where ev is the variance error, sometimes referred to as the experimental error, and is a result of the sampling variation, and eb is the so-called bias error (26), which refers to the failure of the approximating function (ie, polynomials) to exactly represent the true function <p(x). Among the most commonly used designs for RSM studies are central composite designs (6,7), three-level designs (27), mixture designs (28-30), and several other fractional factorials (9,31,32).

After the coefficients (i's have been determined (to satisfy equation 25), the stationary point(s) are determined. Taking the first partial derivatives of equation 23 or 24, equating them to zero, and solving the system of n equations (see equation 3 and related discussion) solves this issue. Matrix algebra can accomplish the same objective (7,8).

A canonical transformation of equation 23 or 24 will indicate the nature of the stationary point (8). If complicated ridge systems exist (when the stationary point is faraway from the origin, outside the experimental region), they may require further ridge analysis (33).

If the system has only one response (one performance function), then following the methods and analyses discussed achieves optimization easily. Often more than one response is involved, and sometimes the objectives are competing with each other. Under these circumstances, methods of multiresponse optimization (MRO) will locate the optimum operating conditions. The simplest form of MRO is the graphical approach. Equation 23 or 24, depending on the system, creates predictive models and contour plots. Superimposing contour plots of several responses identifies critical (optimum) regions. Examples of such graphical optimization approach exist in the literature (34-36).

An improved graphical method has been presented (3) that allows three (instead of two) independent variables to be represented continuously and simultaneously in a contour plot for one or more discrete values of the response y. It facilitates the process of locating optimum regions in multivariable systems, and requires some computer programming. An extended discussion and an application example of the improved graphical method are available (3,36).

Several other MRO techniques exist, such as the generalized distance approach (36—39), the extended response surface procedure (39), the so-called overall desirability function (7,40), and the normalized function approach (41). The application of these techniques requires caution, because they may lead to unanticipated and practically undesirable conditions (7).

Figure 7. An evolutionary operation (EVOP) strategy.

process repeats until no further improvement occurs. Because experimentation takes place during production, it is important to avoid large changes in the factor settings, which may result in significant product variation or monetary loss. If applied wisely, EVOP provides a simple and straightforward technique for continuous process improvement. It works best when applied to simple systems with up to three independent variables (factors); therefore, its use should be limited to such systems.

Taguchi Methods. A broad spectrum of new ideas on applied statistics and engineering characterize this approach. Followers of these methods advocate a new philosophy on fundamental statistical theory, probability, experimental design, regression theory for modeling, quality and reliability, basic engineering design, and optimization. These ideas and philosophy are generally referred to as Taguchi methods (44,45).

To begin with, every quality characteristic (performance function or response) y relates to some loss L if the characteristic is not exactly on target (if it is not an optimum). Traditionally, the loss function L(y) was thought of as being constant when y lay outside the specified limits (specifications, guidelines, or other quality-control requirements) and zero when y lay within those limits (Fig. 8). Expressed mathematically, this is

Í0 for LL<y < UL (y) jc for y < LL or UL < y

Evolutionary Operation (EVOP). This is an experimental strategy for improving industrial processes on the plant floor during operation. The theoretical basis, methodology, and some application examples of this method have been presented (42,43). The basic idea of EVOP is to change a current manufacturing process slowly and methodically during production until no further improvement is possible. For example, if a process is presently producing an output yp at factor settings xlp and x2p (Fig. 7), then according to EVOP, the factor setting should be changed to measure ya, yb, yc, and yd (in that order). These values are statistically compared to yp. If significant differences exist, the best value becomes the center (the new yp), and the where c is a constant and LL and UL are the lower and upper specification limits, respectively. The Taguchi loss function is a quadratic expression (Fig. 8)

where a is the target (optimum) value and k is a characteristic constant of the system. For a known loss value L corresponding to a given value of y, k is k = L/(y - af (29)

This loss function may be asymmetric, but it should approximate the true loss. By definition, this is the total loss

Figure 8. A comparison of Taguchi's loss function with the traditional quality function.

Figure 8. A comparison of Taguchi's loss function with the traditional quality function.

to society, not the loss to a company or an individual, when a product or process deviates from its target (optimum) value a. It has been further suggested that effort should be made to keep the process-product within a tolerance t from the target (a — t § y Si a 4- t). For values ofy outside the tolerance region (Fig. 8), the loss to society is greater than the amount of effort required to correct the process.

Another aspect of the Taguchi methods deals with the so-called factor space and states that all systems are affected (subject to changes) by some controllable and some uncontrollable factors. The controllable factors are the decision variables, which can be measured, controlled, and adjusted. The uncontrollable factors are either unknown variables or variables over which there is no real control. In either case the effect of uncontrollable factors is termed noise and should be minimized during optimization. Mathematically, this is y = f<* i. • • ♦ 9 %tl9 ¿19 ¿29 • • • 9 Zm ) (30)

where y is the response (performance function), Xi, x2>..., xn are the controllable factors (the normal decision variables), and Zj, z2, . ■ ■ , zm are the uncontrollable factors (noise). To optimize this system, the optimum values xf, xf,. . . , x* must be found such that y* = m

In this case, y* is the value ofy close to the ideal optimum, where the variability due to the effect of zu z2, ... ,zm is minimum. An example application in this area would be the formulation of a cake mix that produces consistently good cakes under uncontrollable variations in baking temperature and time.

The Taguchi methodology goes further and divides the uncontrollable factors into outer noise (ambient temperature, relative humidity, etc), inner noise (machine or equipment deterioration, component wear and tear, etc), and between product noise (product-to-product variation). Fractional factorial designs and orthogonal arrays are used to perform experiments and part of the analysis is the controversial signal to noise ratio.

Overall, the Taguchi methodology follows three basic design phases: (1) system design, the basic engineering design for plants, processes, and equipment; (2) parameter design, optimizing overall processing conditions; and (3) tolerance design, refining the system. This minimizes noise effects, reduces variation, and produces rugged and robust products, processes, and systems. Taguchi methods can also characterize and optimize multiresponse processes (46). Further information on this subject is available (43-46).

Artificial Intelligence. The increasing availability of low-cost computing power has made computationally intensive modeling methods more practical. These include neural network modeling (47), genetic algorithms, and fuzzy logic (48). These methods can provide models that conform more closely to the true response surface than traditional polynomials, but they require significantly more data to create such models.

Living Gluten Free

Living Gluten Free

A beginners guide that will reveal how living "G" free can help you lose weight today! This is not a fad diet, or short term weight loss program that sometimes makes you worse off than before you started. This is a necessity for some people and is prescribed to 1 out of every 100 people on earth by doctors and health professionals.

Get My Free Ebook

Post a comment