## Ti c TT

j=i where tl and Cj are positive coefficients and ay are constants (16).

### Optimization of Dynamic Systems

A system is dynamic if it varies with time or distance. Optimization must account for such variation. An example of a dynamic system is the time-dependent behavior of batch and continuous fermentation vessels. The problem of optimizing dynamic systems can be stated as follows (19): given all the performance equations of a system (possibly a set of differential equations) and the initial-final values of some state variables, find the piecewise continuous decision control variable(s) that maximizes or minimizes the objective function.

Large dynamic systems with many decision variables can be broken down into stages and reduced to a series of interrelated systems, each containing only a few variables. The stages may be process components or equipment, units of time, or any other suitable entity. One of the methods to optimize such multistage dynamic systems is the so-called dynamic programming. Bellman (22) presented the basic theory and developed the principle of optimality. It states that an optimal policy has the property that whatever the initial state and initial decisions, the remaining decisions must constitute an optimal policy with regard to the state resulting from the first decision. The principle of optimality can be stated mathematically (5). In simple terms, the principle of optimality states that in a multistage serial system every component affects every downstream component. Figure 6 shows an example of a multistage process. Each stage has an input (/ā), an output (/ā_]), a decision (dā), and a return (Rn). The output and return are dependent on the input and decisions. A dynamic programming analysis usually begins with the last stage, which affects no other stage in the system, and ends with the first stage, yielding an optimum for every input. For a multistage process, it is necessary first to find fjd-i). With /i(/i) known, then it is possible to find f2(I2), then f3(/3),. . . , fjlj-

Several other approaches can also solve dynamic optimization problems. Among the ones frequently applied are Pontyagrin's maximum principle (19,22-24), and the calculus of variations (25).

If the form of <p is explicitly known, then any of the methods discussed earlier can be used to optimize the system. If the function tp is unknown or very complex, another mathematically simpler function/must be found to approximate <p and describe the system. This new function y = f(xu x2, . . . , xj (22)

estimates y rather than the true value y; it is called an approximating or graduating function and may take the form of practically any mathematical expression. The most commonly used expressions are polynomials of first or second order given by equations 23 and 24, respectively

### Experimental Optimization

Experimental optimization methods incorporate elements of statistical thinking into traditional scientific, engineering, and mathematical modeling and optimization. When the behavior of a system or process is unknown and a sufficiently simple deterministic (mathematical) model cannot be developed for further analysis, experimental optimization may be used to analyze the system and search for optimum conditions. The term experimental indicates that physical experimentation is involved. Well-designed experiments may produce statistically sound data, which in turn may result in reliable empirical models. Such models can predict or optimize, and often they provide the needed basis (information) for developing either more rugged or deterministic models. There are three well-known and widely used experimental optimization methods.

Response Surface Optimization. The term response surface methodology (RSM) refers to a group of mathematical and statistical techniques that, through limited physical experimentation, provide the means for attaining optimum operating conditions of complex systems. The theoretical basis of RSM (6) is a powerful tool for experimental optimization.

The basic idea of RSM is that for any given system, there must be a functional relationship <p that correlates the factors xL (decision variables) to the response y (performance function)

dn d2 d0

dn d2 d0

Rn R2 R i