Introduction to the Design and Optimization of Experiments Using

Introduction to the Design and Optimization of Experiments Using Response Surface Methodology. ... Publication Date (Web): February 1, 2006. Cite this...
0 downloads 0 Views 105KB Size
In the Laboratory

Introduction to the Design and Optimization of Experiments Using Response Surface Methodology

W

A Gas Chromatography Experiment for the Instrumentation Laboratory Patricia L. Lang,* Benjamin I. Miller, and Abigail Tuttle Nowak Department of Chemistry, Ball State University, Muncie, IN 47306; *[email protected]

It is commonly understood that the use of statistical methods can increase the efficiency of an experiment or a laboratory process where several parameters may control the outcome. Yet, the typical chemistry major may never be exposed to any of several techniques used in the design and optimization of experiments (DOE), such as factorial designs, simplex designs, steepest ascent approaches, and response surface methodology (1–5). However, these techniques are widely used in chemical research and industry in the development of new processes and in optimizing the performance of existing processes (6–8). While a complete course in the DOE for the undergraduate student is not practical given the current ACS curriculum, a two-week module illustrating a few approaches to experimental design and optimization in the senior-level instrumental analysis laboratory can serve to introduce undergraduate students to one or more various design techniques. For example, in a typical gas chromatography separation the analyst must chose the type of column, the oven temperature, injector temperature, detector type and conditions, ramping program and rate, among other parameters in order to provide a separation with adequate resolution in an acceptable quantity of time. Must one vary one factor at a time over its range in a successive and lengthy set of trials while keeping the others constant? In any process, it is not uncommon for a factor to produce a different effect on a response at different levels of another factor. Such interactions between factors may not be observed with the one-factor-at-a-time approach (1–11). This article will present an approach to teaching students how to design an experiment that varies several factors at once and then subsequently indicates how to mathematically model the response to determine the optimum experimental conditions. The students will use experimental design software to determine the optimum set of GC conditions to adequately and rapidly separate four fatty acid methyl esters and then validate their results. Although a related experiment for the undergraduate has been published (9), it did not use the factorial methods of design, and, furthermore the lack of easy-to-use software available at that time made it difficult to generate a three-dimensional response surface. A more recently published experiment (10) uses fractional factorial analysis as a screening experiment only to identify important instrumental factors and does not use response surface methodology to find the optimal set of conditions. This article describes how to design and optimize an experiment with multiple factors and multiple responses.

280

Journal of Chemical Education



Theory The first stage of experimental design is to become familiar with the particular experimental process under study and the variables, hereafter called factors, that may control the process. The experimenter should be able to identify the desired outcome(s) of the experiment and the possible factors along with their high and low operational ranges. For example, in this experiment we studied the effect of how the ramp rate, initial oven temperature, final oven temperature, and injection temperature influenced the resolution between two adjacent gas chromatograph peaks and the elution time. An experimental design is then chosen that provides adequate predicting power with the fewest number of experiments, since there are generally time or financial constraints that influence the extent of experimentation possible. After the experiments are performed and the responses measured, the next stage of design is to perform an analysis of variance (ANOVA) to determine the statistical significance of the effect of each of the factors on each of the responses. Although a detailed discussion of this algorithm is beyond the scope of this article, simply speaking, the effects are calculated by taking the average value of a response (for example, the time the last peak elutes in the experiment) obtained at a high factor condition and subtracting it from the average value at the low factor condition (1). Using the central limit theorem, the factor effects are plotted on half normal probability paper. Those factors that do not lie on a straight line can be viewed as significant, and those that do are viewed as noise, being randomly distributed about a mean of zero (1). The response values can then be predicted using a linear model or an interaction model. If the response, Y, is not well-modeled by a linear function, then a polynomial of higher degree must be used such as that shown in eq 1. k

Y = β0 +

k

∑ βi X i

i =1

+

∑ βii X i 2 ∑ ∑ βij X i X j

i =1

i< j

+ ... + ε (1)

where βi are coefficients related to the factor effect, Xi are the factors, and ε is the error. The response can be represented graphically to help one understand and visually assess how the factors influence the response. This is especially useful for the modeling and analysis of processes in which a response is influenced by several variables and the objective is to optimize a response. The collection of statistical techniques that are available to perform

Vol. 83 No. 2 February 2006



www.JCE.DivCHED.org

In the Laboratory

this analysis is named response surface methodology (RSM). Different RSM designs include three-level factorial, central composite, d-optimal, and Box–Behnken designs (1). An example of a central composite response surface is shown in Figure 1 for the proposed gas chromatography experiment where the time the last peak eluted is a function of ramp program rate and final oven temperature. In this plot one can observe that two factors influence the response and that there is interaction between the two. Experimental and Results

Time Last Peak Eluted / min

In the students’ first gas chromatography laboratory, they are introduced to the instrument, its operation, and its use to separate, identify, and quantitate using the area normalization method. However, this introductory experiment to gas chromatography was also designed so that students could use the gathered data later in the term for a design and optimization laboratory. To accomplish this goal, the instructor generated a central composite design prior to the gas chromatography lab that involved 30 experiments and assigned 5 experiments to each of 6 lab groups, a number that could be done in a 3-hour lab period. (Although the instructor could design any number of experiments to suit the number of lab sections and the time allowed.) This particular set of experiments also provided an 84–98% chance (depending on the factor) of detecting the effect of a factor the size of 2 standard deviations. Students were given a mixture of four fatty acid methyl esters to separate under different injection temperature, initial oven temperature, final oven temperature, and ramp program rate conditions. Each group calculated the resolutions between each peak and recorded the time the final peak eluted for a total of four responses.

30 25

Toward the end of the term a lecture was given on factorial analysis during the regular class time. Soon after this lecture each lab section met to analyze a simple 23 factorial design using previously published data for the optimization of microwave popcorn conditions that included two responses, taste and the number of unpopped kernels (11). The students then proceeded to analyze the collectively acquired GC data that had been entered into a file before the lab period. It should be emphasized that the entire cache of statistics yielded by our analysis was not (and could not be in the time allotted) explained. The emphasis was on the following: examining the coefficients and their 95% confidence limits, using the half-normal probability plot of effects to select model terms, the normal probability plot residuals, the residual versus run order plot, the predicted response versus the actual response plot, and checking for optimal transformation of the response using the Box–Cox plot. The students observed what factors and factor interactions affected each response. For example, the injection temperature was a factor that did not affect any response, while the initial oven temperature affected all responses. Students then generated response surfaces for all responses; an example is shown in Figure 1. They were then asked to discuss in their lab notebook whether these data made sense in light of chromatography theory. Finally, the students used an optimization algorithm to determine the optimum conditions, and then validated their predictions with an experimental run. These values are shown in Table 1. One student noted that the resolutions were not as “easy to predict” as the time the last peak eluted and speculated that this was due to the band spreading that may result from injection technique. Others in the class speculated that this would improve with an auto-injector, or at least having 1 person, rather than 23 individuals, our class size, perform the injections. The students then compared their results with those obtained by a graduate student who (alone) had performed the same set of experiments. The 95% confidence limits for the predicted values that the graduate student obtained ranged from ±2% to ±12%, which were much smaller than the class’s values, indicating that sampling technique was indeed an important consideration.

20 15

Table 1. Student Optimization and Validation Data

mp

Ra

50.00

Outcome

38.75

Actual Valuesa,b/ min

10.6 (8.9–12.4)c

11.5

100

27.50

C e /°

125

te /

Ra

Predicted Valuesa/min

16.25

in)

/m

(°C

5.00

tur pera

150 175 200

l

Initia

em

nT Ove

Time last peak eluted Resolution of peaks 1,2

5.0 (2.5–10.1)

5.5

Resolution of peaks 2,3

6.4 (3.2-–13.1)

7.8

Resolution of peaks 3,4

13.4 (7.5–24)

16.5

a

Figure 1. Response surface relating time, ramp rate, and initial oven temperature conditions for a GC separation Design-Expert plot.

www.JCE.DivCHED.org



Optimal Conditions: injector temperature–274 ⬚C; ramp rate–50 ⬚C/min; initial oven temperature–124 ⬚C; and final oven temperature– 210 ⬚C. b95% Confidence Interval. cAverage of three data points.

Vol. 83 No. 2 February 2006



Journal of Chemical Education

281

In the Laboratory

Instruments and Reagents An HP 5890A gas chromatograph equipped with a flame ionization detector was used with a fused silica capillary column. The J & W Scientific (91 Blue Ravine Road, Folsom, CA, 95630-4714) column was a 30-m, narrow-bore column with DB-5MS (5% Phenyl) methylpolysiloxane兾silicone rubber packing and had a 0.25-mm i.d. with 0.5-µm film thickness. The chromatograph was connected to a PC equipped with Hewlett-Packard ChemStation B.02.04 software. Injections were performed in the split mode and 0.5 µL was injected with the split ratio set to 75兾1. The He gas flow was maintained at 1.8 mL兾min, the He make-up gas at 30 mL兾min, the air flow at 450 mL兾min, and the H2 gas at 60 mL兾min. The fatty acid methyl esters were obtained from Polyscience, Kit 611C, (Polyscience Corporation, 7800 Merrimac Ave., Niles, IL, 60648). Methyl decanoate, methyl hendecanoate, methyl dodecanoate, and methyl tetradecanoate were used and dissolved in HPLC grade ethanol. Stat-Ease Design-Expert 6.0.10 (Stat-Ease Inc., 2021 East Hennepin Ave., Suite 480, Minneapolis, MN, 55413) was used to design the experiment and to perform the statistical analysis. Hazards Methanol is highly flammable. Short-term exposure can cause dizziness or blurred vision. Methyl decanoate is an eye and skin irritant. It may cause minor to moderate irritation to the eye. Prolonged exposure to the skin may produce a minor to moderate irritation. Acknowledgments We would like to express our gratitude to Troy Rhonemus, Cargill, for his support, encouragement, and expert instruction; to Pat Whitcomb, Stat-Ease, for his help and expert

282

Journal of Chemical Education



instruction; and to the CHM 420, class of 2004 for obtaining and analyzing the data. Finally, we would like to thank the Ball State University Department of Chemistry Undergraduate Research Program for its support. W

Supplemental Material

Instructions for the students and notes for the instructor are available in this issue of JCE Online. Literature Cited 1. Montgomery, Douglas C. Design and Analysis of Experiments, 5th ed.; John Wiley and Sons, Inc.: New York, 2001. 2. Anderson, Mark J.; Whitcomb, Patrick J. DOE Simplified: Practical Tools for Effective Experimentation; Productivity Press: Shelton, CT, 2000. 3. Box, George E. P.; Hunter, William G.; Hunter, J. Stuart. Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building; John Wiley and Sons, Inc.: New York, 1978. 4. Box, George E. P.; Draper, Norman R. Empirical Model Building and Response Surfaces; John Wiley and Sons, Inc.: New York, 1986. 5. Myers, William H.; Montgomery, Douglas C. Response Surface Methodology: Process and Product Optimization Using Designed Experiments, 2nd ed.; John Wiley and Sons, Inc.: New York, 2002. 6. Suliman, Fakhr Eldin O. Talanta 2000, 56, 175–183. 7. Caetano, Manual; Golding, Rafael E.; Key, Edgar A. J. Anal. At. Spectrom. 1992, 7, 1007–1011. 8. Adinarayana, K.; Ellaiah, P. J. Pharm. Pharmaceut. Sci. 2002, 5, 272–278. 9. Harvey, David T.; Byerly, Shannon; Bowman, Amy; Tomlin, Jeff. J. Chem. Educ. 1991, 68, 162. 10. Stolzberg, Richard J. Chem. Educator 2001, 6, 291–294. 11. Anderson, Mark J.; Anderson, Hank. PI Quality 1993, July兾August, 30.

Vol. 83 No. 2 February 2006



www.JCE.DivCHED.org