Analytical Chemistry: the Journal and the science, the 1970's and

Dec 1, 1978 - Analytical Chemistry: the Journal and the science, the 1970's and beyond. Anal. Chem. , 1978, 50 (14), pp 1309A–1313A. DOI: 10.1021/ ...
9 downloads 0 Views 7MB Size
50 YEARS analytical chemistry

Analytical Chemistry: the Journal and the science, the 1970's and beyond Bruce R. Kowalski Laboratory for Chemometrics Department of Chemistry University of Washington Seattle, Wash. 98195

0003-2700/78/A350-1309$01.00/0 © 1978 American Chemical Society

Over the last 50 years, ANALYTICAL has been successful at detecting the embryonic stages of new analytical methods and important trends in the development of the science of analytical chemistry. The JOURNAL has fostered these trends by attracting well-written expository papers and timely reports covering the most significant advances. ANALYTICAL C H E M I S T R Y recognized that the 1970's have been particularly rich in providing chemistry with new and improved analytical tools. As a carryover from the 1960's, analytical instrumentation continues to benefit from the semiconductor revolution in electronics and the development of a wide range of continuous wave and pulsed lasers, and laboratory minicomputers have led to very powerful midicomputers as well as low-cost integrated microprocessors. The 1970's have also seen an emphasis on analytical measurement systems as two or more instruments are coupled to provide increased capability. The success and proliferation of the gas chromatograph/mass spectrometer pair is an example of an analytical measurement system especially powerful when a minicomputer is included to control the two instruments and to store and process the data. The days when the spectrometer was considered an expensive detector or the chromatograph just a fancy inlet to the spectrometer have passed. Today, unique instrument combinations are demonstrating that the whole can be greater than the sum of the parts. The early developments of electron spectroscopy in the 1960's have paved the way for a new subdiscipline sometimes called "surface analysis", which includes not only ESCA and Auger spectrometry but several new methods of probing the surface of material such as ion scattering spectrometry, backscattering spectrometry, secondary ion mass spectrometry, and several others. Here too, measurement systems are prevalent as commercial instruments that can provide data from these and other experiments all on the same surface sample are available. Liquid chromatography continues to enjoy the highest growth rate in analytical chemistry, both with respect to the number of papers published over each year of the 1970's as well as in terms of the current and projected sales of LC instrumentation. At the 1978 Pittsburgh Conference, the number of papers on LC combined with the number of papers on the growing CHEMISTRY

field of ion chromatography more than doubled over previous years. In these areas and in several others that were born or experienced rapid growth in the 1970's, ANALYTICAL C H E M I S T R Y has made a conscious effort to assist in the development of these new tools. Also, in the applied sciences where analytical methods are put to work, ANALYTICAL C H E M I S TRY has again tried to attract the best papers showing unique adaptation of basic methods to current problems from each area and leave the more directly applied papers to the specialized journals. This is especially true of environmental science and clinical science where the need for improved analytical methods is the greatest. Perhaps the most clearly identifying aspects of the development of the science of analytical chemistry in the 1970's, and the ones on which this paper will focus, are the recognition of analytical chemistry as an information science and the realization that the advanced tools of mathematics and statistics can and must be employed to improve the measurement process, and to aid in extracting the maximum amount of useful chemical information from analytical measurements. This new interest amounted to much more than the introduction of a new area of specialization: chemometrics (7). It actually signaled a change in the philosophy of instrument design as well as a willingness on the part of analytical chemists to take a broader and more active role in the overall scientific process of making measurements to acquire knowledge. Prior to the 1970's the analytical chemist was usually concerned only with the measurement process. As mentioned above, instrumentation advances followed closely on the heels of advances in electrical engineering and computer science. Also, the problems associated with why major scientific studies involving analytical measurements were initiated, how samples were collected and treated, and how the data were analyzed were left to statisticians and engineers. The 1970's mark an end to the isolation of analytical chemistry, and the future holds much in store for the scientist at the very heart of information science: the analytical chemist. This report cites selected examples of analytical chemists using complex mathematical and statistical tools to improve the measurement process, and to properly design experiments and make better use of analytical

ANALYTICAL CHEMISTRY, VOL. 50, NO. 14, DECEMBER 1978 · 1309 A

50 YEARS measurements in order to generate information for a world with many problems to solve. As is probably already evident, no attempt is made to review all of the important advances of the 1970's, a task which has been assumed by the F U N D A M E N T A L R E V I E W editions of ANALYTICAL C H E M I S T R Y . Rather the thesis above is developed with a connection to the crucial role played by the JOURNAL currently celebrating 50 successful years as the representation of a science. Improving the Measurement Process Transform Domains. In writing this report the author has broken with tradition and ignored the usual emphasis placed on the specialty fields of analytical chemistry. Instead, advances from several of the fields are used to show how the tools of information science have permeated these barriers. Actually, ANALYTICAL C H E M I S T R Y has been somewhat responsible for the rapid dissemination of these tools by inviting well-written expository articles (Table I) in the REPORT and I N S T R U M E N T A T I O N sections, which are faithfully read by all analytical chemists as well as others. To illustrate this point, consider the mathematical tool that has had the greatest impact on analytical chemistry in this decade: the fast Fourier transform (FFT). Although Fourier theory has been a well-developed area of applied mathematics for several decades, computer technology was responsible for providing the chemist with the power of this important method. First to appear in the chemical literature were applications to NMR and IR spectrometry. The impact that the FFT had on research in these areas was both immediate and substantial. Early INSTRUMENTATION articles by Loe (May 1969) and Farrar (April 1970) did much to spur an interest in analytical chemistry. However, in the author's opinion, a greater benefit to the philosophy of instrument design was realized by these pioneering efforts. Chemists learned that significant improvements to the measurement process could be achieved by making chemical measurements in one domain (time) and, then, relying on theory and mathematical analysis, cross over to a more familiar domain (frequency). This marked a real departure from the early analytical method development philosophy that, in effect, strived for generation of the simplest type of data

most easily read by humans. With FT-NMR and FT-IR, chemists were actually building sophisticated instruments that generated raw data of little use to the human interpreter until it could be translated by a computer. Driven by the overwhelming success of Fourier analysis in NMR and IR spectrometry, applications to visible absorption spectroscopy, ion cyclotron resonance spectrometry, and electrochemistry began to appear in the literature with many of the early research papers and some excellent special reports again appearing on the pages of ANALYTICAL C H E M I S T R Y .

The philosophy behind the application of the FFT to electrochemistry is particularly interesting. As Smith points out in two well-written INS T R U M E N T A T I O N articles in ANA-

(Table I), the FFT is used to obtain on-line electrochemical relaxation measurements that can provide important kineticmechanistic information to the analytical chemist. Smith suggests, "One can program the mini-computer to extract from the admittance spectrum the essential kinetic-mechanistic data about the electrode process and use these data to either compensate the observed response mathematically for the undesirable kinetic effects or . . . inform the operator of the occurrence of an intractable kinetic situation which invalidates the assay procedure for the sample in question . . . ." He then says, "We are talking about a relatively novel procedure in which the status of an analytical probe's response characteristics is monitored and useful action is taken if that characteristic changes." Spectrum and Waveform Analysis. In a series of papers and a special report published in ANALYTICAL C H E M I S T R Y in the early 1970's, Horlick (Table I) informed analytical chemists of the many additional benefits of Fourier theory besides using the FFT for transportation between two data domains. Convolution, deconvolution, cross-correlation, autocorrelation, and other Fourier theory can be enormous aids to the analytical chemist in performing such tasks as noise reduction, resolution enhancement, and the separation (deconvolution) of the input and the transfer function from an instrument's output. The emphasis placed on applications of Fourier theory above is not meant to indicate that the FFT is the only vehicle of transportation between two data domains or that Fourier LYTICAL C H E M I S T R Y

1310 A · ANALYTICAL CHEMISTRY, VOL. 50, NO. 14, DECEMBER 1978

theory is the only method of signal analysis. At least one other transform, the Hadamard transform, has been used to develop a unique IR spectrometer (Table I). As chemists explore the theoretical implication of the use of linear combinations of other basic functions and even nonlinear transformation, the future will, no doubt, see the development of chemical instrumentation generating strange forms of output that must be translated by an electronic, optical, or mechanical robot before it can enter the relatively narrow region of human information perception. Modeling. Research in the area of potentiometric titrations, an area of few major advances in the past several years, has generated an interesting example of the power of a mathematical method for extending the range of an analytical technique. In work done by Meites and coworkers (2, 3), the concentration of a solution of a very dilute weak base is determined by potentiometric titration using a standard acid. The interesting aspect of the paper is that the concentration of the base is so low as to preclude endpoint detection by several commonly employed techniques including the second derivative method. In the companion paper, similar determinations are made without the knowledge of the concentration of the standard acid! These remarkable feats are accomplished using nonlinear regression analysis to fit a model, derived from acid/base equilibrium theory relating the pH of the solution to the volume of acid added, to the digitized titration curve. The model parameters that are returned to the chemist after data analysis include the concentrations of both the base and the acid, the equilibrium constant for the reaction, and the single ion molar activity coefficient of hydrogen ions, which is dependent on the temperature and the titration medium. Birks and coworkers at the Naval Research Laboratory have recently published the culmination of their efforts over this decade to provide X-ray fluorescence spectrometry with a means of performing accurate and simultaneous multielement analysis in thin or thick, homogeneous or heterogeneous samples (4). They use a model containing fundamental and empirical coefficients and a computer algorithm that employs iteration and multivariate, nonlinear regression analysis. The method they present would seem to have applicability to

50 YEARS several other areas of analytical chemistry as well. These examples serve to show t h a t analytical chemists have graduated from the eyeball approach of curve fitting to computer-implemented linear regression analysis for calibration curves and application of sophisticated mathematical procedures for fitting nonlinear models to their measurements. Since these tools extend the reach of analytical methods, their use will certainly continue to experience a healthy growth. However, these powerful tools must be used with an element of caution and suspicion since model-fitting computer programs are ignorant of chemical principles and will try to make models accommodate data even though the models may be incomplete or even totally incorrect. O p t i m a l C o n t r o l . So far, examples have been given to show t h a t mathematical methods, implemented by computers, can lead to new and improved instrumentation and extend the sensitivity and resolving power of analytical methods. Although computers were used for data acquisition and a few simple control functions in the

1960's, it was not until the early part of this decade t h a t analytical chemists began scouring fields like operations research and control theory for more powerful control mathematics. First, more digital-to-analog components were used so t h a t more instrument functions could be manipulated by computers. Then, univariate (one parameter at a time) control strategies were used in attempts to achieve optimal (e.g., greater sensitivity) control. It was not until 1968 that Ernst applied the simplex method of multivariate (several parameters at a time) control to rapidly optimize the N M R field homogeneity by simultaneously varying the linear and quadratic yaxis gradients (5). Recognizing the general applicability and desirability of multivariate optimal control, Deming and Morgan, active in applications of the simplex method, wrote a special R E P O R T in A N A L Y T I C A L C H E M I S T R Y

(Table I) describing the method and some potential applications. Since t h a t time new control methods have been proposed and applied to analytical instrumentation and analytical method development. These impor-

tant tools are relatively easy to implement and should be appearing on commercially available chemical instrumentation in the future.

More Information from Analytical Measurements In 1969 Jurs, Kowalski, and Isenhour described a computer program t h a t could determine the chemical formula of a compound from its low resolution mass spectrum (6). T h e mathematical method they used, called the "learning machine", appeared to allow a computer to "learn" certain rules of mass spectrometry by examining a library or "training set" of mass spectra. The authors joined forces with C. N. Reilly and, in the early 1970's, published reports of improved learning machine performance and extended applications to other spectral data. In 1971 Isenhour and Jurs prepared a REPORT (Table I) reviewing the early papers on learning machines. Kowalski, Schatzki, and Stross recognized t h a t the learning machine was just one of several multivariate analysis methods under the general heading of Pattern Recognition. In 1972, in ANALYT-

Table I. Feature Articles on Information Science 1970 REPORT—Feb., Part I "Quantitation in Elemental Analysis" H. Kaiser REPORT—April, Part II "Quantitation in Elemental Analysis" H. Kaiser 1971 INSTRUMENTATION—Jan. "Data Domains—An Analysis of Digital and Analog Instrumentation Systems and Components" C. G. Enke INSTRUMENTATION—July "Fourier Transform Approaches to Spectroscopy" G. Horlick REPORT—August "Some Chemical Applications of Machine Intelligence" T. L. Isenhour and P. C. Jurs REPORT—Sept. "Selection of an Optimum Analytical Technique for Process Control" F. A. Leemans 1972 INSTRUMENTATION—Feb. "Hadamard Transform Spectrometry: A New Analytical Technique" J. A. Decker, Jr.

INSTRUMENTATION—May, Part I "Signal-to-Noise Enhancement through Instrumental Techniques" G. M. Hieftje INSTRUMENTATION—June, Part II "Signal-to-Noise Enhancement through Instrumental Techniques" G. M. Hieftje REPORT—Sept. "Inferences from Observations: Graphical Intuition to Bayesian Probability" P. C. Kelly 1973 REPORT—Mar. "Simplex Optimization of Variables in Analytical Chemistry" S. N. Deming and S. L. Morgan

REPORT—Dec. "Operations Research in Analytical Chemistry" D. L. Massart and L. Kaufman

1976 REPORT—Jan. "Limits of Analysis" T. Hirschfeld INSTRUMENTATION—Feb., Part I "Data Processing in Electrochemistry" D. E. Smith INSTRUMENTATION—May, Part II "Data Processing in Electrochemistry" D. E. Smith INSTRUMENTATION—July "Signal-to-Noise Ratio Enhancement by Least-Squares Polynomial Smoothing" C. G. Enke and T. A. Nieman

1977 1975 REPORT—Jan. "Computerized Signal Processing for Chemical Instrumentation" G. Dulaney INSTRUMENTATION—April "Fourier and Hadamard Transform Methods in Spectroscopy" A. G. Marshall and M. B. Comisarow REPORT—Nov. "Measurement Analysis by Pattern Recognition" B. R. Kowalski

REPORT—July "Analysis of Variance in Analytical Chemistry" R. F. Hirsch REPORT—Sept. "Digital Analysis of Electronic Absorption Spectra" D. E. Metzler, C. M. Harris, R. L. Reeves, W. H. Lawton, and M. S. Maggio 1978 INSTRUMENTATION—July "Errors in Computer Data Handling" J. W. Cooper

ANALYTICAL CHEMISTRY, VOL. 50, NO. 14, DECEMBER

1978 · 1311 A

50

YEARS ICAL C H E M I S T R Y they published a research paper (7) in which several pattern recognition methods were used to determine the origin of geolog­ ical samples by X-ray fluorescence measurements. Since that time, sever­ al published papers have described applications of pattern recognition methods and other multivariate statis­ tical methods to analytical measure­ ments (Table I) to solve problems in such diverse areas as product quality control, clinical medicine, forensic science, and several more (8). This increasing interest in the tools of multivariate analysis is significant for at least two reasons. First, these methods allow the luxury of, in effect, examining η-dimensional plots where samples are represented by several measurements, with each measure­ ment corresponding to one axis in the plot. This new capability comes at an opportune time as new multicomponent analysis methods (e.g., multiele­ ment atomic emission spectrophotom­ etry) are providing several measure­ ments instead of a single measure­ ment per unit of time. Second, by understanding and using these methods to solve problems in applied science and engineering, the analytical chemist has accepted a stronger role in the science of the fu­ ture. Examples of the new role are quite evident in such areas as clinical medicine and environmental protec­ tion where experimental design and data analysis become as important as the selection of the optimal analytical method. The Future

It is appropriate to consider the fu­ ture consequences of the point made earlier concerning a change in philoso­ phy for analytical method develop­ ment. Relying on a greater under­ standing of new mathematics and the role it will play in the future, analyti­ cal chemists can be less concerned with the information representation of their measurements and more con­ cerned with information content. Methods can be developed that obtain mountains of raw data far too volumi­ nous, or in an unfamiliar representa­ tion (domain, etc.) prior to data reduc­ tion or transformation, to be of use to humans. A few such "necessary evils" (e.g., GC/MS) are currently found in analytical laboratories since they are far too valuable to ignore in spite of the fact that much of the data they generate are wasted.

A few perceptive analytical chem­ ists are well aware of this new poten­ tial as the literature contains recent reports of instrumentation that can take data in two or more domains at once. For example, researchers in the field of surface analysis have recog­ nized the sensitivity of ion probe tech­ niques and have developed instru­ ments, such as secondary ion mass spectrometers (SIMS), that have a multidimensional measurement capa­ bility. SIMS can measure a mass spec­ trum of ions sputtered as a result of bombarding a small area on the sur­ face of a sample with a primary ion beam. The data generation capability of this method is staggering as the pri­ mary ion beam can be raster scanned over two spatial dimensions, resulting in a complete mass spectrum mea­ sured at several points over a twodimensional grid. This amounts to two spatial domains and a third spectral domain. If depth profiling methods are employed, three spatial domains and one spectral domain are obtained. Even though the entire data array can no longer even be visualized (fourdimensional plot), several different spectral domains can be added as in­ struments that allow simultaneous surface analysis methods to be applied are commercially available. Recognizing that SIMS and other surface analysis methods are multidi­ mensional analytical techniques, Mor­ rison and coworkers at Cornell Uni­ versity published the first account of applying a digital image processing technique to a chemical image in the December 1977 issue of ANALYTICAL C H E M I S T R Y (9). Instead of using a raster scan ion microprobe, Morrison used an ion microscope "combining ion sputtering, mass spectrometric fil­ tering and ion optics to give a spatially resolved mass analysis of the surface of a solid" (9). The Cornell group rec­ ognized that the higher-dimensional images produced by the ion micro­ scopes were really multidimensional spectra and amenable to analysis by η-dimensional spectrum and wave­ form analysis methods including those derived from Fourier theory. In their application they used the two-dimen­ sional FFT to cross-correlate two im­ ages produced for two different ele­ ments on the same surface sample in order to optimally align the two im­ ages. The authors barely scratched the surface of possibilities since in engi­ neering, there is a rapidly growing field called "image processing" that has already generated powerful meth­

1312 A · ANALYTICAL CHEMISTRY, VOL. 50, NO. 14, DECEMBER 1978

ods with applications to medical im­ ages, satellite photography, and many other types of images. Since these tools offer much to these new analyti­ cal methods, ANALYTICAL C H E M I S ­

TRY has already made plans to in­ clude image processing as a topic for a forthcoming I N S T R U M E N T A T I O N section article. Another interesting example of a multidimensional analytical instru­ ment is the videofluorometer devel­ oped by Johnson, Callis, and Christian at the University of Washington (10). Using a unique monochromator, the instrument produces a two-dimen­ sional image at the exit aperture, which in turn is focused on a television camera with a silicon-intensified tube. The two-dimensional image consisting of the complete emission-excitation fluorescence spectrum is acquired and stored by a fast data acquisition inter­ face in fractions of a second. Recogniz­ ing that interpretation of the large amount of data generated by the in­ strument could prevent the realization of its full potential, the researchers began testing a number of mathemati­ cal analysis methods best suited to multicomponent spectral analysis be­ fore construction of the instrument was complete. As a result of investi­ gating the standard methods of multicomponent analysis (least-squares, nonnegative least-squares, and linear programming), the Washington group applied the matrix analysis method of rank annihilation to their data with excellent results (11), thereby provid­ ing analytical chemists with an impor­ tant new tool. Conclusions

All too often the author, and proba­ bly every other analytical chemist as well, has heard ANALYTICAL C H E M ­ ISTRY referred to as "the journal of

the straight line". Even if the expres­ sion may have had the ring of truth in the past, the accomplishments of the 1970's should be enough to bury it forever. In the 1976 FUNDAMENTAL REVIEWS

issue of

ANALYTICAL

C H E M I S T R Y , Shoenfeld and DeVoe,

the authors of "Statistical and Mathe­ matical Methods in Analytical Chem­ istry," note: "Significant changes have occurred since the last review . . . we are only at the threshold of realizing the importance of statistical and mathematical techniques that have been reasonably well understood in the field of applied mathematics for decades" (12). In recognition of the increasing importance of applied

mathematics to analytical chemistry, A N A L Y T I C A L C H E M I S T R Y will in­

clude a Fundamental Review entitled "Chemometrics" beginning in 1980. Since this new specialty is becoming a formal part of graduate education in analytical chemistry at a growing number of academic institutions, this new service should be a significant ed­ ucational aid in the training of mo­ dern analytical chemists. Finally, the author wishes to apolo­ gize to his analytical colleagues for covering t h e developments of our science in t h e 1970's with such a nar­ row bandpass. All that can be offered in defense is t h e rationalization given in the introduction and a firm belief t h a t analytical chemistry is moving to the forefront of t h e physical sciences as a science t h a t exploits t h e limits of mathematics to obtain the informa­ tion necessary to solve t h e problems of a growing society.

This is our Vortex Evaporator

Acknowledgment T h e author expresses his gratitude for help and suggestions made by R. M. Barnes, L. S. Birks, G. D. Chris­ tian, L. R. Field, W.E.L. Grossman, D. M. Hercules, G. M. Hieftje, R. Hummel, R. Kratochvil, H. A. Laitinen, W. S. Lyon, M. O'Donnell, D. K. Roe, H. H. Ross, J. T . Stock, and J. R. Wasson.

This is the competition

References (1) B. R. Kowalski, Ed., "Chemometrics: Theory and Application", ACS Symp. Ser. No. 52, American Chemical Society, Washington, D.C., 1977. (2) D. M. Barry and L. Meites, Anal. Chim. Acta, 68, 435 (1974). (3) D. M. Barry, L. Meites, and Β. Η. Campbell, ibid., 69, 143 (1974). (4) J. W. Criss, L. S. Birks, and J. V. Gilfrick, Anal. Chem., 50, 33 (1978). (5) R. R. Ernst, Rev. Sci. lustrum., 39, 988 (1968). (6) P. C. Jurs, B. R. Kowalski, and T. L. Isenhour, Anal. Chem., 41, 21 (1969). (7) B. R. Kowalski, T. F. Schatzki, and F. H. Stross, ibid., 44, 2176 (1972). (8) D. B. Pratt, C. B. Moore, M. L. Par­ sons, and D. L. Anderson, Res. Dev., Feb., 52(1978). (9) J. D. Fassett, J. R. Roth, and G. H. Morrison, Anal. Chem., 49, 2322 (1977). (10) D. W. Johnson, J. B. Callis, and G. D. Christian, ibid., ρ 747Α. (11) C. H. Ho, G. D. Christian, and E. R. Davidson, ibid., 50,1108 (1978). (12) P. S. Shoenfeld and J. R. DeVoe, ibid., 48.403RU976).

Now you can get evaporation, incubat'on and vortexing in one reliable instrument—and only Buchler makes it. The new Vortex Evaporator is a complete sample preparation station for RIA/CPB, drug abuse screening, electrophoresis, TLC, gas chromatography and quality control. Features include: variable vortexing speed; controlled heating constant within 1°C; optional cooling plate; special vacuum control system guards against bumping. Suitable vacuum source is available as an optional extra. Write today for complete information.

Buchler Instruments, Inc. 1327 Sixteenth Street Fort Lee, New Jersey 07024 (201) 224-3333 CIRCLE 24 O N READER 5ERVICE CARD

ANALYTICAL CHEMISTRY, VOL. 50, NO. 14, DECEMBER 1978 · 1313 A