The Hy-phen-ated Methods - American Chemical Society

principle, the marriage (some- times a shotgun one) of two separate analytical techniques via appropriate interfaces, usually with the backup of a com...
0 downloads 0 Views 11MB Size
Instrumentation Tomas Hirschfeld Lawrence Livermore Laboratory Livermore, Calif. 94550

The Hy-phen-ated Methods GC-MS, LC-MS, GC-IR, LC-IR, TLC-IR . . . . As the rising tide of alphabet soup threatens to drown us, it seems appropriate to look at the common denominator behind all of these new techniques, and of those we have not yet heard from. T h e hyphen which is the single common constituent of all these acronyms is also the symbol of their common principle, the marriage (sometimes a shotgun one) of two separate analytical techniques via appropriate interfaces, usually with the backup of a computer tying everything together. T h e motivation for the development of these systems comes from a number of converging reasons. Increasing the Differentiating Power of Analytical Methods T h e qualitative analysis task of recognizing an unknown as a specific compound out of a set of possible sample components is more and more often constrained by the large size of this set. In fact, the set of possibly present compounds is all too often t h a t of all possible chemical compounds, a practically infinite number. Work performed under the auspices of the U.S. Department of Energy under contract No. W-7405-Eng-48.

0003-2700/80/0351 -297A$01.00/0 © 1980 American Chemical Society

In deference to the need for interpretability, we can use the subset "all known compounds" instead of "all possible compounds," since the members of the set of unknown compounds are not recognizable. Even so, we are then talking about five to six million possible compounds, and this is indeed the approximate number of compounds t h a t an analytical technique must differentiate from one another to do qualitative analysis, unless the set of the possibly present compounds can be substantially reduced using a priori information that happens to be available. It is a rare analysis indeed in which such information is wholly absent. Optimally, this information is in the form of specific inclusions in the set, as for example, if a list of compounds possibly present is available. T h e set of possibly present compounds is then usually small, and qualitative analysis an easy task. However, the reliability of the analysis becomes limited by t h a t of the list, and the workload of producing the list quickly becomes a limiting factor. Here, one must remember t h a t a short list of compounds sought still requires a qualitative technique capable of differentiating a very large number of compounds,

unless we can limit the number of other compounds possibly present. More often, however, the available a priori information is nonspecific, giving us ill-defined categories (by chemical nature, physical properties, genetic relationships, etc.) of chemical compounds t h a t may be present or, an even weaker restriction, that cannot be present. It usually is extremely difficult to cut down the set of possibly present compounds to reasonably small numbers by using such restrictions. T h e operations of properly cutting up the set of all possible compounds into subsets by such means usually becomes not only very laborious, but intolerably unreliable. On the other hand, how large is the set of possibly present compounds t h a t can be differentiated by individual analytical procedures? In the case of gas chromatography, for example, the largest differentiable set of randomly chosen compounds is t h a t which produces no repeated retention times. From elementary statistics we can calculate this to be N, = y/2R\ni/a

(1)

where Nc is the number of compounds permissible in the set, a is the probability of the desired absence of over-

ANALYTICAL CHEMISTRY, VOL. 52, NO. 2, FEBRUARY 1980 · 297 A

Table I. The State of the Art in Hyphenated Methods Requires further invention Feasible in the state-of-the-art Presently successful

Gas chromatography Liquid chromatography Thin layer chromatography Infrared Mass spectroscopy Ultraviolet (visible) Atomic absorption Optical emission spectroscopy Fluorescence Scattering Raman Nuclear magnetic resonance Microwaves Electrophoresis

lap, and R is the number of resolution elements in the entire GC run (de­ fined by the number of resolvable lo­ cations after allowing for reproducibil­ ity limitations). This number is sur­ prisingly small; for example, at a = 90% and R = 1000, it is only 14! This seems clearly contradicted by practice, where many more than 14 compounds can be simultaneously dif­ ferentiated by gas chromatography. This is usually accomplished by get­ ting around the randomness assump­ tion by an often lengthy search for col­ umns, running conditions, and sample pretreatments. T h e resulting analyti­ cal technique achieves Nc —• R by be­ coming very much nonrandom relative to the sample set in question. In so using detailed methods devel­ opment for each sample type instead of intrinsic power in the method, we have, however, mislaid the original qualitative analysis goal. Success in sample resolution at this stage is only brought about by having found out nearly everything about the sample beforehand. As can be seen from Equation 1, resolution, even if very high, provides only rapidly diminishing returns in

terms of the differentiable set size. In­ creasing the dimensionality of the data, via multiple independent mea­ surements, has a much higher poten­ tial, as it appears as a power function in β . In GC this may be accomplished by multiple column measurements (1). These, however, contribute to perfor­ mance only to the extent that they work at all and are mutually indepen­ dent. It can be proven that under these conditions Nc = s/2 In \/a

x j e H-6(n-l)»g/£^»y 2n ο \ ρ / Where b is the probability of a specific column-compound combination lead­ ing to an immeasurably high or low re­ tention time, ρ is the number of inde­ pendent parameters influencing the column-sample interaction (a mea­ sure of the mutual independence of a set of columns), and η is the number of columns actually used. The improvement here is consider­ able, but usually not sufficient. If in the above sample we use three col­

umns with a % success rate each, and five independent parameters are as­ sumed to control column-sample in­ teractions, the maximum possibly present compound set size we can deal with increases to only ~300. Clearly, however, multiple independent mea­ surements will greatly aid in differen­ tiating all members of large sample sets. This can most conveniently be done by chaining together a number of instruments. If, as described above, such instru­ ments all operate on the same princi­ ple, construction economies will be achieved at the cost of reduced mutual independence and thus reduced dif­ ferentiating power. However, when the set of all possi­ bly present compounds is large, a much more powerful technique is re­ quired. This can be attained if each compound in the set produces more than one data point per measurement. Such "rich" data sets are obtained for, example, in pyrolysis gas chromatog­ raphy or in infrared or mass spectros­ copy. For rich data sets, an upper limit on the size of the differentiable set arises from the assumption that all intensi-

ANALYTICAL CHEMISTRY, VOL. 52, NO. 2, FEBRUARY 1980 · 299 A

CIRA

GC/GC INTERFACE

CAPILLARY GC

LC

LC/GC INTERFACE

FTIR

ECLIPSE COMPUTER

LC/MS INTERFACE

GC/MS INTERFACE

HP MS COMPUTER

MS

NOVAS COMPUTER

Figure 1. Lawrence Livermore Laboratory Combined Analytical System ties are equiprobable and mutually in­ dependent at all points. In this case Nc = V 2 In l / « · RiRn

(3)

where Ri is the number of intensity levels that can be repeatably resolved at each point. In practice all points are not mutually independent, and there is a substantial excess probability of 0 intensity (depending on the "sparseness" of the spectrum) at many loca­ tions and intensities » 0 at some oth­ ers (X-H stretch in IR, C O - H 2 0 peaks in pyrolysis-GC). But, even with expo­ nents « β / 2 , very large compound sets (Nc > 107) can be differentiated by such rich methods.

Increasing the Separating Power of Analytical Methods For most analysts pure single com­ ponent samples are one of the fond memories of graduate school; real life samples are quite different. It is thus necessary to unscramble these sam­ ples, either by analytical methods re­ sponding separately to each compo­ nent or by physically separating all components from each other.

For any technique where a single compound gives a single reading, things are pretty straightforward, as the differentiating and separating functions are substantially identical under these conditions. T h e situation changes when the method has had its differentiating power enhanced by using several serial measurements to increase dimension­ ality. Here the way in which multiple methods are combined becomes cru­ cial (2). If measurements along each separate dimension (in the case of the GC example, a column) are done sepa­ rately, then we must either have a pure sample or have a separate identi­ fying tag on each data point to proper­ ly assemble their coordinates together in η dimensional space (again in the GC sample, peak intensities might be so used). True multidimensionality by chain­ ing several measurements will, in fact, require simultaneity, and thus exten­ sive parallelism, at least after the first stage, or ways of tagging individual points, or stepped scan methods where the first stage advances one resolution element for each complete scan of the second one, etc.

300 A · ANALYTICAL CHEMISTRY, VOL. 52, NO. 2, FEBRUARY 1980

When such true multidimensional­ ity is achieved, separating and differ­ entiating powers stay equal, and mod­ erately complex samples with moder­ ate numbers of possibly present com­ pounds can thus be analyzed. However, large numbers of possibly present compounds force us to use "rich" data field techniques, and these have little and often no capability of dealing with mixture samples. Routine analysis using rich measurements rou­ tinely requires sample prepurification or fractionation. T h e use of a combi­ nation of instruments to accomplish this fractionation, followed by the un­ ambiguous separation possible via rich analytical methods, is an extremely powerful general analytical method. In fact, it was for this task that the first popular hyphenated technique, GCMS, was developed (3). Synergism Between Methods By combining several analytical methods, it is possible to pool their virtues. To start with, as described above, we can combine a high discrim­ inating power in one of the instru­ ments, with a high separating power in

the other, to accomplish general qualitative tasks more easily. But there are a number of other such possibilities, which we are only beginning to appreciate properly. Among these is the complementarity of the qualitation and quantitation performance of individual techniques now being combined. Here GC's excellent quantitation and poor qualitation are very well matched to the good qualitation and poor quantitation of either IR or MS. A much more important synergism, however, lies in a task t h a t most qualitative analysis methods today accomplish equally poorly, that of machine interpretation of the measured data. Even by taking advantage of all the potential of today's micro and minicomputers, these searches are lengthy (and getting more so as their data base increases) and impose quite heavy hardware (and thus cost) demands. Even for this they require condensed, preprocessed data bases, whose production from the available hardcopy data bases has substantial costs, delays, and error rates. The eventual availability of compcopy data bases will do much to ease this problem (4). A more serious difficulty is that the final output of these searches is not a compound but instead a list of possible ones, sometimes with probabilities attached, which the analyst must then choose from using his preknowledge or by looking at the original uncondensed data. This may be bearable on an individual basis, but it becomes overwhelming for an instrument which, in the case of a GC-MS, can separate approximately 1000 single compounds in one working shift. It is for this reason that the work output of such an instrument is bottlenecked by the' operator workload at this final processing stage. This stage of the interpretation procedure can be substantially improved by combining more than one set of qualitative data—either the lower grade information contained in the retention time, or that from another instrument in a ternary combination— to resolve the indeterminacies in the first data step. A GC-IR-MS combination, for example, will produce two such lists of possible compounds for each GC peak, requiring only coincidence logic for faster, more reliable identification. Last but not least, a substantial fraction of the compounds is normally not identified for lack of reference data on the pure compounds, which may not have been synthesized, or may not have made it into a reference data collection. The classical solution is to do the largest amount of structural chemical determinations possible. Mix with one top-notch analytical

chemist, stir well, and wait about a month. (5) Unfortunately, this level of data processing sophistication is far beyond today's computers. However, it is an exceedingly tempting ultimate goal for hyphenated instrument development. Curtailment of Methods Development Efforts As chemical analysis becomes more sophisticated and the variety of problems to be solved grows, methods development becomes a steadily growing portion of the workload. This is enhanced by the increased efficiency of automated techniques in coping with routine workloads. Excess power in the analytical technique is by itself a good antidote for the optimization that is the major workload in methods development. The complementarity of two analytical techniques, the usual presence of an interfacing computer, and the reduction in background effects when a general separation procedure is used to initiate the analysis, are all further steps in this direction. This is particularly so if the relaxed constraints on the various separate techniques are used to make them more universal. It is currently possible to process entire categories of industrial samples in GC/MS and GC-IR machines (6, 7) without any specific methods development whatsoever, using a compromise coating in a generously long capillary column and a general set of running conditions. High dynamic ranges of specific compound concentration often make methods development rather difficult, as new techniques must be developed for concentration extremes. In current hyphenated instruments this is considerably increased, as the dynamic range of the separation step often multiplies that of the identification one. In practice, however, sample acceptance limitations in the first stage and sensitivity in the last one often become the limiting factor. Further advances in the generality of "zero development" techniques for hyphenated instruments will be enhanced by adaptive feedback between the data and the operating conditions, improved performance in the component systems, and advances in data processing schemes. The Technology of Hyphenated Methods In discussing the technology of hyphenated methods, we must bear in mind that these techniques, as defined here, are more than the use of two techniques on a single sample for the same purpose. After all, this is a more or less traditional modus operandi in analytical chemistry. Instead, we con-

3 0 2 A · ANALYTICAL CHEMISTRY, VOL. 52, NO. 2, FEBRUARY

1980

sider a hyphenated instrument or method one in which both instruments are automated together as a single integrated unit via a hardware interface. It is this integration which accounts for much of the advantages of these multi-instrument systems. The hyphen thus actually stands for some real and essential devices whose technology is often the limiting factor in the field (8). While this suggests some possibly more appropriate terminology and acronyms, none would be as succint, and we should leave well enough alone. Given the centrality of this integration, more discussion of it is warranted. Typically, the hyphenated instrument consists of two separate instruments, only moderately modified for their task, which have some capacity to stand alone on separation—a capacity which is being steadily eroded as their design optimization continues. Neither instrument represents the limit of the state of the art in its respective field, reflecting both economic constraints and the time lag involved in engineering the combination operation. Prospective users who would like to build an absolutely state-of-the-art hyphenated instrument by interfacing top of the line instruments are well advised to reflect on this time lag, which applies to them as well. By the time they get done, the state of the art will have passed by. In analytical chemistry, as in many other fields, do-it-yourselfers are less and less competitive with specialists. The principal technological aspect of hyphenated instruments is the hardware interface, whose function it is to reconcile the often extremely contradictory output limitations of one instrument and the input limitations of the other. Here one should, of course, try to have both instruments bend as far toward each other as they can. One important step here is to decide who adapts to whom. A classical example of this was recently shown in the GC-IR problem. T h e first approach used adapted the IR spectrometer via the use of F T I R (7) and gave a $110K accessory to a $5K instrument, capable of moderate GC resolution and extremely good quantity sensitivity. A second approach, adapting the GC via the use of high pressure GC technology (9, 10) solved the problem with an $8K accessory for an $8K instrument, giving somewhat less GC resolution and extremely good concentration sensitivity. Both instruments eventually found quite different application fields. The remaining discrepancy, however, must be resolved via hardware interfaces. Historically, the mismatch

between the dilute ambient pressure sample issuing from a GC, and the concentrated low pressure sample re­ quired by an MS, seems pretty tame today—after jet extractors, selective membrane permeation, and differen­ tially pumped direct inlets have solved the problem. The current state of the art is aptly represented by LC-MS, which for interfacing purposes can be thought of as a GC-MS with three or­ ders of magnitude higher carrier gas mass flow rates. In the development of advanced in­ terfaces there is an interesting conver­ gent evolution with the manufacturers of instruments for automated chemis­ try (11) who have produced devices that rapidly carry out nearly all forms of laboratory microchemical opera­ tions which are easily adapted to hy­ phenated instrument interfacing. The last major consideration in hy­ phenated instrument integration is the nearly universal built-in comput­ er, be it a micro or a mini (a distinc­ tion which by now is self-evident only to advertising copy writers). The origi­ nal justification of this computer, data processing and partial interpretation to keep abreast with the instrument's final stage, is still its main duty today. This is gross wastage of one of the major cost factors in the instruments, particularly since data massaging is a far more demanding function than the other tasks that could be entrusted to it. Present tendencies include the in­ tegration of the computer into both instruments, not only to reduce cost but also to increase feasibility while eliminating the confusing "mission control board" all too often associated with modern instruments. Other possibilities such as on-line operator coaching, continuous self-recalibration and control setting reoptimization, "canned expert" operating routines, and unattended operation using se­ quential samplers and keyed sets of stored commands, are all possibilities t h a t the computer brings with it in fairly short order. Eventually, feed­ back between the data and the operat­ ing settings of both instruments, a more complete utilization of all data from both instruments (such as GC re­ tention times, peak shapes, and tem­ peratures), and new operating modes such as correlation sampling will all become commonplace. A trivial case of hyphenated meth­ ods exists when the .two techniques to be used can be accomplished by the same instrument. Here, for example, there are the numerous types of NMR (multiple nuclei, lifetimes, double res­ onances, etc.) (12) which can be done within a single instrument. Chaining the different methods is here little more than a computer instrument control and data processing exercise.

Other such cases are the combination of fluorescence, scattering, and U V absorption in only slightly modified UV-VIS spectrophotometers, that of fluorescence and scattering in flow photometers, (73) etc. The easy implementation of these special cases unfortunately coincides, except for the case of NMR, with sit­ uations where data interpretation is complex and data bases scanty. In the case of NMR, however, such tech­ niques have been brilliantly success­ ful, and only the abundance of suc­ cessful directions slows down the ad­ vance in any one of them. The State of the Art in Hyphenated Methods Historically, the development of hy­ phenated methods seems to have been a random process, with local equip­ ment availability being at least as im­ portant a factor as inspiration. A ma­ trix approach to the problem in which existing major instruments are used to form a square array whose individual combinations are then examined in detail, seemed a more systematic ap­ proach. An example of this procedure is shown in Table I, probably outdated as soon as it was written, which at­ tempts to describe the present state of the art in hyphenated methods. The uncircled crosses are of special in­ terest here, as they reflect present possibilities not fully developed as hy­ phenated instruments. It might be useful at this point to briefly discuss some features of several of these techniques. Among these we have: GC-GC. In the process of devel­ oping a GC-IR in which the gas chromatograph was adapted to the re­ quirements of the infrared spectrome­ ter, a high exit pressure GC was devel­ oped which could accept large (200 μΐ) injections, could operate in stopped flow for long periods without peak pileup, distortion, or diffusive smear­ ing, and produced an exit stream with a very high volume concentration of sample, while developing a resolution of a few thousand theoretical plates (9). Such a device is also an ideal inlet for a capillary column GC, for which it fractionates a large initial sample into small fractions. These no longer over­ load the capillary and are themselves partially resolved, giving a significant gain in dynamic range. The use of a different stationary phase in the high pressure column, together with its stopped flow feature, also allows true two-dimensional GC, for an increase in both resolving and separating power. While the operation of the system is necessarily slow, enough of the resolu­

304 A · ANALYTICAL CHEMISTRY, VOL. 52, NO. 2, FEBRUARY 1980

tion gets done in the first column so that the second one can be run fairly rapidly. Still, the analysis tends to be slow, and the intermediate valving re­ quired is fairly complex. GC-OES. By operating on the edge of the vacuum UV, an optical emission spectrometer can be configured as an C, H, N, O, CI, Br, I, S, and Ρ directreading analyzer. Given enough accu­ racy, the machine can be used to ob­ tain the gross formula of a GC peak, a possibility available until now only with very costly high resolution mass spectrometry. This accuracy had not been attain­ able until now because of interelement effects in the emission source, a prob­ lem only recently circumvented by using a plasma source (14). Such a gross formula readout on a GC would greatly supplement its resolving power, and if built into a ternary de­ vice using an IR or mass spectrometer it would greatly ease library searches or even alleviate the ab initio identifi­ cation problem. T h e success of this technique de­ pends on the prospects for a comeback of the GC-high-resolution-MS, whose high cost might appear bearable in comparison to a ternary system. LC-IR. T h e technology of LC-IR is in a somewhat strange state today. Two manufacturers claim, on the basis of nearly indistinguishable data, that the technique is (15) or is not (16) suc­ cessful. Except for the special case of gel permeation chromatography, it would appear that the nays are cor­ rect. Such caution is a refreshing phe­ nomenon indeed. Meanwhile, the technique suffers from the overwhelming solvent signal, with which the limited dynamic range of an absorption measurement cannot cope. Straightforwardly evaporating away the solvent gives fairly complex interfaces whose compatibility with any sample volatility at all must still be established (17). Just a little more is needed; we are almost there. The nudge over the top may be even now forthcoming with the newly invented micro-LC systems, where consider­ able reductions in solvent flow have increased the exiting concentration of the sample considerably. Another field requiring investigation would be su­ percritical gas chromatography, a sys­ tem intermediate between GC and LC, where IR compatible solvents would make a hyphenated system much simpler. LC-NMR. The sampling require­ ments of a modern, high field, Fourier transform proton NMR and the ef­ fluent stream composition of an LC are reasonably close to each other. The mismatch between the sample volumes involved can be addressed quite simply by going to prep scale LC

columns. The mechanical fixtures required for the interface will no doubt require hard work, but are not a feasibility bar. The sample requirements, while moderate, would certainly not allow microanalysis. The importance of LC as a nearly universal separation tool, and that of N M R as an equally universal and incomparably powerful structural analysis technique, make research in this area an intriguing task. TLC-IR. Previous work in this area had shown TLC-IR to be feasible using F T I R techniques on a special T L C substrate of thin sheet AgCl (18). More recent work has shown that transflectance sampling on thin TLC layers on aluminized Mylar could give reasonable TLC-IR spectra on organic absorbents such as celluose and nylon (79). Accepting these constraints on either substrate or coating makes TLC directly workable at normal sample concentrations. To locate sample spots, light exposure to iodine vapor is adequate for spot observation, but has only a weak effect on the spectrum. This use of one of the most expensive of laboratory instruments as an accessory to just about the least expensive one is certainly amusing, but quite useful. I R / U V / N M R / M S . Various permutations of these four instruments comprise most of the common combinations when several instruments are used separately on a given sample rather than as a hyphenated system. To combine them in an automatic system using a computer and whatever interfaces seem necessary seems obvious, and is certainly feasible as the respective sampling requirements are not incompatible. Here, however, feasibility is an insufficient test for whether the job needs doing in the first place. True, the four instruments are usually employed for ab initio identification, but they are used for more besides, and usually in the hands of four different and physically distant specialists. A combined instrument would probably have to be a dedicated one, which would need fairly heavy workloads to justify its cost. The automated interfacing of the instruments would save time only if sample measurement times were comparable with sample preparation times, a situation t h a t has arisen only recently in these fields. Altogether, it is hard to visualize where the need for tying these instruments together in an automated system would come from. The situation is different when one talks about typing together their data systems, in an interactive operator steered mode, for doing ab initio structural studies, and as an improved means of recognition qualitation. 308 A ·

Given the abundance of available options for hyphenated analysis, a systems engineering study of the potential gains from their use will provide valuable guidance for choosing among them. Future Progress in Hyphenated Methods T h e past few years have seen a boom in hyphenated methods, spearheaded by the success of GC-MS, and rapidly expanding into many other combinations. It seems clear from Table I, that there is plenty of room for future progress. Some of this progress, however, will not be so much in new combinations as in new hyphenated instruments lying outside the schemes discussed here. One such advance will come from the computer side, as really large mass storage devices become affordable systems components. A mass storage device with rapid access to >10 1 2 bits (and such systems are being announced now for minicomputers) is more than an improved computer peripheral. In fact, it is a qualitatively new addition to a multiinstrument system. Such a large capacity store will allow, for the first time, storage of entire unabridged data bases in a computer-accessible form. From this a number of major and minor consequences follow rather directly: • It will be possible to prepare condensed data bases required for rapid search in the machine itself, a less expensive process than the manual one and one free from encoding error. This will allow more complex or even multilevel condensation schemes allowing better trade-offs between search speed and accuracy. • Learning algorithms, which have already proven their ability to provide useful information even for samples not included in their original data base, will become more accurate and powerful, thanks to a much bigger training set. • Research on the information content and distribution in complex spectra will be possible on a statistical basis, allowing the design of improved search alogrithms and better ways of evaluating their shortcomings. It may surprise some readers to realize that the fundamental tenet of qualitative IR spectroscopy—that each chemical has a unique IR fingerprint—has never been credibly tested on a really large scale! • To the extent that it is possible to define a priori the set of possibly present compounds, partial subsets of large data bases can be used to greatly reduce the search effort. This is presently done by manual subdivision of large data bases using informed guesses

ANALYTICAL CHEMISTRY, VOL. 52, NO. 2, FEBRUARY

1980

as to the needs of specific industries (and that's some guessing!), or by including enough supplementary information in the header of each spectrum to allow a quick presearch based on this information (as, for example, elemental composition) to reduce the total workload. With the large data base available in the machine, the automated assembly of such subsets would be a trivial and rewarding operation. • Extending the above principle, a self-improving search algorithm could easily be devised in which a machine would automatically sort all spectra t h a t had been detected once before in that laboratory into a subset to be searched first. This type of learning from experience would be a fair imitation of the way experienced analysts operate. • T h e laborious, nonautomatic finishing of machine searches could be substantially curtailed by going to the unabridged full data set for a detailed correlation search of the short list of candidate compounds produced by conventional searches. • Spectrum-structural correlations, as used in a b initio structural analysis, could be substantially refined if detailed examination of 10 5 spectra data bases could be done by machine. I suspect the greater speed and much longer patience of the computer would enable it to effectively complement the greater processing power of the human brain in this task. T h e value of such data collections will, of course, greatly depend on their quality, but the great operating speed of modern instrumentation, often a key element of hyphenated systems, will greatly increase the rate of accumulation of new data. This will ease the obsolescence-size trade-off that has greatly reduced data base quality in the past. Other developments in hyphenated instruments, still very much in their infancy, are the ternary and higher combinations. T h e basis for these is considerably more than a simplistic, "if two are better than one, three should be better yet," reasoning. We have described above how the separating power of GC can be considerably boosted by going to two-dimensional systems. The advantages of MS-MS schemes, for providing data easier to interpret than conventional MS, has recently been described (20). T h e dynamic range of MS, long a limitation in ultratrace analysis, may be overcome by a recently developed MS containing two quadrupole spectrometers in series, the first of which, run on a specially devised electrical waveform, behaves like a notch filter to screen out background (21). T h e advantages of combining MS

FREE N e w Chromatography Catalog f r o m Alltech

DEVOTED TO CHROMATOGRAPHY-ONLY GC - TLC - HPLC HELPFUL "HOW-TO-HINTS" LOADS TECHNICAL CHROMATOGRAMS

YOURS FOR THE ASKING A l l T P P H CHROMATOGRAPHY * * • - • - • c u n PRODUCTS A S S O C I A T E S CATALOG NO. 35

ALLTECH ASSOCIATES

and IR have already been mentioned, and of course, the advantages of a GC first stage would all still apply to such a combination. In fact, we see in Fig­ ure 1 an example of a quaternary sys­ tem now under construction at Law­ rence Livermore Laboratory. Such high cost systems appear at first sight to be an extravagance only affordable for rich institutions or those likely to have an accidental jux­ taposition of the appropriate basic equipment. But, given the cost and scarcity of highly skilled analysts, and the prodigal use of their time on prob­ lems that existing instruments can't cope with, it may be the manual oper­ ation that is the extravagant one. Actually, there is a general ap­ proach to the problems posed by such expensive instrumentation, which is valid not only for hyphenated systems but for many top-of-the-line instru­ ments. Such instruments are generally not only expensive, but also fast—fast enough so that, in practice, sample loading, clerical keeping track, and data interpretation bottleneck their operating speed. Technical improve­ ments, sample handling automation, and the intelligence now universally a part of such instruments are deliver­ ing drastic increases in the sample throughput of such systems in the lab­ oratory. In fact, automatic sample changers and software advances are now mak­ ing it practical to use intelligent in­ struments in unattended operation. This will eventually move the typical instrumental work week of 20 h to something closer to the 168 h actually available. Only under these conditions, and provided there are enough samples, does it become possible, and indeed beneficial, to take advantage of such highly sophisticated instrumentation. Conversely, unless instrument devel­ opment and utilization procedures are carried out with such systems engi­ neering and economics in mind, the further development of analytical in­

Tomas Hirschfeld received his chem­ istry Ph.D. summa cum laude (1967) from the National University, Uru­ guay. He is currently a chemist at the Lawrence Livermore Laboratory, an industrial consultant, and a visiting professor at Indiana University, Bloomington, Ind.

2501 Waukegan Road Deerfield. IL 60015 Call: 312/948-8600 or 312/392-2670

CIRCLE 23 ON READER SERVICE CARD

312 A · ANALYTICAL CHEMISTRY, VOL. 52, NO. 2, FEBRUARY 1980

strumentation runs the risk of stran­ gling itself. Current developments in hyphenated instruments will thus ad­ vance not only the technology of ana­ lytical chemistry, but also our under­ standing and mastery of its underlying economic basis. References (1) R. R. Freeman, T.A. Rooney, T. M. Przybylski, and L. H. Altmayer, IR/D, ρ 142, September 1979. (2) I. M. Warner, G. D. Christian, E. R. Davidson, and J. Blallis, Anal. Chem., 49,564(1977). (3) J. T. Watson, "Auxiliary Techniques of Gas Chromatography," L. S. Eltre and W. H. McFadden, Eds.; Wiley-Interscience: New York, 1969; ρ 145. (4) R. Shaps, Sadtler Research Labs, Phil­ adelphia, Pa., private communication. (5) T. Hirschfeld, Anal. Chem., 48, 16A (1976). (6) R. A. Johnstone. In "Mass Spectrome­ try"; C. J. W. Brooks and B. S. Middleditch, Eds.; The Chemical Society, Bur­ lington House: London; Vol. 4, Chapter 7; ρ 146. (7) D. L. Wall and A. W. Mantz, Appl. Spectres., 31,552(1977). (8) P. J. Arfino and G. Guiodion, Anal. Chem., 51,682A(1979). (9) T. Hirschfeld and H. McNair, "Pres­ surized Stop Flow Gas Chromatograph Device," Application for Letters Patent, 1977. (10) R. H. Shaps and A. Varano, Ind. Res., 19,86(1977). (11) D. A. Burns, I. Fernandez, J. R. Grant and A. L. Pietrantonio, Am. Lab., 79, Oc­ tober 1979. (12) L. Mueller, A. Kumar, and R. R. Ernst, J. Magn. Reson., 25, 383 (1977). (13) M. R. Melamed, P. F. Mullaney, and M. L. Mendolsohn, "Flow Cylometry and Sorting," Wiley: New York, 1979. (14) D. L. Windsor and M. B. Renton, Appl. Spectros., 32, 366 (1978). (15) K. Kizer, Am. Lab., 5, 40 (1973). (16) D. W. Vidime and D. R. Mattson, Appl. Spectros., 32, 502 (1978). (17) K. H. Shafer, S. V. Lucas, and R. J. Jabrabsen, J. Chromatogr., 17, 464 (1979). (18) M. M. Gomez-Taylor and P. R. Grif­ fiths, Appl. Spectros., 31, 528 (1977). (19) T. Hirschfeld, "Advances in Coupling FT-IR to GC, LC, and TLC," presented at FACSS Meeting, Boston, November, 1978. (20) R. A. Yost and C. G. Fulse, Anal. Chem., 51,1251A(1979). (21) M. B. Denton, Univ. of Arizona, Tuc­ son, Ariz., private communication.