Limits of analysis - ACS Publications - American Chemical Society

goals that on analysis involved beating. Heisenberg's principle ... large set, the data can be unidi- mensional (GC ...... techniques. GC has an advan...
0 downloads 0 Views 3MB Size
LIMITS OF Tomas Hirschfeld Block Engineering, Inc. Cambridge, Mass. 02139

The ultimate limit of analytical chemistry can be as tautological as “measuring the total information content of a physicochemical system”. A better definition of the limitations of the maximum analytical task requires considerable attention to fundamental physical chemistry and information theory. Indeed, for complex systems, this process will illustrate the limitations of the defining process better than it does those of the analytical. A manageable approach starts with the individual analytical chemical parameters whose extrema define maximum analytical tasks: qualitation, quantitation, available sample quantit y and concentration, spatiotemporal localization, matrix effects, and reportability. Then the individual limits of these variables may be considered in terms of some simple models. This paper briefly illustrates the results of such a procedure. The utility of such a putatively philosophical task resides in that several of these limits are no longer very far out of reach. Already a few instances have occurred where considerable research effort was devoted to goals that on analysis involved beating Heisenberg’s principle, Shannon’s limit, or the second law of thermodynamics, all of which can take rather surprising forms under extrema1 conditions. Beyond this negative task, such an analysis can pinpoint any unused degrees of freedom by which an encounter with these limits can be postponed, and thus serve as a guide for further research. Qualitation Task Three modes of qualitative analysis exist: confirmation, recognition, or identification. Of these, confirmation is so often an implicit part of an analytical proce16A

dure that is not adequately considered as a qualitative analysis problem. When building selectivity into an analytical procedure, the goal is to ensure a very high probability that whatever else found in previous samples or expected in future ones does not introduce serious errors. Unless an analytical procedure also includes specific steps to confirm the identity of the analyte, there is always a finite (even if extremely small) probability of large errors which may or may not have the suspicious appearance that suggests a recheck. Furthermore, since such errors are systematic, replicates are no help. Unfortunately, too many situations exist in which the price of some analytical errors, no matter how infrequent, is measured in lives, and is therefore unacceptable. “Zero defect” analytic procedures are unthinkable without qualitative confirmation. Qualitative confirmation, in turn, requires redundant data which can be checked for consistency. This procedure can go through several stages of increasing sophistication and capability. It may consist of no more than replicate measurements with different techniques having as little as possible in common, or using a technique designed to give more than one data point (GC or UV peak location intensity or ratio of two ordinates at different abscissas’in a two-dimensional data field). Ultimately, curve matching or fingerprinting in “rich” twodimensional data fields, preferably of universal response (IR, NMR, MS), can be employed. This may conveniently be implemented by subtracting the reference from the measured data and checking what is left. Recognition, probably the most common form of qualitative analysis, compares the sample to a reference data base. The technique to be em-

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

+

ployed is constrained mainly by the size of the set of possibilities among which the unknown must be located. Since the number of all possible compounds far exceeds our ability to list them, let alone search them comparatively, a subset of the chemical world must always be used here. To render such a subset useful, it must have a very high probability of including the compound sought and will therefore be large. Probably the worst case here is that of recognizing a low-level contaminant fractionated out of a sample, where a significant fraction of all compounds known must be included in the subset used. To give differentiable results for a large set, the data can be unidimensional (GC, melting point), bidimensional (most spectroscopic techniques), or multidimensional if possible (fluorescence, activation, and time resolved spectroscopy, X-ray diffraction, ultramicroscopy, multicolumn GC, etc.). Furthermore, the curve describing the data should preferably have a large number of identifiable features (a “rich” curve). For example, GC curves having only a single peak per sample can resolve a sample set completely only if the number of samples is much less than that of resolution elements in the chromatogram ( N ) .On the other hand, in a “rich” curve such as pyrolysis GC of the same samples (if reproducible), with P peaks resolved into Z relative intensity levels, this u er limit will be (very roughly) lVpPP (1-3). For curves that are not “rich”, like UV or straight GC ones, an increase in dimensionality (fluorescence, multiple noncorrelated columns) is usually necessary. Given the proper reference sample subset and a technique adequate for resolving it, the main limiting effect becomes the size of the search effort.

Report

ANALYSIS Given all but the smallest sets, a direct sequential comparison cannot be performed without automation. In fact, for large sets even automated searches become impossible unless the data are condensed in some fashion. This condensation should preferentially remove features containing little or no information or which are unreliable. In IR searches, for example, intensity data are dropped, nonabsorbing regions become implicit, and absorption wavelengths are specified a t reduced resolution. Eventually, however, data condensation will reduce the limiting set size or recognition reliability. Thus, search technology and the available resources set an independent limit to set sizes. Set sizes somewhat beyond the maximum can be handled if the resulting small set of alternative possible samples can be further reduced by using the information condensed out or any other one, including a priori likelihood, available to the analyst. An alternative search technique, the hierarchical search, uses some of the sample features to define the sample as part of a subset of the master set. This procedure has substantially increased the set size limit for nonautomatic recognition searches. It may be automated provided the master set is so cross-indexed as to make its partitioning faster than sequential searching. Functional group recognition in several spectral regions is one of the most common hierarchical search modes. Of course, recognition searches can be extended to larger sets by sequentially using several techniques. Then one has the option of using only part of the information provided by each to optimize search speeds or to set up a hierarchical search strategy. Eventually this can be automated in a fairly,ef-

fective way. However, the largest data set in existence now is the IR, with -lo5 spectra, followed by about half that number in MS and UV and somewhat less in GC and NMR ( 4 , 5 ) . Since, of course, these data sets are only partially overlapped, obligately sequential search techniques must necessarily start from a much smaller data base. Thus, multiple technique recognition searches usually use one technique for searching and the others to resolve overlaps or for confirmation. Identification, on the other hand, involves qualitative analysis of a sample for which no comparison data are available. Here we run into one of the basic limits of reference data sets: with the largest one available containing -lo5 samples, and 5.106compounds reported in the chemical literature, mutual randomness between both these sets and the frequency with which compounds appear in analytical samples would give the average recognition search a 2% chance of success. Fortunately, the randomness assumption does not hold, and the collection samples actually account for as much as 50-60% of the samples actually looked at. This, however, is still a 4050% failure rate in recognition analysis. Furthermore, as trace, ecological, and biological analysis progresses, the number of new compounds found in samples will grow faster than these collections, compounding the problem. A different problem with collections is that they represent the output of many years of work, making their average quality inevitably much worse than those obtained from samples measured in today’s instruments. Finally, the large scale effort involved in building a collection ensures that some of the samples involved will contain unrecognized impurities or stabil-

ity problems or reactivity toward ambient materials and thus contain a few extra features. Then recognition searching becomes very chancy indeed. A hybrid between recognition search and ab initio identification occurs when a search procedure is sophisticated enough to recognize not only a match but also a close miss, preferably with some indication of how close (6, 7). Then a failed recognition search will provide so much information about the sought structure that a later identification will start with the bulk of the work already done. Search algorithms of this nature are being developed in various forms of spectroscopy and offer the most immediate potential of circumventing the problems of recognition searches and data collections (which would only need to contain a few close relatives of the compound sought). A more fundamental solution, requiring fewer data and a much smaller workload, is the “synthetic” method of ab initio identification. Appropriate methods identify most of the functional groups (and some of their substitution patterns) present by looking for specific characteristic features in the data, and then synthesize the molecule by putting the puzzle together. To locate enough of the molecules’ substructural features, any single technique is marginal. Typical combinations used for organic molecules include IR, NMR, and gross formulas combined a t times with MS and UV. Other techniques used have been elemental analysis, molecular formulas, identification of decomposition products, etc. To automate such a concatenation effectively will require adaptive control of a number of analytical systems, requiring breakthroughs in automated sampling, externally con-

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

*

17A

trolled instruments, systems engineering, and software, which do not seem to be justified by the goal. A more worthwhile approach would be to perfect one of the subsystems involved to the point that one instrument alone could do the entire job by operating in a multiplicity of modes. This may be eventually true for NMR, where multinuclei operation, relaxation time measurements, double resonances, and so forth generate a large number of informational degrees of freedom in a single instrument by adjusting externally controllable parameters. I t is not unreasonable to expect such a system, backed by a t most a gross formula determination, to eventually provide automatic ab initio qualitative identification. For all these modes of qualitative analysis, a number of unspoken as-

unacceptable.



sumptions have been used. A pure sample is essential to both recognition and identification; attempting to avoid this by using selectively blind techniques or data handling procedures is certainly risky and rarely failsafe, as problems can easily occur without advertising their presence. Furthermore, we neglected the often (but not always) simplifying case where a sample must be recognized only as a member of a class, which may be defined between wide limits. However, a larger (and more doubtful) assumption used was our ability to define what a chemical compound is. For instance, where does a heavily hydrated protein molecule end and the solvent begin? An often used rule of thumb is to say that only covalent bonds define an organic compound, and hydrogen bonds are secondary perturbations. Thus, liquid acetic acid can be designated CH3COOH. But then single- and double-strand DNA are the same compound, which is clear nonsense. Perhaps an explicit criterion, based on the strength of the linkage of any part of the molecule to the whole, could be useful. And, finally, let us consider how broad a definition may be used for a “single compound”. For example, isotopic substitution species are generally regarded as the same compound. 18A

Since their IR, NMR, and MS spectra will be different, the (usually justified) assumption of natural isotopic abundances is required. But whether a substance showing keto-enol tautomerism is a single compound or two depends on the measurement time frame being faster or slower than the equilibration time. Similar processes apply to cistrans isomerism a t elevated temperatures or rotation about a single bond a t low ones. And just where do folding changes in a protein molecule, with their profound effects on chemical and physical properties, define a new chemical structure? Again some criterion based on the relationship between kT and the transition activation energy could be used if an appropriate standard time frame is used. Quantitation Task

The limiting phenomena here arise from both fundamental reasons and from the limitations of the data base used. The basic fundamental limit is statistical. We must consider first the fluctuations arising from the stochastic variations in the number of sample particles within the sampling volume. These fluctuations become appreciable only when the number of such particles is small, requiring either very low particle concentrations or extremely small samples. At high sample concentrations, the problem is thus significant only in microscopic or ultramicroscopic analysis, as, for example, in electron microprobes or in Auger spectroscopy of surfaces. The low particle concentration case is much more widespread. For example, in the determination of low enzymatic or antibody activities in cells by microspectroscopy, the stochastic fluctuations are clearly limiting. In virology, we cannot determine viral concentrations in drinking water to 10% accuracy without viewing a t least 10 1. of water ( 8 ) !Using a more “chemical” example, aerosol composition can only be quantitated to 1%if the least abundant particle categories are represented by 10,000 or more individuals. This problem is exacerbated if any polymerization, agglomeration, micelle or microcolloid formation occurs. The limiting quantitative accuracy is then deteriorated by 6 ( n = degree of agglomeration). This problem often occurs for fat soluble molecules in plasma, which may be dissolved in its larger (and less abundant) fatty particles. An interesting example occurs for some biological stains which dissolve in water as 5001000 8, aggregates. A 10-jM solution of these will require almost lo6 p of observation volume for 1%accurate background measurements, ruling out

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

microspectroscopy a t this level of precision (9). A different situation arises if the particles are suspended in a fluid and undergo Brownian motion in contact with a far larger reservoir volume. Continuous observations over times long compared with the mean Brownian residence time in the observed volume will average these fluctuations out, ultimately to the fluctuation limit corresponding to the reservoir volume. Long Brownian residence times may be circumvented by flow methods. The statistical processes in the sample will be controlling provided the next steps are either nonstatistical (very high probability),phenomena or involve such a considerable number of individual responses that the statistical fluctuations of the latter are inconsequential. Among the limitations imposed by the accuracy of the reference data, we must consider first just how the status of “reference” data is attained. Obviously, the data can be no more accurate than the limits of the measurements used to define them. This a t once raises the problem of how to verify the accuracy (not reproducibility) of a superior method (IO). Only in rare cases will it be possible to accomplish this verification by reference to theory, or to identify and quantify all the error sources in the new technique so as to calculate its errors from theory. The accuracy of a measurement could also be established by comparing it with the mean of a large number of determinations with somewhat inferior methods, but this is hardly practical. In practice, accuracy in a determination can be measured by testing it on a synthetic sample deemed to be pure. Other than the vagaries of the determination itself, we then have those of the extent of the sample’s assumed purity and assay. We take advantage of the fact that proving the persistent homogeneity and constancy in properties of a sample across a large enough number of attempted separation techniques amounts to proof of purity and 100% titer to the limit of the techniques used. However, this approach has become so utterly a matter of routine that all too often we tend to overlook its key loopholes. In the first place, its proof is statistical, and where extreme reliability is required, “large enough” is more than the customary couple of recrystallizations and melting point determinations. This can be a very decided nuisance if one is trying to create a large collection. Furthermore, there will always be samples whose impurities cannot be separated by any of the separation procedures available a t the time. A forbidding example is the

Nicolet Announces A New, Complete System For Fourier Transform Infrared Spectroscopy MADISON, Wis.-Nicolet Ins t r u m e n t corporation ( a leader in Fourier Transform NMR Spectroscopy) has acquired t h e Infrared Interferometer product line of EOCOM Corporation. This means that, for t h e first time, one company manufactures both t h e interferometer and t h e data system. This combination of capabilities has produced a complete Fourier Transform Infrared Spectrophotometer instrumentation system for basic analytical or routine laboratory work. The system contains automatic ratio recording with b e t t e r than 0.07 cm-1 resolution throughout t h e spectral range of 4000 to 400 cm-l. It includes a Michelson interferometer with germanium on KBr beam splitter, laser reference and white light reference system, and variable mirror drive r a t e s of 0.05 cm/sec to 4 cm/sec. It has a total optical retardation length of 16 cm and a nominal aperture of 2” diameter. Options a r e available for operation in t h e visible, near and far infrared regions, and for operations with a cooled detector. Information is collected, processed and displayed from a Nicolet 1180 d a t a system having 40K words of solid s t a t e , 20-bit memDry storage, dual 4.8 niegaword disk memory, a high speed digital plotter and CRT display.

Some major features of this data system a r e its 15-bit analogto-digital converter (ADC) with automatic gain ranging, t h e ability t o plot while processing and/or acquiring, t h e ability t o collect and transform up to 512K d a t a points, a n optimized instruc-

tion s e t for fast Fourier transformations, and a very complete software package. An option is available t o replace t h e data system with an ADC interfaced to a 9-track magnet,ic t a p e system and a complete Fortran software package for an IBM 360 system.

Nicolet Technology Announces Fourier Transform Mass Spectrometer MOUNTAIN VIEW, Calif. Nicolet Technology Corporation, which previously specialized in interfacing data systems to nmr spectrometers, has announced plans for a new, high resolution Fourier Ion Resonance Mass Spectrometer called FIRMS. Capable of working with samples of lower volatility than usable in conventional mass spectrometers this new spectrometer offers greatly improved resolution and sensitivity along with t h e ability to examine higher molecular weight compounds. Since Fourier transform ion resonance spectroscopy detects t h e entire spectrum a t once, r a t h e r than one element a t a time a s in t h e conventional scanning spect,rometer, a given

spectrum may be obtained 100 to 1000 times faster. Because of this speed t h e chemist can observe ion-molecule reactions.

More Details Offered MADISON, Wis.-For complete details on either of t h e products described please write or phone Nicolet Instrument Corporation, 5225 Verona Road, Madison, Wisconsin 53711, Telephone: 608 / 271-3333.

CIRCLE 150 ON READER SERVICE CARD

ANALYTICAL CHEMISTRY, VOL. 48,

NO. 1, JANUARY 1976

19A

gradual realization, over the last few years, of just how impure some “standard” enzyme and antibody samples are, and of how much improvement they still require. Clearly, it takes some time for a sample to rise to the status of a reference, and even more if an entire collection of such samples is required. This, in turn, creates an automatic generation gap between the instrumentation and techniques used for such references and those available to their users. This leaves us with the contradiction of reference sample collections being most quantitatively adequate when small (and thus young) and most used when large (and therefore old). The applicability of such reference data, necessarily taken under a predetermined set of standard conditions, in the more variable world of actual analysis is itself uncertain. The physicochemical properties of a sample are onfy to the first order its own properties; in any higher degree of approximation, they are a function also of its surroundings (11).Even when in a given analytical problem these are known, compensating for the perturbations they induce would require vast additional amounts of reference data. Another approach, also not very practical, is to attempt to process every sample into the closest possible resemblance to the standard conditions used for the reference. An alternative procedure, probably the most practical when the concentration sensitivity of the analytical technique has not been exhausted, is to use a highly diluted standard and likewise dilute the sample. Samplesample molecule interactions then drop quadratically and become relatively insignificant, while sample-surrounding interactions quickly become dominated by the sample-solvent ones which are no different in the reference. This technique suffers from two drawbacks: it is not used often enough, and it depends on sample-surrounding interactions being weak enough so that they cannot set up concentration gradients around the samples that resist dilution. In the latter case, accuracy requires either elimi20A

nating the offending constituent, or if known and available, adding it to the standard.

Available Sample Quantity Clearly, the fundamental limit here is the atomic, molecular, or particulate nature of the sample. However, we must remember that even if a single particle can be observed, the sample must contain five on the average to make the presence of one 99% certain. Both these limits will move proportionately up if there is any form of agglomeration. To realize these limits, the phenomenon produced by the analytical interaction with the particle must either occur more than once per particle or be detectable a t its unitary level. Phenomena that can be directly detected at the single level include photon emission at energies > 1 eV, radioactive emission, exoelectron emissions into vacuum, or recoil emission of the particle itself. However, a t photon energies 24OoC boiling point. Equilibration with laboratory air must thus be avoided. Another problem is dust from the environment. This is augmented by the large output of complex vapors and aerosols from the normal human body, as has become evident from the space and the military intruder detection programs. Particularly the aerosol effluvia have been proven to be responsible for the celebrated erroneous reports of “polywater” in 1973. Clean room conditions, with controlled air flow and the operator downstream of the sample, will eventually provide an answer to this. Chemical reactions inside the sample cannot be neglected either. We may consider, not only reactions between constituents enhanced by preconcentration or analytical procedures, but also interactions with lowlevel active species produced by radiochemistry induced by ambient radiation levels. When the analyte is thermodynamically more stable than some abundant solution species, Peter’s equation predicts a minimum possible concentration for it, the one at which the concentration term drives the reaction’s potential beyond the activation one. Finally, we have all seen microorganisms uisibly growing in 10% H$304,5% AgN03, or similar unlikely places. But only one bacteria per ml is a 1 ppt contamination, and it usually can metabolize several times its own mass every hour, often with quite outlandish by-products. Probably the worst possible sampling situation applies to small particulates in liquids, for which the surface/volume ratio is high, and where all kinds of adsorbates can be first collected or generated by the larger volume of liquid. Furthermore, most particulate-liquid separation techniques include drying as a final step. Surface forces ensure that most of the dissolved solids in the collection surface’s residual liquid layer wind up at the particle. An interesting effect may arise in fluid samples which by virtue of temperature variations or agitation develop gas bubbles between their collec-

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

tion and analysis. This will often provide us with an unsuspected and unintended example of the effectiveness of froth flotation in removing very lowlevel constituents from a solution. The current practice of not collecting the bubbles in the analytical aliquot should possibly be replaced by chilling the samples to redissolve all bubbles.

Spatiotemporal Sample Localization Samples not uniform in space and time are becoming steadily more important in biology, material science, and chemical reaction studies. We must also consider the effect of extrinsic inhomogeneities generated in the sample by contamination, mixing problems, adsorption, etc. These must often be studied by the analyst, if only as part of the control procedures for developing an analytical procedure. The ability to produce high spatial resolutions is limited in chemical analysis by the limitations of the next step in the sensing chain and by Brownian motion along every step of this chain involving chemical reactions. Eventually, the final sensor must either be a small volume scanning one or an imaging type. The limits are set by either the mechanical construction limits of a physical sensor or the resolution limits of light or electron microscopy. For physical sensors, a structural limit of the order 1 p exists, but its ultimate value is open to question. Much better limits exist for electron microscopy, which with coming improvements in technology will probably exceed atomic dimensions. However, electron microscopy is limited by the sample damage it produces and the lack of qualitative information from the techniques used at these resolutions. Optical microscopy, on the other hand, is limited in resolution by the wavelength used and will be of the order of a few thousand A. Various spectroscopic methods have produced resolutions better than a wavelength. Attenuated total reflection can produce resolutions approaching k/20 in the vertical axis close to a high index surface (16). Used in complement with holography, this resolution can be obtained triaxially. Intermolecular energy transfer between unlike fluorescent molecules,

Helium filling operation for Viking mini bio lab requires charging 99.9999% pure helium at 4900 psi to system evacuated to 5 x 10-7torr. Continuous RGA in 1-2 ppm is monitored. SWAGELOK Tube Fittings and precisionvalves with SWAGELOK fitting ends maintain leak-tight integrity of charging system.

When the Viking spacecraft lands on Mars in JuV, 1976, the event will celebrate our bicentennial.. .and lead to some answers to the age-old questions about Martian life. A miniaturized biological laboratory, developed by TRW Systems, will use gas chromatographyand other sophisticated techniques to test soil samples for living micro-organisms.Purity and leak-tight charging of helium, the carrier gas for the gas chromatograph, are vital to success of the mission. Contamination from Earths atmosphere would degrade test readings. To eliminate tubing connections as potential l e a k points, TRW specified SWAGELOK Tube Fittings, and WHITEY Severe Service Valves with SWAGELOKfittingends, throughout the helium charging system. NUPRO Bellows Seal Valves with SWAGELOK fitting ends are used on ground support equipment and ultra-high vacuum test systems involved in the project. Just as standard SWAGELOK Tube Fittings accompanied Apollo astronauts to the moon, so too are they joining the Search for Life on Mars. These same products are available to you, off the shelf, from your local Full Service SWAGELOK Tube Fitting Distributor.

Tube Fittings

O V 7 5 MARKAD SERVlCE CO

011 nghts resewec C-p5

NUPRO-TM N~p~oComPonv WHITE'-TM Whitey Comwny

CIRCLE 3 2

ON READER SERVICE CARD ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

25A

“By using a simulated application array as an analytical procedure, we skip all the sample characterization steps and the evaluation of the results, and measure the sample’s adequacy for the job instead. ” one of whose location is known, can provide opt:ical distance measurements down to a few angstroms in special cases ( I 7). Generally, the wavelength limit of resolution may be avoided by using a high-resolution variation in modifying environmental conditions instead. Therefore, in NMR, resonances can be shifted in a space dependent way by using a magnetic field gradient (18). Observation of a given frequency will then correspond to a given location, with a resolution determined by the gradient. In all cases, of course, spatial resolution is limited by signal strength. Furthermore, if scanning systems are used, the number of resolution elements that can be viewed depends on the time available for each resolution element. For imaging systems, the limitation is again the number of available resolution elements and the required FOV size. When time resolution is desired, the most obvious limitation is the number of individual events generated by the measurement process within one time resolution element, which must be statistically compatible with the desired accuracy. This can be circumvented by time averaging across repetitive events if possible. More fundamentally, the time constants of the observation mechanism itself must be considered. If, for example, fluorescence is being used. the time resolution will be limited to a few nanoseconds by the usual fluorescent emission lifetimes. An interesting case arises if the sample localization is somewhere other than the instrument. This can be relatively near to the instrument, as in measurenients of what is going on inside a furnace or combustion chamber. This local inaccessibility, where it cannot be avoided by Sampling probes, requires an analytic interaction capable of propagation to a distance. Given the further assumption of an intervening atmosphere, this implies the use of either neutrons or preferably radiation in the gamma through the X-ray range or the UV through radiofrequency range. Different coinsiderations apply for large instrume nt-sample distances. For truly large distances, as encountered in astronomy, we are restricted 26A

to passive methods, and signalhoise considerations restrict us to the UV-VIS and microwave domains. For intermediate distances, such as we find in remote pollution monitoring or mapping (19), aerial prospecting, earth resources work, and a number of military applications, either passive or active devices are possible but signal/ noise is an overriding consideration. Among passive devices, we have the choice of using natural sample probes such as wind transport or traveling animals, or analyzing an entire line of sight through the atmosphere by using natural light sources, such as the sun or natural temperature differences, with a single-ended instrument pointed as desired. Active devices, on the other hand, employ an artificial source. We may use a distal reflector to give a double-ended (and thus not arbitrarily pointable) measurement of absorption along a line of sight. Or we can use a pulsed single-ended light emitter, which can be freely pointed, to excite Raleigh, Mye, or Raman backscatter, or absorption from this backscatter, or fluorescence, which return to the instrument with a range dependent time delay. The sensitivity possible in these instruments may be illustrated by the detection, from the ground, of the ultratrace concentrations of Na atoms in the atmosphere a t 100-km altitudes (20). Photon shot noises in these instruments probably set ultimate detection limits a t medium ranges to 10-3 ppt in fluorescence and -1 ppm in Raman spectroscopy. Ultimate range resolutions are limited to a few cm in fluorescence by fluorescent decay times, and in Raman by the transform-limited linewidth of the emitted pulse. The sample may be remote from the instrument in time rather than in space. This seemingly forbidding problem is an everyday problem to geochemists, archaeologists, or forensic chemists. However, few general approaches have been developed in this field, and the multiple individual techniques that have evolved do not lend themselves to ready analysis. Matrix Effects

Matrix components will affect analytical determinations through both

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

direct interferences and perturbations of the sample’s properties. Interferences will limit the least detectable quantity and concentration as described above, even if they are predictable enough so that their effects can be discounted. This requires knowing the interferent’s properties in order to design a technique selectively blind to it, or to separate it out, or knowing its concentration so as to subtract its effect from the results. Where it is possible to obtain a blank sample, this can be done implicitly. However, if none of these can be done, a separation method is essential. If one knows what one is looking for, dry runs with the pure compound allow one to recognize the fraction of interest for further analysis. Unfortunately, knowing what one is looking for is not always possible. Quite a few analyses are run on problem statements like, “What is abnorinal in this sample?” (where “normal” is quite fuzzily defined) or, worse yet, “Is there anything harmful in here?” The latter question requires a definition of what is not harmful and constitutes the logical problem of a negative proof, not obtainable with a n y finite effort. However, as the famous judicial decision in the “squaring the circle” case states: “A mathematical impossibility is not a legal one.” Under these circumstances, it is easy to yearn for the simple old problems like the needle in the haystack. The defensive practice of analytical chemistry thus requires detecting and measuring as many sample constituents as possible, known or unknown. This may be done by extensively fractionating the sample and analyzing every fraction obtained. The requirements for such separation techniques will be universality, high separating power, the largest possible number of resolution elements, high sensitivity, and a large dynamic range. Furthermore, the end product of the separation procedure should be compatible with qualitative and quantitative follow-on analyses. Liquid and gas chromatography are probably the most successful such techniques. GC has an advantage in sensitivity, dynamic range, and universality of detection. LC, on the other

hand, does not require a volatile sample, giving a greater universality of separation. But it is bottlenecked by the lower sensitivity and universality of its detection, a phenomenon which is probably not inherent. However, we must remember that LC, with volume concentrations of sample in its effluent similar to GC, has vastly lower mass concentrations, so that some of its apparent sensitivity loss is not real. Beyond this, LC, with the extra degree of freedom of its sample-solvent interaction, has a higher potential for selectivity. T o have enough distinct resolvable elements to avoid overlap in complex mixtures, both techniques would benefit from multidimensional separation. This may be done by column concatenation in either technique, or by using two-dimensional techniques like TLC in d a c e of LC. At uresent.

the output of GC is more compatible with further analytical methods, but this is only a temporary limit. Present circumstances put GC a t a slight advantage, while ultimately the advantage will lie with LC. Perhaps the optimal tradeoff may occur with techniques intermediate between GC and LC, such as ultrahigh pressure GC (211.

The matrix constituents can perturb the properties of the sample either chemically or physically. Chemically, we must consider alterations of dissociation caused by common ion effects and complexation effects with both the analyte and the reagent. Catalysis or catalytic poisoning of the analytical reaction, as well as any incompatibility with the reagents used, are other obvious sources of. problems.

Physical effects can be considerably more subtle. Any micelle forming constituents in the medium will in effect turn the medium into a two-phase system, a fact which will not always be observable. If the dispersed phase has selective affinity for the analyte, the latter will have an inhomogeneous spatial distribution. This profoundly affects statistical error levels, reaction kinetics, and due to the change in solvation, the sample’s physicochemical properties. If the matrix affects surface tension, the analyte’s distribution between the bulk and the surface will change, contaminants may be displaced from the walls, and the presence of other constituents in the surface layers may affect transport-type sensors such as electrochemical. Furthermore, if the analyte is displaced to the surface, this will produce consider-

CIRCLE 4 O N READER SERVICE C A R D

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

27A

For ten years now, Corning has offe& you more meters and more guarantees than anybody. What started as our basic package now grows to the biggest choice in pH today. So write your own ticket

The free year. Behnd the largest familyof pH meters stands our full 24month performance guarantee.

The electrode guarantees. Pick the laboratory pH, reference, or combination electrcdes you need and you’vegot their 6month unconditional guarantee built in. And our process electrcdes are guaranteed to work in your application.

The complete meter. When you buy a pH meter from Corning, it comes with all the electrcdes you need to make it work At no extra charge. Then come the extras, and the choice is yours.

The third year. Now, if you’d like your meter to get an extra year

of protection, it‘syours. For only $1.00.Just check the box.

The electrode/ buffer bonus.

I I 1W

When you buy a Corning pH meter, you get three built-in benefits.

@Two-yearmeter guarantee. Six-monthelectrode guarantee. [idpH meter, complete with electrodes.

1

Between October 1,1975and July 31,1976,you also get your choice of any two of the other options listed below.

IO I 0 I I 1 0

i

The third year meter service coverage ($1.00). 0One pH electrode. One reference electrode. 0One combination electrode. 0One case (12 pts.)of 4.01buffer. 0One case of 7.00buffer. One case of 10.00buffer.

Send this ticket to: Corning Glass Works, Dept. pH, Medfield,Mass. 02052

1 I I? : Name

Department Facility

I

Phone State ---.-.-Zip

0 Please send more information and catalog.

0 Please have salesman call 0I am interested in quanhty discounts

Telephone Number

I I I I I I I I I I

Buy a Corning pH system before July 31,1976and you can receive an extra electde designed for that system. Or a case of the buffer you use most Every Corning buffer meets rigid NBS standards.

Call your dealer. For a closer look at our pH package, contact your nearby Corning dealer or your Corning Laboratory Glassware salesman. Or check the box on the coupon and send to us d.~rectly.

The specifics. See your Coming dealer to make your selection. Then write your own ticket and send it with pmf of purchase and warranty card to Corning Glass Works,Department pH, Medfield, Mass. 02052. h s offer is valid from October 1, 1975through July 31,1976.Your two choices d l be sent to you direct fmm Comng.

i

ICORNING I I Circle 39 f o r literature. Circle 40 f o r sales call. Circle 41 f o r Information 1 o n Q u a n t i t y Discounts.

I

able alterations of its properties and reactivity. We must also consider the effect of the medium’s changes on these properties, as discussed before. Not only the analyte’s susceptibility to such effects matters, but also that of its not too short-lived reaction intermediates or excited states.

Reportability Given the need for a multidimensional data field to identify a single sample constituent, and the number of constituents that can be found down to the trace level, a very large data volume can be generated rather

quickly. The situation is similar if a detailed spatial distribution or time evolution of a single constituent is desired. The very large volumes of data that must be obtained, transmitted, stored, and displayed for this purpose are a quite severe problem. While technological progress may offer a solution, eventually a limit will be set by the human observer’s rate of data assimilation. This ultimate limit can be overcome by a number of data compression schemes. Unfortunately, all such schemes represent an attempt to com-

NEED HIGH PRECISION D2O-IN-WATER DETERMINATIONS? Wilks has the answer for under $2,100 with 3 ppm or better accuracy!

I

Pete against well-developed data compression schemes in the human brain. For example, comprehension speed for condensed representations of spectra seems to vary only slowly with the data compression. The most successful schemes are those that exploit either well-developed mental habits, such as when a spatial data field is displayed with the data content at each point processed into a false color scheme, as used in the earth resources photographs. Another scheme, used in astronomy, exploits the flicker detection ability of the human brain to study the colors of an assembly of luminous points by comparing two pictures taken through different filters in a flicker comparator. More directly, one can attempt to actually eliminate part of the observer workload. Rather than give the observer the analytical data, it will soon be possible to provide him with partial qualitative data for each constituent. One can then conceive of an interactive programming scheme in which operator choice among these alternatives, verification, and a qualitativequantitative description of the sample would follow. Eventually the whole process can be automated, and even further into the future, some of the interpretation of the sample composition may be automated as well. To this futuristic scenario an equally effective present-day alternative exists. By using a simulated application array as an analytical procedure, we skip all the sample characterization steps and the evaluation of the results and measure the sample’s adequacy for the job instead. Two assumptions are involved. First, the simulation experiment is feasible. Second, the consequences of a sample’s application can be detected more readily than all their possible causes. This is true often enough to make these techniques worth consideration.

II

References

The Wilks MIRAN-I Analyzer gives you better photometric accuracy than optical null spectrophotometers accuracy essential to low level D20-in-water determinations.

-

The MIRAN-I D2O-In-Water Analyzer features: 0 high signal-to-noise sample cell flexibility 0 long-term stability optical efficiency 0 virtually maintenance-free operation In addition, when the Analyzer is equipped with a Wilks Digital Display for direct readout in concentration, the analysis becomes nearly error proof! 0

0

For more information, call or write us today. SCIENTIFIC CORPORATION

P.O. Box 449, S. Norwalk, CT 06856, 203/853-1616 CIRCLE 2 3 4 ON READER SERVICE CARD

30A

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

(1) S. L. Grotch, Anal. Chem., 42,1214 (1970). (2) H. B. Woodruff, S. R. Lowry, and T. L. Isenhour, Appl. Spectrosc., 29,226 (1975). (3) G. T. Toussaint, Proc. 2nd Int. Joint Conf. Pattern Recognition, p 1,Copenhagen, Denmark, August 1974. (4) Sadtler Commercial Spectra, Sadtler Research Labs, Philadelphia, Pa. ( 5 ) E. Stenhagen, S. Abrahamsson, and F. McLafferty, “Registry of Mass Spectral Data”, Wiley, New York, N.Y., 1974. (6) S. L. Grotch, Anal. Chem., 47,1285 (1975). (7) F. W. MacLafferty, R. H. Hertel, and R. D. Villwock, Org. Mass Spectrom., 1, 690 (1974). (8) E. L. Medzon, Symposium on Viruses in the Environment, Burlington, Ont., Canada, July 1973. (9) R. Curbelo, E. R. Schildkraut, T. Hirschfeld, R. H. Webb, M. J. Block, and H. M. Shapiro, Histochem. Acta, to be published.

(10) T. Hirschfeld, Appl. Spectrosc., 24, 277 (1970). (11) T. Hirschfeld and K. Kizer, ibid., 29, ‘ 345 (1975). (12) E. E. Weir, T . G. Pretlow, A. Pitts,

and E. E. Williams, J . Histochem. Cytochem., 22,1135 (1974). (13) B. Rotman, Proc. N u t . Acad. Sci., 47, 1981 (1975). (14) B. Abermann and M. M. Salpeter, J . Histochem. Cytochem., 22,845 (1974). (15) G. K. Megla, Acta Cytol., 17,3 (1973). (16) N. J. Harrick, “Internal Reflection

Barnes precision pathlengthcells are modular to save you time, leakproof to protect your sample.

Fluorescence”, Wiley, New York, N.Y., 1967.

(17) R. Peters, Biochem. Biophys. Acta, 233,465 (1971). (18) P. C. Lauterbur, Nature, 242,190 (1973). (19) T. Hirschfeld, E. R. Schildkraut, H.

Tannenbaum, and D. Tannenbaum,

Appl. Phys. Lett., 22,38 (1973). (20) R. D. Hake, Jr., D. E. Arnold, D. W.

Jackson, W. E. Evans, B. P. Ficklin, and R. A. Long, J . Geophys. Res., 77,6839

(1972). (21) J. C. Giddings, M. N. Myers, and J . W. King, J. Chromatogr. Sci., 7,276 (1969).

Only Barnes Interchangeable Precision Pathlength Cells are 100% leakproof. All other interchangeable cells rely on teflon gaskets or 0 rings to form a seal around the injection fittings. Barnes cells are constructed with the needle plate amalgamated to the upper window, just as the upper and lower windows are amalgamated t o each other, forming a reinforced leaktight preassembled unit.

Modular Construction

Tomas B. Hirschfeld, chief scientist, Block Engineering, received his BS, MS, and PhD degrees in 1957, 1965, and 1967 from National University, Uruguay, where he was an assistant professor in spectrochemistry until 1968. He has taught courses in spectroscopy at the Universities of Buenos Aires, Montreal, and California (Los Angeles). In 1966-67 Dr. Hirschfeld was a visiting scientist at the Basic Science Center, North American Aviation, Thousand Oaks, Calif. He became a consultant for Block Engineering, Inc., in 1966, joined the company as a staff scientist in 1968, and became chief scientist in 1971. His work includes research in reflection, Raman, Fourier transform, and fluorescence spectroscopy, on which he has published 100 papers and received 20 patents. He is a member of the Optical Society of America, the American Chemical Society, the Institute of Electronic Engineers, the American Institute of Physics, and the Canadian Spectroscopic Society.

The one-piece configuration of the needle plate and windows effects quick and easy cell interchangeability. Conventional cells require careful assembly, exposing windows to possible moisture contamination. In addition, care must be taken to apply uniform pressure on t h e assembly or the sample will leak. Modular construction eliminates both problems. The pre-assembled unit eliminates the need for handling windows and for relying on the holder to effect a uniformly pressured seal. The one-piece unit also simplifies cell cleaning and storage.

Tight Cell Pathlength To1erance Barnes’ years of experience in preparing amalgam sealed cells have produced refinements in assembling procedures which ensure a tight cell pathlength tolerance and a high yield, resulting in a high quality product at low cost.

Universal Holder Barnes Universal Holders are designed to accommodate cells of any pathlength or window material. The same holder can also be used for demountable cell applications. Barnes Precision Pathlength Cells are the finest interchangeable cells available, yet are competitively priced. They’re available in a wide range of pathlengths and window materials from 0.015mm to 1.Omm. And they’re guaranteed 100% leakproof. For dependability in all your infrared spectrophotometer cells, accessories, and crystals, try Barnes. We offer prompt service, in-stock delivery on most items, and applications assistance on your infrared sampling requirements. Write or call today for our catalog, “Infrared Analytical Accessories & Techniques.” Barnes Engineering Company, 30 Commerce Road, Stamford, Connecticut 06904 (203) 348-5381.

Barnes Engineering Company 300 Commerce Road Stamford, Connecticut 06904 203 348-5381 CIRCLE 26 ON READER SERVICE CARD

~

ANALYTICAL CHEMISTRY, VOL. 48, NO. 1, JANUARY 1976

31A