Measurement - ACS Publications - American Chemical Society

GA 30602. Introduction of the microcomputer has triggered what can be called the first wave of a revolution in laboratory mea- surements on chemical s...
0 downloads 0 Views 7MB Size
REPOR’T

J

Measurement To take full advantage of modern, solid-state, microcomputer-driven instrumentation that is equipped with sophisticated software, one should critically examine factors that can contribute to system instability, bias, and mistaken identification. Measurements that demand unusual system stability can benefit, as can interlaboratory analytical studies. Greater uses of simulations and studies directed toward estimating relative selectivities for techniques and procedures are recommended.

0003-2700/90/0362-703A/$02.50/U @ IS90 Amsrlcan Uwrmlcal Soclay

Lockhart 6. Rogers Department Of Chemistry Unlvaslty of Oeorgla Athens. GA 30602

Introduction of the microcomputer has triggered what can be called the first wave of a revolution in laboratory measurements on chemical systems. In my m e , the basic concepts were planted in 1968 at Lawrence Livermore National Laboratory when Jack W. Frazer proudly showed me a Digital Equipment Corp. PDP-8 minicomputer system, with its 4K of 12-hit-word memory and a Teletypewriter, capable of printing approximately 10 characters per second and providing input/output of programs and data by means of punched paper tape. Moreover, he had purchased the system for the unhelievably low price of $10,000. He had also added another 4K of memory (the maximum possible), a digital-to-analog converter (DAC) and an analog-to-digita1 converter (ADC) so as to be able to do instrument control and data acquisition as well as data processing. Furthermore, he had mounted both the computer system and the teletype on separate low, rolling platforms so that the system could he moved easily from one laboratory t o another. He explained that he wanted to be able to take data from his instruments a t

short, reproducible time intervals and then examine his calculated results “at once.” That would allow him, if he desired, U, change the conditions he had planned for his next experiment. The second wave of the revolution was the substitution of solid-state microelectronics for the discrete, handwired (at first) circuits. Not only did the bulkiness of the computer shrink hut also the cast and the time for a computer cycle. (Another major step toward further shrinkage is already on the horizon [ I ] . ) As a byproduct of that fmt major reduction in size and uniformity of the circuit componente, the control of some temperatwe effects, a topic to he discussed later, became easier. The third, most recent, wave has been the introduction of many complex mathematical calculations “at the bench.” For example, rapid acquisition of multiple-wavelength spectral data for gasoline permits calculations, in less than 20 s, of the concentrations of three claases of compounds plus five physical properties that otherwise would have required five separate procedures (2).More complex “chemomeThis REPORT is based on the award address given by L. B. Rogers when he received the 1989 Pittsburgh Analytical Chemistzy Award at the 40th Pittsburgh Conference and Expasition on Analytical Chemistry and Applied Spectroscopy in Atlanta, GA, March 1989.

ANALYTICAL CHEMISTRY, VOL. 62. NO. 13, JULY 1, 1990

~OBA

tric” operations, such as multidimensional classifications of data, require longer times; but they, too, can now be handled by new laboratory microcomputers and workstations rather than by centralized mainframes or large minicomputers. Goals The combination of those advances and the awesome capabilities of some multiprocessor instrument systems has lulled many users into assuming that such a system automatically “takes care of everything.” Therefore, one goal of this REPORT is to alert the user to the desirability of critically examining seemingly unimportant sources of uncertainty andlor him, so as to take full advantage of the inherent capabilities of those systems. Furthermore, the same factors usually produce larger and more readily detected effects on interlaboratory data. For that reason, a brief introductory section will outline some important characteristics of such studies. A second goal is to remind the reader that chemistry and physics can still intrude on system performance. Especially in trace-level determinations, where the coefficient of variation for replicates obtained for a concentration near 1ppb can be i50% or more ( 3 , 4 ) , the possibility of interfering signals also increases (5-8) and can, at worst, result in careful quantitative measurements being made on incorrectly identified signals. Finally, two other goals are discussed briefly. The first is addressed to software problems. The use of simulated data to test proprietary programs of instrument manufacturers or, alternately, of using a validated program for analyzing the raw data so as to compare the results with those from the proprietary program is very worthwhile. The

Properties, Chemistry, and Preservation

S

cientists can gain valuable information a b u t the past by studying wooden arti facts. This new volume-the first of its kind-is truly unique in that it combines chemistry with techniques of preserving archaeoiqicai wwd. Perfect as a reference source for anyone in this fieid. this 488-page volume reveals present knowledge of the structure of wwd and the mechanisms of its degradation. Seventeen chapten cover topics within these general Catq0rieS:

properties and chemistry

0 0

overview of prerelvation

0

future r w r t h

preservation ofwaterlogged wmd prewrvation ofdrywwd

Specific areas discussed include the chemical composition of wwd and changes brought on by the decay pmcess. blopredaton. radiation curing. fremdrying. chemical preservation techniques, museum environments. the ethics of conservation. and value system for chws ing among the qualies of wwd that can be

second concerns the desirability of developing ways to estimate relative selectivities of different techniques. Resolution is, of course, only one factor to be incorporated in such a calculation. Such information might be useful in selecting a technique-or a combination of two or more techniques-for doing a particular analysis. I regret that some of the references are not to published work but to informal conversations with m-workers and other friends. In addition, some observations apply to outdated equipment, but they have been included because they illustrate a useful approach to a given type of error. Specialists in areas other than those for which specific examples have been given should have little difficulty in identifying appropriate analogies. Interlaboratory measurements Two different approaches are used in interlaboratory studies. In one approach, each laboratory analyzes the reference samples using the method in which it has the greatest confidence. In the second approach, used by the Association of Official Analytical Chemists (AOAC) and by the American Society for Testing and Materials (ASTM), a single method is agreed upon in advance for use by all of the laboratories (9). The discussion that follows is largely devoted to the latter. A variety of factors that affect uncertainty and bias in interlaboratory studies have been reported (5-8, IO). Those that will not be discussed further are: the difficulty in obtaining a representative sample (11-14), sample storage (15, 16), and recognition by the analysts of samples used to test their proficiencies even though the concentrations are unknown to the analysts (17). Another factor that deserves some discussion, although it is not directly rele-

Rqer M.Rowell. Editoc US. Department of Agriculture R. James Barbour. Editor, Forintek Canada Corpmation Dweloped from a symposium sponsored by the Ceilu. Iw.Paper and Textile Divisionof the American Chemical Society Advances in Chemistry Series No. 225 488 pages (1990) Clothbound ISBN 0-8412-1623-1 LC 89-39451 w a aAmerican Chemical Swielv Distribution ORKe. Dept. 62 1155 Suteenth St.. NW. Washington. lt 20036

or CALL TOLL FREE

800-227-5558 lin Washinoton. D.C. 812-43631and use

---*

‘1



Figure 1. Total number of compounds verified by a laboratory. including those it detected that the other laboratoly did not.

104A * ANALYTICAL CHEMISTRY, VOL. 62, NO. 13. JULY 1. ISSO

Figure 2. Change with time of co cients of variation for determination of pesticide residues in check samples of fat and blood by EPA contractors. Results are horn 10-22 labbrauxies lor 3-9 c ~ n pounds. Solid clrcles. fat; opsn clrcle~,b i d .

vant to the main thrust of this article, is the existence of a broad “region.of uncertain reaction” near the detection limit (16).Rather than reciting the hypothetical set of results that has been reported for intralahoratory results (19), this phenomenon is better illustrated by real data from an interlahoratory study undertaken by the US.Environmental Protection Agency (EPA) and the Dow Chemical Company (now Dow USA) in 1979 ( 2 0 ~ ) . The EPA prepared a dilute solution of a very complex mixture of volatile organic compounds and then sent a portion to Dow Chemical Co. Each laboratory prepared for analysis a halfdozen dilutions that covered a range of between 1.5 and 2.0 orders of magnitude. Considering first the case for the most dilute samples in each set, 656 species were not detected by either lahoratory whereas 132 were detected by hoth (a coincidental agreement). More interesting is the fact that there were some species detected by one laboratory but not the other: 132 for the EPA and 55 for Daw Chemical Co. As each laboratory analyzed the next more concentrated sample, the totals for the detected species increased in a rather smooth curve (Figure 1).However, the phenomenon of one laboratory finding a species that the other did not was reported over more than an order of magnitude. The ultimate test of the phenomenon was also observed the failure of a given laboratory to detect at a higher concentration a compound that it had successfully detected at a lower concentration (20b). The discussion will now focus upon groups of experimental factors that combine to produce two characteristics that are evident in interlaboratory studies. Horwitz (21) reported the behavior with time of the average annual coefficient of variation (CV) for data taken in an AOAC-type study. Figure 2

shows that, after starting out at a value 4-5 times larger than the CV obtained from within-lab data, the average value for the CV fell smoothly for a period of approximately five years, before leveling off at roughly 1.5-2 times the size of the average within-lab value. Clearly, the initial decrease can he rationalized as an increase in the proficiencies of the experienced chemists with the method. This has been confirmed indirectly hy Bulkin (22),who reported that the introduction of robots in his laboratories immediately produced data having less scatter (and better accuracy) for the proficiency standards. One would not, however, expect the use of robots per se to influence the second characteristic: the height of the plateau that was finally reached by the CV for the interlahoratory data. There is good reason to believe that the factors that cause that second difference are the same ones that are highly significant in day-to-day variations in intralahoratory measurements (23) as well as those made on a given day at high sensitivities: in trace analyses, in cases where signal averaging for long times is involved, and in differential measurements of two large signals. In the discussion that follows, the experimental variables have been grouped under the headings of environment, hardware, and software. The discussion also addresses the question of interference and the related topic of selectivity. Envimmenlal factors The term environment is used very broadly in this discussion. First, there is the laboratory environment with ita airhome contaminants as well as its changes in temperature, pressure, and humidity. Then, there is the more specific sample environment that includes the container, reaction vessels, reagents and any in situ eensors, stirrers, or other equipment to which the sample is exposed. Chemical contaminationAosg This first example was brought to my attention by C. J. Rodden (24), who worked at the U.S.National Bureau of Standards (now the National Institute of Standards and Technology) before and during World War 11. The results obtained in his laboratory for low partper-million levels of cadmium in uranium were higher and much more variable than those reported hy a halfdozen other laboratories that were analyzing portions of the sane samples. Data from Rodden’s laboratory fell into line only after someone thought to remove the cadmium-plated ironware from the laboratory. A variation of that theme was report-

Gas Chromato-

0 Capillary columns incl. LIPODEX@chiral

capillaries 0 Packed columns and

materials for packed columns 0 Products for

derivatization and sample preparation 0 Unimetrics micro

syringes 0 Accessories Please ask for fulther information!

.

CIRCLE 92 ON R E A M R SERVICE CARD

ANALYTICAL CHEMISTRY, VOL. 82, NO. 13, JVLY 1. 1990

705A

ed by Nordberg (W), who was attempting to follow depletion with time of drugs in hospital patients. His data were highly erratic until he isolated, in two separate rooms, the equipment such as syringes, standard solutions, and the washing operations for (a) the concentrated solutions that were to be injected and (b) the v e e dilute samples of drugs in body fluids that were withdrawn from patients for analyses. Another example involves the work of Powell and Kingston described by Currie (26) on determinations of blanks for low part-per-billion levels of chromium. Figure 3 shows that high variable results were obtained when analyses were performed in the normal way. However, the amounts and the variability were markedly decreased when the analyses were done in a “clean room.” Finally, the lowest values and the smallest variability were obtained after adding a step to clean up the reagents. One must also be careful in the mixing of reagents when preparing metal-ion standards for multiple spectroscopic measurements (27). It is important to realize that contamination of laboratory air also occurs via vapors of low-volatility organic compounds, such as phthalate esters (a), as well as those for the common, more volatile solvents (29).The latter could easily account for data obtained for suecessive 10-fold dilutions of a solution that still gave measurable IR absorption signals for a solute after 30 dilutions (30). Temperature. Most scientita put too much faith in readings from room thermostats. It is easy to forget that

thermostats are carefully protected from damage-and from temperature changes-by shields that often have few perforations, and that the room temperature is controlled by blowing cold or hot air into the room through diffusion devices having unknown mixing effects. Thii was f i t brought to my attention in the mid-1950s by H. J. Keily, a graduate student at MIT, who complained of erratic behavior when he was performing thermometric titrations in a new humidity- and temperature-controlled room. He was using a Mariotte flask and a long thick-walled capillary tube to obtain “constant” flow of reagent into a Dewar flask that held his stirred sample. His setup was near, but not under, an air inlet in the ceiling. His problem was solved when he took the metal cover off the thermostat on the wall and directed a small fan at it from about 25 cm away, and set a large (-75 an diameter) fan on a box on the benchtop at the opposite end of the small laboratory and directed it so as to blow air along a line about 1 m below the air inlets. A combination of more nearly uniform temperature of the room air and the faster response of the thermostat to a change in the average room temperature eliminated detectable effects on the titrations. Recollection of Keily’s experience helped to solve a related problem in the late 1960s, when J. E. Oberholtzer, a graduate student at Purdue, was assembling a high-precision, digitally controlled gas chromatograph using an oven capable of i0.02 “C control while working in a room without temperature control (31). The temperature of a

I

Flgure 3. Analytical blanks for chrmlum determination. Reglon I:Ordlwq hborsmry atmosphere and r e a m . R ~1: pvitied I ~a m p h e r e (dean rwm and clean hmlnnr-flow bench). Region Ilk pwlfledatmosphere and pritled rea@3nls.

706A

ANALYTICAL CHEMISTRY, VOL. 62, NO. 13, J U Y 1, 1990

thermistor in the oven was measured against a secondary standard resistance box in a bridge circuit on the benchtop. W i g the course of the day, the short-term mean temperature of the oven appeared to drift slowly in one direction by more than 0.1 “C and then back at the end of the day. By putting the resistance standard along with transducers in a large wwden box in which the temperature was controlled a few degrees above the highest temperature reached in the room, the desired long-term stability of the oven temperature was observed. Everyone is familiar with the remmmendation that an instrument should be turned on for warmup well in advance of its use for measurements. Some chemists also take an additional precaution and prepare solutions well in advance of w e so as to minimize the effect of heats of mixing. Both precautions are essential if differential measurements are to be made. For example, in the late 1940s when differential measurements in the W-vis were introduced, differences of only a few degrees in temperature between the concentrated reference and the slightly more concentrated sample solutions produced very large changes in the measured differences, even though a 0.01% transmittance difference was usually the smallest detectable! A spectacular example of another effect of sample temperature bas recently been reported by AUerhand and Maple (32).They found that resolution in NMR was greatly enhanced by controlling the temperature of the liquid sample to i0.02 “C. The implications of this fmding certainly extend beyond NMR measurements. Pressure of the laboratory. Our first custom, high-precision gas chromatograph, which was mentioned earlier, boasted a large thermostatically controlled box that held all of the regulators and transducers (except the flame ionization and thermal conductivity detectors, which were sometimes individually thermostated), the digitally controlled sampling valve, and long sections of metal tubing (usually filled with coarse metal shavings to improve heat transfer) to precondition the carrier gases. Flows of carrier gases and those for the flame ionization detector were regulated by manually adjusted, high-precision pressure controllers because of the absence at that time of hgh-stability electronic flow controllers. One day, a frustrated R. A. Culp, a postdoctoral associate working on very high precision differential measurements of retention times (33). asked me to step into the laboratory next door. He promised to generate

A

c

The way you inject sample right at the start of an LC analysis has a major effect on the final results. In fact, using the right injection technique at the start is essential for high accuracy. You can learn all about proper techniques from Rheodyne literature. Three publications cover the subject at differing depths: 1. Most popular is “Tips on LC Injection:’ a concise single-sheetthat clears up common misconceptions and guides you through several

critical steps. 2. If you want a deeper understanding, read “TechnicalNotes 5, Achieving Accuracy and Precision with Rheodyne Sample Injectors.” This 6-page guide describes the subtle processes that occur during sample injection and explains why certain techniques are recommended. 3. Finallx Rheodyne offers little 5 cm x 10cm booklets you can hang around an injector handle for every user to read. These mini-manuals, CIRCLE 120 ON READER SERVICE CARD

one mr eacn model, contain brief reminders ofwhat to do-and what nottodo-whenusingthe injector. For copies use the reader service card, call your Rheodyne dealer, or contact Rheodyne, Inc,EO.Box 996, Cotati, California 94931, USA. Phone (707) 664-9050. Fax 707-664-8739.

s2 i RHEODYNE

peaks immediately upon my command. He stood behind me as I confirmed the flat baseline on the laboratory recorder that was connected in parallel with the ADC output. Then, yes, a sizable peak did appear a t my request-several times! Each time, he had simply opened a door into the hallway! Clearly, the fans in our laboratory hoods were able to lower the atmospheric pressure in the room enough to result in a noticeable change in flow through the low-impedance gas line to the flame-ionization detector which, in turn, resulted in a change in ita signal. It certainly provided a “high-tech” way in which to confirm the arrivals and departures of his laboratory partners and visitors! It is worth noting here another observation from the same study. By measuring directly, for successive chromatograms, the difference in the retention time between the peak for the internal standard and that for the solute in question, one could gain nearly an order of magnitude in reproducibility compared with taking differences between the averages for the absolute measurements of each peak. In this comparison, more factors than just the laboratory pressure were undoubtedly involved. Returning to the effects of changes in laboratory pressure, one could produce not only isolated false peaks but also, as expected, noticeable effects on quantitative measurements of peaks for real components. Shortly thereafter, L. J. Lorenz, a graduate student, using the same chromatographic system to ensemble-average successive chromatograms for methane at a series of very low concentrations (349, found that he obtained responses for his series of standards that were noticeably closer to linear after he corrected for changes in barometric pressure at regular intervals during the day and, especially, from day to day. Finally, a brief mention can serve to remind the reader of the many effects of partial pressure of water and its dayto-day changes, even in air-conditioned rooms when the humidity is not independently controlled. These include the weighings of hygroscopic and active-surface solids and absorption measurements in both IR and NMR (35). Hardware One should always examine critically the basic design features of any new instrument. Oberholtzer and I bought (“sight-unseen”) an oven from the Netherlands that users had recommended for high-precision control of the chromatographic column temperature. As soon as we saw the Beckmann 708A

thermometer, with its large bulb of mercury, that was used to detect changes in temperature, we put a thermistor in the oven. Instead of control to f0.02 “C, the range was more than 10 times greater! Happily, the desired level of control could be obtained after the control circuit had been redesigned to incorporate a thermistor in place of the thermometer. Nevertheless, further examination showed that in spite of very rapid air circulation, when one simply moved the thermistor to different locations in the empty oven, the observed temperature was significantly different (but still under good local control). Because that meant we would not know the effective average temperature of the chromatographic column, we emphasized the precisions of our results rather than their absolute values. One of the first inquiries we received after our first paper on high-precision GC was published came from a medical researcher who was using capillary gas chromatography to monitor the constituents of body fluids. Because the chromatographic run lasted more than an hour and there were countless peaks, the selection of internal standards was very difficult. Furthermore, he could not always be confident of having every species present in every chromatogram. His uncertainty in identifying a peak by retention time was such that he worried about misidentification of peaks. Hence, this researcher was keenly interested in improving the reproducibilities of his retention times. Another factor to consider is the possible effect of “time jitter” (uncertainty) in data acquisition and control. When early laboratory computers (because of their cost) were used either for combined data acquisition and processing from multiple instruments or for combined instrument control, data acquisition, and processing for a single complex instrument, variable delays in responding to interrupts sometimes had noticeable effects on data. Although those errors can still be found (36),the use of relatively inexpensive coprocessors under such circumstances can now virtually eliminate the need for concern. Two other factors that can contribute to day-to-day and lab-to-lab variability are often overlooked. The first involves the repeatability of setting instrument parameters (“resetability”) before each measurement as opposed to simple repeatability of replicate measurements using the same settings. For example, a chromatographic pump might bear performance specifications of 0.2%for repeatability and 0.5% for

ANALYTICAL CHEMISTRY, VOL. 62, NO. 13, JULY 1, 1990

resetability. Because the latter figure includes “slack” or “play” in the adjustment mechanism(s), that figure can sometimes be improved by approaching the set-point from the same direction each time. The second type of set-point problem arises from having too coarse an adjustment device. In a recent study involving the use of a DAC to control the rate of a liquid chromatographic pump, it was found that resolution of the DAC did not contribute to the uncertainty; it was solely a case of resetability of the pump (37). Finally, failure to calibrate instrument settings and performance is often overlooked. For example, most chromatographers take for granted the temperature settings on their injection ports, column ovens, detector ovens, and temperature programmers. An interesting situation is encountered in a control laboratory where a variety of instruments, differing by manufacturer and/or vintage, are used interchangeably for a given analysis. Marc Feldman (38) reported that when he measured the temperature of each column oven after using its instrument dial to make the setting, the actual temperatures for the half-dozen ovens in one group differed by more than f10 “C. Although such an uncertainty may be relatively unimportant in analyzing simple mixtures using packed columns, for more complex samples and for long runs using capillary columns, such a range can put a difficult burden on the analysts and/or the software algorithms. A similar check on the temperature-programming hardware is also desirable. Software Over the years, one of the sticky problems has been the matter of proprietary software for peak deconvolution and baseline correction. One can understand why a company may not wish to divulge its algorithms. However, there are now relatively simple ways to avoid this problem without breaching confidentiality. They result from three factors: (a) the relatively inexpensive computer memory of all types, (b) the likelihood that a user will have access to another microcomputer in addition to the one in the instrument, and (c) the ease with which one can generate mathematically a simulated data set (39). Such a set has features that are known exactly, such as peak locations, peak areas, signal-to-noise ratios, and different types of baselines. One approach is to feed the simulated data into the instrument system and use the proprietary program to analyze it. A second approach is to feed a previously

validated program into the instrument to analyze its raw data and compare the results with those from the proprietary program. Alternatively, one can take the raw data out to a stand-alone computer in order to use the validated program on it and then compare the results as before. Steps have already been taken by some instrument manufacturers to make such procedures readily ,possible. In addition, because governmental regulations dealing with Good Laboratory Practice Standards are forcing users to provide evidence of Validation (40,41),vendors of instruments and computerized data management systems are beginning to provide users with the necessary information (41). In case those precautions seem unnecessary, consider the following. Papas and Delaney (40) have reported that when four chromatographic integrators were tested using simulated data, they obtained not only significantly discrepant results between instruments but also, for a given instrument, widely different effects of noise level and tailing on peaks having different heights and widths. When I later discussed their findings with Lubkowitz ( 4 2 ) , I found that he had independently discovered the existence of such problems. By having the output of a chromatogram put into firmware, he, too, could feed exactly the same (but not validated) data into two or more instruments. He found that the programs for different computerized chromatographs and for chromatographic integrators sometimes produced quite different results! Careful calibrations, using for each “unknown” peak a “known” having nearly the same concentration, should help to minimize bias. Unfortunately, analyses of known concentrations will not be useful in improving the factors that result in variations in precision within one instrument (40). One can find similar problems outside the area of chromatography. Meglen (43) reported that somewhat different conclusions were obtained when two different chemometric programs were used to analyze the same data. Therefore, although the user would like to assume that such concerns are unnecessary, errors resulting from software are not entirely hypothetical. Interference and mistaken identification Before discussing detailed examples, it is well to recall two classical requirements for reliable analyses, whether they are satisfied by performing chemical testa or by using highly instrumented measurements of physical properties. First, it is imperative that the

identity of the species being measured be confirmed. Either a different chemical reaction or a different physical measurement should be made. Second, in making crucially important quantitative meaurements, it is desirable to use two as nearly independent procedures as possible so as to minimize the effect of an interference from an unsuspected source. In many environmental and clinical problems involving tracelevel analyses, the stakes can be very high-as can the chances of interference or misidentification (5-7). As the examples below illustrate, identification of the exact chemical species that is the source of the problem is rarely ever pursued because of the difficulty and cost involved. Instead, one takes appropriate steps to avoid or at least minimize the interference. Let us first consider examples of interferences by unidentified species. Veillon ( 4 4 ) has estimated that a graphite furnace atomic absorption procedure that was inadequate for dealing with one or more interferences produced erroneous results for the chromium content of urine for a period of 10 years or more! Another example, which involved chlorinated dibenzo-pdioxins, was reported by Shadoff ( 4 5 ) . The procedure involved successively a liquid-liquid extraction, LC, and GC/ ,MSusing selected ion peaks for quantitation (46). He found unexpectedly high results for dioxin in fractions from a fish. However, when he scanned a large range of mass-to-charge ratios, he found two tiny dioxin peaks sitting on a high, almost featureless background that spread over a wide range of masses. He speculated that the background came from small amounts of glycerides that had unexpectedly passed through the entire procedure. Similarly, Nestrick and Lamparski (47) found an interesting interference when analyzing for very low levels of dioxins using basically the same isolation and measurement procedures as Shadoff. They found that very erratic and unexpectedly high results were obtained unless they passed high-purity nitrogen through a bed of silica before using that nitrogen to evaporate solvent from the appropriate LC fraction prior to injecting a portion of it into the GC/MS. Since high-purity nitrogen is not a likely place to find tetrachlorodioxins, I suspect that a tank-valve lubricant or some residual pump oil had the ‘(right” properties to be collected by the LC solvent and then pass through the GC/MS to produce the interfering signals. One must always be on the alert for unexpected behavior, but one certainly cannot predict such interferences in the usual way.

Finally, in the clinical area of drug abuse, “false positives’’ are also of great concern because of the seriousness of the charge to the accused person. That is why one laboratory selects statistical conditions that have greater than 99.99% chance of detecting a drug user (48). Nevertheless, those conditions will allow an estimated 2000 true users to pass undetected for every nonuser who is judged wrongly to fail the test. As a further consideration, in some tests, common sources of interference are encountered; for example, poppy seeds on a couple of breakfast rolls have been found to give a positive test for opiates (48). There are other types of interferences that can be examined more logically, just as chemical interferences have traditionally been explored. First, in a situation reminiscent of the early applications of emission spectroscopy using arcs and sparks, a recent paper (49) reminds us that in making inductively coupled plasma measurements, one should not overlook the possibility of interference from second- and thirdorder spectral lines. Second, there is a publication in mass spectrometry that illustrates the use of calculations of peak locations and relative heights based upon (a) known masses and abundances of isotopic species, and (b) known fragmentation patterns for molecules that contain only certain specified elements. In one dioxin study ( 5 0 ) , a computer program was used to calculate all peaks in the region of two prominent peaks used for determining tetrachlorodioxin. All possible combinations of species were calculated that contained up to 50 carbons; 6 oxygens; 2 each of nitrogen, chlorine-35, chlorine-37, bromine-79; one each of bromine-81, sulfur, and phosphorus; and the needed number of hydrogens. They found totals of 339 and 359 peaks that fell within 0.010 mass units of the respective 319.8965 and 321.8936 peaks being measured. It was estimated that a resolution of at least 30,000 would be needed. However, one is limited in the use of increasingly greater resolution because one faces the problems of: (a) appearances of isotopic peaks and (b) greater widths and lower intensities of all peaks. The maximum useful resolution is then set by the amount of available sample (more precisely, the amount of the sought-for chemical species) as well as the previous two factors (50).

In practice, the problem can be complicated further if the two molecular species in the sample, the sought-for and the interfering species, are present in widely different amounts. A minor isotopic peak from an interfering sub-

ANALYTICAL CHEMISTRY, VOL. 62, NO. 13, JULY 1, 1990

*

709A

REPORT

A

Inorganic Compounds with Unusual Properties

T

his comprehensive volume brings to-

gether the latest information on electron transfer from two diverse scientific communities-those studying mechanisms a i electron transfer in metailoproteinsand those studying similar processes in slid state materials. leading scientists in both fields contributed to this voiurne providing a balance of overview. theory and experiment. Semiconductors and tunneling models are offered to explain many aspects of long-range biological electron transfer. Recent research activities and significant advaflces are highlighted. Unilying concepts found here are applicable to electron transfer between metal centers in both solid state materials and biological systems. With 23 chapters. topics covered within this volume include: thBoretical aspects of biological electron transfer experimental approaches to biological electron transfel: peptides and proteins and inorganic complexes theoretical aspects of solid Itate ryrtemr experimental aspects Of solid state systems

This volume will serve as a valuable asset t o the proiessionallibraries of materials scientists, solid.state scientists. biochemists,and all who% inter& span both biological and solid state materials. Michael K. Johnson, R. Bruce King, Donald M. Kurtz. Jr.. Charles Kutai. Michael 1. Norton. Robert A. Scott. Editors, University of Georgia Developed from a symposium sponsored by the DIM slon oi lnorganlc Chemistry o i the American Chemical

stance present a t the 100 parts-per-billion level may obliterate the abundant target peak of the sought-for substance present a t the 100 parts-per-trillion level. Hence, there is a continuing need to go in the direction of using yet more complex combinations of separation and measurement techniques. Before discussing the use of hyphenated techniques and other multiple-operation and multiple-detector systems, it is worth digressing to note a recent paper that dealt with a different multiple-technique problem: the crosa-evaluation of independent, multiple analytical procedures for a given species (51). By using four or more different methods for determining an analyte, it is possible to arrive a t the most accurate procedure for Rnalyzing a complex sample and also to assess the capabilities of each of the individual methods with respect to bias and precision. This paper clearly addresses a problem that is sometimes encountered when the analytical results obtained hy two methods are plotted against one another over a range of concentrations. When, as Figure 4 shows, the results deviate from a broad straight line, especially a t one end or the other, one doesn't know which method to blame (8).Although a third, independent method should be helpful (8,51),the reported mathematical treatment argues strongly for a fourth. The major reason for using multiple operations in series andfor in parallel is to improve the selectivity, a goal that involves more than just one of its components: resolution. This is why it is appealing to speculate as to the feasibility of devising methods for calculating selectivities-at least on a relative basis. Certainly, in a simple example that was cited (10) using first hoilingpoint and then molecular-weight data

b. Pell, R. J.; Erickson, B. C; Hannah, R. W.; Callis, J. B.; Kowalski, B. R. Anal. Chem. 1988,60,2824. (3) Harwitz, W.; Kamps, L. R.; Boyer, K.W. J.Assoc. Off.Anal.Chem. 1980,63,

1344. (4) a. Whitaker,T. B.; Dickens, J. W.; Mon-

470 pages (1990) Clothbound ISBN 0-8412-1675-4 LC 90-301 19

roe, R. J.; Wiser, E. H.J. Am. Oil Chem. Soc. 1972, 49, 590; b. Whitaker, T.B.; Dickens, J. W.; Monroe, R. J. J . Am. Oil Chem. Soc. 1974,51,214. ( 5 ) Widmark, G. Ad". Chem. Ser. 1971,11,

589.95 8

I

American Chemical SKiety

Distribution Ofice. Dept. 67 1155 Sixteenth St.. N.W. Washington. DC 20036 or CALL TOLL FREE

Throughout this discussion, I have documented sources of instability and error, including gross misidentifications, that can occur within a laboratory. Minimizing those sources within a laboratory will expand the range of measurements that can he made and the accuracy with which they can be interpreted reliably. In addition, one may, in some cases, extend one's capability even further by improving resolution in a qualitative, rather than quantitative, way, as has been shown for NMR. Finally, regardless of how consistent one laboratory is in its operations, there is a need to reduce the differences between it and an equally consistent second laboratory. Better recognition of the classes of the sources of bias and uncertainty should lead to better research data as well as to improved product control by industry and regulatory control by governmental agencies.

( 1 ) Chem. Eng. News 1990,68(6), 32. (2) a. Callis, J. B.; IUman, D. L.; Kowalski, B. R. A n d . Chem. 1987.59.624 A b 3 7 A;

Advances in Chemistry Series 226

'

General conclusions

References

Society

3

from a large compilation of organic compounds, the number of possible species was greatly reduced (from roughly 1300 to 6 ) by introducing the second criterion. This, of course, is the same general principle that is used in searching libraries of spectra. Intuitively, it seems that it should be worthwhile, by using different combinations of physical and chemical properties, to estimate the relative numbers of expected interferences when applying different combinations of procedures and techniques to a sample containing given classes of compounds (or elemental species). Are certain comhinations more useful than competitive ones for certain classes of samples? If careful examination shows that hope to be untrue, one can certainly continue along the present random course.

348.

Figure 4. Comparlson of results obtained by two independent methods. The region Whew disagreement arises either b% a u 5 8 Memod 1gives low results M because Method 2 gives high results because Ot an interterenw io indicated wim an asterisk.

710A * ANALYTICAL CCIEMISTRY, VOL. 62, NO. 13, JULY 1, 1990

(6) Donaldaon, W. T. Enuiron. Sci. Technol. 1977,11,348. (7) a. Improuing the Reliability and Acce~tabilitvof Analytical Chemical Datu Uied for Publir Purposra: Ad Hoc Subcommittee 1)ealingwith theSrienrific hmcta

of Rerulatorv

Memurements. ACS

-

II

(37) Ravichdran, K.; Lewis, J. J.; Yh,

I-H.; Koenigbauer, M.; Powley, C.R.; Shah. P.; Rogers, L. B. J. Chromatogr.

1988,439,213. (38) a Feldman, M., E. I. du Pont de Nem o m and Co., personal communication, July 1986. b. Garelick, E. L. Zntech. 1990,

37,34.

(39) Pauls, R. E.; Rogers, L. B. Sep. Sei. 1977,12,395. (40) Papas, A.N.; Delaney, M.F. Anal. Chem. 1987,59,64 A. (41) Head, M.; Last, B. Amer. Lab. 1989, 21(12), 56-59, (42) Luhkowitz, J. A., Independent consul-

tant, Gulf Breeze, FL, personal communication, March 1987. (43) Meglen, R. R. Presented at the 100th AOAC Annual Meetinn. Scottsdale. A 2

(47) Neatrick, T. J., Dow Chemical Co.;

Lamoarski. L. L..Dow Chemical Co.. Dersonai communication, March 1979. (48) Hively, W. Amer. Sci. 1989,77,19-23. (4S3\4ttar, K. M. Anal. Chem. 1988, 60,

A New Horizon in

Electrochemistry

LUVU.

(50)

Mahle, N. H.; Shadall, L. A. Biomed.

Mass S f y o m . 19?2,9:,45., , Mar , H , Noms, K , Wlll~ams,P, C , Anal. Chem. 1989.62.398.

E

G&G Princeton Applied Research is pleased to open the curtain to a new horizon in research with the MicroV~ewScanning Tunneling Microscope.

(51)

The Microview uses Scanning Tunneling Microscope (STM) tech-' nology, which lets you view SUP faces at up to atomic resolution. To the STM, we add a built-in p~ tentiostat and elechwhemical cell. But the Microview gives you more than a closeup view. It electrochemically stimulates your sample in situ and measures the resulting reaction as you view a graphic image of the reaction site on your video display. Think of the advantage of using the Microview to monitor a graphite electrode as it deposits copper onto the surface.Or the value of examining a pit formation with the Microview as it applies a passivation potential to iron in a chloride solution. So if you're ready for new horizons, call us at 1-800-274PARC. Our twenty-eight years of experience in electrochemistry tell us that a new day is here.

(Zij B;Lkin, B. J., BPAmerica, personal communication, March 1989. (23) Gemperline,P.J.; Webber, L. D.; Cox, F. 0. Anal. Chem. 1989,61,138.

(24) Rcdden, C. J., New Brunswick Laboratory, US. Atomic Energy Commission,

personal communication,1947. (25) a. Nordberg, R. Acta Pharm. Suecica 1982,19(1), 51. b. BorgA, 0.Zdem. 52. (26) Currie, L. A. Pure Appl. Chem. 1982, 54,733. (27).Baas, D.A. Amer. Lab. 1989, 21(12), 24. (28) Budde, W., U.S. EPA, personal communication, 1976. (29) Ember, L. R. Chem. Eng. News 1988, 66(48),23. (30) Heintz, E. Naturwissen. 1941,29,71325. (31) a. Oberboltzer, J. E.; Rogers, L. B. Anal. Chem. 1969, 41, 1234. h. Wester-

berg, R.B.; Davis, J.E.; Phel s J E Higins, G. w.; Rogers, L. B. ckm..zij strum. 1975,6,273. (32) Allerhand, A,; Maple, S.R. Anal. Chem. 1987,59,441 A. (33) Cub, R. A.; LoehmiiUer, C. H.; Moreland, A. K.; Swingle, R. s.;Rogers, L. B. J. Chromatogr.Sei. 1971,9,6. (34) Lorenz,L. J.; Culp, R. A.;Rogers, L.B. Anal. Chem. 1970,42,979. (35) Roberts,D. Amer. Lab. News Ed. 1988,

-",

70 lfi

(36) Joyce, R. Amer. Lab. 1989,21(6), 48.

Lockhart B. Rogers is Graham Perdue Professor of Chemistry Emeritus at the University of Georgia. He has been teaching since 1942, except during February 1946August 1948,when he was group leader of long-range research in analytical chemistry at what is now Oak Ridge National Laboratory. He taught for 4 years at Stanford, 13 years at MIT, 13 years at Purdue, and 12.5 years at Georgia before retiring in 1986. Among the numerous awards he has received are the ACS awards in Analytical Chemistry and in Chromatography, the ACS Division of Analytical Chemistry Award for Excellence in Teaching,and the Analytical Chemistrv Award of the Society of Analytical Chemists ofPittsburgh; .

G* EGzGPARC P 0 BOX 2555. PRINCETON. NJ 08543.2% 1M)9r530-1000. FAX. W9) 883-7259 I

Clrcle 98 for Lkamre. Clrcle 37 for Sal- ReprpentBtlve

ANALYTICAL CHEMISTRY. VOL. 62. NO. 13, JULY 1, IS90

711A