Time-sharing minicomputer data acquisition ... - ACS Publications

Time-sharing minicomputer data acquisition-processing system ... Citation data is made available by participants in Crossref's Cited-by Linking servic...
0 downloads 0 Views 9MB Size
Instrumentation

Time-Sharing Minicomputer Data Acquisition-Processing System Mack W. Overton and Larry L. Alber Food and Drug Administration Chicago District Laboratory Room 1222, Post Office Building Chicago, IL 60607

Four years ago, an agency decision was made to investigate the applicability of the on-line minicomputer in Food and Drug Administration district laboratories. Although severalcommercial “turnkey” systems were available (I-3), the authors suggested an “in-house” development. There were several reasons behind this preferred approach: money restrictions eliminated the possibility of acquiring a large system; continual modification and revision of FDA priorities and procedures minimized the benefits of turnkey systems; and the experience gained on this project could be easily applied in other FDA laboratories and would provide in-house minicomputer competence which would be advantageous in the long term. The in-house procedure was adopted, and through a gradual process of software development, application to laboratory “production,” software evaluation and updating, a laboratory computer system evolved which is the subject of this report. With minor modifications the system in question has been routinely applied to laboratory production for the past 24 months. I t provides the capability of time sharing up to eight analytical instruments, each sampled a t different data rates, while processing stored data. A 4K core memory machine and 32K disk are required for the “foreground” data logging operation, but an additional 4K core memory allows “background” operation on stored raw data. The software thus far developed emulates only the data reduction techniques used routinely by analysts to arrive a t an assay result. In other words, no special software is being used for nonroutine procedures, such

Donald E. Smith Department of Chemistry Northwestern University Evanston, IL 60201

N u m e r o u s possible applications of computerized, automated instrumentation exist in F D A lab operations. Consequentlj,,a pilot project, now completed, uias undertaken a t t h e Chicago District Laboratory t o evalu a t e on-line computer a p plications. Results of this project are reported, e m phasizing applications t o simultaneous a s s a j s using on-line gas-liquid chromatographs, automatic analyzers, and pulse polarographs as deconvolution of overlapping peaks, FFT, cross correlation, or the like. Similarly, no special computer-required controls have been added to the analytical instrument consoles, as part of the interfacing procedure. Many computer systems (4-6)provide time-sharing capabilities for multiple-instrument monitoring but may require 8K or more of core memory for the resident monitor alone and a large, fast mass storage device to swap into core the scheduled subroutines. Some

systems offer multiplexed analog-digital conversion (ADC), but all data rates must be the same. Therefore, additional bookkeeping is necessary to handle instruments of varying sample rates. Other systems (7-9) provide online multiplexing and real-time data reduction but do not store raw digital data arrays. Many operating systems are not useful in cases where high data acquisition rates are necessary because of long response times to interrupt requests. Most of these disadvantages have been circumvented, to a degree which is acceptable in our laboratory, by the laboratory computer system described herein. Its most important characteristic is its capability of monitoring multiple instruments a t differing data rates with efficient use of core and disk memory. This is achieved through exclusive use of assembly language programming, rather than relying on sophisticated operating systems and higher-level programming languages (IO). Consequently, this report must be considered relevant to the important question of whether minicomputerization is best effected via the hardware efficient, but time-consuming strategy which relies on assembly language programming, or by adopting the philosophy which minimizes software development and modification time by high-level language programming in combination with powerful operating systems a t the expense of considerable hardware inefficiency. We do not wish to use the experience obtained with our approach, which was adopted mainly out of necessity, as a basis for actively promoting a particular viewpoint regarding the foregoing question. Without advocacy, we believe

ANALYTICAL CHEMISTRY, VOL. 47, NO. 3, MARCH 1975

363A

b

Figure 1. Scope displays of typical data arrays acquired using apparatus and data logging monitor A gas-liquid chromatographic response using electron capture detector with Some chlorinated pesticide standards. Identity Of first through sixth peaks is as follows: petroleum ether solvent: lindane. 0.51 ng; heptachlor. 0.52 ng: aldrin. 0.86 ng; heptachlor epoxide. 1.04 ng: and p,p’DDT, 2.6 ng. Sampling rate 01 1.5 pointslsec. 8: autoanalyzer response using visibieIUV Spectrophotometric detector at 240 nm With phenobarbitaltablets. Sampling rate of 0.25 pointskec. C differential pulse anodic Stripping response using hanging mercury drop working electrode with Zinc. cadmium. and lead (first through third peaks) in tartaric acid electmlyte at pH = 5. Scan from -1.2 to 0.0 V. Sampling rate Of 10 pointslsec. [Note: Slight discontinuities in point plots are due to defect in scape display Controlier (Since corrected) and are not present in actual digital data arrays]

that [he present repurt contributes iisetullv to applicarion examples which can aqsist in deciiion making on the ioregoing imporranr fundamental question in contemporary analytical instrumentation. Ultimately, the choice of options remains a highly individualized matter, dependent on factors such as a laboratory’s equipment, personnel, and short- and long range goals (11).

Experimental The description of the computerized laboratory system which follows covers only the highlights. Full details are available from the authors upon request. Apparatus. All analytical instruments are interfaced to a Digital Equipment Corp. PDP-12A minicom puter with an 8K core memory. Computer peripherals include a DEC Model DF32 32K disk, two Model TU55 magnetic Linc tape units (125K words each), a Model AD12 32-channe1 multiplexed 10-bit analog-to-digital converter (ADC), a Model KW12A programmable crystal clock, a Model VR12 display scope with a Model VC12 control and character buffer, a Model FPP-12,36-hit hardware float ing point processor, a Texas Instruments Model 733-KSR teletypewriter, and a Houston Omnigraphic 2200 X-Y recorder. The instruments interfaced include: a Hewlett-Packard Model 5750 gasliquid chromatograph (GLC) with dual flame ionization detectors; a Packard Model 702 GLC with flame ionization detector; a Barber-Colman Model BC-10 GLC with a dual electron capture-thermionic detector system; a Tracor Model 220 GLC with electron capture detector; a Beckman DK-2A UV-visible recording spectrophotometer coupled to a Technicon 364A

\lode1 1210 (..V visible recording sper‘trophotmwter roupled IO a Technicon Autoanalyzcr network; a Technlcon Fluorometer I1 Autoanalyzer system; and a Princeton Applied Research Corp. Model 174 polarographic analyzer. The 1-IO-mV signals from the GLC detector amplifiers are conditioned to the Al-V full-scale level reauired hv the Model AD12 ADC input amplicer by amplification, followed by low-pass filtering (second-order Butterworth) with the aid of Burr-Brown operational amplifiers. The circuit schematic has been reported (12).A precision retransmitting potentiometer is installed in the Beckman DK2A as a pen position-to-voltage converter. The voltage across the potentiometer is obtained from the Burr-Brown operational amplifier power supply, and the potentiometer output is conditioned by amplification (gain < 1,with dc offset) and low pass filtering (13). Similar signal conditioning is performed on the Coleman 124B and PAR 174 outputs (0-3 and 0-10 V, respectively). The PAR 174 interface includes a computer-activated SPDT switch wired across the 174’s HOLD switch. This enables the scan sweep to he activated a t the teletype when the sampling request is given on the channel assigned to the PAR 174. The X-Y recorder used for plotting was economically interfaced by connecting to the two DAC’s controlline the scme display. All merational amdifier circuits are constructed on Burr-Brown Model 603 manifolds mounted on the PDP-12 chassis. Burr-Brown Model 3057 and 35425 IC operational amplifiers are employed for this purpose, together with Pomona Electronics plue-in accessories.

A N A L Y T I C A L C H E M I S T R Y , VOL. 47. N O . 3, M A R C H 1975

I

n a t a Logging Monitor. l’he data . reductioii on data in the 2K huiier, u,hile aampiing rontinues on the active rhanneli. ‘I’hedatii array tieing pruressvd i. continuously displawd oil the scope, so that results oidata reduction %reps which rnodiiy thr darn array I P . ~ . dig, ital filtering and peak stripping) are shown in real-time. The data array is loaded in core in fixed point format, and individual data points are retained in this state throughout most of the data reduction phase. Data points are transformed to floating point format and stored in small floating point data buffers (58points) only when being operated on by the FPP. This strategy effects a factor of three savings in core memory, relative to storing each point in an array as a floating point number, a t the cost of negligible (for us) increases in processing time. The end result of what we define here as data reduction is three floating point files containing: the abscissa value of the peak maxima relative to the first peak (retention time); the peak heights corrected for baseline; and the peak areas corrected for haseline. Digital Filtering. Data enhancement via digital filtering normally is 366A

*

the first step in data reduction, although it is optional when using the interactive mode. Digital filtering involves three phases: detection and elimination of large short-term transients (one-four points) on the data array, caused by occasional power line fluctuations; a floating two-point averaging routine; and an eight-point least-squares filtering routine. Transient detection involves successive examination of adiacent sets of three points for a second derivative whose absolute magnitude exceeds a reasonable threshold value (10 decimal units, absolute). When such a condition is detected, a straight line fit is calculated between the dat.a point immediately preceding the fir.st “bad” point and the seventh point beyond the latter. Values of the intermediate six points are then adjusted to fit this straight line. As an additioila1 precaution^ against smaller, sirigle-point spikes, the floatiue two-Dci n t average routine (each datapointis averaged with its neighbor) is invoke:d (4).

TIME

These preliminary precautions to eliminate transient “spikes” are essential for optimum performance of the third and principal phase of digital filtering: least-squares filtering ( 1 6 , 1 7 ) . The least-squares filtering routine is based on an eight-point quadratic fit algorithm. Eight successive points from the data array, beginning with the array‘s initial point, are converted to floating point format, and the best auadratic fit to these noints is calcuiated by the least-squares procedure. Ordinate values of the center six points are then modified to exactly match the calculated quadratic, and the ---- eight-noint - ---- r - - set is converted hack to fixed point;format and returned to the original core location. The relevant array ad dress then is incremented by orle, and the procedure is repeated untiil the entire data array has been scarmed. This operation requires appro][imately 10 sec for an array of 2000 points, using the FPP. ,The quadratic least-squares fitting software is reqriired inanother phase

--+

Figure 2. Gas-liquid chromatogram showing nature and locations of “key points” and baselines deduced by peak identification and characterization procedures Flame ionization detector chromatogram of (P?)acetone, (P$ barbital, (Pa) amobarbital. (Pn) pentobarbital, (Pg) secobarbital, (Ps) mephobarbital. and (P,) phenobarbital separated on 6-n, 3% OV-17 column at 200‘C. . . . . Baseline (peak from which peak heights and areas are calculated for peaks P,-P, baseline). - - - - Instrument baseline as interpreted by software. - - Overlapping peak and instrument baseline

A N A L Y T I C A L C H E M I S T R Y , VOL. 47, NO. 3, M A R C H 1975

.

@

DUAL-BEAM AA HIGH ENERGY MODE = MODEL 1250

+

Bui It-in dual-beam background corrector for total control of difficult analytical situations. Another unique Varian Techt ron innovat ion. The built-in background corrector gives results with built-in accuracy, fully corrected for background absorption. All you see is the right answer. Other 1250 features include Auto error light for automatic over range security. High energy mode for highest performance. Digital readout for precision, clarity and convenience. Integration for accuracy and precision. Automatic baseline control for accurate Calibration. Peak signal readout for exact results from carbon rod atomizer. You don’t need to interpret peak signals - the 1250 does it for you.

va rian techt ron Springvale, VIC., Australia. Palo Alto, Cal., U.S.A. Georgetown, Ont., Canada, Zug, Switzerland.

@

varian

I

J CIRCLE 241 ON READER SERVICE CARD

ANALYTICAL CHEMISTRY, VOL. 47, NO. 3, MARCH 1975

367A

of data reduction (see below). For this reason, the more efficient strategy (timewise) of utilizing equivalent weighted average formulas (16)in place of repetitive least-squares fitting for this filtering step was rejected after initial trial runs. The observed savings in computational time (-30%) was not viewed as sufficient to justify the additional core requirement to invoke the weighted average approach. Peak Identification and Characterization. With the ADP mode, this second phase of data reduction is implemented immediately following digital filtering without operator intervention. This operation involves locating, labeling, and filing abscissa values of key points in the data array, which are then used to calculate peak heights and areas. The data array is scanned for the following “key points”: peaks (P); minima (MI; thresholds to peaks (T); tangents to minima (MT); tangents to thresholds (TT).Figure 2 illustrates results of a typical array scan for these kev mints and urovides qualitative Elarification of their definitions, if needed. The search routine involved uses the eight-point quadratic fit routine as an aid to finding maxima, minima, and instantaneous tangents. This, together with the req uirement that an apparent maximum or minimum is followed bv the aDDn3priate downward and upward trend, resuectivelv. over the subsequent eight points, piovides effective discrim,ination against the detection of false peaks and minima resulting from residual noise after digital filtering, Quantitative conditions for detection of various classes of “key poi1nts” .-A are given in Table I. These represcrll. approximate optimization for operating conditions used in our laboratory and are reasonahlv tolerant to variations in peak parameters, such as peak width. The abscissa value for each class of “key point” is stored in a separate fixed point data file in the order of Hppcarnncr on the data array. An ID file is also generated during the 368A

*

1y;rreIevant for some quantitative assay techniques, but not others. This point will be clarified in subsequent discussion of calculation methods. In any event, the optional IDP routine allows the operator to generate only the relevant files for a particular assay, in addition to exercising his judgment regarding baseline location, peak area allocation, whether digital filtering should he invoked, etc. In the IDP mode, digital filtering is invoked hy an appropriate keyboard , , . , command (type “CR”) and proceeds 15 firrt a n d secon’d d&er;u$fivds r in the manner described earlier. The “adlatic fit. That is, if .guadrailc i d F ” ( d = 2 C. . , peak identification-characterization operation is implemented with the aid of the scope display, keyboard, and four analog potentiometers located on data scan process which contains a the computer’s control panel, using record of each key point classification fairly routine procedures (19,201. The in order of appearance. From these six data array, together with two cursors files, sufficient information is provid(bright dots), is displayed on the ed to implement the generation of the scope. The abscissa and ordinate of three floating point files mentioned each cursor are controlled by a pair of earlier. The file of peak abscissa the analog pots. By proper location of values is used to generate the floating the cursors and appropriate keyboard point file of relative peak positions commands, one can generate the three simply by subtracting the first peak’s floating point files provided by the auabscissa value from all others. Calcutomatic mode. For example, peaks are lation of peak magnitudes and areas integrated by the standard procedure requires baseline correction and spe(19)of manually locating the cursors cial procedures to assign peak areas at the initiation and termination of a when unresolved peaks are encounpeak and typing “CR.” Performing tered. Our software uses techniques this with successive peaks in the data which have been described in the liter- array, beginning with the first, will efature (18),which the reader is urged fect “manual” generation of the ueak to consult for details. area file. At thesame time, the peak Figure 2 illustrates the net result of position and peak height determinathese procedures for the chromatotions are filed. The IDP routine also gram in question. The instrument provides the means for stripping a baseline is drawn from the fireIt peiik from an array (type “S” with the threshold (TI, which represenits the cur‘sorslocated on either side of the peaik to be stripped). Provision for ohbeginning of the solvent peak, tosuh‘I sequent threshold tangents ( ‘W, tailling an X-Y point plot of the diswhich represent returns to instrument played data array also is included with baseline. Until the first threshold tanthe IDP mode. As with the ADP gent is reached, areas and heights of mode, printout of the files generated peaks following the first are calculated with the interactive approach may he by “skimming” (18)from a minima to requested. The IDP mode is invoked a minima threshold. In this way, inter- in our laboratory only on occasions ferences from a drawn-out solvent where t he data array contains some peak are deleted. Integrations involvunusualI or complex features which ing unresolved peaks (e.g., peaks Pa might n ot be appropriately processed and Pq of Figure 2) are made by by the 1i D P mode. Such,instances “dropping perpendiculars” (18)from have ariisen to date only in GLC meaminima and minima tangents to the suremeints (see below). baseline. Via such procedures the Calciulations. Assay calculations floating point files of peak heights and using ttle three floating- point files areas are generated, and the ADP progenerat ed by the data ;eduction pro. cess is ended. A t this point the operacedure 1involve standard FDA methtor may request printout of these files ods, wh ich have been documented and r.- -..,:~~~!~~ - - ~ ~ -1 ~1 ~ ~ ~~: ~ ~ :~ ~ .. ~ (14, 15,21,22).These will I W prsimunary exammauun, comparirationalized son with standard files, etc. he outlined briefly here to indicate The foregoing peak identificationhow the data . .files are used..in the. three characterizationroiitine ’ ‘ character. is assay methods under consideration. isticpf the ADP ma#deand is designed Autoanalysis Calculations. In this for general applicatiility to assay laboratory, automatic analyzers are methods which rely on response used to effect individual tablet analyueaks. Because of o ur interest in ses of pharmaceutical preparations .. . .c . . , n, ””\ n ,1 1 Iiruad applicability, some 01 mr ~ n r w ( 1 2 ,L I , LLI. u e ~ e c a o n mernoas are eitlmriiig poini files provided are clrarther UV-visible or fluorescence spec~~~~~

ANALYTICAL CHEMISTRY, VOL. 47, NO. 3, MARCH 1975

~

3

~~~~~~

.?

1

1.

,

Which GC 9 IS best for you. Varian can help you make the best choice, because we offer more choices: more gas chromatographs, more performance options, more accessories, and more applications support. You don’t have to buy more or less than you want. You can choose a G C system that really fits your specific needs and budget. For example, Varian offers 82 different standard gas chromatographs.

1 4 different 2700 models. Your best buy in a research GC; most capability per dollar. Choice of three temperature programmers. Choice of detectors. Large oven. Dual columns. Dual electrometer. CIRCLE 242

6 different 2100 models. With four-column operation the 2100 will handle a larger thruput of samples than any other Varian GC. All glass system and extra large oven make it excellent for biomedical and pesticide applications. CIRCLE 244

45 different 2800 models. The finest, most versatile commercially available gas chromatographs. Include prep capability and multilinear temperature programming. For the research chemist who must be equipped to handle any GC analysis. CIRCLE 243

15 different 1400/2400 models Varian’s most popular singlecolumn gas chromatographs, the compact 1400 series provides research performance at moderate cost. Sensitivity specifications equal or surpass any GC’s on the market. The 2440 is Varian’s CIRCLE 245 most economical dual-column GC.

& & ;;o m ;C h i.

2 different 900 models. Best buy in a low-cost, single-column GC. Simple, rugged, compact. Outperform many more expensive instruments. Widely used in education and quality control. CIRCLE 246

Accessories and Supplies. Varian also provides everything you need for gas chromatography: autosamplers, integrators, recorders, columns, parts and supplies. Ask for your copy .. of the new Varian chromatography catalog. CIRCLE 247

@

varian

C 8 t a l q i NO18

@ .I

1 .

-

_ _ ^ _

_______

I

I -

Let us help you select the gas chromatography system that will do the best job for you. Contact your nearest Varian representative or Varian Instrument Division, 6 11 Hansen Way, Box D-070, Palo Alto, CA 94303.

A N A L Y T I C A L C H E M I S T R Y , VOL. 47,

NO. 3, M A R C H 1975

369A

CXXCAOO FDA LAB REPORT S M P L E NO*: 123456X PROWCT: A S P I R I N LOT NO. SXXXWVW DATEI3JAN74 DECR I N blG OR GR’tGR 10: 105 ITA LIMITS I85 Tot11 SUB NO. PERCDPT H l m t LOW

1

101.3 99.0

101.2 99.1

e

101.0 97.0

98.6 98.8

3

99.2 98.2

98.8 100.1

9 9

9.8 101.3

10% 102 99.7 98.8

109.6 99.1

99.2 97.9 0)

96.0

X

97.0

X X

COV

98.3

1.0

100.0

102.0

97.0

1.6

99.3

102.6

97.9

1.4

99.7

108.6

97.0

1.3

98.0 X X X X X X 99.0 X X X X X X X X X X X X 100.0 x x x 101.0 x x x x x 102.0 x AVE99.7 HIGH- 102.6 LOW 97.0 WGE 5.6 STD DEVI 1.3 5TD ERROR OF MEAN 0-3 T99 CONFIDFNCE LIMITS: LOWER 99.1 WPER 100.3 (8.95) TOLERANCE LIHXTSS LOYW 96.3 UPPER 103.1 XPOPULATX ON VALUE8 LOWER 11.1 UPPER 11.6 IF THE VALUE I S r3.901 THE X POPULATIW EXPECTED TO EXCEED THE UPPER & LOWER LIMKT I S 0%

Figure 3. Hard copy report produced by computer system giv. . ins results of 30-tab1etautoanalysis run

trophotometry. A known standard solution usually is placed in every sixth position of the autoanalyzer sample tray. With the relative peak abscissa and peak height files, calculations are made by comparing sample responses (peak heights) to the average of the two nearest standard responses located before and after the sample response. Other information needed for the calculation is entered at the keyboard, Le., sample dilution, sample declaration, standard concentration, and standard cup order. The generated hard copy report, Figure 3, provides the percent of label declaration for each tablet, an average, and coefficient of variance for the sample. The calculation procedure has been previously reported in greater detail (23). GLC Assay Cdculations. GLC calculations normally involve comparison of sample and standard peak areas ( 1 4 ) . Becacse of the wide variety of GLC ana1y:;es run in our laboratories, and because no two samples are alike, the decision is left to the operator to input via keyboard the standard responses for calculation or to develop a standard file on magnetic tape for a particular component-detector combination. In pesticide residue analysis, where the methodology has been standardized ( 2 4 ) ,the instrument parameters remain constant; therefore, it is feasible to automate identification and calculation steps in which peakby-peak coinparison of standards with unknowns is made. Exceptions occur with certain residues which produce multiple, poorly resolved GLC peaks such as chlordane, toxaphene, polychlorinated biphenyls (PCB’s), or strobane. The interactive data processing routine normally is used in 370A

A N A L Y T I C A L C H E M I S T R Y , VOL

these cases. For example, in samples where an electron capture response exhibits both PCB and DDT peaks, the operator first strips away all peaks associated with PCB’s and integrates the DDT peaks, then retrieves the raw data from tape, strips away the DDT isomers, and obtains the total integral under the multiple PCB peaks. These results are then quantitated by comparison with standard responses which have been processed identically. Routines have been written in FOCAL-12 ( 2 4 ) ,a conversational language, to allow an operator to construct a reference file on magnetic tape by obtaining GLC results with standard solutions (either mixtures or single components). In the case of pesticides, the data stored are a file of relative retention times to aldrin and the substance’s response level per nanogram on a particular detector. The program automatically compares sample GLC runs to the standard file for the appropriate detector, identifies the unknown by scanning the file for matches in relative retention times (RRT), and if identified, calculates the residue levels in ppm. The RRT window used in updating a standard file is f 2 % (if new standard runs yield a value outside this range, the file is updated). For sample identification the RRT window is f5%. If a sample peak’s RRT falls outside the f 5 % range for all components in the standard file, a “NO MATCH” is printed for that peak. Since FOCAL-12 requires 7K core, this file comparison software can be used only when the real-time monitor is not in operation. Pulse Polarographic Calculations. Most pulse polarographic assays run in our laboratory involve relatively 47, NO. 3, M A R C H 1975

simple one- or two-peak (usually one) polarograms in which the qualitative identity of the sample is known. Assay calculations involve only comparison of sample peak heights or areas to corresponding standard responses obtained under identical conditions. In principle, development of standard files for purposes of unknown identification and quantitation is possible with the program described earlier which utilizes GLC data, but the nature of the samples received to date has made this unnecessary.

Results and Discussion For the two-year period during which it has been used for “production” applications, the data logging monitor has performed without failure. We estimate that all possible operating conditions have been encountered during this period; therefore, it is concluded that this component of the system is free from software bugs. A similar conclusion is reached regarding the assay calculation routines, which are relatively straightforward. The monitor and calculation aspects of the system share the property of being defined purely by software packages and, thus, are immune to variable factors such as individual instrument performance and noise levels. Once the software is debugged, these system components can be expected to yield consistent performance. We believe the monitor and calculation routines have reached the latter stage and will comment on them no further. As with most laboratory computer systems, the most critical components in the present one are the interfacing hardware and data reduction software routines. The interfacing includes some rather low-cost operational amplifiers which originally were considered to be barely within our needs regarding noise levels (including dc drift), input impedance, and common mode rejection characteristics. The possibility existed in our mind that these components might lead to signal degradation, at least with some of the numerous types of instruments encountered in this project. Because of their simplicity, the automatic data reduction routines, such as those assigned to locating peaks, minima, and peak thresholds, are not completely noise immune, nor are they necessarily flawless in partitioning areas between poorly resolved peaks. Because our laboratory makes frequent use of standards, and assays based on poorly resolved peaks normally are not attempted (except where the total area under several unresolved peaks is used, e.g., total PCB’s), it was felt that the mathematically simple peak characterization

Perkin-Elmer’s new Model 370 AA Spectrophotometer has all the features of our 360 ...plus these. The Model 503-type burner system -designed for maximum ease of use and a rugged new burner mount.

-

Automatic Nitrous Oxide Switching and burner head safety interlock are standard for simplified operation.

n

I

I

I t

A large, easy-to-read 3%digit electronic readout with adjustable decimal point, energy meter, and direct connection to Perkin-Elmer’s data loggers.

/

P

These Model 370 features give the busy operator an edge in accuracy and speed. But at heart, the Model 370 is still the great Model 360. Still packed with things Ii ke Automatic Gain Control and optional double-beam Deuterium Arc Background Correction with Automatic Intensity Control. And with the same optical system designed to be optimum with flame or flameless sampling just by turning one knob.

The operator can select continuous, integrated, peak area, or peak height readings. The new Model 370 further expands our family of AA spectrophotometers. You select the unit best suited to your staff and your needs. For details, write the Instrument Division, Perkin-Elmer Corporation, Main Avenue, Norwalk, Connecticut 06856. Or call your local Perkin-Elmer sales representative.

PERKIN-ELMER

CIRCLE 192 ON REAC)ER SERVICE CARD

A N A L Y T I C A L C H E M I S T R Y , VOL. 47, N O . 3, M A R C H 1975

371A

Table II. Comparison of Magnitudes and Precision of GLC Peak Area Ratios Calculated by Various Procedures Calculation procedure Injection no.

1 2

3 4

5 6

Mean value SD, % of mean ,value

Manual triangulation Ria

0.954 0.925 0.830 0.791 0.843 0.948 0.882 7.84

Rjh

1.174

1.117 1.288 1.160 1.039 1.348 1.191 9.06

Disk integration Ri

R?

0.961 0.920 0.973 0.973 0.972 0.973 0.962 2.19

1.301 1.205 1.229 1.193 1.207 1.217 1.225 3.18

Corn uterized ADf: mode

Corn uterized ID! mode

--

Ri

R?

RI

0.996 0.989 0.981

1.178 1.175 1.135 1.150

0.980 1.008 0.971 0.996 0.986 0.995 0.989 1.32

1.001 0.988 0.993 0.991 0.701

1.112

1.146 1.149 2.16

Rz

1.171 1.185 1.190 1.173 1.140

1.166 1.170 1.50

I RI i s rhe area ratio of chlorpheniramine free base t o lidocaine "internal standar,d" on tlame immization detector. h R : is the area ratio of pyrilamine free base t o tetracaine "internal standard" on flame ionization detector.

procedures described would suffice, allowing a ssvirigs in core requirements relative ':o more sophisticated curve fitting terhniques involvipg gaussian fits, skewed gaussian fits, etc. Nevertheless. because the interfacing and digital data I-eduction procedures provide sorte obvious possibilities for inadequate system performance, we have implemented extensive trial runs to test the f'ldelity of the computerized assay procedures described above, relative to the conventional noncomputerized methods which have been standard practices in our laboratory. The latter involve manual interpretation of pen-md-ink recordings in most instances. The recordings are obtained routinely during all computerized runs: theref; )re, frequent implementation of such "trial runs" is convenient. Results of'such tests invariably have been quite satidactory, suppressing any fears thst interfacing and data reduction strategies might degrade measurement accuracy. The various comparisons of computerized and noncomputerized assay results which we have performed all lead to essentially the same hajic wnclusions. Consequently. we will present here results of only one trial run, which we submit as representative. GlL' analyses based on peak areas invariably involve comparison of a sample peak's area with a standard's in which ont' calculates the peak area ratit) from M hich the sample composition is deduced. Thus, regardless of the method usemd to evaluate peak area, precisim of the peak ratio determination is i.he most important factor controlling Essay fidelity. Table I1 illustrates tht peak area precision obtained hy t w o commonly used noncomputer procedures, manual triangulation and d sk integration, together with cwmputerized data reduction via the automatic and interactive modes. Peak area ratios are given for the chlor~)henir;imiiie free base-lidocaine and pyrilam ine--tmetracaine pairs. Pe372A

A N A L Y l I C A L CHEMISTRY, VOL

rusal of the results in Table I1 leads to the following conclusions: (a) the best data precision is yielded by the computerized data processing approaches, with disk integration and manual triangulation yielding successively poorer results; (b) the absolute precision of both computerized data processing modes is outstanding, relative to what is normally obtained; (c) with one exception, the agreement between mean peak area ratios obtained by the various methods for a given component pair is excellent. Conclusions derived from the data

in Table 11, as specified under items (a), (b), and (c), are echoed in similar trial runs, regardless of the instrumental technique or observable employed. The finding that computerized data reduction yields superior precision to what is obtained via alternatives involving manual and/or mechanical processing is fairly typical of reports in this field (25-27). The one example referred to under item (c),where a significant deviation in the mean peak area is found, involves manual triangulation of the chlorpheniramine free base-lidocaine pair. This probably is due to systematic error in the manual integration of the lidocaine peak, which is very narrow (3-mm baseline width), given the chart speed provided on the GLC instrument. In a normal assay situation, the peak's narrowness would have led to out-of-hand rejection of the manual triangulation method in favor of the disk integration procedure, if computerized processing were unavailable. Regarding the consistenc:y between computerized and manual measurements, extensive comparisons of this type have been made on the basis of autoanalyzer peak heights. .4typical example is provided by one .autoanalp i s run involving 40 peaks in which results obtained by manual and computerized interpretation of peak

Figure 4. Point plots of GLC data arrays showing effects of digital smoothing A: response of electron capture GLC detector of 2.5 ng, of p,p'-DDT using sampling rate of (one point per 700 msec. B: result after data array "A" is subjected to transient removal and two-point averagtng routtrie. C. result after data array ' B" is subjected to eight-point least-squares quadratic smoothing routine

47.

NO 3. M A R C H 1975

heights varied an average of kO.170, the greatest individual deviation being 0.9%.

Actually, the overall system fidelity exceeds somewhat our expectations. Dc drift and ac noise in the inexpensive operational amplifier circuits and in the signal conduction paths (sometimes as long as 30 meters) associated with the interfacing hardware normally have proved to be somewhat less than the corresponding quantities produced in the GLC and autoanalyzer detectors. Regardless of the source of analog noise, the digital filtering procedures seem adequate to eliminate untoward consequences of such noise with the wide variety of instruments employed, so that the ADP data reduction mode is applicable to most data acquired. However, it should be emphasized that, in our rather poor electronic environment, digital filtering is crucial to routinely realizing good computerized data processing results (such as shown in Table 11) with the ADP mode. Without digital filtering, some of the ADP data reduction routines would be “fooled” occasionally by noise spikes or the like, leading to the necessity for more extensive use of the slower, less precise IDP mode. This is illustrated in Figure 4 which shows an example of the effects of the digital filtering operation on a typical GLC peak (a portion of a larger data array). The original raw data in Figure 4 show one large noise spike on the peak which is typical of what is occasionally encountered (about 95% of our chromatograms are free from such artifacts) as a result of power line transients induced by solenoid activation and the like. The obvious option of power line conditioning was determined to be less cost effective, as well as less electronically effective than the above-described software filter designed to eliminate noise spikes. In addition to the only occasional noise spikes, the more common forms of random noise evidenced in the raw data of Figure 4 are sufficiently reduced to make the data compatible with the subsequent data reduction strategies. We consider digital filtering to be a valuable part of computerized data processing. It provides the basis for using with good fidelity the relatively simple data reduction software which is more efficient regarding core and operating time than more sophisticated options, and it achieves this a t lower cost than the electronic hardware options for improving signal/noise characteristics. Conclusions The minicomputer system as described has automated data acquisition and determination of peak posiCIRCLE 172 ON READER SERVICE CARD

VM!! AND

2 3 and &way models

Every inert component you need to quickly construct any liquid or gas flow system.

I

x

0 CHEMICALLY INERT 0 ZERO DEAD VOLUME 0 LEAK TIGHT TO 500 PSI 17 MATE PERFECTLY WITH GLASS-METAL-PLASTIC Compatible with other micro-plumbing components, the Altex system includes: Tees and Crosses, Luer Adapters, Couplings, Stainless-steel and Glass Tube Adapters, Plugs, Pipe Connectors, TEFLON Tubing and Flanging Tool Also , , . a complete line of Liquid Chromatography Columns and Sample Injection Valves.

COMPLETE CATALOG on request

94 Lincoln Street Boston, Massachusetts 02135

(617) 787-5050 CIRCLE 204 ON READER SERVICE CARD

A N A L Y T I C A L C H E M I S T R Y , VOL. 47, N O . 3, M A R C H 1975

373A

tion, height, and area in a manner which we find compatible with data arrays generated in GLC, autoanalysis, and differential pulse polarography. Although it has not .been attempted, it is anticipated that the system is equally applicable to any assay procedure which yields a peak-shaped response in a time frame comparable to GLC or autoanalysis (e.g., liquidliquid chromatography and amino acid analysis). Without operator intervention, assay such as percent label declaration of pharmaceuticals by autoanalysis are obtained. The computer results are accurate, nonsusceptible to error, and in agreement with manually determined values. The economical operational amplifier interface has been applied to GLC and instruments provided by numerous manufacturers. By the development of the entire system in house, software changes are made to accommodate changes in instrumental methodology. For example, an auto-injector GLC also has been interfaced recently, so that the computer automatically initiates data sampling when the injection occurs and dumps the complete data array (chromatogram) onto magnetic tape when the injection time signifies that a run is completed. Sufficient core remains to expand

the system to handle 16 instruments on-line. When this is done, the computer hardware costs per detector monitored would be less than the cost of many noncomputerized electronic hardware options of greater simplicity and less versatility, e.g., commercial digital electronic integrators for each detector.

References (1) A. J , Raymond, D. M. G , Lowrey, and T. J. Mayer, J. Chromatogr. Sci., 8, 1 (1970). (2) S. W. Downer, h e r . Lab., 6 (2),93 (1974). (3) W. N. Shannon, Lab. Management (31, 26 (1972). (4) R. A. Landowne, R. W. Morosani, R. A. Herrman, R. M. King, Jr., and H. G. (5)Schmus, D. A, Craven, Anal. Chem., E, S, Evert& 44,1961 and(1972). M,

Rubel, J . Chromatogr. Sci., 9,541 (1971). (6) M. Shapiro and A. Schwitz, Anal. (7)Chem., F, Bauman, 43,398H(1971). , C. Brown,and M, B, Mitchell, J . Chromatogr. Sci., 8, 20 (8)(1970). D. Glover, Res. Deuelop., 24 (5),22 (9)(1973). s.p, Perone, Anal, Chem,,43, 1288 (1971). (10) J. Frazer, ibid., 40 (8), 26A (1968). (11) R. E. Anderson, J . Chromatogr, Sci., 10, 8 (1972). (12) L. L. M. w. Overton$and D. E. Smith, J . Ass. Offic. Anal. Chem., 54, 620 (1971). A1beri

(13) M. W. Overton, L. L. Alber, and D. E. Smith, ibid., 56,140 (1973). (14) “Pesticide Analytical Manual,” Vol 1, Food and Drug Administration, Washington, D.C., 1968. (15) “Drug Autoanalysis Manual,” 2nd ed., Food Washington and Drug DC Administration, 1973. (16) A. Savits;y ;niM, Golay, Anal. Chem., 36,1627 (1964). (17) B. Gold and C. M. Rader, “Digital Processing of Signals,” McGraw-Hill, New York, N.Y., 1969. (18) J. D. Hettinger, J. R. Hubbard, J. M. Gill, and L. A. Miller, J . Chromatogr. Sci., 9, 710 (1971). (19) J. W. Frazer, L. R. Carlson, A. M. Kray, M. R. Bertoglio, and S. P. Perone, Anal. Chem., 43,1479 (1971). (20) S. P. Perone, J. W. Frazer, and A. M. Kray, ibid.,p 1485. (21) “u,s,pharmacopia,” 18th rev., Mack Publ. Co., Easton, Pa., 1970. (22) “National Formulary,,913th ed,, American Pharmaceutical Assoc., Washington, D.C., 1970. (23) L. L. Alber, M. W. Overton, and D. E. Smith, J . Ass. Offic. Anal. Chem.,56, 659 (1973). (24) “FOCAL-12 Manual” (DEC-12AJAA-D), Digital Equipment Corp., Maynard, Mass., 1971. (25) R. S. Swingle and L. B. Rogers, Anal. Chem., 43,810 (1971). (26) S. P. Levine, J. L. Naylor, and J. B. Pearce, ibid., 45, 1560 (1973). (27) S. C. Creason and D. E. Smith, ibid., p 2401. The use of equipment described herein does not constitute endorsement by the Food and Drug Administration. Inquiries should be addressed to Larry L. Alber.

SCHLUCHER & SCHUEU HAS A FULL LINE OF FSLTRATtON MEDIA FOR LAB ANALYSES Looking for a single source of supply to meet all your requirementsfor separation media?One whosequailty control guarantees reproducibility ? Who deliversfast from U.S.-made inventory? Then look to Schleicher & Schuell for filters-paper, membrane and glass. For TLC plates and powders. For all your filtration supplies. To receive detailed data, check the list below.. . send this coupon today.

I I

Paperfilters ash-free, for quantitative analysis ash-low, for qualitative analysis

1 I I 1 I

0 hydrophobic, for separation of water and solvent 0 carbon activated, for color and odor adsorption

AISO

0 papers for chromatography and electrophoresis 0 discs for the assay of penlcillln and other antibiotics specimen collection paper

0 spot test paper, for qualitative and semi-quantitative colorimetric analysis O weighing paper 0 cellulose extraction thimblesfor extraction of fats in foods 0 indicator strips for measuring pH values Name Companyllnstitution Address city

Title

State

Zip

0 folded, for retaining gelatinous, coarse, moderately fine

I I I I I I

and ultra-fine precipitates

0 ruled, for analysis of extraneous matter in foods and drugs Glass fiber filters

17 for air monitoring 0 for membrane pre-filtration 0 extraction thimbles for stack analysis

KEENE,N H .U S A 03431/Tei 603352-3810 CABLEADDRESS FILTERING/TWX710-368-6390* D-3364,Dassei. West Germany Telefon (0 5564)1%/Telex 09 65 632 8714Feidbach ZH, Switzerland Telefon 0 55/42 22 12ITelex 75628

ACI3

374A

ANALYTICAL CHEMISTRY

VOL 47

CIRCLE 215 ON READER SERVICE CARD NO 3 M A R C H 1975

I