Instrumentation M. L. Salit Department of Chemistry Arizona State University Tempe, Ariz. 85287
M. L. Parsons Los Alamos National Laboratory CHM-1, M/SG-740 Los Alamos, N.M. 87545
Software drivers
gsssSS? à
Measurement hardware
9
£ ^ Sensor
Control
^ Recent advances in computing technology (1-3) and related drops in cost of computing capability have spawned a new generation of analytical instrumentation. This instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The symbiosis of the computer and measurement in these systems has developed into what might be termed software-driven instrumentation. Depicted in Figure 1 is a generic instrument. The goal of this instrument is to make an observation of the phenomenon of interest. This phenomenon may occur under controlled conditions, with feedback through the control and sensor hardware, through the measurement hardware, to the software drivers that structure the experiment. Although the modular boundaries of such systems vary, the functionality of the modules is distinct. This instrumentation offers the an0003-2700/85/0357-715A$01.50/0 © 1985 American Chemical Society
alyst a tool that is capable of making measurements otherwise impossible or prohibitively expensive to make. It is a flexible tool that can be customized to specific tasks through modification of the software instructions that drive the processing, control, and data acquisition hardware. New computing technology has made the shaping of our software-driven tool easier than ever, allowing the nature of the measurement to be defined by software. The instrumentation for the measurement can be fixed, whereas a software "skeleton" defines the way in which the instrument components behave together. This permits radical changes in the experiment while minimizing costly changes in the hardware. Another important area in which software-driven measurement systems have found acceptance is in environments in which versatility of computer control, data acquisition, and data treatment are called for, although the measurement system by its nature does not demand it. The features of-
^
?
ύ
Phenomenon
Figure 1 . A g e n e r i c s o f t w a r e - d r i v e n instrument
fered by such a system greatly en hance the utility of standard analyt ical instrumentation. This type of application might be a network of instruments integrated into a laboratory information manage ment system (LIMS). Each instru ment need not be a software-driven instrument, but computerization al lows the advantages of a cohesive, so phisticated sample-tracking and re port-generating system to be effected. Generally the computerization of an instrument creates an easy-to-use measurement system, eliminating the need for the attention of a skilled ana lyst and reducing the time demands
ANALYTICAL CHEMISTRY, VOL. 57, NO. 6, MAY 1985 · 715 A
on a technician. D a t a reduction a n d d a t a archiving are simplified a n d more accurate with t h e calculating power of t h e computer, liberating more analyst time. Often t h e sample t h r o u g h p u t of a m e a s u r e m e n t system can be dramatically increased through use of computers in a time-consuming phase of t h e measurement—be it measurem e n t time, instrument preparation, or d a t a processing. T h e reliability of measurements obtained from such a system will of course be improved by eliminating t h e h u m a n element in t h e measurement process. T h e history of software-driven instrumentation stems from t h e time t h a t t h e first minicomputers became available for dedication t o instrumentation in a laboratory environment. These computers were expensive, difficult to program, limited in memory a n d storage, a n d p e r h a p s m o s t import a n t , limited in their ability to handle real-world input a n d o u t p u t (I/O). Design sacrifices were necessary t o m a k e t h e instrumentation conform to t h e capabilities a n d limitations of t h e computer. Despite these limitations, t h e computer offered a far more efficient method for precise control, d a t a acquisition, a n d processing t h a n manual or analog electronic methods, a n d analytical chemists took advantage of these improvements.
T h e r e have been refinements in processing capability, memory integration, a n d I/O capacity of hardware. T h e level of integration (functionality per component) of computer ICs is such t h a t today's most complex computers m a y have a lower p a r t count t h a n their simpler predecessors. This decreases t h e complexity a n d cost of t h e hardware system a n d enhances its speed a n d reliability. T h e new processors not only process more information in less time, b u t they often support sophisticated architectures or programming structures to enhance capability a n d programmability. This enhanced speed and programmability n o t only m a k e it easier t o d o t h e things t h a t were done with t h e earlier technology, b u t open u p applications t h a t were previously impossible. T h e new memories not only quadruple t h e a m o u n t of information stored on a chip b u t are smaller a n d consume less power. Inexpensive, intelligent peripherals allow system input or o u t p u t in many forms without burdening t h e processor or other resources. New measurement schemes have been envisioned in many areas as a result of t h e availability of this new technology. T h e high level of control over real-world parameters affecting t h e p h e n o m e n a t o be observed offers precision otherwise unobtainable, a n d
the d a t a acquisition a n d timing capabilities offer t h e analyst unique flexibility in determining how much data t o obtain a n d when t o obtain it. T h e speed of d a t a acquisition hardware is such t h a t extremely fast, transient physical phenomena can be "stroboscopically" stopped in time, allowing for direct observation of a signal of interest while often discriminating against other signals. These abilities have resulted in new techniques in such areas as atomic, molecular, a n d mass spectroscopy; surface analysis; polymer characterization; a n d electrochemistry. Today's data-processing capabilities, in b o t h hardware a n d software, offer t h e analyst t h e opportunity t o use sophisticated data reduction on masses of d a t a too large to be treated simply manually. New techniques have grown o u t of t h e ability t o use p a t t e r n recognition techniques on difficult-to-understand data. Useful information can often be recognized a n d quantified by using chemometric techniques to reduce large, multivariate d a t a sets t o their meaningful relationships. T h i s approach is vital when it is the n a t u r e of t h e analytical probe t o disorganize, as in a combustion or pyrolysis experiment. T r e n d s in t h e laboratory are toward automation of sample handling a n d
Sample Management a Problem? INTRODUCING
SIM-SOFT
TM
FOR YOUR IIÎM PC OR PC XT "l ^ n ' e v ^ w ô r l d o f s o t t w a V ^ ^ m j . t o creatï SOFT?" ha? hccn created ϊ«*ι£ώ in the ai .laboratory to solve the prohlenupfsample man a i m ] This ke\board entry, meimidnven pmt>ram|ill vuu lu L'omenlmle ("in chemiitry and let >ou| SONAI. COM1TTER do the cAmputhm. SIM-: and vfiur IIJM PI 01 PC XT will provide • I >atd storage \ • Maintenance of data fîtes • Statu·? reports 01 sampjos in lahtiralorv • Ki'purl (ynerdtion j \s pail of nm intrndiictoryjfclt'braliiin. with ih.JM of cadi SIM SOFT" fori $W5.2 ( 8 0 2 ! 8ΠΟ-214-7 CIRCLE
194 O N READER SERVICE
CARD
measurement through both fixed (autosamplers, autotitrators) and flexible (robotic) automation systems. Integration of automation systems with instrumentation is most effectively achieved when the experiment is under the control of a computer; the software drivers can be optimized to perform in the required manner for coordination with the automation system. In addition to the measurement advantages, software-driven instrumentation incorporates features brought to analytical instrumentation from computing—ranging from self-diagnostics to expert systems for automatic tuning or optimization to software integration, the ability of programs to share data and offer the user a consistent environment in which to operate on that data. Software integration in the laboratory is a requirement for implementation of a widely accepted, flexible, and efficient LIMS. The electronic laboratory is an achievable concept, with a sample entering the laboratory and being tracked throughout several analyses, the data collated and analyzed, and a report generated without manual data handling. Instrumentation developers are incorporating these sophisticated computing concepts and tools during the design phases of their projects. This
L
allows for optimization of design, tailoring the computing strengths and limitations to the particular needs of the instrumentation. In many cases the tolerance of the physical measurement apparatus can be significantly relaxed by using feedback techniques in the measurement and control systems. To take full advantage of such features of a computerized instrument the design of the measurement apparatus, the measurement hardware, and the software drivers must be concurrent and integrated. It is the integration of these three areas of a computerized instrument that allows full exploitation of the capabilities of all subsystems. With tools currently available to the developer, this is a feasible, cost-effective approach with benefits to all phases of the project. Architecture Developments in computing have kept the developers of analytical instrumentation actively upgrading their systems (2, 3). For example, an architecture that is having a major impact on software-driven instrumentation is distributed processing, a system in which multiple processors work in cooperation to effect a solution to a problem. This configuration offers increased speed (by sharing tasks), lower cost (several microprocessors are
less costly than a minicomputer), increased modularity of both hardware and software (offering greater reliability and easier modification), and increased standardization in the laboratory. Typically these multiple-processor systems are segregated into host and slave computers, with specific tasks assigned each. Host systems generally encompass the user interface, dataprocessing, and system output functions. These computers range from the popular personal computers (PCs) to highly specialized machines tailored to the management of the instrument. It is the task of the host computer to provide control of the slave system, to provide mass storage for archiving data, and to handle the processing and reporting of analytical results. A host computer need not be dedicated to a particular instrument; it can function as a general-purpose computer—at the disposal of laboratory personnel to improve productivity. Specialized host systems may use array processors for performing Fourier transforms at high speed, floating-point calculation hardware, and image digitizers for the acquisition of video data. Slave computers are dedicated to the instrumentation, providing control and data acquisition capability. These computers either control or directly
A B O R A T O R Y I N T E R F A C E IERIPHERALIJUBSYSTEM A U T O M A T E SPECS, GCS, G E L SCANNERS
for less than $2000 including computer and disk drive
CHROMATOGRAPHY SOFTWARE Auto or manual peak detection and integration. Store and retrieve scan on floppy. 16 BIT ADCs — Four independent channels of 16 bit ADCs giving 1 part in 65000 resolution throughout ADC span. (Other mfgrs. use 12 bit ADCs resolving to only 1 part in 4096.) Differential inputs with dedicated integrator for each channel for optimum noise rejection. Full-scale sensitivities of 10 mv to 10 volts available. D I G I T A L I/O — 24 lines of digital input/output included. DAC CHANNELS — Optional analog output w i t h full-scale voltages of 1 to 15 vofts for controlling devices or data output. V "· ' · «f HIGH R E S O L U T I O N - V E R S A T I L E COMPLETE — 4 channels of ADC with digital I/O and chromatography software for only $1185. ^ 205 Weaver Street / Carrboro, N.C. 27510 Çj (919) 929-5001 for more information
SPECTROFUGE C O R P O R A T I O N OF N O R T H C A R O L I N A , INC. CIRCLE 192 ON READER SERVICE CARD
718 A · ANALYTICAL CHEMISTRY, VOL. 57, NO. 6, MAY 1985
HpHfipromising XaualHy for Pesticide Residue Analysis B&J Brand1" High Purity Petroleum Ether meets the most demanding standards for pesticide residue analysis and gas chromatography. Consistent, reliable purity eliminates interfering background peaks, assures accurate results. For complete information on B&J Brand Petroleum Ether and other high purity solvents for trace analysis contact American Burdick & Jackson, 1953 South Harvey Street, Muskegon, Ml USA 49442. Phone: 616 726 3171. American Burdick & Jackson
Subsidiary of American Hospital Supply Corporation
©1965 American Hospital Supply Corporation
CIRCLE 7 ON READER SERVICE CARD
User enters parameters on host computer
ΐ>
Host compiles and downloads program
V Slave transmits data to host; may repeat program
c
Slave system stores and executes program; host is free
Host saves and processes data
Figure 2. A typical distributed-processing environment
interface with the measurement elec tronics. They contain very simple user interfaces (start-stop switches or indi cators), if any, which appear to the an alyst as black boxes. The level of programmability of the slave system var ies from none—the control of the slave is from instructions in read-only mem ory (ROM)—to complex programs reacting to feedback from external in puts, programmed from a host system with instructions compiled at the slave system. The slave may "buffer" or hold the data it collects in random ac cess memory (RAM), freeing the host from I/O during the experiment, and may preprocess data according to al gorithms in ROM or downloaded to program RAM, again offloading a task from the host. A typical distributed-processing measurement system is depicted in Figure 2. Note the slave's capability for independent action once it has been programmed by the host. This configuration allows a single host to service multiple slave computers. Slaves can be generic; they can be moved from instrument to instrument or from one technique to another with only a change in the downloaded soft ware. This provides an easy upgrade path for the development of a LIMS in a laboratory, permitting slave mea surement systems to be added as needs arise and resources become available. Hardware Different measurement systems have different requirements for pro cessing capability. The range of micro processors available today offers per formance from inexpensive (less than $5) eight-bit chips to highly integrated
16/32-bit microcomputers (several hundred dollars). The nature of the application defines the level of sophis tication required by the processor to meet certain performance criteria. A decision on a given architecture and processor is, therefore, a necessary and important segment of the design phase of the measurement system. Availability of software development capability for a given processor may be as valid a reason for choice of a pro cessor as its performance. Compatibil ity with installed equipment, cost, availability, and compatibility with peripheral devices are all important factors in the selection of a processor. Often an elegant solution to a mea surement problem can be implement ed with a simple processor—with low er development overhead than with more complex chips. The purpose of a measurement sys tem is to provide control outputs to a phenomenon and to input information about the phenomenon for further analysis. Clearly the I/O capability of the computer system is critical in a measurement environment. Today's I/O capabilities are impressive, and it can be presumed that trends to in crease the amount of information that can be gathered and the speed at which it is transferred will continue. For example, developments in imaging technology have been exciting—the first commercial example in analytical instrumentation is linear photodiode arrays for the multiplexed acquisition of UV-VIS spectra. This technology extends to 512 X 320-pixel imaging devices, which demand high data transfer rates for the large amount of information (160 Kbits) to be passed and processed for each image.
720 A · ANALYTICAL CHEMISTRY, VOL. 57, NO. 6, MAY 1985
Software An important current trend in soft ware involves making computer use transparent to the operator. New sys tems architectures support this trend, incorporating sophisticated user inter faces (pointing devices, pull-down menus, integrated environments) and high-level support for the control and programming of these interfaces. Additional features included in the instrumentation software package may include limited data base capa bilities, report generation capability, network support, and integration into standard PC software. This integra tion will provide a key link between the electronic laboratory and the elec tronic office. Instrumentation software packages have been affected by these trends; no longer is the software coded in native assembly language and proprietary development languages. This practice results in software that is difficult to maintain and almost impossible to in tegrate with other packages. New high-level languages are available with good structure and efficiency suitable for the creation of instrument soft ware packages. Time-critical routines are often still coded in native assem bly language for efficiency, but the bulk of the software is written in highlevel languages. This has significantly lowered the overhead involved in the development of software-driven in strumentation in addition to increas ing the complexity, capability, and flexibility of the package. Instruments Examples of software-driven instru mentation include a wavelength-mod ulated continuum source multiele ment atomic absorption spectrometry (AAS) system (4, 5), a commercially available Fourier transform infrared (FT-IR) spectrometer (6) that breaks new ground in qualitative and quanti tative IR measurements through the use of innovative software, and a pyrolysis gas chromatograph with parallel mass spectral and flame ionization de tection for the characterization of polymer systems (7, 8). The multielement AAS system uses a high-resolution Echelle spectrom eter, a high-pressure xenon arc lamp continuum source for excitation over a large spectral range (200-600 nm), a refractor plate mounted on a comput er-controlled galvanometric torque motor for wavelength modulation, and a high-speed analog-to-digital (A/D) converter for the synchronous mea surement of 16 absorbances (Figure 3). The nature of the measurement scheme (the relationship between data acquisition and wavelength modula tion) requires the use of a computer system to implement.
Echelle polychromator
300-W power supply
I I
We're giving away,
Multielement cassette Torque motor controller
our entire product line.
16-channel multiplexer Digital/analog converter
Dual RX02 floppy disk
Real-time event clock
LSI-11 control and data acquisition computer
Dot matrix printer
Alphanumeric terminal
Graphics terminal x-axis (wavelength)
256 Kbytes RAM
High-speed analog/digital converter system
MINC-23 lab computer
/zr^ Digital plotter
Letter quality printer
y-axis (intensity)
Figure 3. A multielement atomic absorption spectrometer (PMT = photomultiplier
tube)
It's all in our free Special Gases & Equipment Catalog. It lists and describes pure, mixed and electronic gases from Argon to Zenon. Plus all the regulators, flowmeters, instrumentation, fittings, cylinders and other gas handling equipment you'll ever need. To get your free copy of everything we sell, just fill out the information below and mail Name. Title _ Company Address City Zip
. State . Phone.
AIRCO Airco Special Gases, 575 Mountain Avenue Murray Hill, New Jersey 07974
A five-step waveform is used to drive the torque motor, allowing data acquisition to be performed at five wavelengths across the absorbance profile. These wavelengths correspond to two background intensity measurements, two off-center intensity measurements, and a line center intensity measurement. The capability to measure the intensity at an off-line wavelength permits extension of the dynamic range of an absorbance measurement to higher concentration through the calculation of a secondary, less-sensitive absorbance. The background intensity measurements allow for dynamic background correction—a technique effective in discriminating against the nonspecific absorbance commonly observed in the graphite furnace atomizer. To implement this measurement scheme a timed interrupt-driven control architecture is established, with digital-to-analog conversion from a waveform array to control the refractor plate position (wavelength) and high-speed multiplexed A/D conversion being performed after motor settling. The resultant array of multielement intensity data (~192 Kbytes) is
CIRCLE 3 ON READER SERVICE CARD
722 A · ANALYTICAL CHEMISTRY, VOL. 57, NO. 6, MAY 1985
then converted to arrays of absorbance data, one per element, an approximately 3-min task on a PDP 11/23 with a hardware floating-point processor. Such a measurement scheme and data reduction method are far too unwieldy to implement in an analog fashion. The FT-IR system with vectorbased software from Beckman Instruments is one of the new generation of low-cost Michelson-interferometerbased instruments. This system uses a distributed-processing architecture to implement control and data acquisition from the interferometer with a processor devoted to managing the sophisticated user interface (light-pendriven graphics and analysis) and data processing. Specialized hardware permits rapid transforms of the data between the frequency and time domains as well as between the absorbance and transmittance domains. The phenomenal growth in FT-IR instrumentation is directly attributable to the availability of suitable, low-cost, computer-driven measurement systems. The control requirements for the mirror drive are exacting, as is the data acquisition timing.
100 parts B
(b)
*
100parts~Â~ /
/ 53.8 p a r t s X "
npj
éi^Z.
,'^,40 partslT
100parts~B~
•
Figure 4. (a) T w o Chemical v e c t o r s ; (b) t w o c h e m i c a l v e c t o r s and a m i x t u r e of them resolved
These requirements are easily met with a software-driven measurement system, whereas analog implementation of these tasks to the precision required would be prohibitively expensive. This instrument is unique in that it performs vector-based data analysis in the time domain. This is a calculationintensive process, suited only for computer implementation, which offers important advantages in using the spectral information to its fullest extent. Instead of the traditional methods of encoding digitized spectra to reduce the amount of data used in library search techniques for qualitative analysis, spectra can be represented in the time domain as a vector of mathematical coefficients of a Fourier series. This permits the comparison of spectra—independent of concentration—for identification purposes through the comparison of the vector angles of the spectra to be compared. Quantitative analysis of the recorded spectrum is performed after the unknown vector has been resolved into the component vectors of interest (pure compounds). It is not necessary to resolve all component vectors to perform quantification on a given component; by using principal factor analysis the component vectors can be orthogonalized. This allows the known
components to be independently determined and subtracted from the net unknown vector, leaving a determinable residue vector for further quantification (Figure 4). These mathematical tools yield heretofore unobtainable information from IR spectrometry, and they are opening new doors in the analysis of mixtures of organics. This approach to using IR spectral data could not have come about without the use of a software-driven measurement system that permitted the complex mathematical manipulation of data obtained in a rigid, demanding measurement scheme. The polymer reconstruction investigative chromatopyrography system conceptually outlined in Figure 5 is used to characterize polymers and polymer formulations. Stepwise analysis on the analyte system of interest is generally performed, with an outgas step to analyze the volatile components and a pyrolysis step to indirectly gather both qualitative and quantitative information on the sample. After collation of the data set, the use of library data base searching and chemometric tools permits a model reconstruction of the original sample. This model can then be used to predict physical and chemical characteristics of the system of interest.
Headspace volatiles
Pattern recognition and library comparisons
Qualitative Quantitative
Spectroscopic determinations
Elemental
Figure 5. Pyrolysis GC and parallel F I D - M S s y s t e m for p o l y m e r c h a r a c t e r i z a t i o n 724 A · ANALYTICAL CHEMISTRY, VOL. 57, NO. 6, MAY 1985
Best-fit reconstruction
To achieve this polymer characterization capability, reproducible chromatopyrography had to be achieved. Through the application of a unique cryofocus device, the large dead-volume injections required for chromatopyrography were rendered feasible, with little deleterious effect on the chromatography. Reproducible cryofocus and chromatography event control is necessary for the reproducibility of chromatopyrography. This polymer characterization system uses distributed processing to implement its parallel measurement and control requirements. Control of the pyrolysis, cryofocusing, and chromatography events and flame ionization detector data acquisition are accomplished with one slave computer; control of and data acquisition from the mass spectrometer are performed by another slave computer. These data are then collated, reduced, and archived in a desk-top host computer, which may communicate this data set to a mainframe computer for chemometric analysis. The data interpretation requirements are well beyond manual methods, requiring the analysis tools of chemometrics to observe trends and surpass simple comparisons to achieve a model of polymer characteristics. This system is an example of a measurement process that grew from the availability of good control I/O and data reduction tools; without such tools the disorganizing probe of pyrolysis would yield only fingerprint information. Inception as a softwaredriven instrument has resulted in a system with far more potential.
Conclusion The performance of software-driven instrumentation is dependent on the software performing the tasks of control, data acquisition, and data processing. To quote Alan Kay: "Computers are to computing as instruments are to music. Software is the score, whose interpretation amplifies our reach and lifts our spirit . . . The same notation that specifies elevator music specifies the organ fugues of Bach"
U). This notion implies that the quality of the measurement system as a whole may to a large degree be associated with the quality of the software code and architecture. For analytical instrumentation this software should be "bulletproof against operator error and often should mimic standard instrumentation. These features will ensure ready acceptance of the new wave in instrumentation in the analytical laboratory. It should be reassuring to the analytical chemist that the sophistication of computing in analytical chemistry
is rapidly tracking developments in the computing community, making instrumentation more capable, easier to use, and less expensive. The new wave in instrumentation is more precise, reliable, and flexible than conventional instrumentation. The effect on analytical chemistry has been to open doors previously closed or barely cracked. New techniques have arisen (polymer reconstruction investigative chromatopyrography) as have new insights into established techniques (vectorbased FT-IR). Increased efficiency and capability can be incorporated into conventional techniques (multielement AAS), and the widespread use of software-driven architecture surely presages a new phase of automation in the laboratory. The role of analytical chemists will not change drastically, although the tools they have available to probe problems will allow more thorough investigation. The time the analyst spends on a problem will be redistributed toward data analysis and away from data collection as instrument throughput and the amount of information yielded by an experiment increase. The analytical chemist will not need to be an expert in computing to survive in the modern laboratory, although an awareness of an instrument's architecture and its inherent capabilities and limitations will enhance creativity in experimentation and the chemist's faith in the analysis. It is the responsibility of instrument designers to inform the analytical chemist of the nature of the instrument through complete, current documentation of all aspects of the system, both for the analyst's creative use and as a reassurance that the measurement being made is free from interference due to inherent design limitations. We are quickly approaching the integrated laboratory, where everything from the inventory of reagents to the balances to the sophisticated instrumentation is integrated into a network easily accessed by the analyst. Although this may alter the nature of his or her job, there is no doubt that in a proper implementation both the productivity and creativity of the analyst will be enhanced.
Sample Preparation Bombs In Many Styles and Sizes
Acid Digestion Bombs, Teflon lined, for treating inorganic samples in HF, HCI and other strong mineral acids, or for digesting organic samples in strong alkalis or oxidizing acids at temperatures well above normal boiling points. Available in five different styles for various applications. r Oxygen Combustion Bombs for breaking down any solid or liquid combustible sample quickly and completely by oxygen combustion in a sealed bomb, for immediate conversion to soluble forms and complete recovery of all elements.
General Purpose Bombs in sizes from 21 ml to 2 gallons for a wide range of laboratory applications.
Our 16-page Bulletin 1100 details these bombs and their many unique applications. Write or phone for your copy.
References (1) Kay, A. Sci. Am. 1984,251, 52. (2) Dessy, R. E., Ed. Anal Chem. 1985,57, 77 A. (3) Dessy, R. E., Ed. Anal Chem. 1985, 57, 310 A. (4) Harnly, J. M.; O'Haver, T. C; Golden, B.; Wolf, W. R. Anal Chem. 1979, 51, 2007. (5) Salit, M. L.; Parsons, M. L. "Development of a Multielement Graphite Furnace AA System," presented at Labcon West, Session 20, May 8-10, 1984, Long Beach, Calif.
fr™ PARR INSTRUMENT COMPANY 211 Fifty-third Street Moline, Illinois 61265 Phone: 309/762-7716 TWX: 910-225-1753
CIRCLE 163 ON READER SERVICE CARD
ANALYTICAL CHEMISTRY, VOL. 57, NO. 6, MAY 1985 · 727 A
(6) Obremski, R. J. "Vector Based Soft ware," Technical Information Paper T-1590-IR-84-27, Beckman Instruments, 1984. (7) Rogers, S. P.; Evans, K. "A Unique Ex perimental Design for Characterizing Complex Organic Matrices," presented at the Pittsburgh Conference on Analyt ical Chemistry and Applied Spectros copy, Abstract 023, March 5-9, 1984. (8) Rogers, S. P.; Salit, M. L.; Parsons, M. L. "Development of the Investigative Portion of PRIC," presented at the 1984 Pacific Conference on Chemistry and Spectroscopy, Abstract PP17, Oct. 11-12, 1984, Sacramento, Calif.
Precise Performance for Critical Separations.
Marc Salit received his BA from Skidmore College in 1981 and is com pleting work for his PhD at Arizona State University under M. L. Par sons. His research interests include instrument design, computerized measurement systems, laboratory automation, and application of cur rent software tools in the analytical laboratory.
B&J Brand™ High Purity Acetonitrile UV gives you uncompromising performance for low wavelength HPLC analysis. With consistently low UV absorbance down to 190 nm, it allows higher sensitivity, and assures artifact-free separations. For complete information on B&J Brand Acetonitrile UV—or other high purity solvents—contact American Burdick & Jackson, 1953 South Harvey Street, Muskegon, Ml USA 49442. Phone: 616 726 3171. American Burdick & Jackson Subsidiary of American
Hospita' Supply Corporation
c 198S Anvrcxi HcepU S u « * Co>po