TECHNOLOGY - C&EN Global Enterprise (ACS Publications)

Nov 12, 2010 - Hardware, software, input/output, interfacing, computers, closed-loop control—in short, the jargon of systems. In Cleveland, the word...
2 downloads 8 Views 2MB Size
TECHNOLOGY

Computers stand out at Pittsburgh Conference Hardware, software, input/output, interfacing, computers, closed-loop control—in short, the jargon of systems. In Cleveland, the words hovered not over the spring Joint Computer Conference but the Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. And as the Pittsburgh Conference made strongly evident this year, they represent concepts that the analytical chemist must soon become familiar with, if he isn't already. In as little as a year's time, the leading question for analysts, involving computers, has changed from "What can I do for my particular instrument?" to "How can I automate my laboratory?" More and more, they are getting answers. As befits the systems concept, however, the answers aren't the familiar type. When the emphasis was on new forms of instrumentation or advances in existing forms, the analyst had for the most part merely to check the catalogs. A given instrument would do what he wanted or it wouldn't. He could afford it or not. "Tailoring" is the key word with 64 C&EN MARCH 24. 1969

systems. When the analyst asks "What have you got?" the answer is most likely to be "What do you want?" The purpose of the laboratory or the instrumentation, the relationship of the laboratory to other parts of the company or organization, company economics beyond individual budgets, and computer sophistication of analytical personnel all become significant. These are the considerations that also help to place in perspective new offerings from the instrument manufacturers, many of whom make computers as well. At one extreme, for example, is the satellite computer concept put forth by Varian as a way to time-share various instruments on a large computer. An example of the other extreme is Hewlett-Packard's new Model 7600A gas chromatograph, which provides a punched tape of the analysis which oan then be fed to a computer through any standard time-shared computer terminal. In between are new packaged systems—instrument, computer, software—for x-ray spectrometry from

Philips Electronic Instruments and for mass spectrometry from Avco and Perkin-Elmer. Orion Research's systems built around various combinations of ion-selective and pH electrodes, and temperature and pressure sensors, are further examples. From computer manufacturers comes another variation, with hardware and software designed for analytical applications. IBM's LAB (laboratory automation based) systems are one example. Others are Digital Equipment Corp.'s GLC-8 system for gas chromatographs and its new PDP12 computer system designed specifically for the laboratory. Also, there are Electronic Associates' packaged Pace III systems for autoanalyzers, gas chromatographs, mass spectrometers, and physical testing instruments. Corn-Share, Inc.'s time-sharing service is a somewhat more specialized offering. This proliferation of systems-oriented offerings from manufacturers exhibiting at the Pittsburgh Conference, coupled with the jammed technical sessions on computer applica-

ALL-COMPUTER SHOW. Nearly everything on the floor of Cleveland's convention center was computerized—like Philip's digital spectrometer (left) and Jarrell-Ash's Atomcounter 750

tions, attests to the broad acceptance by analysts that computers have entered the laboratory to stay. What to do with them now that they are there, however, is perhaps not always so clear. One reason is that application and implementation are not always easily separable, although some distinction can be made. Generally, in laboratory applications, the computer handles data acquisition and data processing. Further, it can control the instrument. In processing data, the exact operations performed by the computer may vary from instrument to instrument. Rapid integration, signal averaging to enhance signal-to-noise, or deconvolution of overlapping wave forms are some of the possible operations. With gas chromatographs, to pick one example, the computer can identify peaks, integrate areas, normalize areas to total area, correct for baseline drift, apply correct response factors, and calculate component concentrations. Such operations can, in general, be performed off-line or on-line. Online, closed-loop operation, however,

extends the advantages of computer operation to the experiment or analysis itself. In research applications, for example, the computer is capable of real-time interaction with the experiment, a necessity for some types of experiments where real time is measured in milliseconds. For routine analyses, such as in a quality-control lab, the computer can automate the instruments, consequently minimizing routine lab effort and making the performance of tedious measurement tasks feasible. In any event, the end results of computer application are improved accuracy and increased productivity. In addition, the computer makes possible some experiments that couldn't previously be carried out. Pittsburgh Conference technical sessions provided examples of several types of applications. At Sun Oil, in Marcus Hook, Pa., an IBM 1800 computer is now being used for quantitative data acquisition from 16 gas chromatographs. Various makes of chromatographs are used, with packed and capillary columns and with vari-

ous detectors. Both isothermal and temperature-programed operations arc carried out for routine petroleum analyses, such as alkylate, naphtha, and C I to C5 gas samples. The computer system was installed, Sun Oil's Arthur J. Raymond says, to provide 24-hour turnaround service on all samples submitted for analysis. The laboratory handles 500 to 800 routine plant samples each month. At Varian, computer methods have been developed to adapt nuclear magnetic resonance (NMR) spectrometry to quantitative analysis. NMR has been used at times for quantitative analysis, Varian's L. H. Smithson explains, but it hasn't been widely accepted. Without computer capabilities, the degree of operator skill required to get high-quality data, he says, is extremely high and often the data just aren't worth the effort. The specific analysis chosen by Varian as a prototype for working out computer techniques is that of edible oils. The spectrometer is a Varian Model T-60 and the computer system MARCH 24, 1969 C&EN 65

is Variant Spectrosystem 100. By keying on the triglyceride protons, the instrument produces data which are then related to iodine number, a measure of unsaturation. Only three parameters, Dr. Smithson says, must be entered by the operator—starting point for the RF sweep, the range over which it should be made, and the rate. After calibrating sweep circuitry, the computer acquires the spectrum according to the entered parameters. It then corrects the baseline, integrates the data, and performs calculations for iodine number. It acquires a number of spectra in this way (however many are desired), calculates standard deviation, and prints out results. The precision achieved so far, Dr. Smithson says, is within two iodine numbers of titration values. Dr. Smithson points out the differences that must be considered between a computer system for a research spectrometer and one for routine analysis. For research use, he says, the computer should control as many functions of the spectrometer as possible. Interfacing should be flexible and the

scientist should have easy access to it. Software should be flexible, versatile, and modular. And the scientist/system interface should be versatile and flexible. On the other hand, for routine analytical work, often carried out by a technician, the computer should be limited to just those functions necessary for the specific analysis. Interfacing should be simple, less flexible, and inaccessible. Software should be inflexible and limited to specific programs. And the operator/spectrometer interface should be limited to very specific functions. Electroanalytical experiments at Purdue University provide an example of still another type of computer application. The experiments are designed to make use of the inherent capabilities of the computer to analyze initial measurements and to use this information in dictating the future course of an experiment. By using a computer, explains Sam P. Perone of the university's chemistry department, stationary electrode polarography experiments can be optimized for each sample. This is diffi-

cult to do with conventional analog instruments, he says, because each sample would have to be known to start with. In the operation, a voltage sweep is made and the computer monitors current voltage data. When it detects a peak it interrupts the sweep for a period of time dependent on the peak height, so that reduction of the element in the diffusion layer around the electrode can be completed before the sweep continues. This prevents one element in a mixture from masking the next. Interrupts last 100 to 1000 millisecond. An uninterrupted sweep lasts about 1 second. In one particular mixture of lead and thallium the measured signal for the lead reduction was improved from 50.8% of theory to 98%. Application is just one aspect of lab systems. Implementation is the other. The particular instrument or instruments involved and their intended use may limit the choices. However, in general, there are four approaches that can be taken. Data can be digitized and punched on paper tape. They can then be proc-

CARBON PROBE. Jeolco's Y. Ogawa fits the company's new carbon-13 probe, which measures 13C by proton decoupling, into its high-resolution NMR unit

AUTOMATED CHROMATOGRAPH. Dow's D. F. Wism'ewski (left) gets a rundown from Hewlett-Packard's R. Galli and J. Poole on capabilities of the HP 7600A

66 C&EN MARCH 24, 1969

essed later, either at a central computer facility or through a time-share terminal connected into an organization's own time-share system or into an outside service. Varian, for example, operates a time-share service nationally and has software programs designed specifically for analytical applications. Corn-Share, one of several nationwide time-share services, made its first appearance at the Pittsburgh Conference this year and is prepared to take on analytical applications. A second option is to have a computer—usually a small one—dedicated to a single instrument. This is the most popular approach so far, and a number of new offerings are now available. Philips Electronic Instruments, for example, has put together such a package for its new PW 1220C x-ray spectrometer. Supplied with computer hardware and software, including specific application programs for cement, glass, steel, or lubricating oils, the system is available for $76,000. A Philips general-purpose P9200 computer is used and controls goniometer, sample, crystals, reflections, collimator, counter, mode of

measurement, and vacuum delay. Perkin-Elmer and Avco Corp. both have computer systems for mass spectrometers. The Perkin-Elmer system is designed for use with high-resolution mass spectrometers and handles off-line data processing. Priced at $55,000, it includes a PDP 8/1 computer, disk files, interfacing, and software. Avco's system is designed for on-line control of its own spectrometers, controlling the scan and acquiring the data—for example, for determining isotopic abundances. At roughly $50,000, the system is based on an Avco PCU S-1200 computer. Computer manufacturer Digital Equipment Corp. has come out with the PDP-12, a new computer system designed specifically for the laboratory. The 12-bit, 4096-word core memory system is priced at $27,900 and includes two magnetic tape storage units, cathode ray tube display, 16-channel analog-to-digital converter and multiplexer, data terminal, Teletypewriter, and paper-tape reader and punch. It can use software developed for the LINC-8 computer, its predecessor, as well as the company's

line of small PDP-8 computers. Two specific features of the PDP-12, the company points out, are hardware program loading and data loading from magnetic tape. The former eliminates the time required to set up a program with a Teletype terminal, and data loading from magnetic tape eliminates dependence on relatively slow paper tape. Electronic Associates has developed its Pace systems further and is now offering the Pace III. Earlier systems were designed specifically for handling multiple gas chromatographs. Now, the company is supplying the Pace III in turnkey systems for gas chromatographs, autoanalyzers, mass spectrometers, and physical testing machines. The basic system contains 16,384 words of core memory, and turnkey systems start at $63,000. A third approach to implementing a laboratory computer system is to use a large computer dedicated to analytical work, but shared on-line by different instruments in different laboratories. The arguments advanced Continued on page 72

MODEL 403. Perkin-Elrner's new atomic absorption spectrophotometer comes equipped with digital readout that's linear in concentration

X-RAY SPECTROMETER. Philips' PW 1220C uses a general-purpose computer to control goniometer, sample, crystals, reflections, collimator, counter, and other operations

MARCH 24, 1969 C&EN 67

TWO BOOTHS. Jarrell-Ash featured its new Laser-Raman system among the instruments it took to Cleveland. Across the way, Varian displayed its mobile unit for demonstrating the pluses of its atomic absorption spectrometers

for such a system point out that each user would have available the sophisticated input/output devices, disk files, and the like of a large computer, giving him more flexibility and computing power. At the same time, the cost, spread among a number of users, would generally be the same or less per instrument than for a small dedicated computer on each unit. Such systems are essentially customdesigned. However, conference attendees at the technical sessions heard a description of just such a system from Engelbert Zieger of West Germany's Max Planck Institute. A system being installed at the institute will handle up to 64 instruments, connected through a multiplexed analogto-digital converter to a time-sharing PDP-10 computer. Up to eight of the 64 analog lines may be connected to high-speed instruments with high datatransfer rates. These include mass spectrometers and pulsed NMR spectrometers. The remainder are for slow-speed instruments, including gas chromatographs, mass, NMR, and ESR spectrometers, and infrared and Raman spectrophotometers. A fourth option, one being put forward by Varian, is a satellite computer system. Varian sees it as providing the best of both worlds. In this approach, small local dedicated computers are used with each instrument or group of instruments. These, in turn, are connected to a large central time-sharing computer which performs the more complex operations. The central computer, Varian says, 72 C&EN MARCH 24, 1969

would process jobs for the satellites in a queuing mode. Thus, the programing would be much simpler than for a nonsatellite time-sharing operation, and throughput would be much faster. An analyst could program his dedicated computer in complete isolation from other users. And he could write, debug, and change central computer programs without interfering with other operations and without the need for mastering the complex time-sharing programing. Moreover, one user would not pre-empt the com-

puter—as is often the case when, for example, a mass spectrometer comes on line. Typically, the satellite computer would handle data acquisition, data reduction, preliminary processing, display, instrument control, intermediate storage, and interaction with the operator. The central computer facility would handle bulk input/output, bulk program storage, and data file storage, and it would perform arithmetic and file search routines requiring large amounts of memory.