The panoply of electronic gadge found in today?s laboratory is years of technological developm
596 A
A N A LY T I C A L C H E M I S T R Y / S E P T E M B E R 1 , 2 0 0 0
ago, this was how laboratory experiments were performed. Indeed, for those of us who lived through the early daysthe “dark days”of laboratory research, it is a time almost too painful to remember, particularly the amount of time spent trying to accomplish tasks that today are so commonplace and trivial. Thankfully, these times have passed into history. And relatively quickly at that: The rapid pace of technological innovation through the intervening years (1−8) has now brought researchers dramatic
relief, reducing previously laborious tasks to microsecond or even faster timescales. Beginning with the first widely distributed laboratory-based minicomputer, a retrospective tour of the remarkable advances that led to today’s mod-
ern computerized instrumentation shows just how far we’ve (thankfully) come.
In the beginning …
DIGITAL EQUIPMENT CORP.
magine a laboratory in which computerized instruments do not yet exist; in which there are no graphical interfaces, no internet connections, no high-speed data processors or communication lines; in which all data are processed by slide rule or primitive calculator; and in which chromatograph peaks are painstakingly cut and weighed by hand, or numbers are read off meters and jotted down on paper. And know that, not so very long
FIGURE 1. The first widely distributed minicomputer. Digital Equipment Corp. workhorse PDP8
minicomputer helped spawn an entire The first widely distribindustry. uted minicomputer, the Digital Equipment Corporation (DEC) PDP8, was shipped in 1965 (Figure 1). The instrument’s success, particularly the integrated circuit PDP8/i and PDP8/e, helped spawn an entire minicomputer industry (9)—one in which DEC, Data General, Hewlett-Packard, Raytheon, Varian Associates, Texas Instruments, Modular Computing Systems, Scientific Data Systems and other companies were major players (10). The inexpensive unit, small enough to reside in the laboratory, enabled chemists to develop more active and interactive computer-based instruments. Chemical instrumentation. X-ray crystallographers and nuclear chemists were among the first to use minicomputers for chemical research. The PDP8 and its predecessor, the PDP5, were used in computer-controlled diffractometer systems (11, 12) to control experiments and acquire data. In a typical crystallography experiment, a user would input approximate lattice parameters determined from photographic methods or, if parameters were unknown, a user would have the computer search for appropriate angles to obtain reflections. If parameters were specified, the computer calculated angles and located reflections. Using angle calculation results, crystals were oriented via computer-controlled motors, and reflection intensities could be obtained. Expermenters located the orienting reflections manually or automatically, and they could interact directly with the instrument or indirectly through the computer, which controlled the experiment and automated the measurement process. In the mid-1960s, electrochemists began using minicomputer-based systems in electrochemical instruments to initiate experiments, synchronize operations, and collect data (13). In later voltammetric instruments, the minicomputer controlled cell potential and the application of potential-time waveforms by being inter-
S E P T E M B E R 1 , 2 0 0 0 / A N A LY T I C A L C H E M I S T R Y
597 A
faced to a potentiostat and sweep generator. The computer also acquired the cell current after digitization by an analogto-digital converter (14). Several novel real-time interactive and iterative electrochemical systems have been described by Sam Perone (15), then of Purdue University. In the late 1960s, spectroscopists also saw the advantages and began developing minicomputer-based spectrometers. Early work by Crouch focused on computercontrolled electrothermal atomization atomic absorption and fluorescence systems (16). The computer controlled the various heating stages of the atomizer and synchronized the acquisition of absorption or fluorescence data with the atomization step. Systems were developed in which the optimization of the various instrumental parameters by gradient searching or Simplex methods could be affected under computer control. At the University of Cincinnati, Atkinson used minicomputers in rapid-scanning spectroelectrochemical experiments and for timeresolved phosphorescence emission studies (17). The value of the minicomputer in the laboratory was also demonstrated when computer-based instruments were used for chemical kinetics studies. One early application involved data acquisition from stopped-flow kinetics experiments (18). Many reactions took place at millisecond timescales. Before the use of laboratory computers, oscilloscopes were commonly used to acquire data. An oscilloscope tracea signal versus time record of the reaction progresswas recorded by camera and analyzed, often manually. The minicomputer overcame this tedious photographic data acquisition process and made possible many advanced data processing modes (smoothing, ensemble averaging, least squares analysis, etc.). Still, because of the expense, computer-based data acquisition facilities were shared among researchers. At Michigan State University (MSU), a stopped-flow apparatus belonging to Jim Dye was connected by parallel digital transmission lines (19) to Chris Enke’s PDP8/i computer located four floors away!
Interfacing nightmares Interfacing instruments to computers was usually difficult and expensive in the early minicomputer days. With the PDP8/e, for example, addressing a peripheral and synchronizing realworld and computer-world activities meant dealing with the infamous input/output programming (IOP) pulses 1, 2,
598 A
A N A LY T I C A L C H E M I S T R Y / S E P T E M B E R 1 , 2 0 0 0
and 3. These pulses were generated on specific I/O bus lines as a result of the execution of the I/O instruction and used to gate information into and out of an interface. Other computers had more sophisticated interrupt structures, but programming these was not simple. Data acquisition boards were usually homemade, wire-wrapped boards, which lacked long-term reliability and were notorious to troubleshoot. Some commercial data acquisition boards were sold for minicomputers, but these were expensive. A few computer systems, such as the DEC Laboratory PDP12 and PDP8/e, incorporated the analog-to-digital converters, digital-to-analog converters, timers, and triggering devices needed to interface with laboratory instruments, greatly easing interfacial barriers. At MSU, such a laboratory minicomputer (the “Rolling 8”) was placed on a cart and rolled from one research laboratory to another. Interfacing with asynchronous serial (INWAS) techniques also became popular during these years. The INWAS scheme had the advantage of being computer- and instrument-independent and, hence, “universal” in nature (20). In the mid-1970s, standard bus structures began to appear, which allowed computers and instruments to communicate via parallel channels. The IEEE Standard 488, developed originally by Hewlett-Packard as the HewlettPackard interface bus, is now called the general purpose interface bus (GPIB) (21). Its initial purpose was to allow the computer to communicate with test instruments, such as digital multimeters, logic analyzers, oscilloscopes, and waveform generators. GPIB interfaces were soon designed into computer peripheral devices and later into scientific instruments, such as electrochemical instruments and chromatographs (22). A second widely used standard, the computer automated measurement and control (CAMAC) standard (23), became popular in this same time period for nuclear instrumentation. The standard was adopted by the National Instrumentation Methods Committee in the United States and was seen in the popular CAMAC crates (24). The IEEE 488 and CAMAC standards were also independent of computer design. During the minicomputer era, packaged commercial systems became available for chromatography, MS, electrochemistry, NMR spectrometry, UV–visible spectrophotometry, fluorescence spectroscopy, and other instrumental methods. Early systems automated the standard features of
conventional instrumentation by adding computer control and data acquisition. In the mid-1970s, however, scientists began to exploit the tremendous power and flexibility provided by the available software and computer hardware to develop new measurement modes and principles. Some involved real-time modification of transducers, data domain converters, and computer interfaces to optimize the acquisition of information about the physical system being studied (25, 26). By the late 1970s, commercial instruments began to take advantage of breakthroughs in measurement and control principles. By then, however, microprocessors had begun to make inroads in computerized instrumentation, as discussed in the next section. Software. Minicomputer software was crude by today’s standards, although some user-developed programs were quite sophisticated. The input devices in early minicomputers were paper tape readers. This discouraged all but the most avid programmersthe error messages that inevitably occurred on the third pass of the FORTRAN compiler were enough to send most users back to their strip-chart recorders and oscilloscopes. Fortunately, mass storage devices became available in the 1970s, making possible the use of operating systems such as OS/8, RDOS, RT11, and RSX11. These systems created file structures and directories on the tape or disk, took care of many housekeeping tasks, and facilitated data interchange among peripheral devices. Application programs were frequently developed in BASIC or FORTRAN using interpreters and compilers that were commercially available. However, assembly language was still used to control the hardware interface to the experiment. Few programs were widely available for accomplishing such generally useful tasks as statistical analysis or curve fitting. Most were distributed through users groups (e.g., the DEC users group DECUS) or by word of mouth; such programs were often sparsely documented and unsupported. During this period, interface hardware vendors began to provide acquisition and control software for their products. However, one such vendor included the disclaimer, “This software is guaranteed only to occupy memory.”
The microcomputer revolution The world of computing was forever changed in 1971 when Intel introduced the 4004a “microprogrammable
computer on a chip”. The first time Crouch heard much about the microprocessor was in 1973, when he attended a joint U.S.–Japan conference in Hawaii with the rather imposing title of “Computer-Assisted Chemical Research Design”. The late Charles Reilley of the University of
FIGURE 2. Topology of a modern computerized experiment. The hierarchical, distributed collection of transducers, data domain converters, communication channels, and computers shown represents but one possible topology for current system configurations. Modules labeled fij represent the input transducers, data domain converters, interface modules, and output transducers.
North Carolina gave a brief talk on UNC’s multilaboratory data acquisition and analysis system for research and teaching laboratories (27). This system was based on the Intel 8008, an 8-bit microprocessor. In 1974, Intel introduced the 8080, which had the advantages over the 8008 of requiring fewer support chips and being able to address much larger amounts of memory. It took several years for microprocessors and microcomputers to become viable in chemical laboratories. Despite the pioneering work of Reilley and others, many potential users were reluctant to devote the time necessary to design
S E P T E M B E R 1 , 2 0 0 0 / A N A LY T I C A L C H E M I S T R Y
599 A
and construct their own computer-based systems. It took the introduction of the MITS Altair 8800 computer to convince many of us that the microprocessor chip could be made into a viable computer. In the chemistry laboratory, the microprocessor eventually became very useful in controlling applications and bringing some “intelligence” to chemical instruments, which began to incorporate microprocessors for control purposes, although they often still used minicomputers for data processing. Microprocessors and microcomputers were even developed that emulated minicomputers, such as the PDP8/e (e.g., the Intersil IM 6100). Microprocessor-based sequencers and controllers brought some branching, looping, and decision-making capabilities to the instrumentation world. In the late 1970s, commercially available instruments began to appear with built-in intelligence and advanced automation capabilities. The early 1980s saw the number of these instruments mushroom. The advanced capabilities brought about by the microprocessor spawned new measurement concepts, such as those incorporated into diode array spec-
trometers (28) and electrochemical instruments (29). Along with these developments, new languages that were particularly useful for instrumentation began to be developed. The FORTH language became very popular among several groups and spawned a number of advancements in instrument control and automation. However, FORTH suffered from a major disadvantage; it was so easily modifiable by the user at run-time that documentation and evolution of programs were virtually impossible. Fortunately, many of us were saved from too many erroneous turns by the introduction of commercial personal computers in the late 1970s. The Radio Shack TRS-80, based on the Z-80 chip, was arguably the first personal computer (PC) to appeal to a mass audience. Even organic chemists could now become computer-literate. In 1977, the Commodore PET, based on the MOS Technologies 6502 chip, and the Apple II, also based on the 6502 chip, were introduced. Single-board computers, such as the Rockwell AIM and KIM computers, were also being used. Getting any of these computers to “talk” to chemi-
FIGURE 3. Measurement and control system evolution. Block diagrams of (a) a manual instrument, (b) a computer-attached instrument, and (c) a modern computerized instrument indicate the progressive level of control computers have had over laboratory experiments and information collection and processing.
600 A
A N A LY T I C A L C H E M I S T R Y / S E P T E M B E R 1 , 2 0 0 0
cal instruments or to do other useful tasks in the laboratory was, however, a formidable task because of the lack of software and the difficulty in interfacing. The first generally useful piece of software was VisiCalc, the famous spreadsheet program developed for the Apple II. In 1981, IBM introduced the IBM PC. While advancing the field only incrementally (it used the 8088 microprocessor), the IBM PC represented the maturing of the PC. The original PC had Microsoft BASIC in read-only memory and a built-in cassette port. Soon, however, versions with floppy disk drives appeared, and Microsoft PCDOS became the dominant operating system. Some useful applications software was also introduced, including a wordprocessing program, a version of VisiCalc, and some crude accounting programs. The PC age was now at hand.
The PC age The trends begun during the minicomputer and microcomputer era are continuing today in the PC age. Not only does the scientist now have a computer in the laboratory but also on the desktop. The scientist now uses the computer not only for scientific computing (theoretical calculations, simulations, data acquisition, data analysis, and experimental control), but also for manuscript preparation; visualization; and communication with peers, funding agencies, and publishers. Instrumentation. Many instruments are now constructed as “stand-alone” unitsall the pieces come in one box. Just as logic is increasingly being integrated onto a single chip and more functionality is being included in the computer, “instruments” are becoming more robust, with more functionality included. More of the components of Figure 2, including at least one CPU, are being integrated into one system. Where an experiment once used several instruments, such as oscilloscopes, voltmeters, potentiostats, power supplies, and analog and digital circuitry, everything now might be contained in one instrument provided by one vendor. In most cases, the instrument or experimental setup is smaller and more closely located to the system being investigated. Networks. As the minicomputer era was drawing to a close and the PC era was beginning, the desire to connect computers in a meaningful way was growing. The days of minicomputers and mainframes had seen the development
of asynchronous multiplexers for connecting large numbers of user terminals to the central computers. Often these “seats” were in the same building, with twisted pairs of wire making the RS-232C connection between computer and terminal. In other instances, modems and phone lines completed the serial links. Almost immediately, these serial lines were also used to connect two computers. Terminal emulator software such as KERMIT allowed a user sitting on one machine to have a session on a second, remote machine. One could also use these facilities to move files between the two machines. More elaborate software was developed making networks of such computers viable. Examples were ARPANET (1970), USENET (1979), and BITNET (1981). In 1980, Xerox, Intel, and DEC introduced Ethernet. IBM introduced the Token Ring network. These hardware technologies coupled with software such as Appletalk, DECnet, SNA, and TCP/IP allowed for higher functionality networks connecting large numbers of nodes (30). The impact of these developments was at least two-fold. First, these higher-performance communication mechanisms provided alternate means to implement the communication channels of Figure 3. Large high-performance instrumentation systems, such as those developed in the 1980s at the National Superconducting Cyclotron Laboratory at MSU, are examples of this (31, 32). A hierarchical system, spread over ~100,000 square feet, consisted of 5 Vaxes, 36 CAMAC crates (each containing a Motorola microprocessor), and the necessary CAMAC modules to read and control 800 parameters. This system was capable of gathering 1 MB of data per second for days. Physicists considered this a moderate-sized experimental system. The second impact was the increasing desire of researchers to use multiple computers for various scientific endeavors (data acquisition on one computer, data analysis on another, manuscript production on a third, report generation and consolidation on a fourth, etc.). Thus, there was an increasing need to transport data out of the experimental environment. Software. With the increasingly functional and more economical computer resources available to the researcher came an improvement in commercially available software. The researcher could no longer afford to develop software. As an example, National Instruments launched their
S E P T E M B E R 1 , 2 0 0 0 / A N A LY T I C A L C H E M I S T R Y
601 A
LabView software system in 1986. This system was significant in a number of ways. First, the software system integrated data acquisition, control, movement, analysis, visualization, and storage. Second, the software allowed a wide range of hardware components, both discrete instrument devices and modules sitting within the computer. Third, the architecture of the software system was modular and hierarchical, much like modern operating systems, allowing common higher-level functionality sitting on top of lower-level drivers and computational modules. Thus, one module or layer could be easily changed when new hardware and communication techniques were implemented. The user interface to LabView was highly graphical, using little or no “programming” in the traditional sense. This trend to graphical user interfaces, begun at Xerox-
laboratory information management system (LIMS). This software tool is used to administer, archive, retrieve, and analyze large databases (data mining), and produce a variety of reports (33, 34). Such LIMS have now become an integral part of corporate computing. The World Wide Web. The World Wide Web has had an impact on the geographical distribution of the components in Figure 2. Paul Lauterbur of the University of Illinois has developed a Web-based investigator’s interface to various magnetic resonance imaging (MRI) instruments (http://bmrl.med.vivc.edui8080/smr-96-abstract.html). This allows the researcher to sit at any Web-capable terminal and oversee MRI experiments. This also allows for unique modes of collaboration among geographically separated researchers. The investigator still has to get the analytical samples to the instrument, and a team of technicians has to be at the site to handle samples and maintain the instrument.
Benefits and lessons learned
FIGURE 4. Changes in computers over time. Since the introduction of the PDP8, the price-performance ratio of computers (green)the number of bits/word divided by the product of cycle time and pricehas increased tremendously, while the cost of disk storage (red) in the last 25 years dropped dramatically.
PARC, at Apple and with X-windows for operating systems, includes AVS for three-dimensional data visualization and Microsoft’s ACCESS for databases. National Instruments now calls their graphical language “G”, a takeoff on the name of the language “C”. Data management. Legal and regulatory requirements coupled with the increased volume of data emerging from the laboratory have led toward the development of the
602 A
A N A LY T I C A L C H E M I S T R Y / S E P T E M B E R 1 , 2 0 0 0
Computerized instruments have indeed come a long way since the late 1960s (Figure 4). Since the introduction of the PDP8 in 1965, the performance-price ratio (PPR) of computersthe number of bits/word divided by the product of cycle time and pricehas increased tremendously. For a modern computer with a 32-bit, 700-MHz processor, this translates to a PPR approaching 107. By contrast, the early 12-bit PDP8, with a cycle time of 1.6 µs, had a PPR of only a few hundred. Not only has the PPR of the modern computer increased by several orders of magnitude, but in the last 25 years, the cost of disk storage has dropped just as dramatically. In 1974, Crouch purchased a 1-MB cartridge disk for the PDP8/e for approximately $8000. Today, a 20-GB disk, which holds 20,000 times as much information, can be purchased for a little over $100—a drop in the cost per MB of nearly 107. Memory costs have also decreased tremendously, although in a more zigzag fashion. These are great improvements in the computers attached to or contained within modern chemical instruments. The improvement in transducers, analog electronic components, and control elements of instruments has been almost as significant. The combination of the new generation of computers with modern sensors and miniaturized instrument
components are making possible new advances that were inconceivable just a few short years ago. What have we learned from our look at the past? It may seem to some that all the effort at computerizing instruments during the minicomputer and microcomputer era was for naught since little of what was developed then survives today. But we believe that this is a shortsighted view. From our perspective, today’s computerized instruments have evolved because of those past efforts. By being involved in the evolutionary process, scientists have helped to guide the developments and shape the way things are today. We are tempted to speculate about what the next few years will bring, but that’s not our job. Still, we can hardly wait.
(23)
References
(24)
(1) (2) (3) (4)
(5) (6) (7) (8) (9) (10) (11)
(12)
(13) (14) (15) (16)
Frazer, J. W. Chem. Instr. 1970, 2, 271–295. Enke, C. G. Science 1982, 215, 785–791. Wade, A. P.; Crouch, S. R. Spectroscopy 1988, 3, 24–31. Malmstadt, H. V.; Enke, C. G.; Crouch, S. R. Microcomputers and Electronic Instrumentation: Making the Right Connections; American Chemical Society: Washington, DC, 1994; pp 1–21. Enke, C. G. Anal. Chem. 1971, 43, 69 A–80 A. Malmstadt, H. V.; Enke, C. G.; Crouch, S. R. Electronics and Instrumentation for Scientists; Benjamin/Cummings: Menlo Park, CA, 1981. Venkataraghavan, R. F.; McLafferty, F. W.; Amy, J. W. Anal. Chem. 1967, 39, 178. Ceruzzi, P. E. A History of Modern Computing; MIT Press: Cambridge, MA, 1998. Electronics 1980, 53, 322–372. Ceruzzi, P. E. A History of Modern Computing; MIT Press: Cambridge, MA, 1998; pp 191–192. Busing, W. R.; Ellison, R. D.; Levy, H. A.; King, S. P.; Roseberry, R. T. The Oak Ridge Computer-Controlled X-Ray Diffractometer, ORNL-4143; Oak Ridge National Laboratory: Oak Ridge, TN, 1968. Sparks, R. A. In Laboratory Systems and Spectroscopy; Mattson, J., S.; Mark, H. B., Jr.; MacDonald, H. C., Jr., Eds.; Marcel Dekker: New York, 1977; Vol. 5, pp 19–44. Lauer, G.; Abel, R.; Anson, F. C. Anal. Chem. 1967, 39, 765–769. Perone, S. P.; Harrar, J. E.; Stephens, F. B.; Anderson, R. E. Anal. Chem. 1968, 40, 899. Perone, S. P. Anal. Chem. 1971, 43, 1288–1299. Crouch, S. R.; Montaser, A.; Goode, S. R. In Information Chemistry— Computer Assisted Chemical Research Design; Fujiwara, S.; Mark, H. B.,
(17)
(18) (19) (20) (21) (22)
(25) (26) (27)
(28) (29) (30) (31) (32) (33) (34)
Jr., Eds.; University of Tokyo Press: Tokyo, 1975; pp 107–124. Mark, H. B.; Wilson, R. M.; Miller, T. L.; Atkinson, T. V.; Yacynych, A. M.; Woods, H. In Information Chemistry—Computer Assisted Chemical Research Design; Fujiwara, S.; Mark, H. B., Jr., Eds.; University of Tokyo Press: Tokyo, 1975; pp 3–28. Desa, R. J.; Gibson, Q. H. Comput. Biomed. Res. 1969, 2, 494. Coolen, R. B.; Papadakis, N.; Avery, J.; Enke, C. G.; Dye, J. L. Anal. Chem. 1975, 47, 1649–1655. Dessey, R. E.; Titus, J. Anal. Chem. 1974, 46, 294 A–302 A. IEEE Standard Digital Interfaces for Programmable Instrumentation, IEEE, Standard 488-1975, New York, 1975. Ratzlaff, K. L. Introduction to Computer-Assisted Experimentation; Wiley & Sons: New York, 1987; pp 341–346. Leo, W. R. Techniques for Nuclear and Particle Physics Experiments, 2nd ed.; Springer-Verlag: New York, 1994; pp 338–352. Krutz, R. L. Microprocessors and Logic Design; Wiley & Sons: New York, 1980; pp 385–392. Caserta, K. J.; Holler, F. J.; Crouch, S. R.; Enke, C. G. Anal. Chem. 1978, 50, 1534–1541. Darland, E. J.; Leroi, G. E.; Enke, C. G. Anal. Chem. 1980, 52, 714–723. Reilley, C. N.; Woodward, W. S.; Ridgeway, T. H. In Information Chemistry—Computer Assisted Chemical Research Design; Fujiwara, S.; Mark, H. B., Jr., Eds.; University of Tokyo Press: Tokyo, 1975; pp 345–386. Ingle, J. D., Jr.; Crouch, S. R. Spectrochemical Analysis; Prentice Hall: Englewood Cliffs, NJ, 1988; pp 354–361. He, P.; Avery, J. P.; Faulkner, L. R. Anal. Chem. 1982, 54, 1313 A–1326 A. Ceruzzi, P. E. A History of Modern Computing; MIT Press: Cambridge, MA, 1998; pp 281–306. Au, R.; Beneson, W.; Fox, R.; Notman, D. IEEE Trans. on Nucl. Sci. 1983, 30, 3808–3812. VanderMolen, A.; Au, R.; Fox, R.; Maier, M.; Robertson, M. IEEE Trans. on Nucl. Sci. 1989, 36, 1559–1561. Dessey, R. E. Anal. Chem. 1983, 55, 70 A–80 A. Dessey, R. E. Anal. Chem. 1983, 55, 277 A–303 A.
Stanley R. Crouch is professor emeritus at Michigan State University and adjunct professor at Arizona State University. Thomas V. Atkinson is senior academic specialist in charge of chemistry computing and information technology at Michigan State University. Crouch has recently retired to devote his time to writing textbooks on analytical chemistry and spectroscopy. Atkinson’s research and teaching interests are in computers and their applications to chemistry. Address correspondence about this article to Crouch at 4725 Star Rock Dr., Prescott, AZ 86301 (
[email protected]).
S E P T E M B E R 1 , 2 0 0 0 / A N A LY T I C A L C H E M I S T R Y
603 A