A/C INTERFACE
INFORMATION TECHNOLOGY AND AUTOMATING THE TECHNICAL CENTER GETTING I T
Raymond E. Dessy Chemistry Department Virginia Polytechnic Institute and State University Blacksburg, VA 24061
Scientists collect data in ever increasing amounts to meet legal and fiscal imperatives. They convert some small part of it into information, and from that distill a precious drop of knowledge. Ideally all of the knowledge and information should be shared concurrently and retrospectively among colleagues. This assures that time of dev e l o p m e n t is m i n i m i z e d , w o r k efficiency is maximized, and creativity is optimized. Exchange of information and knowledge breeds quality, new ideas, and profit. 0003-2700/92/0364-733A/$03.00/0 © 1992 American Chemical Society
A L L TOGETHER
Yet today's l a b o r a t o r y often is plagued with a problem paralleling a concept that mathematicians and biologists have recently found intriguing and ubiquitous. Take a population of animals, data, information, or knowledge and assume that it grows from generation to generation according to the simple law X = RX(1-X) where X is the current population or amount, ./? is the reproductive ratio from period to period, and X is the new value. The equation seems simple and reasonable. The new value is proportional to the old value multiplied by the production constant. The term (1-X) might suggest that as X increases there is a hindrance to further increase. In animal populations
this might result from a lack of food; in laboratories it might result from increasing difficulties in communicating and sharing data as the numbers of people and facts increase. Alternatively, it could be attributable to the use of inadequate tools that cause the quality of the information and knowledge retrieved to drop. Use your favorite spreadsheet to explore this equation. Take an initial normalized value of X = 0.3, and successively let R = 1, 2, 3.2, and then 3.57 for about 20 periods. When R = 1 the function quickly drops to zero; at R = 2 it rapidly reaches a limit of 0.6. At R = 3.2 the function soon oscillates between 0.5 and 0.8. With R = 3.57 the information transfer efficiencies in each succeeding period will event u a l l y oscillate wildly a t r a n d o m
ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992 · 733 A
A/C
INTERFACE
number values. This example illustrates the concept of chaos theory. The purpose of this tutorial is to describe some of the concepts t h a t are allowing many major laboratories to avoid t h e pitfalls of i n f o r m a t i o n chaos as they critically examine and automate their operations to make use of available technology. W o r k flow re-engineering: T h e human side
Coalescing information technology (IT) functions within the confines of the modern corporate laboratory is a necessary task, but it is not an easy one. Many c u r r e n t s y s t e m s h a v e grown by ad hoc addition, not logic. The normal linear flow of data, information, and knowledge t h a t might occur from research, through development, to production must be r e placed by a nested set of elliptical feedback loops. This requires t h a t normal work routines and the system for distribution of results be re-engineered for more efficient and better use of physical, human, and factual r e s o u r c e s . R e s i s t a n c e to c h a n g e makes this difficult, and the process is sometimes compounded by an u n willingness of users to allow their areas to be accessed by the network linkages required for complete IT integration. Work re-engineering therefore requires deft, diplomatic studies of current methods, and the development and analysis of alternative strategies for accomplishing tasks more efficiently. Although the methodologies to accomplish this are not new in the business and manufacturing arenas (1-5), they are new to the technical laboratory environment. Analytical chemistry, in particular, can be well served by such restructuring because it can be considered as a classic case of a production environment: samples in, results out. The variables to be considered are the physical flow of samples, utilization of staff, and deployment of instruments. To these must be added careful analyses of what data are to be acquired and how they are to be stored, retrieved, shared, and r e ported. In m o s t l a b o r a t o r i e s e v e r y o n e produces data; unfortunately, this data is often excessive and incomplete, and it rapidly becomes obsolete. The future usefulness of data decreases over time if parameters such as the methods used, instrument setup, analyst identification, and other descriptive information are lacking. In such cases the information content may decline even
more rapidly. Where does all such data and information go? If it is on paper it is ignored, lost, and eventually shredded; if it is electronically stored it clogs computer a r t e r i e s . Useless data keeps entire industries alive. Technical center work reorganization therefore h a s some unique problems that involve human as well as technical factors. The nucleus
Ownership of the new tools that will affect work flow and work habits is a n important factor. There must be a consensus among those affected, a degree of participation, and an emotional feeling that it is "our" system r a t h e r t h a n a solution imposed by "them." The process of knowing what enterprise integration is and how to accomplish it involves one or more of five approaches. The most satisfactory approach involves a lengthy education of the staff of the first labor a t o r y t o b e a u t o m a t e d by a sympathetic, sensitive group. This process provides the user with the knowledge and vocabulary to express new needs. It is not desirable or efficient merely to replicate old manual systems with new hardware and software. New ways to a p p r o a c h tasks, share data, and extract inform a t i o n a n d k n o w l e d g e m u s t be found. Many corporations are unwilling to expend the time or money to do this, although this approach pur-
chases the strongest ownership bonds, produces t h e best product, and assures that workers will accept the system, not merely put up with it. An alternative is to devise a database survey composed of a set of questions involving parameter choices and weights. The initial responses from everyone affected are then shared with the participants, and a second request is made for responses to the same queries. This iterative process arrives at a consensus unbiased by the assertiveness of a few individuals in a public forum, allows polar views to be mitigated by reflective compromise or conciliation, and permits unique views to win a following because of the unpressured contemplation that system offers. Interactive computer-assisted consensus, strategy, and tactical decision-making "facilitating rooms" are now becoming available to provide this type of approach electronically in real time. The combination of h u m a n contact and anonymous iterative input with computer text and thought processing is a rapid way to combine m a n and machine in planning total integration. Users find the output incisive, less biased, and more synergistic and comprehensive than that obtained by other means. A third option is to present alternative "paper" scenarios to u s e r s who can select or discard features and synthesize their own plan. This method is based on the philosophy that most people know what a r t or music they prefer, even though they may not be able to paint or compose. One must, however, guard against innate biases in the proffered alternatives. Installation of operating prototype systems that allow users to develop a plan based on actual exposure is a fourth approach. Although this runs counter to most of the "dialogue, then planning" dogma instilled in comp u t e r scientists, such prototyping has long been encouraged by James Martin, a guru of large computer sys-
Table 1. Evolution of laboratory automation Element
User's computer Automation level Networks Connectivity
1960
1970
1980
1990
Mainframe
Minicomputer
PC
Workstation
Instrument
Lab
Technical center
Local area Host centric
Wide area Client-server
734 A · ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992
2000
Global Peer-peer
Distributed,) hosts *»
Figure 1. Host centric environment. The central (distributed) hosts provide all services to users.
terns (6). The only caveat is t h a t if the prototype is a poor fit, discard it and rebuild from the beginning. J u r y rigging an unsuitable system always results in failure because of maintenance difficulties. A fifth option involves having technically qualified individuals interview those to be affected in very small groups. A consensus plan can be developed from merged reports. The possible difficulties in this approach are inherent interviewer biases and the probability that multiple internal interviewers will provide uneven understandings of the lab, whereas outside interviewers may misinterpret the corporate personality. Whatever the approach, any development scenario must ask the users to consider the following questions in some way. • How should sample and work requests travel through the lab? The plan must be built on future needs, not past habits. • What data should be collected electronically, and what role should bar codes play in sample, test, and operator identification? How can file protocols be standardized? • How can data be released to the client most effectively? How should sign-off and access control be implemented to protect both laboratory and client from misinterpretation? • What better mechanisms can be developed to i n t e r p r e t data? How could automatic information and report generation be achieved to reduce paper workload, yet retain the ability to immediately flag unusual events for both lab and client? This could include analytical reliability statistics, correlation of multiple analytical re-
sults where matrix effects might be important, and appropriate warnings to the client about alternative interp r e t a t i o n s . All of these would increase the future usefulness of the information provided. • W h a t information and knowledge is currently being lost by the data manipulation programs now being used? How could expert systems ( 7) be used to extract more and better i n f o r m a t i o n ? An e x p e r t s y s t e m driven methods development program could suggest procedures for characterizing new analytes. What automatic knowledge extraction tools are needed for the large variety of lab databases? They could include correlations among analyte source, date, previous handling, analytical methods used, and related samples. Longt e r m statistical analytical t r e n d s , possible supplier problems, and pot e n t i a l corrections might be provided, increasing the future usefulness of the information. Analytical lab sample throughput optimization charts based on retrospective loads and current instrument, manpower, and sample makeup could be generated (lab t h r o u g h p u t increases of 30% have been reported at Unilever in the Netherlands [8]). • How can data, information, and knowledge be shared globally to improve product delivery t i m e a n d quality, reduce duplicate efforts, and improve creativity by cross-fertilization? Few companies fully use the two most valuable and expensive resources they possess: the minds of their employees and the data t h a t these individuals have created. Data is collected, reported, and archived, but it is seldom efficiently shared. Access to the material by those with
related interests via organized electronic search, browse, or special interest group modes is essential. • Will there be a scientific electronic lab notebook available? Many companies are attempting to provide repositories for research-type activities (9, 10), just as they are doing for more structured analytical and quality control aspects. One goal is to make the individual research notebook more available to others. Another is to provide a better record for patent litigation purposes. The gene r a l i n d u s t r i a l c o n s e n s u s is t h a t "write once, read m a n y " times (WORM) optical drives meet the test of legal acceptability. Holding both scientific results and purchase order information, this medium can demonstrate conception and diligent pursuit (11). • Will the library be an integral part of the network? Many companies already use fax and e-mail dissemination of reports and have centralized facilities where access to, not ownership of, documents is important. These often include electronically s c a n n e d h i s t o r i c a l r e p o r t s stored as machine-readable and searchable files. Network access to CD-ROM databases and journals is developing. The ISO (International Standards Organization) Z39.50 access standard is becoming more prevalent, providing client-server interaction and transparent interlibrary access. Electronic training facilities using interactive CD-ROMs and video conferencing are part of this environment. New compander (compress/expand) hardware has reduced the bandwidth requirements of longdistance conferencing. Today's lib r a r i a n s are information technologists, a vital part of IT (12-15). • Are there a move and a commitment to ISO 9000 standards? Laboratories exist to meet somebody's r e q u i r e m e n t s . Good L a b o r a t o r y Practices (GLP) initiated a move to-
ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992 · 735 A
A/C
INTERFACE
ward focusing on quality. Competi tion from the Pacific Rim, the impact of EC '92, and the changing nature of the American workplace all suggest that total quality is a necessary prod uct characteristic for m a i n t a i n i n g competitiveness. Primary producers and service laboratories alike are finding it important to be able to cer tify that they meet a minimum stan dard of quality, so t h a t purchasers of their product need not implement their own expensive quality control procedures. The economic sensibility of this ap proach, and the marketplace pres sure from those who adopt it first, will rapidly force the issue. The ISO has developed total quality specifica tions (ISO 9001, 9002, 9003) that al low manufacturers and labs to place the equivalent of a UL label on their product or analytical report (16, 17). U.S. equivalents to the ISO s t a n dards are available as ANSI specifi cations. Compliance begins in the shop or lab, but its proof is to be found in the databases that make up IT. • What database search strategies would you like to have? Most older systems are locked into rigid key word search strategies. New systems allow full-text and Boolean searches and permit queries t h a t were never envisaged at the time the database was structured. This latter featureis the strength of relational databases (RDBs). This approach also allows
formats for storage of chemical struc tures already exist. A development plan can begin to emerge from these queries and the responses they elicit. However, there are other factors t h a t must also be considered.
Work flow re-engineering: The technical side
different fields within and among such RDBs to be joined, providing a new view into data and information relationships. • C a n you i m b e d s t r u c t u r e s , graphs, and diagrams into reports? What forms of this compound docu ment architecture (CDA) do you need to produce the reports t h a t clients, management, and regulatory bodies demand? Standards for CDA are r a p idly being solidified. The Analytical I n s t r u m e n t Association (ΑΙΑ) and ASTM are at work on analytical data i n t e r c h a n g e s t a n d a r d s t h a t will make it possible to take files of elec tronically captured data and manip ulate these as objects for inclusion in reports (18). Much of the pioneering effort h a s occurred in the Digital Equipment Corporation ADISS (An alytical Data Interchange and Stor age S t a n d a r d s ) project. S t a n d a r d
Servers
Network
f
Distributed hosts *
Clients
Figure 2. Client-server environment. The network gives clients access to multiple server resources. Simplicity and reduced incremental expansion costs can be gained by isolating similar services on their own computer servers.
736 A · ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992
It is perhaps simplistic, but useful, to look at the evolution of laboratory a u t o m a t i o n from several perspec tives (Table I). Many companies today are initiat ing the development of integrated IT systems. They start in an area of ob vious need, with a carefully analyzed scenario of what the new system is to p r o v i d e in t h e w a y of work r e engineering to improve throughput, quality, and exchange. The success of this effort is then allowed to ripple through the remaining organiza tional components, assuring that in dividual needs are reconciled with t h e m a n d a t e t h a t all systems in stalled be capable of effective inter communication. This usually means a common data base architecture, built around an RDB framework and an easy-to-use s t r u c t u r e d query language (SQL). Unix as an operating system, coupled with RDBs such as Rdb, Ingres, and Oracle, are commonly encountered examples. Such platforms offer rela tively stable, flexible standards cou pled with probable longevity—vital factors in the continuing evolution t h a t will occur (19). S o f t w a r e f a c t o r s . Many scien tists consider commercial laboratory information m a n a g e m e n t systems (LIMS) to be cost-efficient and im mediate solutions to laboratory data needs. Unfortunately, most commer cial products labeled LIMS are actu ally only laboratory data manage m e n t s y s t e m s (LDMS), not t r u e LIMS. Many people also have the impres sion t h a t coupling various LDMS constitutes an effective approach to enterprise-wide automation. Unfor tunately, database descriptor format incompatibilities and inefficient or nonexistent information- and knowl edge-extraction tools can make this approach difficult. Some commercial LIMS (LDMS) products can provide a beginning (20). However, customizing a com mercial LIMS product may not be as easy as many promise, and customiz ing usually is necessary. If this ap proach is attempted, full source code for the modified portions of the soft ware should be available. (All other code should be held in escrow by a
third party.) A full and open license to the database engine involved is essential. Some commercial systems provide r u n - t i m e v e r s i o n s of t h e LDMS and d a t a b a s e engine only. This severely limits the ability to modify structures, paint new screens, or change database descriptor variables. For these reasons, about half of the larger companies are building their own systems. Such in-house systems are costly to develop and may involve ten man-years or more to create. On the other hand, they may fit well and potentially can be easily modified. The success of such an effort depends highly on the skills of the software team. Poor planning, improper language and operating system platforms, inadequate documentation, and loss of key system programmers and managers have all led to catastrophe. For multiple domestic site operations, the large software development costs can be easily amortized, and facile file transfer between sites is assured. For international operations the s e a m l e s s n e s s of a common system can be essential. Slightly modified screens accommodate language differences, but provide a common d a t a b a s e all may share. Alternatively, a front-end expert system can provide a look-alike access mechanism to older, traditional existing databases with different internal formats. A much less satisfactory approach to achieving compatib i l i t y i n v o l v e s t h e u s e of file interchange translators. These consume time and computer cycles, and writing them in house may not be a trivial process. Supportive vendors should provide these if proprietary systems need to be preserved. N e t w o r k i n g f a c t o r s . A longrange facility-wide plan must provide the computer power and network infrastructure (21) for the IT system. This plan m u s t recognize t h a t the distribution of computing power begun with the PC revolution must remain, yet it must be integrated into a more centrally m a n aged networked system to meet the new demands of sharing, certification, validation, archiving, and system management. The n e t w o r k ' s evolvable b a n d width must be large enough to accommodate the increasing demands that will be made of it as users become more sophisticated. Multiple local area networks (LANs) t r a n s mitting at 10 megabits per second (Mb/s), coupled with redundant fiber distributed data interconnect (FDDI)
at 100-200 Mb/s, are becoming common. This normally takes the form of a segmented LAN made up of baseband or token-ring structures servicing high-volume local traffic within laboratories. These segments may be connected with repeaters, bridges, or routers. Repeaters merely repeat a communication onto another segment and have no directional or security ability. Many companies are using combination bridges and routers (brouters). These p e r m i t control of the sensitive information flow to only appropriate destinations and also allow for intelligent rerouting in case of LAN faults. Laboratories are becoming so heavily dependent upon electronic transmission that a failure is
catastrophic. Concentric duplicate ringed high-speed FDDI links among buildings with duplicate brouter access will survive cable severing. LAN segments can be constructed from thin-wire coaxial cable for noisy environments, or unshielded twiste d - p a i r (such as level 3 wire, ~3 twists/ft) for office areas. Where high bandwidth is needed for intralaboratory transfer of images and large spectral data sets, copper distributed data interconnect (CDDI) may be required. Multiple shadowed disks that hold redundant copies of data, so that failure of one drive leads to a faulttolerant condition, are common. Such i n f r a s t r u c t u r e s r e q u i r e a f r o n t - e n d e x p e n d i t u r e of c a p i t a l early in the development of an integrated system. Developers cannot wait until user demand from several labs wishing to intercommunicate forces the issue. They will be too far behind the power curve ever to catch up. Evolution The evolution of computer hardware and software t h a t can support such an integrated environment is fairly obvious. Stage I. In this stage (Figure 1), conservative facilities will start with host centric environments because of existing personnel and habit pat-
Network
Distributed hosts Peers
Figure 3. Peer-peer environment. Databases and other processes are distributed among computers of equal stature. Services may still be present as client-server constructs, but requests can travel both ways.
ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992 · 737 A
A/C
INTERFACE
terns. A large central computer is at the center of the web. S t a g e II. Client-server environments make much better sense as the system expands (Figure 2). Users (clients) are served by computer servants (servers) holding current and commonly used files. These servers are in turn connected to larger computers. By adding servers it is possible to successively layer the developm e n t of t h e s y s t e m , a n d t h e accompanying cost, as demand increases. Concurrently, constant response time to the user can be maintained. Many Stage I developments have not been able to do this, and user dissatisfaction rapidly leads to declining credibility. S t a g e III. P e e r - p e e r structures will become more common as software tools improve (Figure 3). The hierarchical distinction between elements in the network begins to disa p p e a r a n d P C , workstation, a n d mainframe become peer-like. At the user level this trend has involved g r a p h i c a l u s e r i n t e r f a c e s (GUIs) t h a t have led to interoperable environments, where u s e r s do not need to learn an arcane command string to elicit function. The menus and icons of the PC world are the interface. As an illustration, Molecular Design's Isis can run on a Macintosh or a PC across a network to a variety of RDBs. The user will notice no difference in keystroke or screen a p pearance between the two machines; they work alike and look alike. Use of such systems is intuitive (22). Object-oriented programming codes (OOPs) will mature at Stage III (22). This is an old concept t h a t allows programs to treat targets as generic objects and p e r m i t s already written code to be easily reused. Object-oriented operating systems are also being developed t h a t p e r m i t these reutilization and adaptation characteristics to apply to the operating system itself. System managers will be able to respond to user requests quickly and cost-effectively. Distributed directory service (DDS) capabilities will allow small and large machines to interact in an environment t h a t permits the user to navigate through a complex network without maintaining a physical image of the hardware, without arcane specifications to file locations, and with an awareness of who else is on the network. These combined developments will lead us to an environment where we can work a t our PCs, keeping the friendliness they have now, yet have access to the power and knowledge at
the remotest tip of the network. Finally, expert system shells are becoming available t h a t automatically generate rules from data obtained from the scientist in a cordial dialogue. These rules then allow new data to be transformed into information (23). Rapid strides are being m a d e in a u t o m a t i c knowledgeextraction tools. These advances feed on database information created by integrated systems and help humans apply t h e i r own u n i q u e cognitive processes to t u r n data into knowledge.
Management and the integrated laboratory system Given these h u m a n and technical challenges, how does one politically bootstrap such an effort? The eventual total investment required in political, fiscal, and h u m a n capital is significant, so how is the system j u s tified and how does this pearl grow? The key is effective, knowledgeable, and committed higher management. Although the project begins in a few highly visible labs, it must ripple through the entire organization. This requires a long-range plan established by a group with a high degree of representation from the users. In the best cases the committee is made up of users, and the installation is user-need-driven. The implementers provide a service. This reverses the traditional role of computer groups, which formerly dictated policy. This service role is the trend in organizations t h a t respect the bottom line and u n d e r s t a n d why it is best to think not of a computer center but of the computer service. The long-range plan clearly envisages that, after the highly visible and more easily automated labs are in hand, the integration will proceed to labs t h a t are more complex, labs where difficult political problems are present, and where less immediate a n d visible r e s u l t s a r e expected. Building a n entire system in one massive step is dangerous. Selling of the plan is based on increased efficiency, more creative efforts, and the ability to meet the time and cost factors essential to global competition. This requires management's total involvement. Management's role. The personnel in the nuclear lab(s) serving as the first focal point for integration and the implementers must be supported by someone in the top level of m a n a g e m e n t (24, 25). This "white knight" must have sufficient technical grasp, charisma, and power to c h a m p i o n t h e effort. I n t e g r a t i o n
738 A · ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992
must be built from the bottom up and supported from the top down. Lab managers and management information system personnel cannot, alone, drive through the blockades of corporate politics, conservatism, and ego t h a t tend to block integration. The long-range plan must be highly visible, agreed upon by consensus, and have the proactive backing of top management. Experience suggests t h a t 50% of the attempts to utilize LIMS effectively have succeeded. The ability to conceive and deliver a truly i n t e grated IT system will have a much lower success rate. Such poor success rates result not from failures in technology but from deficiencies in the organization. Failure. Failures in such endeavors have stemmed from a variety of h u m a n sources. Attempts to build empires out of IT and the communication network are paramount. Hoarded information and ineffective communication systems characterized the 1960s and 1970s, which were dominated by large computers. The distributed processing environment resulting from the PC revolution, which brought comfortable desktop computing to the masses, changed our views and expectations of the computer as a servant. As we move back to a more integrated, yet still distributed environment, it is essential that the ease of use and access typifying the 1980s be maintained. Unfortunately, empires are being built again. Shared information is power; unshared information is disaster. Often, precipitous attempts at integration, or pushing an antiquated system beyond its capabilities, have left lab m a n a g e r s w i t h concerns about integrated systems. They see their control and work flow being altered by an alien force. In many ins t a n c e s ill-conceived p l a n s w i t h promises of too much too soon have led to skeptical, pessimistic, and defeatist attitudes among employees. Only tact, diplomacy, and the building of a sense of ownership of the system by the lab personnel will correct this situation. Success. The system we get is the product of what we need and what we are willing to adopt. If the "we" defined here clearly understand the benefits to be derived from integration, the re-engineering changes will be accepted. The corporation will become more competitive, producing higher quality products in a shorter time frame. Such sustained success is essential to survival (26).
The glistening pearl Like a pearl t h a t glows more color fully against skin, an integrated lab oratory system accrues power as its users become more sophisticated and integrated themselves. Science has become complex, p r o d u c i n g more d a t a , information, and knowledge t h a n we can digest w i t h c u r r e n t methods. The PC allowed us to better control our own small world. All lab oratory personnel must now reach out and begin to s h a r e what they know with others. Problems in devel opment must be transferred to r e search quickly, and production u p sets should be anticipated or more rapidly solved by development. Re search can learn a great deal from its downstream partners. Two-way communication up and down the cre ation pathway is the only solution. Understanding other people's prob lems and how they think and act al lows integration in the social sense. It makes equal sense in the technical environment. The author would like to thank the many com panies and their employees who have provided the incentive to think about this problem, and those whose conversations stimulated some of the concepts presented here. Interactions with participants in our ACS short course in this area have helped hone and refine many of the ideas. Special thanks go to Michael Starling, Clifford Baker, Keith Casserta, Robert MacDowall, Marc Salit, Dave Duewer, Bernard Vandeginste, and Gerrit Kleijwegt, whose phi losophies are intimately intertwined within this document. Particular thanks are extended to one of the referees who helped greatly by sug gesting changes to the architecture and content that improved substantially the usefulness of the paper. The figures are adapted with permis sion from Michael Starling (Union Carbide, Charleston, WV), who also provided insight into the use of computer-aided facilitation. References (1) Ono, T. Workplace Management; Pro ductivity Press: Cambridge, MA, 1988. (2) Nolan, W. Harvard Business Review 1979, March-April, 115. (3) Brown, D.; White, C. Organizational Research and Artificial Intelligence; Kluwer Academic Publishing: Boston, 1990. (4) Rubin, D.; Stinson, J. A Quantitative Approach to Management; McGraw-Hill: New York, 1986. (5) Boothroyd, H. Articulate Intervention; Taylor and Francis: London, 1978. (6) Martin, J. Application Development without System Programmers and An Infor mation Systems Manifesto; Prentice-Hall: Englewood Cliffs, NJ, 1982. (7) Dessy, R. E. Anal. Chem. 1984, 56, 1200 A; 1312 A. (8) Vandeginste, B., Vlaardingen, The Netherlands; personal communication, July 1991. (9) Burger, Α.; Meyer, B.; Jung, C; Long, K. Hypertext '91 Proceedings. (Available from The ForeFront Group, 1709 Dryden, Suite 901, Houston, TX 77030.) (10) Dessy, R. E. Chemom. Intell. Lab. Syst. 1990, 8, 2.
(11) Dessy, R. E. Anal. Chem. 1985, 57, 692 A. (12) Kibbey, M.; Evans, N. Educom 1989, Fall, 15. (13) Lynch, C. Educom 1989, Fall, 21. (14) McGill, M. Educom 1989, Fall, 27. (15) Simutis, L. Educom 1992, Winter, 2. (16) Breitenberg, M. "The ISO 9000 Se ries"; NIST Internal Report 4721; U.S. Department of Commerce. National In stitute of Standards and Technology: Gaithersburg, MD, 1991. (17) Quality Systems: ANSI/ASQC Q911987; American Society for Quality Con trol, 310 W. Wisconsin Ave., Milwaukee, WI 53203. (18) "Standard Specification for the Ana lytical Information Model For Analyti cal Data Interchange and Storage"; ASTM Document E49.52.002.R03; Ana lytical Instrument Association: Alexan dria, VA, 1992. (Inquiries may be di rected to R. Lysakowski, Digital Equip ment Corp., Four Results Way, MR043/ C9, Marlboro, MA 01752. Copies of the May 1992 release of the Chromatogra phy Standard may be obtained from M. Duff, ALA, 225 Reinekers Lane, Suite 625, Alexandria, VA 22314.) (19) Dessy, R. E. Anal. Chem. 1983, 55, 883 A (20) Warner, S. A. Anal. Chem. 1990, 62, 389 A. (21) Warner, S. A. Anal. Chem. 1990, 62, 95 A (22) Dessy, R. E. Chemom. Intell. Lab. Syst. 1991, 11, 251. (23) Hohne, Β. Α.; Pierce, T. H. Expert Sys tem Applications in Chemistry; ACS: Wash ington, DC, 1989. (24) Dessy, R. E. Anal. Chem. 1984, 56, 725 A. (25) Dessy, R. E. Chemom. Intell. Lab. Syst. 1991, 10, 271. (26) An excellent video presentation of Integrated Laboratory IT is available in the form of the fall 1991 COMDEX key note address "The Second Decade: Com puter-Supported Collaboration," pre sented by Andrew Grove, CEO of Intel. The last 30 minutes portrays a live transcontinental and trans-Atlantic in tegrated IT exchange involving text, voice, SEM images, CD-ROM clips, and live TV. Call the Intel Technical Litera ture Distribution Center (800-5484725) to request a free copy (order no. 241226-001, Literature Packet DA03.) Raymond E. Dessy received a B.S. degree in pharmacy (1953) and a Ph.D. in chemistry (1956) from the University of Pittsburgh. After a decade at the Univer sity of Cincinnati, he joined the faculty of Virginia Polytechnic Institute and State University in 1966. From 1982 to 1986 he was contributing editor of ANALYTI CAL CHEMISTRY'S A/C INTERFACE feature. In 1986 he was the first recipient of the ACS Award for Computers in Chemistry. He is currently an associate editor of Chemometrics and Intelli gent Laboratory Systems. His research group works on the development of micro electronic biosensors, expert systems for chemical processes, and novel means of processing analytical information. He is internationally recognizedfor his teaching in the fields of laboratory and technical center automation.
Λ.*; -, ·*~Α~
From Caveman to Chemist Circumstances and Achievements
W
hat was the connection between early chemistry and magic? What was the logic that made alchemists think they could make gold out of lead? Why were gases not recognized until the 17th century? Why did it take 49 years before Avogadro's hypoth esis was accepted? In From Caveman to Chemist, author Hugh Salzberg traces the oddities of chemistry, ex amining cultural and political influences on the ideas of chemists. He follows the evolution of chemistry from the Stone Age beginnings of ceramics and metallurgy, through the rise and decline of alchemy, to the culmination of clas sical chemistry in the late 19th century. Chapters 1 through 9 lead from prehistoric technology, through ancient and medieval sci ence to the study of chemicals and reactions that resulted in the 16th century birth of sci entific chemistry. Subsequent chapters focus on key chemists such as Sala, Boyle, Black, La voisier, Dalton, Berzelius, Laurent, and Arrhenius as they developed the ideas that led to classical chemistry and the concepts of mole cules, chemical reactions, homology, valence, and molecular formulas and structures, among others. Twenty topical illustrations enhance the text. Six timelines and two maps help readers understand the influences of early history on chemistry. About the Author Hugh W. Salzberg taught chemistry at the City University of New York for 35 years and of fered courses in the history of chemistry over a period of 20 years. From Caveman to Chem ist reflects his dual passions for chemistry and history and his profound admiration of the great minds that developed the ideas of chemistry. Hugh W. Salzberg Editor 300 pages (1991) Clothbound: ISBN 0-8412-1786-6 $24.95 Paperbound: ISBN 0-8412-1787-4 $14.95 Ο · R · D · Ε · R
F - R - Ο · M
American Chemical Society Distribution Office. Dept. 88 1155 Sixteenth St., N.W. Washington, DC 20036 or CALL TOLL FREE
800-227-5558 (in Washington, P.C. 872-4363) and use your credit card!
ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992 · 739 A