Information technology and automating the technical center. Getting it

Information technology and automating the technical center. Getting it all together. Raymond E. Dessy. Anal. Chem. , 1992, 64 (14), pp 733A–739A...
1 downloads 0 Views 6MB Size
GETTING

Raymond E. Dessy Chemistry Department Virginia Polytechnic Institute and State University Blacksburg, VA 24061

Scientists collect data in ever increasing amounts to meet legal and fiscal imperatives. They convert some small part of it into information, and from that distill a precious drop of knowledge. Ideally all of the knowledge and information should be shared concurrently and retrospectively among colleagues. This assures that time of development is minimized, work efficiency is maximized, and creativity is optimized. Exchange of information and knowledge breeds quality, new ideas, and profit. 0003 - 2700/92/0364 -733A/$03.OO/O 0 1992 American Chemical Society

IT ALL

Yet today’s laboratory often is plagued with a problem paralleling a concept that mathematicians and biologists have recently found intriguing and ubiquitous. Take a population of animals, data, information, or knowledge and assume that it grows from generation to generation according to the simple law

x = Rx(1-x) where X is the current population or amount, R is the reproductive ratio from period to period, and X is the new value. The equation seems simple and reasonable. The new value is proportional to the old value multiplied by the production constant. The term (1-X) might suggest that as X increases there is a hindrance to fur ther increase. In animal populations

this might result from a lack of food; in laboratories it might result from increasing difficulties in communi cating and sharing data as the numbers of people and facts increase. Alternatively, it could be attributable to the use of inadequate tools that cause the quality of the information and knowledge retrieved to drop. Use your favorite spreadsheet to explore this equation. Take an initial normalized value of X = 0.3, and successively let R = 1, 2, 3.2,and then 3.57 for about 20 periods. When R = 1 the function quickly drops to zero; at R = 2 it rapidly reaches a limit of 0.6. At R = 3.2 the function soon oscillates between 0.5 and 0.8. With R = 3.57 the information transfer efficiencies in each succeeding period will eventually oscillate wildly a t random

ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15, 1992

733 A

=INTERFACE number values. This example illustrates the concept of chaos theory. The purpose of this tutorial is to describe some of the concepts that are allowing many major laboratories to avoid t h e pitfalls of information chaos as they critically examine and automate their operations to make use of available technology. Work flow re-engineering: The human side Coalescing information technology (IT) functions within the confines of the modern corporate laboratory is a necessary task, but it is not an easy one. Many current systems have grown by ad hoc addition, not logic. The normal linear flow of data, information, and knowledge that might occur from research, through devel opment, to production must be replaced by a nested set of elliptical feedback loops. This requires that normal work routines and the system for distribution of results be re-engineered for more efficient and better use of physical, human, and factual resources. Resistance to change makes this difficult, and the process is sometimes compounded by a n unwillingness of users to allow their areas to be accessed by the network linkages required for complete IT integration. Work re -engineering therefore re quires deft, diplomatic studies of current methods, and the development and analysis of alternative strategies for accomplishing tasks more efficiently. Although the methodologies to accomplish this are not new in the business and manufacturing arenas (1-5),they are new to the technical laboratory environment. Analytical chemistry, in particular, can be well served by such restructuring because it can be considered as a classic case of a production environment: samples in, results out. The variables to be considered are the physical flow of samples, utilization of staff, and deployment of instruments. To these must be added careful analyses of what data are to be acquired and how they are to be stored, retrieved, shared, and re ported. In most laboratories everyone produces data; unfortunately, this data is often excessive and incomplete, and it rapidly becomes obsolete. The future usefulness of data decreases over time if parameters such as the methods used, instrument setup, analyst identification, and other descriptive information are lacking. In such cases the information content may decline even 734 A

r

more rapidly. Where does all such data and information go? If it is on paper it is ignored, lost, and eventually shredded; if it is electronically stored i t clogs computer arteries. Useless data keeps entire industries alive. Technical center work reorgan ization therefore has some unique problems that involve human as well as technical factors. The nucleus Ownership of the new tools that will affect work flow and work habits is an important factor. There must be a consensus among those affected, a degree of participation, and an emotional feeling that it is “our” system rather than a solution imposed by “them.” The process of knowing what enterprise integration is and how to accomplish it involves one or more of five approaches. The most satisfactory approach involves a lengthy education of the staff of the first labor a t o r y t o b e a u t o m a t e d by a sympathetic, sensitive group. This process provides the user with the knowledge and vocabulary to express new needs. I t is not desirable or efficient merely to replicate old manual systems with new hardware and software. New ways to approach tasks, share data, and extract information a n d knowledge m u s t be found. Many corporations are unwilling to expend the time or money to do this, although this approach pur-

ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15,1992

chases t h e strongest ownership bonds, produces the best product, and assures that workers will accept the system, not merely put up with it. An alternative is to devise a database survey composed of a set of q u e s t i o n s involving p a r a m e t e r choices and weights. The initial responses from everyone affected are then shared with the participants, and a second request is made for responses to the same queries. This iterative process arrives at a consensus unbiased by the assertiveness of a few individuals in a public forum, allows polar views to be mitigated by reflective compromise or conciliation, and permits unique views to win a following because of the unpressured contemplation that system offers. Interactive computer-assisted consensus, strategy, and tactical decision-making “facilitating rooms” are now becoming available to provide this type of approach electronically in real time. The combination of human contact and anonymous iterative input with computer text and thought processing is a rapid way to combine man and machine in planning total integration. Users find the output incisive, less biased, and more synergistic and comprehensive than that obtained by other means. A third option is to present alternative “paper” scenarios to users who can select or discard features and synthesize their own plan. This method is based on the philosophy that most people know what art or music they prefer, even though they may not be able to paint or compose. One must, however, guard against innate biases in the proffered alternatives. Installation of operating prototype systems that allow users to develop a plan based on actual exposure is a fourth approach. Although this runs counter to most of the “dialogue, then planning” dogma instilled in computer scientists, such prototyping has long been encouraged by James Martin, a guru of large computer sys-

Figure 1. Host centric environment. The central (distributed) hosts provide all services to users.

tems (6). The only caveat is that if the prototype is a poor fit, discard it and rebuild from the beginning. Jury rigging an unsuitable system always results in failure because of maintenance difficulties. A fifth option involves having technically qualified individuals interview those to be affected in very small groups. A consensus plan can be developed from merged reports. The possible difficulties in this approach are inherent interviewer bi ases and the probability that multiple internal interviewers will provide uneven understandings of the lab, whereas outside interviewers may misinterpret the corporate personality. Whatever the approach, any development scenario must ask the users to consider the following questions in some way. How should sample and work requests travel through the lab? The plan must be built on future needs, not past habits. What data should be collected electronically, and what role should bar codes play in sample, test, and operator identification? How can file protocols be standardized? How can data be released to the client most effectively? How should sign-off and access control be implemented to protect both laboratory and client from misinterpretation? What better mechanisms can be developed to interpret data? How could automatic information and re port generation be achieved to reduce paper workload, yet retain the ability to immediately flag unusual events for both lab and client? This could include analytical reliability statistics, correlation of multiple analytical re-

sults where matrix effects might be important, and appropriate warnings to the client about alternative inter pretations. All of these would increase the future usefulness of the information provided. What information and knowledge is currently being lost by the data manipulation programs now be ing used? How could expert systems (7) be used to extract more and better information? An expert systemdriven methods development program could suggest procedures for characterizing new analytes. What automatic knowledge extraction tools are needed for the large variety of lab databases? They could include corre lations among analyte source, date, previous handling, analytical methods used, and related samples. Longterm statistical analytical trends, possible supplier problems, and potential corrections might be provided, increasing the future usefulness of the information. Analytical lab sample throughput optimization charts based on retrospective loads and current instrument, manpower, and sample makeup could be generated (lab throughput increases of 30% have been reported at Unilever in the Netherlands [SI). How can data, information, and knowledge be shared globally to improve product delivery time and quality, reduce duplicate efforts, and improve creativity by cross-fertilization? Few companies fully use the two most valuable and sources they possess: their employees and the data that these individuals have created. Data is collected, reported, and archived, but it is seldom efficiently shared. Access to the material by those with

related interests via organized electronic search, browse, or special interest group modes is essential. Will there be a scientific electronic lab notebook available? Many companies are attempting to provide repositories for research - type activi ties (9,IO),just as they are doing for more structured analytical and quality control aspects. One goal is to make the individual research notebook more available to others. Another is to provide a better record for patent litigation purposes. The general industrial consensus is t h a t “ w r i t e once, r e a d many” t i m e s (WORM) optical drives meet the test of legal acceptability. Holding both scientific results and purchase order information, this medium can demonstrate conception and diligent pursuit (11). Will the library be an integral part of the network? Many companies already use fax and e-mail dissemination of reports and have centralized facilities where access to, not ownership of, documents is important. These often include electronically scanned historical reports stored as machine -readable a n d searchable files. Network access to CD-ROM databases and journals is developing. The IS0 (International Standards Organization) 239.50 access standard is becoming more prevalent, providing client-sewer interaction and transparent interlibrary access. Electronic training facilities using interactive CD-ROMs and video conferencing‘ are part of this environment. New compander (compresdexpand) hardware has reduced the bandwidth requirements of longdistance conferencing. Today’s librarians are information technologists, a vital part of IT (12-15). Are there a move and a commitment to IS0 9000 standards? Labosomebody’s reLaboratory ted a move to-

ANALYTICAL CHEMISTRY, VOL. 64, NO.14, JULY 15,1992

735 A

IN-FACE ward focusing on quality. Competition from the Pacific Rim, the impact of EC '92, and the changing nature of the American workplace all suggest that total quality is a necessary product characteristic for maintaining competitiveness. Primary producers and service laboratories alike are finding it important to be able to certify that they meet a minimum standard of quality, so that purchasers of their product need not implement their own expensive quality control procedures. The economic sensibility of this approach, and the marketplace pressure from those who adopt it first, will rapidly force the issue. The IS0 has developed total quality specifications (IS0 9001, 9002, 9003) that allow manufacturers and labs to place the equivalent of a UL label on their product or analytical report (16,17). U.S. equivalents to the IS0 standards are available as ANSI specifications. Compliance begins in the shop or lab, but its proof is to be found in the databases that make up IT. What database search strategies would you like to have? Most older systems are locked into rigid keyword search strategies. New systems allow full- text and Boolean searches and permit queries that were never envisaged at the time the database was structured. This latter featureis the strength of relational databases (RDBs). This approach also allows

formats for storage of chemical structures already exist. A development plan can begin to emerge from these queries and the responses they elicit. However, there are other factors that must also be considered.

different fields within and among such RDBs to be joined, providing a new view into data and information relationships. C a n you imbed s t r u c t u r e s , graphs, and diagrams into reports? What forms of this compound document architecture (CDA) do you need to produce the reports that clients, management, and regulatory bodies demand? Standards for CDA are rapidly being solidified. The Analytical Instrument Association (AIA) and ASTM are at work on analytical data interchange s t a n d a r d s t h a t will make it possible to take files of electronically captured data and manipulate these as objects for inclusion in reports (18).Much of the pioneering effort has occurred in the Digital Equipment Corporation ADISS (Analytical Data Interchange and Storage Standards) project. Standard

Figure 2. Client-server environment. The network gives clients access to multiple server resources. Simplicity and reduced incremental expansion costs can be gained by isolating similar services on their own computer servers. 736 A

ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15,1992

Work flow re-engineering: The technical side It is perhaps simplistic, but useful, to look a t the evolution of laboratory automation from several perspectives (Table I). Many companies today are initiating the development of integrated IT systems. They start in an area of obvious need, with a carefully analyzed scenario of what the new system is to provide in t h e way of work r e engineering to improve throughput, quality, and exchange. The success of this effort is then allowed to ripple through t h e remaining organizational components, assuring that individual needs are reconciled with the mandate t h a t all systems installed be capable of effective intercommunication. This usually means a common data base architecture, built around a n RDB framework and an easy-to-use structured query language (SQL). Unix as an operating system, coupled with RDBs such as Rdb, Ingres, and Oracle, are commonly encountered examples. Such platforms offer relatively stable, flexible standards coupled with probable longevity-vital factors in the continuing evolution that will occur (19). Software factors. Many scientists consider commercial laboratory information management systems (LIMS) to be cost-efficient and immediate solutions to laboratory data needs. Unfortunately, most commercial products labeled LIMS are actually only laboratory data management systems (LDMS), not t r u e LIMS. Many people also have the impression t h a t coupling various LDMS constitutes an effective approach to enterprise-wide automation. Unfor tunately, database descriptor format incompatibilities and inefficient or nonexistent information - and know1edge - extraction tools can make this approach difficult. Some cammercial LIMS (LDMS) products can provide a beginning (20).However, customizing a commercial LIMS product may not be as easy as many promise, and customizing usually is necessary. If this approach is attempted, full source code for the modified portions of the software should be available. (All other code should be held in escrow by a

third party.) A full and open license to the database engine involved is essential. Some commercial systems provide run-time versions of the LDMS and database engine only. This severely limits the ability to modify structures, paint new screens, or change database descriptor vari ables. For these reasons, about half of the larger companies are building their own systems. Such in-house systems are costly to develop and may involve ten man-years or more to create. On the other hand, they may fit well and potentially can be easily modified. The success of such a n effort depends highly on the skills of the software team. Poor planning, improper language and operating system platforms, inadequate documentation, and loss of key system programmers and managers have all led to catastrophe. For multiple domestic site operations, the large software development costs can be easily amortized, and facile file transfer between sites is assured. For international operations the seamlessness of a common system can be essential. Slightly modified screens accommodate language differences, but provide a common database all may share. Alternatively, a front - end expert system can provide a look-alike access mechanism to older, traditional existing databases with different internal formats. A much less satisfactory approach to achieving compati bility involves t h e u s e of file interchange translators. These consume time and computer cycles, and writing them in house may not be a trivial process. Supportive vendors should provide these if proprietary systems need to be preserved. Networking factors. A longrange facility - wide plan must pro vide the computer power and net work infrastructure (21)for the IT system. This plan must recognize that the distribution of computing power begun with the PC revolution must remain, yet it must be integrated into a more centrally managed networked system to meet the new demands of sharing, certification, validation, archiving, and system management. The network’s evolvable bandwidth must be large enough to accommodate the increasing demands that will be made of it as users become more sophisticated. Multiple local area networks (LANs) transmitting a t 10 megabits per second (Mb/s), coupled with redundant fiber distributed data interconnect (FDDI)

at 100-200 Mb/s, are becoming common. This normally takes the form of a segmented LAN made up of baseband or token-ring structures servicing high-volume local traffic within laboratories. These segments may be connected with repeaters, bridges, or routers. Repeaters merely repeat a communication onto another segment and have no directional or security ability. Many companies are using combination bridges and routers (brout ers). These permit control of the sensitive information flow to only appropriate destinations and also allow for intelligent rerouting in case of LAN faults. Laboratories are becoming so heavily dependent upon electronic transmission that a failure is

catastrophic. Concentric duplicate ringed high-speed FDDI links among buildings with duplicate brouter access will survive cable severing. LAN segments can be constructed from thin-wire coaxial cable for noisy environments, or unshielded twist ed-pair (such as level 3 wire, -3 twistdft) for office areas. Where high bandwidth is needed for intralaboratory transfer of images and large spectral data sets, copper distributed data interconnect (CDDI) may be required. Multiple shadowed disks that hold redundant copies of data, so that failure of one drive leads to a faulttolerant condition, are common. Such infrastructures require a front - end expenditure of capital early in the development of an integrated system. Developers cannot wait until user demand from several labs wishing to intercommunicate forces the issue. They will be too far behind the power curve ever to catch UP. Evolution The evolution of computer hardware and software that can support such a n integrated environment is fairly obvious. Stage I. In this stage (Figure l), conservative facilities will start with host centric environments because of existing personnel and habit pat-

Figure 3. Peer-peer environment. Databases and other processes are distributed among computers of equal stature. Services may still be present as client-server constructs, but requests can travel both ways.

ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15,1992

737 A

A/C INTERFACE terns. A large central computer is a t the center of the web. Stage 11. Client-server environments make much better sense as the system expands (Figure 2). Users (clients) are served by computer servants (servers) holding current and commonly used files. These servers are in turn connected to larger computers. By adding servers it is possible to successively layer the developm e n t of t h e s y s t e m , a n d t h e accompanying cost, as demand increases. Concurrently, constant response time to the user can be maintained. Many Stage I developments have not been able to do this, and user dissatisfaction rapidly leads to declining credibility. Stage 111. Peer-peer structures will become more common as software tools improve (Figure 3). The hierarchical distinction between elements in the network begins to disappear and PC, workstation, and mainframe become peer-like. At the user level this trend has involved graphical user interfaces (GUIs) that have led to interoperable environments, where users do not need to learn an arcane command string to elicit function. The menus and icons of the PC world are the interface. As an illustration, Molecular Design’s Isis can run on a Macintosh or a PC across a network to a variety of RDBs. The user will notice no difference in keystroke or screen appearance between the two machines; they work alike and look alike. Use of such systems is intuitive (22). Object-oriented programming codes (OOPS)will mature a t Stage I11 (22).This is a n old concept that allows programs to treat targets as generic objects and permits already written code to be easily reused. Object-oriented operating systems are also being developed t h a t permit these reutilization and adaptation characteristics to apply to the operating system itself. System managers will be able to respond to user requests quickly and cost-effectively. D i s t r i b u t e d directory service (DDS) capabilities will allow small and large machines to interact in a n environment that permits the user to navigate through a complex network without maintaining a physical image of the hardware, without arcane specifications to file locations, and with an awareness of who else is on the network. These combined developments will lead us to an environment where we can work a t our PCs, keeping the friendliness they have now, yet have access to the power and knowledge at 738 A

the remotest tip of the network. Finally, expert system shells are becoming available that automatically generate rules from data obtained from the scientist in a cordial dialogue. These rules then allow new data to be transformed into information (23). Rapid strides are being made i n a u t o m a t i c knowledgeextraction tools. These advances feed on database information created by integrated systems and help humans apply their own unique cognitive processes to turn data into knowledge. Management and the integrated laboratory system Given these human and technical challenges, how does one politically bootstrap such an effort? The eventual total investment required in political, fiscal, and human capital is significant, so how is the system justified and how does this pearl grow? The key is effective, knowledgeable, and committed higher management. Although the project begins in a few highly visible labs, it must ripple through the entire organization. This requires a long-range plan established by a group with a high degree of representation from the users. In the best cases the committee is made up of users, and the installation is user-need-driven. The implementers provide a service. This reverses the traditional role of computer groups, which formerly dictated policy. This service role is the trend in organizations that respect the bottom line and understand why it is best to think not of a computer center but of the computer service. The long-range plan clearly envisages that, after the highly visible and more easily automated labs are in hand, the integration will proceed to labs t h a t are more complex, labs where difficult political problems are present, and where less immediate and visible results a r e expected. Building a n entire system in one massive step is dangerous. Selling of the plan is based on increased efficiency, more creative efforts, and the ability to meet the time and cost factors essential to global competition. This requires management’s total involvement. Management’s role. The personnel in the nuclear lab(s) serving as the first focal point for integration and the implementers must be supported by someone in the top level of management (24,25). This “white knight” must have sufficient technical grasp, charisma, and power to champion t h e effort. Integration

ANALYTICAL CHEMISTRY, VOL. 64, NO. 14, JULY 15,1992

must be built from the bottom up and supported from the top down. Lab managers and management information system personnel cannot, alone, drive through the blockades of corporate politics, conservatism, and ego that tend to block integration. The long-range plan must be highly visible, agreed upon by consensus, and have the proactive backing of top management. Experience suggests that 50% of the attempts to utilize LIMS effectively have succeeded. The ability to conceive and deliver a truly integrated IT system will have a much lower success rate. Such poor success rates result not from failures in technology but from deficiencies in the organization. Failure. Failures in such endeavors have stemmed from a variety of human sources. Attempts to build empires out of IT and the communication network a r e paramount. Hoarded information and ineffective communication systems character ized the 1960s and 1970s, which were dominated by large computers. The distributed processing environment resulting from the PC revolution, which brought comfortable desktop computing to the masses, changed our views and expectations of the computer as a servant. As we move back to a more integrated, yet still distributed environment, it is essential that the ease of use and access typifying the 1980s be maintained. Unfortunately, empires a r e being built again. Shared information is power; unshared information is disaster. Often, precipitous attempts at integration, or pushing an antiquated system beyond its capabilities, have left lab managers with concerns about integrated systems. They see their control and work flow being altered by an alien force. In many instances ill-conceived plans with promises of too much too soon have led to skeptical, pessimistic, and defeatist attitudes among employees. Only tact, diplomacy, and the building of a sense of ownership of the system by the lab personnel will correct this situation. Success. The system we get is the product of what we need and what we are willing to adopt. If the “we” defined here clearly understand the benefits to be derived from integration, the re-engineering changes will be accepted. The corporation will become more competitive, producing higher quality products in a shorter time frame. Such sustained success is essential to survival (26).

The glistening pearl Like a pearl that glows more colorfully against skin, a n integrated laboratory system accrues power as its users become more sophisticated and integrated themselves. Science has become complex, producing more data, information, and knowledge t h a n we can digest with current methods. The PC allowed us to better control our own small world. All laboratory personnel must now reach out and begin to share what they know with others. Problems in development must be transferred to research quickly, and production upsets should be anticipated or more rapidly solved by development. Research can learn a great deal from its downstream p a r t n e r s . Two-way communication up and down the creation pathway is the only solution. Understanding other people’s problems and how they think and act allows integration in the social sense. It makes equal sense in the technical environment.

(11) Dessy, R. E. Anal. Chem. 1986, 57, 692 A. (12) Kibbey, M.; Evans, N. Educom 1989, Fall, 15. (13) L ch, C. Educom 1989, Fall, 21. (14) cGill, M. Educom 1989, Fall, 27. (15) Simutis, L. Educf‘m 1992, Winter, 2. (16) Breitenberg, M. The IS0 9000 Series”; NIST Internal Report 4721; US.

The author would like to thank the many companies and their employees who have provided the incentive to think about this problem, and those whose conversations stimulated some of the concepts presented here. Interactions with participants in our ACS short course in this area have helped hone and refine many of the ideas. Special thanks go to Michael Starling, Clifford Baker, Keith Casserta, Robert MacDowall, Marc Salit, Dave Duewer, Bernard Vandeginste, and Gerrit JSleijwegt, whose philosophies are intimately intertwined within this document. Particular thanks are extended to one of the referees who helped greatly by suggesting changes to the architecture and content that improved substantially the usefulness of the paper. The figures are adapted with permission from Michael Starling (Union Carbide, Charleston,WV), who also provided insight into the use of computer-aidedfacilitation.

1991, 11, 251. (23) Hohne, B . A.; Pierce, T. H. Expert Sys-

References (1) Ono, T. Workplace Management; Productivity Press: Cambrid e, MA, 1988. (2) Nolan, W. Harvard &sines Review 1979, March-A ril, 115. (3) Brown, D.; Ghite, C. Organizational

Research and Artificial Intell&ence; Kluwer Academic Publishing: Boston, 1990. (4) Rubin, D.; Stinson, J. A Quantitative Approach to Management; McGraw-Hill: New York, 1986. (5) Boothro d, H. Articulate Intervention; Taylor a d F r a n c i s : London, 1978. ( 6 ) Martin, J. Application Development without System Programmers and An Information Systems Manihto; Prentice-Hall: Englewood Cliffs, NJ, 1982. ( 7 ) Dessy, R. E. Anal. Chem. 1984, 56, 1200 A; 1312 A. (8) Vandeginste, B., Vlaardingen, The Netherlands; personal communication, July 1991. (9) Burger, A.; Meyer, B.; Jung, C.; Lon K. HyPertext ’91 Proceedings. (Availabg from The ForeFront Grou 1709 Dry77030.) den, Suite 901, Houston, (10) Dessv. R. E. Chemom. Intell. Lab. Svst.

&

1990, 8,”2.

h=”

Department of Commerce. National Institute of Standards and Technology: Gaithersbur MD, 1991. ( 1 7 ) Quality fystems: ANSI/ASQC 9911987; American Society for Quality Control, 310 W. Wisconsin Ave., Milwaukee, WI 53203. (18) “Standard Specification for the Analytical Information Model For Analytical Data Interchange and Storage”; ASTM Document E49.52.002.RO3; Anal ical Instrument Association: Alexan&a, VA, 1992. (Inquiries may be directed to R. Lysakowski, Digital Equipment Corp., Four Results Way, MR043/ C9, Marlboro, MA 01752. Copies of the May 1992 release of the Chromatography Standard may be obtained from M. Duff, ALA, 225 Reinekers Lane, Suite 625, Alexandria, VA 22314.) (19) Dessy, R. E. Anal. Chem. 1983, 55, 883 A. (20) Warner, S. A. Anal. Chem. 1990, 62, 389 A. (21) Warner, S. A. Anal. Chem. 1990, 62, 95 A. (22) Dessy, R. E. Chemom. Intell. Lab. Syst.

tem Applicatrons zn Chemrstty; ACS: Washington, DC, 1989. (24) Dessy, R. E. Anal. Chem. 1984, 56, 725 A. (25) Dessy, R. E. Chemom. Intell. Lab. Syst. 1991, 10, 271. (26) An excellent video resentation of Inte ated Laboratory I f is available in the E m of the fall 1991 COMDEX keynote address “The Second Decade: Computer-Supported Collaboration,” presented by Andrew Grove, CEO of Intel. The last 30 minutes portrays a live transcontinental and trans-Atlantic integrated IT exchange involving text, voice, SEM images, CD-ROM cli s, and live TV. Call the Intel Technical l i t e r a ture Distribution Center ( 8 0 0 - 5 4 8 4725) to request a free copy (order no. 24 1226 - 00 1, Literature Packet DA03.1 Raymond E. Dessy received a B.S. degree i n pharmacy ( 1 9 5 3 ) a n d a Ph.D. in chemistry (1956) from the University of Pittsburgh. After a decade at the University of Cincinnati, he joined the faculty of Virginia Polytechnic Institute and State University in 1966. From 1 9 8 2 to 1986 he was contributing editor of ANALYTICAL CHEMISTRY? A/C INTERFACE

feature. In 1986 he was thefirst recipient of the A C S A w a r d f o r Computers in Chemistry. He is currently a n associate editor of Chemometrics and Intelligent Laboratory Systems. His research group works on the development of microelectronic biosensors, expert systems for chemical processes, and novel means of processing analytical information. He is internationally recognized for his teaching in the fields of laboratory and technical center automation. ANALYTICAL C H E M I S l

h m Caveman to Chemist Circumstances and Achievements

W

hat was the connection between early chemistry and magic? What was the logic that made alchemists think they could make gold out of lead? Why were gases not recognized until the 17th century? Why did it take 49 years before Avogadro’s hypothesis was accepted? In From Caveman to Chemist,author Hugh Salzberg traces the oddities of chemistry, examining cultural and political influences on the ideas of chemists. He follows the evolution of chemistry from the Stone Age beginnings of ceramics and metallurgy, through the rise and decline of alchemy, to the cuimination of classical chemistry in the late 19th century. Chapters 1 through 9 lead from prehistoric technology, through ancient and medieval science to the study of chemicals and reactions that resulted in the 16th century birth of scientific chemistry. Subsequent chapters focus on key chemists such as Sala, Boyle, Black, lavoisier. Dalton, Berzelius. laurent, and Arrhenius as they developed the ideas that led to classical chemistry and the concepts of moiecutes, chemical reactions, homology, valence, and molecular formulas and structures, among others. Twenty topical illustrations enhance the text. Six timelines and two maps help readers understand the influences of early history on chemistry. About the Author Hugh W. Salzberg taught chemistry at the City University of New York for 35 years and offered courses in the history of chemistry over a period of 20 years. From Caveman to Cbemist reflects his dual passions for chemistry and history and his profound admiration of the great minds that developed the ideas of chemistry. Hugh W.Salzberg Editor 300 oaaes ( 1 99 1 1 Clotibohdl ISBN 0-8412-1786-6 $24.95

Paperbound: ISBN 0-8412-1787-4 $1 4.95 American Chemical Society Distribution Ofice. Dept. 88 11SS Sixteenth St.. N.W. Washington. DC 20036 ir CALL TOLL FREE

800-227-5558

in Washington, D.C. 872.4363) and use your credit card!

I, VOL. 64, NO. 14, JULY 15,1992

739 A