Focus
Tearing Down the Tower of Babel I
magine what life would be like if everyone in the lab spoke different languages and no one was fluent in more than one. Entire departments would need to be reorganized so that labmates could communicate with one another; experiments would have to be repeated again and again because researchers wouldn't be able to decipher each others' notes and data. Such a situation currently exists with analytical instrumentation. For example, a Hewlett-Packard detector doesn't output results that a Perkin-Elmer detector can read, which prevents researchers from archiving and reviewing data or directly comparing results from instruments made by rival manufacturers. Why can't instruments communicate? After all, Hewlett-Packard makes floppy disks that write in a format that can be read by IBM computers. The reason is that analytical instrument manufacturers have not
Jim Kling
what instrument they came from. Those data-exchange standards—called the Analytical Data Interchange Protocol (AND!)—won't let you connect a Varian GC with a Perkin-Elmer autosampler, but they would be thefirststep toward uniting the instrument manufacturers and easing the load on lab managers everywhere.
First steps toward a common language f o r analytical data yet set standards for communications. This lack of standards is preventing the industry from fulfilling its market potential, says Lynn Matthews, president of Thru-put Systems. "I suspect manufacturers will pay more attention to this, because the market won't grow until we standardize. Users wiil be held captive by the formats the manufacturers put forth, and that's restrictive," she says. "Things we take for granted as consumers, we don't have in the laboratory." But that may be changing. In the first step toward better instrument-to-instrument communication, Matthews and others are leading an effort to establish protocols that will dictate how readouts from instruments are stored electronically and allow users to archive the results of experiments in the samefileformat, regardless of
Early attempts The drive to establish analytical instrumentation standards got its start in October 1989, when a committee of vendors and end users met in Monterey, California, during an American Society for Mass Spectrometry (ASMS) conference. Searching for a developed standard that mass spectrometrists could use to output experimental results, the group initially focused on the JCAMP program, which was created under the auspices of the Joint Committee on Atomic and Molecular Physical Data. That standard—developed with the assistance of IR spectrometer manufacturers and already adopted by several manufacturers—writes
Analytical Chemistry News & Features, September 1, 1999 621 A
Focus data in the plain-text ASCII format, so any word processing program can read the file. However, at the time, JCAMP wasn't very well supported, says David Stranz, a partner at Sierra Analytics LLC and one of the founders of the ANDI. "It was developed, but there wasn't any software to support it, and there was no one in charge of supporting the software. There had been a couple of papers published in journals, but that was the extent of it," he recalls. The committee mulled using JCAMP or developing a new standard, but the idea had very little appeal. "Nobody wanted to do that much work," says Stranz. But good fortune intervened. Rich Lysakowski, who was working for Digital Equipment Corporation at the time, showed up at Perkin-Elmer (where Stranz worked), trying to get them interested in Analytical Data Interchange and Storage Standards (ADISS), which he had developed for implementing an electronic notebook. (Lysakowski is now executive director of the Collaborative Electronic Notebook Systems Association, an industry group that develops and markets general record keeping systems.) "After talking about it [the committee] decided that the standard [Lysakowski using] had all that JCAMP had and It was complex enough to handle the kinds of data we wanted to push around" says Stranz Lysakowski's standard used a format called netCDF, which had been developed by the consortium known as the University Corporation for Atmospheric Research (UCAR) to handle data from researchers modeling thunderstorms and tornadoes. Those storm-watchers used their own modeling programs and had trouble comparing data because the output formats differed. The netCDF format handled raw data and experimental information, and the ADISS standard added sample descriptions, computational results and other information required to meet Good Laboratory Practices (GLP) specifications "The target for Rich's effort was the pharmaceutical industry and electronic record keeping So by using thi