news
Simulated spectra extend the lifetime of NIR calibrations At the 2008 Olympics in Beijing, scientists pricked the ears of Michael Phelps and his teammates after every race to draw blood and check their lactate levels, and across the world, millions of diabetics prick their fingers several times a day for blood-glucose tests. The creation of a noninvasive NIR assay for lactate or glucose is the ultimate goal for researchers Gary Small at the University of Iowa and his former graduate student Yusuf Sulub (now at Novartis Pharmaceuticals). In a recent AC paper (DOI 10.1021/ac801746n), Sulub and Small report how they improved the practicality of NIR spectroscopy assays and present a multivariate calibration method for this technique that bypasses the need to repeatedly prepare and measure a series of calibration standards. Despite the promise of NIR spectroscopy, the U.S. Food and Drug Administration has yet to approve a noninvasive monitor based on the technique, partially because the performance of NIR’s calibration models degrades over time. Because of the degradation problem, scientists must perform and update laborious multivariate, or multiwavelength, calibrations. Previous attempts to minimize this deterioration still required the preparation and spectral measurement of calibration samples. “Making a set of calibration samples and measuring themOthat could easily be a week’s worth of work,” says Small. “So one of the big questions in applications where you use multivariate calibration is... How do you keep [the calibration model] functioning over time?” To address this issue, the researchers measured the background spectra of buffer blanks and combined those spectra with reference spectra of the samples’ pure components to generate synthetic calibration spectra for a particular day. “When we first went down this road, I thought we were being pretty ambitious to think that we could synthesize
spectra well enough,” says Small. In the investigators’ preliminary study of their synthetic spectral modeling approach that incorporated background spectra, the set of measured calibration spectra was decreased but not eliminated (Appl. Spectrosc. 2007, 61, 406⫺413). In the current work, the group found that under certain conditions, sufficient numbers of background spectra could effectively supplant the need to measure calibration spectra. “This method says that if the sample matrix is well characterized, you know the constituents, and you know the concentration ranges that they’re going to be in, then you may not even have to do calibration samples,” says Small. “You may be able to just synthesize [the calibration spectra] and collect backgrounds and use the backgrounds to characterize the instrument.” In effect, incorporating the background spectra into the calibration compensated for instrumental drift and environmental fluctuations over time. The group’s method significantly decreased the time required for the analyses because the majority of background spectra were automatically collected while the instrument was warming up. For this project, the researchers generated two partial-least-squares calibration modelsOone from measured calibration spectra and the other from synthetic calibration spectraOand tested them over a 325-day period. Synthesized calibration spectra based on a particular day’s background spectra were generated for each measurement session. Then, the measured and synthetic calibration models were evaluated for their ability to accurately predict glucose concentration. Analytical samples consisted of varying concentrations of glucose, lactate, urea, ascorbate, alanine, and triacetin in a phosphate buffer to loosely approximate the composition of blood plasma or serum. Spectra were acquired in triplicate in a random order within each series, and the glucose concentrations in the analytical samples were predicted by applying the
10.1021/AC802669W 2009 AMERICAN CHEMICAL SOCIETY
Published on Web 01/08/2009
calibration models to the analytical spectra. Single-beam and dual-beam spectra were collected to compare the two types of instruments. For confirmation, the researchers verified glucose concentrations with an automated glucose analyzer. With the measured calibration model, the concentration predictions for the single-beam analytical spectra began to degrade starting at day 34 and completely failed on day 114. For the dualbeam analytical spectra, the concentration predictions based on the measured calibration model degraded more gradually. The synthetic calibration model predictions, however, did not degrade over time. To test the viability and robustness of the synthetic calibration method, the researchers prepared samples containing an additional component, proline. When information about the extra component was not fed into the synthetic calibration method, the method could not accurately predict the glucose levels in those samples. “The clear limitation of this approach is the requirement that the sample matrix be well enough characterized such that realistic synthetic spectra can be computed,” says Small. The researchers hypothesize that their synthetic calibration method would be particularly useful for on-line monitoring in chemical and pharmaceutical plants, and they hope to test their hypothesis in the near future. Perhaps by the 2012 Games in London, athletes and diabetics can say goodbye to sore ears and fingers and simply test their lactate and glucose with a beam of light. —Christine Piggee
FEBRUARY 1, 2009 / ANALYTICAL CHEMISTRY
861