Report
Measurement Uncertainty ncertainty in the broad sense is no new concept in chemistry; analysts have always sought to quantify and control the accuracy of their results. Few analysts would dispute that a result is of little value without some knowledge of the associated uncertainty; clearly, without such information, interpretation is impossible. Correct interpretation depends on a good assessment of accuracy. When estimates of accuracy are optimistic, results may appear irreconcilable and may be overinterpreted; with unduly pessimistic assessments, methods may appear unfit for a particular purpose and may be optimized when it's not necessary.
Steve Ellison Laboratory of the Government Chemist (U.K.)
Wolfhard W e g s c h e i d e r University of Leoben (Austria)
Alex Williams EURACHEM Working Group on Measurement Uncertainty (U.K.) S0003-2700(97)09035-5 CCC: $14.00 ©1997 American Chemical Society
Correct interpretation of accuracy ensures that results are judged neither overly optimistically nor unduly pessimistically. In general, different methods of estimating uncertainty will lead to different values. Most estimates of accuracy have been based on the standard deviation of a series of experiments or interlaboratory comparisons, often in association with estimates of bias in the form of recovery estimates. When individual effects are being considered, the contribution from random variability can be estimated from repeatability reproducibility or other precision measures. In addition separate contributions
from several systematic or random effects can be combined linearly or by the root sum of squares. Finally, the way uncertainty is expressed can vary substantially. Confidence intervals, absolute limits, standard deviations, and coefficients of variance are all in common use. Clearly, with so many possibilities for estimating and expressing such a critical parameter, a consensus is essential for comparability. The most recent recommendation is that accuracy be expressed in terms of a quantitative estimate of uncertainty as described in the International Standards Organization's (ISO) Guide to the Expression of Uncertainty in Measurement t1) and other measurement authorities (2,3). The guide is published under the auspices of several organizations, including ISO, the International Bureau of Weights and Measures (BIPM) the Organization for International and Legal Metrology (OIML) and the International Union of Pure and Applied Chemistry (IUPAC) The document lays out a standard approach to estimating and expressing un-
Analytical Chemistry News & Features, October 1, 1997 6 0 7 A
Report certainty across manyfieldsof measurement and, in view of its pedigree, is widely accepted by accreditation and certification agencies worldwide. This Report deals primarily with the provisions and interpretation of this document, though it is recognized that different approaches are used in other ISO documents. Definitions
The definition of measurement uncertainty is "a parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably
be attributed to the measurand" (1,4). Thus, measurement uncertainty describes a range or distribution of possible values. For example, 82 ± 5 describes s range eo values. Measurement uncertainty, therefore, differs from "error", which is defined as a single value—the difference between a result and the true value. The stated range must also include the values the measurand could reasonably take, on the basis of the result. That makes it quite different from measures of precision, which give only the range within which the mean of a series of ex-
Box 1 . Calculating uncertainty using ISO rules
Rule 1: All lontributions are eombined in the form oo fsandard deviationn sSDs). Combining as SDs allows calculating a rigorous combined SD using standard forms. It does not imply that the underlying distribution is or needs to be normal—every distribution has an SD. It is not perfectly rigorous to deduce a confidence interval from a combined SD, but in most cases, especially when three or more comparable contributions are involved, the approximation is at least as good as most contributing estimates. Rule 2: Uncertainties are combined according to
periments will lie. Precision makes no allowance for bias; measurement uncertainty includes random components and systematic components. Note that known systematic errors, or bias, should be corrected for as fully as possible; failure to make such a correction is simply to report a result known to be wrong. But an uncertainty associated with each correction factor remains and must be considered. This consideration of systematic effects makes measurement uncertainty more realistic than measures such as standard error. Finally, measurement uncertainty is an estimate. Obviously, all statistical calculations onfinitesamples provide estimates of population parameters, but the estimate goes deeper than this. Devising experiments that can accurately characterize uncertainties in method bias and other systematic effects is extremely difficult. For example, most derivatizations are presumed to proceed to completion. How certain can the analyst be of this? Unfortunately, statistics help little; in practice the chemist is often forced to make an educated estimate from prior experience However it is crucial to realize that the attempt must be made the correction for bias and the uncertainty of this correction factor cannot simply be ignored if compa-
rability is to be established in in which u (y) is the uncertainty of a value y, «lF u2, - • • the uncertainties of the independent parameters xh x2,... on which it depends, and dy/dXj is the partial differential of y with respect to *,-. When variables are not independent, the relationship is extended to include a correlation term (1). Rule 2 establishes the principle of combination of root sum squares. One corollary is that small components are quickly swamped by larger contributions, making it particularly important to obtain good values for large uncertainties and unnecessary to spend time on small components. In pictures, this looks like a simple Pythagorean triangle. For the uncertainties ux and u2, the combination uc can be visualized.
Rule 3: The SD obtained ffom Eq. . 1neds to ob multiplied db y coverage factor k to obtain a range called the expanded uncertainty, which includes a large fraction of the distribution. For most purposes, k = 2 is sufficient (2) and will give a range corresponding to an approximately 95% confidence interval. Similarly, k = 3 is recommended for more demanding cases.
608 A
Analytical Chemistry yews & &eatures, October 1, 1991
Error and uncertainty. In common parlance, the terms error and uncertainty are frequently used interchangeably. However, several significant differences in the concepts are implied by the terms defined by ISO (4). Error is defined as the difference between an individual result and the true value of the measurand. Error, therefore, has a single value for each result. In principle, an error could be corrected if all the sources of error were known, though the random part of an error is variable from one determination to the next. Uncertainty, on the other hand, takes the form of a range and, if estimated for an analytical procedure and a defined sample type, may apply to all determinations so described. No part of uncertainty can be corrected for. In addition, estimation of uncertainty does not require reference to a true value, only to a result and the factors that affect the result. This shift in philosophy marks a concept rooted in observable, rather than theoretical, quan-
titles. To further illustrate the difference, the result of an analysis after correction may—by chance—be very close to the value of the measurand, and hence have a negligible error. The uncertainty may nonetheless still be very large, simply because the analyst is unsure of the size of the error. Uncertainty and quality assurance. ISO explicitly excludes gross errors of procedure from consideration within an uncertainty assessment. Uncertainty estimates can realistically apply only to wellestablished measurement processes in statistical control, and thus they are a statement of the uncertainty expected when proper quality control (QC) measures are in place. It is thus implicit that QC and quality assurance (QA) processes be in place and within specification if an uncertainty statement is to be at all meaningful. Repeatability and reproducibility
The most generally applied estimates of uncertainty at present are those obtained from interlaboratory comparisons, particularly those using the collaborative trial protocols of ISO 5725 (5) and the Association of Official Analytical Chemists (AOAC) (6). For legislative purposes, the collaborative trial reproducibility figure is the closest approach to uncertainty that attempts to estimate the full dispersion of results obtained by a particular metiiod, and it has the considerable advantages of simplicity and generality, though at high cost. Another substantial advantage is its objectivity, because it is based entirely on experimental observations in a representative range of laboratories. Though it serves well in cases in which the chief issue is comparability among particular laboratories with a common aim several factors leave this approach wanting Reproducibility is inevitably a measure of precision; although it covers a range of laboratory bias, it cannot cover bias inherent in the method itself, nor, in general, sample matrix effects. Arguably, these effects are not relevant for a standard method, which may simply define a procedure that generates a result for trade or legislative purposes. Many methods, indeed fall into this class; even when a metiiod purports to determine a specific
Figure 1. Uncertainty estimation process.
molecular species, there is no guarantee that it determines all that is present or, indeed, any particular species at all. An example is the semiquantitative AOAC method for detecting cholesterol. Though standardized and properly accepted for certain trade and regulatory purposes on the basis of collaborative trial data showing sufficient agreement between laboratories (7), subsequent work using internal calibration (8) has shown that method recovery is poorer than the reproducibility figure suggests. It follows that long-term studies of cholesterol levels in food could be expected to misinterpret changes in apparent level particularly nations using different cholesterol determination methods Reproducibility figures will in treneral suffer from the absence of bias information These arguments suggest that reproducibility will generally underestimate uncertainty—but not necessarily. A single laboratory can have much smaller uncertainties for a determination than the reproducibility figure would indicate, which
tends to include a range of poor as well as good results. This issue can be put more bluntly: What does the spread of results found by a handful of laboratories on a specific set of samples at some time in the past have to do with the results of an individual laboratory today? Indeed, this is the very question that must be answered quantitatively before any laboratory can make use of collaborative trial information in a formal uncertainty estimate. It follows that reproducibility, although a powerful tool, is not a panacea. The ISO approach
The approach recommended in the ISO guide, outlined below, is based on combining the uncertainties in contributory parameters to provide an overall estimate of uncertainty (Figure 1). To begin, write down a clear statement of what is being measured, including the relationship between the measurand and the parameters (measured quantities, constants, calibration standards, and other influences) on which it depends. When
Analytical Chemistry News & Features, October 1, 1997 6 0 9 A
Report
Figure 2. D i o x i n a n a l y s i s .
possible, include corrections for known systematic effects. Though the basic specification information is normally given in the relevant standard operating procedure or other method description, it is often necessary to add explicit terms for factors such as operating temperatures and extraction time or temperature, which will not normally be included in the basic calculation given in a method description. Then, for each parameter in this relationship, list the possible sources of uncertainty, including chemical assumptions. Measure or estimate the size of the uncertainty associated with each possible source of uncertainty (or for a combination of sources). Combine the quantified uncertainty components, expressed as standard deviations, according to the appropriate rules (see Box on p. 608 A)) ,o give e combined standard uncertainty, and apply the appropriate coverage factor to give an expanded combined uncertainty The most important features are that all contributing uncertainty components are quantified as standard deviations in the first instance, whether they arise from random variability or systematic effects; also, that estimates of uncertainty from experiment, prior knowledge, and professional judgment are treated in the same way and given the same weight. 610 A
Quantifying all contributing uncertainty components as standard deviations provides a particularly simple and consistent method of calculation based on standard expressions for combining variances. It is justified in principle because, although an error in a particular case may be systematic, lack of knowledge about the size of the error leads to a probability distribution for the error. This distribution can be treated in the same way as that of a random variable. Treating estimates of uncertainty fron experiment prior knowledge and professional judgment the same way and giving them the same weight ensures that all known factors contributing to uncertainty are accounted for even when experimental determination is not possible In principle, this approach overcomes many of the deficiencies in currently used approaches. It is much quicker and less costly to apply than a collaborative trial, but it can use collaborative trial data advantageously if available. The approach covers all the effects on a result, systematic or random, and it takes into account all available knowledge. In addition, it mandates a particular form of expression, leading to improved comparability in uncertainty estimates However, disadvantages exist. The ISO approach, because it requires appro-
Analytical Chemistry News & Features, October 1, 1997
priate judgment, cannot be entirely objective; to some extent it relies on the experience of the analyst. A significant cost in time and effort is a factor; estimating uncertainties on the basis of local conditions without using published data involves more effort than simply quoting a published reproducibility figure. The lack of objectivity can be compensated for by third-party review, such as quality system assessment, interlaboratory comparisons, in-house QC sample results, and certified reference material checks. Finally, it should be clear that a decision to exclude a particular contribution entirely rather than make some judgment of its size represents a de facto decision to allocate the contribution a size of zero hardly an improvement. Cost, too, may be recouped in direct or indirect benefits. Uncertainty estimation improves knowledge of analytical techniques and principles, forming a powerful adjunct to training. Knowing the main contributions to uncertainty determines the direction of method improvement most effectively. Efficiency can be improved with minimal impact on method performance. Finally, normal QA procedures, such as checking the method for use, maintaining records of calibration and statistical QC procedures, should provide all the required data; additional cost should be no more than that of combining the data appropriately Sources of uncertainty
Many factors affect analytical results, and every one is a potential source of uncertainty. In sampling, effects such as random variations between different samples and any potential for bias in the sampling procedure are components of uncertainty affecting thefinalresult. Recovery of an analyte from a complex matrix, or an instrument response, may be affected by other constituents of the matrix. Analyte speciation may further compound this effect. When a spike is used to estimate recovery the recovery of the analyte from the sample iriciv differ" from the recoverv of the spike Stability effects are also important but frequently are not well-known Cross-contamination between samples and contamination from the
laboratory emnrrtnment are pupr nrpcpnt risks
Though ISO does not include accidental gross cross-contamination in its definition
of uncertainty, as it represents loss of control of the measurement process, the possibility of background contamination should nonetheless be considered and evaluated when appropriate. Although instruments are regularly checked and calibrated, the limits of accuracy on the calibration constitute uncertainties. Calibration used may not accurately reflect the samples presented; for example, analytical balances are commonly calibrated using nickel check weights, although samples are rarely of such high density. Though not large in most circumstances, buoyancy effects differ between calibration weight and sample. Other factors include carry-over and systematic truncation effects. The molarity of a volumetric solution is not exactly known, even if the parent material has been assayed, because some uncertainty relating to the assay procedure exists. A wide range of ambient conditions, notably temperature, affects analytical results. Reference materials are also subject to uncertainty; fortunately, most providers of reference materials now state the uncertainty in the manner recommended in the guide. The uncritical use of computer software can also introduce errors. Selecting the appropriate calibration model is important, and software may not permit the best choice. Early truncation and rounding off can also lead to inaccuracies in the final result. Operator effects may be significant; they can be evaluated either by predicting them or by conducting experiments involving many operators. The latter approach will not normally detect an overall operator bias (for example, a particular scale reading may be taken in the same manner by a group of operators similarly trained), but the scope of variation can be estimated. "Operator effect" could reasonably be considered a proxy for a range of poorly controlled input parameters such as scale-reading accuracy time and duration of agitation during extraction and so on It follows that a formal mathematical model of the experimental process would not normally include "operator" as an inniit factor but only the specific factors un-
der
ooerator
Random effects contribute to uncertainty in all determinations, and this en-
try is usually included in the list as a matter of course. Conceptualizing every component of uncertainty as arising from both systematic and random effects is also frequently useful; this step avoids the most common trap for the unwary— overlooking systematic effects in the effort to obtain good precision measures. Both need to be taken into account, though the ISO approach requires only the overall value. Determinands are not always completely defined. For example, volumes may or may not be defined with reference to a particular set of ambient conditions. Similarly, the determinand may be defined in terms of a range of conditions. For example, material extracted from an acidified aqueous solution at pH below 3.0 allows substantial latitude. Such incomplete definitions result in the determinand itself having a range of values, irrespective of good analytical technique, and that range constitutes an uncertainty. Many common analytes, such as fat, moisture, ash, and protein, are defined not in terms of a particular molecular or atomic species but against some essentially arbitrary process. In effect, the result is simply a response to a stated procedure, expressed in the most convenient units. Such measurements are not generally compared with results from other methods; in effect, bias is neglected by convention. However, the procedure itself may lack full definition or permit a range of conditions, giving rise to uncertainties. Of course if comparison with other methods is desired additional sources of uncertainty including method bias must be taken into
Increasing confidence
The ISO guide suggests multiplying the standard uncertainty by a coverage factor k to express uncertainties when a high degree of confidence is desired. This representation exactly mirrors the situation in conventional statistics, in which a confidence interval is obtained by multiplying a standard deviation for a parameter by a factor derived from the Student ^-distribution for the appropriate number of degrees of freedom. The formal approach in the guide requires estimation of a similar parameter, the "effective degrees of freedom", and uses this value in the same way. Though the details are beyond the scope of this article, some important points can be made. This parameter is almost invariably dominated by the number of degrees of freedom in the dominant contribution to the overall uncertainty. When the dominant contribution arises from sound and well-researched information, effective degrees of freedom remain high, normally leading to k = 2 for near 95% confidencce Only where large uncertainty contributions are based on meager data will the choice of k become significant. A pragmatic approach, therefore, is simply to adopt k = 2 for routtne work ,nd k k = when a particularly high confidence is required (2) The question of possible distributions must also be considered at the point of deciding coverage factors. Although the guide uses a combination of standard deviations based on established error propagation theory, the step from standard deviation to confidence involves some assumptions. The guide takes the view that, in most cir-
Table 1 . Contributions to uncertainty in dioxin analysis. Parameter
u(RSD)
Main contribution 3
O ss
0.02
V Ak and -4SS RRFn
0.02 0.09 0.08
"lspk Combined uncertainty
0.12 0.17
Syringe specification; certified reference solution uncertainty Density (volume determined by weight) Permitted abundance ratio range Range permitted by method Permined range of spike recovery [(0.02) + ( 0 . 0 2 ) + ( 0 . 0 9 ) + (0.08) 2 +(0.12) 2 ] 1 / 2
(a) Contributions are listed if they contribute more than 10% of the stated uncertainty. (b) Permitted ranges are treated as limits of rectangular distributions and adjusted to SD values (1) by dividing by the square root of 3. (c) Recovery of added material is not, in general, fully representative of recovery of analyte materials.
Analytical
Chemistry
News & Features, October
1, 1997
611 A
Report
Upper limit
Lower limit
(a) Result above limit plus uncertainty
(b) Result above limit, but limit within uncertainty
Figure 3. Uncertainty and compliance
cumstances, the central limit theorem will apply, and the appropriate distribution will be approximately normal. Certainly it is rare to calculate confidence intervals based on other distributions in general analytical chemistry, if only because it is unusual to have sufficient data to justify other assumptions. Nonetheless, when additional knowledge about underlying distributions exists, it is most sensible to base k on the best available information. Dioxin example
The analysis of dioxins in the effluent of paper and pulp mills by isotope dilution MS (Figure 2) is a good example (9). For the sake of discussion, we will consider only the analysis of 2,3,7,8-tetrachlorodibenzodioxin (2,3,7,8-TCDD) and will ignore the normally important uncertainties caused by interference from other TCDD isomers, GC integration, and resolution difficulties. By way of illustration, some minor contributions that would not normally be included will also be examined. The basic equation for determining the concentration Cx of TCDD is Cx = AkQss/AaJiRFnVR%pk in which Ak is the peak area of the analyte, Qss is the amount of spike, i4ss is the peak area of the standard, RRFn is the relevant response factor for the relevant ion 13C-12, Fis the original sample volume, and i?spk is the (nominal) recovery of the analyte relative to added material. Ra k merits explanation, because it is 612 A
(c) Result below limit, but limit within uncertainty
(d) Result below limit minus uncertainty
long as the method is operating within control. For most methods currently in use, however, such control limits are not closely specified. Typically, one or two critical parameters are given single target values, and precision control limits are set It then falls to the laboratory to estimate the contribution of its own level of control to the uncertainty, rather than simply demonstrating compliance with an established set of figures and an associated, carefully studied, uncertainty estimate. Legislation and compliance
Two issues are important when uncertainty is considered in the context of legislation and enforcement. The first conlimits. cerns the simpler problem of whether a result constitutes evidence of noncomplinot used in the standard. Because the 13 ance with some limit, particularly when C-12 calibration spike is added to the the limit is within the uncertainty quoted. slurry and is not naturally part of the samThe second issue is the use of uncertainty ple, differential behavior is possible. If information in setting limits. measurable, this behavior would appear as imperfect recovery of analyte. A comTwo instances in compliance are plete mathematical model of the system clear-cut: Either the result is above the therefore requires some representation of upper limit, including its uncertainty, the effect. Because no existing parameter which means that the result is in nonin the equation is directly influenced by compliance (Figure 3a); or the result, recovery the recovery term has been including its uncertainty, is between the added in the form of a nominal correction upper and lower limits, and is therefore factor The result is a basic equation enin compliance (Figure 3d). For any other compassing all the main effects on the case, some interpretation is necessary result and can be made only in the light of the Identification of the remaining contriand with the knowledge and butions to the overall uncertainty is best understanding of the end of the achieved by considering the parameters information in the equation, any intermediate meaFor example, Figure 3b represents surements from which they are derived, probable noncompliance with the limit, and any effects that arise from particular but noncompliance is not demonstrated operations within the method (such as the beyond reasonable doubt. In the case of possibility of "spike partitioning"). Table 1 legislation, the precise wording needs to lists parameters, calculated uncertainties be consulted; some legislation requires (as relative standard deviations), and that, for example, process operators demsome contributory factors. onstrate that they are complying with a limit. In such a case, Figure 3b represents Information in Table 1 shows that uncernoncompliance with the legislation; comtainties associated with the physical meapliance has not been demonstrated besurements of volume and mass contribute yond doubt. essentially nothing to the combined uncertainty and that any further study should be Similarly, if legislation requires clear directed primarily at the remaining compo- evidence of noncompliance with a limit nents. The largest contribution arises from that triggers enforcement, although there the extraction recovery step, in line with is no clear demonstration of compliance, most analysts' experience. there is insufficient evidence of noncompliance to support action, as in Figure 3c. The method studied here is unusual in In these situations, end-users and legislaspecifying direct control of all the major tors must spell out how the situation factors affecting uncertainty, which makes should be handled. it relatively easy to estimate uncertainty as
Analytical Chemistry News & &eatures, October 1, 1991
A recent editorial (10) pointed out the need to avoid setting limits that cannot be enforced without disproportionate effort. An important factor to consider is the actual uncertainty involved in determining a level of analyte; if legislation is to be effective, the uncertainty must be small in relation to any limiting range. In chemistry, measurement requirements tend to follow a "best available" presumption, even when this policy is not actually written into legislation; an example is the Delaney Clause (11). As the state of the art improves, new measurements become possible and are immediately applied, leading to a situation in which the best available technology is the only acceptable technology. In such a situation, uncertainties are, inevitably, hard to quantify well; they will often be larger than required for the purpose. That legislation takes into account the full uncertainty is particularly important; failure to include significant components may unreasonably restrict enforcement. In particular, the possibility of systematic effects being considered is vital; legislation based on measurement of absolute amounts of substance, as in most new European environmental legislation, must consider the full range of methods and sample matrices that may fall within that legislation. Another important consideration is the interpretation of results and their relevant uncertainties against limits. Assumptions about the handling of experimental uncertainty in interpretation for enforcement purposes must be clearly stated in the legislation. Specifically, do limits allow for an experimental uncertainty or not? If so, how large is the allowance? A fundamental factor is how well legislators understand uncertainty. The need to set limits in some contexts is easily understood, such as how much of a toxic compound is acceptable in an environmental matrix. However, judging compliance is trickier, and a better understanding of analytical uncertainly is required. The current move toward specifying method performance parameters, such as repeatability, reproducibility, and recovery rather than the method itself, is a step in the right direction; but these parameters do not necessarily cover all of the significant components of
uncertainty. What is required is the additional specification of the measurement uncertainty to meet the needs of the legislation.
What's new in LC-MS
Ellison's work was supported under contract with the Department of Trade and Industry as part of the National Measurement System Valid Analytical Measurement Programme.
References (1) Guide to the Expression of Uncertainty in Measurement; ISO: Geneva, ,193; ISBN 92-67-10188-9. (2) Quantifying Uncertainty in inalytical Measurement; Published do nehalf fo EURACHEM by Laboratory of the Government Chemist: London, 1995; ISBN 0-948926-08-2. (3) Taylor, B. N.; Kuyatt, C. E. Guidelines for Evaluating and Expressing tht Uncertainty of NISTMeasurement Results; NIST Technical Note 1297, National Institute of Standards and Technology: Gaithersburg, MD, 1994. (4) International Vocabulary ofBasic ana General Standard Terms in Metrology, ISO: Geneva, 1993; ISBN 92-67-10175-1. (5) ISO 5725:1986, Precision of Test Methods: Determination of Repeatability ana Reproducibility for a Standard Method by Interlaboratory Tests; ISO: Geneva, 1987. (6) Youden, W. H.; Steiner, E. H. Statistical Manual of the Association of Official Analytical Chemistst AOAC: Washington, DC, 1982. (7) Thorpe, C.W.Assoc. Anal. Chem. 1969, 52,778-81. (8) Lognay, G. C; Pearse, J.; Pocklington, D.; Boenke, A.; Schurer, B.; Wagstaffe, P. J. Analyst 1995,1201831-35. (9) Report EPS l/RM/19; Environment Canada: Ottawa, Ontario, 1992. (10) Thompson, M. Analyss 1995,120, 117 N-18 N. (11) Delaney: Federal Food, Drug and Cosmetic Act; Food additives amendment, 1958. Steve Ellison is head of the analytical quality and chemometrics section at the Laboratory of the Government Chemist (U.K.). His research interests include statistics, validation and measurement uncertainty, and chemometrics in the contexts of regulatory analysis and analytical chemistry. Wolfhard Wegscheider is professor of chemistry and dean of graduate studies at the University ofLeoben (Austria) and chair of EURACHEM Austria. Alex Williams is chair of the EURACHEM Working Group on Measurement Uncertainty Address correspondence about this article to Wegscheider at Institute of General and Analytical Chemistry University ofLeoben A-8700 Leoben Austria (wegschei@unileoben ac at)
PRODUCTS
Quattro LC
micromass USA Tel: 508 524-8200 Fax: 508 524-8210 Europe Tel: +31 (0) 294-480484 Fax: +31 (0) 294-419052 UK / International Tel: +44 (0) 161 945 4170 Fax: +44(0) 161 998 8915 http://www.mlcromass.co.uk
CIRCLE 1 ON READER SERVICE CARD