An Alternative Minimum Level Definition for Analytical Quantification

Definition for Analytical Quantification. SIR: The paper by Gibbons et al. (1) on an Alternative. Minimum Level (AML) as a substitute for the existing...
0 downloads 4 Views 48KB Size
Correspondence Comment on “An Alternative Minimum Level Definition for Analytical Quantification SIR: The paper by Gibbons et al. (1) on an Alternative Minimum Level (AML) as a substitute for the existing statistical procedures for the determination of reporting limits is certainly welcome as the numerous shortcomings of the existing procedures are well known. The authors are correct that statistical procedures that are based solely on standard deviations produce a wide range of reporting limits depending on the concentration of the analyte used to determine the reporting limit (2). The development of a statistical method that produces the same reporting limit independent of the concentration of the spiking concentration is obviously important. However, the authors only briefly touch upon one of the main practical problems that all statistically based reporting limits suffer, that is they ignore data quality objectives (DQOs) of the data user. In particular, statistically based methods do not incorporate the need for a measure of accuracy in the reporting limit (3, 4). To illustrate this point, the results presented in Table 1 of Gibbons et al. 1997 (1) were re-analyzed in terms of percent bias (% bias) and percent relative standard deviation (% RSD). % bias is defined here as the absolute difference between the spiked concentration and the measured concentration divided by the spiked concentration × 100. % RSD was defined two ways, one following Gibbons et al. (1) was standard deviation divided by the spiked concentration × 100 (% RSD-spiked); the other was the standard deviation divided by the measured mean concentration × 100 (% RSD-mean), which is how it would be calculated if the spiked or true value were not known. These three measures were plotted against the spiked concentration and are presented in Figure 1. Figure 1 shows that the % bias and % RSD-mean increase dramatically at lower concentrations. This indicates that, at lower concentrations, results are less accurate and less reproducible and precise than at higher concentrations. Let us assume that a laboratory using the same analytical method as Gibbons et al. (1) were to analyze the same samples that had the true value of 50 ng/L, but the laboratory did not know the true value. Assuming that this laboratory got exactly the same results as the authors’ did, the mean measured value would be 2 ng/L, a % bias of 96 with a % RSD-mean of 286 (since the laboratory does not know the true value, this is how it would be calculated). The question here is: are results with such a large % bias and % RSD acceptable? This is entirely dependent on the data quality objectives of the data user. If, for example, the data user wants to know if this sample is above the Maximum Contaminant Level (MCL) set by the U.S. EPA for drinking water, which is 1000 ng/L, then this is an acceptable result as it accurately reflects the fact that the amount of cadmium in the sample is far below the MCL and that a violation of Federal regulations has not taken place. That the result was negatively biased and imprecise is irrelevant, any one of the individual seven replicates alone would produce an acceptable answer, or indeed any result within 3 SD of the mean measured value. The mean could have had an equally positive bias (98 ng/L) and been just as imprecise (standard deviation ) 149) and still be acceptable as it would still be well below the MCL.

S0013-936X(97)00691-3 CCC: $14.00

 1997 American Chemical Society

FIGURE 1. ICP-MS results for cadmium at mass 114 in reagentgrade water. On the other hand, if the data user were performing a risk assessment, and an important health effect was expected to occur at a concentration of 50 ng/L but not at 5 ng/L of cadmium, then this level of accuracy and precision would be unacceptable. This level of quality control would be equally unacceptable if the MCL were 25 ng/L. What statistical tests are passed or failed or what level of theoretical rigor is applied seem unimportant as compared to the question of does the data meet the accuracy and precision needs of the data user. A truly alternative approach is to ask what is the smallest quantity of an analyte that meets the data quality objectives of the data user (for a fuller discussion of this perspective see refs 2-6). The statistical approach advocated by Gibbons et al. (1) is by far the most commonly used. It is the basis for the U.S. EPA’s Method Detection Limit (7), Reliable Detection Limit (8), and Minimum Level (9) as well as numerous other procedures. In sharp contrast, the U.S. EPA’s Offfice of Drinking Water (ODW) has come out with a radically different approach. As part of the Information Collection Rule (ICR), the ODW has created a so-called Minimum Reporting Level (MRL) (10). This is the lowest concentration that meets the data quality objectives of the ICR. In order for a laboratory to demonstrate that it can quantify accurately at the MRL, their analyst must spike the analyte at the MRL concentration into drinking water and recover it with a fixed percent (either +25% or +50% of the spiked concentration). Thus, any result generated by the methods used greater than the MRL can be expected to have equal or better accuracy and precision, as the results on Figure 1 and other similar studies show (2-6). Any concentration less than the MRL will be not have this level of accuracy and thus is not reported. This is the sort of user-based quantitation limit incorporating accuracy that is needed for the evaluation of environmental analysis.

Literature Cited (1) Gibbons, R. D.; Coleman, D. E.; Maddalone, R. F. Environ. Sci. Technol. 1997, 31, 2071-2077. (2) Kimbrough, D. E.; Wakakuwa, J. R. Environ. Sci. Technol. 1993, 27, 2692-2699.

VOL. 31, NO. 12, 1997 / ENVIRONMENTAL SCIENCE & TECHNOLOGY

9

3727

(3) Kimbrough, D. E.; Wakakuwa, J. R. Environ. Sci. Technol. 1994, 28, 338-345. (4) Kimbrough, D. E.; Suffet, I. H.; Hertz, C. D. Rethinking Reporting Limits for Regulatory Purposes. Proceedings of the Water Quality & Technology Conference, Vol. 2; Boston, MA, 1996; Paper ST-2. (5) Yohe, T. L.; Hertz, C. D. Importance of PQLs in the Development of MCLs: A Water Utility Perspective. Proceedings of the Water Quality & Technology Conference, Orlando, FL, 1991. (6) Hertz, C. D.; Brodovsky, J.; Marrollo, L.; Harper, R. E. Minimum Reporting Levels Based on Precision and Accuracy for Inorganic Parameters in Water. Proceedings of the Water Quality & Technology Conference, Toronto, Quebec, Canada, 1992. (7) Glaser, J. A.; Forest, D. L.; McKee, G. D.; Quave, S. A.; Budde, W. L. Environ. Sci. Technol. 1981, 15, 1426. (8) Keith, L. H.; Lewis, D. L. Revised Concepts for Reporting Data Near Method Detection Levels: Proceedings of 203rd Meeting of the American Chemical Society, Committee on Environmental Improvement, San Francisco, June 1992.

3728

9

ENVIRONMENTAL SCIENCE & TECHNOLOGY / VOL. 31, NO. 12, 1997

(9) U.S. EPA. Guidance on Evaluation, Resolution, and Documentation of Analytical Problems Associated with Compliance Monitoring; EPA 821-B-93-001; United States Printing Offfice: Washington, DC, June 1993. (10) Section 7, U.S. EPA DBP/ICR Analytical Methods Manual; Office of Water; EPA 814-B-96-002: United States Printing Offfice: Washington, DC, April 1996.

David E. Kimbrough Castaic Lake Water Agency 27234 Bouquet Canyon Road Santa Clarita, California 91350 ES9706918