an integrated approach for efficient site cleanup - ACS Publications

inently a set of management and technical tools for .... icy calls—including the difficult def- inition of tolerances ..... to environmental solutio...
0 downloads 0 Views 7MB Size
ES&T FEATURES

RISK-BASED DECISION MAKING

AN INTEGRATED APPROACH FOR EFFICIENT SITE CLEANUP ι wo counterproductive | themes run through the history of Superfund and Resource Conservation and Recovery Act clean­ ups: a tendency toward overinvestment in reme­ diation actions (when regulators have directed the cleanup) and a ten­ dency toward costly liti­ gation, which often does nothing but delay cleanup (when responsible, i.e., regulated, parties have at­ tempted to assert themselves). In both situations, the constituencies paying the bills do not get their mon­ ey's worth because the investments are not efficiently directed at reduc­ ing risk in a timely manner. These problems are no secret, and federal and state regulators are tak­ ing a new look at how the cleanups should be managed. Regulators have begun to emphasize real risk reduction, both as a standard and as a frame of reference for ranking pri­ orities; and the agencies have begun to explore ways to streamline the procedures of regulator-driven cleanup. Employing risk assess­ ment to determine priorities for what should be cleaned up, when it should be cleaned up, and how much it should be cleaned up, was discussed extensively in a recent National Research Council study of the Department of Energy's cleanup liabilities (i). Two recent d e v e l o p m e n t s at EPA—the Superfund Accelerated Cleanup Model (SACM) (2) and the guidelines for setting data quality

I

466 A

objectives (DQO) (3)—offer real promise for using risk standards to clean up sites efficiently. The SACM primarily reflects the regula­ tors' perspective; it relaxes some of the previous constraints in manag­ ing a cleanup in a way conducive to more flexible and efficient cleanup while still meeting consistent stan­ dards. The DQO guidance is preem­ inently a set of management and technical tools for articulating tech­ nical objectives to facilitate commu­ nication among diverse players in complex projects, and for optimiz­ ing the collection and use of data in environmental decision making. In our experience, the techniques available in the DQO portfolio capi­ talize on the opportunities for effi­ cient, flexible cleanups offered by the SACM. The SACM and the DQO guid­ ance were developed and first ap­ plied in public sector settings where the regulator conducted the cleanup (e.g., Part 2 of this paper) or where the regulated party was an­ other public sector agency (4). The private sector needs to be alerted to the opportunities that these devel­ opments offer for reducing costs in a cleanup action.

Environ. Sci. Technol., Vol. 28, No. 11, 1994

STANLEY

BLACKER

MAC Technical Services Germantown, MD 20874

DANIEL GOODMAN Montana State University Bozeman, MT 59717

This paper briefly describes some key elements of the SACM and DQO logic. We will show how four as­ pects of the DQO methods lend themselves to the streamlining and efficiency that is latent in the SACM initiative. Applying these DQO methods should save significant amounts of time and money in cleanups, while still achieving the risk-based cleanup standards ex­ pected by the public and regulators. The Superfund Accelerated Cleanup Model The SACM, an initiative of EPA's Office of Emergency and Remedial Response, is a major rethinking of how to manage the cleanup process (5). The model emphasizes early identification of the real cause of regulator and public concern at each site. The SACM standardizes the level of residual risk that is the goal of the cleanup. In addition, the SACM streamlines the site charac­ terization process by focusing data collection on the priority risks rather than pursuing characteriza­ tion and risk assessment laundry lists that involve implausibly re­ mote scenarios or contaminants that are not problems at the site. More­ over, the SACM facilitates remedy selection by drawing on cumulative experience with effective remedial technologies (presumptive reme­ dies). Finally, the SACM encour­ ages early action by helping the stakeholders reach consensus and by drawing on the available menu of accepted risk standards, pre­ sumptive remedies, and focused

0013-936X/94/0927-466A$04.50/0 © 1994 American Chemical Society

site-specific risk assessments and site characterization. Data Quality Objectives The important early steps for streamlining the cleanup process at a particular site are, first, to create a short list of critical conditions that make the site a genuine problem, and second, to agree on the data required to determine where cleanup is needed (and, by extension, where cleanup is not needed) and where and when the cleanup is complete. Both of these steps are facilitated and enhanced by the methods of DQOs. DQO procedures were developed by EPA's Quality Assurance Management Staff (QAMS) to provide general assistance for planning data collection programs. The emphasis in the guidance is on collecting the right amount and kind of data to answer the program's needs (3). The DQO methods can help the stakeholders in a regulator-driven remediation sort through their initial myriad concerns to reach consensus on a focused statement of the problem. The participants work through a risk-based decision logic to arrive at cleanup objectives that are fixed and clear. A variety of alternative remedial strategies can be examined systematically to determine what the expected cost of each will be and whether it can be expected to meet the cleanup objectives. Alternative strategies for using the data to guide the remediation are also evaluated. Efficiencies are achieved by selecting the least costly design that meets the cleanup objectives.

Four Key Elements in the Logic of Data Quality Objectives 1. Separation between policy calls about "risk-goals" and technical discussion of alternatives for implementation. 2. Translation of the regulatory policy calls—including the difficult definition of tolerances for "uncertaint y " — into specific, concrete, measurable cleanup criteria. 3. Documentation of agreements reached, among all stakeholders, on the critical requirements for the cleanup. 4. Use of the full arsenal of technical optimization methods to develop the most efficient design for the remediation.

Four key elements in the logic of DQOs In applying DQO methods to regulator-driven cleanup, four elements are essential to reduce the time and cost of cleanup (see the box, "Four Key Elements . . ."). The remainder of this paper will discuss these four elements; in the following companion article we will describe the application of this approach to a particular Superfund site as an i l l u s t r a t i o n of the achieved cost savings and the minimization of controversy. Separating risk-goal policy from negotiable technical factors. The central concern about any site cleanup is the risk posed by the site.

If that risk can be reduced to an acceptable level, then the regulators and the public should be satisfied. Similarly, a responsible party operating in good faith should be willing to invest in cleaning up the site to reduce the risk to an acceptable level. From the perspective of the party paying the bills, the strategy should be to invest as efficiently as possible in reducing the risk to an acceptable level, and not to overinvest in activities that do not effectively reduce risk. A statement of what constitutes an acceptable level of risk determines the " r i s k - g o a l " for the cleanup. Risk-goals are statements of policy that, within the limitations set by programmatic policy, are the prerogative of the regulators responsible for the cleanup. Because they are policy calls, they are not natural topics for negotiation or technical review. By contrast, technical alternatives for implementation should be inherently negotiable as long as these alternatives reach the goal set by policy. Notwithstanding the broad authority of the regulator at the site, there are a number of general features of the risk-goal that will be similar across sites. When the concern is with human health, as it often is, there is now a degree of agency consensus on appropriate target levels for nominal risk. A reasonable quantitative range of risk values has been established between unacceptably high risks and acceptable risks. Superfund guidance (6] recom-

Environ. Sci. Technol., Vol. 28, No. 11, 1994

467 A

mends that the site-specific goal for carcinogens should be a residual risk in the range of 10~4 to 10~6 lifetime risk of cancer, assuming a plausible, worst case exposure scenario. This type of exposure scenario is constructed by evaluating the contaminants present at the site, their exposure routes and exposure levels, transport mechanisms, transformation mechanisms, and future site use. When a quantitative interpretation is sought for "worst case," it is often the most severe 2.5% tail of the distribution of plausible outcomes (often computed as an upper 95% confidence limit). The risk associated with a given scenario is c o m p u t e d from n o m i n a l d o s e response curves. Many of the nominal dose—response curves in common use are subject to serious u n c e r t a i n t y . These curves are often derived from extrapolations of data outside the concentration ranges of concern, and they may be based on animal rather than buman data. Furthermore, exposure estimates necessarily involve assumptions whose validity is uncertain. Nevertheless, reasonable conventions have been formulated to standardize the practice of the risk calculations despite the uncertainties. At the level of the individual site, it is generally best to accept these conventions. As a rule, both sides lose in litigation over the metaphysical aspects of establishing acceptable risk levels or choosing among assumptions that cannot be decided by available data. When there is no scientific consensus on a particular assumption or goal, the courts cannot be expected to settle the matter in any more reasonable fashion. Litigation in such a situation is costly and the results are unpredictable. In designing the cleanup for a site, it usually makes sense to accept the existing program-wide policy conventions. Site-specific technical n e g o t i a t i o n and factual investigation should be restricted to those aspects of the cleanup for which new facts and discussion are likely to pay real dividends. Flexibility can be negotiated in those aspects of the cleanup that are not policy matters. Wise use of this flexibility creates opportunities for savings. A hard look at factual relationships among such factors as waste form, transport mechanisms, site use, and exposure routes can lead to a more efficient cleanup strategy. Examples of tactical considerations are given in the box, "Some 468 A

Tactical Considerations . . .". Translation of regulatory policy decisions into measurable criteria. The statement of the acceptable risk for a particular site is more than a statement of philosophy. It serves as the basis for determining how much cleanup is required. Going from a statement of acceptable risk to a concrete statement of measurable criteria that will guide remediation at a site is accomplished by translating the statement of acceptable risk to a dose standard, translating the dose standard to a concentration standard, and translating the concentration standard to a criterion expressed in terms of field measurements. A description of the implementation of these translations is given in the box, "Translating Regulatory Policy Decisions . . .". These translation processes are at the heart of obtaining concrete, unambiguous statements of criteria that fully address all the essential policy concerns for the cleanup, but that are no more stringent than necessary. The clarity and concreteness will be important when these criteria are used as the technical specifications the cleanup design must meet. In particular, explicit quantification of the requirements for certainty in the cleanup allows the use of decision theory. As discussed below (under "Optimization of the design"), this allows rigorous application of sophisticated optimization techniques—such as those of operations research—to structure the search for an efficient remedial design. Documentation of agreements reached on critical requirements. Reaching a consensus on the concentration standard and associated

Environ. Sci. Technol., Vol. 28, No. 11, 1994

error tolerance marks an important t r a n s i t i o n in e s t a b l i s h i n g the c l e a n u p r e q u i r e m e n t s . At this point, all the policy issues are settled and documented in a clear paper trail that is endorsed by the part i c i p a n t s (to p r e c l u d e later misunderstandings). Once documentation of the policy judgments, and their rationale, is complete, the remaining aspects of the cleanup are technical. Then discussion and analysis can focus on selecting efficient means for achieving the desired risk-based cleanup levels. Because success in a regulatormandated cleanup is so dependent on c o n s e n s u s , it is i m p o r t a n t throughout the course of the DQO effort to document carefully the sequence of discussions from which the specific cleanup requirements (and, hence, the cleanup implementation plan) evolve. The paper trail maintains continuity in the intervals between the face-to-face meetings; the incorporation of episodic sign-off steps in the record preserves the agreements that have been reached. Finally, capturing the logic and the rationale for the agreements can be invaluable if questions arise later during the course of the cleanup. Optimization of the design. When the cleanup goal is expressed in terms of a concentration standard and an associated error tolerance for estimates of the concentration relative to the standard, the regulators have effectively made a contractual statement. They are committed to accepting any implementation plan that achieves that concentration with that level of certainty. This leaves open many of the details of the implementation plan, including

Some Tactical Considerations that Promote Flexibility and Opportunities for Efficiency in Site Cleanups • If the risk is driven by long-term average exposure, then spatial or temporal "hot spots" of contamination may not be important except as they affect the overall average (unless the hot spots exceed a threshold where acute, rather than chronic, health effects become a concern). • If projected site use will limit certain exposure routes, then the cleanup requirements need not focus on the risk associated with those exposure routes, except to ensure that the control of site use will be effective and will be maintained for as long as the source of contamination remains. • If future site use will lead to exposure only to the contaminants that escape the site boundary via a particular route, then containment measures or actions that intercept the contaminant at this particular route may be sufficient, provided that containment can be guaranteed for as long as required. • If effectiveness of containment or interdiction measures can be ensured for the duration of a remediation, this removes the urgency of eliminating the source, allowing slower but possibly more efficient remediation strategies.

theory) for finding efficient solutions, once the goals and constraints of the remediation are sufficiently defined. The DQO methods facilitate definition of these goals and constraints, and the use of decision theoretic formulations to state tolerances for uncertainty (as per the DQO methods) completes the translation of the cleanup requirements to a set of technical, unambiguous specifications.

Translating Regulatory Policy Decisions into Measurable Criteria Translating the risk-goal to a dose standard. When the contaminant of concern, the exposure route, and the acceptable risk level have been identified, a conventional dose-response curve should be used. Such a curve will identify the dose believed to be associated with that level of risk, for that type of exposure, and for that contaminant. The modifier "conventional" acknowledges that a variety of assumptions are often built into the construction of a dose-response curve. In the interests of consensus and in the absence of more conclusive scientific alternatives, the standard assumptions should be accepted, rather than attempting to wrestle with policy issues that are not likely to be resolved at the level of the individual site. Translating the dose standard to a concentration standard. When the medium, exposure route, transport mechanisms, and mechanisms of decay and byproduct formation are taken into account, calculating the source concentration associated with the standard allowable dose is routine. This becomes the concentration standard for judging whether a cleanup has achieved the risk-goal. In the case of a containment strategy, the concentration standard applies only outside the containment structure. In the case of a removal strategy, the concentration standard applies to the portion of the site that is contaminated. The concentration standard must be expressed as an average over a particular volume of space (or area). For dynamic situations, such as groundwater plumes, the average also needs to reflect a window of time. The spatial and temporal basis of the averaging must be logically connected with the exposure scenario (e.g., the area that is used in averaging should be the same as the area that an individual may cover in accumulating a dose in the plausible worst case scenario). The nature of the dose calculation (e.g., How will the individual's exposure be accumulated over time? At what levels might there be a transition from chronic to acute effects?) must be documented. This dose calculation is to confirm that the concentration standard is consistent with the risk-goal, given the assumptions of the dose-response curve and the exposure scenario. Translating the concentration standard to field measurements. The contaminant concentration standard is expressed in terms of the true concentration averaged over some interval of space (and time, if appropriate). This concentration standard could be used as the working standard in the field only if the true concentration in the field were known. However, we can know only estimates of the true concentration from measurements on samples. Samples are subsets of the spatial and time domain specified in the concentration standard. Further, the data come from measurements that are subject to analytical error. Because of the variation introduced by sampling and analytical error, it is unreasonable to expect an exact correspondence between the true value and the estimate. This discrepancy creates an uncertainty which itself is part of the risk posed by the site. For this reason, it is a policy call to define how much average discrepancy is tolerable (i.e., the error tolerance) between the measurable estimate and the unknown true value. One way to state the error tolerance is in terms of acceptable frequencies of false positive and false negative decisions about whether the cleanup goal has been reached. For example, in determining whether a subunit of a contaminated site is above or below the threshold defined in the concentration standard, a false negative would arise when the estimate was below the threshold even though the true concentration was above the threshold. Thus, a stated 15% tolerance for such a false negative would convey that a sampling and measurement system would be acceptable if it provided estimates that gave rise to such false negatives no more than 15% of the time. The definition of error tolerance, arising as a statement of policy, will then have implications for the design and cost of data collection. The number of samples can be increased, or a more advanced analytical method may be used to obtain a more accurate estimate but at greater cost; or fewer samples can deliver a less accurate estimate at lower cost.

sampling design, analytical procedures, and remediation activities. With this level of flexibility in the details for meeting the concentration standard and the certainty level, there can be many remediation i m p l e m e n t a t i o n plans that should be acceptable to the regula-

Summary The methods for setting DQOs take advantage of the central features of the SACM: consistency of standards and flexibility of implementation. The DQO methods can help focus a remediation on the risk standards that really matter for that site, translate these standards into measurable quantities, and structure a search for the least expensive remediation plan that satisfies the critical risk standard for that site. In our experience in using DQO methods, we have come to expect large cost savings while ensuring regulators that the regulatory standards are satisfied. The atmosphere of comparative trust fostered by a stepwise, logical process with constructive participation of all the important stakeholders, and with careful documentation of expectations and actual performance, is refreshing.

tors. Plans that meet the established requirements will vary in cost. A systematic search for efficient plans can lead to substantial savings. The methods of operations research offer many techniques (such as linear and nonlinear programming, dynamic programming, and control

The efficiencies achieved in this approach result from several positive features. In the early stages of identifying the core issue for a site, eliminating extraneous concerns can lower costs of remediation. Treating risk explicitly as a quantifiable feature of the cleanup leads to efficiencies in remediation design. The regulators establish quantitative acceptance levels (i.e., riskgoals and risk-based error tolerances) for the cleanup, and this determines what levels of remediation, characterization, and monitoring are really required. (By contrast, when risk calculations are not explicit, there is a tendency to throw in intuitive "margins of safety" at many steps in the process. These multiple, ad hoc distortions can compound, leading to a much more conservative and costly cleanup.) By collecting all the policy decisions early and documenting the logic and motivation for them, the scope for the remaining technical decisions is well-defined, allowing the best use of technical optimization tools. Overall, the use of a stepwise logic for justifying, refining, and documenting cleanup require-

Environ. Sci. Technol., Vol. 28, No. 11, 1994 469 A

merits e n h a n c e s c o m m u n i c a t i o n and consensus and reduces the confusion, m o v i n g targets, a n d false starts that cause delay a n d invite confrontation.

(4) (5)

Blacker, S. M.; Goodman, D.; Clark, J. M. Environ. Test. Anal. 1994, 3(4), 38-43. Statement of Carol M. Browner, Administrator, U.S. Environmental Protection Agency, before the Subcom-

Acknowledgment Portions of this work were s u p p o r t e d by EPA cooperative agreement R-818563 to Montana State University.

(6)

m i t t e e on T r a n s p o r t a t i o n a n d Hazardous Materials of the Committee on Energy and Commerce, U.S. House of Representatives. Feb. 3, 1994. "Risk Assessment Guidance for Superfund: Volume 1, Human Health Evaluation Manual," Publication 9285.7-01B, Office of Emergency and Remedial Response, U.S. Environmental Protection Agency: Washington, DC, Interim, Oct. 1991.

References (1)

(2)

(3)

"Building Consensus Through Risk Assessment and Management of the Department of Energy's Remediation Program"; Committee to Review Risk Management in the DOE's Environmental Remediation Program, National Research C o u n c i l , January 1994. "Guidance on Implementation of the Superfund Accelerated Cleanup Model under CERCLA and the NCP," OSWER Directive No. 9203.1-03, U.S. Environmental Protection Agency: Washington, DC, 1992. "Guidance for Planning for Data Collection in Support of Environmental Decision Making—Using the Data Quality Objectives P r o c e s s " ; EPA QA/G-4, Quality Assurance Management Staff. U.S. Environmental Protection Agency: Washington, DC, October 6, 1993.

Stanley Blacker is vice presidentenvironment at Management Analysis Company Technical Services, providing technical and policy support to the public and private sectors. He was formerly director of the Quality Assurance Management Staff at EPA when he retired from USPHS after 20 years of service at EPA. His main interests are in applying strategic planning tools to design and implement cost-effective cleanups in tough cases. He has a masters degree in chemical engineering from Cornell University and a J.D. from Villanova University.

Puzzled

EÏE

about cost-effective solutions to pollution control, prevention & abatement?

THE ENVIRONMENTAL TECHNOLOGY EXPO

CONFERENCE AND EXPOSITION

Attend ETE '94 in Atlanta ETE is designed for professionals charged with responsibility of solving environmental problems, by featuring: A conference program that offers expertise to guide attendees • to environmental solutions that work, based on an integrated approach to pollution management. / * /

A 150-booth expo targeted to provide solutions affecting air, water, hazardous and solid waste.

*

A certification program by the National Registry of Environmental Professionals (NREP).

/

Colocation with the 17th World Energy Engineering Congress and Plant & Facilities Expo. REFER TO KEY NO. 3

470 A

Environ. Sci. Technol., Vol. 28, No. 11, 1994

Daniel Goodman is a professor in the Biology Department at Montana State University and directs the Environmental Statistics Group. He received his Ph.D. from Ohio State University. His interests include uncertainty analysis and validation of environmental models. He has worked with the public and private sectors to improve environmental decision making.

The big show where buyers & sellers unite for environmental solutions December 7-9, 1994 Atlanta, Georgia Georgia World Congress Center For further information, write to: The Environmental T e c h n o l o g y Expo 4025 Pleasantdale Road, Suite 420 Atlanta, GA 30340-4264 FOR FASTER RESPONSE: FAX US AT 4 0 4 / 4 4 6 - 3 9 6 9 OR CALL 4 0 4 / 4 4 7 - 5 0 8 3 , Ext. 2 1 0