AN INTEGRATED APPROACH FOR EFFICIENT SITE CLEANUP wo counterproductive themes run through the history of Superfund and Resource Conservation and Recovery Act cleanups: a tendency toward overinvestment in remediation actions (when regulators have directed the cleanup) and a tendency toward costly litigation, which often does nothing but delay cleanup (when responsible, i.e., regulated, parties have attempted to assert themselves). In both situations, the constituencies paying the bills do not get their money’s worth because the investments are not efficiently directed at reducing risk in a timely manner. These problems are no secret, and federal and state regulators are taking a new look at how the cleanups should be managed. Regulators have begun to emphasize real risk reduction, both as a standard and as a frame of reference for ranking priorities; and the agencies have begun to explore ways to streamline the procedures of regulator-driven cleanup. Employing risk assessment to determine priorities for what should be cleaned up, when it should be cleaned up, and how much it should be cleaned up, was discussed extensively in a recent National Research Council study of the Department of Energy’s cleanup liabilities ( I ) . Two recent developments at EPA-the Superfund Accelerated Cleanup Model (SACM) (2)and the guidelines for setting data quality
1
This paper briefly describes some objectives (DQO) (3)-offer real promise for using risk standards to key elements of the SACM and DQO clean u p sites efficiently. The logic, We will show how four asSACM primarily reflects the regula- pects of the DQO methods lend tors’ perspective; it relaxes some of themselves to the streamlining and the previous constraints in manag- efficiency that is latent in the SACM ing a cleanup in a way conducive to initiative. Applying these DQO more flexible and efficient cleanup methods should save significant while still meeting consistent stan- amounts of time and money in dards. The DQO guidance is preem- cleanups, while still achieving the inently a set of management and risk-based cleanup standards extechnical tools for articulating tech- pected by the public and regulators. nical objectives to facilitate communication among diverse players in The Superfund Accelerated complex projects, and for optimiz- Cleanup Model The SACM, an initiative of EPAs ing the collection and use of data in environmental decision making. In Office of Emergency and Remedial our experience, the techniques Response, is a major rethinking of available in the DQO portfolio capi- how to manage the cleanup process talize on the opportunities for effi- (5). The model emphasizes early cient, flexible cleanups offered by identification of the real cause of regulator and public concern at the SACM. The SACM and the DQO guid- each site. The SACM standardizes ance were developed and first ap- the level of residual risk that is the plied i n public sector settings goal of the cleanup. In addition, the where the regulator conducted the SACM streamlines the site characcleanup (e.g., Part 2 of this paper) or terization process by focusing data where the regulated party was an- collection on the priority risks other public sector agency (4).The rather than pursuing characterizaprivate sector needs to be alerted to tion and risk assessment laundry the opportunities that these devel- lists that involve implausibly reopments offer for reducing costs in a mote scenarios or contaminants that are not problems at the site. Morecleanup action. over, the SACM facilitates remedy selection by drawing on cumulative experience with effective remedial STANLEY BLACKER technologies (presumptive remeMAC Technical Services dies). Finally, the SACM encourGermantown, MD 20874 ages early action by helping the stakeholders reach consensus and DANIEL GOODMAN by drawing on the available menu Montana State University of accepted risk standards, preBozeman, MT 59717 sumptive remedies, and focused
466A Envimn. Sci. Technol., Vol. 28, No. 11. 1994
0013-936W94/0927-466A$04.50/0 0 1994 American Chemical Sociely
--
.a.
site-specific risk assessments and site characterization. Data Quality Objectives The important early steps for streamlining the cleanup process at a particular site are, first, to create a short list of critical conditions that make the site a genuine problem, and second, to agree on the data required to determine where cleanup is needed (and, by extension, where cleanup is not needed) and where and when the cleanup is complete. Both of these steps are facilitated and enhanced by the methods of DQOs. DQO procedures were developed by EPA’s Quality Assurance Management Staff (QAMS) to provide general assistance for planning data collection programs. The emphasis in the guidance is on collecting the right amount and kind of data to answer the program’s needs (3). The DQO methods can help the stakeholders in a regulator-driven remediation sort through their initial myriad concerns to reach consensus on a focused statement of the problem. The participants work through a risk-based decision logic to arrive at cleanup objectives that are k e d and clear. A variety of alternative remedial strategies can be examined systematically to determine what the expected cost of each will be and whether it can be expected to meet the cleanup objectives. Alternative strategies for using the data to guide the remediation are also evaluated. Efficiencies are achieved by selecting the least costly design that meets the cleanup objectives.
Four Key Elements in the Logic of Data Quality Objectives 1. Separation between policy ca? about “risk-goals” and technic discussion of alternatives for impL
2. Translation of the regulatory pc
icy calls-including the difficult definition of tolerances for “uncertainty”-into specific, concret, measurable cleanup criteria. 3. Documentation of agreemen reached, among all stakeholder on the critical requirements for tl 4. Use of the full arsenal of technr-
efficient design for the re-
I
Four key elements in the logic of DQOs In applying DQO methods to regulator-driven cleanup, four elements are essential to reduce the time and cost of cleanup [see the box, “Four Key Elements. . .”I. The remainder of this paper will discuss these four elements; in the following companion article we will describe the application of this approach to a particular Superfund site a s a n illustration of the achieved cost savings and the minimization of controversy. Separating risk-goal policy from negotiable technical factors. The central concern about any site cleanup is the risk posed by the site.
If that risk can be reduced to an acceptable level, then the regulators and the public should be satisfied. Similarly, a responsible party operating in good faith should be willing to invest in cleaning up the site to reduce the risk to an acceptable level. From the perspective of the party paying the bills, the strategy should be to invest as efficiently as possible in reducing the risk to an acceptable level, and not to overinvest in activities that do not effectively reduce risk. A statement of what constitutes an acceptable level of risk determines the “risk-goal’’ for the cleanup. Risk-goals are statements of policy that, within the limitations set by programmatic policy, are the prerogative of the regulators responsible for the cleanup. Because they are policy calls, they are not natural topics for negotiation or technical review. By contrast, technical alternatives for implementation should be inherently negotiable as long as these alternatives reach the goal set by policy. Notwithstanding the broad authority of the regulator at the site, there are a number of general features of the risk-goal that will be similar across sites. When the concern is with human health, as it often is, there is now a degree of agency consensus on appropriate target levels for nominal risk. A reasonable quantitative range of risk values has been established between unacceptably high risks and acceptable risks. Superfund guidance (6) recom-
Environ. Sci. Technol., Vol. 28, No. 11. 1994 467 A
mends that the site-specific goal for carcinogens should be a residual risk in the range of to IO-“ lifetime risk of cancer, assuming a plausible, worst case exposure scenario. This type of exposure scenario is constructed by evaluating ths contaminants present at the site, their exposure routes and exposure levels, transport mechanisms, transformation mechanisms, and future site use. When a quantitative interpretation is sought for “worst case,” it is often the most severe 2.5% tail of the distribution of plausible outcomes [often computed as an upper 95% confidence limit]. The risk associated with a given scenario is computed from nominal doseresponse curves. Many of the nominal dose-response curves in common use are subject to serious uncertainty. These curves are often derived from extrapolations of data outside the concentration ranges of concern, and they may be based on animal rather than human data. Furthermore, exposure estimates necessarily involve assumptions whose validity is uncertain. Nevertheless, reasonable conventions have been formulated to standardize the practice of the risk calculations despite the uncertainties. At the level of the individual site, it is generally best to accept these conventions. As a rule, both sides lose in litigation over the metaphysical aspects of establishing acceptable risk levels or choosing among assumptions that cannot be decided by available data. When there is no scientific consensus on a particular assumption or goal, the courts cannot he expected to settle the matter in any more reasonable fashion. Litigation in such a situation is costly and the results are unpredictable. In designing the cleanup for a site, it usually makes sense to accept the existing program-wide policy conventions. Site-specific technical negotiation a n d factual investigation should he restricted to those aspects of the cleanup for which new facts and discussion are likely to pay real dividends. Flexibility can he negotiated in those aspects of the cleanup that are not policy matters. Wise use of this flexibility creates opportunities for savings. A hard look at factual relationships among such factors as waste form, transport mechanisms, site use, and exposure routes can lead to a more efficient cleanup strategy. Examples of tactical considerations are given in the box, “Some
Tactical Considerations . , .”. Translation of regulatory policy decisions into measurable criteria. The statement of the acceptable risk for a particular site is more than a statement of philosophy. It serves as the basis for determining how much cleanup is required. Going from a statement of acceptable risk to a concrete statement of measurable criteria that will guide remediation at a site is accomplished by translating the statement of acceptable risk to a dose standard, translating the dose standard to a concentration standard, and translating the concentration standard to a criterion expressed in terms of field measurements. A description of the implementation of these translations is given in the box, “Translating Regulatory Policy Decisions . . These translation processes are at the heart of obtaining concrete, unambiguous statements of criteria that fully address all the essential policy concerns for the cleanup, but that are no more stringent than necessary. The clarity and concreteness will be important when these criteria are used as the technical specifications the cleanup design must meet. In particular, explicit quantification of the requirements for certainty in the cleanup allows the use of decision theory. As discussed helow [under “Optimization of the design”), this allows rigorous application of sophisticated optimization techniques-such as those of operations research-to structure the search for an efficient remedial design. Documentation of agreements reached on critical requirements. Reaching a consensus on the concentration standard and associated .’I.
error tolerance marks an important transition i n establishing the cleanup requirements. At this point, all the policy issues are settled and documented in a clear paper trail that is endorsed by the participants [ t o preclude later misunderstandings]. Once documentation of the policy judgments, and their rationale, is complete, the remaining aspects of the cleanup are technical. Then discussion and analysis can focus on selecting efficient means for achieving the desired risk-based cleanup levels. Because success in a regulatormandated cleanup is so dependent on consensus, it i s important throughout the course of the DQO effort to document carefully the sequence of discussions from which the specific cleanup requirements [and, hence, the cleanup implementation plan] evolve. The paper trail maintains continuity in the intervals between the face-to-face meetings; the incorporation of episodic sign-off steps in the record preserves the agreements that have been reached. Finally, capturing the logic and the rationale for the agreements can be invaluable if questions arise later during the course of the cleanup. Ootimization of the desien. When u the cleanup goal is expressed in terms of a concentration standard and an associated error tolerance for estimates of the concentration relative to the standard, the regulators have effectively made a contractual statement. They are committed to accepting any implementation plan that achieves that concentration with that level of certainty. This leaves open many of the details of the implementation plan, including
-
Some Tactical Considerations that Promote Flexibility and Opportunities for Efficiency in Site Cleanups If the risk is driven by long-term average exposure, then spatial or temporal “hot spots” of contamination may not be important except as they affect the overall average (unless the hot spots exceed a threshold where acute, rather than chronic, health effectsbecome a concern). If projected site use will limit certain exposure routes, then the cleanup requirements need not focus on the risk associated with those exposure routes, except to ensure that the control of site use will be effective and will be maintained for as long as the source of contamination remains. If future site use will lead to exposure only to the contaminants that escape the site boundary via a particular route, then containment measures or actions that intercept the contaminant at this particular route may be sufficient,provided that containment can be guaranteed for as long as required. If effectiveness of containment or interdiction measures can be ensured for the duration of a remediation, this removes the urgency of eliminating the
.
458 A Environ. Sci. Technol., VoI. 28, No. 11, 1994
Translating Regulatory Policy Decisio into Measurable Criterla rnslatlng the risk-goal to a dose standard. When the contaminant c. incern, the exposure route, and the acceptable risk ievei have been identid, a conventional dose-response curve should be used. Such a curve w~ll I identify the dose believed to be associated with that level of risk. for that tvoe of exposbre, and for that contaminant. Tne mooifier "conventional" acknowllqes that a variety of assumptions are often built into the construction of a )&-response cuke. In the 'interests of consensus and in the absence of ore conclusive scientific alternatives, the standard assumptions should be :cepted, rather than attempting to wrestle with policy issues that are not likely be resolved at the level of the individual site. Translating the dose standard lo a concenlrallon Standard. When tne edium. exposure route. transport mechanisms, ano mechanisms of decav and byproduct formation are taken into account, calculating the source con centration associated with the standard allowable dose is routine. This be -%nes the concentration standard for judging whether a cleanup has achieveb e risk-goal. In the case of a Containment strategy, the concentration stanird applies only outside the containment structure. In the case of a removal rategy, the concentration standard applies to the portion of the site that is mtaminated. The concentration standard must be expressed as an average fer a particular volume of space (or area). For dynamic situations, such as oundwater plumes, the average also needs to reflect a window of time. The jatial and temporal basis of the averaging must be logically connected with e exposure scenario (e.g., the area that is used in averaging should be the ime as the area that an individual may cover in accumulating a dose in the ausible worst case scenario). The nature of the dose calculation (ea.. . ~. How II the individual's exposure be accumulate0 over time? At wnat levels might ere be a transition from chronic to aclrte effects?) must be documented. Tn8s ise calculation is to confirm that the concentration standard is consistent ith the risk-goal, given the assumptions of the dose-response curve and the tposure scenario. Translating the concentratlon standard to fleld measurements. The mtaminant concentration standard is expressed in terms of the true concen3tion averaged over some interval of space (and time, if appropriate). This incentrationstandard could be used as the working standard'in ihe fieid only the true concentration in the field were known. However, we can know only stirnates of the true concentration from measurements on samples. Samples 'e subsets of the spatial and time domain specified in the concentration andard. Further, the data come from measurements that are subject to anatical error. Because of the variation introduced by sampling and analytical 'mr, it is unreasonable to expect an exact correspondence between the true h e and the estimate. This discrepancy creates an uncertainty which itself is art of the risk posed by the site. For this reason, it is a policy call to define W I much average discrepancy is tolerable (i.e., the error tolerance) between e measurable estimate and the unknown true value. One way to state the error tolerance is in terms of acceptable frequencies of Ise positive and false nwative decisions about whether the cleanuo ooal has 3en'reached. For exam$e, in determining whether a subunit of a Gontamided site is above or below the threshold defined in the concentration stanud, a false negative would arlse when the estimate was below the threshold ten though the trLe concentration was above the threshold. Thus, a stated 5% tolerance for such a lake negative WOJ o convey that a sampling and easurement system would be acceptable if .t provided estimates that aave se to such false negatives no more than 15% of the time. The definition of error tolerance, arising as a statement of policy, will then we implications for the design and cost of data collection. The number of imples can be increased, or a more advanced analytical method may be sed to obtain a more accurate estimate but at greater cost; or fewer samples an deliver a less accurate estimate at lower cost.
I
I
-
sampling design, analytical procedures, and remediation activities. With this level of flexibility in the details for meeting the concentrat i o n standard and the certainty level, there can be many remediat i o n implementation plans that should be acceptable to the regula-
tors. Plans that meet the established requirements w i l l vary in cost. A systematic search for efficient plans can lead to substantial savings. The methods of operations research offer many techniques (such as linear and nonlinear programming, dynamic programming, and control
theory) for finding efficient solutions, once the goals and constraints of the remediation are sufficiently defined. The DQO methods facilitate definition of these goals and constraints, and the use of decision theoretic formulations to state tolerances for uncertainty (as per the DQO methods) completes the translation of the cleanup requirements to a set of technical, unambiguous specifications.
s-ary The methods for setting DQOs take advantage of t h e central features of the SACM: consistency of standards and flexibility of implementation. The DQO methods can help focus a remediation on t h e risk standards that really matter for that site, translate these standards into measurable quantities, and structure a search for the least expensive remediation plan that satisfies the critical risk standard for that site. In our experience in using DQO methods, we have come to expect large cost savings while ensuring regulators that the regulatory standards are satisfied. The atmosphere of comparative trust fostered by a stepwise, logical process with constructive participation of a l l the important stakeholders, and with careful documentation of expectations and actual performance, i s refreshing. The efficiencies achieved in t h i s approach result from several positive features. In the early stages of identifying the core issue for a site, eliminating extraneous concerns can lower costs of remediation. Treating risk explicitly as a quantifiable feature of the cleanup leads to efficiencies in remediation design. The regulators establish quantitative acceptance levels &e., riskgoals and risk-based error tolerances) for the cleanup, and this determines what levels of remediation, characterization, and monitoring are really required. (By contrast, when risk calculations are not explicit, there i s a tendency to throw in intuitive "margins of safety" at many steps in the process. These multiple, ad hoc distortions can compound, leading to a much more conservative and costly cleanup.) B y collecting a l l the policy decisions early and documenting the logic and motivation for them, the scope for the remaining technical decisions i s well-defined, allowing the best use of technical optimization tools. Overall, the use of a stepwise logic for justifying, refining, and documenting cleanup require-
Envimn. Sci. Technol., VoI. 28, NO.11, 1994 469 A
m e n t s enhances c o m m u n i c a t i o n and consensus and reduces t h e confusion, moving t a r g e t s , and false s t a r t s t h a t cause delay and i n v i t e
14)
Blacker. S. M.; Goodman. D.;Clark.
I. M. Environ. Test. Anal.
1994, 3(41.
3843. (51 Statement of Carol M. Browner. Ad-
ministrator. U S . Environmental Protection Agency, before the Subcom-
confrontation. Acknowledgment Portions of this work were supported by EPA cooperative agreement R-818563 to Montana State University.
mittee o n Transportation and Hazardous Materials of the Committee on Energy and Commerce, U S House of Representatives. Feb. 3, 1994. I61 "Risk Assessment Guidance
for Superfund: Volume 1. Human Health Evaluation Manual," Publication 9285.7-018, Office of Emergency and Remedial Response. US. Environmental Protection Agency: Washington, DC, Interim, Oct. 1991.
References (11 "Building Consensus Through Risk
Assessment and Management of the Department of Energy's Remediation Program"; Committee to Review Risk Management in the DOE'S Environmental Remediation Program. National Research Council. January 1994.
(2) "Guidance on Implementation of the
Superfund Accelerated Cleanup Model under CERCLA and the NCP." OSWER Directive No. 9203.1-03. U.S. Environmental Protection Agency: Washington, DC. 1992. (3) "Guidance for Planning far Data Collection in Support of Environmental Decision Making-Using the Data Quality Objectives Process"; EPA QAIG-4, Quality Assurance Management Staff. U.S. Environmental Protection Agency: Washington, DC. October 6 . 1993.
Stanley Blacker is vice presidentenvironment a t Monageinent Analysis Company Technical Services. providing technical a n d policy support to the public a n d private sectors. He was formerly director of the Quality Assurance Management Staff a t EPA when h e retired from USPHS after 20 years of service a t EPA. His main interests are in applying strategic planning tools to design and implement cost-effective c l e a n u p s in tough cases. He has a masters degree in chemical engineering from Cornell University a n d a I.D. from Villanova University.
ETE is designed for profession responsibility of solving envirc A conference program that offers expertise to guide attendees
approach to pollution management. A 150-booth expo targeted to provide solutions affecting air,
J A cettification program by the National Registry of Environmental Professionals (NREP). Colocation with the 17thWorld Energy Engineering Congress REFER TO KEY NO. 3 470A
Environ. Sci. Technol.. Vol. 28. No. 11. 1994
sr Daniel Goodmnn is o professor in the Biology Departinent at Montana State University a n d directs the Environmental Statistics Group. He received his Ph.D. from Ohio State University. His interests include uncertainty analysis and validation of environmental models. He h a s worked with the public a n d private sectors to improve environmental decision making.