Dispersion modeling - ACS Publications

small truncation error can accumulate through the successive ..... environmental thrusts that the gov- ernment legislatively ..... tion is small. A pu...
0 downloads 19 Views 4MB Size
Dispersion modeling -an increasingly used and increasingly questioned tool in air quality planning. Researchers point to the need f o r more accurate and more complete input data, field uerification, a better understanding of the applicability of the various models, and possibly a change in the f o r m of air quality standards

This article is the first in 4 series on mathematical modeling of the atmosphere. Future articles will e x amine models of carbon-dioxideinduced changes in climate and halocarbon-induced changes in stratospheric ozone. Mathematical models of the atmosphere, once subjects of the fiefdom of pure research, have now entered the realm of the day-to-day regulatory decision. T h e ban on fluorocarbons as aerosols went into effect a year ago, largely on the basis of computer simulations of the chemistry and dynamics of the stratosphere which showed a depletion in the ozone layer. Intensive studies of the effect of a buildup of atmospheric carbon dioxide are underway, with possible implications for the continued heavy use of fossil fuels; again, computer simulation is the primary tool. Mathematical modeling of the effect of new sources on ground-level air quality is a standard part of EPA permit procedures. In each of these instances, the use of a model has been dictated by the need 370

Environmental Science & Technology

to make decisions on an action which is known in advance to pose a possible environmental problem. The increasing use of models is in one sense an indication of our better understanding of the environment, our ability to look ahead to potential problems rather than wait for a disaster and attempt to apply a remedy after the fact-and after the fact usually means after it is possible to do anything about it. If we wait for observational confirmation of the predicted destruction of stratospheric ozone from fluorocarbon release, for example, the effect will already be snowballing. The National Academy of Sciences report on stratospheric ozone said, “There is a time delay between-halocarbon release and ozone destruction; the maximum ozone decrease actually occurs some 15 years after all release has stopped.” Thus. we cannot afford to “experiment” to find out what will happen; a decision has to be made on the lessthan-perfect information available now. The “greenhouse effect.” which links a buildup of carbon dioxide with a global warming, presents a similar dilemma. Although a rise in atmo-

spheric carbon dioxide concentration has been observed, its predicted effect on climate has yet to be conclusively observed; again, wating until the effect has risen above the “noise” level may mean waiting until the effect is well on the way to causing drastic changes in the environment. Models can help to predict what the magnitude of the effect will be, and what some of its consequences may be. Decisions which must be made regularly on a local scale also must rely on model calculations. A utility cannot be expected to spend millions of dollars building a new plant and installing pollution control devices only to discover that their effect on local air quality is unacceptable. There is general agreement that no good alternatives are available to the use of models. But how the results are to be interpreted, how they may be verified, and how they may be improved are matters of great debate. The validity of the assumptions and simplifications made in a model and where exactly the uncertainties are introduced are also questions with no firm answers right now. This first article in a series on mathematical modeling of the atmosphere examines these questions as they apply to dispersion modeling, the technique used in new source permit applications.

The regulatory requirements The Clean Air Act amendments of 1977 specify the use of air quality models in the analysis of prevention of significant deterioration (PSD) permit applications. The PSD requirements limit the amount by which ground level concentrations of sulfur dioxide and particulate matter may increase over existing baseline levels of 1977 as a result of all new sources. The object of using an air quality model is thus to relate the expected emission rates of these pollutants from a proposed new source to their effect on the ground 1eve I concentration s to est a b I ish t h a t the permissible increment over baseline will not be exceeded. Modeling is also usually necessary to demonstrate that National Ambient Air Quality Standards for carbon monoxide, hydrocarbons, nitrogen d i ox ide , a nd phot oc:h e m i ca 1 ox i da n t s will not be exceeded as a result of the new source . The major worry of those facing the permitting process is how accurate the models currently in use are. The utility industry in particular is concerned over what it sees as the strong “conservative” bias of the widely used “off-theshelf” models developed by the E P A -b i as es w h i c h over es t i mat e

concentrations and may lead a utility to spending hundreds of millions of dollars on unnecessary controls, it says. Though other than the EPA-developed models may be used. gaining approval for their use may be difficult. The result is that the EPA models have attained a certain “regulatory status,” according to many. I n a recent discussion of the problem in the Journal of the Ajr Pollution Control Association, V. A. Mirabella of the Southern California Edison Company argued, “There seems to be a built-in reluctance to employ alternative modeling techniques. . . due to the arduous task of gaining EPA approval.” “To use an alternative or nonrecommended model for a given situation, an applicant must present evidence that the alternative method is equivalent to a recommended model.” Equivalency is judged “not on a comparison of performance and accuracy of individual models, but on the degree to which the alternative model treats individual features of the modeling problem in a manner similar to that of the reference model.”

Sources of error The EPA models, which naturally have their defenders as well as their critics, are all based on a “Gaussian” plume spread. The Gaussian (or binormal or bell-shaped) distribution of the concentration of a substance is the general solution to the diffusion equation when the simplifying assumption of “Fickian” diffusion is made (see box). The biggest plus for Gaussian models is their relative simplicity. Little computation time and input data are required. “When it all comes downto a reasonable simplicity, the Gaussian model can’t be beat,” said Isaac van der Hoven, chief of the National Oceanic and Atmospheric Administration‘s Air Resources Environmental Laboratory. “More sophisticated models mean more sophisticated data you need to put in.” Although errors in predicted concentration may be large-from 30% to a factor of 2, or even IO,depending on the pollutant, the averaging time, the spatial scale, the terrain, and who is asked-the errors may not be significantly greater than those associated with the input data. According to van der Hoven, “The major uncertainty is in the source term. Unfortunately, this is not often well known to within a factor of 2.” Robin Dennis of the National Center for Atmospheric Research agreed: “The errors that are now coming along

in the models are equivalent to the error in emission inventories and knowledge of micrometeorology.” But others point to the “crude” assumptions and simplifications inherent to Gaussian modeis. Heading the list is that Gaussian models are all steady-state models. Over the time interval considered in the calculation, factors such as wind speed, temperature, emission rates. and mixing height (the height above which a spreading plume does not penetrate) are taken as constants. A second source of uncertainty is the empirical formulation used to specify the horizontal and vertical dispersion parameters. These factors determine how the plume spreads as a function of distance from the source. The commonly used formulation empirically adjusts the dispersion-versus-distance curves according to what is called stability class-a semiquantitative description of how stable or unstable the atmosphere is. More data and more complete field studies could improve this formulation, however. According to Bruce Turner of the EPA’s meteorology and assessment division, “ I n most cases we have a theoretical foundation and the equations are theoretically based; it’s just the values of the parameters going into them that are empirical.”

Data data who’s got the data Turner and others who question the need to go to more complex models than the Gaussian keep coming back to this point: Improved data and more data can be incorporated into the Gaussian models to advantage. “If you improve the data, you’re going to change the model to use the improved data. It’s not just that we want less error in the data we’re getting-we want new parameters,“ said Turner. For example, measurements of the statistical fluctuation in wind direction could provide the basis for a better estimate of horizontal dispersion than could stability class; this information, though, is not normally available. Turner noted that there is a tendency to get stuck in a rut with available data. “The principal reason for the simplification [in the model] is that certain data is readily available and other data isn‘t-and we didn‘t even ask for the other data.“ Thus those who make the measurements aren’t aware of a need for additional data. But G a uss i a n form u 1at i o n s cannot absorb all data improvements. Some simplifications in the model, such as assuming that average wind velocity is not a function of position, are inherent to the Gaussian model. Volume 14, Number 4,April 1980

371

Using a single wind velocity locks the calculation into an empirical framework since there is no obvious answer to the question of which wind speed to use. And since the effective injection point of the plume is not the top of the stack but rather some point above the stack (the plume rises vertically for some distance before leveling off), some sort of empirical extrapolation of ground level or top-ofstack wind measurements is necessary.

Those who are developing more complex models insist that it is important to make explicit the true data requirements of a complete description of pollutant spread. They argue that although the Gaussian model is appropriate under conditions which correspond to the model’s assumptions, it cannot reveal how much the results would vary under differing conditions. An example was provided by Philip Roth, vice-president of Systems Ap-

The Gaussian plume model The basis of most mathematical descriptions of atmospheric diffusion is the “Ktheory’’ equation, which relates the concentration of a substance, as a function of position and time, with the action of average wind flows and turbulence.

significant compared with mean flow in the x direction. With these assumptions, the equation has an analytical solution, which expresses the concentration as a Gaussian, or normal, distribution in the y and z directions.

The K theory equation in one dimension

The Gaussian solution

C u

K x t

concentration of the substance wind speed in the x direction a coefficient, known as the “turbulent diffusivity” or “eddy diffusivity” distance from the source time

The first term on the right side of the equation represents transport due to average winds; the second term represents diffusion due to turbulence. To obtain the Gaussian solution, four assumptions must be made: the solution is time-invariant the wind speed is not a function of position the diffusivities are not functions of position (the key assumption in “Fickian” diffusion) diffusion in the x direction is in-

372

Environmental Science & Technology

C

0 u cry

concentration emission rate at the source wind speed in the x directiqn a function of the turbulent diffusivity in the y direction and the distance X

u,

a function of the turbulent diffusivity in the z direction and the distance X

x y

z

horizontal distance from the source along the direction of u horizontal distance from the source perpendicular to u vertical distance from the source

The figure shows the characteristic horizontal spread of a Gaussian plume as it travels downwind from the source. Concentration is shown on the vertical axis as a function of xand y for a fixed value of z.

plications of San Rafael, Calif., a consulting firm which does work in air quality for government and industry, and which is currently developing models under an EPA contract: “Suppose you knew that one model didn’t take into account wind shear”-the variation in wind speed and direction with height-“while another model did. Then, under conditions where wind shear might be a factor, you‘d still want to use the more complex formulation and see how that formulation responds to variations in wind shear over the range of uncertainty”-even though exact measurements may not be available. I f the more sophisticated model shows the results to be relatively insensitive to these variations, then the simpler model can be used with more confidence, Roth said.

Range of applicability Everyone agrees, though, that the Gaussian formulation has its limitations: “You just can’t seem to address photochemical problems with Gaussian models; you have to make assumptions if you‘re going to address a complex terrain with a Gaussian model,” said Turner of EPA. Photochemistry is a major factor in determining the concentrations of ozone and nitrogen oxides in the lower atmosphere; since it is a time-dependent phenomenon, the steady-state Gaussian model cannot incorporate it. This fact is widely recognized and indisputable. The application of Gaussian models to rough terrain is open to question, however. It is possible to use a Gaussian model in such instances-and, according to many, totally incorrect. “The models are terrain-sensitive,” said Glenn Hilst of the Electric Power Research Institute’s environmental assessment department. “If they’re developed for a flat terrain and then applied even to rolling hills, they are not appropriate.” The danger, according to Hilst, is that by not firmly establishing the range of validity of a model-particularly a readily available, “approved” model-it may be applied to conditions which i t cannot possibly handle correctly. Even the EPA model developed specifically for rough terrain is questioned. “The use of it has inspired a lot of criticism-it predicts the maximum concentration would be twice that of the plume itself,” said Bruce Egan of Environmental Research & Technology, a consulting firm which provides modeling assistance to permit applicants. The problem, said Egan. is that “modifications to Gaussian models are

essential in complex terrain to consider the way the flow is deformed around terrain objects.” A second problem is that roughness creates an increase in turbulence that needs to be taken into account through modification of the dispersion coefficients. As it stands, the EPA model predicts ground-level concentrations that are as much as eight times those predicted by models em body i ng d i f fer e n t ass u m p t i ons , Egan said.

Complex models What are the alternatives, then, to Gaussian models? Complex models exist i n a number of formulations (see box);they have a number of common features, however. One is that they are all numerical models. While a Gaussian model makes certain assumptions which lead to an analytical solution of the basic diffusion equation, a complex model performs a numerical solution of the equation, integrating it step by step over time. The Gaussian solution is thus an exact solution of an inexact equation--one in u hich steady-state assumptions have been made; the solution provided by a complex model, on the other hand, is an inexact solution cf a more exact equation. From the viewpoint of what the solutions describe, the Gaussian is statistical. It does not trace the movement of the plume through time and space as the complex models do, following changes in wind speed, direction. and stability; rather, it uses the overall description of wind and stability to produce a statistical, steady-state picture of how a plume spreads with distance from a source. Since complex models apply the diffusion equation at each point in space over a series of time intervals, this means that more detailed information, particularly on wind speed and direction is necessary. A “wind field,” or map of wind speed and direction over the entire region of study, must be specified. Such complete information is rarely available. Because complex models involve numerical integration of the equations, truncation errors can be a problem. A small truncation error can accumulate through the successive integration steps: this “numerical diffusion” is a major bugaboo in grid models. The effect can be minimized by shrinking the grid spacing or shortening the integration time steps, though this increases the computer capacity needed for the calculation. Computer running time as well as computer capacity are considerations w i t h complex models. Both increase

with model complexity--as does expense. But Roth of Systems Applications pointed out that other factors may be more important. ‘‘it‘s true that they’re more expensive, but that’s not where the cost is tied up. Your real cost is in collecting field data and preparing it for input. Carrying out the runs and analysis turns out to be a relatively minor part of the cost.” Roth also argued that the additional expense of gathering the more complete meteorological data nceded for complex models may be minor when compared with the potential cost of unnecessary pollut ion-control equipment. But the real issue, according to Roth, is not one of expense or even simply of Gaussian models versus complex models, but rather of the need to step back and examine the nature of the problem before choosing a model. “You have to consider what pollutants you want to model; then there’s the time scale of the problem: Are you looking at an hour or a day or a year? Then there’s the spatial scale of the problem: Are you looking at a local problem or a regional problem?” Most of the scientists working in the field with whom E S & T spoke agreed that such factors should dictate the choice of the model, but disagreed over where to draw the line. Opinions ranged from the view that the G a m ian model is at most a “screening” procedure-which, by embodying conservative assumptions, can yield a

“worst case” prediction-to the view that only extreme circumstances justify the greater expense, time, and data requirements of the complex models.

Field tests The most obvious way to decide the issue would be on the basis of field tests, under varying conditions, which could determine how well the various models perform. But in the rush to develop models and put them into use, this key work has been given the back seat. “It costs a lot of money to do these tests, and they aren‘t done very often.” said van der Hoven. His group at NOAA is “desperately trying to validate these models” through field tests in which sulfur hexafluoride, also known as “tracer gas,” is released at a known rate and its concentration measured downwind by an array of up to 300 samplers, along with a good deal of meteorological data. Egan of ERT, while noting the importance of such work, injected a note of caution with regard to studies that simply compare predicted and observed concentrations. “When you realize that a model would be required to estimate ground-level concentrations for all the different meteorological conditions that may be encountered,” you realize that a few comparisons of predicted and observed concentrations is not sufficient “verification,” he said. “To verify a model, you should be

Numerical models “Complex” models all rely on numerical solutions of the K-theory equation. Three general approaches are possible. In the Eulerian formulation, a fixed coordinate system, or grid, is laid out over the entire region of interest. The concentration of a pollutant in each square of the grid is then calculated by explicitly solving the equation, using numerical methods and the help of a computer, over a series of small time intervals. In squares which contain sources, a source term is added to the right side of the K-theory equation. This approach is most useful for situations in which there are multiple sources or for which predicted concentrations are needed for the entire region. The Lagrangian formulation, on the other hand, considers a coordinate system moving with the local mean winds. This approach is particularly useful for modeling transport over a long distance or when the effect on

only a particular receptor site is of concern-in other words, when there is no need to calculate concentrations over an extensive array of fixed locations. While this saves on computer space and time, it leads to results which can be difficult to interpret, since the coordinate grid is distorted as it twists and turns with the local winds. A subset of Lagrangian models, trajectory models, apply the Lagrangian formulation to a single moving cell, thus avoiding the grid distortion. In this case, however, turbulent diffusion must be calculated from external data, and is in practice usually neglected altogether. The particle-in-cell method is a hybrid approach. Here, the source emissions are divided into individual Lagrangian cells, each of which is tracked over a fixed coordinate system. The concentration in each fixed grid square is then calculated simply by counting up the number of these cells present in each square.

Volume 14, Number 4 , April 1980

373

WHATMAN

For years one glass microfiber filter has been the filter of choice for pollution sampling and monitoring procedures in the U.S.A.: Reeve Angel 934-AH. Now Reeve Angel 934-AH i s Whatman 934-AH. Only t h e package has changed. Whatman 934-AH glass microfibre filters are in every way identical to ( ) the former Reeve Angel 934-AH filters. And are available from laboratory supply dealers in the same sizes and with the same characteristics.

z

Reeve Angel 934-AH

Whatman 934-AH.

Whatman Paper Division 9 Bridewell Place Clifton, New Jersey 07014 Tel: (201) 773-5800

M Amatman CIRCLE 23 ON READER SERVICE CARD

374

Environmental Science 8, Technology

looking into verifying the choice of some of the parameters in the model.” An example would be determining if the plume spread really does follow that assumed in the Gaussian model. But even studies of the sort that Egan calls insufficient are scarce. And often, according to Egan, Hilst, and others, there is so much uncertainty in defining the meteorological conditions that little difference in performance can be seen in comparisons of Gaussian and complex models. A fair comparison is also difficult for the reasons outlined by Roth: Performance of a given model depends on what you are trying to predict-different models are appropriate to different applications. One study which did apply two different models to the same data and which did reveal differences in performance was based on measurements of radioactive krypton-85 in the vicinity of the Savannah River nuclear plant. Annual averages were predicted to within 20% by a complex trajectory model, but only to within a factor of 4 by a Gaussian model. But other studies, applying the same data to predictions of seasonal and weekly averages, were far less conclusive. EPRI is currently undertaking a large model validation study, which it hopes will come up with some definitive answers as to the reliability, accuracy, and range of applicability of the various models. A second goal of the project is the development of improved models. In the EPRI study, a network of 30 full-time monitoring stations-which can be supplemented by a second network of 250 stations for intensive studies--along with airborne remotesensing equipment and in-plant monitors, supplies a “massive” data base on emissions and meteorology for a single power-plant site in Springfield, Ill. The EPA is reportedly considering a similar study over complex terrain.

Form of air standards But when it comes to judging the performance of a model with respect to its task in the “real world”-that is, perinit procedures-a very basic problem arises with regard to all models. “The part we have the most problems with is that most of the modeling we’re trying to do relates to air quality standards,” Turner said. Many of the standards are expressed in terms of concentrations which may be exceeded only once per year; thus the job of the model is to estimate the second highest concentration that occurs in a year-an exceptionally difficult task to perform and to verify. “It’s the end of the statistical spectrum

we’re trying to address,” said Turner. Egan, writing in the Journal o f t h e Air Pollution Control Association, agreed: “The difficulty with attempting to predict reliably such extreme values with present modeling techniques makes modelers very uncomfortable with having to provide such estimates for critical decision-making purposes.” Egan told ES& T that this opinion is shared by a number of members of a National Commission on Air Quality panel assigned to examine dispersion modeling. The panel’s report is expected early this month. One possibility that has been suggested is to change the standard to a more stringent one, but one that could be exceeded more often. This would bring the problem closer to the centerline of the statistical spectrum. The opportunity for such a modification will present itself this year with EPA’s review of the ambient standards, though any change from the status quo is likely to be opposed either by industrial or environmental activist groups; and since a change such as this one will be hard to judge as either a “tightening” or a ‘‘loosening’’ of the standard, it could conceivably be opposed by both sides.

New directions For the immediate future, the most pressing need seems to be for new and better data-a conclusion which the N C A Q panel also reportedly reached. A number of researchers interviewed by E S & T also spoke of some pressing needs in new areas which deserve the attention of modelers. One is regional sulfate transport, a key factor in acid rain and visibility impairment. Systems Applications recently received a contract from the EPA for the development of visibility models, which the agency will need for setting and implementing visibility regulations. The agency is under a court order to issue its final regulations on visibility by November. Acid rain, though not subject to regulation at the present, is an issue of increasing concern. It is a complicated problem involving multiple sources, long-range transport, and heterogeneous chemistry. Extending photochemical models to cover large regions-such as the greater New York City area-and to include night-time conditions of vertical stratification and thermochemistry so that multiple day events can be modeled are also receiving attention. -Stephen Budiansky

How to get the devil out of your waste stream and the EPA off your back. Organic contamination of plant waste streams is a devil of a problem Contaminants such as phenol, dyestuffs, chlorinated pesticides, and chlorinated hydrocarbons, to name a few, give the modern industrial producer a myriad of troubles. How can one remove these. chemicals from waste streams? IS the method used satisfactory? What do I do with recovered chemicals? Will the removal process meet federal and IOCZl standards? I S the recovery or removal process economical? Through constant research and development over the past four decades, Rohm and Haas Company has

pioneered the technology that can give you the right answers. Our line of AmberliteRpolymericadsorbents and ion exchange resins provides industry with products that will safely and economically adsorb these and other chemicals from plant waste streams. We will be pleased to discuss your organic waste problems and possible :;olutions.For further information, circle our reader Service number or contact Rolqm and Haas Company, Marketing Services, Independence Mall West, Philadelphia. PA 19105. P M I L A D E L P N ' A . PA 19?05

RoHMO

A World Leader in Water Treatment

CIRCLE 6 ON READER SERVICE CARD

Volume 14, Number 4, April 1980

375

Reports worth reading Three recent reports worthy of any serious environmental professional‘s reading time are the EPA publications “Environmental Outlook 1980” and “Research Outlook 1980,“ and the C E Q report of the “Interagency Task Force on Environmental Data and Monitoring.” These reports are better than most federal agency reports in that each projects what will happen or is planned in the upcoming years. “Environmental Outlook 1980,” ;: first major report prepared by the EPA Strategic Analysis Group of the EPA Office of Research and Development ( O R D ) , is a good source of information for those interested in following the agency’s long-range research and development program designed to support its regulatory function. lrvin L. White, special assistant to Stephen J . Gage, EPA assistant administrator for research and development, says that the more than 500page document (EPA 600/8-80-003) provides agency planners with information on future environmental trends and problems. At press time the report was still in draft form. T h e final version, which will incorporate comments and suggestions received from reviewers, will be available later this year, tentatively sometime in June. Although this document is in fact the fourth in a series of annual reports, it is the first to provide more than quantitative trend information on selected air, water, and solid waste pollutants. This 1980 report was produced by a n interdisciplinary team from the EPA Strategic Analysis Group and four contractors-The MITRE Corp./Metrek Division, C O N S A D Research Corp., International Research & Technology Corp., and Urban Systems Research and Engineering, Inc. In the foreword, Dr. Gage welcomes comments and suggestions on its Strategic Analysis programs. T h e second report, “Research Outlook 1980,” is a congressionally mandated report that spells out the agency’s environmental R & D plans for the next five years. It also details a number of significant R & D planning and management changes. Among them is the establishment of a joint planning system-a research com376

Environmental Science & Technology

tion. The workshop of federal produccrs and nonfederal users of monitoring data is tentatively set for early summer in Washington, D.C. The task force felt very strongly about the quality assurance issue and decided that it was best dealt with a t the individual agency level. The EPA cstablishcd its QA policy last May, and the air and \cater Q A programs of the agcncj were detailed i n an E S & T “Special Report” (ES& T , hovember 1979, p 1356). A not he r t IV o r ecom mend a t i on s related to data systems and ways to improve budget procedures. The Task Force also recommended federal agency support of international global monitoring data and monitoring activities such as those of U N E P and OECD. These recommendations are not binding on any of the nearly 20 federal agencies; they go to the President, who called for the creation of the task force in the first place. Douglas Buffington of the C E Q staff indicated the need for a statistical reporting system, another Task Force recommendation. At this late date, Three there is no place in the federal govA first report of the “Interagency ernment where statistics on the various Task Force on Environmental Data environmental thrusts that the govand Monitoring” is the third report ernment legislatively made in the worthy of reading. Called for i n the decade of the ’70s-the environmental President’s environmental message of decade-can be evaluated. T h e result May 23, 1977-a message that called is that the success or failure of enviattention to the deficiencies in the na- ronmental policies simply cannot be tion’s environmental data and moni- assessed. Nevertheless, Buffington pointed toring programs-a task force comprising some 200 federal participants out that perhaps the best information from nearly 20 agencies was organized on statistics is found in yet another on Oct. 5 , 1977. The task force’s 40- publication, “Environmental Statistics page report, which presents only con- 1978.“ With more than 200 tables in clusions-the technical support doc- some I2 chapters on topics such as inuments, essentially working papers, dustrial production, hazardous subwill be available from the NTlS this stances, water quality, air quality, and summer-examined the efforts of the others, this publication contains data government in the monitoring area and lvhich provides trends in the environmade recommendations for ways to ment. This information will be updated in biennial reports. But what this improve them. Not surprisingly, the two main rec- publication is not is also important to ommendations relate to the issues of note. It is not, for example, comprecoordination and quality assurance. hensive coverage, nor do the tables Coordination is at the heart of im- include sufficient detail for regional proving environmental d a t a and and local analysis. Also, no statistics on monitoring programs. One way to expenditures of time and money by improve coordination is to get the pr i v :it e a n d govern men t i n s t i t u t ions have been included. federal producers of data together with the nonfederal users of such informa-Stan Miller mittee system oriented to the users of research information. Although this 224-page document ( E P A 60019 80-006, February 1980) is the fifth i n a series of EPA annual reports mandated by Congress. thc scope and content of this year’s report has continually broadened and now includes :I focus on future research and emerging problems. For example. the research plans for some I0 interrelated programs are discussed. In separate chapters the topics are air. water quality, industrial and municipal wastewaters, drinking water, solid and h a za rd o u s wastes , pest i c ides . no n ionizing radiation, noise. and energy. Additionally, there are chapters on research options and anticipating environmental problems. Elaine Fitzback of EPA, 1980 Research Outlook coordinator, indicated that this report focuses on environmental concerns, approaches to abate pollution, and the research needed to support these approaches. Dr. Gage also invites comments on this research program.

Find, identify,,measure, count and record pollutants with Zeiss. For all microscope techniques illustrated below and for many others used in pollution analysis Zeiss has the instrument you neet1 Fully automatic camera microscopes. Photomicroscope I l l with automatic flash and data recording ?)step for 35 mm photamicrography Ultraphot IIIb for both fully automatic Dhotoinicrography and photomacrography 35 mm and 4x5” formats Inverted camera microscope ICM 405. Fully automatic inverted camera microscope for transmitted arid reflected light with integrated 35 mm and 4x5’ cameras Ultra-stable Standard and WL microscopes. Excep*ionalversatility and Zeiss quality opt’cs at competitive prices

Universal microscope. The most universal microscope for routine and research applications The automatic attachment carnera MC 63 takes 35 mm or 4x5” (including Polaroid‘) and provides highly resolved exceptionally bright images Stereo and dissecting microscopes. High resolution flat field long working distances Specimen-saving transmission electron microscope EM 109. New high-performance (3 44A) EM instantly ready for use Three unique i nnovations outside-t he-vac uum camera system specimen - saving focusina svstem ultra-clean vacuum system- ’ Nationwide Service.

The qreat name in optics

CIRCLE 15 ON

READER SERViCE CARD

Carl Zeiss, Inc.,444 5th Avenue, New York, N.Y. 10018(212)730-4400. Branches Atlanta Boston Chicago Houston Los Angeles San Francisco Washington D C In Canada 45 Valleybrook Drive Don Mills Ontario M3B 2S6 Or call (416) 449-4660

Capital formation Do environmental regulations inhibit it, enhance it, or hace any effect at all? Here are several ciewsfion? gocernment, industry, and the academic community “Environmental laws and regulations inhibit the formation of private capital” is a statement heard in many quarters. Is that indeed the case‘? If so, are the inhibitory effects material or marginal‘? Spirited debate has raged around these and related questions since environmental regulations became a significant factor in doing business. Answers to these questions may vary in proportion to how many knowledgeable people are asked them. One thing is clear, however-when pollution control (pc) equipment must be installed and operated to comply with environmental laws and regulations, more capital investment is needed to achieve a given level of productivity than would have been needed had these laws and regulations not been “on the books.” What is not certain is the amount of extra capital needed. For instance, a “software” firm preparing and marketing computer programs for financial institutions might not need to consider environmental compliance when it determines its capital requirements. On the other hand, in projecting capital needs, a steel, chemical, paper, or power company cannot fail to take pollution abatement funding into consideration. In discussions with various representatives of the federal establishment, industry, and academic institutions concerning effects of environmental laws and “regs” on capital formation, ES& T found solid agreement on at least one point: Getting a “handle” on these effects is an extremely difficult task. These representatives also concurred that not much work has been done, to date, to define qualitatively and quantitatively the extent of any capital-inhibiting role of environmental laws/regulations. But the effort has not been neglected, by any means.

Two peaks To comply with pollution abatement rules dealing with air, water, solid waste. radiation, land reclamation, 378

Environmental Science & Technology

u ith “best available” under the Clean Water Act becomes necessary. This says nothing of additional increments for TSCA and Resource Conservation and Recovery Act ( R C R A ) compliance and other impending laws and “regs.”

noise, and hazardous/toxic substances as of 1977, industry and other entities had to assemble $8.9 billion ($12.9 billion for 1978, in 1978 dollars). The $8.9 billion, in -1977 dollars, covered capital costs, including depreciation and interest. These dollars represent amounts beyond what would h a m been needed had those rules not been in force or soon forthcoming, and are labelled, “incremental costs.” Operation/maintenance (O/M) of pollution abatement systems were $10.4 billion, giving total annual costs of $19.3 billion, as estimated by the Council on Environmental Quality (CEQ) and EPA. In 1986, capital, O/ M, and total annual costs may be $25.7 billion, $26.6 billion, and $52.4 billion, respectively, in 1977 dollars (see charts for increased 1978 estimates in 1978 dollars). The cumulative costs over 1977-86, according to C E Q and EPA-capital, O / M , and total annual-would be $172.5 billion, $188.8 billion, and $361.3 billion, in that sequence. These totals may have to be revised upward somewhat for land reclamation and toxic substances ( T S C A ) compliance. Excluding mobile sources, private sector pc investments were estimated by C E Q at 4.7% of total business investment in plants and equipment in 1978. The council expects this percentage to fall, then rise to a second peak early this decade, as compliance

Few “nervous Nellies” To try to get a better “handle” on economic, including capital formation, effects of environmental “regs,” EPA now has a policy of conducting an overall study every two years, Frans J. Kok. director of the agency‘s Economic Analysis Division, told E S & T. I n his words, the environmental regulation-capital formation connection “is not well researched.” However, he described a 1976 Chase Econometrics study which estimated that for every dollar invested in pc, 40c less is invested in other plant assets. A Data Resources, Inc. (DRI. Boston, Mass.) study puts this figure at 33c. According to Kok, a Council on Wage and Price Stability study-in this case, of the paner industry-“found no evidence of capital ‘displacement‘ ascribable to investment in pc.” Edward Denison, formerly with the Brooking Institution and now with the U.S. Department of Commerce ( D O C ) , was quoted by C E Q as saying that i n percentage terms, 6% of otherwise “prod u c t i ve ca pi t a I is “d is p I aced,*’ Kok acknowledged that there might be “short-run” capital formation problems and additional costs of doing business; but he said that since these costs are probably passed on to the consumer over the long term anyhow, “long-run returns on investment won’t change. In any case, we are talking fractions of a percent.” With respect to regulations and their development, Kok added, “Sure, EPA has made some mistakes; but in the large niajority of cases, the agency did not come out with regulations that are way out of line.” “

“NOfinancial return” The EPA/CEQ view, then, is essen ti a I I) t h a t en v i r on m e n t a I I a \v s and

regulations have a marginal or, a t times. a moderatel> adverse effect on capital formation over the near term. This vie\v would not coincide with that of David Roderick. chairman of the board of U S . Steel Corp. (Pittsburgh. Pa.). He told a Senate subcommittee that a t his company alone, capital spending for pc equipment. now and for the next few years, uill be about 30% of total spending on facilities. "This severely restricts the funds avai1;tblc for other projects. And there is no financial return from these environ men t a1 con t ro I ex pe nd it u res." R odcr i c k 101d the s u bcom m i t tee. He alw pointed out that "except for ;i couple of years during the 1970s. iron/steel industry profit rates have been a t , or near, the bottom of all manu fact uring industries." T o that , Roderick explained. add the fact that tax depreciation is limited to historical cost. so the industry's capital recovery via retained earnings becomes "grossly i n ad eq u:i 1e d u r i ng i n flat i on A r y per i ods (such ;IS the present time). There is not enough capital to replace plant/equipment a s it wears out. Thus, environmental control needs strain the c 3 pit a I for in ;it i o n situ a t i o n c ve n in o r e severel) i n iron/steel--a very capital-intensive industry. in any caseaccording to Roderick. "

A new plant The view of a capital shortage in the steel industr), was also given to E S & T by Philip Masciantonio, director of environmental control for US.Steel. ;is he described the problems of its proposed Conneaut (Ohio) Plant. Estimates are that the plant will be able to reduce labor and energy costs by 30-40'70 over present plants, put out a product that can compete with those of foreign steel makers, and have good transportation logistics, he said. Capital funding must cover not only the plant, but also the pc hardware. There are large outlays to be made before the ground is broken, Masciantonio pointed out. For example, the e nv i r on menta I i m pact stat e men t (EIS), finalized last May, was a 17volume document, two years in the tnaking. A Business Roundtable spokesman told ES& T that this type of document alone, for which a firm must provide data even though the federal agency actually writes it, could surely cost that firm several million dollars. Then, there are the costs of applying for all the necessary air. \vater, RCRA, and other pertinent permits. And since the EIS apparently prompted a lawsuit by several groups, including the Sierra Club. there may be litigation costs.

239 5

Volume 14, Number 4 , April 1980

379

Frank J. Kok of EPA “it’s not well researched”

All of these EIS, permit, and litigation factors cause further delays in getting plant construction and start-up under way. Because of such delays, inflation also raises costs across the board; so what capital has been formed is further eroded.

An unusual case? Nevertheless. the steel industry may be “an unusual case,” or “a worst case,” according to the Business Roundtable spokesman. The industry is also buffeted by competition from foreign technology, and government subsidies and lower labor costs for many non-U.S. companies. Moreover, there are other stresses not directly related to environmental considerations, the spokesman said. By way of comparison, he mentioned the paper industry, explaining that it “may not have anywhere near as severe capitalformation problems as the steel industry has. Troubles in that industry are more in the way of red tape and permit delays, which, of course, erode

380

Environmental Science & Technology

capital through the added costs of inflation,” he said. The problem of permit delays may have been dramatized in the chemical industry by Dow’s decision not to build a plant in California in 1978. According to Myron Foveaux of the Chemical Manufacturers Association (C MA, Washington, D.C.), Dow needed three dozen permits and, after three years, had only four at most. So Dow decided to “throw in the towel.” Continued delays would have sent capital requirements for the plant “out of sight.”

A new-product “damper” As do the chemical and other industries and companies, one way to form capital is through healthy retained earnings. These earnings, in turn, are derived from profits. CMA’s Foveaux pointed out that in the chemical industry, profits are “boosted” through new products; these profits and subsequent retained earnings form needed capital for the research and development ( R & D ) necessary to keep the new products coming. But if the capital increasingly has to go to regulation compliance, R & D is inhibited. The company must then rely on older products that become “mature” and much less profitable. That situation, in turn, leads to less retained earnings and, thus, less capital available for R&D, and a “damper” on new products. According to C M A figures, 1979 spending for all plant/equipment was a bit above $8.1 billion (estimated). Of that amount, 7.2% was spent on pol-

U.S. Steel’s Masciantonio a I7-coiunie document

lution abatement. This is a capital outlay; one must add operating and other costs that could reduce retained earnings for future capital formation. True, these added costs could be passed on to customers, but the products then might risk becoming u n competitive in the marketplace. The environmentalicapital formation quandary would affect not only the ironistee1 and chemical industries. Probably every manufacturing industry, as well as metal-plating shops and numerous other businesses would be concerned. Their competitive position, if not their viability, is jeopardized, especially when the business in question is small.

A punitive mood Whether one is concerned with a large or small business, it is vital to address the capital formation question, Sidney Caller, deputy assistant secretary of commerce for environmental affairs (now retired), said. He told E S & T that to do so, the process of regulatory development should be reexamined to ensure that resulting regulations allow enough options and flexibility to enable optimum solutions-most cost-effective environmental quality response-to be achieved. Perhaps a part of the problem is found in the legislative and regulatory history of the period 1965-73 (recall that the petroleum crisis hit in the latter year). For example, Caller observed that in preparation of the 1970 Clean Air Act and the 1972 water law (PL 92-500)-drafted before the oil crisis-there was brought to bear a sizeable element of a punitive “putthe-screws-to-industry” philosophy. This punitive rigidity inhibits the “fine tuning” necessary to harmonize environmental and energy needs and to devise and install more cost-effective cleanup technologies, Galler said. He expressed concern that “not enough is being done to reappraise laws and regulations in the light of 10 years of

Benefits

cxpericncc, so that action can be taken to insure cost-effectiveness.” An example Caller gave involves “1ahs and regulations forcing endof-pipe pollution cleanup that is not only not cost-effective, but also energ)-inefficient.” More flexibility. perhaps, could offer incentives for the development of low- or no-waste technology or other streamlined environmental options which Caller said present luws and regulations largely foreclose. An essential result of more cost-effective pollution control would be, of course, more retained earnings, improved capital formation. “and, most important, lower costs and prices for the taxpayer and consumer,” GalIcr said.

Ripple effects The capital outlays and operating costs for pc systems, and other such direct effects that impinge on a firm’s financial position, are primarj3 effects, according to economist Murray Weidenbaum of the American Enterprise Institute for Policy Research. These effects on capital formation are the “most visible,’‘ and can be reasonably estimated numerically, he said. But then there are secondary effects of environmental regulations. and these become much harder to evaluate. Such effects might comprise paperwork, such a s record keeping, and f o r m to be submitted to cognizant government agencies, as well as recruitment and training costs for environmental control personnel. Moreover, environmental matters can involve litigation, as well as more mundane legal activity. All of that must be financed, Weidenbaum observed. A tertiary effect is very hard to “cost out,” but may be the most powerful. That is the cost of R & D necessary to b r i n g a bout reg u I a tory co m p 1 i a n ce . This type is “defensive” R & D that does not support innovation, Weidenbaum said. Less innovation means less goods/services: lower productivity; less sales, profits, and retained earnings:

The capital funds allocated to compliance with environmental regulations-do they buy benefits? Certainly there appears to have been improvement in air quality since 1970. For instance, a University of Wyoming study (EPA-600/5-79-001e, February 1979) estimated that mortality reduction (e.g., lowering of “excess” deaths from pneumonia-related diseases) and morbidity reduction (increased labor productivity) benefits would total $40.8 billion for a 60% reduction in ambient particles from 1970 levels. For 1977 alone, the total was put at $8.16 billion. Of the $8.16 billion sum for 1977, $7.28 billion worth of benefits accrued from lowered morbidity, according to the study. No doubt, there are analogous benefits of better water quality achieved. Certainly all of the foregoing represents great strides toward necessary, desirable, and praiseworthy social goals. But, to put it bluntly, a capital planner or potential investor, may be more concerned with how these social benefits show up numerically on ledgers, journals, income/ expense statements, and balance sheets. One answer is that perhaps technologies forced by regulations would lead (and have led) to new ways of saving energy, increasingefficiency, reducing/eliminating wastes, and recovering marketable materials, thereby, cutting costs. EPA offers several examples of beneficial effects of environmental investment that some companies do enjoy. Nevertheless, consideration of tangible and intangible benefits, be they social or accruing directly to the regulated firm, is a whole area of contention in its own right. It needs more precise definition. Indeed, per.haps environmental “reg” effects on capital formation might not be seen in as negative a light in many quarters, as has been the case up to now, if we can get a better “handle” on what funds, diverted from traditional business applications, buy for the regulated firm and for society in general. Also, as U S . Steel’s Masciantonio reminded ES&T, the concept of technologicat changes toward low- and no-waste systems, which are most helpful when they can be brought about, probably cannot apply universally.

and. consequently, deteriorated capital formation from retained earnings. M or eov c r , poor e r ret a i n ed e a r n i n g s prospects tend to discourage potential investors that lcould be sources of fresh debt or cquity capital. .A new risk factor Another discouraging factor for investors of capital is a new dimension of risk--“government and political risk, in addition to traditional business risk -W e i de n ba u ni poi n t ed out . Such government risk could entail instabilit) of regulations, which means that even after a plant or process is started up, with pc systems in place, and is complying w i t h environmental “regs.” the rules could be tightened and could require costly retrofitting to comply w i t h them. However, an EPA spokesman told ES& T that the agency does not know of any examples of the need for such retrofitting. Weidenbaum said that these shifting “regs“ a r e a fearful uncertainty-one which investors may well s h y ~ M L I Jfrom and t u r n toward companies not needing all sorts of regulatory permits. But there is a “plus side,” t oo . E nv i r on in en t a I reg u 1 at i o n s have “

Needed: $7 billion/y According to a recent report by the American iron and Steel Institute (AISI), the industry will need $7 billion/y, over the next several years, to get back into a strong, world competitive position. To achieve that goal and comply with environmental regulations, the steel industry will have to allocate $4 billion in capital expenditures plus $600 million/y to operate and maintain pollution control systems. To meet zero pollution discharge requirements, the industry would have to come up with an additional $3 billion for capital costs plus $700 million/y for operation/maintenance, as part of the total package. However, as AIS1 points out, “not all of steel’s capital headaches are caused by environmental regulations.” There are rafts of other government regulations, as well. Also, there have been problems with foreign steel makers “dumping” their productsthat is, selling below costs in their own countries-in the U S . Moreover, employee productivity has been declining, and plant modernization and replacement has been “hamstrung” for many reasons other than environment, such as adverse tax structures, AIS1 says.

Volume 14, Number 4 , April 1980

381

Murray Weidenbaum “a series of effects”

generated markets for people, consulting, equipment, and systems, and, in that respect, have a capital-forming effect.

“No shortage of capital” A divergent view: “Capital formation can actually be stimulated by environmental laws/regulations,” Lester Lave, professor of economics a t Carnegie-Mellon University (CMU, Pittsburgh, Pa.) and Senior Fellow, Brookings Institution, told ES& T . For example, the need to install and operate cleanup systems and to build new plants to replace ones in which environmental compliance retrofitting is uneconomical, would, in fact, force formation of capital to meet those needs, he explained. “If a venture has reasonable profit potential, you will find the dollars to get it going,” Lave said. “In that sense, there really is no capital shortage. Besides, the cost of environmental ‘regs’ comes out relatively small [primary capital formation effects, in Weidenbaum’s terms]-they’d normally add no more than 2% to costs. Even in the steel industry, the regulations would cost about $5-8/t, or 2% a t most. “The kicker is not the regulations per se, but their administration and stability. For instance, take a power plant that has to convert from oil or gas to coal. It seems ‘OK,’ because coal is a cheaper fuel, anyway. The new plant is planned with a 30-year life. But if the plant keeps getting hit with new and tougher ‘regs’ and must keep retrofitting, then obviously, the cost advantages of coal are quickly lost. Another risk is that EPA or a state may put out such tough new rules that the plant won’t be able to comply, so it might have to shut down,” Lave ex. plained. “What I suggest is this,” he said. “Let’s insist that a plant must meet, over its lifetime, the ‘regs’ that are in force when the plant starts up as a new source. If, later on, tougher regulations 382 Environmental Science & Technology

are needed or desired, that is like changing the scope of a contract-then let society, through the government, pay, since a social benefit, in the form of additional environmental cleanup, would be reaped. “But how are you going to develop the technology for better environmental cleanup, as well as other innovation?” Lave asked. “Foster capital investment! One important way to foster this investment, besides showing reasonable profit prospects, is to get rid of regulatory uncertainty. The attitude in business generally is, ‘Let us know the rules, and even though we don’t like them, we can meet them. It’s when

The SEC connection There are several ways to form capital. One, of course, consists of accumulating retained earnings. 0thers consist of incurring debt or obtaining equity capital by making loans, or selling bonds of various types, or common or preferred stock, of which there are also various types and classes. Raising capital through stocks and bonds, especially when such activity involves approaching the public, will often require registering the securities with the Securities and Exchange Commission (SEC), a federal agency. This registration is made pursuant to provisions of the Securities Act of 1933 and the Securities and Exchange Act of 1934.The company registering securities (registrant) must make a “full and fair” disclosure of its activities, financial situation, and planned use of securities sale proceeds, among other things. Thus, for many types of securities registration for public sale, a registrant must disclose material effects that federal, state, and local environmental regulations n y y have on its capital expendityes, earnings, and competitive position. In most cases, the registrant must also disclose envilonmental litigation it is involved in, as wet1 as environmental expenditures for current and future fiscal years that may be deemed material. All of these matters must be disclosed regardless of whether the registrant believes that those planning to invest in the registrant’s securities may decide not to do so, based on the disclosures. For addiiional reading on this subject, see SEC Releases 33-6130 (Sept. 27,1979),and 3416224.They are also in the Code of Federal Regulations, 17 CFR Parts 231 and 241.

I

CMU Professor Lave ‘actually. there’s enhancement”

the rules change every I O minutes that we have trouble,’ ” Lave observed. While regulatory uncertainty is an undesirable situation from the business point of view, and a seeming catastrophe to small business, perhaps the picture is not entirely bleak. That is the view of Dwight Baumann, the executive director of CMU’s Center for Entrepreneurial Development (CED).

Some are not fazed Baumann acknowledged that regulatory uncertainty is not a factor most conducive to capital formation. Yet, he said that a newly started business “might not be fazed, for example, by an additional tightening of, say SO2 ‘regs’ by 5%. This lack of trepidation may be psychological, in part. But remember that entrepreneurs worth their salt, and who can ‘get off the ground,’ become used to overcoming many imponderables. “Remember, also, that most entrepreneurs are not hide-bound and wedded to established thinking,” Baumann reminded ES& T. As a class of people, they tend to be much more creative, and they don’t have big technological investments or other large capital investments to protect.” According to Baumann, such creative people can often find ways of forming capital, despite regulations and their uncertainties, that do not occur to others. But it does take a good deal of imagination and intestinal fortitude! Perhaps future studies and experience, covering both big and small business, will lead to a clearer perception of this whole perplexing question of the extent to which private sector capital formation is adversely or otherwise affected by environmental regulations. Perhaps ways will be found to modify the regulatory process in such a manner that the portion of capital allocated to compliance is able to be spent in the most socially and economically cost-effective way. -Julian Josephson

A new sampler from ISCO.

..

for toxic substances and suspended solids. The new Model 2100 Sampler has been specifically designed for priority pollutant and suspended solids sampling, and is equally well suited for general purpose applications. Samples a r e d r a w n through a vinyl or Teflon@ tube and then through a short length of silicone rubber tubing by a high speed peristaltic p u m p and are placed directly into glass bottles without contacting any other materials. There are n o cumbersome m e t e r i n g chambers or valving systems to clean or cause cross-contam ination. The 3,000 m l per m i n u t e p u m p i n g rate and the elimination of secondary distribution systems makes i t ideally suited

for s u s p e n d e d s o l i d s sampling. The Model 2100 featu res m i c r o p r o c e s so rbased electronics that provide a new dimension in programming versatility. Any sample volume from 10 to 350 mI can be collected on a timed or flow proportioned basis at heads to 22 feet and

line lengths over 44 feet. The base i s completely insulated, has a large ice capacity and holds 24 polypropylene bottles of 1,000 ml or 24 glass bottles of 350 mi. A composite base i s optional. The Model 2100 is portable, operating from either 1 1 7 VAC or a 12 VDC Nicad battery. It is corrosion resistant and all electronics are in a gasketed stainless steel housi n g t o protect against accidental submersion. For d e t a i l s o n t h e Model 21 00 and all of its other features, send for a descriptive brochure or p h o n e t o l l free (800) 228-4373 (continental U.S.A. except Nebraska). ISCO, 3621 N.W. 36th Street, Lincoln, Nebraska 68524.

BDuPont

CIRCLE 8 ON READER SERVICE CARD

Volume 14, Number 4 , April 1980

383

Solve problems before they happen. That's right With Martek-design ed and Martek-manufactured water quality measurement systems your process control prob lems can be prevented around the clock Real-time analysis of monitored control parameters pinpoints potential malfunctions and pro vides an early warning system for corrective action or required maintenance. When you buy a Martek system, you get a total package - the analyzer, the data acquisition system, and the formatted printout systems. Whether your need is for a portable analyzer for specific parameters or for a complex

fixed installation, you get the same high-quality sensors manu factured by Martek to meet your total requirements. Your process control needs can be supplied from a single source at a reasonable cost by a company whose expertise ranges from ultrapure water applications in nuclear power plants to seawater applications for cooling water.

With a Martek water quality measurement system, you reduce downtime and achieve greater plant efficiency. Whether you're monitoring conductivity, salinity, dissolved oxygen, pH, specific ions, turbidity, flow rate, depth/ pressure, or temperature - you can do it efficiently with Martek analyzers. And Martek's research efforts will continue to provide you with additional parameter monitoring capabilities. Talk to the man from Martek about your water management program. He'll help you do a better job.

MARTEK INSTRUMENTS, INC. 17302 Daimler St

P.O. Box 16487

Irvine, CA 9271 3

(714)540-4435 Telex 692-317

CIRCLE 18 ON READER SERVICE CARD

384

Environmental Science & Technology

Process measurements for environmental assessment In a phased approach, all emissions and effluents f r o m industrial processes are identified by analytical chemistry and bioassay techniques The concept of environmental assessment developed by the Environmental Protection Agency (EPA) in response to concern about waste discharges from industrial and energy processes goes far beyond the usual lists of a few selected pollutants. The answers to two questions are sought in an environmental assessment: To what extent does a particular source or industry cause pollution damage to the environment? What can be done to minimize or eliminate the problem? The latest analytical and bioassay methodology for environmental assessment was the subject of the Second Symposium on Process Measurements for Environmental Assessment, held Feb. 25-27 in Atlanta. The symposium w a s sponsored by EPA’s Industrial Environmental Research La bora tory in Research Triangle Park, N.C. James A. Dorsey, chief of the Process Measurements Branch at the labora-. tory. served as general chairman of the symposium, as he did for the first symposium held in 1978. Symposium coordinators were Philip L. Levins and Judith C. Harris, both of Arthur D. Little, Inc., Acorn Park, Cambridge, Mass. The symposium consisted of three invited papers and 19 contributed presentations; a poster session featured summaries of I2 additional projects related to environmental assessment. E nv i r o n m e n t a I assessments a r e prerequisites to potential future regulatory approaches and are aimed at identifying problems before they occur. One EPA publication points out that environmental assessment allows resolution of problems on other than a one-pollutant-at-a-time basis, “which is fraught with endless studies, only partially effective results, and high costs at all levels of implementation.” The components of an assessment include: a review of the technological

process under study and its potential for national and regional utilization environmental data acquisition, involving biological, chemical, and physical tests on process sources, effluents, and pollutants impact analysis relating pollutant levels to projected environmental effects control technology assessment, assessing the effectiveness of various control options. Because of the complexity of industrial processes, the direct approach of complete chemical and biological characterization of all process streams would be prohibitively expensive. This is why the Industrial Environmental Research Laboratory has developed a phased approach to environmental assessment. In the phased approach, all streams are first surveyed (Level 1 ) using simplified sampling and analytical methods, permitting the ranking of streams on a priority basis. Level 1 is a screening study which indicates whether or not a potential environmental problem exists in connection with any of t h e process streams. The next step in the phased approach is Level 2 study, intended to identify specific chemical species re-

James A. Dorsey general symposium chairman

sponsible for the environmental hazard in streams ranked most hazardous in the Level 1 survey. The next step, Level 3, involves continuous monitoring of key indicator materials to evaluate long-term process variability. In addition to chemical characterization, the phased approach also specifies biological testing to directly determine toxic effects of emissions on biological test organisms. The analytical protocol developed for environmental source assessments adheres to the same level concept. At each phase of the assessment, specificity and sophistication of analytical techniques are tailored to the information required in that phase, thus avoiding the utilization of expensive analytical manpower and methodology on streams of unknown pollution potential. The Level 1 approach is designed to detect environmental problems at minimum cost. It shows, within broad limits, the absence or presence and approximate concentrations of elements, anions, and classes of organic compounds in gaseous, liquid. and solid process effluents. Particulate matter in effluent gases is analyzed separately for chemical composition, particle size distribution, and other physical parameters. Selective biotesting is also performed on samples to indicate possible human health and ecological impacts. Among the analytical tech ri iq ues used in Level 1 testing are spark source mass spectrometry (SSMS), wet chemical methods for anions. gas chromatography ( G C ) , liquid chromatography ( L C ) , infrared spectrometry (IR), and low resolution mass spectrometry ( L R M S ) . Level 1 biological tests include rodent acute toxicity, microbial mutagenesis, cq‘totoxicity, fish acute toxicity, and algal bioassay. When Level I tests indicate potentially hazardous emissions, the ofVolume 14, Number 4, April 1980

385

fending streams are priority-ranked and investigated by Level 2 efforts. Level 2 analyses are less susceptible than Level 1 to the development of specific protocols. Here it is necessary to allow for flexibility and leave method development to the discretion and skill of the analytical chemist. Analytical procedures for Level 3 are oriented toward determining time variation in concentration of key indicator substances. This is an area where continuous monitors for selected pollutants can be effectively incorporated into the program. The data acquired in the testing stage are used to determine the extent of hazard associated with particular waste streams. An impact analysis is then made by comparing these hazards to relevant environmental objectives. Finally, an environmental assessment will evaluate pollution control technology or alternative processes capable of minimizing the environmental damage. The multidisciplinary combination of biological and chemical tests utilized in environmental assessment constitute what Michael R. Guerin of O a k Ridge National Laboratory ( O R N L ) referred to in his invited lecture as an “integrated” approach to chemical-biological analysis. T h e papers presented concentrated, for the most part, on the application df chemical analysis and biological testing to Level 1 and Level 2 measurements. Although the concept of environmental assessment is broadly applicable to any industrial process, speakers at the symposium were primarily interested in the application of these tests to processes connected with the manufacture and combustion of natural and synthetic fuels.

Level 1-analytical chemistry As mentioned before, two of the most frequently used analytical techniques are IR and L R M S , and W. F. Gutknecht and A. Gaskill, Jr. addressed problems in I R and L R M S spectral interpretation in their presentation at the symposium. These two investigators from Research Triangle Institute found wide variability in approaches to measurement and interpretation of spectra sent to a series of EPA contractors. The contractors also obtained their own spectra for interpretation from prepared organic test mixtures sent them in the study. The basic finding of the interlaboratory comparison was that different analysts get the same answers for the most part. In general, the different labs agreed quite well with each other as far as qualitatively identifying the sample components. Having said that, the 386 Environmental Science & Technology

presentation then concentrated on the problems that did arise in the spectral interpretations. Gutknecht reported variations in signal location of f 5 - 1 0 cm-’ from analyst to analyst in the IR study. He noted that they often used too much or too little sample by incorrectly loading KBr pellets and salt plates, which frequently resulted in inappropriate signal intensities. But most of the errors in spectral interpretation were caused by incorrect signal interpretation. Errors in assignment were common, partly because there are different organic functional groups giving rise to signals at very nearly the same wavenumber. For example, carboxylic acids, aldehydes, esters, and ketones all give rise to signals around 1700 cm-l. To make matters worse, the research-

Philip L. Levins symposium coordinator

ers found that wavenumber ranges for particular structures are reported differently in different reference tables. Broad signals caused by intermdlecular interactions, and complex samples with overlapping bands can also make life difficult for the spectroscopist. But the majority of errors noted in IR spectral interpretation were errors of omission-not making optimal use of available data. In one case an absorption at 1240 cm-’ was assigned as the C - 0 stretch of an ester. However, there was no signal at 1735 cm-’ in the spectrum, where the C = 0 stretching band should have been. The analyst failed to call his original interpretation into question, overlooking the absence of the requisite complementary data. In general, Gutknecht recommended that the IR spectroscopist should interpret all signals of significant amplitude, search out and use complementary signals that support or refute other interpretations, and make alternative interpretations when such are reasonable. Gaskill reported on L R M S inter-

pretation errors. Among the most common were failing to find molecular ions (this factor alone accounted for half the errors), reporting molecular ions as fragment ions and vice versa, and incorrectly assigning ions to homologous series. He also noted differences in the approach to interpretation among different analysts. Most looked for molecular peaks and fragment ions. Several sought out the eight most abundant peaks and then referred to the “Eight Peak Index.” Several tried to account for all peaks above a certain intensity and also above a certain m/e value, generally 30. Gaskill made a number of specific suggestions for successful spectral interpretation. He recommended using the “Eight Peak Index” for interpretation and verification, and setting intensity and m/e criteria to simplify the spectra. To combine the two spectral techniques, he recommended independent analysis of IR and L R M S data to avoid bias. Results should then be compared. R. P. Baldwin of the University of Louisville reported on the development of a new electrochemical method for Level- 1-type measurement of dissolved aldehydes and ketones. Electrochemistry’s applicability to organic analysis in complex matrices is severely limited by its lack of specificity and relatively poor resolution. Baldwin’s technique addresses these limitations by using a chemically modified electrode ( C M E ) to selectively preconcentrate compounds containing particular functional groups. The C M E is an electrode to which chemical groups are attached by covalent chemical reaction or irreversible chemisorption. Baldwin attached amine functionalities to a Pt electrode and preconcentrated aldehydes and ketones on the electrode surface through the formation of imine condensation products. The attached ketones and aldehydes were then determined by differential pulse scanning of the electrode potential. Selectivity of response for aldehydes and ketones is derived from Ihe chemical selectivity of the preconcentration step and from the redox potential of the attached product in the analysis step. Baldwin stated, “Extension of this approach to C M E systems appropriate for the selective analysis of other environmentally significant classes of organic compounds is easily envisioned.” To date, however, Baldwin’s technique has only been tested on one compound, ferrocene carboxaldehyde. J. C . Harris, M. J. Cohen, and M. J. Hayes of Arthur D. Little, Inc. pointed

out that the use of solid adsorbent resins could be an attractive alternative to the more commonly used solvent extraction techniques for the collection and recovery of organics from aqueous industrial solvents. Solvent extraction for trace organics necessitates the use of large sample volumes to reach the necessary detection limits. In addition, recovery of polar species by solvent extraction is frequently poor, and detection limits are generally a t the ppm level or higher. Use of sorbents, on the other hand, means that smaller volumes of sample can be used, and substances can be determined at ppm to ppb levels. The mixed resin cartridge developed and reported on by Harris, Cohen, and Hayes contained a layer of XAD-2 for collection of nonpolar organics, followed by a layer of XE-347 for the polar organics. Recoveries were as good as for solvent extraction, and better for some polar compounds such as phenol. They also presented data showing that the efficiency of adsorption was highly flow rate dependent, and that most literature references they have seen indicate presently used flow rates may be too high

were concerned with Level 1 analytical methodology. I . Bodek and K. T. Menzies of Arthur D. Little, Inc. reported on the ion chromatographic analysis of organic acids in diesel exhaust and mine air. Low-weight carboxylic acids are among highly water soluble compounds that are difficult to quantify by conventional procedures. The relatively new technique of ion chromatography has the potential for determining these acids in complex matrices, according to Bodek and Menzies. Ion chromatography is a combination of the techniques of ion exchange, liquid chromatography, and conductimetric detection, made feasible by use of a suppressor column to eliminate native conductivity of the eluant. ES& T reported on ion chromatogra-

Vapor-particle interaction D. F. S. Natusch of Colorado State University gave a fascinating presentation on the interaction between the vapor phase and the particulate matter in stack emissions from combustion processes. The environmental impact of particulate material, such as coal fly ash, depends on the surface characteristics of the particle, its composition, its aerodynamic size, and its thermal history. Natusch reported that at temperatures inside the stack, organics such as polynuclear aromatic hydrocarbons (PAHs) will tend to stay in the vapor phase. Once the stack gases emerge from the stack and cool, however, the organics will adsorb to the particles. Particles were collected from inside the stack of a coal-fired power plant and from the plume 40 ft from the stack at 0 O C . The particles from both sources were then Soxhlet-extracted with benzene. When ultraviolet light was directed a t the extract from the stack particles, fluorescence was negligible. The extract from the plume particles, on the other hand, fluoresced brightly from PAHs adsorbed in the cooler plume. Natusch pointed out that we must therefore be sure to at least collect both gases and particles when sampling power plant emissions. Otherwise, P A H concentration may be seriously underestimated. A number of other presentations

Judith C . Harris symposium coordinator

phy in its July and October 1979 issues (p 804 and p 1214). On-line monitoring of toxic materials in sewage effluent at Lawrence Livermore Laboratory ( L L L ) in California was discussed by M. Auyong, J . L. Cate, Jr., and D. W . Rueppel. A representative fraction of the 350 OOO-gal/d total waste stream a t LLL is passed through a detection assembly consisting of an X-ray fluorescence unit which detects high levels of metals, N a l ( T 1 ) crystal detectors that scan the sewage for radiation excursions, and an industrial probe for pH monitoring. Currently, if radiation or pH standards are exceeded, an alarm is sent to a station where personnel can respond, and a sample of the sewage is automatically collected for more complete analytical evaluation. R. L. Barbour and R. J. Jakobsen of Batelle Columbus Laboratories reported on the versatility of Fourier transform infrared spectroscopy ( F T I R ) in the environmental assessment field. Of course, organic class

information has long been the province of infrared spectroscopy, but Barbour and Jakobsen have been able to identify specific inorganic compounds as well. FTIR is primarily oriented toward Level 1 analyses, but better separations can be achieved by combining FTIR with high performance liquid chromatography or gas chromatography. This allows Level-2-type analyses to be made. E S & T reported on FTIR in the June 1977 issue on p 568. Level 1-biological testing In a highly industrialized society, it is essential that hazardous discharges be detected and controlled before they cause damage to the environment. I n Japan, for example, a plastics factory discharging wastes containing mercury into Minimata Bay ultimately caused 46 human deaths and poisoned another 120 persons. Biological testing is a direct technique for rapid assessment of this kind of hazard. D. G . Nichols and A. W . Kolber of Research Triangle Institute reported on the economic advantages of Level I biotesting as compared to other applicable techniques. The National Cancer Institute whole animal carcinogenesis test, for example, takes 3'/2 years to execute and costs $250 000. In contrast, in vitro Level 1 biotesting takes 0.01-0.2 years and costs $6000. However, there are important precautions with any testing protocol, and Nichols and Kolber reminded attendees that mutagenicity testing results may be misleading unless a complex sample is fractionated into its crude constituents prior to testing. For example, in research performed on a coal gasification process in Wyoming, investigators found crude tar from the process negative in mutagenicity (0-3 revertants/Fg). But four samples of tar bases separated from coal tar showed activities of 6-38 revertants/pg. Thus, the mutagenic activity of tar bases might have been missed had crude coal tar not been fractionated prior to testing. Since tars from the production of synfuels can vary widely in composition, tars containing higher proportions of tar bases tend to have more mutagenic activity than tests on some crude coal tar saniples might indicate. It is, of course, essential that the biological activity of a sample be unaffected by sample handling techniques to prevent inaccurate results. D. J . Brusick of Litton Bionetics addressed this important problem in his discussion on possible effects of collection methods and sample preparaVolume 14, Number 4, April 1980

387

tion on Level 1 health effects testing of complex mixtures. Extraction of particles, for example, is generally accomplished by sonication or Soxhlet extraction with organic solvents. However, Brusick warned that such treatment may very well cause preferential releases of toxic and mutagenic organic materials from the bound state. If these materials are not released from the particles under normal environmental or physiological conditions, they can skew the ranking scheme. Again, in concentrating substances in organic solution, the XAD-2 resins used may let inorganic substances pass through, effecting concentration of organics only. Some of the inorganics lost may be strong toxicants. In addition, concentration of the organic constituents in solution may introduce artifacts by altering the chemistry of the system and by enhancing chemical dynamics not encountered in dilute solutions. One precaution used a t Litton Bionetics, Brusick reported, is the use of standardized sample collection and sample processing forms, on which a complete history of each bioassay sample and explicit instructions for further sample processing are recorded. Photosynthetic algal bioassay J . M. Giddings of O R N L reported on the development of a rapid algal bioassay for assessing the toxicity of coal-derived materials. Algal cultures or natural algal communities are exposed to the test material for 4 h. Photosynthesis is determined by the ITC-bicarbonatemethod during the final two hours of exposure and compared with controls for a measure of toxicity. Giddings reported that 4-h algal bioassays of the water soluble fractions of more than 20 natural oils and synthetic oils from coal liquefaction showed the coal liquefaction products to be considerably more toxic to algae than the petroleum products. Shale oils were intermediate in toxicity. Giddings found the organic bases in the samples to be the most toxic fraction, with acidic components such as phenols following close behind. H e particularly recommended consideration of some of the advantages of his algal bioassay relative to the widely used 14-day algal growth test, often called the bottle test. T h e latter is being considered by the EPA and the A S T M for consideration as a standard method, but Giddings points out that his photosynthetic assay is 388 Environmental Science & Technology

faster and affords greater control over experimental variables. Conditions are simply not held constant in the bottle test-the algae undergo profound physiological changes during the 14 days of the test, and the solution’s pH may also change radically during this period. “There’s obviously a danger here that the algal assay bottle test, because it‘s been used for 10 or 15 years, has generated enough momentum to carry it into becoming the standard bioassay procedure,” said Giddings. “ I want to emphasize that there a r e points we ought to look into before we do that.” T h e quality and quantity of wastes in almost any. industrial plant vary enormously. In a 1 -year study of one plant, J . Cairns, J r . and K . W . Thompson of Virginia Polytechnic Institute and State University found one waste line that had a 104-fold variation in toxicity over time. The two investigators reported on their development of a toxicity testing system in which a means of systematically varying the concentration of a test chemical or chemicals in a continuous-flow system has been provided. They pointed out that the relationship between the traditional bioassay and its applicability to real-world situations in which concentrations of toxicants are not constant deserves more serious consideration. Cairns questioned the wisdom of using the most sensitive species as a surrogate to protect the ecosystem in toxicological tests. “It may be that the system is much more resilient and more durable than we believe, in which case we need a system level response to use as a criterion rather than a single species response.” Space does not permit a description of all the bioassay research results presented a t the conference. Conspicuous among the omissions was a presentation by G. L. Fisher, C . E. Chricp, and F. D. Wilson, University of California, on the application of bioassays to the study of the biologically significant physical and chemical properties of coal fly ash. Level 2 testing Also discussed a t the symposium were a number of techniques whose applications lie more in the domain of Level 2 testing because of their greater specificity and accuracy. One of these techniques has the dubious distinction of being called LESS, which stands for laser-excited Shpol’skii spectroscopy. V. A. Fassel reported on this technique

on behalf of himself and his colleagues, A. P. D’Silva and Y. Yang, from the U S . Department of Energy and Iowa State University. Although PAHs are among the most carcinogenic compounds known, there has heretofore been no method that permits selective detection of individual PAHs without prior separation, as a group or individually. This is because their fluorescence spectra are broadbanded and thus overlap. In the L E S S technique, the PAH mixture is frozen into a hydrocarbon matrix a t 15 K. The PAHs occupy strictly oriented positions in well-defined crystal sites within this matrix, and thus behave as if they were individual molecules. As soon as you have individual molecules in fields that are precisely duplicated, you get sharp line spectra. In LESS, the full width a t half maximum of spectral peaks is about I O cm-I. Since hundreds of PAHs may be mixed together in a sample, selectivity is accomplished by laser excitation. One molecule a t a time is excited by the tunable laser, and the individual spectra are observed. The Shpol’skii effect has been observed for a rather large number of solute-solvent combinations, so potentially a wide variety of compounds can be determined this way. Other Level 2 studies included work by R . B. Gammage, T. Vo-Dinh, and P. R . Martinez of O R N L on the analysis of samples from fuel processes by synchronous fluorescence and phosphorescence a t room temperature. They reported that the experimental procedures used are no more complicated than those required for Level 1 environment a I assessment , and the 30% accuracy of the phosphorescence measurements is more than adequate for Level 2 applications. R . F. Maddalone of T R W . Inc. reported on the application of Level 2 inorganic sampling and analysis methods to particulate samples from oil- and coal-fired boilers. The Level 2 analysis scheme developed a t T R W uses such techniques as scanning electron microscopy. atomic absorption spectrometry, inductively coupled plasma spectrometry, FTIR, and a number of sensitive surface techniques. A t the Second Symposium on Process Measurements for Environmental Assessment, the quality of the technical papers was high. as were the scientific goals of the attendees. It is certain these goals will be even closer to realization as further symposia on environmental assessment convene in the future. -Stuart Borman