Climate modeling -the key to predicting the effect of a buildup of atmospheric carbon dioxide. Incorporating some subtle feedback mechanisms and a more realistic representation of clouds and oceans are the steps needed to improve the quality and scope of the predictions
This article is the second in a series on mathematical modeling of the atmosphere. The third andfinal article will examine models of halocarbon-induced changes in stratospheric ozone. An atmospheric model is a mathematical representation of key physical processes of the atmosphere; the major aim of a model is to predict the consequences of some human action. Modeling has been made at once possible and necessary by a better understanding of atmospheric processes: possible, by providing the needed tools; necessary, by providing the recognition that some of our actions may engender such drastic environmental change that we cannot afford to “do the experiment” to confirm them. The construction of a new pollutant source is one such action. A company cannot be expected to spend millions of dollars on a plant only to discover that its effect on air quality is unacceptable, nor can local residents be expected to tolerate the construction of a plant without assurances that it will not damage the air they breathe. A more subtle barrier to “doing the experiment” arises with global atmospheric problems. Here, the obstacle is not a company’s expenditures, but detecting the problem before it is too late. Although we know through observation that the carbon dioxide concentration of the atmosphere is increasing, largely as a result of fossilfuel combustion, the calculated effect of this increase-a global warmingmay not be observed until it is well
under way. “We may not be given a warning until the COz loading is such that an appreciable climate change is inevitable,” said a National Academy of Sciences report on the subject. Thus the problem becomes one of predicting the outcome of a “continuing experiment” before any of the experimental results have been obtained-and while it still can be halted.
A global problem Adding enormously to the complexity of this task is the global scale of the problem. Identifying and determining the magnitude of sources and sinks, developing an adequate representation of topography, and accounting for physical processes on a local scale is difficult enough, as the controversy surrounding local dispersion models demonstrates (discussed
in the first article of this series, ES& T , April 1980, p 370). Pinning down sources and sinks is in fact almost impossible for the global C02 problem. The uncertainties include not only economic and political factors that determine the world’s use of fossil fuels, but also a limited understanding of the role the oceans play in absorbing some of the C02 injected into the atmosphere. The National Academy report summed up the situation: “Our limited knowledge of the basic features of the carbon cycle means that projections of future increases of COz in the atmosphere as a result of fossil-fuel emissions are uncertain. It has been customary to assume that 50% of the emissions will stay in the atmosphere. The possibility that the intermediate waters of the ocean, and maybe also the deep sea, a r e in more rapid contact with the atmosphere may reduce this figure to 40%, perhaps even to a somewhat smaller figure. O n the other hand, a continuing reduction of world forests will further add to any increase due to fossil-fuel combustion. The ability of the oceans to serve as a sink for C02 emissions to the atmosphere is reduced as the concentrations increase because of the chemical characteristics of the carbonate system of the sea.” The role of forests is hotly debated; while some argue that forest clearing has resulted in a release of some 200 billion tons of carbon to the atmosphere since the beginning of the 19th century, others argue that regrowth of cut forests has balanced this release with an equal uptake of carbon. The only things that can be said Volume 14, Number 5, May 1980
501
with any confidence about sources and sinks are that some 5 billion tons of carbon in the form of C 0 2 are released to the atmosphere each year through the burning of fossil fuels; that that rate is growing by 4.3% each year; and that through the combined action of man-induced emissions (fossil-fuel burning and possibly forest clearing) and natural sinks, the C 0 2 content of the atmosphere has increased by 42 billion tons since 1958, and perhaps 90 billion tons since 1850. Modelers trying to predict the effect of increasing atmospheric C 0 2 on climate have, in an effort to cut this problem down to a manageable size, limited themselves to trying to answer the question: What will be the effect on climate ifthe atmospheric concentration of C 0 2 doubles (or triples or quadruples), regardless of how or when that occurs? But the problem they are left with is still nothing to sneeze at. The global scale of the problem still stretches to the limit our understanding of geophysical processes large and small, as well as our resources of computer time and capacity.
The radiative balance The basis of the CO2-induced climate warming, however, is a simple and well-understood process. The earth’s temperature is determined by a balance between radiative energy it emits to space (which is proportional to the fourth power of its temperature) and the radiative energy it receives from the sun. A buildup of atmospheric C 0 2 disrupts this balance: Although C 0 2 is transparent to incoming solar radiation, it absorbs outgoing terrestrial radiation. The carbon dioxide “blanket” then reemits this absorbed radiation, half upward to space and half downward to the earth. The earth warms up as a result, until its outgoing radiation increases to the point that it once again balances total incoming radiation. The higher the concentration of C 0 2 in the atmosphere, the greater the warming needed to restore equilibrium. If there were no more to this problem, the entire calculation could be done on the back of an envelope and the effort would scarcely rate being called “modeling.” The difficulty is the myriad changes that accompany a climate change, and the effect each may in turn have on climate. For example, water vapor in the atmosphere acts in much the same way as C 0 2 , trapping terrestrial radiation; increased temperature means increased evaporation and increased water vapor in the atmosphere. The result is an 502
Environmental Science & Technology
FIGURE 1
Increasing COSproduction Atmospheric COZ change (ppmiyear) 4.0
Annual COZ production (billion tons Ciyear) 10 0
-
3.0
-
20
-
--
--0.4 0.3 0.2 1 0 -
0.5
1860
1880
1900
1920
1940
1960
1980”
The annual production of COZfrom fossil fuel combustion has increased at a rate of 4 3%iyear, except for interruptions during World War I, the Great Depression, and World War II Note that the vertical scale is logarithmic Source C F Baes ef ai Am So May/June 1977
amplification, or positive feedback, of a CO2-induced warming. The simplest models that incorporate at least some of these secondary effects are the so-called radiativeconvective global-average, or onedimensional, models. The only space variable considered is altitude. Values of the incoming solar radiation, the earth’s “albedo” (that is, the percentage of solar radiation reflected by the earth’s surface), and cloud cover are averaged over the entire surface of the earth and over all four seasons. A prescription of the change in relative humidity with height is usually supplied to the model, and often the assumption is made that convection of heat vertically in the lower atmosphere will prevent the lapse rate (change in temperature with altitude) from becoming greater than some specified value. The C 0 2 concentration is then specified and the model integrated over time until a steady state is reached. Calculations with these models have typically shown a 2 “ C average temperature rise for a doubling of atmospheric C02.
spheric Research ( N C A R ) and a member of the Academy panel. The specification of a limiting lapse rate is an example. Richard Wetherald of the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory (GFDL) explained that, in his view, the real point of the 1-D models was to give the sign and order of magnitude of the change, rather than a specific number. He noted that these models are radiative models, which cannot take into account the dynamics of the earth’s atmosphere or any of a number of complex feedback effects. One such feedback mechanism is provided through changes in snow and ice albedo. As temperature rises, the surface area covered by snow and ice decreases. Since snow and ice are more reflective than bare ground, this change leads to an overall decrease in reflectivity, and thus an increase in absorbed incoming radiation. Because this effect is inherently regional-it depends on latitude at least-a onedimensional model cannot possibly incorporate it.
Missing factors A one-dimensional representation clearly omits many processes. “Certain aspects of the 1 -D models are arbitrary assumptions,” said Robert Dickinson of the National Center for Atmo-
Multidimensional models The simplest models which can begin to incorporate some of these factors do so by adding one additional dimension-latitude, Incoming radiation, albedo, and temperature are
allowed to vary with latitude; the averaging is done over a zone, or circle of latitude, rather than the entire surface of the earth. Empirical parameters still carry the day, however. The complex circulations of the atmosphere and ocean which redistribute heat over the globe are reduced to a parameterized flow from equator to poles. And while snow and ice albedo feedback are included in these two-dimensional “energybalance” models, the zonal averaging forces an empirical specification of the average albedo as a function of temperature. But the results of the 2-D models are not radically different from the 1-D results. The “JASON” study, prepared for the Department of Energy and chaired by Gordon MacDonald, former chairman of the Council on Environmental Quality, found a n average rise of 2.4 OC for a doubling of c02. A complete description of global heat flows that avoids any empirical parameterization can, of course, only be provided by a three-dimensional, or general circulation, model. In this approach the fundamental equations of atmospheric motion are integrated over time; temperature, pressure, wind velocity, and water vapor concentration are calculated a t each point of a three-dimensional grid. Perhaps the major obstacle to the use of general circulation models is the computer time and capacity needed to store and manipulate such vast quan-
tities of data. “In a 3-D calculation, you need almost a n entire IBM 370 computer totally committed to the modeling effort,” said John Hummel of General Motors Research Laboratories. “The numerical burden is just enormous.” As a result, general circulation modeling has been carried out only a t institutes which can afford such an allocation of resources-NOAA’s Geophysical Fluid Dynamics Laboratory, N C A R , and NASA’s Goddard Institute for Space Studies. Results that have been obtained from these models are again in line with those produced by other models: a rise of between about 2 O C and 4 O C for a douhling of COz. But uncertainties remain even with the 3-D models. Although James Hansen of the Goddard Institute expressed the view of nearly everyone working in the field when he said, “It’s very hard to imagine how the basic prediction could be wrong,” he cautioned that there are uncertainties in the magnitude of the warming and in how long it will take for it to appear. “It would be premature to say it’s nailed down.”
Oceans and clouds Hansen told ES&T of some particular areas that need more work. “The greatest uncertainty is in the role of the oceans, which have been treated in a very simple way-particularly their role in the storage of heat.” Most models assume that the ocean is either
a “swamp”-wet land with no heat capacity a t all-or consider only t h e , surface layer of the ocean, which is in rapid communication with the atmosphere. There is evidence, however, that much deeper parts of the ocean can be involved in heat exchange with the atmosphere through a process which pumps water from the surface in the subtropics. The Academy report noted that this means that “the effective thermal capacity of the ocean for absorbing heat . . . is nearly an order of magnitude greater than that of the mixed layer alone. If this reservoir is indeed involved, it could delay the attainment of ultimate global thermal equilibrium by the order of a few decades.” Thus, although the final result would be the same, it would be delayed. Dickinson of N C A R noted that this “adds confusion to the subject from the point of view of actually seeing anything” of the effect. But this is only one effect the ocean might have. Ocean currents and ocean heat transport could change as a result of climate change. This could have a significant effect on the distribution of the temperature change over the globe. What has been the difficulty in including a more complete description of the oceans in models? According to Hansen, it’s particularly a data problem-we don’t have many ocean measurements, certainly no where near as many as we have for the atmosphere. Measurements are needed not only to provide comparisons with model calculations, but also to allow us to develop a better understanding of large-scale ocean flows. The problem is also one of finite computer capacities. A universally stated need is for “better and faster” computers; coupling oceans to the atmosphere accentuates this need. The model atmosphere comes to equilibrium in about one model year of integration, while the ocean takes about 100 years. Synchronizing the two, while keeping real computer running time within bounds, is tricky. “A second major uncertainty is in how to treat possible changes in clouds that will occur as C 0 2 warms up the atmosphere,” Hansen continued. “In my opinion, there are two things that could be done over the next few years. One is to get better data on the distribution of clouds at this time; that’s possible with satellite measurements. Something can also be learned about cloud physics by looking at other planets.” Weather satellites routinely make measurements of the earth’s clouds. Volume 14, Number 5, May 1980
503
FIGURE 3
Calculated temperature rise 16 14 12
10 “C
8
6 4 2 0
90”
80”
70”
60”
50”
40”
30”
20”
10”
0”
Latitude
The effect of a doubling and quadrupling of the atmospheric COZcontent is shown as a function of latitude. Calculations were made by a general circulation model, and then averaged over longitude. Source: S. Manabe and R . Wetherald, J. A m So., January 1980.
but the enormous amount of data collected makes data storage for more than a few days impossible-and models need information on seasonal or annual variations in cloud properties. But Hansen explained that all that would really be necessary would be to “tap in” to the data stream and keep running averages of the data as they are obtained. H e also noted that there is already quite a bit of information on the clouds of Venus; data from Jupiter is expected to become available in the middle of the decade. “The modeling of clouds is one of the weakest links in the general circulation modeling efforts,” concluded the Academy report. Clouds not only reflect incoming solar radiation, but also absorb terrestrial radiation. How these two opposing effects balance out is part of the uncertainty. The radiative properties of clouds depend strongly on cloud height, which further complicates the problem. Low clouds, which are in a warmer environment, intercept and reradiate more energy than high clouds. The difficulty in predicting cloud patterns from first principles has led to the specification of fixed cloud coverage in many models. But Hummel of G M Research believes this is a serious mistake. Calculations he has done with a radiative-convective model in which cloud heights and thicknesses are calculated on the basis of water vapor and temperature profiles show the effect of cloud feedback to be significant. O n the other hand, Wetherald of the G F D L believes, on the basis of his 3-D 504
Environmental Science & Technology
calculations, that cloud changes will not have very much effect on the globally averaged results. H e also cautioned against drawing conclusions about cloud feedback effects on the basis of 1-D models. The Academy report provides what may be the best conclusion: “How important the overall cloud effects are is, however, an extremely difficult question to answer. The cloud distribution is a product of the entire climate system, in which many feedbacks are involved.” And the report agreed that data is the key: “Trustworthy answers can be obtained only through comprehensive numerical modeling of the general circulations of the atmosphere and oceans together with validation by comparison of the observed with the model-produced cloud types and amounts. Unfortunately, cloud observations in sufficient detail for accurate validation of models are not available a t present.”
Validation The difficulty in validating the models’ cloud predictions points up the larger difficulty in validating the model as a whole. In a nutshell, the problem is “you have no data to say whether you have the right answer or not,” said Dickinson of N C A R . H e outlined two approaches, however, which can at least help. One is to look at past climatic data contained in the geological record. A model that can satisfactorily recreate the features of a past climate change is presumably on the right track.
H e also noted that simply comparing different models, which make different simplifications and assumptions, can provide some assurance. “If the models give similar results, you have a little more confidence.” Because the general circulation models construct a picture of global climate from first principles, their ability to construct the present climate is itself a test of their validity. A typical starting condition for a general circulation model is a stationary, isothermal atmosphere. The model then applies the equations of motion and the specified C02 concentration, integrating the atmosphere over time until a steady state is attained-usually 400-500 model days from the starting point. The atmosphere is then typically integrated another 400-500 model days to obtain an average climate prediction. The distribution of average temperature over space and seasons, calculated by the model when the present COz concentration is assumed, can be compared with actual observations. The model can also be tested by its ability to predict day-to-day changes in weather when actual meteorological data are fed in.
More questions But despite the difficulties in validating the models, and despite the uncertainties in modeling clouds and the oceans, there is widespread confidence in the general results. Nearly everyone agrees that the temperature will go up by 2.5-3 “C as a best estimate, that the change will be greater a t the poles than at the equator, and that hydrologic patterns-the distribution of rainfall, for example-will change. Getting a handle on the regional distribution of the temperature change is of top priority now. Obtaining this information and information on the distribution of rainfall and evaporation changes was in fact one reason for developing the 3-D models. Here, however, the uncertainties are greater and the understanding less. The Academy report found that “two models may give rather similar zonal averages [averages over a circle of latitude] but, for example, very different monsoon circulations, positions, and intensities . . . and quite different rainfall patterns. It is for this reason that we do not consider existing models to be at all reliable in their predictions of regional climate changes due to changes in C02 concentration.” The report cited shortcomings in the treatment of clouds, precipitation, evaporation, and transport across boundary layers, as well as in the use
Solve problems before they happen. That's right With Martek-design ed and Martek-manufactured water quality measurement systems your process control prob lems can be prevented around the clock Real-time analysis of monitored control parameters pinpoints potential malfunctions and provides an early warning system for corrective action or required maintenance. When you buy a Martek system, you get a total package - the analyzer, the data acquisition system, and the formatted printout systems. Whether your need is for a portable analyzer for specific parameters or for a complex
fixed installation, you get the same high-quality sensors manufactured by Martek to meet your total requirements. Your process control needs can be supplied from a single source at a reason able cost by a company whose expertise ranges from ultrapure water applications in nuclear power plants to seawater applications for cooling water.
With a Martek water quality m e a s urement system, you reduce downtime and achieve greater plant efficiency. Whether you're monitoring conductivity, salinity, dissolved oxygen, pH, specific ions, turbidity, flow rate, depth/ pressure, or temperature - you can do it efficiently with Martek analyzers. And Martek's research efforts will continue to provide you with additional parameter moni. toring capabilities. Talk to the man from Martekabout your water management program. He'll help you do a better job.
MARTEK INSTRUMENTS, INC. 17302 Daimler St
P.O. Box 16487
Inhe, CA 92713
(714) 540-4435 Telex 692-317
CIRCLE 1 ON READER SERVICE CARD
Volume 14, Number 5, May 1980
505
RaltEch Scimtific SErvictisAir Monitoring and Evaluation. The Facilities. The Services. The E x p e r i e n c e . FACILITIES: Raltech has 500 professional scientists and support personnel working in 450,000 square feet of modern laboratory space in Madison, Wisconsin and St. Louis, Missouri. The instrumentation at these facilities is the most sophisticated available. The data produced is reliable and presented i n a manner acceptable t o regulatory agencies. TYPICAL SERVICES: Sample site selection. Source tests. Determination of stack gas velocity. Molecular weight determination. Measurement of particulate emissions nitrogen oxide emissions sulfuric acid mists EXPERIENCE: Monitoring ofBarrel drum drier exhausts. Coal fired boiler stacks. Process streams for detergent manufacturers. Feed mill exhaust ducts. Spray drier exhaust ducts. Exhaust from an iron pelletlng operation and from various ducts in the process.
Nitrogen Oxide: Emissions from spray drying operation. Study for burner and heat exchanger evaluation. Sulfur Dioxide emissions: On a,” oil fired drier exhaust. Study on exhaust gases from a variety of fuels used for combustion. Solvent measurement in high solvent usage area. Lead levels in various processing locations. Mercury levels for employee exposure hazard.
For information and a brochure that concisely explains how Raltech can work wlth you to comply wlth air monltdring regulations, call o r write: Environmental Services Raltech Scientific Services 3301 Kinsman Boulevard Madison, WI 53704 Phone 608/241-4471
Raltech SCIENTIFIC SERVICES TM A DIVISION OF RALSTON PURINA COMPANY
Ralston Purina Company 1980
CIRCLE 18 ON READER SERVICE CARD
Waste disposal chemistry The treatment and disposal of hazardous waste stands on shaky scientific ground. Some first attempts to shore up our understanding of this complex subject were discussed at the EPA’s Solid and Hazardous Waste Research Symposium in Chicago “Our knowledge about land disposal, treatment, and incineration is almost as a black art.” The essence of the task ahead, continued Gary Dietrich of the EPA’s Office of Solid Waste, is to make it into a science. Some first steps towards that goal were reported at the Sixth Annual Research Symposium, sponsored by the EPA Solid and Hazardous Waste Research Division (Cincinnati, Ohio) and the Southwest Research Institute. The meeting was held in Chicago on March 17-20, and dealt for the most part with work funded by the EPA. The urgency of determining what constitutes sound hazardous waste management-or, in EPAese, “Best Engineering Judgment”-has been intensified by the flurry of legislative and regulatory activity in the area. Particularly pressing is the EPA’s deadline of the end of August to come up with the technical standards which will be used in issuing permits on disposal sites under the Resource Conservation and Recovery Act (RCRA). The actual regulations-as opposed to the supporting technical standardswere due at the very end of last month. The issuing of permits is to begin April 30 of next year. Characterizing waste Complicating the problem terribly is that “hazardous waste” covers a wide range of chemicals. Heavy metals, pesticide residues, organic solvents, acids, inorganic salts, explosives-all of these may fall under the heading of “hazardous.” Each has its own chemical characteristics; each must be handled differently. “Proper management of hazardous waste is not possible without adequate knowledge of composition,” said Robert Stephens of California’s Hazardous Materials Laboratory. Yet the precise characterization of a sample is a time-consuming and expensive task. 508
Environmental Science & Technology
“We just cannot afford to make a research project out of each sample,” Stephens said. The only approach possible, then, is to develop certain standard procedures for analyzing and categorizing wastes in a general way. Stephens and his co-workers are creating a flowchart of standard procedures to be followed in analysis. Their method emphasizes a sequence of increasingly refined tests, so that the most precise-and timeconsuming and costly-methods are saved for those samples that really require them. The initial screening tests, for example, are designed to indicate certain general hazardous properties, such as acidity, flammability, and toxic gas generation. Results from these tests are in some cases adequate to characterize a waste as nonhazardous, thus sparing the need for more detailed work. If further analysis is required, though, the next step is phase separation and analysis with thin-layer chromatography-before going to the state-of-the-art methods like gas chromatography/mass spectroscopy. Thin-layer chromatography, said Stephens, is a fast, inexpensive, and versatile method for separating a mixture into its components; again, the results may be sufficiently revealing to obviate more detailed work. And even if there is still a need for G C / M S analysis, the problem can at least be reduced by this initial study-for example, to analyzing only one or two components of interest obtained from the thin-layer separation. In addition to pointing the way to the best treatment or disposal strategy, analysis and characterization of wastes is vitally important in preventing dangerous reactions which would result from mixing incompatible materials. Howard Hatayama, also of California’s Hazardous Materials Lab., noted that serious accidents of
this sort occur most often because waste handlers do not know the chemical composition of the waste or are unaware of how different wastes interact chemically. Again, making a “research project” out of every possible specific chemical reaction is impossible; the only way to get a handle on the problem is to lump wastes into general categories. Hatayama’s study produced a hazardous waste compatibility chart which separates wastes into categories, such as oxidizing mineral acids, organophosphates, and nitriles, and lists possible interactions, such as flammable gas generation, violent polymerization, and solubilizing of toxics. This study, however, was based primarily on a literature search and considered only binary interactions; further studies, including lab work, will be needed to examine catalytic effects of metals, surface reactions in containers, and complex interactions of more than two substances.
Chemical treatment In the long run, however, hazardous waste management may-thankfully-be less concerned with complex mixtures. “The days of collecting waste together,” said Dietrich, “can probably be numbered.” He added, “ I personally believe that a large part of the solution of the hazardous waste problem will be in dealing with segregated waste.” Yet if the work discussed at the symposium is an) indication of the state of o u r knowledge of chemical treatment. it would seem that we are a long way from knowing what to do even with segregated Haqte. “We certainly haven’t begun to catalog all the unit treatment processes,” Dietrich said. One encouraging development, though, was reported in the treatment of PCBs and other halogenated or-
panics. Louis P:!tlt:wski of the Franklin Research Ceni.sr (Philadelphia) described the successful decomposition of these compoiLnds by the use of molten sodium in polyethylene glycol. The sodium aids in the oxidation of organics and the removal of chlorine: products are sodium chloride, hydrogen gas. and polyhydroxylated biphenyls or orher phenolic compoi)nds. The reactior is exothermic-thus se I f- su s t LIi n i ne! a n d very fast "Complete dechlorination of PCB [polychlorinated biphenyl] oil occurs approxirnately 3 ~ - 5minutes after addition to the rl:action mixture." said P y t l e ~ s k i Decomposition . of kepone, D D TI pe n t ac hs i ci r o p h c no I. he xac h I orobcnrene. nius1.ard gas. and hexac h I or oc yc I o h e x a ,? e w a 5 a I so reported , Pol!ethylene glycol is used as the solvent: it has t . ' ~advantages of high thermal stabilit! (reactions can run up to 250 "C) a n d the ability to dissolve PCHs and o t h e r chlorinated conipounds to a high degree. Anuthcr area where progress is being h d e 1:; . n inorganic waste treatment. An ongoing program to test three inorganic trcatment methods was described by Ll'arren l-yman of Arthur D. Little. Inc. (Cambridge. Mass.). High-gradient magnetic separation, a process Lrhich i s used principally to whiten c l a j b:, removing a colored magnetic fracti'3n. m a l also be applic a b I e to h ;I ra rd oii s wastes cont ;i in in g hea\.> metals. ''4largc fraction of the more coni nion h c av y in et a Is encou n tered i n hazardc8uswastes are either fer r o m a g n e t i c ';lr par a m a g n e t i c i n the elemental form o r in compounds." said L.!nian. "Included are Fe, Co. Ui. Cr, Cu.Ti. and Cz: not included are Pb. Hg. and Z n .I n some instances i t may be possible to rmiovc nonmagnetic .ociated i n any wa! v, it h the mag n t: t i i: mat e r i ;i 1. .' Thc iiictals ,Arc collected on a ferromagnetic filter. i n the presence of a high-intensit! ( u p to 20 000 gauss) magnetic field. The relativelq small volu me of c o n m r,t ra t ed h ea\), ni e t ;i I s collected could t r e n be disposed of i n ;I secure landfill or trented further for material recover:!. Also under s.ud> are :i solvent estraction protei;:; for removing iviiter and oils from sildges containing both organic and inorganic material, and a n itctivated-carbon adsorption process for removing hwi.4' metals and certain anions from mixed acidic plating ,
LI3StCS.
The burning issm The little th;jt is understood o f the chemistry of tr'cai.nient is rivaled on14
by the little that is understood of the chemistr! of incincration. While man! researchers belime that incineration can pro\,ide the best overall solution. t h e y freely admit that much needs to be learned about just ho\t to do it and what goes on during incineration. Richard Carnes of EPA, Cincinnati, made this point a t a recent meeting of the National Governors' Association in M'ashington. A keq iisue. he said. is to "make sure that the breakdown products themseliw arc n o t ha7ardous." Other quest ons \A hich need a n swering. said Carnes. ;ire hon much residence time is needcd in t h e incinerator for a particular compound. \\hat temperatures and pressures are required. and uha:. kind of flue gas scrubbers would t x rec;iuired. Dietrich agreed: "\r'ou cro>\ !'our fingers and hope that nothing conies out o f the stack that's harmful to public health. W e don't k n o ~hou a n incinerator operates in the scnse t h a t we can predict with confidence h o u a particular incinerator clejign \ + i l l deal with a particular mixture of harardous wastes wen i n those c ~ i m\I hcrc we do k n ow t h e br e a k83ow n c h ;I r ;i c t e r i \ tics." D. S. Du\.all o/' t h e L n i \ i . r s i t > of Da!ton Research Institute reported to the Chicago symposium o n laborator! research \\ hich is beginning t o suppl! some of the need-d :in\v.ers. Duvall and his group h a \ e designed and assembled what thcc call a "thermal decomposition analytic;il .;)stem." \I hich combines a small high-temperaturc reactor with a n in-line G C and ;I minicomputer. Esposure temperature. pressure. residence time. and atmosphere can be carefull> con-
Thermal decomposition of a PCB Concentration (weight O O )
1 Pentachlorobenzene
,
600 800 1000 Exposure temperature ( 2) The decompositicn of a polychlorinated biphenyl [in this case 2 2 4 4 5 5 -hexachlorobiphenyl) is shown along with the appearancp of breakdown products as a function of exposure temperature Source 3 S Dubs I
LIq
i - r s 1" of D a f t c n
Rpseercb C-nter
trolled. The G C / M S is used to monitor the :iniount of the target compound remaining a s \+ell a s t h e breakdown products formed. T h e t ~ 4 oclasses of compounds tested, PCHs and "hex" \\aste ( a mixture o f hex ii c h Io r o bc n ze n e, pent a c h I orobenrcnc. and asmrted other chlorinated and nonchlorinated organics), bot h showed high therm a 1 s t ;.ihi 1 it y . Temper;iture\ of 750 "C uere rcyuired for 99.95decomposition of PCBs (for ;I 2 - 5 exposure). for example. .,In anul!,sis of breakdown products, Volume 14, Nurnber 5, May 1980
509
though, showed that significant concentrations of chlorinated organics remained even after exposures to 800 OC. Hexachlorobenzene, both a breakdown product of PCB and a primary constituent of “hex” waste, was found to be particularly resistant to decomposition; low levels remained even after 1000 OC exposure. Other breakdown products, too-such as some formed by the incorporation of oxygen-show greater stability than their parent compounds. This point emphasizes the need to study breakdown products, not just the disappearance of the primary compound. Not surprisingly, reducing the proportion of oxygen in the reactor had the effect of increasing thermal stability-by as much as 200 “C. Some further work on the hows and whys of incineration is being planned at the National Center for Toxicolog-
ical Research in Pine Bluff, Ark. An experimental incinerator will be used to carry on similar studies on a larger scale, and also to learn how much energy might be released through combustion of wastes. The subject of obtaining useful energy from the incineration of organic wastes came up at the governor’s meeting. According to Arch Pettit, former president of Arkansas Power and Light, now with Stephens, Inc. (Little Rock, Ark.), the notion is not as far-fetched as it might first appear. H e noted that synthetic organics account for 40% of hazardous wastes, and that the incineration energy of these wastes could drive 20-MW power plants. One incineration method that may deserve particular attention is fluidized-bed combustion. According to Barbara Edwards of Ebon Research Systems (Washington, D.C.), who
reported to the Chicago symposium on a number of emerging disposal technologies, the turbulent action of the fluidized bed allows rapid oxidation of organic compounds a t relatively low temperatures and with a minimum of excess oxygen. Furthermore, the addition of sodium carbonate to the bed can trap chlorine gas as it is produced when compounds such as polyvinylchloride are burned. Particulate emissions, though, are likely to be a problem with fluidized combusion, Edwards said, and some sort of precipitator or cyclone collector would be needed. Fluidized-bed combustion has so far been used to destroy oils, refinery wastes, phenol, an organic water dye slurry, and a variety of chlorinated hydrocarbons. -Stephen Budiansky -Julian Josephson
Drinking water and its t6atment An interview with Professor Sontheimer
Public concern about the quality of drinking water has risen sharply since the early 1970s. The discovery of synthetic organic compounds in New Orleans’ water supply ( E S &T , January 1973, p 14) put pressure on Congress to enact the Safe Drinking Water Act of 1974 ( E S & T , March 1975, p 194). Scientists also found that many of the naturally occurring organics, which actually represent the bulk of organic chemicals in water, were being converted to chloroorganics, which are presumed to be more toxic than their nonchlorinated precursors. In the US.,chlorine is typically used both as an oxidant in the water treatment process and as a disinfectant a t the end of the treatment process. It was discovered in late 1974 that chloroform and other trihalomethanes (THMs) are present in the drinking water of the U S . systems that chlorinate. More than five years of concentrated effort by dozens of individuals, including scientists, toxicologists, en510
Environmental Science & Technology
gineers, attorneys, and administrators, went into the U S . effort to regulate THMs. This formal rule-making process required more than three and a half years. It started in July 1976 when the EPA published an advance notice of proposed rule making entitled “Control Options for Organic Chemicals in Drinking Water.” In February 1978, EPA published a proposal to amend the National Interim Primary Drinking Water Regulations to include a maximum concentration level ( M C L ) for T H Ms as well as a proposal to deal with synthetic organic chemical contaminants. The final T H M regularion, exclusive of controls for synthetics, was published as a final regulation on Nov. 29, 1979. It specified a 0.10 mg/L level for T H M s in finished drinking water. EPA is now being sued by the American Water Works Association ( A W W A ) on this regulation, but the case has not yet been heard. Joseph Cotruvo, director of the EPA Drinking Water Standards Division,
told E S & T that most waterworks in the U S . would be able to meet this M C L by making relatively minor modifications in their current treatment practices, such as by changing the place where chlorine is added in the treatment process. I n a number of cases, waterworks have actually saved money by working to reduce T H M levels. In the U S . , the source of water supply for large drinking waterworks is surface water; but a large number of small drinking waterworks take their supply from groundwater sources. Surface waters, which usually have large amounts of dissolved natural organics, tend to have the greatest potential for producing chloroorganics after chlorination. A typical treatment process consists of breakpoint chlorination, flocculation, sedimentation, sand filtration, and safety chlorination (disinfection). Breakpoint chlorination commonly has been used to remove ammonia. The chlorine would convert the NH3 to free N 2 in a series of steps which includes
formation of the intermediaries chloramine, NH2C1; dichloramine, N HCI,; nitrogen trichloride, NC13; and eventually free nitrogen, N2. The chlorine used in this step also competes with organic material leading to formation of the chloroorganic by-products, not only T H M s but all of the organically bound halogen expressed as TOCl or TOX-total organic halides. Today breakpoint chlorination is practiced less often and less chlorine is used in the early stage of the treatment process. However, EPA is particularly concerned that “in no case should biological quality and the barriers against pathogen transmission be reduced in the cause of attempting to reduce THMs,” Cotruvo said. “The technology is available to both optimize biological quality and minimize the formation of the unwanted byproducts of chlorination.” One way to reduce organic chlorine by-products is to substitute prechlorination with some other preoxidation process, such as one involving ozone, chlorine dioxide, permanganate, or chloramine (NH2C1). The best way is to significantly reduce precursors in the water prior to the chlorine addition. Thus with precursors reduced, chlorine demand is less; disinfection efficiency is improved; and by-product formation is reduced. “It is not going to be difficult for many U S . waterworks to meet the MCL,” Cotruvo said, “although some waterworks will require more extensive changes. The main point is that significant amounts of the undesirable and potentially harmful chemicals are being introduced during the treatment process, and waterworks have it within their capacity to substantially reduce them and produce a better quality water .” Cotruvo emphasized that one should optimize the treatment process, making certain to remove bacteria and pathogens and to minimize the formation of by-products, and that each waterworks has a unique water treatment problem to deal with. He said, “Avoid the patchwork approach; examine the total treatment process; identify the technical problems; evaluate the options, including permutations and modifications of processes; work out the analytical process control needs; examine the economics of the options; and above all, assure microbiological quality of the water.” Shortly after enactment of the Safe Drinking Water Act, EPA proposed that N A T O (the National Atlantic Treaty Organization) initiate a study of drinking water issues under the auspices of its Committee on the
Challenges of Modern Society (CCMS). About this time, it seemed EPA felt that drinking water quality issues in industrialized countries were so similar that an international effort should be mounted to identify the problems and utilize the collective experience of the national experts to find approaches to solve them. Conceived and headed by Joseph Cotruvo, the activity became known as the C C M S Drinking Water Pilot Project.
Dr. Heinrich Sontheimer is a professor of water chemistry at the Unicersity of Karlsruhe, the oldest technical unirersity in German!,, where he studied and obtained his Ph.D. degree. He returned to the unicersity after 15 years with a Liirgi engineering f i r m . Besides teaching, he also directs the research actiuities of 50 scientists and technicians in basic and applied research at the Engler-Bunte Institute, which is supported bjs the German waterworks. EPA has overall management responsibility for the project, which was subdivided into six areas, with responsibility given to different N A T O member countries. With the exception of one area, the project is nearing completion. Reports are now being prepared for submission a t the Fall 1980 plenary session of the N A T O / C C M S tentatively planned for this October a t a site yet to be determined. The six working groups in the Drinking Water Pilot Project include: Analytical Chemistry-whose chairman, Lawrence Pittwell, is with the Directorate of the Environment, U.K. Advance Treatment Technology-with Germany being the lead country, and whose chairman is Pro-
fessor Heinrich Sontheimer of the University of Karlsruhe Wastewater Reuse-whose chairman is Andrew Goodman of the Directorate of the Environment, U.K. Microbiology-with U.S. responsibility, whose chairman is Dr. Dean 0. Cliver, professor of virology at the Food Research Institute of the University of Wisconsin Health Effects-again with U S . responsibility, whose chairman is Professor Joseph Borzelleca of the Medical College of Virginia (this is the group whose report will not be finalized prior to the October meeting) Groundwater-with Germany being the host country, whose chairman is Horst Kussmaul of the Ministry of the Interior of Bonn. The N A T O countries participating in this Drinking Water Pilot Project include the U S . , Canada, England, Germany, the Netherlands, France, Italy, Norway, Belgium, Turkey, and Greece. Some N A T O countries did not participate in the project, while some other countries which are not members, including Israel, Sweden, and Spain, are participating. In all, 14 countries and more than 7 5 individuals are involved. Two reports from the C C M S groups, based upon two international symposia, have already been completed: “Oxidation Techniques” and “Adsorption Techniques.” These reports are available from the EPA Office of Drinking Water in limited quantity and from the National Technical Information Service ( N T I S ) . The full series of reports will be published eventually. Professor Heinrich Sontheimer, chairman of the group on Advanced Treatment Technology, had this to say about the practices of providing good drinking water. ES&T. What is the source and typical treatment for drinking water in Germany? Sontheimer. Surface water is the source of 50% of our drinking water, but practically all river waterworks use some sort of ground filtration. All waterworks along the Rhine River do not take the water directly from the river. Rather, they take the water from the wells along the river at distances from 50 m to 100 m so that the retention time within the ground turns out to be 30, 50, and sometimes 100 days. This typical German practice reduces the organics concentration about 75% from what was originally in the raw water. This then enables us to use activated carbon in our treatment train with a much greater efficiency. Volume 14, Number 5, May 1980
511
Most biodegradable substances are removed during ground passage. The remaining substances-besides the humic acids, which are nonbiodegradable to some extent too-are mostly chemicals that have to be removed from treatment with activated carbon because the Rhine River contains a high loading of chemicals coming from the chemical industry. We remove some of these materials by using ozone treatment before the carbon columns to get conversion of some of the nonbiodegradable organics to biodegradable ones. We also encourage biological activity within these filters. In Germany we try to use biological treatment processes as much as possible because they are very cost effective. The only disadvantage is the need for large land areas. Where river bank filtration is not possible, we use slow sand filters, which are followed by ground filtration to reduce organics concentrations as well as some bacteria. This practice is why we do not have a problem with trihalomethanes. In only a very few waterworks we used a treatment with chlorine similar to that in the U S . In fact, there are only five or six large waterworks out of a total of about 50 large river waterworks in Germany that use chlorine for ammonia removal, and they have changed their treatment during the last two years. ES&T. For years breakpoint chlorination has been touted as standard practice and the safe way to remove ammonia as well as pathogens from drinking water. Why has this treatment process changed today? Sontheimer. The main reason for the practice of breakpoint chlorination was to remove ammonia at low temperature. Earlier, we did not know that we could remove ammonia with biological treatment. W e found that with preozonization the biological activity becomes so enhanced that ammonia is removed even at temperatures as low as -0.1 OC. For this reason, then, we don’t need breakpoint chlorination. With the introduction of ozonization, however, it became necessary to change our flocculation units to some extent. If you don’t use chlorine then the flocculation is not so easy. We needed to pay careful attention to the dosage and the use of flocculant aides. Today, in Germany we have no breakpoint chlorination at all. The general feeling in our country is that groundwater is much better (higher in quality) than river water. People tried to transfer river water to groundwater and this historical development is how 512
Environmental Science & Technology
this ground filtration practice became preeminent in our country. There was no specific reason at the time related to chemical quality, but it, in conjunction with granular activated carbon (GAC), turned out to be very effective as a process to reduce organic chemical contamination. Our parents did it, as the treatment is easy to do; it’s low in cost but needs the land space. ES&T. The newer treatment processes use oxidative and adsorptive techniques for treatment of drinking water. Indeed, the two international meetings (the first in September 1978 at the University of Karlsruhe and the second in May 1979 in Reston, Va.) under the CCMS Drinking Water Pilot Projects for which you were cochairman stressed these new processes. T o what extent is ozone used in Germany today?
“In Germany we try to use biological treatment processes as much as possible because they are very cost effective. . . about 50 waterworks in our country are using C102for disinfection instead of chlorines . . .”
Sontheimer. We use ozone very often, but only for special purposes. Some waters, for example, are difficult to flocculate without chlorine. In order to enhance flocculation you must use some oxidative process. So we use a small dosage of ozone to improve flocculation. We first used ozone 20 years ago in Dusseldorf. There they compared ozone alone, activated carbon alone, and ozone in combination with activated carbon-the first time that the oxidative and adsorptive processes were used in combination. They found that taste and odor removal was much cheaper when they used ozone before activated carbon. From this Dusseldorf experience, many waterworks followed the same practice. Until a few years ago, we included treatment with activated carbon because some biological activity in the carbon extends the time between reactivation, thus reducing the cost. One group within our institute, for example, is concerned with the study
of the optimum ozone discharge. It is difficult to find the exact ozone dosage that is best for each situation. Too high a dosage of ozone may cause problems too. For example, excessive oxidation of organics to carboxylic groups may occur. These materials are not readily adsorbed on carbon. So we try to change to organics to some extent to make them biodegradable, but not too much because without adsorption there would not be much opportunity to utilize the biodegradation process. At present, we are doing studies on a large drinking water plant in Essen where ozonization is used before ground filtration only to enhance biodegradation within the ground. But ozone is not the only answer. We have other large waterworks that make good water without the use of ozone. ES&T. What is the experience with the use of other oxidants, for example, chlorine dioxide? Sontheimer. We have about 50 waterworks in our country that are using chlorine dioxide for disinfection (safety chlorination) instead of chlorine at the end of the treatment process. The use of ClOz as a disinfectant eliminates the chlorine odor or taste. At present we are undertaking a large study on the use of chlorine dioxide as an oxidant within the treatment process. We feel that C102 can be used within the process because all chlorites resulting from its use can be removed by the activated-carbon treatment. So we would have no residual at all. I think we can use chlorine dioxide in place of chlorine or ozone as an oxidant. To date, however, we have never used chlorine dioxide in the treatment process (except for final disinfection). ES&T. The activities of the Engler-Bunte Institute are sponsored by the waterworks in Germany. How does the research that you direct differ from that conducted at institutes in the US.? Sontheimer. 1 have a staff of about 50 people who are working on water treatment or water quality problems. The institute was founded by the German Gas and Water Works. It has a very unusual construction and started as a gas institute where they studied gas purification and the like. Later on in the 1930s it became a university institute, but a technical part remained which does not belong to the university. It has done research on water treatment since 1957. Today, there is an agreement between the state government and the Federation of Gas and Waterworks that within our institute there will be a department whose people are paid by the water-
Water Treatabilitlv. Come to the expeits the exnerts come to.
equipped mobile pilot water trea used for on-site verification of a
Some of the most knowledgeable people in the field - including those who make the rules - consult with our Environmental Services Department for answers about industrial waste water pollutants and treatability. That’s because Monsanto has “written the book” on the analysis and treatment of waste waters through experimentation and experience. As a result, we have outstanding personnel and facilities t o provide current, cost-effective solutions to water pollution problems. In our centrally-located laboratories (where analysis is a basic function) we can analyze site-collected samples, perform a
preliminary treatability study, and interpret results t o select process steps promising the most cost-effective treatment system. We then can use our mobile pilot treatment system on-site t o verify that process selection. Finally, we can recommend or assist with program implementation. And since we manufacture no products involved in the recommended process, we can be totally impartial in that recommendation. 73 solve your water pollution problems efficiently and economically, come to the experts the experts come to. Monsanto Research Corporation. For typically fast response, clip and mail the coupon below.’Ibday. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _Monsdrto _ _ _ _ _C O F I C d n y 1980
Foi- more information about Water ‘Peatability services or other environmental problem solving, mail coupon to: Monsanto Research Corporation, Environmental Services, Dept. ES-4, Station B, Box 8, Dayton, Ohio 45407. Phone 5131268-3411
Name Company City & S t a t e
Monsanto
Title
Address Zip
Monsanto Research Corporation
CIRCLE 13 ON READER SERVICE CARD
Volume 14, Number 5, May 1980
513
works. Most large waterworks pay the institute a certain sum each year to support this activity. My people do basic and applied work at the same time, and we work closely with the operators of the waterworks. It is my experience that the only way to help build a better treatment sequence is to have fundamental knowledge about each step in the process. No one can design a carbontreatment system who does not understand diffusion and the chemistry of adsorption, as well as engineering and operating problems. We have groups working on oxidation, flocculation, filtration, adsorption, and the analytics. The institute has doctoral students who do basic research, while the post-doctoral candidates do more of the applied work, in general. In the LiS. there are excellent researchers, but they tend not to be associated with the waterworks. On the other hand, you also have very competent waterworks people, but they tend not to be associated with basic research activities. The EPA laboratory tries to bring the two together. I think the situation in the U S . has been changing and more and more interaction between universities and waterworks is occurring. I think you will attract more university people into drinking water treatment questions and applications if you have more combined research. ES&T. EPA set a regulation on the T H M level in drinking water in the U S . last November. Is there a T H M level for drinking water in Germany? Sontheimer. We have a Commission for Drinking Water. It is not an official government body; rather, it is a commission of scientists giving public recommendations. This commission made a recommendation to the government in February 1979 and proposed, for technical reasons, a T H M level of 25 p g / L . Remember, in Germany we use no chlorine in most of our treatment plants. ES&T. What is happening in other N A T O countries regarding T H M levels? Sontheimer. The English are considering setting guidelines, but historically the U.K. has relied heavily on chlorine to assure biological quality. The Canadians have established a guideline at a maximum of 350 pg/L. In France there are only two large companies that deliver drinking water. They reduced the dosage of chlorine and prefer to use ozone. Very similar developments are going on in many other countries. ES&T. You mentioned that one of 514
Environmental Science & Technology
the activities to disseminate the knowledge about treatment was the sponsorship of two international meetings on the subject-the September 1978 meeting at Karlsruhe, Germany and the May 1979 meeting in Reston, Va. Is the new knowledge from these conferences being applied today? Sontheimer. Yes, it is. The newest knowledge can be seen in the use of activated-carbon columns. Ours and other research work has shown that. If you treat with chlorine and then use activated-carbon filters, good results are difficult to get. In this case, chlorine makes chloroorganics that are nonbiodegradable. If you use activated carbon to remove these materials, you find that the carbon columns fill up very soon and must be regenerated.
“Biological acticity in activated carbon means that we can get the same biological effect uor biodegradable chemicals j in one-half hour that would hace required 50-100 days for ground filtration.”
This can be avoided by using other oxidants or lower chlorine dosages. Thus we have to understand what we are doing in each special case and have to learn about it through basic research. (In the U.S., for example, there are active examinations of the use of ozone and chlorine dioxide, as well as on G A C use and other adsorption processes.) ES&T. For the first time, several years ago you and your colleagues identified biological activity on carbon filters, which was hailed as a very significant breakthrough. What does this mean? Sontheimer. Yes, it is true that we identified this activity as important, but we do not yet understand all of the factors that affect it. Now there are many international studies i n this area and I a m sure the answers will be found in another five years. Biological activity in activated carbon rneans that we can get the same biological effect (for biodegradable chemicals) in one-half hour that would
have required 50- 100 days for ground filtration. We get the same effects but in a much shorter time. The general research was started years ago in Dusseldorf, where we had trouble with carbon columns because countings (of microorganisms) had been too high after the filters. We know now that there are two ways that microorganisms work within a carbon filter. They behave like microorganisms metabolizing biodegradable chemicals at or near the surface of the carbon. For the other part, we know that most organics are first adsorbed on the activated carbon and this carbon is then regenerated biologically. It’s a dynamic process. The organics adhere to the column somewhere, come off through the biological regeneration, move down the column, and are adsorbed at another site in the column and so on. I must say that we do not completely understand all effects now, but, again, I am sure that it can be explained within the next five years. ES&T. What are the areas of technological research that are the most promising for the future? Sontheimer. I mentioned that the biological activity on carbon filters has to be studied much more during the next years. It will take this time to gain a basic understanding of what is really going on. Another problem that we are working on is filtration and the use of polyelectrolytes. Within our studies, three-layer filters which use activated carbon as a top layer for filtration are most important. The filters with these three layers have much longer running times and can treat water with high turbidity. too. Flocculation and sedimentation are not needed and so these steps can be avoided. We can now flocculate and go directly to this three-layer filter. This type of treatment looks very promising from an economic standpoint. Our other concern is corrosion in water. For the past 95 years, people in Germany as well as in other countries believed that the precipitation of calcium carbonate was most important for preventing corrosion of pipes in drinking water supplies. During the last year, we found out that this is not true. It is not calcium carbonate. it’s ferrous carbonate that is important. It is less soluble than calcium carbonate and can be oxidized to the ferric salt. This competition between the precipitation of FeC03 and the oxidation to the ferric oxides governs the type of coating one can get at the surface. This is a fascinating type of reaction, worth research i n much more detail. --Stanton Miller