but not over science - ACS Publications - American Chemical Society

Jun 23, 1980 - debate from the scientific to the political realm. It is two ... Needleman of the Harvard Medical. School ... longer scientific uncerta...
0 downloads 0 Views 9MB Size
Lead the debate goes on, but not over science Research on the extent of lead contamination of the modern environment and on the health effects of low levels of lead has done much to remove scientific uncertainty and to shift debate from the scientific to the political realm It is two millenia since the first description of lead poisoning appeared, two centuries since Benjamin Franklin wrote on the effects of lead, half a century since lead paint was tracked down as thecauseof lead poisoning in children. “Why,” asks Herbert L. Needleman of the Harvard Medical School, “does the problem of lead continue to be debated? And why is the orderly process of decreasing lead in the human environment slowed in the face of what many consider an overwhelming body of facts documenting its toxicity a t low dose?” The answer, says Needleman, is no longer scientific uncertainty, but economics and politics. On one side are six government agencies charged with regulating and studying lead. They face an enormous task. Lead appears in food from the solder used on cans; in water from the lead pipes used in older plumbing; and in air from lead additives used in gasoline, from the dust of old lead paint and putty, and from the initial smelting oflead ore that makes lead available for all its uses. Several of those agencies are playing a game of musical chairs, each trying to pass the responsibility to another by sh0win.g that the source of lead exposure it regulates is insignificant compared to the others (Chemical and Engineering News, June 23, 1980, p. 23). On the other side is an industry that is fighting for its very existence, and will fight all the harder as more take up the cry of Clair C. Patterson, a geochemist at the California Institute of Technology: “It is intrinsically wrong to mine and smelt millions of tons of a highly toxic, poi-

environment, has been all but put to rest by Patterson’s work; he has shown contamination of the oceans of the Northern Hemisphere IO-fold over prehistoric levels; the atmosphere, 50-fold; the Greenland ice cap, 300fold; and most Americans’ bodies, 600-fold. Most significantly, he has demonstrated that conflicting results obtained by others were the result of csntamination of samples and laboratory equipment, leading to analytic errors of as much as three orders of magnitude in the numbers they obtained for prehistoric levels. The other controversy-the effects of low levels of lead-while not put to rest. is a t least not thrashing about in the quagmire of a few years ago. This is largely the consequence of a carefully controlled epidemiological study by Needleman that showed poorer performance and more behavioral problems among children with elevated, but subacute, levels of lead in their bodies.

sonous substance such as lead each year and disperse it within human environments.”

Two controversies Research that in the last few years has done much to settle two longstanding scientific controversies concerning lead in the environment has in particular made it clear that the continuing debate hinges on politics, not science. The first, over the extent of contamination of the contemporary

0013-936X/81/0915-0243$01.25/0 @ 1981 Amerlcan Chemical Society

Natural or industrial? Patterson’s work is the climax of a story that began in 1924, a story of analytical chemistry with a touch of imaginative detective work. In 1924, J. C. Aub et al. published a paper in the Journal o j f h eAmerican Medical Association which reported that lead did not occur naturally in humans; only workers improperly exposed to industrial lead were contaminated. Improved analytical techniques allowed R. A. Kehoe to refute this claim a decade later; he showed that lead occurred at low levels even in the blood Volume 15, Number 3. March 1981 243

and excreta of typical adults. His facts were correct but his conclusion was not. He believed that the levels he detected in typical adults were natural, not the result of industrial activity. This view survived intact for three decades. Even by the mid- 1960s, the prevailing medical opinion was that no more than one-half of the lead in humans was of industrial origin. It was Patterson’s early work on lead contamination of the environment that unleashed the controversy. “Patterson claimed that humans were contaminated with industrial lead by a factor of 100,” explains Robert W. Elias of Virginia Polytech. “This estimate was based on his estimate of lead contamination in the environment. It has taken almost 20 years to verify Patterson’s numbers, and he was right.” Patterson was right because he takes elaborate precautions to prevent contamination of his samples and his laboratory by the elevated levels of lead that are omnipresent in the modern environment. The interior of his lab is lined with plastic; an air-lock and pressurized ultra-clean air prevent outside contaminated air from entering; and his reagents are cleaned of lead contamination within the lab. His precautions-and his use of the most sensitive analytic technique for lead, isotope dilution mass spectroscopy-have given him the edge needed to detect the true, minute quantities of lead occurring in samples that reflect the prehistoric environment. And he has shown time and time again that conflicting results obtained by other workers are the result of sample contamination and inadequate analytic techniques. The most recent instance was data that questioned Patterson’s analysis of deep, ancient arctic ice. Patterson’s group had found that 3000-year-old ice contained only 2 ng lead/kg, compared with 200 ng/kg at present. Three other groups subsequently reported finding prehistoric levels to be much higher, in fact nearly the same as present levels-findings that suggested Kehoe was right, that current typical levels of contamination are natural and not the result of industrial activity. But in a paper to be submitted to Geochimica et Cosmochimica Acta, Patterson and a graduate student, Amy Ng, report that their analyses of the ice cores used by the other groups revealed considerable contamination of the cores during drilling and handling. Lead concentrations at the center of the cores turned out to be less than a tenth of the values reported. Concentrations on the outside of the 244

Environmental Science & Technology

cores were six orders of magnitude over the true uncontaminated values. Patterson’s group has dug out other obscure records of the ancient environment-pond sediments, deep ocean water, bones of prehistoric Peruvians-and all have confirmed his basic contention: Industrial lead is ubiquitous in the modern environment at concentrations orders of magnitude above natural levels. The Peruvian bone data has a special significance because it provides a direct measure of the natural level in humans, some two to three orders of magnitude below present typical levels. The data also lend concrete support to the “biopurification” theory that Elias has used to estimate natural levels in humans and other organisms. Elias explains that nonpollutant trace metals such as barium and strontium are increasingly purified with respect to calcium as they move up the food chain. Calcium is essential to the workings of all living things, and they therefore actively concentrate calcium at the expense of the other trace metals. By analogy, the ratio of lead to calcium should drop as well in going from plants to herbivores to carnivores. Elias’s predicted value for natural lead in humans so obtained is within a factor of two of the value found in the Peruvian bones.

Politics and pride In the face of this overwhelming body of evidence, resistance to Patterson’s findings can be attributed largely to politics and professional pride-no one likes to be told that his analytic technique is off by a factor of 1000, and Patterson hasn’t balked at stepping on toes. Patterson, who was reached in American Samoa where he is doing field work, believes it is a matter of time for his results and methods to sink in: “Knowledgeable and competent scientists who understand the significance and who are aware of the quality and integrity of the work of my colleagues and myself are in uniform agreement. A large number of applied chemists have made an enormous number of measurements of environmental occurrences of lead that are in error by many orders of magnitude. This is a fact which is not controversial. The problem is the length of time it will take for applied chemists to recognize this fact. “Here at the U S . NOAA Air Resources Laboratory in Pago Pago, American Samoa, I and my colleagues are determining the input flux of lead to the South Pacific. One of the parameters we measure is the concen-

tration of lead in the easterly trade winds. It is difficult to properly collect filtered samples of such air that are scientifically significant because of interference by local sources of industrial lead contamination. So elaborate precautions must be taken to properly locate the air collector, which can be operated only a fraction of the time, under suitable meteorological conditions. During the past week, we operated our air filters only for five hours. “During this same period, an air filter device placed nearby, but at a scientifically meaningless and totally unacceptable location, droned on and on 24 hours a day for days on end, collecting local lead pollution on filters that had been packaged and handled in the most crude and unacceptable fashion. In past years, these filters were sent to the U S . National Health and Safety Laboratory of the Department of Energy. And their mindless, meaningless analytic results which have been published as reports of lead in air at remote locations typify the quality and significance of most of the large amount of environmental lead data now being collected and which will continue to be collected and published for some time to come.” . Patterson’s case does not end with his showing the analytic errors of others, H e believes that the extent the modern environment is contaminated with lead is proof itself that we must stop the further mining and smelting of lead. And his argument takes a striking form when he points out that we have already increased our dietary intake of lead 100 times over natural levels-and that a further increase of a factor of only five brings us into the level of acute toxic effects. The “engineering approach,” which demands subacute effects to be demonstrated before they are believed, ignores the extent of contamination, he says. No other metal may be increased in the diet by a factor of 100 without deleterious effect. But not everyone buys this conclusion. The Food and Drug Administration, which has responsibility over canned foods, continues to use the typical levels of lead in canned foods as the reference point. When Patterson, in a paper published in Science last year, showed that tuna packed in lead-soldered cans is contaminated 4000-fold over fresh raw tuna-and that, once again, analyses of fresh tuna by others (including the FDA) came up with values 1000 times too highFDA called it a “meaningless academic exercise.” “For the quantities of lead that are

found in these food samples, you don’t need the kind of laboratory setup that Dr. Patterson has,” says Kathryn R. Mahaffey, assistant to the director of FDA’s division of nutrition. Patterson calls this circular bureaucratic reasoning. “We have proved that they made 1000-fold errors in the determination of lead in raw tuna. Their response to this disclosure of their flagrant ineptitude is that the FDA i s required only to correctly determine when lead concentrations increase above typical levels of about a half a part per million in canned tuna fish.” Patterson insists that the reference point is not that half a ppm, but the natural levels, 1000 times less. Subtle effects But even while Mahaffey argues that “the assertion that the level of lead in the food supply causes health effects is not proved” and takes Patterson, a geochemist, to task for venturing into biology (“One thing that’s important for scientists to recognize is that there are limits to their expertise”), she is convinced that progressively more subtle health effects of low-level exposures to lead will be indentified. This is the second issue, really the other side of the coin of the environmental contamination issue, and one which has also seen considerable progress in the last few years. Spearheading that progress is an epidemiologic study by Needleman of some 2000 school children in the Boston area. His principal finding: “Lead in children at levels below that which bring them to a doctor are dangerous to their brains.” Children are particularly sensitive to lead. They absorb 30-50% of ingested lead, as opposed to the 5-10% that adults absorb. The acute effects of lead poisoning-constipation, vomiting, anemia, swelling of the brain, palsies, and eventually death-are well documented. Needleman set out to document more subtle effects that he believed showed up at subacute exposure levels-behavioral problems, poor intellectual performance, short attention span. But epidemiology is an inexact science. Even studies that look for acute effects, such as lung cancer, are plagued with uncertainty. It is not unusual to find as many different results as there are epidemiological studies on a given subject. The subtle effects that Needleman was looking for make the problem all the harder. He set out to design a study that would answer the objections leveled at previous studies of the subtle effects of lead, studies that are a morass of in-

Increasinglead production..

.

World production of lead (tly)

10s

U

I

5000

4Mx)

3wo

Moo

1000

Years before present Man began releasing lead to the environment with the discovery of cupellation, the process of separating silver from base metals including lead. Note that thevettical scale is logarithmic. NaI1m.l Academy 01 ScIenoe%“ b a d In the Human Envlmnmenl”

...and envlronmentalaccumulation pg IeadlkQ snow

0.a

0.1!

0.11

0.U

D.

I

0

50

Age of samples Samples of snow from the Greenland ice cap show the increasing conlamlnation of the environment with lead. IOUIQ: C. Panenon. Wlllfomia ln.lllY1e01Te~hn01~y

Volume 15, Number 3, March 1981 245

conclusiveness. He identified four specific problems: Poor markers of past exposure. The almost universally used marker of exposure, blood lead concentration, reflects current exposure only. Ascertainment bias. Subjects who agree to participate in a study may differ markedly from those who refuse. Weak measures of outcome. Tests of behavior and performance are often limited or insensitive. Confounding variables. Factors other than lead exposure may affect the outcome. Needleman used lead concentrations in lost primary teeth-lead accumulates in teeth as in bone-as the marker of lifetime exposure, and he collected teeth from 70% of the children studied. A variety of double-blind performance tests (IQ, reaction time, sentence repetition) were used along with teachers’ ratings of the children’s classroom behavior (“distractable,” “impulsive,” “does not follow instructions”) to measure the outcome. And after separating children into high-lead and low-lead groups, Needleman looked at some 30 nonlead variables (medical histories, race, sex, parents’ income, parents’ attitudes towards school, mother’s number of pregnancies) and controlled for five that he found to differ significantly between the two groups. “High-lead children were found to be significantly less able on a number of performance measures,” Needleman says. “High-lead subjects were also significantly more likely to be reported as disordered in their classroom behavior.” The high-lead group had a four-point lower mean IQ than the low-lead group. And when Needleman separated the children into six groups according to lead levels, he found a continuous dose-response relationship on each of the items of the teachers’ ratings. There appeared to be no threshold. Insensitive or negative? The resistance to Needleman’s findings may once again be more a matter of politics than science. The lead industry continues to cart around results of studies that Needleman and others call poorly controlled and insensitive; these show no correlation between lead exposure and performance of children. At the recent annual meeting of the American Association for the Advancement of Science, Jerome F. Cole of the International Lead Zinc Research Organization (ILZRO), an industry-supported group, gave considerable at246

EnvironmentalScience & Technology

tention to one of these studies, carried out among children who had lived near a lead smelter in El Paso. Cole quoted from the findings of a committee set up by the Society of Occupational and Environmental Health to evaluate that work: “In summary, the committee concluded, reports and data made available in these studies of El Paso have not clearly demonstrated psychological or neurological effects in the children under study,” Yet Cole admits in passing that the committee also found that the studies were not powerful enough to demonstrate the absence of health impairments-which is the very essence of why negative results from insensitive studies are of little value. And that is Needleman’s key message: A negative result does not simply cancel out a positive result. “The point at which brain effects show up will depend on the sensitivity of the study and the rigor of the epidemiology,” he maintains. But Cole argues that since “there is an impossibility of proving the absence of an effect-you can never, ever prove the negative”; the scale is unfairly tipped towards positive findings. H e points to an EPA evaluation of 22 studies of health effects of lead; only five of these were considered to be any good, and of these only two-including Needleman’s-showed any effects. And from this he concludes that “if there is an effect, it isn’t all that severe.” Punctuating this conflict in basic approaches is continuous sniping over details of the scientific methods and conclusions of the various studies. Cole, for example, argues that the four-point difference in IQ that Needleman found is quite small; Needleman responds that “a lot of people who make that point should really know better” since a four-point difference in the mean can translate to a large difference in the wings of the distribution. There may be many more children in the high-lead group who are, say, below 75 in IQ. Cole counters that we don’t know what the actual distribution is. Needleman says that the El Paso study has “serious problems”; for example, “a lot of those people did not participate-there was litigation against the industry, and it is possible that the families of the kids most injured, who were bringing suit, would not participate in an industry-sponsored study.” Cole claims that the excluded group did not differ in blood lead levels. Needleman argues that blood lead is an inadequate marker of past exposure. And so on.

Both ILZRO and Needleman are now working on prospective studies of lead’s effect on children; the results of these studies may take away some of the ammunition of future sniping, Needleman plans to examine children from pregnancy on, measuring their total exposure to lead from air, food, water, and paint, Unanswered questions That work may also help to answer one of the genuine scientific questions that remains: What are the relative contributions of different sources of lead to typical human exposures. Getting the answer to that question could do much to halt the runaround among the regulatory agencies. (A recent report issued by the General Accounting Office confirms that the runaround continues. GAO found that the Department of Housing and Urban Development, which is charged with eliminating lead paint from federally assisted housing and which maintains that lead from sources other than paint is more important, is “not fully complying with its own regulations and procedures directed at eliminating the hazards of lead-based paint.” Meanwhile, EPA continues to move slowly and cautiously on enforcing its ambient air standard for lead, even after it was recently upheld by the Supreme Court.) Patterson is quick to insist that research on the relative contributions of different sources should not be relevant to regulatory strategies, however. H e is convinced that the scientific evidence already in hand makes the case for eliminating exposures from all sources-and for halting mining of lead altogether. The political reality, though, is that technical fixes will be the short-term answer to lead pollution, and that until the buck-passing among the agencies stops, even these fixes will not be enforced. Patterson points to a second unresolved scientific issue, one that is a sobering reminder and a capsule summary of the extent of lead contamination of the modern environment. “It is highly probable that many biochemical processes within the cells of humans and animals, universally subjected to lead pollution on a widespread scale, are highly perturbed and unnatural. It is a matter of great scientific interest to discover how natural biochemical processes operate under lead-free conditions. This means that such studies should be carried out in the future in lead-free sanctuaries.” In our lead-contaminated world, Patterson is saying, there are no unperturbed controls. -Stephen Budiansky

COMPU/CHEMTM DOUBLES PRODUCTION

And we were already the biggest GC/MS facility in the business. centralized laboratory, operating around the clock, provides your company the largest capacity GUMS pollutant analysis facility in the world. And we’re still growing! COMPUKHEM analyzes priority pollutants and hazardous wastes by GUMS (gas chromatography/mass spectrometry) and that’s our only business. This approach to GC/MS analysis provides several benefits: quality due to specialized staff and inshumentation, results in weeks not months, lower cost than competition (and your internal R&D facilities), and capacity to process samples for the largest of corporate programs. If your company has problems handling environmen-

COMPUKHEM’S

tal sample logistics for plants throughout the US.(and overseas), COMPUKHEM is the solution. Tell us your corporate-wide sampling schedule-where, when, and who. We’ll automatically forward specialized sample shipping and collection containers (our Pat. Pend. “SAMPLESAVER”) where and when they’re needed. That’s all part of our Customer Service. COMPUKHEM’S pollutant analysis service results from years of environmental involvement by the Mead Corporation in the paper, pulp, rubber, f o u n b , coal, chemical, ink and petroleum industries COMPUKHEM provides technical excellence, rapid turnaround, large scale capacity, and lower cost. To learn more about the largest

automated GUMS pollutant lab in the business (and give your R&D people a chance to do R&D), phone us. COMPUKHEM. 5 Triangle Drive, Research Triangle Park, North Carolina 27709. Telephone 800-334-8525.

Regional Offim: Atlanta, Detroit, Philadelphia, Raleigh CIRCLE 15 ON READER SERVICE CARC

Volume 15, Number 3. March 1981 247

Environmental health issues How does science assess the hazards of chemicals to humans? What are the institutional mechanisms f o r risk assessment? How can better, scientifically valid regulatory decisions be made? Generally speaking, science is the underpinning of many regulatory judgments in the environmental health field. In the past, federal agencies often failed to draw on the best scientific experts and to expose their preliminary judgments to scientific review. Only recently have some agencies attempted to incorporate the peer review process in their selection of research projects. During the next few years, the administrative process for testing presumable facts and making regulatory decisions concerning environmental health hazards is likely tocome under special scrutiny. Emerging scientific issues that are important to government and to those influenced by government policies, regulations, and statutes were summarized in the annual program plan (1981) of the National Academy of Sciences-National Research Council’s Board on Toxicology and Environmental Health Hazards (ES&T, June 1980, p. 648). This plan was put together by its ad hoc group on critical issues of the 1980s.

One issue is short-term testing of chemicals. These tests are inexpensive, rapid, and have generated a very large data base. Many, especially the microbial tests for carcinogens and mutagens, are being applied before being extensively validated, according to this NAS-NRC document. However, some of these mutagenicity tests, such as the widely used Ames test, have been extensively validated qualitatively-about 80-90% of carcinogens are detected as mutagens. Another question arises: What priorities should be established to decide what further testing the hundreds of test-positive chemicals should undergo? Scientific decisions must meet high professional standards and must not be skewed by anticipated regulatory consequences. Many agree that the risk-assessment function should be removed from the regulatory apparatus entirely. The suggestion has been 248

Environmental Science & Technology

-1

made that N A S and its components could perform the risk-assessment function. But not all agree; because this proposal raises the difficult issue of having one scientific group doing all risk-assessment evaluations, it will be debated within both the political and scientific communities during the 1980s.

In making regulatory decisions involving scientific issues, government agencies have been criticized severely for acting precipitously, for taking tOo much time, and for distorting or neglecting good science. Such criticism 4d hoc group on critlcal issues of the %Osa

Richard Yerrlll, chairman 01 the group, is located at the Law School, University of Virginia Jdn Rake, cochairman, is at the National Institute of E mental Health Science Ronald Estabrook is at the Department of Biochemistry at the University of Texas Medical Schooi in Dallas Jullus Johnson is at the Dow Chemical Co. Ian Nlsbet is director of the scientific staffof the MassachusettsAudubm Society a

Ad hoc group of the NAS-NR

on Toxicology and Environmental

has generated numerous proposals for reforming the administrative process. A majority support the proposition that it is important to separate the assessment of health risks from the selection of regulatory options. In response to this, EPA set up a Cancer Assessment Group as a procedural process about five years ago, but it is too early to have definitive report on this.

Early experiments In the %Os, all regulatory agencies will explore other administrative techniques to assure the integrity of their scientific judgments. To do so, however, some administrative procedures must be changed. For example, the current federal conflict-of-interest policy prevents an agency from using scientists who work in the private sector. This policy makes it difficult for agencies to employ, even temporarily, scientists who benefit from industry research support. To correct this difficulty, Rep. W. C. Wampler (R-Va.) has proposed legislation that would create a quasiindependent body of scientists, some of whom could be affiliated with industry, which would perform the riskassessment function for all federal regulatory agencies. The objective of this proposal is not merely to assure that risk-assessment and regulatory analysis are treated as distinct functions, but to remove the former function from the regulatory apparatus entirely. In the %Os, there will be numerous experiments with other types of procedures for finding scientific facts and resolving conflicts. The approaches already proposed include procedural innovations such as the “science court,” which has not functioned to date, and FDA’s use of “a scientific board of inquiry,” which has only been used once. However, these approaches are so diverse that one cannot identify

0013-936X/81/0915-0248$01.25/0

@ 1981 American Chemical Society

a “best” model, according to the NAS-NRC document. FDA persuaded the makers of aspartame, a new artificial sweetner, and the opponents of its approval to submit their dispute to a jointly selected panel of nongovernment scientists. This group heard witnesses, evaluated testimony, and produced a recommended decision. This approach substituted an informal hearing before a panel of experts for the traditional trial before an administrative judge. A potency index Knowing which environmental chemicals Dose the greatest hazard to humans would gresly aid the regulatory process. The public now assigns equal significance to its exposure to saccharin, trichloroethylene in decaffeinated coffee, chloroform in drinking water, and aflatoxin in peanut butter, but the potency of these cancer-causing chemicals varies considerably. For years, attempts have been made to develop a Carcinogenic potency index for chemicals, an index that would take into account both the extent and level of human exposure to chemicals as well as the vast differences in intrinsic carcinogenicity. Such a carcinogenic potency value for specific environmental chemicals could then be used to improve risk assessment, risk-benefit judgments, and regulatory policy. For several years, Bruce N . Ames and colleagues at the University of California at Berkeley have been developing a carcinogenic potency index. At the 1980 Cold Spring Harbor Laboratory conference, Ames reported that quantitative information on the capacity of various chemicals to cause cancer in man is preferable, but with rare exception these data are not available. But there is an abundance of research reports of animal bioassays on hundreds of chemicals. They therefore have compiled a comprehensive data base which incorporates worldwide animal bioassays that, in their view, are suitable for determining a carcinogenic potency index. But most workers in the field agree that there are many uncertainties in extrapolating animal cancer results to man. The Ames et al. carcinogenic potency index is called the tumorigenic dose, TDso; it’s the daily dose that results in tumors in 50% of otherwise tumor-free animals over a standard lifetime. This dose rate is usually not far from an actual dose used in an experiment that yields statistically positive results; only a small extrapolation from experimental observation is necessary, according to Ames’s prog-

ress report. Ames declared that the differences in potency of different carcinogens can be enormous. For example, the TD50 for saccharin fed to rats is 10 g/kg of body weight per day. For aflatoxin, the TD50 is 1 pg/kg of body weight per day-ten million times less. Ames reported that computer programs have been developed to estimate a TD50 together with its confidence limits; this program also analyzes the shape of the dose-response relation and the probability that the TD50 is significant. The computer file will eventually

contain this information, plus tumor site and type, on more than 600 chemicals. Many will include multiple tests that use different strains and species. Ames reported that an error check is being completed on the data base. In the meantime, Ames and his colleagues hope to examine the TD50 values of chemicals that have already aroused public concern because of widespread human exposure, such as DDT, dioxin, benzene, saccharin, benzo[a]pyrene, vinyl chloride, ethylene dibromide, and ethylene dichloride. -Stanton Miller

I

I

TheEFm

LSC-2 oncentrator meets or exceeds all EPA requirements forTHM & priority pollutants.

Model ALS Automatic Sampler for PUrQe and Trap

And now. add our new model ALS Automatic Sampler and run up to 10 purge and trap samples automatically.

-

I

Model LSC.2 Purge and Trap Concentrator

The Tekmar LSC-2 is an automatic purge and trap concentrator which will allow your GC or GUMS to detect and quantitate purgeable organics in industrial and drinking waters. Completely Automatic Operation. The functions of purge, desorb, GC program start, sample drain, and trap bake are all handled automatically by the LSC-2. Interfacingcapability, built into every LSC-2, allows operation as master or slave to your GC or GCIMS system, or operation under computer control.

The Tekmar ALS Increases Sample Capacity. The Model ALS interfacedto the LSC-2 permits automated analysis of as many as ten water samples for volatile organic pollutants, accommodating either frit or needle sparge samplers. Since its introduction, the LSCP has gained wide acceptance in the field. From early indications the same is true of the ModelALS Automatic Laboratory Sampler. Call our Toll Free 800 number for Information,

or check the reader service listed below.

TEKMAR COMPANY / P.O. Box 37202 / Cincinnati, OH 45222 / (600)543-4461 In Ohio call collect (513)761-0633/ Telex No. 21-4221

See us at the Pittsburgh Conference. Booth Nos. 529535. ... I

CIRCLE 10 ON READER SERViCE CARD

Volume 15, Number 3, March 1981 249

Hazardous waste lanNiiis Though general1y performance-oriented,

Despite the ongoing development of sophisticated hazardous waste management (HWM) methodshigh-temperature incineration, deepwell injection, starved-air pyrolysis, and, in some cases, physical-chemical treatment and recover y-landfills will probably be the predominant method for dealing with the nearly 57 million metric tons/y of such waste produced in the U.S. Though fraught with problems, landfills continue to be the least expensive, in comparison with the other technologies, and may account for up to 80% of the hazardous waste disposed of. Landfills will be much more strictly regulated in the future. Regulations will be aimed at making certain that no hazardous wastes interact dangerously within a fill or migrate from it and contaminate the surrounding air, soil, and especially surface and groundwater. The rules will encompass geological formations, rainfall runoff, infiltration, leachate control and collection, and numerous other factors.

Performance The regulation of landfills will take effect in three phases; the first two are already in force. Technical standards for landfilling will be performanceoriented. Essentially, this means that while EPA requires certain standards to be met, “how they are met would be left up to the facility’s operator,” explained Wayne Tusa of Fred C . Hart Associates (New York, N.Y.). This approach, described in the May 19, 1980, Federal Register and subsequent regulations, has been charac250

Environmental Science & Technology

terized as “best engineering judgment.” Its development will be required for each permit application, Tusa pointed out. At this writing, technical standards for Phase I1 are not yet available, nor have they been published in the Federal Register. They are expected to cover drawings, specifications, and engineering design and documentation of any site improvements. There are two major caveats concerning coming regulations in the H W M field. One deals with the role that states will play and how they may run their programs; it is expected that up to 40 states will seek primacy. The other is the effort to bring about “relief from ‘burdensome’ regulations,” as advocated by the new Reagan administration. As this went to press, President Reagan ordered a 60-day freeze on certain regulations promulgated during the waning days of the Carter administration. This freeze adds a measure of uncertainty to the nature and timing of regulations concerning landfills, as well as many others. Closure plans Under the regulations, before a landfill commences operations, a detailed closure and postclosure plan must be filed with EPA and appropriate state authorities. This plan must indicate the steps necessary to close the fill after it can no longer accept wastes, to cover it with an impermeable cap, and to maintain it after closure so that it does not threaten the environment and public health. At least 180 days prior to the

planned closure date, EPA must approve the plan. Within six months of final waste deposition in the fill, closure must be completed and certified by a registered professional engineer. A principal aim of regulations concerning landfill operation and closure is to make certain that a fill is secure. This means, in essence, that no leachate or other contaminant may escape from the fill and cause adverse impacts on the surface or groundwater. Ideally, making a landfill secure would prohibit, for all time, any contact between the hazardous wastes and the fill’s surroundings. In practice, the landfill’s owner or operator must care for the site for 30 years after closure to prevent such contacts, unless he can prove to EPA’s satisfaction that care for that length of time is unnecessary. Leakage from the site is not acceptable during or after operations. Neither is any external or internal displacement, which could be brought about by slumping, sliding, and flooding. Wastes must not be allowed to migrate from the site, John Lynch of ERT, Inc. (Concord, Mass.) reminded a recent Washington workshop sponsored by that firm. He explained that the primary goal of these requirements is to prevent groundwater contamination. Much easier said than done, perhaps. Lynch pointed out, “It is next to impossible to create an impervious burial vault [for hazardous wastes] and guarantee its integrity forever.” He noted that for this reason, a secure

0013-936X/81/0915-0250$01.25/0 @ 1981 American Chemical Society

permeability not to exceed IO-’ cm/s, or equivalent. That is, if the clay layer is thinner, it must be less permeable, and vice versa. A synthetic liner must have similar overall impermeability characteristics; Arthur Lazarus of ERT told ES&T that some flexible membrane liners have achieved permeabilities as low as cm/s.

Regulation: three phase: The regulation of landfills will take effect in thee phases; the first two are now in force. Under Phase I. in effect since last Nov. 19. an owner M operator of a hazardous waste manage ment facility was to notify EPA by last Aug. 18 that he had an operating facility. Then, he should have submitted a Part A Resource Conservation and Recoveiy Act (RCRA) permit application by last Nov. 19. If approved, the application covered the facility under Interim Status Standards (ISS). This ISS stage was a “passive” program, in that EPA (orstates) did not actively issue wrmits. or routinelv review.and appiove plans and proce; dues for these existinn facilities. The regulations themselGes specified terms of cmliince at that stage. and only compliance documentation and groundwater mnitaing r e m had lo be filed with EPA and States. However, inspectlon programs are being& up. Also, only facilities that

landfill carries with it an associated liability. Lynch advocated using landfilling as “the method of last resort” and employing “every conceivable” HWM technique to reduce the amount of material io be disposed of. These techniques may consist of biological, chemical, physical. or thermal treatment. or involve the separation and removal of spccific components. The secure landfill must be engineered to cnsure that rainwater (or snow melt) is not impounded. If precipitation and runoff have contacted the waste in active. exposed areas of the fill. t h a t water must be collected and treated as hazardous waste. Even if this water has been treated and proven to EPA’s satisfaction t h a t it is no longer hazardous, its discharge could still be subject to provisions of [he Clean Water Act. Permeability

The fill should be sited to prevent flooding, ponding, and direct surface runon (runoff going io the fill). Exposed waste surfaces in active areas should be minimized, and free liquids must be excluded. A free-liquid problem exists if, when a I -5-kg sample of solid waste is placed on a sloping glass plate for at least five min, the liquid drains or the material itself flows. The subsurface should contain no rock fractura, fissures, or soils of permeability more t h a n cm/s. If these geologic formations exist. they must be permanently sealed. so that an unbroken barrier prevents any liquid movement from the site to usable sur-

filed Part A in a timely manner are covered by ISS. New ones, far more

tishtly and actively regulated. must M e both Parts A and B. Both existing and new site owM)r/ operators must file Part B of a RCRA permit application, under phase II regulations that were signed by the EPA adrninislratoron Jan. 12. This part will inckde waste analysis, inspection schedules, and contingency plans. Within applicable regulatory guidelines, permits granted-all sites will eventually require permits-will be based on the “best engineering juds ment“ of permit writers. Phase Ill will involve the develop ment of considerably mwe detailed technical standards for use in issuing RCRA permits. Perhaps, in part, because of an incomplete scientific/ engineering understanding of many aspects of hazardous waste management, that phase may not be implemented for five years face or subsurface water. Not only might subsurface sealing be required, but an impermeable liner may be needed for the sides and bottom to secure the landfill. A liner may not be required if the site is in an arid region where evaporation exceeds precipitation by at least 20 in./y and if nature provides a liner at least I O 4 thick (a very heavy clay, for example) that is properly configured geologically with respect to shape, depth, and certain other factors. Otherwise, “imported” clay or a synthetic membrane layer, documented to be chemically resistant to waste materials in the fill, must be installed. A rule of thumb for a clay liner calls for a thickness of at least 5 ft and a

Leachates In other than arid regions, the owner or operator would most likely have to install a leachate collection and treatment system, which could be costly and troublesome. One approach calls for the impermeable clay, lining the fill’s bottom and sides, to be overlain by the leachate collection system itself. This system is composed of a layer of sand or gravel a t least 12-in. thick. A polyvinyl chloride perforated piping system, normally 4-6 in. in diameter, is installed at intermediate and low points in the sand or gravel layer. This piping actually collects the leachate and conducts it away for treatment. The absence of impermeable clay on site, especially coupled with the presence of porous materials (silt, sand, or gravel, for example), creates the most difficult and costly situation. After lining the fill on bottom and sides with properly impervious material, the owner or operator would install two leachate collection systems, with a bottom one acting as a backup to collect fluid that may escape the topsystem. These systems would be separated by a layer of compacted clay, or of sand/& if it is impractical to bring clay to the site. Landfill cells To aid in the safeguarding of surface and groundwater, the atmosphere, and

Landfill leachate colkction system’

- -

Flexible membrane or soiUbentonite

Variations

-BC BC

liner

AbAb

in. Sand or pea stone 12+

3 H Compacted clay

sandy silt or silty sand .Mort dlicvlf SilYaIio”

Back up PVC perlorated drain pipe

liner

OMMlddle layer is 3 n of mmpsded day CMiddle layer is compand sandy sill or SiW sand sourn: Presentation by E m s Amur k- a1 MnSnop

Volume 15. Number 3, March 1981

251

surrounding land, hazardous wastes must be placed in “cells” within the fill for management and storage purposes. Cells are essentially discrete waste storage areas with liners which are sealed from neighboring cells, and from their other surroundings, as applicable regulations prescribe. The site operator must keep careful records of the location and dimensions of each cell, and must depict each cell on a map keyed to permanently surveyed vertical and horizontal markers, Fred C. Hart Associates’ Tusa and ERT’s Lynch pointed out. Records must show the contents of each cell and the approximate location of each hazardous waste type within the cell. Incompatible wastes must be placed in different cells within a fill so that they never come into contact with each other; otherwise explosions or other undesirable reactions within the cell and fill could occur. Also, when a cell is full, it should be quickly sealed off, covered, and revegetated in order to curtail wind and water erosion, and leachate production. Gas collection In some landfills, when wastesparticularly biodegradable ones-are covered over and sealed off, potentially malodorous and hazardous gases may be generated. For that reason, along with the various provisions for site security, public access restriction, and the like, the fill owner/operator must often include plans for monitoring, collecting, and venting these gases. Many hazardous waste management experts say that “best engineering judgment” calls for such plans to be included as a regular practice. These gases include carbon dioxide, hydrogen sulfide, methane, and several others. If not vented (in accordance with any applicable Clean Air regulations), they could build up enough pressure beneath an impermeable landfill to rupture the cover. To minimize gas generation, wastes that might cause this problem could be biodegraded before their deposition in the fill. But if this cannot be done, or if it is suspected that gas may be a problem in any case, a perforated pipe must be placed in a gravel layer atop the waste, but under the landfill’s impermeable cap. This pipe collects the gas, which is then conducted to a riser for safe venting to the atmosphere. In certain instances, the production of gases in landfills may offer some benefits. A few enterprising organizations are engaged in collecting the fill gas; scrubbing out carbon dioxide, hydrogen sulfide, and other impurities; and retaining combustible hydrocar252

Environmental Science &Technology

bon gases, mainly methane, for industrial or residential distribution and sale. Watch on groundwater The purpose of groundwater monitoring is to ensure that programs for managing runon, runoff, and leachates are functioning properly so that groundwater remains uncontaminated. If contamination is occurring, early warning can be given and tountermeasures taken. For openers, a site owner/operator has to place at [ensf four monitoring wells-one up-gradient, three downgradient-around the limits of the faci1ity;and set up appropriate sampling and analysis programs. The regulations set forth, in detail, how the monitoring wells must be sunk, screened, sealed, sampled, and located, with special emphasis on location of the down-gradient wells. Groundwater monitoring regulations cover certain organic chemicals whose presence, even at concentrations as low as 10-50 parts per billion (ppb), are feared to cause long-term adverse health effects. To detect such compounds, very sophisticated sampling and analysis techniques are required. For instance, sample collection and handling must be done so that volatiles are not lost through evaporation or air stripping. Analysis requires methods such as gas chromatography and mass spectrometry. Note that emphasis on organics does not reduce concern with inorganic contaminants and their monitoring requirements. General groundwater quality, especially the suitability of the uppermost aquifer for use as a drinking water source, must be assured in keeping with EPA’s interim primary drinking water standards, pursuant to

the Safe Drinking Water Act. Tests must be run quarterly for the first year, annually thereafter, to monitor this water horizon. They cover at least 21 contaminants, such as arsenic, cadmium, chromium, lead, endrin, radium, 2,4-D, toxaphene, Escherichia coli, phenol, and inorganics such as chloride, iron, manganese, sodium, and sulfate. If the landfill is leaking to the groundwater, the site operator must file an assessment plan with EPA. Other testing covers parameters such as pH, specific conductivity, total organic carbon, and total organic halogens, ERT’s Richard Cadwgan explained. All analytical results must be reported to the EPA regional administrator. During the first monitoring year, reports must be submitted within 15 days of completion of quarterly analyses. A special listing must be made of contaminants present in amounts in excess of allowable concentration levels set forth in the applicable regulations. Doubts and alternatives

Even with all of the existing and forthcoming administrative and technical standards pertaining to secure landfills, some doubts have been expressed whether surrounding soil, air, and surface and subsurface water can be as well protected as the regulations’ developers and drafters envisioned. For instance, in a recent study, the U.S. General Accounting Office (GAO) concluded that landfilling as a disposal method “presents the greatest potential risk for surface and groundwater contamination and liability for damages.” Still, as the least costly option to date, secure landfill disposal of hazardous wastes ‘‘will most likely remain attractive,” GAO predicted. The hazardous waste management

Incinerationprocesses

Liquid injection Fluidizedbad

850-1800 450-1000

Multiple hearlh

315-1000 150-1800 480-800

Coincineration Starvad-alr combustion/ pyrolysis*

solids. how 0.1-2 Seconds Liauids and aas~s.seconds:

0.25-1.5 Hours Seconds to hours

community is also looking closely at waste reduction and destruction methods, especially high-temperature incineration, usually in excess of 1000 “C. Some HWM experts foresee incineration as a principal commercialscale technology within the next several years. GAO called for consideration of incineration, despite its higher dollar and energy costs, because of the possibilities of reduced land demands and environmental risks. For these reasons, EPA is also encouraging incineration, Eugene Crumpler, Jr., of EPA told a Washington conference on consolidated permits sponsored by The Energy Bureau Inc. (New York, N.Y.). Crumpler said that t a t s run by EPA and others “demonstrated that a broad range of hazardous wastesespecially organics-can be destroyed with existing- incinerator technolog y.” Other HWM scientists and ennineers are not so confident. TGy maintain that high-temperature incineration is in its technological infancy. These experts add that, in their view, even when the technology matures, its application would be limited to the destruction of highly hazardous and refractory organic wastes, such as PCBs, mainly because of high costs and energy needs. Even though secure landfilling will probably be the principal method for a number of years to come, the search for technically and economically practical alternatives to landfilling must be encouraged, developed, and brought on-line as soon as possible. -Julian Josephson Additional reading Code of Federal Regularions. 40 CFR 122.25, for Part B of RCRA permit application. 40 CFR 264 and 265 for general landfilling regulations and requirements. FederalRegister, May 19, 1980 Nov. 19, 1980; Jan. I2 and 26, 1981. “Hazardous Waste Disposal Methods Major Problems With Their Use”; Report No. CED-81-21; GAO, Document Ham dling and Information Services Facility Box 6015, Gaithersburg, Md. 20760. “A Method for Determining the Compat. ibility of Hazardous Wastes”. EPA, 600/2.80-076: Municipal Environmenta Research Laboratory, Officeof Researct and Development, U S . EPA Cincinnati Ohio 45268, April 1980; see also supple. mental Caution Notice for Cornpatihilit! Chart, November 1980. Report No. PB-293-335;U S . Departmen of Commerce Note: An EPA report concerning change in liner permeability under conditions o increased hydraulic pressure head shouk be coming out shortly. 0013-936X/81/0915-0253$01.25/0 @ 19f

Classical or radical? The case for ascribing the toxic effects of NO, and ozone to free-radical intermediates was discussed at a session of the AAAS meeting

The hows and whys of the toxicity f ozone and NO, stubbornly remain mystery. Although the effects hemselves have been recognized for ecades, only recently has a serious ffort been made to get at the mechaism; and one principal mechanism hat has been suggested-peroxidation f the target cell’s fatty acids by freeadical intefmediates-is raising as iany questions as it is answering. “Radical intermediates for NO2 >xicity have to be looked at with some kepticism,” said .I.Brian Mudd, a liochemist a t the University of Caliornia a t Riverside. In a review preented to a symposium a t the annual neeting of the American Association or the Advancement of Science “Radicals and the Biosphere,” Jan. 8, 981, Toronto), Mudd noted that nuch of the evidence for-or ,gainst-radical intermediates is by

erican Chemical Society

necessity indirect. Faced with this obstacle, he said, “Two extreme approaches can be taken. One is to analyze the biological system after exposure. This is the most biological approach, but what you analyze may be the consequence of a series of reactions. The other approach is what I’d like to call synthetic: to build up a chemical scheme that can eventually be tested in the biological system.’’ Mudd, taking the synthetic tack, described several well-known reactions of NO2 with fatty acids, or lipids, that can lead to peroxidation (see Figure I). But fitting these simple chemical pieces into a model of what goes on in an actual living organism leaves some ragged edges. In Mudd‘s terminology, the results of the analytic approach and the results of the synthetic approach are not meshing. Tests that have attempted to demonstrate lipid

Volume 15, Number 3, March 1981

253

Are you analyzing priority pollutants in water? Here’s help for you from Varian.

EPA Method 607 Selecled purgeable halocarboos analyzed 00 a Varian VISTA 44 Gas

Chromaiogiaphy Sysrem.

Instrument Systems. A large selection of gas chromatography, atomic absorption and’liquid chromatography instruments designed to meet the latest standards in air and water pollution analysis. Application Reports. An ongoing series of Varian application reports on environmental pollution analysis. Local Support. Hundreds of Varian sales and service representatives and applications scientists located throughout the world. They’re on call to give you the help you need, before and after you buy. Workshops (IRaining Courses. Local workshops, ranging from elementary training in the analysis of priority pollutants to special problemsolving courses tailored specifically for your needs. Lots of instrument training courses, too. Parts h Supplies. Everything you need to create a total GC. AA or LC system for environmental pollution analysis. For literature, circle 32 on the reader card. If you’d be interested in a local “hands-on” workshop, circle 33. Circle 34 for a representative to call. Varian Instrument Group 611 Hansen Way, D-070 Palo Alto, CA 94303

659 varian

254

Environmental Science 8 Technology

:roxidation i n the lungs of animals posed to NO2 have, overall, been conclusive. And i n plants, which ffer lesions o f the leaf tissue when posed to high levels o f NO2, the apr e n t metabolism o f NO2 can be exained without invoking free radicals. he NO2 appears to dissolve in water form nitrite and nitrate; these Impounds are then handled through e plant’s normal channels: Nitrate i s duced to nitrite and then to ammoa. Evidence for this i s found in the gher leaf-concentration of nitrite and trate and the greater ammonia proiction i n exposed plants. Daniel B. Menzel of the Cancer istitute of Hawaii outlined some her sorts o f evidence that may help ttle the matter. “Direct evidence, in vo, is rather scant,” he said; but inrect indications of lipid peroxidation .e available in vivo. Lipid peroxidai n gives rise to ethane and pentane in le breath of exposed animals; it lib‘ates malonaldehyde; and i t changes le ratio o f saturated to unsaturated tty acids. Ozone is a more complicated case, resenting a bewildering array of xsible reaction sequences and interiediates. Water, ever-present in bioigical systems, adds an extra dimenon of complexity. The well-underood ozone reactions o f classical orm i c chemistry, which give rise to ell-characterized intermediates, are lmost all carried out in nonaqueous iedia, and so do not apply. With water resent, ozone can break down to sev.a1 radicals: Oj-.. 0 2 - . (superoxide), nd OH- (hydroxyl). A general scheme o f indirect evi:nce that can help sort out the details F ozone’s effect in living organisms iakes use of the specificity with which iffering substrates are attacked (see igure 2). But the model toxicological :actions o f lipids with ozone are still iconclusive-both free radicals and

the classical products of ozonolysis are produced in the process. I n reactions with proteins, however, Mudd argues that the results are more straightforward. Laboratory reactions of ozone with protein seem to be characteristic o f direct ozonolysis. I n particular, the oxidation is highly selective for certain amino acid residues along the protein chain-one of the markers o f a direct attack, as shown in Figure 2. Mudd also reported that in experiments with higher systems (red blood cells), which contain both lipids and proteins, the proteins were much more susceptible tooxidation than were the lipids. Susceptible proteins were even found on the inner cell membrane. indicating that ozone passed through the lipid membrane layer without significant reaction taking place. But the evidence in all cases is tantalizingly indirect. “We’re dealing with transient, reactivespecies that cannot be readily identified,” Anne P. Autor of the University of Iowa told ES&T. Autor, who chaired the A A A S symposium, explained that some ambiguity remains in the results obtained with the methods available for identifying radicals-principally electron-spin resonance spectroscopy and reaction with a selective scavenger. “The precise and unequivocal identification of the radicals is still i n doubt,” she said. The absence o f direct proof has created an impasse. “We’re stuck at this point.” said Autor. “ A t the moment we have a lot of inferential evidence. We know that radicals are probably involved, but a really hard demonstration is yet to be had.” She believes that the consensus in the field is that new analytical techniques will be required to come up with that demonstration and to settle the sorts of questions raised by Mudd. -Stephen Budiansky ~~

h/IlnM

. ... . . . . ....

FIGURE 2

Ozone’s routes of attack

S

I :Substrate

I

02-‘T Reduction of S, high selectivity

H,O,

.. . .

.. .

- .

..

.

.~

~

1ILI