Environmental t News Unknown mechanism for airborne mercury
T
PHOTODISC
he mercury belched out of monitoring data with a model of “Either the models are missing coal-burning power plants discrete plumes of material emitthe physics of what’s happening in seems to change as it travted from power-plant smokestacks. the atmosphere or the measureels downwind, according to new The authors focused their efforts ments are not perfect,” Seigneur research published in this issue of on one monitoring station downsays. The researchers propose that ES&T (pp 3848–3854), but just how wind of several power plants in the models are more likely to be that happens remains unclear. If Georgia (owned by Southern Co., suspect, and that an unidentiverified, the research fied reaction that transconfounds the current forms RGM to Hg 0 could understanding of what account for the absent happens to mercury carmercury. ried in the atmosphere— “This is probably the and suggests that new first time a model of that regulations on power scale has been matched plants’ mercury emiswith the field data that sions may not have the they have, to look at the desired impact. change in speciation of Last year, the Bush mercury in transport,” Administration finalized says Steven Lindberg, an the Clean Air Mercury emeritus fellow of Oak Rule, which imposes cutRidge National Laborabacks on mercury emistory. Although Lindberg sions from coal-fired and others say that the power plants, capping toresults are not definitive, Coal-burning power plants disperse mercury in far-traveling tal U.S. output to 15 t by the work “comes closest plumes, but the toxic metal may be undergoing some unknown 2018. The effectiveness of to making the case that transformation as it moves through the atmosphere. the rule depends on the we need to understand form of mercury emitted from power plants. Elemental mercury (Hg0) can linger in the atmosphere for years. Reactive gaseous mercury (RGM; Hg 2+), on the other hand, is water-soluble and washes out of the air within days, close to its source. As a result, controlling Hg0 will probably require international efforts, but countries can target RGM. Studies have tracked mercury deposition, but monitoring mercury in its different forms during transport remains tricky, particularly after it is emitted from coalfired power plants, according to experts. Lead author Kristen Lohman of Atmospheric & Environmental Research, Inc., a consulting firm, and her coauthors tackle this issue by combining, for the first time,
an energy firm that cosponsored the research). There, the team collected data from nine different plumes that wafted over the station, which they could pinpoint to one of four nearby power plants by the samples’ sulfur content and timing. They found that levels of RGM decreased significantly downwind of the power plants, up to 10-fold. However, their findings contrast sharply with simulations of the plumes generated by ROME (Reactive & Optics Model of Emissions), a 2D physical and chemical model developed by Christian Seigneur of Atmospheric & Environmental Research, a coauthor of the new results. The ROME simulations predict that much more RGM should have been present.
3664 n Environmental Science & Technology / june 15, 2006
this better,” he says, while stressing the uncertainties in the research. Those uncertainties stem largely from the fact that the team “did not directly measure the RGM emission rate from the power plants they were looking at,” says Russ Bullock of the U.S. National Oceanic and Atmospheric Administration’s Air Resources Laboratory. “They estimated the amount of mercury, [which means] you have to question whether they had the right number for the emission rate to start with” for downwind modeling. Larry Bruss of the Wisconsin Department of Natural Resources, which has advocated for immediate reductions of mercury emissions from power plants, notes that the distances from the power-plant sources to the monitoring station © 2006 American Chemical Society
News Briefs
RGM or reactions with SO2 could transform mercury into other species, they write. Bench studies to identify a potential mechanism are in the works right now, says Leonard Levin, a mercury specialist with the Electric Power Research Institute, an industry consulting firm. However, the results are still preliminary, he adds. “Even though we’re finding all this evidence for [mercury transformation] in the data, it’s still an incomplete phenomenon because we don’t have the chemical mechanism,” Levin says. —NAOMI LUBICK
Ozone, traffic, and developing countries
Researchers have modeled the potential climate impacts of extra ozone, along with nitrogen and other chemicals that will be belched out in the next half century. In a model that analyzed automobile emissions, local air pollution, and photochemical smog, the team showed that countries in the Northern Hemisphere generally increase their radiative forcing because of summertime driving. Radiative forcing denotes the amount of heat absorbed by the planet’s surface. In the Journal of Geophysical Research (doi 10.1029/2005JD006407), the researchers report that Northern Hemisphere emissions will increase radiative forcing by 0.05 W/m2 by 2050. Analyzing projected changes for global driving habits, they found that summertime radiative forcing would spike by 0.27 W/m2 by 2050.
Sulfur leads to methylmercury The team applied the first 6‑h sulfate “rainstorm” to the wetland in May 2002, followed by 2 more “rains” in July and September of that year. They measured the sulfate and MeHg content of the wetland on subsequent days, with matching measurements in DAN ENGSTROM, COURTESY OF JEFF JEREMIASON
Researchers have long worked under the premise that acid rain triggers the production of methyl mercury (MeHg). Now, the first large-scale field experiment on the interactions of mercury and sulfate in wetlands, described in this issue of ES&T (pp 3800–3806), provides the most persuasive evidence yet that acid rain increases production of MeHg. The “cause-and-effect relationship” between sulfur and mercury deposition from the atmosphere has been “demonstrated in the lab and in the field in small-scale experiments,” says Charles Driscoll of Syracuse University. But this is the first such large-scale ecosystem experiment, he points out. Jeff Jeremiason of Gustavus Adolphus College and colleagues took an acre-sized patch of a wetland in Minnesota and peppered half of the site with sprinklers that could simulate rainfall with sulfate loads 4× as high as annual background levels. The sulfate levels used in the test are equivalent to historic levels of sulfate deposition in the northeastern U.S., the authors note.
Edward Swain of the Minnesota Pollu tion Control Agency works on a sprinkler head designed to deliver sulfatebearing “rain” to a freshwater wetland in northeastern Minnesota.
Clean diesel for U.S. fleet
The U.S. EPA has entered into an agreement with a company to evaluate the commercial viability of new turbochargers, air management systems, and electronic sensors for use with cleaner diesel and high-efficiency gasoline engines. Ensuring that such engines run cleanly yet remain cost-effective presents a technical challenge, according to the agency. Diesel vehicles are inherently more fuel-efficient than their gasoline counterparts, but they currently represent only 0.4% of all new car sales in the U.S. However, the popularity of diesel passenger vehicles has grown 80% since 2000. The research firm J. D. Power and Associates predicts that U.S. consumers will purchase 2× as many dieselpowered vehicles by 2012.
june 15, 2006 / Environmental Science & Technology n 3665
Jupiterimages
further muddy the concentration measurements. He adds that the limited number of plumes reported affects the accuracy of the modeling results. Bruss’s colleague John Heinrich points out that “some recent papers have suggested that the pool of Hg 0 may transform into RGM,” the opposite of what the researchers propose. The authors acknowledge some of these criticisms. They point out that inaccurate measurements of mercury released from a source or increased dry deposition, which is difficult to pin down, could affect their findings. In addition, decay of
Environmentalt News a control section in an upstream and untouched part of the same wetland. The team found a jump in sulfate levels followed by a surge of MeHg after the first application, just as theory predicts. The sulfur apparently stimulates certain microbes that transform other forms of mercury to MeHg as they respire. They could also track the concentrations of MeHg exported out of the swampy wetland, which increased threefold. Then things got muddy. The researchers still measured MeHg flowing out of the marsh after the two applications later in the summer, but the levels did not increase dramatically. They could no longer find evidence of sulfate in the marsh, either. “We might have missed all the action,” says Jeremiason, by waiting a day after the extra sulfate was applied before beginning to take staggered measurements. He says his group is also split on other explanations for the conundrum, including a change in marsh chemistry and biology because of higher temperatures as the summer waxed. Another possibility is that the microbial
community may have fluctuated as the seasons changed, affecting how much sulfur the bugs could use up and in turn potentially influencing the amount of MeHg that they produced. Mark Hines, a microbiologist at the University of Massachusetts Lowell, says that he, too, would have expected bursts of activity following later applications similar to the first event. “You’d think [the microbes] would be raring to go the second time around.” He suggests that because the team only measured marsh waters, they may have missed both sulfate and MeHg that ended up bound to dead organic matter. In fact, the continued output of dissolved MeHg that the researchers reported could be explained by the slow release of the bound pool, Hines conjectures. This newest study complements observations elsewhere, such as the Everglades, where experiments also substantiate the link between sulfate and MeHg. The new study makes the connection that much clearer, says Brian Branfireun, a hydrologist at the University of
oronto. The Everglades, for examT ple, start out at higher sulfate concentrations. However, Branfireun says that these Minnesota peatlands lack sulfate to begin with, and that makes this experiment even more important. He notes, “adding sulfate alone increased the methylmercury in this system and increased the output into receiving waters” downstream. “That’s an important message. There aren’t any fish living in the peat. . . . The question is: Is that the [same] methylmercury that gets into the fish [downstream]?” “Implications of this work are interesting from a policy perspective,” Driscoll says. For example, the research implies that acidrain regulations, such as the 2005 Clean Air Interstate Rule, could control MeHg and thus tempt policymakers to relax controls on mercury itself. Still, “mercury is an extremely dangerous substance,” Driscoll says, and any evidence that acid rain controls alone could reduce mercury concentrations in fish “will play out for next 15 to 20 years.” —NAOMI LUBICK
An EPA legend retires
ard Reding, head of the Engineering and Analytical Support Branch of the EPA’s Office of Water. Telliard, who retired earlier this year as director of analytical methods for the Office of Water, has been with EPA since its infancy. He is perhaps best known for developing the Priority Pollutant List, which became a cornerstone of the Clean Water Act for controlling industrial wastewater discharges. ES&T published this list in a research article co-authored by Telliard, in 1979, which is now one of ES&T’s most highly cited papers (Environ. Sci. Technol. 2001, 35, 488A–494A). In addition to picking which pollutants to measure, Telliard was also involved with developing methods to test for these substances in waterways. In fact, he helped to make gas chromatography/mass
spectrometry the key, routine tool for monitoring pollutants that it is today, says Larry Keith, an independent environmental consultant who was with EPA’s Office of Research and Development in Athens, Ga., when he coauthored the priority pollutants paper with Telliard. In all, >100 analytical methods were developed under Telliard’s direction during his 33-year tenure at EPA, according to the agency memo. These methods are used for detecting the presence of numerous contaminants, including mercury, dioxin, PCB congeners, and parasites such as Cryptosporidium and Giardia. “He was a straight shooter who came to be respected even by the people he regulated, because when he came at you, he came at you with facts, not opinions,” Reding says. —KRIS CHRISTEN
William Telliard’s efforts in developing analytical techniques underpin almost every U.S. EPA William Telliard regulation related to controlling pollutants in waterways, according to an agency memo recommending him for the Distinguished Career Service Award that was presented to him in April. Normally reserved for senior, high-level managers, the award is “a rare, high honor for a career employee, and it’s because of the body of work he put together,” says Rich-
3666 n Environmental Science & Technology / june 15, 2006
Environmentalt News PERSPECTIVE Chemical screening—Faster, cheaper, better?
Camphene
Chloroform
DCE
Propylene
Atrazine
Chlorothalonil
TBBPA
Hexachlorophene
Triclosan
log risk assessment factor
JOHN ARNOT
a risk analysis by combining estimates of chemical properties, including toxicity and environmenAs the EU and Canada gear up to Assessments like Canada’s effort tal fate, with a rough estimate of evaluate the tens of thousands of are welcome, says chemical fate emissions rate to conduct a simple, chemicals currently in use that and transport pioneer Don Mac transparent screening-level risk have never before been screened kay at Trent University (Canada). assessment. In their ES&T paper, for risk to humans or the environBut splitting up the assessment Mackay and his student Jon Arment, questions about the right job into two parts—an initial PBT not demonstrate such a scheme strategy and practical issues about screen and then a more detailed by evaluating a set of 70 chemicals the duration and cost of such an risk assessment—makes extra work from Canada’s Domestic Substancundertaking are being raised. An and ignores important informaes List. approach recently They show that the published (Environ. relative risk associ6 Sci. Technol. 2006, ated with releasing 4 40, 2316–2323) could these 70 to the envifit the bill, say sevronment varies by >14 2 eral nongovernmenorders of magnitude. 0 tal scientists familiar Those that came up with these activities. as relatively risky in–2 Governments beclude the herbicide –4 gan crafting laws atrazine, the antibac–6 to evaluate the terial agent triclosan, health risks of exthe flame retardant –8 isting chemicals in tetrabromobisphenol –10 the wake of the 2001 A, the antibacterial –12 Stockholm Treaty agent hexachlorothat banned the phene, and the fun–14 “dirty dozen”: 12 pergicide chlorothalonil. sistent, bioaccumulaScreening programs tive, and toxic (PBT) based solely on PBT By factoring in what is known about the emissions of chemicals current organic chemicals. properties could miss ly in use, Arnot and Mackay ranked the relative risk of 70 chemicals— Before that, the vast these chemicals, including the 9 shown above—and found that they varied by >14 orders majority of chemicals of magnitude. The calculations include emissions to air ( ● ); water (); says Arnot. Among used in commerce those near the botsoil (); and equally to air, water, and soil (X); risk assessment factors use level II ( ) and level III calculations. TBBPA = tetrabromobisphenol were not subject to tom in relative risk such evaluations, be- A; DCE = 1,2 dichloroethane. are chemical intermecause the environdiates such as toxic mental laws put in place some two tion that could make this a better, DCE (1,2-dichloroethane), and decades earlier in Europe, Canada, faster, and easier one-step job, he camphene, an ingredient in flavors and the U.S. exempted substances says. Moreover, schemes that rely and fragrances. Chloroform also already on the market. This group only on PBT properties could miss ranks low, but typical exposure is of “grandfathered” chemicals chemicals that are marginally perprimarily from indoor evaporation, comprises >95% by volume of the sistent or bioaccumulative, such as which the model does not address. chemicals used in production prophthalate esters and silicones, but This kind of data should make cesses and consumer products. are still problematic because they it very easy for regulators to priCanada, the first country to are used in such high volumes. Or oritize their efforts, Mackay says. prioritize chemicals, started in they can focus unnecessary effort “Including emissions is an obvious 2000 and aims to finish screening on very toxic chemicals that are way to evaluate risk,” says mod~23,000 chemicals on its Domestic emitted in such low volumes that eler Thomas McKone at Lawrence Substances List—the official list of they pose little risk. Berkeley National Laboratory. chemicals used in the country—for “These hazard assessment Such a screening effort could their PBT properties by September schemes are fundamentally inadalso catch problem chemicals 2006. Chemicals that fail the PBT equate, and they are not what the early, says Mackay. Perfluorinatscreen will be subject to further public wants,” Mackay says. “People ed chemicals, which have been assessment. The EU is currently rightly want to know if they are at emerging as contaminants of conmoving toward legislation that risk.” PBT schemes only assess the cern over the past 5 years, might would require reviews of ~100,000 hazard posed by a chemical, he says. have come under closer scrutiny chemicals. Regulators can go straight to earlier had this model been used june 15, 2006 / Environmental Science & Technology n 3667
Environmentalt News PERSPECTIVE for screening, agree University of Toronto chemist Scott Mabury and former 3M environmental toxicologist Rich Purdy. “This model applied to precursor fluorinated alcohols would have likely indicated the atmosphere as the dominant sphere for contamination,” Mabury says. “Perhaps scientists would have wondered about the fate of these reactive fluorochemicals and thereby helped to avoid a global contamination problem we are now just understanding,” he adds. One anonymous regulator agrees that the model performs well. “It handles emissions in a clever way by including emissions. [With the model,] it is possible to screen a larger number of chemicals more quickly and, as a result, to more efficiently rule in, or rule
out, chemicals.” Actual emissions estimates are often very uncertain, says Arnot, so they are not included directly as a model input. Instead, the model calculates a critical level of emissions that would cause concern. Regulators can then compare emissions estimates with this critical value to assess the risk. “However, [in] the real world, assessment and regulation is more complex and has many factors, as much economic and political as scientific, that render a model like this inadequate,” the regulator warns. But Mackay asks, “Should we continue on the present course of using PBT screening to slowly evaluate some thousands of chemicals for hazard, which is interesting but misses the point, [and]
3668 n Environmental Science & Technology / june 15, 2006
some hundreds for risk, and leave tens of thousands totally unassessed? Or do we assess the tens of thousands for approximate risk in a timely manner with a view to identifying the potentially risky chemicals for closer subsequent scrutiny? I suspect the public wants the latter.” Toxicologist Tim Kroop of the Environmental Working Group, an advocacy organization, agrees: “This is the kind of thing that needs to be done, something that looks at all the factors that determine risk. The current chemical screening program in the United States, where there is no effort to evaluate all chemicals in commerce, is based on timing: Is a chemical grandfathered in or not? What people want is chemical screening based on risk.” —REBECCA RENNER
EnvironmentaltINTERVIEW Evan Mills
lbnl
With the hurricane season upon us, expect a reenactment of last year’s debate over whether global warming is causing more large hurricanes like Katrina. Insurance executives are one group that is growing ever more nervous about this issue. The Association of British Insurers finds that weather-related losses now outpace trends in population growth and inflation. The association’s analyses have identified changes in weather as the driver of a 2–4% annual increase in U.K. property losses. As the world’s largest industry, the insurance business faces more finan cial risk from global warming Evan Mills than any other sector of the economy. To better understand how business leaders are dealing with the dilemma, ES&T spoke with Evan Mills, a staff scientist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (LBNL). In the past 15 years, Mills has distinguished himself as an expert on the economic risks posed by climate change. In a study in Science last August, he detailed how global warming stands to hurt the insurance industry. He further explicated this work in a recent report released by Ceres, a network of investors and public interest groups that promote environmental stewardship on the part of corporations. And his work in beginning to find its way into popular publications, including Consumer Reports, The New Yorker, and Forbes.
Right now, the media seems to be caught in a debate over whether hurricanes are becoming stronger because of global warming. What does the insurance industry predict? Earlier this year, the insurers’ catastrophe [CAT] modelers unveiled
their first attempt to incorporate the implications of climate change and other factors in the outlook for hurricane losses. The net result was an ~45% increase in previous ly expected insured losses due to changes in the physical characteristics of the extreme weather events alone. Demographic trends are amplifying these increases. Responsible insurers are continuing to watch the science unfold.
When did the insurance industry first recognize the potential costs of climate change? Remarkably, the issue was first publicly flagged by Munich Re way back in 1973. European and Asian insurers first engaged in the work with the Intergovernmental Panel on Climate Change [IPCC] in the late 1980s, but it’s only recently that U.S. insurers have begun to take notice. [Note: Munich Re is the world’s second-largest company in reinsurance—they insure other insurance companies.] For some time, the business community, specifically the oil and gas industry, has waged a campaign to magnify scientific uncertainties. Plus, they consistently highlight studies that show that controlling CO2 emissions creates burdensome economic costs. Why do the insurance companies buy into the science? I would say that insurers are better equipped to understand and evaluate the science than most other industries, and they have no particular vested interest in propping up polluting industries. To the contrary, pollution liability is one of the emerging (often insured) risks that keep them up at night. They are also more vulnerable to the impacts; they can’t afford to overlook or be wrong about the science. Insurers who have looked at the climate-change issue closely see more burdensome economic costs from inaction than from prudent action, and, in fact, they are developing business opportunities associated with climate-change mitigation and adaptation solu-
tions. They are also quick to recognize that investments in reducing greenhouse-gas emissions can be highly cost-effective in terms of reduced energy expenditures. Insured losses from weatherrelated events in 2005 approached $80 billion (4× those from 9/11), and that excludes a host of smallscale events that don’t appear in the official statistics. Insurers are the masters of analyzing and coping with uncertainty, and, based on the observed patterns of weatherrelated losses and [the] state of the art in modeling the future outlook, are increasingly convinced that climate change needs to be addressed. Insurers have a strong tradition of natural science research and worked with us as coauthors on the insurance chapter in the IPCC’s Third Assessment Report. Insurers are again involved in the Fourth Assessment, to be published next year.
Why didn’t the insurance industry protect itself from the disruptive practice by the oil and gas corporations of “manufacturing uncertainty”? I suppose, in part, because while the energy sector was more or less forced to respond to the climatechange discussion early on, the insurers have, until recently, been focusing their attention on issues they perceived to be more pressing, such as terrorism, asbestos, the mold crisis, and corporate governance. The U.S. insurance industry is much more diverse and fragmented than the fossil-fuel industry. There are thousands of firms and many trade associations with different areas of emphasis. Few insurers in the U.S. have inhouse expertise on natural science, enabling the climate contrarians and mixed signals from government to paralyze movement. And some just saw it as a nonissue, but this is a less frequent view today. In Europe, most of these distractions are much less operative and the insurers have consequently been much more engaged in looking at climate change. But insurers around the globe are now paying more attention.
june 15, 2006 / Environmental Science & Technology n 3669
t
Environmental INTERVIEW What do insurance executives think will happen to them because of climate change? Climate change will compound existing demographic forces already pushing losses upwards. The bottom line will be eroded profits, and, in the worst case, insolvency of individual firms. This would come about through a combination of premiums failing to keep pace with rising losses plus market contraction. Meanwhile, their customers and regulators will be disgruntled as insurers tighten their terms and conditions. Compounding the problem, many of insurers’ investments are weather-sensitive, to varying degrees. If extreme weather losses coincide with a downturn in financial markets—a la the Great Drought and Depression of the 1930s—the consequences could be especially dire. Of particular concern are nonlinear changes in the intensity, frequency, magnitude, or geographic locus of losses. According to RMS, [the intensity of] Hurricane Katrina was 5 standard deviations from the norm. Rising uncertainty will confound pricing and reduce insurability in some cases. From an actuarial perspective, abrupt climate change is much more of a challenge to insurers than a stylized view of gradual and linear changes over long time frames. [Note: RMS (Risk Management Solutions) is the leading provider of natural hazard risk modeling to financial markets.] Insurers also face various forms of regulatory risk, especially if they do not act on their own to maintain their own solvency or the availability and affordability of their products. Interestingly, the National Association of Insurance Commissioners just formed an executive-level task force to look closely at whether the industry is being sufficiently prudent in its responses to climate change. Have the models or algorithms that insurers use to project the costs from global warming changed over time? Actually, insurers rarely build their
own models, relying instead on third-party CAT modelers. These CAT modelers have only recently begun to incorporate climate change in their assessments. The results have been startling and are creating a stir. It is important to note that many of the impacts of climate change, especially small-scale or gradual-loss events that have enormous aggregate costs—lightning, permafrost melt, mold, drought, or sea-level rise—are poorly (if at all) incorporated in these models. This creates some worrisome blind spots, which I’m afraid will grow larger under climate change.
What other impacts does the insurance industry expect from climate change? Impacts will vary depending on which type of insurance products you look at. The most obvious impacts are for property insurers, but business-interruption losses are expected to also be high. Mention has also been made of rising claims through political risk insurance, arising from civil unrest triggered by the effects of climate change. This has been discussed in the Pentagon’s report on abrupt climate change. You also have litigation against utilities or other polluters, which is already under way in parts of the U.S. Insured crop losses from any number of factors—temperature, moisture, natural disasters, pestilence, disease—is yet another domain of vulnerability. As explored in our recent study with the Harvard Medical School, about half of the industry’s revenues are from life and health insurance. Climate change can trigger losses from extreme heat episodes, infectious diseases, elevated pollen levels due to “carbon fertilization” of the atmosphere, and the mobilization of pollutants in the wake of natural disasters. Some stakeholders used to say that carbon fertilization would be a welcome impact. Ecosystem impacts, at first blush not perceived relevant to insurers, could have financial consequences such as loss of storm-
3670 n Environmental Science & Technology / june 15, 2006
surge protection resulting from coral reef die-off.
Some industry sectors are now saying, “Well, it’s too expensive to reduce greenhouse gas production. Let’s just practice mitigation.” Does this make any sense for the insurers? Quite the contrary. Leading insurers believe in pursuing both approaches jointly. They recognize that reducing greenhouse gases is a highly cost effective strategy. There are also interesting synergisms between the two approaches. For example, better forest management helps sequester carbon while reducing flood and mudslide risks. Also, insurers see real business opportunities, such as introducing new products like energy-savings insurance or financial and risk-management products associated with carbon trading. Some predict that the developing world will be hurt more than the U.S. What is expected, and how will this harm business? This is one of my primary concerns and something we looked at in detail in a recent study for USAID [the U.S. Agency for International Development]. It is well established that the developing world is particularly vulnerable to the impacts of climate change but adapts less readily than industrialized countries. I was just in Tanzania, where protracted drought has given rise to daily 12-h power outages. This is having a crippling effect on otherwise vibrant economic activity. The developing world is the fastest growing insurance market. While “only” $400 billion in insurance premiums are collected from the developing world today (~12% of the entire insurance market), this will rise to about half of the market in the next few decades. However, if insurers react as they have in vulnerable areas of the industrialized world, then this growth could be radically attenuated. This would translate into a major constraint to development. —PAUL D. THACKER