Scientists challenge EPA methods for assessing dioxin cancer risk

Scientists challenge EPA methods for assessing dioxin cancer risk. Pat Phibbs. Environ. Sci. Technol. , 1997, 31 (3), pp 130A–131A. DOI: 10.1021/es9...
0 downloads 0 Views 5MB Size
Scientists challenge EPA methods for assessing dioxin cancer risk Leading scientists and risk experts disagree on the merits of an analytical tool that has been proposed to assess cancer risk from dioxin exposure. Proponents say their methodology produces more accurate calculations showing that dioxin does not pose as great a cancer risk as EPA concluded in its draft dioxin reassessment {ES&T, Jan. 1995, 24A). Other scientists, however, say the tool is only one of iiisjiv mat federal regulators should use adding that it was examined in EPA's dioxin ment and found to be inadequate The issue cuts to the heart of the dioxin debate. If proponents can prove this methodology is best, it will support industry's arguments for less stringent regulations. However, other scientists say risk assessments cannot be based on one analytic tool alone; disparate methodologies are needed to fully understand the risks posed by the many different people are exposed to dioxin. Risk assessments rely on dose metrics—mathematical methodologies designed to estimate the dose, rate, and duration of a population's exposure to a toxic agent. Analysts then determine whether that total exposure is likely to harm. In the December 1996 issue of ES&T, Lesa Aylward and three other industry consultants published a research paper that ana-

lyzed data from the National Institute for Occupational Safety and Health (NIOSH), which evaluated more than 5000 workers exposed to dioxin from 1950 to 1980. Aylward focused on a 3000worker cohort and used a variety of methods, including a dose metric called area-under-thecurve (AUC) to estimate the relative susceptibility of animals and

The researchers conclude that humans are as much as 90 times less sensitive to dioxin than is assumed on the basis of rodent tests. humans to the carcinogenic risk of the most commonly studied dioxin, 2,3,7,8-tetrachlorodibenzo-pdioxin (TCDD). AUC looks at the curve produced by changing levels of TCDD serum lipid concentrations over time and considers the time of exposure and the half-life of dioxin potency (see figure). The NIOSH data provide a wealth of information on exposed individuals, including age, last exposure and, for a subset of the cohort, the measured TCDD level at time of exposure.

Dioxin exposure TCDD concentration over time Rats reach a steady state of dioxin concentration in four months of exposure, the authors say,but humans do not reach such a state even after years of exposure. The authors say researchers have not adequately considered this difference when assessing dioxin toxicity to humans based on rodent tests.

Source: Environ. Sci. Technol. 1996,30,3537.

1 3 0 A • VOL. 31, NO. 3, 1997 / ENVIRONMENTAL SCIENCE & TECHNOLOGY / NEWS

According to Aylward, researchers have relied too strongly on a single data point: exposure levels at the time a TCDD concentration is measured. If researchers consider duration of occupational exposure and the time between die last dioxin exposure and when a sample is drawn, a very different view of carcinogenic potency emerges. Relying on body burden levels at the time a sample is taken is misleading, she said, especially if an individual was highly exposed to dioxin long before the sample was drawn and over the ensuing vears the dioxin levels in blood and tissue have lowered The authors concluded that when risk is calculated with the AUC method, humans are shown to be as much as 90 times less sensitive to dioxin than is assumed on the basis of rodent tests, which is how most dioxin toxicity standards are determined. By contrast, EPA's draft dioxin reassessment relies heavily on two dose metrics average daily dose and current body burden in rodent tests. EPA's use of these and related methods assumes that humans and rodents are equally sensitive to dioxin a conclusion that mav lead to tighter emission controls cleanup standards and other regulations Some scientists dispute Aylward's findings and the importance proponents are placing on AUC. There are numerous problems in die calculations and conclusions in Aylward's paper, according to Christopher Portier, a statistician and director of the National Institute of Environmental Health Sciences' (NIEHS) computational biology and risk analysis laboratory in Research Triangle Park, N.C. He charged that their analysis fails to support the conclusion that humans are less susceptible Portier wrote much of the dose-response chapter of EPA's dioxin reassessment However, James Wilson, a toxicologist and senior fellow at Resources for the Future, called Aylward's paper "a first-class piece of work" and the "coup de grace on EPA's justification for using cancer risk as the reason to regulate dioxin as tightly as it wants to." Other scientists agreed. "This is not a pro-industry, pro-envi-

ronment argument," said Alan Poland, a paper industry science adviser and pharmacologist with the McArdle Laboratories for Cancer Research Center in Madison, Wis. Poland is credited with spearheading the research that led to discovery of the Ah receptor through which dioxin appears to exert its effects. "This is about bad science and obfuscation," he said. EPA has made a policy decision that dioxin poses a significant risk and consequently has "ignored all data or methodologies that would refute that position," Poland asserted, calling EPA's way of calculating dioxin's dose and consequent risk as "simplistic and naive " "No pharmacologist in the world would say it is legitimate," he continued. "The only argument about dioxin is whether human exposure is sufficient to cause adverse effects and, if so, what those adverse effects are. AUC is a very logical way to calculate dioxin's risk." Poland noted,

however, that AUC may not be as appropriate for odier environmental chemicals that have not been studied as extensively. "It's simplistic to think that one approach applies to all situations," countered George Lucier,who directs the NIEHS environmental toxicology program. Lucier, who has played a key role in preparing EPA's reassessment, has said repeatedly that dioxin may be causing harm at or near current exposure levels. "Each [dose metric] has its advantages and disadvantages," Lucier said. "All may under- or overstate risks." Another dose metric mishit conclude th.3.t cin.im.cils 3xe equally sensitive to dioxin or perhaps 10 times less sensitive than people he said EPA did not ignore AUC, Lucier stressed, noting that the agency's reassessment devotes 40 pages to a discussion of the methodology. But EPA chose not to use AUC exclusively because it has many limitations, he said. For example, he

said, AUC did not always accurately predict tumor incidences in his own research on dioxin. In addition, AUC assumes that people and animals are equally sensitive throughout their life spans. Yet, dissimilar species may have different sensitivities at various times in their lives, he noted. Aylward and her co-authors plan to use AUC to analyze other dioxin data to see whether they find a similar difference between human and animal susceptibility. The paper's point, however, was not to develop a precise estimate of the relative sensitivity for animals and humans, she said, but to illustrate that a "more sophisticated" analytic approach could make a significant difference in estimating the risks of dioxin. AUC requires more time effort and information than EPA's methods and consequentiv it is not always warranted Aylward noted But in the case of TCDD the data exists and the stakes are high PAT PHIBBS

U.S., Russia release wealth of classified Arctic environmental data Once-secret data totalling at least 1.4 million scientific observations of the Arctic Ocean were released to the public by the U.S. and Russian governments Jan. 14. Vice President Al Gore made the announcement, saying the data will double the publicly available environmental scientific information about this remote region. Gore was credited by U.S. and Russian scientists and government officials speaking at a briefing at the National Geographic Society in Washington, D.C., with pressing for the release of Arctic data gathered by military and intelligence agencies. The data include ice and sediment samples, ocean temperatures, current flows, seabed topography, and other information. Availability of the data is expected to lead to more accurate global warming models, better longrange meteorological forecasting, and a more thorough understanding of ocean currents, according to James Baker, head of the National Oceanic and Atmospheric Administration. Baker and other scientists speaking at the briefing noted

that most of the Earth's seawater receives its properties from climatic and physical actions that take place in the Arctic Ocean. The lion's share of the new material is wintertime scientific observations collected on Russian ice flow science stations. It is available on compact disk and through the World Wide Web (http://ns.noaa.gov/atlas). Over

the next year, three additional disks will be released. These include observations made during summer months, meteorological data, and ice characteristics. To obtain CD information, write to User Services, National Snow and Ice Data Center, CIRES Campus Box 449, University of Colorado, Boulder, CO 803090449. JEFF JOHNSON

This 40-year average temperature transect of the Arctic Ocean was created by combining recently released U.S. and Russian data. A vertical mixing zone (A), although known by U.S. scientists, had been under close study by Russians since the 1930s. (Photo courtesy Environmental Research Institute of Michigan, Ann Arbor) VOL. 31, NO. 3, 1997 / ENVIRONMENTAL SCIENCE & TECHNOLOGY / NEWS • 1 3 1 A