Making sense of toxicogenomics data - Environmental Science

Making sense of toxicogenomics data. Britt E. Erickson. Environ. Sci. Technol. , 2003, 37 (7), pp 125A–126A. DOI: 10.1021/es032411w. Publication Dat...
0 downloads 0 Views 81KB Size
tisan research group, agrees. “There are always problems when reductions under the cap [or emission limits set by the program] are shifted to emissions outside the cap.” Offset projects add another challenge because there is no obvious baseline for giving credits, he says. “I do not believe that voluntary programs can achieve significant emission reductions, that is, reductions that entail significant costs,” Pizer adds. “Of course, there might well be some cheap reductions lying around out there. In a competitive world, how can a company spend anything other than [public relations funds] on something that its competitors are not doing?” He also doubts the liquidity of a market with only 14 players and suggests that trading would be very modest. For information on CCX, visit www.chicagoclimateexchange.com. —MARIA BURKE

Making sense of toxicogenomics data Although the emerging field of toxicogenomics—the study of how genes respond to environmental toxicants or stressors—could answer many questions that have plagued chemical risk assessments, such as determining the minimum dose that causes a health effect and evaluating risks from mixtures, significant issues still need to be resolved before the regulatory community is willing to embrace this new technology. To help the U.S. federal government overcome some of the hurdles in using toxicogenomics information, the National Research Council (NRC) has formed a Committee on Emerging Issues and Data on Environmental Contaminants, which will provide a public forum for stakeholders to discuss issues in environmental toxicology, risk assessment, exposure assessment, toxicogenomics, and other related fields over the next five years. The committee will not prepare reports but may recommend topics for future NRC workshops and studies. At its second public meeting, held February 6, much of the time was

devoted to one of the biggest challenges in toxicogenomics—making sense of the tremendous variability in data that have been generated so far by different laboratories, so that the data can be easily compared and compiled into a central database. According to Brenda Weis, program coordinator for extramural toxicogenomics research at the National Institute of Environmental Health Sciences (NIEHS), researchers are using many different types of microarrays, which are tiny glass or plastic chips that contain thousands of genes, to quantify which genes are turned “on” or “off” by exposure to a particular chemical. Some research groups are using commercially available microarrays, while others have their own custom-designed platforms, she says. Not all of these platforms contain the same number of genes, and the genes are not always in the same place in the ordered arrays, she adds. In addition, there are no standard protocols. To make matters even more complicated, there are often inconsistencies in gene expression data generated by the same laboratory

News Briefs UNEP mercury report Worldwide mercury pollution could be significantly reduced by curbing emissions from power stations, concludes a report from the United Nations Environment Programme (UNEP). Coal-fired plants and waste incinerators account for approximately 1500 tons, or 70%, of new anthropogenic mercury emissions, and most of that comes from developing countries. At 860 tons, emissions from Asia are the highest, the researchers found. The report was considered during UNEP’s Governing Council meeting in February on global action to control mercury emissions (Environ. Sci. Technol. 2003, 36, 441A). Global Mercury Assessment: UNEP Chemicals is at www.unep. org/GoverningBodies.

Other GHGs To cost-effectively tackle climate change, climate policies must extend beyond CO2 and address all of the greenhouse gases (GHGs), according to a report by the Pew Center on Global Climate Change, a nonprofit research group. The authors cite that over the past century, the total effect of all other GHGs, including banned chlorofluorocarbons, is roughly equal to CO2’s influence on climate change. The report acknowledges that CO2 is an important GHG and its ties to fossil fuel emissions make for simple estimates. However, capabilities to measure and assess aerosols, related pollutants like SOx and CO, and other GHGs have improved. Multi-Gas Contributors to Global Climate Change: Climate Impacts and Mitigation Costs of Non-CO2 Gases is available at http://pewclimate.org.

APRIL 1, 2003 / ENVIRONMENTAL SCIENCE & TECHNOLOGY ■ 125 A

PHOTODISC

these gases, Heydlauff says. CCX is one way to respond to the Bush administration’s voluntary climate change program, he adds (see story on page 123A). Environmentalists gave a qualified approval to the scheme. “The problem with voluntary schemes is that that there are never enough volunteers,” says David Doniger of the Natural Resources Defense Council. “Ultimately, it needs to be mandatory. But these companies believe that some carbon regulation is inevitable and are trying to get ahead of the curve.” Doniger’s top concern is that the scheme will rely on giving credits for actions taken outside the program, such as reforestation projects in South America or investments in reduction technologies in nonmember companies. Billy Pizer, a fellow in the Quality of the Environment Division of Resources for the Future, a nonpar-

RNA extraction

Labeling/ hybridization

Scanning

Analysis

(bioinformatics)

3’AAAA

DNA “chip”

The sources of technical variation that can occur during a cDNA microarray experiment are shown above.

using the same platform. To better understand what is causing the data variations, NIEHS’s Microarray Center joined forces with five academic centers that have expertise in toxicology and gene expression research to form the Toxicogenomics Research Consortium (TRC) in 2001. One of TRC’s primary goals is to standardize how gene expression data are generated. Currently, TRC is investigating the technical variations in gene expression profiling, which include things like how RNA is extracted from animal tissue, how it is stored, and how the complementary DNA is amplified, labeled, and bound to the microchip. The consortium also plans to address variations in data analysis that arise from using different software and statistics programs, as well as differences in gene nomenclature. Eventually, the consortium hopes to examine biological variations, or genetic differences among animals of the same species, which could explain some of the intralaboratory variations. Although standards for microarray data have been developed and

adopted by an international organization called the Microarray Gene Expression Data (MGED) Society, those standards, which are referred to as the Minimum Information About a Microarray Experiment (MIAME), do not specify a protocol to follow. Rather, MIAME outlines what information about an experiment must be reported. MIAME provides a means for sharing information, so that you can get a full data set and annotation in a form you can use from a public database, says Chris Stoeckert of the University of Pennsylvania’s Center for Bioinformatics. “Although we have MGED standards, independent source variation is unknown, and the impact of variation on data interpretation is unknown,” says Weis. Nearly everyone agrees that it is important to create a common language for toxicology, but some members of the NRC committee feel that too much time is being spent discussing variations in gene expression data, and not enough time is being devoted to other important issues. “Let’s get the information and give it to the statis-

Corps, EPA pull back from wetlands regulation Two years after a U.S. Supreme Court decision questioned to what extent the Clean Water Act (CWA) applies to isolated wetlands, the Army Corps of Engineers (Corps) and EPA have issued guidance withdrawing federal protections from 20% of the nation’s wetlands. In concert with the guidance, the agencies issued an advance notice

of proposed rulemaking on January 15 that solicits comments on how to better define federal jurisdiction over isolated wetlands (Fed. Regist. 2003, 68, 1991–1998). The action has alarmed environmentalists, who believe it will lead to the loss of federal regulations of industrial discharges, while the regulated community, mainly builders, wel-

126 A ■ ENVIRONMENTAL SCIENCE & TECHNOLOGY / APRIL 1, 2003

ticians. It will be solved,” says committee member Timothy Zacharewski of Michigan State University. Still others, such as Dow Chemical’s James Bus, believe “we will need to give more thought to biological variability.” One important area in which the committee hopes to spend more time is addressing the impacts and limitations of using toxicogenomics data in risk assessment and environmental decision making. The technology could help identify subpopulations that are more susceptible to health effects from a particular chemical exposure, as well as identify common mechanisms in humans and other species, so that traditional toxicity data can be better extrapolated from laboratory animals to humans, says Linda Greer, director of public health at the Natural Resources Defense Council. Another area is chemical mixtures. The National Institutes of Health is already funding toxicogenomics research to investigate mixtures, and according to Kenneth Olden, director of NIEHS, “they are producing good data.” —BRITT E. ERICKSON

comes the opportunity to refine a muddy definition. “The guidance helps clarify how to apply federal authority since a 2001 Supreme Court case struck down the use of migratory birds as the sole basis for the Corps to assert CWA jurisdiction over isolated, non-navigable intrastate wetlands,” explains John Millett, press officer for EPA. The CWA allows the Corps to issue permits for the discharge of

NIEHS

Environmental▼News