TESTING MORE CHEMICALS FASTER - C&EN Global Enterprise

FEDERAL AGENCIES with the job of protecting human health have long relied on toxicity tests on laboratory animals to help them decide whether to regul...
3 downloads 0 Views 361KB Size
M AG G IE BA RTLETT/NHG R I

GOVERNMENT & POLICY

TESTING MORE CHEMICALS FASTER Federal agencies collaborate on high-throughput TOXICITY STUDIES to generate data for regulation CHERYL HOGUE, C&EN WASHINGTON

FEDERAL AGENCIES with the job of pro-

tecting human health have long relied on toxicity tests on laboratory animals to help them decide whether to regulate chemicals. These experiments are expensive and may take years to complete. Consequently, only a few thousand commercial chemicals have undergone thorough toxicity testing. A new collaboration between the National Institutes of Health and the Environmental Protection Agency may provide more toxicity data faster to regulators. The ultimate goal is to foster a healthier population. “As a society, we need to be able to test thousands of compounds at a much faster rate than we did before,” NIH Director Elias A. Zerhouni says. The collaboration will develop, validate, and translate new toxicity testing methods by employing the same high-throughput screening techniques used by the pharmaceutical industry in drug discovery (C&EN, Aug. 6, 2007, page 34). The agencies hypothesize that high-throughput methods will produce toxicity data as well as, or perhaps even better than, experiments on laboratory animals can. The collaboration, announced on Feb. 14, will test that hunch.

“We are laying out a very logical framework to use these technologies, to understand them,” and to refine them so they can be used in regulation of chemicals, says Robert Kavlock, director of EPA’s National Center for Computational Toxicology, Research Triangle Park, N.C. As the scientific basis improves for predicting toxicity from disturbances in cells and metabolic pathways, regulators may increasingly rely on the information from such experiments, Kavlock says. “We are officially starting something that could change the way in which toxicology and compounds are assessed in the 21st century,” says Francis S. Collins, director of NIH’s National Human Genome Research Institute (NHGRI), Bethesda, Md. The new federal effort, he points out, follows the recommendations in a 2007 report by the National Research Council to replace animal testing with experiments on cells, cell

lines, or cellular components (C&EN, June 18, 2007, page 13). Traditional toxicology testing, Collins says, “is slow, it’s expensive, and its precise predictive ability has often turned out not to be as good as we would want.” Now, the field of toxicology is transforming into one that predicts the toxicity of a chemical on the basis of whether it damages cells or interferes with metabolic pathways. According to Christopher P. Austin, director of the NIH Chemical Genomics Center (NCGC), the high-throughput method involves taking “a dish that’s about 3 by 5 inches that contains 1,536 different little wells in it. Those little wells are a fraction of a millimeter across, and we put the same cells in every one of those [wells]. Then we take 1,536 different chemicals and we put them on top of those cells.” Each chemical is tested at 15 different concentrations, and the cell cultures are exposed to the chemicals for different lengths of time, ranging from five minutes to several days. “To get the answers you want, you have to do all the conditions, all the different concentrations, all the times,” Austin says. “That’s why we need to have such a highthroughput system.” The computerized, robotic system that does the work takes two days to test 100,000 compounds each at the 15 different concentrations, Austin says. In contrast, some toxicity tests in laboratory rodents can take up to two years to complete. THE DREAM is that researchers could,

“in a battery of tests, end up with very specific molecular signatures that will be predictive of human toxicology in ways that you just can’t do in animal testing today,” Zerhouni says. But he warns that high-throughput techniques will not soon displace traditional toxicity studies on laboratory animals. “You cannot abandon animal testing overnight,” Zerhouni says. Traditional toxicity testing on laboratory animals and the new technologies “will have to be intertwined for years,” he says. The collaboration initially will use

Traditional toxicity testing on laboratory animals and the new technologies “will have to be intertwined for years.” WWW.C E N- ONLI NE .ORG

27

M ARC H 3, 20 0 8

NCGC will test chemicals previously studied by NTP. NCGC uses robotic, computerized equipment to identify small molecules to probe genes, cells, and biological pathways. The screening system used to identify beneficial compounds, however, can also be used to determine whether compounds have toxic effects. ENVIRONMENTAL ACTIVISTS and industry groups, though supportive of the new federal effort, attach caveats to emerging technologies. Although their concerns diverge, the basis for them is the same. The use of the high-throughput methods shifts chemical testing from whole animals with intact organs working together in a complex system to a simpler arrangement of cells in a lab dish. This simplification of the system used to test for toxicity will make interpretation of results more difficult, say Caroline (Cal) Baier-Anderson, a health scientist at Environmental Defense, and Richard A. Becker, senior toxicologist with the American Chemistry Council.

R EQUEST MOR E AT ADI NFONOW.ORG

the new techniques to test nearly 2,800 chemicals already studied through traditional toxicology methods on laboratory animals. The National Toxicology Program (NTP), a part of NIH located in Research Triangle Park, generated some of these data over the past 30 years as part of its mandate to test chemicals of interest to federal regulators. Information on other compounds was submitted to EPA by chemical manufacturers to support registration of their products as active ingredients in pesticides. “We have a wonderful legacy database of information where you know what the answer is,” Collins says. The new collaboration will rely on that information, drawn from animal tests, to assess which of these new assays are most predictive of toxicity, he says. Some high-throughput studies are already under way by federal contractors through ToxCast, an EPA research program that is evaluating a range of cellular assays on hundreds of pesticides and a handful of other well-studied chemicals (C&EN, Aug. 13, 2007, page 36).

WWW.C E N- ONLI NE .ORG

28

M ARC H 3, 20 0 8

According to Becker, a lot of science needs to be done before regulated industry and federal agencies can have confidence that the high-throughput tests predict chemical toxicity with adequate certainty. It is important to differentiate cellular changes that lead to toxic effects from those that are reversible perturbations in normal physiology, he tells C&EN. Baier-Anderson points out that whether people might develop cancer or a birth defect or reproductive problems from a chemical often depends on when during their development they are exposed to the substance. Cell-based tests may not be able to mimic the toxic effects that change during stages of development, she says. She also predicts that regulators, manufacturers, environmental groups, and people exposed to a chemical will disagree about when cell-based toxicity data show enough evidence to trigger regulatory action. “We argue over it with animal bioassays,” Baier-Anderson says, adding, “We will continue to argue over it with in vitro bioassays.” ■