Skills and Training for the 21st Century Chemical Toxicologist

Jul 18, 2011 - Skills and Training for the 21st Century Chemical Toxicologist. Timothy W. Gant (Guest Editor). Chem. Res. Toxicol. , 2011, 24 (7), pp ...
0 downloads 0 Views 671KB Size
EDITORIAL pubs.acs.org/crt

Skills and Training for the 21st Century Chemical Toxicologist was articulated by five letters appearing in the Times of London newspaper on February 7, 2011. JK Aronson, Emeritus President of the British Pharmacological Society, highlighted the paucity of UK experts in pathology, toxicology, pharmacology, and physiology. Aronson commented further that this skills gap was highlighted in the Life Sciences Blueprint published in 2009 by the Office of Life Sciences in the UK. In particular, there is a lack of in vivo experience. R. Maynard of the UK Department of Health, writing in 2006,5 commented that “toxicology as a scientific discipline will need to be significantly strengthened during the coming century”. Writing in this journal, A. T. Jacobs and L. J. Marnett observed that there has been a decrease in the flow of talented younger scientists into toxicology.6 P. C. Burcham has noted the same phenomenon in Australia.7 Furthermore, Maynard correctly pointed out that toxicology as a discipline suffers from having its “successes ignored and its failures paraded”. Among academics, toxicology suffers from being a multidisciplinary, largely applied, science. The impact factors for its journals tend to be lower than, for example, those more focused on fundamental areas of genetics or biochemistry. This is a problem because the “rule of the impact factor” in academic departments is in danger of pushing multidisciplinary sciences such as toxicology to the margins. Toxicology does, however, have immediate relevance in society, and its real impact is here as well as in the impact factors. Does this make toxicology less academically worthy than other biomedical sciences? Not at all, and mechanisms of action (MOA) work often gets published in high impact factor (greater than 10) journals, as for example, the article published on the primary protein target of thalidomide in Science.8 MOA work such as this points the way forward but must not be confused with the narrower use of a xenobiotic to perturb a system where the ultimate aim is to study the system rather than the xenobiotic. These types of studies are fundamentally biochemical or physiological, though ultimately may contribute to toxicological understanding if interpreted from the context of the chemical. Toxicology therefore takes a more holistic approach and is directed by the chemical and not the biochemical or physiological system. Jacobs and Marnett suggest that the new generation of toxicologists could be drawn from undergraduates trained in disciplines such as chemistry. The implication here is that the toxicology training would take place at the postgraduate level.6 This is certainly a good way to proceed, and initiatives such as the Medical Research Council Integrative Toxicology Training Programme in the United Kingdom are taking a lead in this respect (http://www.mrc.ac.uk/Fundingopportunities/Fellowships/ Integrativetoxicologytrainingpartnership/MRC004300). Universities should take note and consider introducing more undergraduate programs in toxicology either alone or as specialist additions to courses such as pharmacy.

T

wo thousand and eleven is the International Year of Chemistry. Chemistry has delivered some of the major innovations of the last century, including drug and crop products to cite just two, of many, chemical categories of benefit to society. However, if a representative sample of society were to be questioned about their perception of chemicals, it is likely a consistent theme would emerge of perceived dangers to the environment or to themselves over and above recognition of benefit. Some might realize that drugs are chemicals and recognize the benefits of these perhaps more readily than those of bulk chemicals but may still express concern about drug reactions and side effects. This concern about the adverse effects of chemicals is justified with a recent study reporting that 8.3% of preventable deaths in 2004 were due to just a few substances.1 Though this study showed that the majority of attributed deaths were not due to bulk manufactured chemicals but rather environmental hazards such as air pollutants, the principle is sound. Furthermore, many people are likely not be overly concerned about the risk to the population as a whole, but are rather to be concerned about risks to themselves. Prediction of risk to the individual has, to date, been little more than a theoretical concept in toxicology. However new technology, itself dependent on chemical advances, is bringing us to the point where it is not inconceivable that the risk to the individual will have to be considered. What is certain is that risk, at least at the level of subpopulation, must be considered. This is a substantial challenge for the toxicologist of the 21st century.2 The past few years have seen a considerable reduction in the size of the pharmaceutical industry. This reduction has been spread across many countries with facilities closing or consolidating as profits have been squeezed by a lack of new products and the expiry of patents on marketed drugs. The most recent of closures at the time of writing was announced by Pfizer on February 1.3 Not an isolated incident, Pfizer’s closure announcement followed similar reductions by most of the other pharmaceutical giants. The reason for these closures is largely a reduction in the number of compounds in the pipeline, with many compounds being lost in the development phase due to toxicity. These events highlight a need for better testing methods, biomarkers, and better understanding of mechanisms of chemical toxicity.4 Of even more concern to the pharmaceutical industry is “failure in use” leading to withdrawals. There have been a number of these over the past few years with Rosiglitazone being one of the most recent. It is now vital that toxicologists embrace all of the new tools and systems available to recognize hazard and determine risk, both to prevent risky chemicals from reaching the public but also to reduce false positive calls and allow more development to proceed. Furthermore, the toxicologist must be familiar with all the new tools available in order to properly assess and interpret the tidal wave of new information being deposited in the public domain. Are these skills being developed in the younger generation of toxicologists? The loss of the Pfizer facility in Kent, United Kingdom, and its 2500 jobs prompted some soul searching that r 2011 American Chemical Society

Published: July 18, 2011 985

dx.doi.org/10.1021/tx200204q | Chem. Res. Toxicol. 2011, 24, 985–987

Chemical Research in Toxicology

EDITORIAL

’ TOXICOLOGY TECHNOLOGY FOR THE 21ST CENTURY

individuals from sequencing projects will greatly assist in this endeavor, but such analyses will also require adept toxicologists trained in bioinformatics and biomathematics. It would be naive to assume that all differential susceptibility arises from alterations in the genome. Lifestyle factors such as age, body mass index, and current health status all have well recognized roles to play. Knowledge of MOA will assist the modern toxicologist toward an understanding of the effects of these factors on individual risk. New Cellular Models. In general the public desire is for completely safe medicines and chemicals, a goal made even more challenging by the further desire for the use of fewer animals in toxicology testing and risk assessment. There are few in vitro systems that can faithfully model the in vivo response to chemicals. Genotoxicity can be assessed in vitro, though metabolic activation is still challenging. Assessing ADME (absorption, disposition, metabolism, and excretion) is more difficult to achieve in vitro as are events such as nongenotoxic carcinogenicity. Primary human cells are difficult to acquire, and even when acquired change their characteristics when placed in culture. Conversely, the age old assertion that animals are not humans remains true, rendering extrapolation from experimental animals to the human complex, though an understanding of MOA can greatly assist this process. Production of embryonic and induced pluripotent stem cells and their differentiated progenitors has been advancing rapidly in the past few years.13 Most of the development has been with a medical application in mind for the replacement of differentiated cells lost in disease processes or injury. A significant proportion has though been for the development of differentiated cells that can be used in research including toxicology. Full worth has yet to be proven, but the prospect of unlimited supplies of human differentiated cells such as cardiomyocytes, hepatocytes, and neuronal cells is tantalizing. One ethical problem that has been apparent, particularly in the US, has been the origin of the embryonic stem cells from fetuses. This has, to some extent, impeded work. Now however, the advent of the induced pleuripotent stem cells (iPSCs), which are derived by reverse engineering a normal somatic cell, offer an even more attractive and desirable way forward for toxicologists. Not only do these IPSCs not carry the burden of fetal ethics, but they can be derived from individuals of different genetic backgrounds. So for the first time, we have the possibility of chemical and drug toxicological research, particularly for MOA in relevant differentiated stem cells from humans of different genetic backgrounds. Issues such as the variation in the epigenetic status of the differentiated cells and achieving consistency in the differentiation process remain. Nevertheless, the use of stem cell technologies in toxicology will probably produce some relevant models and allow a much better assessment of risk through MOA work in an appropriate species and cell type. Toxicology and Mathematics. The use of more advanced mathematics and statistics in Toxicology arrived with the genomic revolution driven by microarrays and will continue with the application of the next generation sequencing technologies. However, this represents only one application of mathematics in toxicology and can be classified as data interpretation. The second and perhaps more profound application for mathematics in toxicology is modeling with an ultimate aim of building a mathematical model of the system in which chemical interactions can be assessed. Mathematical modeling of chemical interaction in biological systems can take several levels, for example, the relatively well established quantitative structure activity relationships (QSAR) analysis attempts to predict the toxicity profile

Toxicology and Genetics. In 2010, Ozzy Osbourne, former lead singer of the band Black Sabbath, had his genome sequenced.9 After a lifetime of drug and chemical exposure, Ozzy had a question to ask, “Why am I still alive?” Quite rightly, he thought that at least some of the answer might lie in his genetic makeup. A short while later and for about $30,000, he had his genome sequenced and analyzed. A number of polymorphisms were identified that could allow him to metabolize some xenobiotics, such as ethanol, more quickly and may have conferred upon him a degree of resistance to these materials. Thus, he had some answers. In a wider context, though, the sequencing of this man’s DNA points us toward a much more profound challenge for toxicologists, the concept of risk assessment for the individual. The cost of $30,000 for a genome is not overly expensive, but perhaps still beyond the reach of most people. This cost, though, is coming down very rapidly, and as the capacity of the Next Generation Sequencing (NGS) machines increases the concept of the $1000 genome could well be reality in a relatively short period of time. As this revolution in sequencing unfolds, polymorphisms associated with individual susceptibilities and resistance to xenobiotics will undoubtedly be discovered. We already know of many, for example, in metabolism or in the conferral of susceptibility to allergic reactions. These, without doubt, represent the tip of the iceberg, and there are many more to come.10 For disease phenotypes such as Parkinson’s, the story now emerging that as little as a 30% change in the expression of some single genes can be associated with disease. Polymorphism of genes involved in metabolism giving rise to differential sensitivity and resistance can produce much larger changes in protein activity than 30%.11 If a 30% difference in expression can confer susceptibility to a disease phenotype, it is rational to deduce that the same may be true for a xenobiotic. In addition to polymorphisms, NGS is already having a profound influence on the understanding of the role of epigenetic modification in toxicology.12 Data published to date indicate this to be an important area of toxicology, with fundamental roles in nongenotoxic carcinogenesis and transgenerational toxicology to name but two important areas. Epigenetic change may well represent an even greater challenge than polymorphisms for the toxicologist of the 21st century. Given that there are going to be many tens of thousands of polymorphisms recognized in the near future, and even more epigenetic variants, using the new sequencing technologies available to assess the relevance of these will be a bewildering and impossible task for the toxicologist without knowledge of MOA. Only by understanding MOA will there be any possibility of associating polymorphisms and epigenetic variation with susceptibility to toxicological effects. If the genes involved in the mechanism of toxicity are understood, there is some possibility of discovering whether there are any populations, subpopulations, or individuals who are especially likely to show an adverse response to exposure to some chemical. With this information we can begin to make more adequate estimates of risk. Understanding MOA is perhaps the most significant challenge for the toxicologist. Once biochemical interactions are identified, the process of identifying polymorphisms that may cause adverse events can begin. The public deposition of genetic data on 986

dx.doi.org/10.1021/tx200204q |Chem. Res. Toxicol. 2011, 24, 985–987

Chemical Research in Toxicology

EDITORIAL

of a chemical from its structure. Connectivity mapping achieves the same end but using gene transcription signatures.14 While these approaches can potentially make some recognition of chemical hazard, actual risk depends on many other factors. Therefore, further modeling is required, for example, of exposure by physiologically based pharmacokinetic modeling (PBPK). The further development of genetics and MOA will inform these models further as more is understood of the role of polymorphisms in determining internal chemical exposure and response. Ultimately, it can be hoped that mathematics will be able to model, and account for, the variables between an in vitro system and humans in vivo allowing for toxicological risk assessment to be entirely evaluated in in vitro systems. What has been clear from the early and somewhat disappointing development of QSAR is that the expectations of modeling have outpaced the actual development leading to some frustration. This is most probably due to the modeling being based on incomplete data sets, and as data sets develop, further modeling will improve. This is an evolutionary science, and long-term commitment is required to achieve the desired output. The application of mathematics and computation is explained well in a commentary from Rusyn and Daston.15

(9) Ayres, C. (2010) The Osbourne Identity. The Sunday Times of London Newspaper, October 24. (10) Johansson, I., and Ingelman-Sundberg, M. (2011) Genetic polymorphism and toxicology—with emphasis on cytochrome P450. Toxicol. Sci. 120 (1), 1–13. (11) Kumaran, R., Vandrovcova, J., Luk, C., Sharma, S., Renton, A., Wood, N., Hardy, J., Lees, A., and Bandopadhyay, R. (2009) Differential DJ-1 gene expression in Parkinson’s disease. Neurobiol. Dis. 36 (2), 393–400. (12) (a) LeBaron, M., Rasoulpour, R., Klapacz, J., Ellis-Hutchings, R., Hollnagel, H., and Gollapudi, B. (2010) Epigenetics and chemical safety assessmen. Mutat. Res. Fundam. Mol. Mech. Mutagen. 705 (2), 83–95. (b) Goodman, J., Augustine, K., Cunnningham, M., Dixon, D., Dragan, Y., Falls, J., Rasoulpour, R., Sills, R., Storer, R., Wolf, D., and Pettit, S. (2010) What do we need to know prior to thinking about incorporating an epigenetic evaluation into safety assessments? Toxicol. Sci. 116 (2), 375–381. (13) Wobus, A., and Loser, P. (2011) Present state and future perspectives of using pluripotent stem cells in toxicology research. Arch. Toxicol. 85 (2), 79–117. (14) Lamb, J., Crawford, E., Peck, D., Modell, J., Blat, I., Wrobel, M., Lerner, J., Brunet, J., Subramanian, A., Ross, K., Reich, M., Hieronymus, H., Wei, G., Armstrong, S., Haggarty, S., Clemons, P., Wei, R., Carr, S., Lander, E., and Golub, T. (2006) The Connectivity Map: using geneexpression signatures to connect small molecules, genes, and disease. Science 313 (5795), 1929–1935. (15) Rusyn, I., and Daston, G. (2010) Computational toxicology: realizing the promise of the toxicity testing in the 21st century. E. Health Per. 118 (8), 1047–1050.

’ CONCLUSIONS We as individuals are likely to know a lot more about ourselves in the coming years as advancing technology is able to analyze our genome, scan our bodies, or rapidly evaluate a wide range of biomarkers. This is going to make risk assessment, from drug or environmental chemical exposure, more individual, at least to the level of populations and subpopulations. The good news is that these technologies and models such as stem cells are giving us new insight into MOA that will allow a better assessment of risk. Coping with this data and making sense of it will be a huge challenge for toxicology and for the training of toxicologists. The examples above are by no means exhaustive. Others that could have been included are biomarkers for MOA and exposure assessment, all of the “omic” technologies, and mutant, transgenic, and humanized animal models to name but a few. The toxicologist of the 21st century is going to need to be a very multiskilled individual. Are our universities meeting this challenge? Timothy W. Gant Guest Editor

’ REFERENCES (1) Pr€uss-Ust€un, A., Vickers, C., Haefliger, P., and Bertollini, R. (2011) Knowns and unknowns on burden of disease due to chemicals: a systematic review. Environ. Health 10 (9), 1–15. (2) Cohen Hubal, E., Richard, A., Shah, I., Gallagher, J., Kavlock, R., Blancato, J., and Edwards, S. (2008) Exposure science and the U.S. EPA National Center for Computational Toxicology. J. Exposure Sci. Environ. Epidemiol. 20 (3), 213–236. (3) Cressey, D. (2011) Pfizer slashes R & D. Nature 470, 154. (4) Kola, I. (2008) The State of Innovation in Drug Development. Clin. Pharmacol. Ther. 83 (2), 227–230. (5) Maynard, R. (2006) Toxicology in the twenty-first century. Hum. Exp. Toxicol. 25 (3), 163–165. (6) Jacobs, A., and Marnett, L. (2007) The future of toxicology: wrap up. Chem. Res. Toxicol. 20 (7), 983–985. (7) Burcham, P. (2008) Toxicology down under: Past achievements, present realities and future prospects. Chem. Res. Toxicol. 21 (5), 967–970. (8) Ito, T., Ando, H., Suzuki, T., Ogura, T., Hotta, K., Imamura, Y., Yamaguchi, Y., and Handa, H. (2010) Identification of a primary target of thalidomide teratogenicity. Science 327, 1345–1350. 987

dx.doi.org/10.1021/tx200204q |Chem. Res. Toxicol. 2011, 24, 985–987