Could the Quality of Published Ecotoxicological Research Be Better

Aug 6, 2015 - The use of erroneous data in the latter field can lead to costly mistakes, both with respect to wildlife and economics. Evidently this i...
1 downloads 8 Views 896KB Size
Viewpoint pubs.acs.org/est

Could the Quality of Published Ecotoxicological Research Be Better? Catherine A. Harris* and John P. Sumpter Institute for the Environment, Brunel University London, Uxbridge, Middx, UB8 3PH, United Kingdom In 2014 we published a paper in this journal presenting 12 “Principles of Sound Ecotoxicology” that, in our opinion, should be adhered to in order to produce a reliable set of data in this area of research.5 The paper also aimed to provide guidance for reviewers/regulators assessing publications, as well as for young researchers beginning their careers. We have recently been using some of these principles to assess the current state of affairs with respect to the quality of published ecotoxicological research. We embarked on this as a first step toward gauging the scale of the problem at hand, as well as to start to pinpoint which areas of scientific (specifically, ecotoxicological) experiments are most often neglected, in order that we can start to consider practical measures to improve the quality of this research. We were not aiming to identify all of the causes of poor science in this instance, but rather to identify some of the most regularly occurring “mistakes” that can lead to irreproducible data. We selected two journals specializing in toxicological publications (Environmental Toxicology and Chemistry (ET&C), and Aquatic Toxicology); and one (ES&T) which has a high number of papers published in this field each month. From these, we analyzed ecotoxicology papers published during the first 6 months of 2013. All (66) of the relevant publications n increasing number of scientific publications (including from this time period were analyzed from ET&C; a random some describing ecotoxicological research) are found to selection (up to 10 from each issue; 58 in total) was analyzed contain poor quality experiments and unreproducible results. from Aquatic Toxicology, and all (49) of the ecotoxicology This is widely recognized as being a problem not only from an articles from ES&T were analyzed. Our primary aim was to get ethical and even obligatory point of view (scientists have a a preliminary feel for the overall situation and not to draw responsibility both to the funders and to the wider society to comparisons between the individual journals. Any apparent undertake accurate and objective research), but also from a differences between the journals may partly be a reflection of more practical standpoint, that is, that of the use of peerthe somewhat different remit of the three journals. reviewed data to underpin the management of the environRather than using all 12 of the ‘Principles of Sound ment. The use of erroneous data in the latter field can lead to Ecotoxicology’, which would have included several somewhat costly mistakes, both with respect to wildlife and economics. subjective end points, we instead chose the three most objective Evidently this issue of poor-quality science is not new. For Principles, in order to try to avoid the potential for bias. These over a decade, Ioannidis and colleagues have been pointing out were: that a very significant percentage of all claims in biomedical research are false (i.e., not reproducible), and have been • Was the actual concentration of the test chemical providing the reasons why this is so.1 There is now wide measured? acceptance that many claims in the biomedical literature cannot • Was more than one concentration of chemical used? be substantiated. Direct attempts to replicate more than 50 of • Was the experiment repeated (within the current the key findings of recent years reported in the biomedical publication)? literature have shown that a high proportion cannot be 2 While these appear to be objective and hence easy to apply, replicated, albeit to different degrees. The situation in other we discovered that even the most apparently objective criterion fields of research is less clear. There has, for example, been no has an element of subjectivity to it. In many cases this was due systematic analysis of the ecotoxicology literature conducted in to inadequate reporting and not necessarily to poor study the way it has been in the biomedical field. It is nonetheless design. clear that many ecotoxicologists have become concerned about The results of this analysis are presented in Figure 1. the quality of published research in their field, and some have Essentially, we found that the majority of papers used more started to address the issue objectively. For example, Agerstrand and colleagues have built on the start provided by Klimisch to develop and use criteria capable of evaluating ecotoxicology Received: March 24, 2015 data.3,4

A

© XXXX American Chemical Society

A

DOI: 10.1021/acs.est.5b01465 Environ. Sci. Technol. XXXX, XXX, XXX−XXX

Viewpoint

Environmental Science & Technology

Figure 1. Percentage of papers published in Environmental Toxicology and Chemistry, Aquatic Toxicology and Environmental Science and Technology in the first 6 months of 2013 which were found to fulfill three criteria for good quality research.

than one concentration of chemical in the study in question, and also reported measured concentrations of the experimental chemical. The majority of studies, however, did not describe a repeat study to ensure reproducibility. While this may sound positive in the areas of establishing dose-related responses and measurement of actual concentrations of chemical used, a closer look at the actual numbers found presents a somewhat bleaker picture. The analysis shows that, in one journal, twofifths of publications did not report measured concentrations of the chemical used. That is, there was no indication that the chemical in question was actually present in the exposure medium, and if it was, there is no confirmation that it was present at, or near to, the expected concentrations. Hence these data are of no use for Risk Assessment purposes. With regards to the use of more than one concentration of chemical, up to one-third of publications assessed only a single concentration of the test substance. Again, this provides no information of any use to a Risk Assessor, as there is no indication of the threshold of response or the steepness of any potential dose−response curve. The situation with respect to repeat studies is clear. The vast majority of publications (85%, 71%, and 53% (in ET&C, Aquatic Toxicology, and ES&T, respectively) reported either the results of only one experiment or results from more than one experiment, but there was no overlap between experiments hence these could not be considered repeat studies. In some instances the response observed is highly statistically significant and uses a number of different chemical concentrations (and a high number of replicates) in order to provide a robust analysis using a single experiment, so the lack of repetition could perhaps be understood given current restraints on funding. However, in very many cases the data are not robust enough to conclude that the response is unequivocally a result of exposure to that chemical and in such cases the authors should acknowledge that the experiment needs to be repeated in order to demonstrate that the results are reproducible.

While this is in no way a comprehensive analysis of the currently available ecotoxicological literature, it does provide an indication of the amount of inadequate research that is being conducted (and, indeed, published) in this field. We are not able at this stage to provide solutions to the current problems; those will require changes in all organisations involved in research, from funding bodies through to the scientists doing the research and the journals publishing it.



AUTHOR INFORMATION

Corresponding Author

*E-mail: [email protected]. Notes

The authors declare no competing financial interest.

■ ■

ACKNOWLEDGMENTS Many thanks to Defra for funding this work. REFERENCES

(1) Ioannidis, J. P. A. Why most published research findings are false. PLoS Med. 2005, 2 (8), 696−701. (2) Begley, C. G.; Ellis, L. M. Raise standards for preclinical cancer research. Nature 2012, 483, 531−533. (3) Agerstrand, M.; Küster, A.; Bachmann, J.; Breitholtz, M.; Ebert, I.; Rechenberg, B.; Ruden, C. Reporting and evaluation criteria as means towards a transparent use of ecotoxicity data for environmental risk assessment of pharmaceuticals. Environ. Pollut. 2011, 159 (10), 2487− 2492. (4) Klimisch, H. J.; Andreae, M.; Tillman, U. A systematic approach for evaluating the quality of experimental toxicological and ecotoxicological data. Regul. Toxicol. Pharmacol. 1997, 25, 1−5. (5) Harris, C. A.; Scott, A.; Johnson, A.; Panter, G. H.; Sheahan, D.; Roberts, M.; Sumpter, J. P. Principles of Sound Ecotoxicology. Environ. Sci. Technol. 2014, 48, 3100−3111.

B

DOI: 10.1021/acs.est.5b01465 Environ. Sci. Technol. XXXX, XXX, XXX−XXX