Should There Be Minimum Information Reporting Standards for

Oct 27, 2017 - So, how should one balance the complexity of sensors with the need for minimum reporting guidelines? ... extremely low power, small for...
0 downloads 12 Views 955KB Size
Editorial Cite This: ACS Sens. 2017, 2, 1377-1379

pubs.acs.org/acssensors

Should There Be Minimum Information Reporting Standards for Sensors?

T

characterizing these materials, which are not always readily accessible. The incredible challenge in standardization of nanomaterials has parallels to sensing, where we are dealing with many different types of sensors, transducers, modification schemes, nanomaterials, and recognition molecules. So, is it even possible to have minimum reporting requirements in sensing? At ACS Sensors, we think the time is right to start discussing minimum reporting requirements for sensing publications to allow direct comparison between studies. We argue that despite the incredibly rapid recent developments in sensing, it is quite a mature field: the biosensing concept and the first semiconducting gas sensors emerged in 1962, ion-selective field effect transistors stretch back to 1973, and some chemical sensors were in use long before these dates. So, how should one balance the complexity of sensors with the need for minimum reporting guidelines? Despite the incredible breadth and complexity of sensors we feel that the logical way to identify minimum reporting requirements for such a complex field is to focus on what the sensor is designed to achieve, that is, to provide analytical data. Like all journals with a focus on analytical chemistry, in our Guidelines for Authors we ask for detailed experimental protocols so that the work can be reproduced. Unlike many other analytical journals, our guidelines state “All analytical data should include uncertainties, comparisons to a standard analytical method, and demonstration of the sensor’s performance in the complex samples for which the device is intended to be used.” So, in many ways we already have a minimum information requirement. We would argue that these requirements are simply the minimum your colleagues will want to know about your sensor: How reproducible is it? Does it actually give the right answer? And, is it fit-for-purpose? The last point goes beyond “does it work in the intended samples” in that it embraces the analytical requirements of the sensor in terms of the concentration range in which it operates, its detection limit, its selectivity, and its reproducibility as appropriate for the intended application. It is important here to add a caveat: not all papers in ACS Sensors satisfy these analytical criteria and that is for a reason. A journal like ACS Sensors is in the business of transmitting new ideas, ways of thinking, and concepts. So, apart from application papers, we also seek papers that have a high degree of novelty, where providing such analytical data may not be viable at such an early stage. And, we seek papers that are addressing fundamental aspects of sensing, where the analytical performance is not even the focus of the paper. We would say that from the papers we receive, the requirements for uncertainties and demonstration that the sensor works in complex samples are generally embraced by our authors (although we do see papers where the complex samplesuch as biological fluidis diluted 10,000- to a million-fold, which

wo interrelated issues that are attracting considerable attention in the scientific literature at present are reproducibility of data and minimum reporting standards. These issues both relate to the reporting of science so it can be reproduced by others, and the results of one paper can be compared with another. Not being able to reproduce a body of work wastes incredible amounts of time and resources, so we all have a responsibility to do our best to report our results in a way that allows them to be replicated as easily as possible. And, that does not consider the incredible reputational harm this does to scientific research, as a whole. Both these issues have been covered in recent editorials on reproducibility of data,1 and standardization of nanomaterials2 in our sister journals, Analytical Chemistry and ACS Nano, respectively. So, we do not seek to repeat those eloquent editorials, but rather discuss the question: how does this relate to sensing? The reproducibility problem was underscored in a 2012 study by the pharmaceutical company, Amgen, where their team of highly qualified scientists could only reproduce 6 of 53 high profile papers in hematology and oncology. In that study, Amgen had quite stringent criteria for designating a result as “nonreproduced”. It was nonreproduced when the findings were not “sufficiently robust to drive a drug-development programme”.3 This is possibly the best known of these types of studies, although there are others4 across disciplines. Such studies do not necessarily suggest that these challenges with reproducibility are a result of disingenuous behavior, but they do highlight the importance of presenting the experimental methods and details in a paper in a clear and complete manner that allows others to repeat the work. Equally as important, but less often emphasized, is the need for experimental details to be presented in a way that allows a clear comparison between experimental studies. Being able to make comparisons between methods and materials is where the two concepts of standards and minimum information to be reported come in. Minimum information for reporting has been the subject of a number of projects in the biological sciences. For example, minimum information or reporting guidelines have been promoted for genome sequences,5 microarray experiments,6 the annotation of biochemical models, 7 biological and biomedical investigations,8 and recently, for a single amplified genome.9 Within the chemical community, of course, our organic chemistry, inorganic chemistry, and crystallography colleagues have long held standard expectations for the characterization data needed when describing a new molecule or crystal. As is clear from the ACS Nano editorial,2 the nanomaterials community is beginning to think about and discuss minimum information that should be reported to facilitate both direct reproduction of reported results and comparison between studies. Such standardization of methods is particularly challenging in a field like nanotechnology where there is not only a vast array of materials, but also a vast array of ways these materials can be assembled and further functionalized to create the nanosystem of interest. Furthermore, there exists a plethora of methods for © 2017 American Chemical Society

Received: October 6, 2017 Published: October 27, 2017 1377

DOI: 10.1021/acssensors.7b00737 ACS Sens. 2017, 2, 1377−1379

ACS Sensors

Editorial

hardly reflects how a sensor might be used in most cases). The one aspect we believe could be improved is in showing how well the sensor actually works relative to a standard method. In fact, we feel that the whole area of comparisons is something we, as a community, must improve upon. We often see comparisons that are somewhat flawed. For example, in areas where advances in materials are driving developments, comparisons often consist of the authors’ preferred material relative to another material they madewhich they know does not work wellrather than comparing their new material with the state-of-the-art in the literature. Another common trend is that the sensor’s analytical performance is compared with selected examples from the literature in tabulated form. Although better than no comparison, the tables are seldom comprehensive, and still fail to answer the question of whether the sensor actually gives reliable analytical data. Although on some occasions, it is not possible, such as when the sensor operates in an unusual environment, or has unrivaled performance parameters. In these instances, we strongly advocate for the authors to make an experimental comparison where the same sample is analyzed with an established method and with the new sensor. For example, there are many gas sensors already on the market; many metals could be analyzed using standard and commonly available analytical methods; a lot of proteins and small organic molecules can be quantified by commercially available enzyme linked immunosorbent assays; and there are standard methods accepted by industry standards agencies like the AOAC for most organics, inorganics, gases, and some biological species. Of course, many of the “standard” methods are not amenable to emerging needs for high speed, high throughput, in-field or pointof-care scenarios, real-time analysis, extremely low power, small form factor, minimal cost, limited calibration, etc., and it is appropriate to highlight the advantages of a new sensor or sensor methodology against the existing benchmarks. The point of highlighting advantages also comes back to the concept of fit-for-purpose. One potential barrier to the presentation of comparative data is the thought of some authors that the referees, or journal, will reject their paper if the sensor does not work as well as a published sensor or method on all criteria. As we stated above, the focus of ACS Sensors is very much toward presenting new ideas. So, a sensor might not perform as well as some other sensors, but might be based on new concepts or materials or might be for a different purpose where the performance of the sensor is more than adequate. So, better performance is not always essential, but we do want to know how well the sensor works as a benchmark from which the idea can be improved. We would think that such comparisons are a logical minimum reporting requirement for a new sensor.

Eric Bakker, Associate Editor The University of Geneva, Geneva, Switzerland

Shana Kelley, Associate Editor The University of Toronto, Toronto, Canada

Yitao Long, Associate Editor East China University of Science and Technology, Shanghai, China

Maarten Merkx, Associate Editor Technische Universiteit Eindhoven, Eindhoven, The Netherlands

Michael Sailor, Associate Editor University of California, San Diego, United States

Nongjian Tao, Associate Editor



Arizona State University, Tempe, United States

AUTHOR INFORMATION

ORCID

J. Justin Gooding: 0000-0002-5398-0597 Eric Bakker: 0000-0001-8970-4343 Shana Kelley: 0000-0003-3360-5359 Yitao Long: 0000-0003-2571-7457 Maarten Merkx: 0000-0001-9484-3882 Michael Sailor: 0000-0002-4809-9826 Notes

Views expressed in this editorial are those of the authors and not necessarily the views of the ACS.

J. Justin Gooding, Editor-in-Chief



The University of New South Wales, Sydney, Australia

REFERENCES

(1) Sweedler, J. V. Striving for Reproducible Science. Anal. Chem. 2015, 87 (23), 11603−11604. (2) Mulvaney, P.; et al. Standardizing Nanomaterials. ACS Nano 2016, 10 (11), 9763−9764. (3) Begley, C. G.; Ellis, L. M. Raise standards for preclinical cancer research. Nature 2012, 483 (7391), 531−533. 1378

DOI: 10.1021/acssensors.7b00737 ACS Sens. 2017, 2, 1377−1379

ACS Sensors

Editorial

(4) Aarts, A. A.; et al. Estimating the reproducibility of psychological science. Science 2015, 349 (6251), 8. (5) Field, D.; et al. The minimum information about a genome sequence (MIGS) specification. Nat. Biotechnol. 2008, 26 (5), 541−547. (6) Brazma, A.; et al. Minimum information about a microarray experiment (MIAME) - toward standards for microarray data. Nat. Genet. 2001, 29 (4), 365−371. (7) Le Novere, N.; et al. Minimum information requested in the annotation of biochemical models (MIRIAM). Nat. Biotechnol. 2005, 23 (12), 1509−1515. (8) Taylor, C. F.; et al. Promoting coherent minimum reporting guidelines for biological and biomedical investigations: the MIBBI project. Nat. Biotechnol. 2008, 26 (8), 889−896. (9) Bowers, R. M.; et al. Minimum information about a single amplified genome (MISAG) and a metagenome-assembled genome (MIMAG) of bacteria and archaea. Nat. Biotechnol. 2017, 35 (8), 725−731.

1379

DOI: 10.1021/acssensors.7b00737 ACS Sens. 2017, 2, 1377−1379