Metrics to Evaluate Journals, Scientists, and Science - ACS Publications

and certainly pay attention to a range of journal metrics to assess how Analytical Chemistry is doing. I consider the use of the JIF metric to judge a...
1 downloads 9 Views 215KB Size
Editorial pubs.acs.org/ac

Metrics to Evaluate Journals, Scientists, and Science: We Are Not There Yet

A

quality, then researchers work to maximize their high JIF publications. Of course there are several reasons for the explosion of metrics such as the JIF, including the ease of generating them and the desire to simplify complex comparisons (between journals, scientists, departments, and universities) to a simple number;5 as these quantifiable evaluation systems are already in place, an understanding of them may be useful as you navigate your career. If you want to read many additional interesting comments and editorials on metrics in science, the recent Infozine from ETH Zürich is a fun read.5 To finish my thoughts, perhaps comparisons such as changes in a journal’s metrics can indicate a trend in how a journal is changing over time but is it not better to select a journal to publish your manuscript in based on your targeted audience? As Renato Zenobi pointed out,6 most metrics offer too simplistic a view to capture the value of scientific advances and they should be viewed with caution. It is better not to adjust your research objectives, publication practices, or field of study to optimize a metric and do not judge others based on a simple number. Evaluate them by reading their manuscripts.

s a researcher and editor, I have the opportunity to travel and interact with scientists around the world. Inevitably, the discussion will turn to how Analytical Chemistry is doing, with changes in a journal’s impact oftentimes being one of the first questions (or comments for those in the know). I am proud that Analytical Chemistry’s impact, as determined by most quantitative metrics, ranks well in relationship to similar journals. However, these individuals are not asking about the difficult to measure broader “impact” of Analytical Chemistry on the measurement science field but rather about our score, or journal impact factor (JIF), as determined by Clarivate (formerly Thomson Reuters). While most know this, the JIF represents the total number of citations to a journal in a year divided by the total number of articles from the prior 2 years (with some complexity added to determine the denominator).1 Intriguingly, up to 75% of the articles published in a journal can have citations over the prior 2-year period totaling less than the journal’s impact score because of the skew created by a few highly cited articles.2 If you are selecting a journal to publish your work, the JIF is useful but so are many other metrics, as recently discussed by the Scholarly Kitchen,3 including CiteScore, Impact per Publication, H5-index, Eigenfactor, and others. As one example, Google Scholar Metrics calculates many journals’ H5-indices. As we rank number one among analytical journals for our H5index, I could suggest that the H5-index is an important metric. As an analytical chemist and Editor, I like quantitative numbers and certainly pay attention to a range of journal metrics to assess how Analytical Chemistry is doing. I consider the use of the JIF metric to judge a scientist’s research output to be a larger issue of concern. Like many of you, I review tenure cases and receive job applications from faculty and postdoctoral associates where the candidates list the JIF for the journals they have published in. Think about this. The JIF is a hypothetical number of citations for a manuscript; most of us should be more interested in how our specific manuscripts were received by the research community. While perhaps an article in Analytical Chemistry will receive an average of six citations in its first 2 years (and many more citations after 5 years), when looking at an individual, I want to know how their specific articles fared, not journal citation averages. When evaluating an individual, should not we read their manuscripts and make an informed decision on their research creativity and importance and not make a judgment based on citations and related scores? When department and university evaluation systems reduce the determination of faculty output to a series of metrics and they base status, promotions, and salary on these metrics, the metrics take on a life of their own. Yes, I have heard from individuals that a specific manuscript under review in Analytical Chemistry is required to get tenure because they are short a paper based on their university’s scoring system. This reminds me of the classic statement: “On the folly of rewarding A while hoping for B”.4 If the reward system values the JIF more than long-term scientific impact/ © XXXX American Chemical Society



Jonathan V. Sweedler AUTHOR INFORMATION

ORCID

Jonathan V. Sweedler: 0000-0003-3107-9922 Notes

Views expressed in this editorial are those of the author and not necessarily the views of the ACS.



REFERENCES

(1) McVeigh, M. E.; Mann, M. E.; Stephen, J. JAMA 2009, 302 (10), 1107−1109. (2) Bohannon, J. Science 2016, DOI: 10.1126/science.aag0643. (3) Davis, P. Citation Performance IndicatorsA Very Short Introduction. Scholarly Kitchen, 2017 (4) Kerr, S. Academy of Management Executive 1995, 9, 7−14. (5) Dolenc, J.; Hünenberger, P.; Renn, O. Infozine 2016, 1. (6) Zenobi, R. Infozine 2016, 31−32.

A

DOI: 10.1021/acs.analchem.7b01872 Anal. Chem. XXXX, XXX, XXX−XXX