Controlling Science - ACS Publications - American Chemical Society

Jul 16, 2010 - EDITORIAL ADVISORY BOARD. Carolyn R. Bertozzi. University of California, Berkeley. Brian T. Chait. Rockefeller University. Tim Clackson...
0 downloads 0 Views 66KB Size
Editor’s

EDITOR-IN-CHIEF

Laura L. Kiessling University of Wisconsin, Madison BOARD OF EDITORS

Jennifer A. Doudna University of California, Berkeley

Kai Johnsson Ecole Polytechnique Fe´de´rale de Lausanne

Anna K. Mapp University of Michigan, Ann Arbor

Michael A. Marletta University of California, Berkeley

James R. Williamson The Scripps Research Institute EDITORIAL ADVISORY BOARD

Carolyn R. Bertozzi University of California, Berkeley

Brian T. Chait Rockefeller University

Tim Clackson ARIAD Pharmaceuticals, Inc.

Jon C. Clardy Harvard Medical School

Benjamin F. Cravatt The Scripps Research Institute

Peter B. Dervan California Institute of Technology

Rebecca W. Heald University of California, Berkeley

Tony Hunter Salk Institute

Richard H. Kramer University of California, Berkeley

Rolf Mu¨ller Saarland University/Helmholtz Institute for Pharmaceutical Research Saarland

Joseph P. Noel Howard Hughes Medical Institute, The Salk Institute for Biological Studies

Thomas V. O’Halloran Northwestern University

Hiroyuki Osada RIKEN

Anna M. Pyle Yale University

Ronald T. Raines University of Wisconsin, Madison

Charles Sawyers University of California, Los Angeles

Stuart L. Schreiber Harvard University

Carsten Schultz EMBL

Peter G. Schultz The Scripps Research Institute

Michael P. Sheetz Columbia University

H. Ulrich Stilz Sanofi-Aventis, Frankfurt

Hiroaki Suga The University of Tokyo

Wilfred A. van der Donk Howard Hughes Medical Institute, University of Illinois at Urbana-Champaign

Christopher T. Walsh Harvard Medical School

www.acschemicalbiology.org

LETTER Controlling Science

W

ith the fallout of the financial crisis reaching all aspects of life in most countries, the media continues to report the problems faced by different industries. So, I was not surprised when I heard a discussion on innovation management over the radio. The president of a European university program that offered courses to companies for substantial fees reported that 2009 had seen a precipitous drop in companies interested in innovation management courses. This was not unexpected, but she mentioned that one area that had seen drastically increased interest was “innovation controlling”. Innovation control professes to teach companies how to assess the return of each Euro invested in innovation in the company. In times of tight budgets, companies are particularly interested in controlling research and development efforts to maximize return on investment. Innovation controlling is now dramatically increasing in science such that we are beginning to see it in every institution. Universities, research institutes, and of course commercial ventures all seem to have become obsessed with controlling. Strategic planning, implementation, and assessing outcome are clearly important. In many cases, results can be clearly measured and adjustments to the process can improve efficiency. Manufacturing production is a great example of a process that benefits from strict control. In my opinion, innovation is at the other end of the spectrum. We can measure qualitative and quantitative factors, but how do we measure innovation? A lot of smart people have put a lot of thought into it and have come up with plenty of end-points for appraisal (apart from the number of publications). As a result there are a host of bibliometric metrices such as citations, H-factor, relative impact, impact normalized over a field, and so on and so forth. These numbers do provide useful data about a researcher that is often representative. But this system can be gamed too (1). One might argue that the number of patents might also be used to assess innovation. However, this is not always an accurate index either. Many academic scientists refrain from patenting discoveries in order to get broader exposure. Counting the amount of research funding that a researcher brings in is also not a perfect measure, since simply bringing in money does not perfectly correlate to scientific output. Possibly the most important output of academics is the “production” of young scientists since this activity multiplies our impact as scientists. But is it more important to produce professors at high-ranking research universities or researchers in small or large companies? In my opinion, it is impossible to answer this question accurately. All quantitative indicators taken can together provide a snapshot of a researcher, a group, or an institute that can be compared to others at a similar stage and such data can be used in strategic planning. But at the end, it takes experts that understand the field to interpret the data and to really understand the science. Controlling to some degree is necessary because institutions want to know how funds are being used and how to compare performance to other institutions. At the same time, controlling should not result in researchers in both academia and industry wasting valuable time to constantly fill out reports on minute detail of their activities. In recent times, I have noticed an alarming increase in university administrative activities devoted to planning. A large quantity of data is produced by these administrators, without concomitant consideration of usefulness. A colleague put this situation into a rather fitting picture: instead of put-

10.1021/cb1001735 Published online July 16, 2010 © 2010 by American Chemical Society

VOL.5 NO.7 • ACS CHEMICAL BIOLOGY

623

Editor’s

LETTER ting more horses before the cart that has to be pulled, more drivers are sitting down on the cart and are investing in better whips. Here is what I propose. Instead of further inflating administration, it is time that universities and research institutions consider doing the following: (i) making sure they hire really good people, (ii) giving these people the opportunity to do their job effectively, largely undisturbed so that they can focus on the task at hand, and (iii) assessing progress after longer intervals (such as after five years) and taking corrective measures only when necessary. Such a system can indeed work, as I have experienced during my tenure, both at ETH Zurich and at the Max-Planck Society. Once on board with shared ideals, the leaders trust the scientists. And relatively “slim” administrations can view themselves as enabling the core functions of teaching and research. Arguments have been made that only well-funded, “rich” institutions can afford these initiatives, but I think that the model can succeed at all levels. The key word here is trust. If you hire people who you cannot trust, even constant controlling will not help much. Investing in more controlling will frustrate those being constantly bothered to provide reports and data. It is time that scientists take a stance against the controlling nonsense sweeping the land. And it will have to be the senior researchers that will have to lead the charge.

Peter H. Seeberger Max-Planck Institute for Colloids and Surfaces, Potsdam and the Freie Universita¨t Berlin REFERENCE 1. Williamson, J. R. (2009) My h-index turns 40: my midlife crisis of impact, ACS Chem. Biol. 4, (5), 311–313.

624

VOL.5 NO.7 • 623–624 • 2010

www.acschemicalbiology.org