Faculty evaluation by formula - Journal of Chemical Education (ACS

Keywords (Audience):. Continuing Education ... Journal of Chemical Education. Brooks, Kelter and Tipton ... Published online 1 April 1980. Published i...
0 downloads 0 Views 2MB Size
Faculty Evaluafion by Formula This paper, which is intended to introduce readers to the notion of a formula approach to facultvevaluation, has troubled me for a long timi. It has remindkd me of thetime that a friend who is an excellent established chemist commented while reading a paper written by a junior colleague, " X looks as if he is going to publish and perish." I embrace the use of formulas for facultv evaluation because thev nut a deoartmental value system out front. As such, rational persons may decide uoon the formula's worth. and. therefore. the value system'sworth. My main interest is in the teaching of chemistry. Teaching must he valued in order to be rewarded. Teaching often goes unrewarded and even unnoticed in large, research-oriented departments. A formula can deal with teaching explicitly, and thereby make i t difficult for teaching to go unnoticed. Formulas do not necessarily promote collegiality; they are not particularly prescriptive with respect to faculty development. Although " it is nossible for me to descrihe the anolication of a formula within my department, I will instead descrihe how a formula can be arrived at in a colleeial fashion for the nur" poses of distributing merit increases and shaping the efforts of a faculty toward achieving collective goals. The faculty hegins by deciding what an appropriate distribution of effort might be. For a chemistry department in a community college this might he 85%teaching, 13%service, and 2% research. For a graduate chemistry department this might be 35%teaching, 50% research, and 15%service. Administrative inputs are necessary and appropriate to help guide this decision; within a given college, for example, not all departments need have the same distrihution. I t is not necessarv that each oerson anoortion hisher own effort to match .. the departmentdl distrihutiun ot effort. The presumed ultimate thir( rive would he to have the distril~utionof merit increases, etc., track the departmental distrihution: if 50%of the effort goes toward research, then 50% of the merit increase would be apportioned according to the various faculty members' research productivity. Let's nssum6 that your colleagues arc still speaking ro one anuther niter the dcparrmenrsl distrihution is decided. }low are an individual's contrihution in each area taken into account? A general premise is that each contrihution be considered to he the product of an extensive quantity and an intensive quantity: the amount of each contrihution times a quality factor for each contrihution. Research is alleged to be "easy" to measure. Assume that your departmental faculty decides that research sould be measured in terms of grants, publications, and presentations.

".

A.

~

~

Grants probably ought not count dollar per dollar-there prohahlv should be some accountine for differences. sav a where $ represents the total anweighting function of 3J$, nual amount of grants in effect. Should it matter where the dollars come from? If so, weighting factors such as NSF = NIH = 1.0, Navy = DOE = 0.8, and EPA contract = 0.35 might be used. Should one count total erant dollars. or onlv total overhead dollars?-After all, theoverhead doll'ars are tLe ones that directly benefit the department. Should grants for "teaching" (CAUSE, LOCI, precollege teachers training, etc.) be counted here? Should intrauniversity grants count? Should proposals that are not funded count? If so, should reviews be read and considered? How should grants jointly obtained by two or more faculty be counted? It is not easy to decide how to count research dollars. I t is obvious that one's selection of the criteria is critical to both the future oattern of - - dennrt~~mental funding and to the maintenance of collegiality. How should naoers . . be counted? Should there he weiehtin~ function based upon number of papers, say 2q'R'Should there be weighting based upon the number of pages per article, ni, say N = Xi 3&. Should there be a weighting according to journal? How much more valuahle is a data-based article in J A C S than a nondata-based article in this Journal, if a t all? Should there he an exnected difference according to area-e.g., are synthetic organic chemists really expected to be more prolific than statistical mechanicians? Should a score from the Citation Index be weighted in? Should there he a systematic external peer review designed to assign quality factors? How should research awards and prizes he counted

.~ ~~

~~~~

L~

--~~~-~-

-

...

in?

How should presentation^ be cnunted? How much mnre wlunllle is an inwed paprr at a Gurdon Confrrenre than a cmtritluted paper to the Podunk Academy of Science'? Clearly, the way in which papers and talksare counted determines whether there willhelots of little ones or a few hie ones, whether they will be in quickie press or prestige journals, and whether thev w~llhe a t local forums or broader forums. What mix of grants, papers, and presentations is appropriate? Ours happens to he 3:2:1. How should ioint uaners . . he countedIas one per the number of authors;or as something else? This decision has long-term effects upon colleeialitv. .. for sure! The formula game can be a great deal more than counting papers. Would it be a valuable exercise for you and your colleagues to set weights and functions for counting research ronrrihutions in thv hope of clarifying departmentsl goals and rewards' I'm quite sure that the evaluation ~f research in your

-

Volume 57, Number 4. April 1980 / 295

department is quantitative hut is not based upon as many as four of these dimensions. Perhaps research performance isn't as easy to evaluate as prevalent clichbs would have us believe. Let's now look a t service. How do we count committee work-based upon memherships? Hours spent? Committee importance? \%'hatwrirht does departmental work have relative to college work, ec.? Are national committees especially good? Does paid consulting for DuPont count as much as reviewing for NSF? Does running the department's United Fund Drive count? Does coaching the school's ping pong team count? Does heing a school hoard member count? Does heing a Scoutmaster count? Does heing a church elder count? How is the quality of committee work to he assessed? How much extra should chairing a committee count? Does student advising fit in here, under teaching, or in a separate fourth category? Now let's turn to teachine. Ohviouslv. the numher of hours taught should have some bearing. ~ h o i l d the level of a course count, and, if so, which course levels should count for more? Should class size he counted? How should laboratory hours be counted relative to lecture hours? How should outsideof-class hours be counted, if a t all? How should the numher of preparations he counted? No mention has yet been made of the most controversial aspect of faculty evaluation, which is the evaluation of teaching quality. Valid and important criteria of quality can be measured for teaching without ever asking a student his or her oninion of teachine- aualitv! Course content can he . evaluated by peers by comparing the level of examination auestions with those of an accepted contemporarv text. Students can use a simple yeslno firm to reporton unacceptable teaching behaviors such as missing class or not returning graded papers. Of course, some student rating is often employed to arrive at a quality factor. My perception of the problem with the underrating of teaching as a part of the faculty member's performance in research departments is that teaching and service assignments are often counted in a simple minded fashion and tend to be evenly distributed-i.e., there exists a ". . .we all teach nine contact hours and serve on three committees" policy. This leaves research as the only variable. Given only the function

~~.

T=

296 1 Journal of

(N;[O.I

+ 0.1 3J;;l])

Chemical Education

where T is the formula numher of teaching hours, Ni is the numher of lecture contact hours, n; is the enrollment, and i is the index for courses taueht hv the teacher. anv administrator could field excellent fectuiers for large ;ndergraduate classes. Thev would count iust a little bit more. and evervone with available time and interest would want the large class assignments first! (It usually happens in the reverse order; staffing of graduate courses with enrollments of 5 takes place first.) To make a formula work to collective advantage, one must he prepared to allow that some colleagues earn zero scores in certain cateeories. Good teachers who are poor researchers can emphasize teaching; good researchers whiare poor at teaching can emphasize research. Facultv zood a t evervthine can contributewhere they are most needeb. In a system using an 85% teaching formula, everyone must teach. However, where the balance-of factors among teaching and research is close to even, faculty loads may he completely unbalanced in the direction of "vou do mv teachineand p11 do vour research." A department can actLally try todecide whether or not to hire a "teacher" or a "researcher," (although I find myself committed to an ideal that everyone he expected to perform a t some creative activitv). Formula schemes are in use a t several institutions. One reason whv formulas are findine increased use is that thev have an &of objectivity about thkm-so you can win in couA from each side, faculty or administration. From a faculty point of view, the real contribution of a formula is that it can get the values out front. In chemical education, we need to do more of that, much more of that! For those interested.

..

Persons interested in exnlorine further the svstematic evaluation of faculty should-contact Michael ~ c r i i e na t the Evaluation Institute, University of San Francisco. A microfiche supplement has been prepared for this paper. I t includes issues to be considered when working with formulas. Also, it discusses some management strategies designed to enhance collegiality. Finally, it takes 10 model biosketches for a department and works these through several formula approaches using different parameters. T o obtain this supplement write to the author at 227 Hamilton Hall. Universitv of NehraskaLincoln, Lincoln, NE 68588. David W. Brooks University of Nebraska-Lincoln Lincoln. NE 68588