productivity of research personnel cannot be measured as one does product output of a process, there has long been a need for a t least a qualitative evaluation of the quality of research. IIanageinent generally recognizes the need for research if a company is to maintain its competitive position. Yet, there has been no quantitative method of evaluating the quality of the research department or the individuals in the research group. As evidenced by many symposia and articles published on this subject, a variety of different ai?VEX THOUGH
proaches have been taken to evaluate research. One interesting method of measuring the quality of research contributions is set forth in an article by Dr. Jack Et. Westbrook, in the General Electric Cos’s Research Laboratory Bulletin (Fall 1961 issue), He notes that one approach used quite often in recent years has been to simply count the number of papers published by the laboratory staff. While this is a reasonable measure of scientific activity, it is not an accurate indication as to the quality or significance of that work. A more objective basis for measuring the quality of published research, Dr. Westbrook feels, is examination of references cited in these published papers. Repeated citations of a particular source by independent research workers is indicative of the worth of the cited article.
Number of Net Citations
Number of Net Citations 30
U. Sheffieid
U. London
Penn. State
U. Chicago
G U.E London
GE
U. California U. iliinols
U. Gottingen
Ohio State
HaNard
N. V. Philipr 231 citations from
PO
MIT N. V. Philip3
sourcea
Figure I
194 citations from 10 sources
Figure 111 Number of PuMished Papers in Sample Penn.
State
I
NBS
I
A l h d U.
10
I I
I
U. Illinois Bur. Miner MiT
Ohio Stele
U. Utah
Oak Ridge
Corning
Westinghouse
Oak Ridge
U. California IBM
Owens4 llinois
RCA
Portland Cement
u.
Clemson U.
Naval Ordnancm
Mellon Inrt.
111, inst. Tech.
Raytheon
DuPont
Armour Reo. Hanford
Figure II
Concepcion
Brooklyn Poly.
A. 0. Smith
Tufts U. Chicago
Figure IV
Assuming that there are a sufficient number of literature citations to be statistically signifioant, it should be possible to identify laboratories, individuals, or even specific papers of unusual significance. To check this hypothesis, Dr. Westbrook selected the field of ceramics, and limited his study to trying to identify laboratories (and not individuals or specific papers) which are doing the most significant work. The concept, he believes, should be applicable to other fields. There are admittedly some shortComings in this approach. One is that laboratories which are working in an area which is of great interest a t the moment may get a higher rating than may be justified. This effect would be overcome if the analysis covers a period of years. Another weakness is that unpublished work, which may be significant, cannot be included in the analysis, As far as basic research is concerned, however, he does not feel that this is a great problem. As a cross-check, Dr. Westbrook chose one group of 99 papers from the 1958 Journal of the American Ceramic Societg and a second group of papers dealing with articles on ceramics which appeared in several other technical journals in 1958. He eliminated certain references such as private communications, interim reports, government contract reports, theses, reference works, and monographs. Proceedings of special symposia, published separately from usual journals, were included. First Group. He ended up with 838 references from the Journal of the American Cerainic Society. These were entered on punched cards so that information concerning the name of the parent journal, author affiliation, and laboratory where work was done could be evaluated. He found that 213 laboratories accounted for 811 of the 838 references. A few of these citations were then eliminated where they appeared to be duplicates. A study of the remaining citations shows
EDITOR'S COLUMN
that 10 laboratories, only 5% of the entire group, accounted for about 30% of the total. (Fig. I ) . It is of interest to note that 65 of the 99 papers selected for the study came from 17 laboratories; the remaining 34 came from other laboratories. Fig. 11. Six laboratories appear in both groups. Second Group. To make results somewhat comparable, the second group of papers, which were selected from a group of technical journals (except Journal of the American Ceramic S o c i e t y ) , was limited t o 100. These consisted of 20 each from 1958 issues of five journals, including two on physics, two on chemistry, and one on metallurgy. I n these 100 papers, Dr. Westbrook found 949 total citations. Ten laboratories accounted for 20% of the citations. As in the first group, these laboratories constituted less than 5% of all laboratories in the group (Fig. 111). Five appear on both lists (citations), It should be noted t h a t in this second group, 17 laboratories ac-
counted for 56 of the total 100 papers examined; each of the remaining 44 laboratories contributed one paper each. This is quite similar to the statistics which he obtained for the first group (Fig. 117). Only 3 laboratories are common to the first group and second group with respect to sources (not citations) of papers. This, Dr. Westbrook believes, is due to emphasis on fundamental versus applied science or to author preference for a particular journal. In conclusion, Dr. Westbrook feels that the analysis of literature citations is a useful measure of the significance of research; that a relatively few laboratories are publishing especially significant work (in ceramics); and that a sample of 100 papers yields about 1000 usable, identifiable citations which is an adequate number for this type of study. If it were deemed desirable to measure performance of individual scientists or to identify unusually significant papers, much larger samples would be required.