June 18, 2008
Three international mathematics organizations have strongly cautioned against relying too heavily on citation statistics for measuring research quality. Their joint report, titled "Citation Statistics," contends that citation data, such as the impact factor, provide "only a limited and incomplete view of research quality."
Prepared by John Ewing of the American Mathematical Society, Robert Adler of the Technion-Israel Institute of Technology, and Peter Taylor of the University of Melbourne, the report describes not only the limitations of citation statistics but also how better to use these data.
The report represents a response from mathematicians and statisticians to a "culture of numbers," created by a drive toward more transparency and accountability in the academic world. In effect, institutions and individuals assume that fair decisions can be reached by algorithmic evaluation of some statistical data. "Unable to measure quality (the ultimate goal), decision makers replace quality by numbers they measure," the report notes.
"There is a belief that citation statistics are inherently more accurate because they substitute simple numbers for complex judgments, and hence overcome the possible subjectivity of peer review," the report states. "But this belief is unfounded."
"The sole reliance on citation data provides at best an incomplete and often shallow understanding of research—an understanding that is valid only when reinforced by other judgments," the report continues. "Numbers are not inherently superior to sound judgments."
A journal's impact factor, for example, is a simple average derived from the distribution of citations for a collection of articles in that journal. However, the average captures only a small amount of information about this distribution. "Using the impact factor alone is like using weight alone to judge a person's health," the report argues.
"People are using numbers in a way that's really inappropriate," Ewing told Carl Bialik of the Wall Street Journal. "It seems to me that it is up to mathematicians to speak out when it comes to this kind of misuse."
The International Mathematical Union, the International Council for Industrial and Applied Mathematics, and the Institute of Mathematical Statistics released the findings, which were drawn from the literature on the use of citation data to evaluate research and on practices as reported by mathematicians and scientists from around the world.