A. Ben Wagner, Sciences Librarian, Science and Engineering Library, Arts and Sciences Libraries, University at Buffalo.


In light of the recent cancellation of ISI’s Journal Citation Reports, this document was created to help faculty find and properly use journal impact factors while also discussing alternate metrics for measuring the quality and impact of journal articles. Eugene Garfield himself, founder of ISI, cautions against using journal impact factors to establish the quality or significance of individual articles.1




The Institute for Scientific Information/ISI (now Thomson Scientific) began publishing journal impact factors in 1975 using the Science Citation Index database (now Web of Science). Over the past two decades, publishing, tenure, and promotion decisions have increasingly been based on impact factors, seemingly to the exclusion of any other measure of quality.


Definition of Impact Factor: For a given two-year period, the impact factor measures the number of times the average article in a given journal is cited by the entire body of journal articles published in the subsequent year. If a journal has a 2004 impact factor of 4.35, then the average article published in that journal during the preceding two year period (2002-03) received 4.35 citations from the entire body of 2004 scientific journal literature covered by the Web of Science.


Section IV of this paper discusses the limitations and misuse of journal impact factors in detail. Many of these concerns are discussed by Garfield in an online essay2


·         Impact factors measure the gross average for all articles appearing in a journal and should not be used to establish the quality or significance of individual articles. Articles even in a prominent journal like Nature Cell Biology can range from being cited hundreds of times to many with zero citing articles.


·         Impact factors measure citations for a single year against only the previous 2 years, biasing the statistic towards heavily funded, rapidly moving research fields. Comparisons across disciplines and sub-disciplines are not valid. 


·         Impact factors can and reportedly are being manipulated by journal editors and publishers. Publishing more review articles or encouraging manuscript authors to include more citations in their bibliography to articles from the journal they are submitting to can significantly increase a journal’s impact factor.


·         Not all journals are covered by the Web of Science.  In addition, erroneous citations, journal title changes, mergers, and splits create problems in measuring impact factors. 




Although the Journal Citations Reports database is a copyrighted, subscription-only database, impact factors for many journals can be often found in three ways.


A. Publisher Press Releases & Newsletters


The Journal Citation Reports is released annually in May for the previous year.   Publishers often advertise favorable results and rankings within disciplines in press releases and newsletters.  For example, the American Chemical Society promotes their impact factors through various means including a complete list at Of course, a publisher with unfavorable impact factors is unlikely to advertise that fact, but then one may not want to publish in journals with impact factors so low that a publisher does not cite them.


B. Editors and Reviewers


Editors and reviewers are usually very aware of the impact factors of the journals they work for. Phone or email contact with professional colleagues can often supply the required information.


C. General Internet Search Engines


Searching for a journal title and the keyword “impact factor” can often retrieve impact factors appearing in open web documents. For example, the 4th result in a Google search, “biochemical journal” “impact factor”, is, which lists impact factors for dozens of medical journals.  As with any Internet search, be sure to establish the reliability and currency of the data by noting the source and cross-checking values when possible.




Many within the library and academic community have rightly become increasingly concerned about the seeming fixation on journal impact factors, especially since new, interesting alternate metrics have recently become available.


A. “h-index”, a Real Innovation


As reported last year in Nature3, h-index was invented by Jorge Hirsch, a physicist at the University of California, San Diego. The h-index is the highest number of papers a scholar has published that have each received at least that number of citations.  Thus, someone with an h-index of 25 has written 25 papers that have each received at least 25 citations. 


Determining the h-index with the Web of Science takes only a few minutes.  One simply creates a search set of an author’s or department’s work and sorts the results by Times Cited.  Then one simply moves down the ranked list until number of the article matches the number of citations received. Obviously, care must be taken to eliminate extraneous hits to other authors with the same or similar names.  Typically this can be done by using the address (institutional affiliation) search field. Please contact your subject specialist librarian if you have any questions about determining the h-index for individuals or departments.


The h-index is difficult to inflate by self-citation since it sums up the entire body of work over time. It is relatively insensitive to the total number of publications. As with impact factors, comparisons across disciplines is inadvisable, given the varying citation patterns of each field. 


B. Benchmarking individuals, departments and institutions – Analyzing publication patterns.


Although Thomson Scientific long had the only significant citation database, over 16 other databases now track citation data permitting comparison and enrichment of the Web of Science data.  A few of these databases, notably SciFinder Scholar, Web of Science, INSPEC and Compendex Plus (last two will be available later this year on Engineering Village 2) have a powerful Analyze feature that extracts the journal titles and ranks them based on any set of search results.   Hence, one can empirically determine what journals any individual, department, institution, or research field are publishing in. Benchmarking against a group of peers or peer institutions can be very revealing. For example, MIT has one of the top earth sciences/geology departments in the world.  An analysis of this department’s 2002-2004 publications generates the following list of the top 5 journals they publish in:


1.       Journal of Geophysical Research-Solid Earth       

2.       Geophysical Research Letters      

3.       Journal of Physical Oceanography   

4.       Earth and Planetary Science Letters        

5.       Geochimica et Cosmochimica Acta 


Although the Analyze feature is easy to use, one must make sure the result set includes all relevant citations. Your subject specialist librarian would be glad to review the step-by-step process described in Section V.




1.   Impact factors describe a gross average of all articles from the preceding two years in a journal. The range of citations to articles for an entire year for a journal can be significant.  For example, 257 articles were published in Nature Cell Biology in 2003.  The number of cites received by these articles by newer articles range from 283 cites down to 55 articles with zero citations. Impact factors were never intended to measure the quality or significance of individual articles by individual researchers.  That is why they are called journal impact factors. 


2.   Extreme currency is inherent in the metric. If a number of articles in a particular journal are not immediately recognized as significant because of their complexity, being “ahead of their time”, or in a field that is not blessed with significant funding and a rapid moving research front, impact factors will not accurately reflect the significance of that body of work.  This is the reason that Thomson Scientific prominently disclaims any validity of comparison of impact factors between different disciplines and subdisciplines.  


3.   Impact factors can be manipulated in ways analogous to seeking high rankings for web pages in Internet search engine results. Thomson Scientific reports journal self-citation (articles in the journal citing other articles in the same journal) accounts for about 13% of the citations a journal receives. A few editors have reportedly requested authors increase the number of citations to work appearing in the journal they have submitted their manuscript to. Review articles are nearly always more heavily cited than single-study articles so journals that increase the number of review articles they publish can significantly change their impact factor.


4.    Web of Science does not cover every journal published. Building citation databases is an expensive proposition. It can take a number of years before a new journal develops enough “status” to be covered.  From the point Web of Science starts covering a journal, it then takes three years worth of data (current year plus two preceding years) before the impact factor can be calculated. Title changes, mergers, and splits are common in the publishing field.  Since impact factors are calculated on a specific title basis, all these publishing changes create discontinuities in impact factor measurement.




1.   Access the Web of Science (aka Science Citation Index) via the UB Library web site, Click on the General Search button.


2.   Search authors in the Author field or institutions/departments in the Address field, or do both simultaneously. Only initials are input for first and middle names.  The Address field is highly abbreviated. Click on the Abbreviations Help link for a list of most of the abbreviations used.


3.   When limiting to a particular department, use the SAME operator to make sure the various parts of the address all appear in the same entry.  Addresses are listed for every author.  Example: UB Chemistry Department can be searched by entering: buffalo same (suny or univ or 14260) same chem not (engn or 14222)Engn eliminates the Biological and Chemical Engineering Department and 14222 eliminates Buffalo State. Please consult with your subject specialist librarian for further guidance in defining a comprehensive search strategy for your desired purposes.


4.   Input limits for years as desired.  Note that the final search set must be less than 2,001 hits to be completely analyzed in the next step.


5.   Click on the Analyze button in the right-hand column of the results display.  Under Select field to rank by:” drop down list, choose Source Title. Choose the first 500 or all (up to 2,000) records, number of top entries to list (10, 25, 50…), and minimum count threshold as desired.  Click on the Analyze button.  Depending on the number of hits, the analysis may take a few minutes.


6.   The ranked list is displayed.  Note that the list of titles can be saved to a file using the Save analysis data to a file button.  This makes it very easy to insert the ranked listing into reports. 




1.       Garfield, E., How can impact factors be improved? British Medical Journal 1996, 313, (7054), 411-413.

2.       Garfield, E., The Impact Factor: ISI. 1994.

3.       Ball, P., Index aims for fair ranking of scientists. Nature 2005, 436, (7053), 900.