PhD on Track » Evaluation and ranking » Citation impact

Citation impact

3ear-1ep

“Whether a text is interesting and provides you with something is more important than whether it is published somewhere important”.
PhD, humanities

“Even though it depends a lot on how one writes. If you have a choice between one that has a high impact factor and one that has a low one, I would take the high one”.
PhD, life science

Citations are increasingly used for the purpose of evaluating research. On this page you will learn about:

  • how bibliometric indicators, such as the Impact Factor and h-index are calculated
  • the criticism that has been voiced against bibliometric calculations and the application of them
  • the functions citations play in research and in research evaluations
  • implications bibliometric indicators may have for your research and your career

Journal Impact Factor

The Impact Factor is designed for assessing journals indexed by Web of Knowledge. The Impact Factor is a measure of how often an article in a particular journal has been cited on average per year. For journals within the same subject category, the factor indicates the journal’s relative influence or impact.

Calculating a journal’s Impact Factor

To retrieve the Impact Factor of a given journal, open the database Journal Citation Reports (JCR) from your library homepage. Search for a specific journal or a group of journals by discipline.

The Impact Factor (IF) is based on the number of citations (A) in the current year to items published in the previous 2 years and the number of articles (B) published in the same two years: IF=A/B.

if-calculation

Figure: The grey shaded areas indicate the citations received for articles published in 2009 (light grey) and 2010 (dark grey). Only citations in 2011 contribute to the Impact Factor for 2011.

IF for 2011 2009 2010 Sum
Citations in 2011
of articles published in
900 600 1500
Articles
published in
140 160 300

Table: Citations and publications involved in the calculation of the Impact Factor for 2011.

By definition the Impact Factor is:

IF (2011)=(Number of citations in 2011 to articles published in 2009 and 2010)/(Number of articles published in 2009 and 2010)

In numbers this would be: 1500/300=5.

For 2011 the Impact Factor of this journal is equal to 5.

Observing journal trends

On an annual basis the value of the Impact Factor is re-calculated. It represents the current value and enables observing journals over time.

IF-trend

Diagram: Impact Factor trend.

The diagram shows the performance trend of our example journal. The value of the Impact Factor lies around 5, which means that on average articles have been cited five times a year. In spite of year 2010, the journal has maintained a consistent performance over the last seven years.

Critical remarks – Impact Factor

Over the years criticism has been raised against the Impact Factor. You can read more about some of the critical remarks here.

Field dependency

Because the Impact Factor is field dependent only journals within the same scientific field are comparable. Nevertheless, the Impact Factor has been used to compare different fields.

Figure: Amount of references by age to articles published in 2011. The figure is based on Adler, R., Ewing, J., & Taylor, P. (2009).

The figure demonstrates the field dependency of the Impact Factor. The citations contributing to the calculation of the Impact Factor are the ones inside the two-year citation window, marked grey. Citations outside the window do not count, even though most of them lie outside and refer to older articles. For rapidly developing fields, blue line, the Impact Factor is considerably higher than in slowly developing fields, red line. A lower Impact Factor does not mean a lower quality of one subject field compared to another. The citation window may just be too short to be representative for subjects which develop slowly.

Anglo-American bias

The pool of selected journals has a strong Anglo-American bias. Influential journals written in other languages are rarely captured by ThomsonReuters Journal Citation Reports.

Unintended use

“Typically, when the author’s bibliography is examined, a journal’s impact factor is substituted for the actual citation count. Thus, use of the impact factor to weight the influence of a paper amounts to a prediction, albeit coloured by probabilities”.
(Garfield, 1999)

The Impact Factor is not only used for ranking journals, as initially intended, but also for measuring the performance of individual researchers. The use of the Impact Factor when applied to individual researchers has been criticized by a broad scholarly community, not least the founder of the Citation Index (nowadays ThomsonReuters), Eugene Garfield, himself. Assessing individual researchers by the Impact Factor merely reflects an assumed impact, not the actual impact based on accumulated citations of self-authored publications.

Manipulation

The Impact Factor can be manipulated. It is influenced by the point in time when a journal issue is published. Issues published at the beginning of a year have a higher chance to accumulate citations than issues published at the end of the year. Furthermore, editors may influence the value of their journal’s Impact Factor by writing editorial matters containing references to articles in their journal (journal self-citations). In addition, references given in the editorial matter count to the nominator, while editorial matters do not count to the denominator. By definition the denominator only consists of citable articles and editorial matters are not regarded as such.

Incomplete references

References given in the articles may be incomplete and incorrect. Incorrect references are not corrected automatically and therefore are not added to the citations. This fact influences the value of the Impact Factor and other citation indicators such as the h-index.

Alternative journal indicators

There are several alternatives to the Impact Factor. The ones listed here are all based on citation measures, but indicate different aspects of impact:

Immediate or long lasting impact

Whilst the Impact factor reflects a current measure of the journal’s impact, the immediate or the long lasting impact may be expressed by the following indicators, i.e. the Immediacy Index and the Journal Cited Half-Life.

Immediacy Index

The Immediacy Index is a measure of how often an average article of a particular journal is cited in the year it is published. When comparing journals, the immediacy index is useful to provide information on whether the journal is specialized in cutting edge news. The higher the index, the more immediately the presented research is taken up.

NB: The index tends to be higher for journal issues published early in the current year. Early publications have a longer time to accumulate citations through the rest of the year.

Journal Cited Half-Life

The cited half-life is the median age of the articles that were cited in the current year. Half of the citations of the journal refer to items published within the cited half-life. The half-life tells us something about the age of the citations. In contrast to the immediacy index, the half-life indicates how long the articles of a particular journal still are of interest for the research community.

Advanced journal impact measures

In order to compensate for the weaknesses of the Impact Factor (field dependency, inclusion of self-citations, length of citation window, quality of citations), several approaches have been undertaken to develop an indicator which fits best all different aspects of assessing journals. More advanced metrics are usually based on network analysis, for example SNIP, the Source Normalized Impact per Paper, based on data from Scopus.

H-index

The h-index is a measure of the total importance of a researcher measured by how often she or he gets cited.

  • A scientist has index h if h of his/her Np papers have at least h citations each,  and the other (Np − h) papers have no more than h citations each  (Hirsch, 2005).
  • The index combines both an author’s scientific production (publications) and impact (number of citations).

When exploring the literature of your research field, the h-index may give you a picture of the impact of individual researchers and research groups. You may retrieve the h-index in e.g. Web of Science, Scopus and Google Scholar.

When applying for a scholarship, project funding or an employment position, you may be required to state your h-index.

Calculating someone’s h-index

The h-index for a given author (Karen) calculated step-by-step:

Step 1: Search for author Karen in a given database

illustrasjon_01

Figure: Search result of author Karen’s publications in a given database.

The figure illustrates the search result of author Karen in an arbitrary database. The figure also indicates all citing publications (counting-lines) within the same database. In our example author Karen has 10 publications (a,b,c,d,e,f,g,h,i,j). N = 10.

Step 2: Sort publications by decreasing number of citations

Rank 1 2 3 4 5 6 7 8 9 10
Publication c i a g f h d j b e
Citations 90 45 20 10 4 4 3 3 1 1

Table: Karen’s publications sorted by decreasing number of citations. View table as graph.

Step 3: Fulfil the condition: h of the publications have at least h citations each
h = max(Rank) provided Cit ≥ Rank
According to our example 4 publications have received more than 4 citations. These are publications c,i,a,g.
The remaining (N – h) publications have not more than h citations each. According to our example, the remaining 6 publications (f,h,d,j,b,e) have not more than 4 citations each.

Result: Karen’s h-index is equal to 4.

Example: H-index retrieved in Web of Science, Scopus and Google Scholar

In this example we use a renowned Norwegian researcher in ecology and evolutionary biology: Nils C. Stenseth. We demonstrate that his h-index is different in each of the databases due to their different coverage of content.

Step 1: Search for the author, making sure you cover all possible signatures.

Step 2: Go to the statistics available in the three databases

  • Web of Science: Citation Report
  • Scopus: View Citation Overview (watch video by Scopus)
  • Google Scholar: Scholarometer (add on in Firefox)

Results presented here are based on data as of January 2013. Citation counts typically increase with time and so does the h-index. To determine the present value, perform a new search.

Web of Science

h-index = 61

To be certain that all publications by the author are retrieved, the example shows search results for stenseth n* . Using stenseth nc would have led to an h-index equal to 60.

Scopus

 

h-index = 57

The time interval is shorter than in the other two databases and covers only publications from 1996 until now. Early publications are in general those which are cited frequently and thereby add to the index. Missing these early publications explains why the h-index is lower here.

Scholarometer

 

h-index = 68

Google Scholar covers a wider range of publication types; therefore the h-index is higher here.

Critical remarks – H-index

The h-index alone does not give a complete picture of the performance of an individual researcher or research group.

The h-index underrepresents the impact of the most cited publications and does not consider the long tail of rarely cited publications. In particular, the h-index cannot exceed the total number of publications of a researcher. The impact of researchers with a short scientific career may be underestimated and their potential undiscovered. Read more about this below: “Problem: The Matthew effect in science”.

Be aware:

“Using a three-year citation window we find that 36% of all citations represent author self-citations. However, this percentage decreases when citations are traced for longer periods. We find the highest share of self-citations among the least cited papers”.
(Aksnes, 2003)

  • The h-index is comparable only for authors that are working in the same field.
  • The h-index is comparable for authors of the same scientific age.
  • The h-index differs from database to database depending on database coverage.
  • The h-index depends on your institution’s subscription time range. The h-index may underestimate researchers’ impact if their older publications are not included.
  • The h-index is manipulable. Exaggerated use of self-citations may influence the h-index and result in an inflated value.

Citations in communication

Citing is an activity maintaining intellectual traditions in scientific communication. Usually, citations and references provide peer recognition; when you use others’ work by citing that work, you give credit to its creator. Citations are used for conversational reasons and express participation in an academic debate. They are aids to persuasion; assumed authoritative documents are selected to underpin further research. However, citations may be motivated by other reasons as well.

Citations may also express

  • criticism of former research
  • friendship services to support colleagues
  • payment of intellectual debt, e.g. toward supervisors or collaborators
  • self-marketing one’s own research, i.e. self-citations

Citations and evaluation

Applicable across fields?

Be aware that scholarly communication varies from field to field. Comparisons across different fields are therefore problematic.

However, there exist attempts to make citation indicators field independent. For example, The Times Higher Education World University Rankings involve citation indicators which are field independent, i.e. normalized
(Times Higher Education, 2013)

Citations are basic units measuring research output. Citations are regarded as an objective (or at least less subjective) measure to determine impact, i.e. influence and importance. They are used in addition to, or as a substitute for peer judgments.

There is a strong correlation between peer judgments and citation frequencies. For this reason one leans on citations as an indicator for quality and uses them, among other things, for:

  • benchmarking universities
  • scholarship and employment decisions
  • political decisions regarding research funding
  • exploring research fields and identifying influential works and research trends

Citations must be handled carefully when evaluating research.

Citation data vary from database to database, dependent on the  coverage of content of the database.

Furthermore, both different motivations for citing, and the fact that the distribution of citations is highly skewed, are problematic aspects.

Problem: The Matthew effect in science

To those who have, shall be given…

When sorting a set of publications by the numbers of citations received, the distribution shows a typical exponential or skewed pattern. Works which have been cited are more visible and get more easily cited again (vertical tail in figure), while other works remain hidden and hardly get any citations (horizontal tail in figure). This fact is referred to as the Matthew effect in science.

Citation pattern

Citation pattern

What is the problem with skewed distributions?Skewed patterns make it difficult to determine an average citation count. Different approaches may be applied, see the figure.

  • Mean citation count: Long vertical or horizontal tails disturb the average value. The Impact Factor is an example of this type of average.
  • Citation median: Long vertical or horizontal tails disturb the calculation of the median value. For example, a long tail of rarely cited publications results in a low median value, while a minority of highly cited publications is ignored.
  • H-index: The h-index is an alternative average value. It is designed to compensate for the effect of long tails.

Optimize your impact

Good research + High visibility
=
Your best chance to make an impact

When you are aware of how academic performance is evaluated, this allows you to make informed decisions and devise strategies to optimize your chances of impacting your research community, as well as the society. It can be essential for getting tenure and research funds, and a powerful tool to evaluate different career choices, such as where to go for a post-doc. Therefore, make your work visible, accessible and understandable.

Make your work visible to other researchers

  • Publish with acknowledged, international publishers/journals in your field. These are traditionally most regularly read and cited by your research fellows. However, with the digital age, top cited research is frequently published in more obscure journals, and it is the article metrics and not the journal metrics which determine the actual impact of your work.
  • Share your work on social networks. Impact measures based on usage are emerging and available on many websites, e.g. on journal and database websites. Work that has been shared and spread is more likely to get cited.
  • Engage and participate in scholarly debates in society, e.g. in popular press.
  • Showcase your work:
    – Create your scholarly online identity. Persons with common names are difficult to distinguish. To avoid ambiguity create your profile (e.g. ORCID, Google Scholar) and link your publications to your profile.
    – Cite your previous works in order to give a broader picture of your research. However, be balanced and make sure that you always stay in line with the topic discussed.
  • Check whether the journal is indexed in databases used in your field; databases increase the visibility of your work.
  • Make sure your work is added to the research register at your institution ( CRIStin in Norway). Usually these data are used for performance measures, so state your name and institutional affiliation on your publications in a proper and consistent manner.
  • Collaborate with other researchers. In general, collaboration benefits your career by having a positive impact on your production. Co-publishing may also imply borrowing status from more renowned co-authors who are read and cited regularly.

Make your work accessible to other researchers

Make your work understandable to other researchers

  • Use informative, memorable descriptions including key words in the title and abstract.
  • Put your findings in a larger context, by using references to previous work.
  • Publish in English for a wider international distribution. If you have written in your mother language, consider to republish it in English, or publish a summary of your main findings in an international journal.
  • Share the research data along with your publication. This strengthens your research and makes your findings replicable and verifiable.

References

Adler, R., Ewing, J., & Taylor, P. (2009). Citation statistics. Statistical Science, 24(1), 1.

Aksnes, D. W. (2003). A macro study of self-citation. Scientometrics, 56(2), 235-246.

Garfield, E. (1999). Journal impact factor: a brief review: CMAJ, 161, 979-980.

Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102(46), 16569-16572.

Times Higher Education. (2013). The essential elements in our world-leading formula.

A s k -u s