The Journal Impact Factor: A brief history, critique, and discussion of adverse effects

Abstract:  The Journal Impact Factor (JIF) is, by far, the most discussed bibliometric indicator. Since its introduction over 40 years ago, it has had enormous effects on the scientific ecosystem: transforming the publishing industry, shaping hiring practices and the allocation of resources, and, as a result, reorienting the research activities and dissemination practices of scholars. Given both the ubiquity and impact of the indicator, the JIF has been widely dissected and debated by scholars of every disciplinary orientation. Drawing on the existing literature as well as on original research, this chapter provides a brief history of the indicator and highlights well-known limitations-such as the asymmetry between the numerator and the denominator, differences across disciplines, the insufficient citation window, and the skewness of the underlying citation distributions. The inflation of the JIF and the weakening predictive power is discussed, as well as the adverse effects on the behaviors of individual actors and the research enterprise. Alternative journal-based indicators are described and the chapter concludes with a call for responsible application and a commentary on future developments in journal indicators.

Good citations | Feature | Law Society Gazette

“More academic journals are making their content freely available online through ‘open access’, making the dissemination of scholarly articles quicker and wider. [Jon Yorke, law professor at Birmingham City University] points to an academic paper he co-authored in 2013 on the EU and the abolition of the death penalty as ‘the most downloaded article in the Pace International Law Review’, with over 2,000 downloads by governments, institutions and non-governmental organisations.

‘With open access of journals globally, policy-makers can have at their fingertips instant access to the quality of material which they are required to [use to] form intricate arguments. It definitely helps them and they do listen to legal academics in that way,’ Yorke adds….”

Seeking Impact and Visibility: Scholarly Communication in Southern Africa

“African scholarly research is relatively invisible globally because even though research production on the continent is growing in absolute terms, it is falling in comparative terms. In addition, traditional metrics of visibility, such as the Impact Factor, fail to make legible all African scholarly production. Many African universities also do not take a strategic approach to scholarly communication to broaden the reach of their scholarsí work. To address this challenge, the Scholarly Communication in Africa Programme (SCAP) was established to help raise the visibility of African scholarship by mapping current research and communication practices in Southern African universities and by recommending and piloting technical and administrative innovations based on open access dissemination principles. To do this, SCAP conducted extensive research in four faculties at the Universities of Botswana, Cape Town, Mauritius and Namibia.”

Library celebrates C. Judson King’s new open access book on the University of California | UC Berkeley Library News

“The book is close to King’s heart for many reasons. During his time as UC provost, King helped launch both the California Digital Library, one of the world’s largest online libraries, and eScholarship, the University of California’s open access, electronic repository for publications by UC authors. King is passionate about the power of open access materials to strengthen scholarship and has made his book freely available online through eScholarship. The goal, King said, is to allow administrators in developing countries interested in building a university to access his book free of restraints….

King has experienced the role of open access publishing in spreading scientific knowledge firsthand. In 1980, King published a second edition of a seminal chemical engineering textbook he’d written on separation processes — operations that pull apart two or more chemicals in a mixture, like in the purification of seawater. After the book went out of print, King put it on eScholarship. Today, the book racks up 100 to 150 downloads per month on the digital platform, King said — which is equivalent to what it sold when it was brand-new. That highlighted for King the potential of open access publishing to help countless researchers around the world….”

 

Responsible metrics: where it’s at? – The Bibliomagician

“At the Lis-Bibliometrics event, Katie Evans raised the important question as to how we can encourage openness in early-career colleagues when they face such pressures to publish in usually closed ‘high impact’ journals.  David Price said that he felt senior colleagues had to lead the way.  At UCL, Paul Ayris pointed out, promotion criteria now included openness metrics.  The challenges of measuring openness, and open measures were acknowledged.  Interestingly enough, Lis-Bibliometrics plans to take a look at this in more detail at a future event….”

Bibliometrics: The Leiden Manifesto for research metrics : Nature News & Comment

“…Yet the abuse of research metrics has become too widespread to ignore. We therefore present the Leiden Manifesto, named after the conference at which it crystallized (see http://sti2014.cwts.nl). Its ten principles are not news to scientometricians, although none of us would be able to recite them in their entirety because codification has been lacking until now. Luminaries in the field, such as Eugene Garfield (founder of the ISI), are on record stating some of these principles3, 4. But they are not in the room when evaluators report back to university administrators who are not expert in the relevant methodology. Scientists searching for literature with which to contest an evaluation find the material scattered in what are, to them, obscure journals to which they lack access.

We offer this distillation of best practice in metrics-based research assessment so that researchers can hold evaluators to account, and evaluators can hold their indicators to account….”

Few UK universities have adopted rules against impact-factor abuse

“A survey of British institutions reveals that few have taken concrete steps to stop the much-criticized misuse of research metrics in the evaluation of academics’ work. The results offer an early insight into global efforts to clamp down on such practices.

More than three-quarters of the 96 research organizations that responded to the survey said they did not have a research-metrics policy, according to data presented at a London meeting on metrics on 8 February. The same number — 75 — had not signed up to the Declaration on Research Assessment (DORA), an international concord that aims to eliminate the misuse of research metrics, which was developed in San Francisco in December 2012….

The survey found 52 institutions had implemented some measures to promote responsible-metrics principles, but only four had taken what the forum considers to be comprehensive action….”

RCUK statement on the responsible use of metrics in research assessment

[Undated but released c. February 8, 2018.]

“Research councils consider the journal impact factor and metrics such as the H-index are not appropriate measures for assessing the quality of publications or the contribution of individual researchers, and so will not use these measures in our peer review processes. …The research councils will highlight to reviewers, panel members, recruitment and promotion panels that they should not place undue emphasis on the journal in which papers are published, but assess the content of specific papers, when considering the impact of an individual researcher’s contribution….The Research Councils will sign DORA as a public indication of their support for these principles….”