Springer Nature is committed to being a part of the open-access movement | by Steven Inchcoombe, chief publishing officer

“Institutions, research funding bodies and publishers must all work together to change the system in the interest of advancing research, says Steven Inchcoombe

As part of our recent IPO process, there was a regulatory requirement for Springer Nature to prepare a “prospectus”: a lengthy legal reference document intended for “qualified investors”. In the past week, some content from this 400-plus-page document has been taken out of context to make inaccurate and unfair comments about us, our plans and our business; and we want to set the record straight.

We have been accused of “paying lip service” to the San Francisco Declaration on Research Assessment (DORA). This is not true and is particularly upsetting for our colleagues who are proud to stand firmly behind DORA and who have been implementing the large-scale changes needed to fulfil our obligations. This has seen us stop using journal impact factors in isolation in our marketing (note: a prospectus is a legal document aimed at potential investors, not a marketing tool for authors or librarians). In fact, for more than 10 years, long before DORA, Nature editorials have expressed concerns about the overuse of impact factors and have set out the case for a greater variety of more suitable metrics for different purposes. We continue to see this need, and we will continue to offer our librarians, authors, readers, editors and partners other choices, especially those at article level.

We have been accused of “exploiting” impact factor to market our journals. We are not. At Springer Nature, we have increased the use of other journal-level and article-level metrics including article usage and altmetrics. This is clearly stated in the prospectus, which references the importance of other metrics such as views/downloads or mentions in social media. The fact, however, remains that authors do choose which journals to publish in partly based on their impact factors, which is why we had a duty to explain this. Indeed, their long history of being independently calculated and published means that they are an important reference point in a prospectus, which is a verifiable, fact-based document aimed at investors. In our author survey last year (completed by more than 70,000 authors from all disciplines and regions), a journal’s impact factor is one the top-four criteria when choosing where to submit their draft articles, alongside a journal’s reputation, relevance and quality of peer review, in that order.

Finally, it has been claimed that our only motivation for higher impact factors is to drive higher article-processing charges. This is also not true. Part of our commitment to developing the largest and most comprehensive range of open-access journals in the publishing industry includes a desire to have a range of community-based OA journals, sound science OA journals and selective OA journals. For example, we flipped Nature Communications many years ago to become fully OA to ensure that such a choice existed for authors, and it is now the highest-cited OA journal in the world, demonstrating its appeal to authors and readers alike….”

 

The Journal Impact Factor: A brief history, critique, and discussion of adverse effects

Abstract:  The Journal Impact Factor (JIF) is, by far, the most discussed bibliometric indicator. Since its introduction over 40 years ago, it has had enormous effects on the scientific ecosystem: transforming the publishing industry, shaping hiring practices and the allocation of resources, and, as a result, reorienting the research activities and dissemination practices of scholars. Given both the ubiquity and impact of the indicator, the JIF has been widely dissected and debated by scholars of every disciplinary orientation. Drawing on the existing literature as well as on original research, this chapter provides a brief history of the indicator and highlights well-known limitations-such as the asymmetry between the numerator and the denominator, differences across disciplines, the insufficient citation window, and the skewness of the underlying citation distributions. The inflation of the JIF and the weakening predictive power is discussed, as well as the adverse effects on the behaviors of individual actors and the research enterprise. Alternative journal-based indicators are described and the chapter concludes with a call for responsible application and a commentary on future developments in journal indicators.

Bibliometrics: The Leiden Manifesto for research metrics : Nature News & Comment

“…Yet the abuse of research metrics has become too widespread to ignore. We therefore present the Leiden Manifesto, named after the conference at which it crystallized (see http://sti2014.cwts.nl). Its ten principles are not news to scientometricians, although none of us would be able to recite them in their entirety because codification has been lacking until now. Luminaries in the field, such as Eugene Garfield (founder of the ISI), are on record stating some of these principles3, 4. But they are not in the room when evaluators report back to university administrators who are not expert in the relevant methodology. Scientists searching for literature with which to contest an evaluation find the material scattered in what are, to them, obscure journals to which they lack access.

We offer this distillation of best practice in metrics-based research assessment so that researchers can hold evaluators to account, and evaluators can hold their indicators to account….”

Few UK universities have adopted rules against impact-factor abuse

“A survey of British institutions reveals that few have taken concrete steps to stop the much-criticized misuse of research metrics in the evaluation of academics’ work. The results offer an early insight into global efforts to clamp down on such practices.

More than three-quarters of the 96 research organizations that responded to the survey said they did not have a research-metrics policy, according to data presented at a London meeting on metrics on 8 February. The same number — 75 — had not signed up to the Declaration on Research Assessment (DORA), an international concord that aims to eliminate the misuse of research metrics, which was developed in San Francisco in December 2012….

The survey found 52 institutions had implemented some measures to promote responsible-metrics principles, but only four had taken what the forum considers to be comprehensive action….”

RCUK statement on the responsible use of metrics in research assessment

[Undated but released c. February 8, 2018.]

“Research councils consider the journal impact factor and metrics such as the H-index are not appropriate measures for assessing the quality of publications or the contribution of individual researchers, and so will not use these measures in our peer review processes. …The research councils will highlight to reviewers, panel members, recruitment and promotion panels that they should not place undue emphasis on the journal in which papers are published, but assess the content of specific papers, when considering the impact of an individual researcher’s contribution….The Research Councils will sign DORA as a public indication of their support for these principles….”

Prof Randy Schekman: Giving Science To The People

“It is a peculiar situation when commercial science journals can not only ask investigators to pay for the privilege of sending in their work but also charge universities and others for the privilege of accessing work that was publicly funded.”

Recommendation on the evaluation of individual researchers in the mathematical sciences

“Nothing (and in particular no semi-automatized pseudo-scientific evaluation that involves numbers or data) can replace evaluation by an individual who actually understands what he/she is evaluating. Furthermore, tools such as impact factors are clearly not helpful or relevant in the context of mathematical research….”

Open access research | Revista Pesquisa Fapesp

“Brazil stands out on the international landscape when it comes to open access, a movement launched in the early 2000s with the aim of making scientific output freely available online. According to data compiled by Spanish research group Scimago, 33.5% of the Brazilian articles indexed in the Scopus database in 2016 were published in journals whose content is free to read online as soon as it is published, under a model known as the “golden road.” This is the largest proportion among the 15 nations with the highest volume of scientific output recorded on Scopus. Brazil is also top of the list of nations with the highest number of open access scientific journals (see charts).”

Article visibility: journal impact factor and availability of full text in PubMed Central and open access

Abstract:  Both the impact factor of the journal and immediate full-text availability in Pubmed Central (PMC) have featured in editorials before.1-3 In 2004, the editor of the Cardiovascular Journal of Africa (CVJA) lamented, like so many others, the injustice of not having an impact factor, its validity as a tool for measuring science output, and the negative effect of a low perceived impact in drawing attention from publications from developing countries.1,4

Since then, after a selection process, we have been indexed by the Web of Science® (WoS) and Thomson Reuters (Philadelphia, PA, USA), and have seen a growing impact factor. In the case of PMC, our acceptance to this database was announced in 2012,2 and now we are proud that it is active and full-text articles are available dating back to 2009. The journal opted for immediate full open access (OA), which means that full-text articles are available on publication date for anybody with access to the internet.