Additional support for RCR: A validated article-level measure of scientific influence

“In their comment, Janssens et al. [1] offer a critique of the Relative Citation Ratio (RCR), objecting to the construction of both the numerator and denominator of the metric. While we strongly agree that any measure used to assess the productivity of research programs should be thoughtfully designed and carefully validated, we believe that the specific concerns outlined in their correspondence are unfounded.

Our original article acknowledged that RCR or, for that matter, any bibliometric measure has limited power to quantify the influence of any very recently published paper, because citation rates are inherently noisy when the absolute number of citations is small [2]. For this reason, in our iCite tool, we have not reported RCRs for papers published in the calendar year previous to the current year [3]. However, while agreeing with our initial assertion that RCR cannot be used to conclusively evaluate recent papers, Janssens et al. also suggest that the failure to report RCRs for new publications might unfairly penalize some researchers. While it is widely understood that it takes time to accurately assess the influence that new papers have on their field, we have attempted to accommodate this concern by updating iCite so that RCRs are now reported for all papers in the database that have at least 5 citations and by adding a visual indicator to flag values for papers published in the last 18 months, which should be considered provisional [3]. This modified practice will be maintained going forward.

Regarding article citation rates of older articles, we have added data on the stability of RCR values to the “Statistics” page of the iCite website [4, 5]. We believe that these new data, which demonstrate that the vast majority of influential papers retain their influence over the period of an investigator’s career, should reassure users that RCR does not unfairly disadvantage older papers. Our analysis of the year-by-year changes in RCR values of National Institutes of Health (NIH)-funded articles published in 1991 reinforces this point (Fig 1). From 1992–2014, both on the individual level and in aggregate, RCR values are remarkably stable. For cases in which RCRs change significantly, the values typically increase. That said, we strongly believe that the potential for RCR to decrease over time is necessary and important; as knowledge advances and old models are replaced, publications rooted in those outdated models naturally become less influential….”

HuMetricsHSS – Rethinking humane indicators of excellence in the humanities and social sciences

“HuMetricsHSS takes the approach that metrics should only be used to measure a scholar’s progress toward embodying five values that our initial research suggests are central to all HSS disciplines:

  • Collegiality, which can be described as the professional practices of kindness, generosity, and empathy toward other scholars and oneself;
  • Quality, a value that demonstrates one’s originality, willingness to push boundaries, methodological soundness, and the advancement of knowledge both within one’s own discipline and among other disciplines and with the general public, as well;
  • Equity, or the willingness to undertake study with social justice, equitable access to research, and the public good in mind;
  • Openness, which includes a researcher’s transparency, candor, and accountability, in addition to the practice of making one’s research open access at all stages; and
  • Community, the value of being engaged in one’s community of practice and with the public at large and also in practicing principled leadership. …”

Clarivate Analytics announces landmark partnership with Impactstory to make open access content

“Clarivate Analytics today announced a novel public/private strategic partnership with Impactstory that will remove a critical barrier for researchers: limited open access (OA) to high-quality, trusted peer-reviewed content. Under the terms of the partnership, Clarivate Analytics is providing a grant to Impactstory to build on its oaDOI service, making open access content more easily discoverable, and the research workflow more efficient from discovery through publishing….The oaDOI service is from Impactstory, a nonprofit creating online tools to make science more open and reusable. It currently indexes 90 million articles and delivers open-access full text versions over a free, fast, open API that is currently used by over 700 libraries worldwide and fulfills over 2 million requests daily. Impactstory has also built Unpaywall, a free browser extension that uses oaDOI to find full text whenever researchers come across paywalled articles….”

Usage Bibliometrics as a Tool to Measure Research Activity

Abstract:  Measures for research activity and impact have become an integral ingredient in the assessment of a wide range of entities (individual researchers, organizations, instruments, regions, disciplines). Traditional bibliometric indicators, like publication and citation based indicators, provide an essential part of this picture, but cannot describe the complete picture. Since reading scholarly publications is an essential part of the research life cycle, it is only natural to introduce measures for this activity in attempts to quantify the efficiency, productivity and impact of an entity. Citations and reads are significantly different signals, so taken together, they provide a more complete picture of research activity. Most scholarly publications are now accessed online, making the study of reads and their patterns possible. Click-stream logs allow us to follow information access by the entire research community, real-time. Publication and citation datasets just reflect activity by authors. In addition, download statistics will help us identify publications with significant impact, but which do not attract many citations. Click-stream signals are arguably more complex than, say, citation signals. For one, they are a superposition of different classes of readers. Systematic downloads by crawlers also contaminate the signal, as does browsing behavior. We discuss the complexities associated with clickstream data and how, with proper filtering, statistically significant relations and conclusions can be inferred from download statistics. We describe how download statistics can be used to describe research activity at different levels of aggregation, ranging from organizations to countries. These statistics show a correlation with socio-economic indicators. A comparison will be made with traditional bibliometric indicators. We will argue that astronomy is representative of more general trends.

European Institutions Adopt Altmetric Explorer for Institutions – Digital Science

“Our portfolio company Altmetric announce that École Polytechnique Fédérale de Lausanne (EPFL) has become the latest institution to adopt the Explorer for Institutions platform to help analyse the online engagement surrounding its scholarly research outputs.

With an intuitive interface which enables users to browse, filter and report on the latest shares and mentions for over 10 million research outputs, the Explorer for Institutions platform makes it easy to identify where academic work has received mainstream or social media coverage, been referenced in public policy, or received attention from scholarly and broader audiences in places such as Wikipedia, Reddit and post-publication peer-review forums. Citation data from Scopus and Web of Science is also included where available.

EPFL joins leading institutions including Ghent University, ETH Zurich, The University of Helsinki and the International Institute of Social Studies and the Erasmus Research Institute of Management at Erasmus University Rotterdam in utilising Altmetric data to better understand the reach and influence of published research.”

Tread carefully with altmetrics, European Commission told | Times Higher Education (THE)

“Alternative metrics should be used by the European Commission alongside expert judgement and other measures of research quality, according to a new report.

The report cautions against relying too heavily on new ways of measuring research when developing the open science agenda in Europe….The group, led by James Wilsdon, professor of research policy at the University of Sheffield, came to its conclusions by reviewing the literature and evidence submitted to it about how new metrics could help to advance the work on opening up European science….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

PBJ is now a leading open access plant journal – Daniell – 2017 – Plant Biotechnology Journal – Wiley Online Library

“Welcome to the first issue of the fifteenth volume of Plant Biotechnology Journal. I would like to start this editorial by announcing the successful transition of PBJ from a subscription-based journal to an open access journal supported exclusively by authors. This resulted in enhanced free global access to all readers. I applaud the PBJ management team for offering free open access to all articles published in this journal in the past 14 years. As the first among the top ten open access plant science journals, based on 2016 citations, PBJ is very likely to be ranked among the top three journals publishing original research. PBJ is now compatible with mobile platforms, tablets, iPads, and iPhones and offers several new options to evaluate short- and long-term impact of published articles, including Altmetric scores, article readership, and citations….”