Perception of the importance of chemistry research papers and comparison to citation rates

Abstract:  Chemistry researchers are frequently evaluated on the perceived significance of their work with the citation count as the most commonly-used metric for gauging this property. Recent studies have called for a broader evaluation of significance that includes more nuanced bibliometrics as well as altmetrics to more completely evaluate scientific research. To better understand the relationship between metrics and peer judgements of significance in chemistry, we have conducted a survey of chemists to investigate their perceptions of previously published research. Focusing on a specific issue of the Journal of the American Chemical Society published in 2003, respondents were asked to select which articles they thought best matched importance and significance given several contexts: highest number of citations, most significant (subjectively defined), most likely to share among chemists, and most likely to share with a broader audience. The answers to the survey can be summed up in several observations. The ability of respondents to predict the citation counts of established research is markedly lower than the ability of those counts to be predicted by the h-index of the corresponding author of each article. This observation is conserved even when only considering responses from chemists whose expertise falls within the subdiscipline that best describes the work performed in an article. Respondents view both cited papers and significant papers differently than papers that should be shared with chemists. We conclude from our results that peer judgements of importance and significance differ from metrics-based measurements, and that chemists should work with bibliometricians to develop metrics that better capture the nuance of opinions on the importance of a given piece of research.

The academic papers researchers regard as significant are not those that are highly cited

“For many years, academia has relied on citation count as the main way to measure the impact or importance of research, informing metrics such as the Impact Factor and the h-index. But how well do these metrics actually align with researchers’ subjective evaluation of impact and significance? Rachel Borchardt and Matthew R. Hartings report on a study that compares researchers’ perceptions of significance, importance, and what is highly cited with actual citation data. The results reveal a strikingly large discrepancy between perceptions of impact and the metric we currently use to measure it.”

Altmetric Scores, Citations, and Publication of Studies Posted as Preprints | Medical Journals and Publishing | JAMA | The JAMA Network

“As preprints in medicine are debated, data on how preprints are used, cited, and published are needed. We evaluated views and downloads and Altmetric scores and citations of preprints and their publications. We also assessed whether Altmetric scores and citations of published articles correlated with prior preprint posting….Published articles with preprints had significantly higher Altmetric scores than published articles without preprints….”

Additional support for RCR: A validated article-level measure of scientific influence

“In their comment, Janssens et al. [1] offer a critique of the Relative Citation Ratio (RCR), objecting to the construction of both the numerator and denominator of the metric. While we strongly agree that any measure used to assess the productivity of research programs should be thoughtfully designed and carefully validated, we believe that the specific concerns outlined in their correspondence are unfounded.

Our original article acknowledged that RCR or, for that matter, any bibliometric measure has limited power to quantify the influence of any very recently published paper, because citation rates are inherently noisy when the absolute number of citations is small [2]. For this reason, in our iCite tool, we have not reported RCRs for papers published in the calendar year previous to the current year [3]. However, while agreeing with our initial assertion that RCR cannot be used to conclusively evaluate recent papers, Janssens et al. also suggest that the failure to report RCRs for new publications might unfairly penalize some researchers. While it is widely understood that it takes time to accurately assess the influence that new papers have on their field, we have attempted to accommodate this concern by updating iCite so that RCRs are now reported for all papers in the database that have at least 5 citations and by adding a visual indicator to flag values for papers published in the last 18 months, which should be considered provisional [3]. This modified practice will be maintained going forward.

Regarding article citation rates of older articles, we have added data on the stability of RCR values to the “Statistics” page of the iCite website [4, 5]. We believe that these new data, which demonstrate that the vast majority of influential papers retain their influence over the period of an investigator’s career, should reassure users that RCR does not unfairly disadvantage older papers. Our analysis of the year-by-year changes in RCR values of National Institutes of Health (NIH)-funded articles published in 1991 reinforces this point (Fig 1). From 1992–2014, both on the individual level and in aggregate, RCR values are remarkably stable. For cases in which RCRs change significantly, the values typically increase. That said, we strongly believe that the potential for RCR to decrease over time is necessary and important; as knowledge advances and old models are replaced, publications rooted in those outdated models naturally become less influential….”

HuMetricsHSS – Rethinking humane indicators of excellence in the humanities and social sciences

“HuMetricsHSS takes the approach that metrics should only be used to measure a scholar’s progress toward embodying five values that our initial research suggests are central to all HSS disciplines:

  • Collegiality, which can be described as the professional practices of kindness, generosity, and empathy toward other scholars and oneself;
  • Quality, a value that demonstrates one’s originality, willingness to push boundaries, methodological soundness, and the advancement of knowledge both within one’s own discipline and among other disciplines and with the general public, as well;
  • Equity, or the willingness to undertake study with social justice, equitable access to research, and the public good in mind;
  • Openness, which includes a researcher’s transparency, candor, and accountability, in addition to the practice of making one’s research open access at all stages; and
  • Community, the value of being engaged in one’s community of practice and with the public at large and also in practicing principled leadership. …”

Clarivate Analytics announces landmark partnership with Impactstory to make open access content

“Clarivate Analytics today announced a novel public/private strategic partnership with Impactstory that will remove a critical barrier for researchers: limited open access (OA) to high-quality, trusted peer-reviewed content. Under the terms of the partnership, Clarivate Analytics is providing a grant to Impactstory to build on its oaDOI service, making open access content more easily discoverable, and the research workflow more efficient from discovery through publishing….The oaDOI service is from Impactstory, a nonprofit creating online tools to make science more open and reusable. It currently indexes 90 million articles and delivers open-access full text versions over a free, fast, open API that is currently used by over 700 libraries worldwide and fulfills over 2 million requests daily. Impactstory has also built Unpaywall, a free browser extension that uses oaDOI to find full text whenever researchers come across paywalled articles….”

Usage Bibliometrics as a Tool to Measure Research Activity

Abstract:  Measures for research activity and impact have become an integral ingredient in the assessment of a wide range of entities (individual researchers, organizations, instruments, regions, disciplines). Traditional bibliometric indicators, like publication and citation based indicators, provide an essential part of this picture, but cannot describe the complete picture. Since reading scholarly publications is an essential part of the research life cycle, it is only natural to introduce measures for this activity in attempts to quantify the efficiency, productivity and impact of an entity. Citations and reads are significantly different signals, so taken together, they provide a more complete picture of research activity. Most scholarly publications are now accessed online, making the study of reads and their patterns possible. Click-stream logs allow us to follow information access by the entire research community, real-time. Publication and citation datasets just reflect activity by authors. In addition, download statistics will help us identify publications with significant impact, but which do not attract many citations. Click-stream signals are arguably more complex than, say, citation signals. For one, they are a superposition of different classes of readers. Systematic downloads by crawlers also contaminate the signal, as does browsing behavior. We discuss the complexities associated with clickstream data and how, with proper filtering, statistically significant relations and conclusions can be inferred from download statistics. We describe how download statistics can be used to describe research activity at different levels of aggregation, ranging from organizations to countries. These statistics show a correlation with socio-economic indicators. A comparison will be made with traditional bibliometric indicators. We will argue that astronomy is representative of more general trends.

European Institutions Adopt Altmetric Explorer for Institutions – Digital Science

“Our portfolio company Altmetric announce that École Polytechnique Fédérale de Lausanne (EPFL) has become the latest institution to adopt the Explorer for Institutions platform to help analyse the online engagement surrounding its scholarly research outputs.

With an intuitive interface which enables users to browse, filter and report on the latest shares and mentions for over 10 million research outputs, the Explorer for Institutions platform makes it easy to identify where academic work has received mainstream or social media coverage, been referenced in public policy, or received attention from scholarly and broader audiences in places such as Wikipedia, Reddit and post-publication peer-review forums. Citation data from Scopus and Web of Science is also included where available.

EPFL joins leading institutions including Ghent University, ETH Zurich, The University of Helsinki and the International Institute of Social Studies and the Erasmus Research Institute of Management at Erasmus University Rotterdam in utilising Altmetric data to better understand the reach and influence of published research.”

Tread carefully with altmetrics, European Commission told | Times Higher Education (THE)

“Alternative metrics should be used by the European Commission alongside expert judgement and other measures of research quality, according to a new report.

The report cautions against relying too heavily on new ways of measuring research when developing the open science agenda in Europe….The group, led by James Wilsdon, professor of research policy at the University of Sheffield, came to its conclusions by reviewing the literature and evidence submitted to it about how new metrics could help to advance the work on opening up European science….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”