HuMetricsHSS – Rethinking humane indicators of excellence in the humanities and social sciences

“HuMetricsHSS takes the approach that metrics should only be used to measure a scholar’s progress toward embodying five values that our initial research suggests are central to all HSS disciplines:

  • Collegiality, which can be described as the professional practices of kindness, generosity, and empathy toward other scholars and oneself;
  • Quality, a value that demonstrates one’s originality, willingness to push boundaries, methodological soundness, and the advancement of knowledge both within one’s own discipline and among other disciplines and with the general public, as well;
  • Equity, or the willingness to undertake study with social justice, equitable access to research, and the public good in mind;
  • Openness, which includes a researcher’s transparency, candor, and accountability, in addition to the practice of making one’s research open access at all stages; and
  • Community, the value of being engaged in one’s community of practice and with the public at large and also in practicing principled leadership. …”

Clarivate Analytics announces landmark partnership with Impactstory to make open access content

“Clarivate Analytics today announced a novel public/private strategic partnership with Impactstory that will remove a critical barrier for researchers: limited open access (OA) to high-quality, trusted peer-reviewed content. Under the terms of the partnership, Clarivate Analytics is providing a grant to Impactstory to build on its oaDOI service, making open access content more easily discoverable, and the research workflow more efficient from discovery through publishing….The oaDOI service is from Impactstory, a nonprofit creating online tools to make science more open and reusable. It currently indexes 90 million articles and delivers open-access full text versions over a free, fast, open API that is currently used by over 700 libraries worldwide and fulfills over 2 million requests daily. Impactstory has also built Unpaywall, a free browser extension that uses oaDOI to find full text whenever researchers come across paywalled articles….”

Usage Bibliometrics as a Tool to Measure Research Activity

Abstract:  Measures for research activity and impact have become an integral ingredient in the assessment of a wide range of entities (individual researchers, organizations, instruments, regions, disciplines). Traditional bibliometric indicators, like publication and citation based indicators, provide an essential part of this picture, but cannot describe the complete picture. Since reading scholarly publications is an essential part of the research life cycle, it is only natural to introduce measures for this activity in attempts to quantify the efficiency, productivity and impact of an entity. Citations and reads are significantly different signals, so taken together, they provide a more complete picture of research activity. Most scholarly publications are now accessed online, making the study of reads and their patterns possible. Click-stream logs allow us to follow information access by the entire research community, real-time. Publication and citation datasets just reflect activity by authors. In addition, download statistics will help us identify publications with significant impact, but which do not attract many citations. Click-stream signals are arguably more complex than, say, citation signals. For one, they are a superposition of different classes of readers. Systematic downloads by crawlers also contaminate the signal, as does browsing behavior. We discuss the complexities associated with clickstream data and how, with proper filtering, statistically significant relations and conclusions can be inferred from download statistics. We describe how download statistics can be used to describe research activity at different levels of aggregation, ranging from organizations to countries. These statistics show a correlation with socio-economic indicators. A comparison will be made with traditional bibliometric indicators. We will argue that astronomy is representative of more general trends.

European Institutions Adopt Altmetric Explorer for Institutions – Digital Science

“Our portfolio company Altmetric announce that École Polytechnique Fédérale de Lausanne (EPFL) has become the latest institution to adopt the Explorer for Institutions platform to help analyse the online engagement surrounding its scholarly research outputs.

With an intuitive interface which enables users to browse, filter and report on the latest shares and mentions for over 10 million research outputs, the Explorer for Institutions platform makes it easy to identify where academic work has received mainstream or social media coverage, been referenced in public policy, or received attention from scholarly and broader audiences in places such as Wikipedia, Reddit and post-publication peer-review forums. Citation data from Scopus and Web of Science is also included where available.

EPFL joins leading institutions including Ghent University, ETH Zurich, The University of Helsinki and the International Institute of Social Studies and the Erasmus Research Institute of Management at Erasmus University Rotterdam in utilising Altmetric data to better understand the reach and influence of published research.”

Tread carefully with altmetrics, European Commission told | Times Higher Education (THE)

“Alternative metrics should be used by the European Commission alongside expert judgement and other measures of research quality, according to a new report.

The report cautions against relying too heavily on new ways of measuring research when developing the open science agenda in Europe….The group, led by James Wilsdon, professor of research policy at the University of Sheffield, came to its conclusions by reviewing the literature and evidence submitted to it about how new metrics could help to advance the work on opening up European science….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

PBJ is now a leading open access plant journal – Daniell – 2017 – Plant Biotechnology Journal – Wiley Online Library

“Welcome to the first issue of the fifteenth volume of Plant Biotechnology Journal. I would like to start this editorial by announcing the successful transition of PBJ from a subscription-based journal to an open access journal supported exclusively by authors. This resulted in enhanced free global access to all readers. I applaud the PBJ management team for offering free open access to all articles published in this journal in the past 14 years. As the first among the top ten open access plant science journals, based on 2016 citations, PBJ is very likely to be ranked among the top three journals publishing original research. PBJ is now compatible with mobile platforms, tablets, iPads, and iPhones and offers several new options to evaluate short- and long-term impact of published articles, including Altmetric scores, article readership, and citations….”

Using PageRank to assess scientific importance | Ars Technica

“Researchers have turned to network theory to better model and understand scientific importance. A number of interesting node-based models describe papers as an interconnected web of citations, including some that rank the relative impact of journals. To a simple approximation, most of these models combine the number of links to a particular paper (the number of citations), the number of links to the citing papers, and the interconnectedness of that network with other networks of papers. It occurred to researchers at Brookhaven National Lab (BNL) and Boston University (BU) that an additional and extremely effective network analysis tool already exists: Google’s PageRank….”