Requiem for impact factors and high publication charges: Accountability in Research: Vol 0, No ja

Abstract:  Journal impact factors, publication charges and assessment of quality and accuracy of scientific research are critical for researchers, managers, funders, policy makers, and society. Editors and publishers compete for impact factor rankings, to demonstrate how important their journals are, and researchers strive to publish in perceived top journals, despite high publication and access charges. This raises questions of how top journals are identified, whether assessments of impacts are accurate and whether high publication charges borne by the research community are justified, bearing in mind that they also collectively provide free peer-review to the publishers. Although traditional journals accelerated peer review and publication during the COVID-19 pandemic, preprint servers made a greater impact with over 30,000 open access articles becoming available and accelerating a trend already seen in other fields of research. We review and comment on the advantages and disadvantages of a range of assessment methods and the way in which they are used by researchers, managers, employers and publishers. We argue that new approaches to assessment are required to provide a realistic and comprehensive measure of the value of research and journals and we support open access publishing at a modest, affordable price to benefit research producers and consumers.

 

Openness Profile: Modelling research evaluation for open scholarship | Zenodo

“Knowledge Exchange (KE) has published the report ‘Openness Profile: Modelling research evaluation for open scholarship

The report describes mechanisms and approaches to improve and incentivize the recording, the evaluation and the recognition of contributions to Open Scholarship practice. The report presents the Openness Profile and how it can help address existing gaps in the assessment of open science.

The Openness Profile is a digital resource, a portfolio of a research contributor’s outputs and activities, accessible in a single place. Academic and non-academic open scholarship activities become visible and more easily recognised. The Openness Profile is modelling how research evaluation in an open science context can be improved. Expected benefits are highlighted and requirements listed. Recommendations are provided to various stakeholders how to establish the Openness Profile as research evaluation routine.

Over 80 individual stakeholders from 48 different organisations provided input to this report on research assessment and open scholarship. The work and writing were done by consultants Fiona Murphy and Phill Jones of MoreBrains Cooperative, together with the KE Open Scholarship Research Evaluation task & finish group….”

Incentivization Blueprint — Open Research Funders Group

“A growing number of funders are eager to encourage grantees to share their research outputs – articles, code and materials, and data. To accelerate the adoption of open norms, deploying the right incentives is of paramount importance. Specifically, the incentive structure needs to both reduce its reliance on publication in high-impact journals as a primary metric, and properly value and reward a range of research outputs.

This Incentivization Blueprint seeks to provide funders with a stepwise approach to adjusting their incentivization schemes to more closely align with open access, open data, open science, and open research. Developed by the Open Research Funders Group, the Blueprint provides organizations with guidance for developing, implementing, and overseeing incentive structures that maximize the visibility and usability of the research they fund.

A number of prominent funders have committed to taking steps to implement the Incentivization Blueprint. Among them are the following: …”

Open Research Funders Group (ORFG) | DORA

“The ORFG released guidance for funders called, Incentivizing the sharing of research outputs through research assessment: a funder implementation blueprint. The group created the document to assist funders in encouraging researchers to maximize the impact of their work by openly sharing research outputs. The blueprint identifies three goals to be successful:

change the perception that publication in high-impact journals is the only metric that counts;
provide demonstrable evidence that, while journal articles are important, we value and reward all types of research outputs; and
ensure that indicators like the venue of publication or journal impact factor are not used as surrogate measures of quality in researcher assessment.

To do this, the blueprint provides three steps with concrete actions for funders: 1) policy development and declarations, 2) implementation, and 3) engagement.  Template language for funders is included in the document to promote easy uptake….”

Recognition and rewards in the Open Era: Turning thoughts into actions | Open Working

“The TU Delft Open Science programme held its very first thematic session on the Recognition and Rewards cross-cutting theme on October 5, 2020. The Open Science Programme currently has 5 projects and 3 cross-cutting themes, from FAIR software to Open Education. This means that the programme core team is composed of members from many different departments (not only within the Library), bringing in their diverse perspectives and skills! But this also poses a challenge on teamwork- we need a way for us to all stay in touch, be able to see and learn from each other’s work, and contribute and provide feedback – hence the idea of the thematic sessions.Ingrid Vos, the leader of the Recognition and Rewards theme, has kindly volunteered to lead this first thematic session. Since this theme relates to everyone’s work within the Open Science Programme, Ingrid wanted to make sure everyone can be effectively engaged in the session and their voices can be heard – more on this below.Key takeaways: A re-examination of rewards and recognition is needed to further fuel the cultural and behavioural changes towards open science TU Delft’s work in this aspect builds upon VSNU’s “Room for everyone’s talent” position paper. Every university in the Netherlands has a committee on Recognition & Rewards. The TU Delft committee is led by Ena Voûte. The Open Science Programme team had fruitful discussions around open research and education behaviours and “products”, how to evaluate, appreciate and reward these, as well as emerging career paths We’d love to hear your ideas and thoughts, both on rewards and recognition and on how you’d like to contribute and participate in these discussions- please use the comment section of this post!  …”

Indonesia should stop pushing its academics to chase empty indicators – Nikkei Asia

“An assessment system that predominantly evaluates research performance based on journal output and citations is steering academics from developing countries like mine to chasing quantity over quality. And being exploited while doing so.

Researchers in Indonesia are the second most likely in the world to publish in dubious journals that print articles for a fee without proper scientific peer review, a process where several experts in the field review the merit of the research, according to a new study by economists Vit Machacek and Martin Srholec.

 

These predatory journals prey on academics whose career progressions, and therefore salary increase, are determined by credit points. They exploit the processing fees that authors pay to make articles open to the public. They pocket the payment, an average of $178, an amount close to the basic salary of an entry-level lecturer in a state university in Indonesia, without facilitating proper peer review. The papers published by predatory journals are often low-quality, with typographical and grammatical errors….

In addition to the predatory journal problem, the metric also discourages science collaboration. As the metric values article count, academics who want to turn out several journal articles from a data set has an incentive to hold on to them rather than sharing them for other scientists to analyze….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Taylor & Francis signs up to principles outlined in DORA supporting balanced and fair research assessment – Taylor & Francis Newsroom

“Taylor & Francis Group has signed the San Francisco Declaration on Research Assessment (DORA), which aims to improve the ways in which researchers and the outputs of scholarly research are evaluated.

Signatories to DORA recognize that the Journal Impact Factor should not be used as an all-encompassing tool for evaluating research. They advocate for a sea change where all research articles are assessed on their own merits and impact, and not assessed on the basis of their publication venue. By signing DORA, Taylor & Francis aligns with these concepts….”

Taylor & Francis signs up to principles outlined in DORA supporting balanced and fair research assessment – Taylor & Francis Newsroom

“Taylor & Francis Group has signed the San Francisco Declaration on Research Assessment (DORA), which aims to improve the ways in which researchers and the outputs of scholarly research are evaluated.

Signatories to DORA recognize that the Journal Impact Factor should not be used as an all-encompassing tool for evaluating research. They advocate for a sea change where all research articles are assessed on their own merits and impact, and not assessed on the basis of their publication venue. By signing DORA, Taylor & Francis aligns with these concepts….”