The changing role of funders in responsible research assessment: progress, obstacles and the way ahead

“Encouraging interim results of different vaccine trials reflect the speed, innovation and dedication that the research community has shown in its response to Covid-19. But the pandemic has also shone a spotlight on the inner workings of research, and in lots of ways—good and bad—has intensified scrutiny of how research is funded, practiced, disseminated and evaluated, and how research cultures can be made more open, inclusive and impactful.

 

The uncertain possibilities that flow from this moment follow a period in which concern has intensified over several long-standing problems, all linked to research assessment. As attention shifts from describing these problems, towards designing and implementing solutions, efforts are coalescing around the idea of responsible research assessment (RRA). This is an umbrella term for approaches to assessment which incentivise, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures.

This working paper explores what RRA is, and where it comes from, by outlining fifteen initiatives that have influenced the content, shape and direction of current RRA debates. It goes on to describe some of the responses that these have elicited, with a particular focus on the role and contribution of research funders, who have more freedom and agency to experiment and drive change than many of the other actors in research systems.

The paper also presents the findings of a new survey of RRA policies and practices in the participant organisations of the Global Research Council (GRC)—most of which are national public funding agencies—with responses from 55 organisations worldwide….”

Home – Responsible Research Assessment – a virtual conference from the Global Research Council

“Around the world, the COVID-19 pandemic has reaffirmed the importance of international collaboration in research and innovation. The impact of research has become ever more apparent during the pandemic, and so there is a renewed urgency for funders to come together and reconsider how research is assessed and evaluated. 

At the GRC Responsible Research Assessment Conference 2020, participants will be invited to consider the existing sector-wide frameworks on responsible research assessment and have a global discussion on how funders can drive a positive research culture through research assessment criteria and processes. The discussions will reflect on how to support a diverse, inclusive and thriving research sector….”

A pivotal moment for responsible research assessment – Research Professional News

“We’ve been involved in diagnosing, assembling evidence and banging drums about these problems, through initiatives such as the Declaration on Research Assessment (Dora), the Metric Tide report and the UK Forum for Responsible Research Metrics.

So we welcome signs that attention is shifting towards implementing solutions, and coalescing around a more expansive agenda for responsible research assessment (RRA). Early debates on metrics and measurement have expanded to encompass questions about how to create a healthy work culture for researchers, how to promote research integrity, how to move from closed to open scholarship, and how to embed the principles of equality, diversity and inclusion across the research community.

This more holistic approach can be seen, for example, in UK Research and Innovation’s commitment to a healthy research culture, and in the recent guidelines on good research practice from the German Research Foundation (DFG).

Next week’s Global Research Council virtual conference on RRA—hosted by UKRI in ?collaboration with the UK Forum for Responsible Research Metrics and South Africa’s National Research Foundation—comes at a pivotal time….

Declarations and statements of principle have been an important part of this story. But even though we have co-authored some of these, we feel the time for grand declarations has passed. They risk becoming substitutes for action.

RRA now needs to focus on action and implementation—testing and identifying what works in building a healthy and productive research culture. Institutional commitments must be followed by the hard graft of reforming cultures, practices and processes….”

Academy of Finland to adopt reform – Academy of Finland

“Starting with calls to be opened after 1 January 2021, the Academy of Finland will introduce a number of reforms concerning open access to scientific publications and responsible researcher evaluation. Through the reforms, the Academy wants to further strengthen its long-established policies on openness of scientific outputs and responsibility in researcher evaluation.

The policies are supported by the Academy’s commitments to international and national declarations over the past two years: the Plan S initiative for open access publishing (2018), the Declaration on Research Assessment (DORA) for improved research assessment (2019), the national recommendation on responsible researcher evaluation (2020), and the Finnish Declaration for Open Science and Research (2020)….”

Journal- or article-based citation measure? A study… | F1000Research

Abstract:  In academia, decisions on promotions are influenced by the citation impact of the works published by the candidates. The Medical Faculty of the University of Bern used a measure based on the journal impact factor (JIF) for this purpose: the JIF of the papers submitted for promotion should rank in the upper third of journals in the relevant discipline (JIF rank >0.66). The San Francisco Declaration on Research Assessment (DORA) aims to eliminate the use of journal-based metrics in academic promotion. We examined whether the JIF rank could be replaced with the relative citation ratio (RCR), an article-level measure of citation impact developed by the National Institutes of Health (NIH). An RCR percentile >0.66 corresponds to the upper third of citation impact of articles from NIH-sponsored research. We examined 1525 publications submitted by 64 candidates for academic promotion at University of Bern. There was only a moderate correlation between the JIF rank and RCR percentile (Pearson correlation coefficient 0.34, 95% CI 0.29-0.38). Among the 1,199 articles (78.6%) published in journals ranking >0.66 for the JIF, less than half (509, 42.5%) were in the upper third of the RCR percentile. Conversely, among the 326 articles published in journals ranking <0.66 regarding the JIF, 72 (22.1%) ranked in the upper third of the RCR percentile. Our study demonstrates that the rank of the JIF is a bad proxy measure for the actual citation impact of individual articles. The Medical Faculty of University of Bern has signed DORA and replaced the JIF rank with the RCR percentile to assess the citation impact of papers submitted for academic promotion.  

Statement on the Scholarly Merit and Evaluation of Open Scholarship in Linguistics | Linguistic Society of America

“The Linguistic Society of America values the open sharing of scholarship, and encourages the fair review of open scholarship in hiring, tenure, and promotion. The LSA encourages scholars, departments, and personnel committees to actively place value on open scholarship in their evaluation process with the aim of encouraging greater accessibility, distribution, and use of linguistic research….”

Research Assessment Policy | Templeton World Charity Foundation, Inc.

“Research Assessment Policy (With effect from 2021)

We do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions or in funding decisions.
For the purposes of research assessment, we consider the value and impact of all research outputs (including datasets and software) and a broad range of impact measures.
We make explicit the criteria used in evaluating the scientific productivity of grant applicants and we expect applicants, grantees, and reviewers to respond to these criteria accordingly.
We expect Grantees that are research institutions to have a:

statement of commitment to implementing the DORA principles on their website – this should be prominent and accessible
plan for implementing the DORA principles, or a clear process in place for developing a plan (with a specified delivery date) 
process in place for monitoring and reporting on progress….”

What Our New Open Science Policy Means for the Future of Research | by Dawid Potgieter | Templeton World | Sep, 2020 | Medium

“We are at the beginning of a new, five-year strategy to support scientific research on human flourishing, and as part of that, Templeton World Charity Foundation has revised its grant-making activities to incentivize open science best practices across all fields of inquiry which we support. Open science refers to a process whereby research data, methods and findings are made open and available to all researchers — regardless of affiliation — for free. This may sound like inside baseball, but it will affect all of us by radically changing the way scientists work, accelerating the pace of scientific breakthroughs, and making the upper echelons of science more global and more inclusive.

OUR NEW POLICIES

Our new commitment includes two policies. Our Open Access Policy requires that anyone who uses Foundation research dollars must make their final paper openly accessible to anyone with an internet connection. They can still publish in any journal they like, and our policy allows for a number of options to stay compliant. This policy aligns with Plan S, and we are delighted to also be joining cOAlition S. As a part of this new policy we will also commit more resources toward article processing charges to facilitate this transformation.

In support of this, we also launched a Research Assessment Policy, which seeks to increase fairness and scientific rigor. Researchers have typically been encouraged to publish in journals with a high impact factor, but they tend to have a paywall. Under our new research assessment policy, we put value on the quality of data, code and methodologies produced by the researcher, and we will not prioritize impact factor. These changes are the result of a long process of analysis and our core conviction that open science is a requirement for driving scientific breakthroughs in the future. This policy aligns with the San Francisco Declaration on Research Assessment (DORA)….”

Elsevier have endorsed the Leiden Manifesto: so what? – The Bibliomagician

“If an organisation wants to make a public commitment to responsible research evaluation they have three main options: i) sign DORA, ii) endorse the Leiden Manifesto (LM), or iii) go bespoke – usually with a statement based on DORA, the LM, or the Metric Tide principles.

The LIS-Bibliometrics annual responsible metrics survey shows that research-performing organisations adopt a wide range of responses to this including sometimes signing DORA and adopting the LM. But when it comes to publishers and metric vendors, they tend to go for DORA. Signing DORA is a proactive, public statement and there is an open, independent record of your commitment. DORA also has an active Chair in Professor Stephen Curry, and a small staff in the form of a program director and community manager, all of whom will publicly endorse your signing which leads to good PR for the organisation.

A public endorsement of the LM leads to no such fanfare. Indeed, the LM feels rather abandoned by comparison. Despite a website and blog, there has been little active promotion of the Manifesto, nor any public recognition for anyone seeking to endorse it….”

Elsevier have endorsed the Leiden Manifesto: so what? – The Bibliomagician

“If an organisation wants to make a public commitment to responsible research evaluation they have three main options: i) sign DORA, ii) endorse the Leiden Manifesto (LM), or iii) go bespoke – usually with a statement based on DORA, the LM, or the Metric Tide principles.

The LIS-Bibliometrics annual responsible metrics survey shows that research-performing organisations adopt a wide range of responses to this including sometimes signing DORA and adopting the LM. But when it comes to publishers and metric vendors, they tend to go for DORA. Signing DORA is a proactive, public statement and there is an open, independent record of your commitment. DORA also has an active Chair in Professor Stephen Curry, and a small staff in the form of a program director and community manager, all of whom will publicly endorse your signing which leads to good PR for the organisation.

A public endorsement of the LM leads to no such fanfare. Indeed, the LM feels rather abandoned by comparison. Despite a website and blog, there has been little active promotion of the Manifesto, nor any public recognition for anyone seeking to endorse it….”