A policy and legal Open Science framework: a proposal

Abstract:  Our proposal of an Open Science definition as a political and legal framework where research outputs are shared and disseminated in order to be rendered visible, accessible, reusable is developed, standing over the concepts enhanced by the Budapest Open Science Initiative (BOAI), and by the Free/Open Source Software (FOSS) and Open data movements. We elaborate this proposal through a detailed analysis of some selected EC policies, laws and the role of research evaluation practices.

 

 

Academy of Finland to adopt reform – Academy of Finland

“Starting with calls to be opened after 1 January 2021, the Academy of Finland will introduce a number of reforms concerning open access to scientific publications and responsible researcher evaluation. Through the reforms, the Academy wants to further strengthen its long-established policies on openness of scientific outputs and responsibility in researcher evaluation.

The policies are supported by the Academy’s commitments to international and national declarations over the past two years: the Plan S initiative for open access publishing (2018), the Declaration on Research Assessment (DORA) for improved research assessment (2019), the national recommendation on responsible researcher evaluation (2020), and the Finnish Declaration for Open Science and Research (2020)….”

Journal- or article-based citation measure? A study… | F1000Research

Abstract:  In academia, decisions on promotions are influenced by the citation impact of the works published by the candidates. The Medical Faculty of the University of Bern used a measure based on the journal impact factor (JIF) for this purpose: the JIF of the papers submitted for promotion should rank in the upper third of journals in the relevant discipline (JIF rank >0.66). The San Francisco Declaration on Research Assessment (DORA) aims to eliminate the use of journal-based metrics in academic promotion. We examined whether the JIF rank could be replaced with the relative citation ratio (RCR), an article-level measure of citation impact developed by the National Institutes of Health (NIH). An RCR percentile >0.66 corresponds to the upper third of citation impact of articles from NIH-sponsored research. We examined 1525 publications submitted by 64 candidates for academic promotion at University of Bern. There was only a moderate correlation between the JIF rank and RCR percentile (Pearson correlation coefficient 0.34, 95% CI 0.29-0.38). Among the 1,199 articles (78.6%) published in journals ranking >0.66 for the JIF, less than half (509, 42.5%) were in the upper third of the RCR percentile. Conversely, among the 326 articles published in journals ranking <0.66 regarding the JIF, 72 (22.1%) ranked in the upper third of the RCR percentile. Our study demonstrates that the rank of the JIF is a bad proxy measure for the actual citation impact of individual articles. The Medical Faculty of University of Bern has signed DORA and replaced the JIF rank with the RCR percentile to assess the citation impact of papers submitted for academic promotion.  

Research Square Launches Beta Testing of Ripeta’s Open Science Assessment Tool – ripeta

“The first 200 authors who opt in  can use this manuscript improvement tool at no cost 

Research Square has launched a beta trial of its new automated Open Science Assessment tool, which can help authors enhance the quality of their research and the robustness of their scientific reporting.

This opt-in tool, powered by Ripeta and currently in the beta testing phase, is available at no cost for authors who upload their preprints  to the Research Square platform….

Ripeta’s natural language processing technology targets several critical elements of a scientific manuscript, including purpose, data and code availability statements, funding statements, and more to gauge the level of responsible reporting in authors’ scientific papers and suggest improvements….”
 

Statement on the Scholarly Merit and Evaluation of Open Scholarship in Linguistics | Linguistic Society of America

“The Linguistic Society of America values the open sharing of scholarship, and encourages the fair review of open scholarship in hiring, tenure, and promotion. The LSA encourages scholars, departments, and personnel committees to actively place value on open scholarship in their evaluation process with the aim of encouraging greater accessibility, distribution, and use of linguistic research….”

Best Practices in Research Metrics: A Conversation with Dr. Diana Hicks Registration, Tue, Oct 13, 2020 at 4:00 PM | Eventbrite

“Join Professor Diana Hicks of the Georgia Tech School of Public Policy for a conversation about the Leiden Manifesto for Research Metrics. There will be a high level overview of the 10 principles to guide research evaluation followed by a participant driven Q&A with Professor Hicks.”

Research Assessment Policy | Templeton World Charity Foundation, Inc.

“Research Assessment Policy (With effect from 2021)

We do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions or in funding decisions.
For the purposes of research assessment, we consider the value and impact of all research outputs (including datasets and software) and a broad range of impact measures.
We make explicit the criteria used in evaluating the scientific productivity of grant applicants and we expect applicants, grantees, and reviewers to respond to these criteria accordingly.
We expect Grantees that are research institutions to have a:

statement of commitment to implementing the DORA principles on their website – this should be prominent and accessible
plan for implementing the DORA principles, or a clear process in place for developing a plan (with a specified delivery date) 
process in place for monitoring and reporting on progress….”

What Our New Open Science Policy Means for the Future of Research | by Dawid Potgieter | Templeton World | Sep, 2020 | Medium

“We are at the beginning of a new, five-year strategy to support scientific research on human flourishing, and as part of that, Templeton World Charity Foundation has revised its grant-making activities to incentivize open science best practices across all fields of inquiry which we support. Open science refers to a process whereby research data, methods and findings are made open and available to all researchers — regardless of affiliation — for free. This may sound like inside baseball, but it will affect all of us by radically changing the way scientists work, accelerating the pace of scientific breakthroughs, and making the upper echelons of science more global and more inclusive.

OUR NEW POLICIES

Our new commitment includes two policies. Our Open Access Policy requires that anyone who uses Foundation research dollars must make their final paper openly accessible to anyone with an internet connection. They can still publish in any journal they like, and our policy allows for a number of options to stay compliant. This policy aligns with Plan S, and we are delighted to also be joining cOAlition S. As a part of this new policy we will also commit more resources toward article processing charges to facilitate this transformation.

In support of this, we also launched a Research Assessment Policy, which seeks to increase fairness and scientific rigor. Researchers have typically been encouraged to publish in journals with a high impact factor, but they tend to have a paywall. Under our new research assessment policy, we put value on the quality of data, code and methodologies produced by the researcher, and we will not prioritize impact factor. These changes are the result of a long process of analysis and our core conviction that open science is a requirement for driving scientific breakthroughs in the future. This policy aligns with the San Francisco Declaration on Research Assessment (DORA)….”

Ouvrir la Science – Activités de Knowledge Exchange | Partenaires pour améliorer le service à l’ESR

Knowledge Exchange (KE) brings together six organizations from six countries. Their common objective is to examine the issues related to research support and infrastructure and service development.

Members:

CNRS (France),
CSC (Finland),
DEIC (Denmark),
DFG   (Germany),
JISC (United Kingdom),
SURF (Netherlands).

Recent results:

About monographs;

A landscape study on open access and monographs –  DOI: 10.5281 / zenodo.815932
Knowledge Exchange Survey on Open Access Monographs – DOI: 10.5281 / zenodo.1475446
Towards a Roadmap for Open Access Monographs – DOI: 10.5281 / zenodo.3238545

Preprints

Accelerating scholarly communication – The transformative role of preprints – DOI: 10.5281 / zenodo.3357727

Economy of Open Science

Insights into the Economy of Open Scholarship: A Collection of Interviews – DOI: 10.5281/zenodo.2840171
Open Scholarship and the need for collective action – DOI 10.5281/zenodo.3454688

 

Research 2030 podcast: Can the reward system learn to love open science? Part 1 with Jean-Claude Burgelman

“The open science movement has been gaining momentum over the past decade, prompting initiatives such as cOAlition S, with its plan to increase open access publications. But while the goals of open science are welcomed by many, challenges remain. And top of the list is the researcher reward system.

This is the first episode in our short series on open science and the reward system. Host Dr. Stephane Berghmans, Elsevier VP of Academic and Research Relations EU, welcomes Prof. Jean-Claude Burgelman to the podcast. Prof. Burgelman is eminently qualified to talk about this topic. Not only is he a part-time Professor of Open Science Policy at Vrije Universiteit Brussel, he was recently Head of Unit Open Data Policies and Science Cloud at the European Commission and an open access envoy for the organization….”