Covering research preprints amid the coronavirus: 6 things to know

“Below, we highlight six key things reporters need to know about preprints, based on interviews with two people with vast experience in biomedical research: Bill Hanage, an associate professor of epidemiology at the Harvard T.H. Chan School of Public Health, and John R. Inglis, who has launched and managed multiple academic journals and, as executive director of the nonprofit Cold Spring Harbor Laboratory Press, co-founded medRxiv and bioRxiv….”

Knowledge Exchange Openness Profile – Knowledge Exchange

“Part of KE’s work on Open Scholarship aims to enhance the evaluation of research and researchers. This currently does not cover recognition of non-academic contributions to make Open Scholarship work, such as activities to open up and curate data for re-use, or making research results findable and available. Our approach is to raise more awareness on the lack of recognition in current evaluation practice and work towards a possible solution, through the development of an ‘Openness Profile’.

The KE Open Scholarship Research Evaluation task & finish group works on the awareness issue, listing all academic and non-academic contributions that are essential to Open Scholarship and should be recognised when evaluating research. The group also works on the Openness Profile, a tool that is meant to allow evaluation of currently ignored contributions that are essential for Open Scholarship. For the development of the Openness Profile we seek involvement of various key stakeholders and alignment with current identifiers such as DOI and ORCID iD.

By demonstrating the immaturity of current research evaluation practice, and by developing the Openness Profile tool, KE supports researchers as well as non-researchers to get credit for all their contributions that make Open Scholarship possible. Our ambition is that recognition of these essential activities becomes part of standard research evaluation routine….”

Knowledge Exchange Openness Profile – Knowledge Exchange

“Part of KE’s work on Open Scholarship aims to enhance the evaluation of research and researchers. This currently does not cover recognition of non-academic contributions to make Open Scholarship work, such as activities to open up and curate data for re-use, or making research results findable and available. Our approach is to raise more awareness on the lack of recognition in current evaluation practice and work towards a possible solution, through the development of an ‘Openness Profile’.

The KE Open Scholarship Research Evaluation task & finish group works on the awareness issue, listing all academic and non-academic contributions that are essential to Open Scholarship and should be recognised when evaluating research. The group also works on the Openness Profile, a tool that is meant to allow evaluation of currently ignored contributions that are essential for Open Scholarship. For the development of the Openness Profile we seek involvement of various key stakeholders and alignment with current identifiers such as DOI and ORCID iD.

By demonstrating the immaturity of current research evaluation practice, and by developing the Openness Profile tool, KE supports researchers as well as non-researchers to get credit for all their contributions that make Open Scholarship possible. Our ambition is that recognition of these essential activities becomes part of standard research evaluation routine….”

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Open Scholarship as a mechanism for the United Nations Sustainable Development Goals

Abstract:  Traditional methods of scholarly publishing and communication are ineffective in meeting the United Nations Sustainable Development Goals (SDGs). The SARS-CoV-2 pandemic has demonstrated that, in times of need, the global research community can activate and pool its knowledge and resources to collaborate on solving problems. The use of innovative Web-based technologies, including open source software, data-sharing archives, open collaboration methods, and the liberation of thousands of relevant research articles from proprietary sources show us that the fundamental components of a fully open system are readily available, technologically efficient and cost-effective. If we are to achieve the SDGs by 2030, systematic reform and explicit adoption of open scholarship strategies at scale is necessary. We propose that the United Nations and parallel entities take a position of leadership by creating or funding an organisation or federated alliance of organisations to implement these reforms.

The COVID Tracking Project – Homepage

“The COVID Tracking Project collects information from 50 US states, the District of Columbia, and 5 other US territories to provide the most comprehensive testing data we can collect for the novel coronavirus, SARS-CoV-2. We attempt to include positive and negative results, pending tests, and total people tested for each state or district currently reporting that data….

State Testing Data Release – Best Practices—Our recommendations on what data state public health authorities should be releasing, and how….”

Open Access publishing practice in geochemistry: overview of current state and look to the future: Heliyon

Abstract:  Open Access (OA) describes the free, unrestricted access to and re-use of research articles. Recently, a new wave of interest, debate, and practice surrounding OA publishing has emerged. In this paper, we provide a simple overview of the trends in OA practice in the broad field of geochemistry. Characteristics of the approach such as whether or not an article processing charge (APC) exists, what embargo periods or restrictions on self-archiving’ policies are in place, and whether or not the sharing of preprints is permitted are described. The majority of journals have self-archiving policies that allow authors to share their peer reviewed work via green OA without charge. There is no clear relationship between journal impact and APC. The journals with the highest APC are typically those of the major commercial publishers, rather than the geochemistry community themselves. The rise in OA publishing has potential impacts on the profiles of researchers and tends to devolve costs from organizations to individuals. Until the geochemistry community makes the decision to move away from journal-based evaluation criteria, it is likely that such high costs will continue to impose financial inequities upon research community. However, geochemists could more widely choose legal self-archiving as an equitable and sustainable way to disseminate their research.

 

Launch of Platform for Responsible Editorial Policies – Leiden Madtrics

“With these features, PREP aims to contribute to more responsible journal management and to open science. By supporting authors, reviewers and editors in obtaining information about the editorial process of academic journals, it addresses well-known issues with one of science’s central institutions. By facilitating journal editors and publishers to transparently share their review procedures and by providing suggestions on alternative review options, it additionally aims to support some of the key stakeholders in academic publishing. This should ultimately lead to more open and responsible publishing….”

Time for NIH to lead on data sharing | Science

“The U.S. National Institutes of Health (NIH), the largest global funder of biomedical research, is in the midst of digesting public comments toward finalizing a data sharing policy. Although the draft policy is generally supportive of data sharing (1), it needs strengthening if we are to collectively achieve a long-standing vision of open science built on the principles of findable, accessible, interoperable, and reusable (FAIR) (2) data sharing. Relying on investigators to voluntarily share data has not, thus far, led to widespread open science practices (3); thus, we suggest steps that NIH could take to lead on scientific data sharing, with an initial focus on clinical trial data sharing….”