Data deposition required for all C19 Rapid Review publishers – OASPA

“The C19 Rapid Review Initiative – a large-scale collaboration of organisations across the scholarly publishing industry – has agreed to mandate data deposition across the original group of journals that set up the collaboration (eLife, F1000 Research, Hindawi, PeerJ, PLOS, Royal Society, FAIRsharing, Outbreak Science Rapid PREreview, GigaScience, Life Science Alliance, Ubiquity Press, UCL, MIT Press, Cambridge University Press, BMC, RoRi and AfricArXiv). New members aim to align in due course. 

The Initiative, which grew from a need to improve efficiency of peer review and publishing of crucial COVID-19 research, began in April 2020 and now involves over 20 publishers, industry experts, and scholarly communication organizations, supporting over 1,800 rapid reviewers across relevant fields. …”

:From Bioethics to Data Sharing for Transparency in Nursing Research

“Our journal, Journal of Korean Academy of Nursing (JKAN), adopted data sharing policy in December 2020 (https://www.jkan.or.kr/index.php?body=dataSharing) [

3] which was applied from volume 50 issue 6 after extensive discussion. As editor-in-chief, I would like to inform our readers to enhance their understanding of the data sharing policy….”

:From Bioethics to Data Sharing for Transparency in Nursing Research

“Our journal, Journal of Korean Academy of Nursing (JKAN), adopted data sharing policy in December 2020 (https://www.jkan.or.kr/index.php?body=dataSharing) [

3] which was applied from volume 50 issue 6 after extensive discussion. As editor-in-chief, I would like to inform our readers to enhance their understanding of the data sharing policy….”

Replicate Others as You Would Like to Be Replicated Yourself | PS: Political Science & Politics | Cambridge Core

“This article presents some principles on constructive ways to conduct replications. Following the style of the Transparency and Openness Promotion (TOP) guidelines for open science (Nosek et al. 2015), we summarize recommendations as a series of tiers from what is commonly done (Level I), what would be better (Level II), and what would be better still (Level III) (table 1). The aspiration of our constructive replication recommendations is to help fields move toward a research culture in which self-correction is welcomed, honest mistakes are normalized, and different interpretations of results are recognized as a routine outcome of the process. Changing culture is always difficult, of course, but conducting projects in line with ideals and encouraging ideals in others are available to researchers for contributing to an improved culture that is closer to reality….”

APA Joins as New Signatory to TOP Guidelines

“The American Psychological Association (APA), the nation’s largest organization representing psychologists and psychological researchers has become a signatory to the Transparency and Openness Promotion (TOP) Guidelines, an important step for helping to make research data and processes more open by default, according to the Center for Open Science (COS).

The TOP Guidelines are a community-driven effort to align research behaviors with scientific ideals. Transparency, open sharing, and reproducibility are core values of science, but not always part of daily practice. Journals, funders, and institutions can increase reproducibility and integrity of research by aligning their author or grantee guidelines with the TOP Guidelines.

The APA said it will officially begin implementing standardized disclosure requirements of data and underlying research materials (TOP Level 1). Furthermore, it encourages editors of core journals to move to Level 2 TOP (required transparency of data and research items when ethically possible). More information on the specific levels of adoption by each of the core journals will be coming in the first half of 2021….”

Open Science: Promises and Performance| Qualtrics Survey Solutions

“Although many scientists and organisations endorse this notion, progress has been slow. Some of my research explores the barriers that have impeded progress and makes recommendations to encourage future success. This  survey forms part of that work and addresses a variety of issues, including attitudes towards data storage and access, the role of journals in open science, and associated ethical issues. 

Those interested in scientific progress are invited to take part, and participation should take less than 10 minutes. Responses will be anonymous and participants can withdraw at any time.

The findings from the survey will be submitted to open access journal and made available as open access preprint. The raw data will be lodged with the Open Science Foundation …”

Transparency and open science principles… | Wellcome Open Research

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 28 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. This instrument rates the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.
Results: Across the 28 journals, we find low values on the TOP Factor (median [25th, 75th percentile] 2.5 [1, 3], min. 0, max. 9, out of a total possible score of 28) in sleep research and chronobiology journals.
Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support the recent developments in transparent and open science by implementing transparency and openness principles in their guidelines and making adherence to them mandatory.

Transparency and Openness Promotion (TOP) Guidelines

Abstract:  The Transparency and Openness Promotion (TOP) Committee met in November 2014 to address one important element of the incentive systems – journals’ procedures and policies for publication. The outcome of the effort is the TOP Guidelines. There are eight standards in the TOP guidelines; each move scientific communication toward greater openness. These standards are modular, facilitating adoption in whole or in part. However, they also complement each other, in that commitment to one standard may facilitate adoption of others. Moreover, the guidelines are sensitive to barriers to openness by articulating, for example, a process for exceptions to sharing because of ethical issues, intellectual property concerns, or availability of necessary resources.

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.