Journal research data sharing policies

Abstract:  The practices for if and how scholarly journals instruct research data for published research to be shared is an area where a lot of changes have been happening as science policy moves towards facilitating open science, and subject-specific repositories and practices are established. This study provides an analysis of the research data sharing policies of highly-cited journals in the fields of neuroscience, physics, and operations research as of May 2019. For these 120 journals, 40 journals per subject category, a unified policy coding framework was developed to capture the most central elements of each policy, i.e. what, when, and where research data is instructed to be shared. The results affirm that considerable differences between research fields remain when it comes to policy existence, strength, and specificity. The findings revealed that one of the most important factors influencing the dimensions of what, where and when of research data policies was whether the journal’s scope included specific data types related to life sciences which have established methods of sharing through community-endorsed public repositories. The findings surface the future research potential of approaching policy analysis on the publisher-level as well as on the journal-level. The collected data and coding framework is provided as open data to facilitate future research and journal policy monitoring.

 

Data-sharing recommendations in biomedical journals and randomised controlled trials: an audit of journals following the ICMJE recommendations | BMJ Open

Abstract:  Objective To explore the implementation of the International Committee of Medical Journal Editors (ICMJE) data-sharing policy which came into force on 1 July 2018 by ICMJE-member journals and by ICMJE-affiliated journals declaring they follow the ICMJE recommendations.

Design A cross-sectional survey of data-sharing policies in 2018 on journal websites and in data-sharing statements in randomised controlled trials (RCTs).

Setting ICMJE website; PubMed/Medline.

Eligibility criteria ICMJE-member journals and 489 ICMJE-affiliated journals that published an RCT in 2018, had an accessible online website and were not considered as predatory journals according to Beall’s list. One hundred RCTs for member journals and 100 RCTs for affiliated journals with a data-sharing policy, submitted after 1 July 2018.

Main outcome measures The primary outcome for the policies was the existence of a data-sharing policy (explicit data-sharing policy, no data-sharing policy, policy merely referring to ICMJE recommendations) as reported on the journal website, especially in the instructions for authors. For RCTs, our primary outcome was the intention to share individual participant data set out in the data-sharing statement.

Results Eight (out of 14; 57%) member journals had an explicit data-sharing policy on their website (three were more stringent than the ICMJE requirements, one was less demanding and four were compliant), five (35%) additional journals stated that they followed the ICMJE requirements, and one (8%) had no policy online. In RCTs published in these journals, there were data-sharing statements in 98 out of 100, with expressed intention to share individual patient data reaching 77 out of 100 (77%; 95% CI 67% to 85%). One hundred and forty-five (out of 489) ICMJE-affiliated journals (30%; 26% to 34%) had an explicit data-sharing policy on their website (11 were more stringent than the ICMJE requirements, 85 were less demanding and 49 were compliant) and 276 (56%; 52% to 61%) merely referred to the ICMJE requirements. In RCTs published in affiliated journals with an explicit data-sharing policy, data-sharing statements were rare (25%), and expressed intentions to share data were found in 22% (15% to 32%).

Conclusion The implementation of ICMJE data-sharing requirements in online journal policies was suboptimal for ICMJE-member journals and poor for ICMJE-affiliated journals. The implementation of the policy was good in member journals and of concern for affiliated journals. We suggest the conduct of continuous audits of medical journal data-sharing policies in the future.

Publisher Data Availability Policies Index – CHORUS

“Over the last few years, publishers have been making their Data Availability Policies known either at the publisher level or at the journal level. These policies range in their mandate, but most require authors to make all data necessary to replicate their study’s findings publicly available without restriction at the time of publication. When specific legal or ethical restrictions prohibit public sharing of a data set, authors must indicate how others may obtain access to the data. Often, when submitting a manuscript to the publisher, authors must provide a Data Availability Statement describing compliance with the data availability policy. If the article is accepted for publication, the Data Availability Statement will be published as part of the article.CHORUS has created a centralized index of our member publishers’ policies with links to the publisher’s site. This chart will be updated at least annually….”

Increasing the transparency, openness and replicability of psychological research: Mandatory data sharing for empirical studies in the Journal of Health Psychology – David F Marks,

Abstract:  This editorial announces this journal’s policy on transparency, openness and replication. From 1 July 2020, authors of manuscripts submitted to Journal of Health Psychology (JHP) are required to make the raw data fully accessible to all readers. JHP will only consider manuscripts which follow an open publication model defined as follows: M = Mandatory, I = Inclusion (of), R = Raw, D = Data (MIRD). All data and analytical procedures must be sufficiently well described to enable a third party with the appropriate expertise to replicate the data analyses. It is expected that findings and analyses in the JHP will be fully capable of being accurately reproduced.

 

Journal research data sharing policies: a study of highly-cited journals in neuroscience, physics, and operations research | SpringerLink

“The practices for if and how scholarly journals instruct research data for published research to be shared is an area where a lot of changes have been happening as science policy moves towards facilitating open science, and subject-specific repositories and practices are established. This study provides an analysis of the research data sharing policies of highly-cited journals in the fields of neuroscience, physics, and operations research as of May 2019. For these 120 journals, 40 journals per subject category, a unified policy coding framework was developed to capture the most central elements of each policy, i.e. what, when, and where research data is instructed to be shared. The results affirm that considerable differences between research fields remain when it comes to policy existence, strength, and specificity. The findings revealed that one of the most important factors influencing the dimensions of what, where and when of research data policies was whether the journal’s scope included specific data types related to life sciences which have established methods of sharing through community-endorsed public repositories. The findings surface the future research potential of approaching policy analysis on the publisher-level as well as on the journal-level. The collected data and coding framework is provided as open data to facilitate future research and journal policy monitoring.

 

Journal research data sharing policies: a study of highly-cited journals in neuroscience, physics, and operations research | SpringerLink

“The practices for if and how scholarly journals instruct research data for published research to be shared is an area where a lot of changes have been happening as science policy moves towards facilitating open science, and subject-specific repositories and practices are established. This study provides an analysis of the research data sharing policies of highly-cited journals in the fields of neuroscience, physics, and operations research as of May 2019. For these 120 journals, 40 journals per subject category, a unified policy coding framework was developed to capture the most central elements of each policy, i.e. what, when, and where research data is instructed to be shared. The results affirm that considerable differences between research fields remain when it comes to policy existence, strength, and specificity. The findings revealed that one of the most important factors influencing the dimensions of what, where and when of research data policies was whether the journal’s scope included specific data types related to life sciences which have established methods of sharing through community-endorsed public repositories. The findings surface the future research potential of approaching policy analysis on the publisher-level as well as on the journal-level. The collected data and coding framework is provided as open data to facilitate future research and journal policy monitoring.

 

The citation advantage of linking publications to research data

Abstract:  Efforts to make research results open and reproducible are increasingly reflected by journal policies encouraging or mandating authors to provide data availability statements. As a consequence of this, there has been a strong uptake of data availability statements in recent literature. Nevertheless, it is still unclear what proportion of these statements actually contain well-formed links to data, for example via a URL or permanent identifier, and if there is an added value in providing such links. We consider 531, 889 journal articles published by PLOS and BMC, develop an automatic system for labelling their data availability statements according to four categories based on their content and the type of data availability they display, and finally analyze the citation advantage of different statement categories via regression. We find that, following mandated publisher policies, data availability statements become very common. In 2018 93.7% of 21,793 PLOS articles and 88.2% of 31,956 BMC articles had data availability statements. Data availability statements containing a link to data in a repository—rather than being available on request or included as supporting information files—are a fraction of the total. In 2017 and 2018, 20.8% of PLOS publications and 12.2% of BMC publications provided DAS containing a link to data in a repository. We also find an association between articles that include statements that link to data in a repository and up to 25.36% (± 1.07%) higher citation impact on average, using a citation prediction model. We discuss the potential implications of these results for authors (researchers) and journal publishers who make the effort of sharing their data in repositories. All our data and code are made available in order to reproduce and extend our results.

 

Did awarding badges increase data sharing in BMJ Open? A randomized controlled trial | Royal Society Open Science

Abstract:  Sharing data and code are important components of reproducible research. Data sharing in research is widely discussed in the literature; however, there are no well-established evidence-based incentives that reward data sharing, nor randomized studies that demonstrate the effectiveness of data sharing policies at increasing data sharing. A simple incentive, such as an Open Data Badge, might provide the change needed to increase data sharing in health and medical research. This study was a parallel group randomized controlled trial (protocol registration: doi:10.17605/OSF.IO/PXWZQ) with two groups, control and intervention, with 80 research articles published in BMJ Open per group, with a total of 160 research articles. The intervention group received an email offer for an Open Data Badge if they shared their data along with their final publication and the control group received an email with no offer of a badge if they shared their data with their final publication. The primary outcome was the data sharing rate. Badges did not noticeably motivate researchers who published in BMJ Open to share their data; the odds of awarding badges were nearly equal in the intervention and control groups (odds ratio = 0.9, 95% CI [0.1, 9.0]). Data sharing rates were low in both groups, with just two datasets shared in each of the intervention and control groups. The global movement towards open science has made significant gains with the development of numerous data sharing policies and tools. What remains to be established is an effective incentive that motivates researchers to take up such tools to share their data.

 

 

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.