Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Journal transparency index will be ‘alternative’ to impact scores | Times Higher Education (THE)

“A new ranking system for academic journals measuring their commitment to research transparency will be launched next month – providing what many believe will be a useful alternative to journal impact scores.

Under a new initiative from the Center for Open Science, based in Charlottesville, Virginia, more than 300 scholarly titles in psychology, education and biomedical science will be assessed on 10 measures related to transparency, with their overall result for each category published in a publicly available league table.

The centre aims to provide scores for about 1,000 journals within six to eight months of their site’s launch in early February….”

A study of the impact of data sharing on article citations using journal policies as a natural experiment

Abstract:  This study estimates the effect of data sharing on the citations of academic articles, using journal policies as a natural experiment. We begin by examining 17 high-impact journals that have adopted the requirement that data from published articles be publicly posted. We match these 17 journals to 13 journals without policy changes and find that empirical articles published just before their change in editorial policy have citation rates with no statistically significant difference from those published shortly after the shift. We then ask whether this null result stems from poor compliance with data sharing policies, and use the data sharing policy changes as instrumental variables to examine more closely two leading journals in economics and political science with relatively strong enforcement of new data policies. We find that articles that make their data available receive 97 additional citations (estimate standard error of 34). We conclude that: a) authors who share data may be rewarded eventually with additional scholarly citations, and b) data-posting policies alone do not increase the impact of articles published in a journal unless those policies are enforced.

 

Data Repository Selection: Criteria That Matter – Request For Comments – F1000 Blogs

“Publishers and journals are developing data policies to ensure that datasets, as well as other digital products associated with articles, are deposited and made accessible via appropriate repositories, also in line with the FAIR Principles. With thousands of options available, however, the lists of deposition repositories recommended by publishers are often different and consequently the guidance provided to authors may vary from journal to journal. This is due to a lack of common criteria used to select the data repositories, but also to the fact that there is still no consensus of what constitutes a good data repository. 

To tackle this, FAIRsharing and DataCite have joined forces with a group of publisher representatives (authors of this work) who are actively implementing data policies and recommending data repositories to researchers. The result of our work is a set of proposed criteria that journals and publishers believe are important for the identification and selection of data repositories, which can be recommended to researchers when they are preparing to publish the data underlying their findings. …”

Data Repository Selection: Criteria That Matter – Request For Comments – F1000 Blogs

“Publishers and journals are developing data policies to ensure that datasets, as well as other digital products associated with articles, are deposited and made accessible via appropriate repositories, also in line with the FAIR Principles. With thousands of options available, however, the lists of deposition repositories recommended by publishers are often different and consequently the guidance provided to authors may vary from journal to journal. This is due to a lack of common criteria used to select the data repositories, but also to the fact that there is still no consensus of what constitutes a good data repository. 

To tackle this, FAIRsharing and DataCite have joined forces with a group of publisher representatives (authors of this work) who are actively implementing data policies and recommending data repositories to researchers. The result of our work is a set of proposed criteria that journals and publishers believe are important for the identification and selection of data repositories, which can be recommended to researchers when they are preparing to publish the data underlying their findings. …”

ASTRO Journals’ Data Sharing Policy and Recommended Best Practices- ClinicalKey

Abstract:  Transparency, openness, and reproducibility are important characteristics in scientific publishing. Although many researchers embrace these characteristics, data sharing has yet to become common practice. Nevertheless, data sharing is becoming an increasingly important topic among societies, publishers, researchers, patient advocates, and funders, especially as it pertains to data from clinical trials. In response, ASTRO developed a data policy and guide to best practices for authors submitting to its journals. ASTRO’s data sharing policy is that authors should indicate, in data availability statements, if the data are being shared and if so, how the data may be accessed.

 

Building Trust to Break Down Barriers | The Official PLOS Blog

“At PLOS we have invested significantly in people and processes to support a strong journal data sharing policy since 2014. We are seeing a steady increase year-on-year in the proportion of PLOS authors who use a data repository. Although less costly for publishers, journal policies that only encourage data sharing have much lower levels of compliance….”

Journal practices (other than OA) promoting Open Science goals | Zenodo

“Journal practices (other than OA) promoting Open Science goals (relevance, reproducibility, efficiency, transparency)

Early, full and reproducible content

preregistration – use preregistrations in the review process
registered reports – apply peer review to preregistration prior to the study and publish results regardless of outcomes
preprint policy – liberally allow preprinting in any archive without license restrictions
data/code availability – foster or require open availability of data and code for reviewers and readers
TDM allowance – allow unrestricted TDM of full text and metadata for any use
null/negative results – publish regardless of outcome
 

Machine readable ecosystem

data/code citation – promote citation and use standards
persistent IDs – e.g. DOI, ORCID, ROR, Open Funder Registry, grant IDs
licenses (in Crossref) – register (open) licenses in Crossref
contributorship roles – credit all contributors for their part in the work
open citations – make citation information openly available via Crossref
 

Peer review

open peer review – e.g. open reports and open identities
peer review criteria – evaluate methodological rigour and reporting quality only or also judge expected relevance or impact?
rejection rates – publish rejection rates and reconsider high selectivity
post-publication peer review – publish immediately after sanity check and let peer review follow that?
 

Diversity

author diversity – age, position, gender, geography, ethnicity, colour
reviewer diversity – age, position, gender, geography, ethnicity, colour
editor diversity – age, position, gender, geography, ethnicity, colour

Metrics and DORA

DORA: journal metrics – refrain from promoting
DORA: article metrics – provide a range and use responsibly…”