Data Repository Selection: Criteria That Matter – Request For Comments – F1000 Blogs

“Publishers and journals are developing data policies to ensure that datasets, as well as other digital products associated with articles, are deposited and made accessible via appropriate repositories, also in line with the FAIR Principles. With thousands of options available, however, the lists of deposition repositories recommended by publishers are often different and consequently the guidance provided to authors may vary from journal to journal. This is due to a lack of common criteria used to select the data repositories, but also to the fact that there is still no consensus of what constitutes a good data repository. 

To tackle this, FAIRsharing and DataCite have joined forces with a group of publisher representatives (authors of this work) who are actively implementing data policies and recommending data repositories to researchers. The result of our work is a set of proposed criteria that journals and publishers believe are important for the identification and selection of data repositories, which can be recommended to researchers when they are preparing to publish the data underlying their findings. …”

Data Repository Selection: Criteria That Matter – Request For Comments – F1000 Blogs

“Publishers and journals are developing data policies to ensure that datasets, as well as other digital products associated with articles, are deposited and made accessible via appropriate repositories, also in line with the FAIR Principles. With thousands of options available, however, the lists of deposition repositories recommended by publishers are often different and consequently the guidance provided to authors may vary from journal to journal. This is due to a lack of common criteria used to select the data repositories, but also to the fact that there is still no consensus of what constitutes a good data repository. 

To tackle this, FAIRsharing and DataCite have joined forces with a group of publisher representatives (authors of this work) who are actively implementing data policies and recommending data repositories to researchers. The result of our work is a set of proposed criteria that journals and publishers believe are important for the identification and selection of data repositories, which can be recommended to researchers when they are preparing to publish the data underlying their findings. …”

ASTRO Journals’ Data Sharing Policy and Recommended Best Practices- ClinicalKey

Abstract:  Transparency, openness, and reproducibility are important characteristics in scientific publishing. Although many researchers embrace these characteristics, data sharing has yet to become common practice. Nevertheless, data sharing is becoming an increasingly important topic among societies, publishers, researchers, patient advocates, and funders, especially as it pertains to data from clinical trials. In response, ASTRO developed a data policy and guide to best practices for authors submitting to its journals. ASTRO’s data sharing policy is that authors should indicate, in data availability statements, if the data are being shared and if so, how the data may be accessed.

 

Building Trust to Break Down Barriers | The Official PLOS Blog

“At PLOS we have invested significantly in people and processes to support a strong journal data sharing policy since 2014. We are seeing a steady increase year-on-year in the proportion of PLOS authors who use a data repository. Although less costly for publishers, journal policies that only encourage data sharing have much lower levels of compliance….”

Journal practices (other than OA) promoting Open Science goals | Zenodo

“Journal practices (other than OA) promoting Open Science goals (relevance, reproducibility, efficiency, transparency)

Early, full and reproducible content

preregistration – use preregistrations in the review process
registered reports – apply peer review to preregistration prior to the study and publish results regardless of outcomes
preprint policy – liberally allow preprinting in any archive without license restrictions
data/code availability – foster or require open availability of data and code for reviewers and readers
TDM allowance – allow unrestricted TDM of full text and metadata for any use
null/negative results – publish regardless of outcome
 

Machine readable ecosystem

data/code citation – promote citation and use standards
persistent IDs – e.g. DOI, ORCID, ROR, Open Funder Registry, grant IDs
licenses (in Crossref) – register (open) licenses in Crossref
contributorship roles – credit all contributors for their part in the work
open citations – make citation information openly available via Crossref
 

Peer review

open peer review – e.g. open reports and open identities
peer review criteria – evaluate methodological rigour and reporting quality only or also judge expected relevance or impact?
rejection rates – publish rejection rates and reconsider high selectivity
post-publication peer review – publish immediately after sanity check and let peer review follow that?
 

Diversity

author diversity – age, position, gender, geography, ethnicity, colour
reviewer diversity – age, position, gender, geography, ethnicity, colour
editor diversity – age, position, gender, geography, ethnicity, colour

Metrics and DORA

DORA: journal metrics – refrain from promoting
DORA: article metrics – provide a range and use responsibly…”

A cross-sectional description of open access publication costs, policies and impact in emergency medicine and critical care journals. – PubMed – NCBI

Abstract

INTRODUCTION:

Finding journal open access information alongside its global impact requires access to multiple databases. We describe a single, searchable database of all emergency medicine and critical care journals that include their open access policies, publication costs, and impact metrics.

METHODS:

A list of emergency medicine and critical care journals (including citation metrics) was created using Scopus (Citescore) and the Web of Science (Impact Factor). Cost of gold/hybrid open access and article process charges (open access fees) were collected from journal websites. Self-archiving policies were collected from the Sherpa/RoMEO database. Relative cost of access in different regions were calculated using the World Bank Purchasing Power Parity index for authors from the United States, Germany, Turkey, China, Brazil, South Africa and Australia.

RESULTS:

We identified 78 emergency medicine and 82 critical care journals. Median Citescore for emergency medicine was 0.73 (interquartile range, IQR 0.32-1.27). Median impact factor was 1.68 (IQR 1.00-2.39). Median Citescore for critical care was 0.95 (IQR 0.25-2.06). Median impact factor was 2.18 (IQR 1.73-3.50). Mean article process charge for emergency medicine was $2243.04, SD?=?$1136.16 and for critical care $2201.64, SD?=?$1174.38. Article process charges were 2.24, 1.75, 2.28 and 1.56 times more expensive for South African, Chinese, Turkish and Brazilian authors respectively than United States authors, but neutral for German and Australian authors (1.02 and 0.81 respectively). The database can be accessed here: http://www.emct.info/publication-search.html.

CONCLUSIONS:

We present a single database that captures emergency medicine and critical care journal impact rankings alongside its respective open access cost and green open access policies.