Two Competing Visions for Research Data Sharing – The Scholarly Kitchen

“In recent years, mechanisms for sharing and preserving research data have grown considerably. But the landscape is crowded with a number of divergent models for data sharing. And because these divergent approaches to research data sharing are poorly distinguished in much of the discourse, it can be a confusing landscape. Some are driven by the needs of science, some by business strategy. Today, I propose that two fundamentally competing visions are emerging for sharing research data….”

An assessment of transparency and reproducibility?related research practices in otolaryngology – Johnson – – The Laryngoscope – Wiley Online Library

Abstract

 

Objectives/Hypothesis

Clinical research serves as the foundation for evidence?based patient care, and reproducibility of results is consequently critical. We sought to assess the transparency and reproducibility of research studies in otolaryngology by evaluating a random sample of publications in otolaryngology journals between 2014 and 2018.

Study Design

Review of published literature for reproducible and transparent research practices.

Methods

We used the National Library of Medicine catalog to identify otolaryngology journals that met the inclusion criteria (available in the English language and indexed in MEDLINE). From these journals, we extracted a random sample of 300 publications using a PubMed search for records published between January 1, 2014 and December 31, 2018. Specific indicators of reproducible and transparent research practices were evaluated in a blinded, independent, and duplicate manner using a pilot?tested Google form.

Results

Our initial search returned 26,498 records, from which 300 were randomly selected for analysis. Of these 300 records, 286 met inclusion criteria and 14 did not. Among the empirical studies, 2% (95% confidence interval [CI]: 0.4%?3.5%) of publications indicated that raw data were available, 0.6% (95% CI: 0.3%?1.6%) reported an analysis script, 5.3% (95% CI: 2.7%?7.8%) were linked to an accessible research protocol, and 3.9% (95% CI: 1.7%?6.1%) were preregistered. None of the publications had a clear statement claiming to replicate, or to be a replication of, another study.

Conclusions

Inadequate reproducibility practices exist in otolaryngology. Nearly all studies in our analysis lacked a data or material availability statement, did not link to an accessible protocol, and were not preregistered. Taking steps to improve reproducibility would likely improve patient care.

Open Access Week 2019 | Research Data Management Program

“International Open Access Week is an opportunity to take action in making openness the default for research—to raise the visibility of scholarship, accelerate research.

At Harvard, the Library is dedicated to fostering equitable systems of open research and scholarship that serve the needs of our diverse global community.

This year’s Open Access Week invites all interested stakeholders to participate in advancing this important work. Please join us for a variety of workshops on open platforms to help you make your research, data, and scholarship more accessible, collaborative, and reproducible. …”

Guest Post – A Look at the User-Centric Future of Academic Research Software — And Why It Matters, Part 2: Implications – The Scholarly Kitchen

“Yesterday’s post discussed current trends in the landscape of research and academic software. Today, we look at the implications of those trends. First, we look at the reproducibility crisis as a case study of how researcher-built tools can help to solve tough problems faced by the community. Second, we look at some of the broader possible implications for the scholarly communication space….”

Transforming the culture of data science | The Alan Turing Institute

“The crisis of reproducibility in science is well known. The combination of ‘publish or perish’ incentives, secrecy around data and the drive for novelty at all costs can result in fragile advances and lots of wasted time and money. Even in data science, when a paper is published there is generally no way for an outsider to verify its results, because the data from which the findings were derived are not available for scrutiny. Such science cannot be built upon very easily: siloed science is slow science.

That’s one of the reasons funders and publishers are beginning to require that publications include access to the underlying data and analysis code. It’s clear that this new era of data science needs a new cultural and practical approach, one which embraces openness and collaboration more than ever before. To this end, a group of Turing researchers have created The Turing Way – an evolving online “handbook” on how to conduct world-leading, reproducible research in data science and artificial intelligence….”

15 Years of a Movement for Open Access Medical Science | Speaking of Medicine

“To kick off the celebration of PLOS Medicine‘s 15th Anniversary, Specialty Consulting Editor Sanjay Basu discusses the journal’s contributions to scientific communication and his favorite article from the past 15 years. 

It’s fitting that one of PLOS Medicine’s most viewed and cited articles remains the cult classic, Why Most Published Research Findings Are False (2005). The article codifies the challenge taken up as a mantle by the contributors and editors of PLOS Medicine for the last 15 years: to make science more transparent, reproducible, and trustworthy….”

Cambridge journal aims for ‘radical new approach’ | Research Information

“A journal from Cambridge University Press (CUP) is aiming for a ‘radical new approach’ to both publishing and peer reviewing research.

Experimental Results aims to tackle the crisis in the reproducibility of results, to provide an outlet for standalone research that currently goes unpublished – and to make peer review faster, less onerous and more transparent.

Submissions are open for the journal, which will give researchers a place to publish valid, standalone results, regardless of whether those results are novel, inconclusive, negative or supplementary to other published work.

It will also publish the outcome of attempts to reproduce previously published experiments, including those that dispute past findings….”

Home – Profeza

“A set of workflow software tools that guides article authors to make the scientific outputs including Additional Research Objects (AROs, i.e datasets and null results) more easily reproducible/reusable and it captures all the events of reuse over the period of time after publications to reward those who have contributed towards it. It makes the improvement of research outputs a continuous process rather than one-time event….”