Supporting rigor through reproducibility | JAMIA Open | Oxford Academic

“Community abstracts are now mandatory for accepted submissions to JAMIA Open. Community abstracts support a key goal in dissemination of published work to the stakeholders that have the most to gain– the patients. I acknowledge the challenge for many of us who spend our time writing and communicating to research audiences to write Community Abstracts. However, for our field to have the most impact, we must convey our work to the greater community. Community understanding for our findings and innovations in leveraging informatics approaches to improve health and health care is a crucial first step toward building a foundation for reproducibility. That is, if a patient understands work presented in a publication, they should expect that it can be reproduced for themselves.

Reproducibility is on equal footing with rigor in terms of importance in the pursuit of knowledge. Many funding agencies now require explicit description for how proposed work will not only be rigorous but also reproducible. JAMIA Open has strongly encouraged the availability of any associated data (eg, through Dryad) to support reproducibility. Data that are made available through readily accessible, public repositories supports not only verification studies, but can also form the basis for new studies. Data sets can also be enhanced and curated to provide common “benchmark” datasets for algorithm evaluation.

JAMIA Open was established as a Level 1 Data Availability journal, meaning that authors were encouraged to share their data publicly. We are now shifting to be a Level 2 Data Availability journal, meaning that, in addition to sharing data publicly, each publication must include a Data Availability Statement. Of course, the release of data should only be done where ethically possible and in accordance to relevant laws. A description of the Oxford University Press Data Availability policies can be found here: https://academic.oup.com/journals/pages/authors/preparing_your_manuscript/research-data-policy….”

PsyArXiv Preprints | Questionable and open research practices: attitudes and perceptions among quantitative communication researchers

Abstract:  Recent contributions have questioned the credibility of quantitative communication research. While questionable research practices are believed to be widespread, evidence for this claim is primarily derived from other disciplines. Before change in communication research can happen, it is important to document the extent to which QRPs are used and whether researchers are open to the changes proposed by the so-called open science agenda. We conducted a large survey among authors of papers published in the top-20 journals in communication science in the last ten years (N=1039). A non-trivial percent of researchers report using one or more QRPs. While QRPs are generally considered unacceptable, researchers perceive QRPs to be common among their colleagues. At the same time, we find optimism about the use of open science practices in communication research. We end with a series of recommendations outlining what journals, institutions and researchers can do moving forward.

[2011.07571] Software must be recognised as an important output of scholarly research

Abstract:  Software now lies at the heart of scholarly research. Here we argue that as well as being important from a methodological perspective, software should, in many instances, be recognised as an output of research, equivalent to an academic paper. The article discusses the different roles that software may play in research and highlights the relationship between software and research sustainability and reproducibility. It describes the challenges associated with the processes of citing and reviewing software, which differ from those used for papers. We conclude that whilst software outputs do not necessarily fit comfortably within the current publication model, there is a great deal of positive work underway that is likely to make an impact in addressing this.

 

SocArXiv Papers | Metascience as a scientific social movement

Abstract:  Emerging out of the “reproducibility crisis” in science, metascientists have become central players in debates about research integrity, scholarly communication, and science policy. The goal of this article is to introduce metascience to STS scholars, detail the scientific ideology that is apparent in its articles, strategy statements, and research projects, and discuss its institutional and intellectual future. Put simply, metascience is a scientific social movement that seeks to use the tools of science- especially, quantification and experimentation- to diagnose problems in research practice and improve efficiency. It draws together data scientists, experimental and statistical methodologists, and open science activists into a project with both intellectual and policy dimensions. Metascientists have been remarkably successful at winning grants, motivating news coverage, and changing policies at science agencies, journals, and universities. Moreover, metascience represents the apotheosis of several trends in research practice, scientific communication, and science governance including increased attention to methodological and statistical criticism of scientific practice, the promotion of “open science” by science funders and journals, the growing importance of both preprint and data repositories for scientific communication, and the new prominence of data scientists as research makes a turn toward Big Science.

 

A web-native approach to open source scientific publishing | Opensource.com

“This summer, eLife was pleased to launch Executable Research Articles (ERAs) in partnership with Stencila, allowing authors to post computationally reproducible versions of their published papers in the open-access journal.

The open source ERA technology stack delivers a truly web-native format that treats live, interactive code as a first-class asset. It was developed to address current challenges around reproducing and reusing published results—challenges mostly caused by the lack of infrastructure for publishers to showcase the richness and sophistication of the computational methods used by researchers in their work.

As part of its mission to transform research communication, eLife invests in open source technology innovation to modernize the infrastructure for science publishing and improve online tools for sharing, using, and interacting with new results. The organization began work on the concept of computationally reproducible papers in 2017, first in partnership with Substance and later, with Stencila, and announced a number of milestones along the road to delivering ERA….”

A web-native approach to open source scientific publishing | Opensource.com

“This summer, eLife was pleased to launch Executable Research Articles (ERAs) in partnership with Stencila, allowing authors to post computationally reproducible versions of their published papers in the open-access journal.

The open source ERA technology stack delivers a truly web-native format that treats live, interactive code as a first-class asset. It was developed to address current challenges around reproducing and reusing published results—challenges mostly caused by the lack of infrastructure for publishers to showcase the richness and sophistication of the computational methods used by researchers in their work.

As part of its mission to transform research communication, eLife invests in open source technology innovation to modernize the infrastructure for science publishing and improve online tools for sharing, using, and interacting with new results. The organization began work on the concept of computationally reproducible papers in 2017, first in partnership with Substance and later, with Stencila, and announced a number of milestones along the road to delivering ERA….”

Two years into the Brazilian Reproducibility Initiative: reflections on conducting a large-scale replication of Brazilian biomedical science

Abstract:  Scientists have increasingly recognised that low methodological and analytical rigour combined with publish-or-perish incentives can make the published scientific literature unreliable. As a response to this, large-scale systematic replications of the literature have emerged as a way to assess the problem empirically. The Brazilian Reproducibility Initiative is one such effort, aimed at estimating the reproducibility of Brazilian biomedical research. Its goal is to perform multicentre replications of a quasi-random sample of at least 60 experiments from Brazilian articles published over a 20-year period, using a set of common laboratory methods. In this article, we describe the challenges of managing a multicentre project with collaborating teams across the country, as well as its successes and failures over the first two years. We end with a brief discussion of the Initiative’s current status and its possible future contributions after the project is concluded in 2021.

 

Escaping science’s paradox – Works in Progress

“There are lots of ideas about how to improve scientific reproducibility in how federal research is funded. After all, quality control and assurance are hardly new ideas.

For example, we could require that data and computer code be shared openly so that others can scrutinize and rerun it. In too many cases to list, this sort of reanalysis has led to revisions, retractions, and even the discovery of outright fraud….”

Data Management UKRN workshop

“We are offering a one-day short course on data management skills for Open Research online in November 2020. It is open to researchers at all career stages, and across all quantitative disciplines in the biomedical sciences (broadly defined). The workshop is free, supported by Cancer Research UK. We especially encourage researchers funded by CRUK to apply. The course aims to teach researchers how to make their work more reproducible, open and robust through the use of data management tools such as R and Git. Attendees will come away with an overview of how open/reproducible data pipelines work, and why they are important, as well as hands-on experience delving into specific tools. The workshop will include four main components:…”