Towards FAIR protocols and workflows: the OpenPREDICT use case [PeerJ]

Abstract:  It is essential for the advancement of science that researchers share, reuse and reproduce each other’s workflows and protocols. The FAIR principles are a set of guidelines that aim to maximize the value and usefulness of research data, and emphasize the importance of making digital objects findable and reusable by others. The question of how to apply these principles not just to data but also to the workflows and protocols that consume and produce them is still under debate and poses a number of challenges. In this paper we describe a two-fold approach of simultaneously applying the FAIR principles to scientific workflows as well as the involved data. We apply and evaluate our approach on the case of the PREDICT workflow, a highly cited drug repurposing workflow. This includes FAIRification of the involved datasets, as well as applying semantic technologies to represent and store data about the detailed versions of the general protocol, of the concrete workflow instructions, and of their execution traces. We propose a semantic model to address these specific requirements and was evaluated by answering competency questions. This semantic model consists of classes and relations from a number of existing ontologies, including Workflow4ever, PROV, EDAM, and BPMN. This allowed us then to formulate and answer new kinds of competency questions. Our evaluation shows the high degree to which our FAIRified OpenPREDICT workflow now adheres to the FAIR principles and the practicality and usefulness of being able to answer our new competency questions.

 

Publish your Registered Report on F1000Research

“We believe that the value of science is in the rigor of the method, not the appeal of the results – an ethos at the heart of our publishing model. Choosing to publish your research as a Registered Report puts this into practice, by shifting the focus away from the results and back to the research question. Registered Reports can be used for research in almost any field of study, from psychology and neuroscience, to medicine or ecology.

 

Registered Reports on F1000Research follow a two-stage process: firstly, the Study Protocol (Stage 1) is published and peer-reviewed by subject experts before data collection begins. Then, once the research has been completed, the Research Article (Stage 2) is published, peer reviewed, and awarded a Registered Report badge.

 

F1000Research is the first publisher to combine the Registered Reports format with an open, post-publication peer review model. Alongside our open data policy, this format enhances credibility, and takes transparency and reproducibility in research to the next level….”

Cambridge scientist ‘breaks up the old-fashioned academic paper’ | Research Information

“Over the past two years, Freeman has been working on Octopus, an alternative publishing model that divides the various elements of publishing into eight different steps. This model allows for all the complexities and failures that are part of research to be published as part of the final output. Researchers will no longer have to cram all their work, often accrued over many years, into simplified, easy-to-read articles.

Freeman says: ‘Each of these mini publications will be publishable instantly, rather than submitted for peer review and selected by editors first. This way, research can be instantly in the public domain to be both reviewed and rated by all, speeding up research and solving some of the problems of the existing peer review process. The model will also credit researchers for their individual contributions and offer a tangible solution to the reproducibility crisis.’

For instance, Octopus allows for people who are specialists in research design to publish stand-alone protocols, those who have collected data to publish it (regardless of the size of the data), and for researchers specialised in analysing data to publish statistical analyses of data published by others. Each of these publications would be reviewed independently. This creates quality control through greater collaboration, and specialisation related to each step….”

Cambridge scientist ‘breaks up the old-fashioned academic paper’ | Research Information

“Over the past two years, Freeman has been working on Octopus, an alternative publishing model that divides the various elements of publishing into eight different steps. This model allows for all the complexities and failures that are part of research to be published as part of the final output. Researchers will no longer have to cram all their work, often accrued over many years, into simplified, easy-to-read articles.

Freeman says: ‘Each of these mini publications will be publishable instantly, rather than submitted for peer review and selected by editors first. This way, research can be instantly in the public domain to be both reviewed and rated by all, speeding up research and solving some of the problems of the existing peer review process. The model will also credit researchers for their individual contributions and offer a tangible solution to the reproducibility crisis.’

For instance, Octopus allows for people who are specialists in research design to publish stand-alone protocols, those who have collected data to publish it (regardless of the size of the data), and for researchers specialised in analysing data to publish statistical analyses of data published by others. Each of these publications would be reviewed independently. This creates quality control through greater collaboration, and specialisation related to each step….”

The Argument for Open Research in the Time of COVID-19

“Many funders and health organizations are demanding that research approaches and results be made open. Preprints have offered one solution, and their value during this challenging time has been evident in the huge volume of COVID-19 related content appearing online. For example, this collection of COVID-19 SARS-CoV-2 preprints on medRxiv and bioRxiv has more than 1900 manuscripts.

Now, protocols.io and Code Ocean are working to ensure that those research approaches remain open. These open access online tools are ideal repositories for all protocol and methodological approaches as well as computational pipelines and code. Online collaborative research tools are helpful to researchers who are restricted in how they can work and collaborate. For those at the frontline conducting scientific research, these tools serve as an ideal way to share their insights and approaches.

Here’s how protocols.io and Code Ocean are supporting the research community during this unprecedented time:…”

The Declaration to Improve Biomedical & Health Research

“3) That all publicly funded research is registered and published in designated Research Repositories The majority of research is funded by public and charitable funds. Yet, huge amounts of research is never published at all, which aside from being an indefensible waste of public money, is a major source of publication bias 3 . Meanwhile, basic research documentation which is essential to ensure appropriate research conduct, such as protocols, are only sometimes available, either on voluntary databases or upon agreement of study authors. The World Health Organization (WHO) has long urged registration of trials in affiliated ‘primary registries’, such as ClinicalTrials.gov 17 and the EU Clinical Trials Register 18 which can all be searched simultaneously a dedicated WHO website 19 . Mandatory registration of trials has improved transparency , although compliance with publication requirements is poor 20 , possibly hampered by problems with the basic functionality of some major registries 21 22 . Even where trials have been registered, usually only very limited information is shared, rather than the full protocols requir ed to really understand study plans. Most researchers don’t work in trials. Some principled scientists do register their work but while this remains voluntary such researc hers are likely to remain a minority . A ll publically funded research, not just trials, comprehensive documentation including protocols , statistical analysis plans, statistical analysis code and raw or appropriately de-identified summary data should be available on a single WHO affiliated repository, designated for that purpose by each state or groups of states . Depositing documentation need not become onerous for researchers and could actually replace much of the overly bureaucratic reporting currently required for funders and ethics committees. Different solutions may exist in different countries. For example, England’s Health Research Authority could develop such a registry 23 , by building on the its existing public databases 24 . Or, through additional national funding and international support existing platforms which promote transparency and accessibility 25 26 27 could be designated for this purpose through collaboration with national research bodies.”

JMIR Preprints #16078: Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: A Cross-sectional Analysis

Abstract:  Background:

Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature.

Objective:

We sought to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices.

Methods:

By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014 to December 31, 2018. After generating a list of eligible dermatology publications, we then searched for full-text PDF versions using Open Access Button, Google Scholar, and/or PubMed. Each publication was analyzed for eight indicators of reproducibility and transparency, using a pilot-tested Google Form.

Results:

After exclusions, 127 studies with empirical data were included in our analysis. The majority of publications (113, 89%) did not provide unmodified, raw data used to make computations, 124 (98%) failed to make complete protocols available, and 126 (99%) did not include step-by-step analysis scripts.

Conclusions:

Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted. More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology.

To cure brain diseases, neuroscientists must collaborate: That’s why I’m giving my data away

L-DOPA remains the most effective therapy for Parkinson’s. But it was discovered back in the 1960s and no other disease-modifying therapy has emerged since then.

This is partly due to the complexity of the disease, but also because we haven’t done a good enough job sharing our protocols and data in an open and accessible manner, so others can take the next step forward and avoid making the same mistakes or repeating the same experiments over and over again.

That’s why I’m giving my lab’s standard methods away over the next year — making them available to anyone who would like to access them….”

UC-wide pilot of protocols.io – Office of Scholarly Communication

One of the serious barriers to reproducibility of research is the lack of detailed methods in published articles. As trainees leave a research lab, it is often impossible to identify precisely the steps of their performed experiments. As we look to tackle the various aspects of open access and open research, the University of California continues to explore how we can unlock the underlying methods and protocols used in lab experiments.

With this goal in mind, we are excited to announce a new pilot for UC-wide use of protocols.io — an open access repository for research methods. The pilot, which will run for a three-year period from June 1, 2019 through May 31, 2022, will remove all cost barriers and allow UC researchers to test the uses of protocols.io for private collaboration around method development and for use in classrooms. In the long term, this initiative should also increase the reproducibility and rigour of the research published by UC academics….”

Power to the Protocols! – Chan Zuckerberg Science Initiative – Medium

“We require our grantees to contribute to open science in several ways, including:

  • Depositing software code to an open repository such as GitHub;
  • Submitting results to open-access preprint servers like bioRxiv upon submission to a peer-reviewed journal, if not earlier;
  • Making experimental protocols openly accessible….”

These challenges helped us identify a new technology platform for developing and sharing protocols. Protocols.io is an open access resource that allows researchers to discover and share up-to-date science methods, similar to the way code can be shared on GitHub. …”