Building capacity through open approaches: Lessons from developing undergraduate electrophysiology practicals

Abstract:  Electrophysiology has a wide range of biomedical research and clinical applications. As such, education in the theoretical basis and hands-on practice of electrophysiological techniques is essential for biomedical students, including at the undergraduate level. However, offering hands-on learning experiences is particularly difficult in environments with limited resources and infrastructure. In 2017, we began a project to design and incorporate electrophysiology laboratory practicals into our Biomedical Physics undergraduate curriculum at the Universidad Nacional Autónoma de México. We describe some of the challenges we faced, how we maximized resources to overcome some of these challenges, and in particular, how we used open scholarship approaches to build both educational and research capacity. The use of open tools, open platforms, and open licenses was key to the success and broader impact of our project. We share examples of our practicals and explain how we use these activities to strengthen interdisciplinary learning, namely the application of concepts in physics to understanding functions of the human body. Our goal is to provide ideas, materials, and strategies for educators working in similar resource-limited environments.

 

Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities | The BMJ

Abstract:  Objective To determine the presence of a set of pre-specified traditional and non-traditional criteria used to assess scientists for promotion and tenure in faculties of biomedical sciences among universities worldwide.

Design Cross sectional study.

Setting International sample of universities.

Participants 170 randomly selected universities from the Leiden ranking of world universities list.

Main outcome measure Presence of five traditional (for example, number of publications) and seven non-traditional (for example, data sharing) criteria in guidelines for assessing assistant professors, associate professors, and professors and the granting of tenure in institutions with biomedical faculties.

Results A total of 146 institutions had faculties of biomedical sciences, and 92 had eligible guidelines available for review. Traditional criteria of peer reviewed publications, authorship order, journal impact factor, grant funding, and national or international reputation were mentioned in 95% (n=87), 37% (34), 28% (26), 67% (62), and 48% (44) of the guidelines, respectively. Conversely, among non-traditional criteria, only citations (any mention in 26%; n=24) and accommodations for employment leave (37%; 34) were relatively commonly mentioned. Mention of alternative metrics for sharing research (3%; n=3) and data sharing (1%; 1) was rare, and three criteria (publishing in open access mediums, registering research, and adhering to reporting guidelines) were not found in any guidelines reviewed. Among guidelines for assessing promotion to full professor, traditional criteria were more commonly reported than non-traditional criteria (traditional criteria 54.2%, non-traditional items 9.5%; mean difference 44.8%, 95% confidence interval 39.6% to 50.0%; P=0.001). Notable differences were observed across continents in whether guidelines were accessible (Australia 100% (6/6), North America 97% (28/29), Europe 50% (27/54), Asia 58% (29/50), South America 17% (1/6)), with more subtle differences in the use of specific criteria.

Conclusions This study shows that the evaluation of scientists emphasises traditional criteria as opposed to non-traditional criteria. This may reinforce research practices that are known to be problematic while insufficiently supporting the conduct of better quality research and open science. Institutions should consider incentivising non-traditional criteria.

The NIH Preprint Pilot: A New Experiment for a New Era – NLM Musings from the Mezzanine

“Recognizing the growing interest in preprints, NLM is today launching the first phase of the NIH Preprint Pilot, which will test the viability of making preprints searchable in PubMed Central (PMC) and, by extension, discoverable in PubMed, starting with COVID-19 preprints reporting NIH-supported research.

To be clear, NLM is not building a preprint server for NIH investigators, nor are we developing a comprehensive preprint discovery resource. Rather, through this pilot, we plan to add a curated collection of preprints from eligible preprint servers to our established literature resources. In doing so, our goal is to improve scholarly communications by accelerating and expanding the findability of NIH research results.

With the encouragement of NIH leadership, NLM has been exploring ways to leverage its literature databases to help accelerate the discoverability and maximize the impact of NIH-supported research via preprints. The planned pilot builds on guidance released by NIH in March 2017, which encouraged NIH investigators to use preprints and other interim research products to speed the dissemination of research and enhance the rigor of their work through public comments and new scientific collaborations….”

Data-sharing recommendations in biomedical journals and randomised controlled trials: an audit of journals following the ICMJE recommendations | BMJ Open

Abstract:  Objective To explore the implementation of the International Committee of Medical Journal Editors (ICMJE) data-sharing policy which came into force on 1 July 2018 by ICMJE-member journals and by ICMJE-affiliated journals declaring they follow the ICMJE recommendations.

Design A cross-sectional survey of data-sharing policies in 2018 on journal websites and in data-sharing statements in randomised controlled trials (RCTs).

Setting ICMJE website; PubMed/Medline.

Eligibility criteria ICMJE-member journals and 489 ICMJE-affiliated journals that published an RCT in 2018, had an accessible online website and were not considered as predatory journals according to Beall’s list. One hundred RCTs for member journals and 100 RCTs for affiliated journals with a data-sharing policy, submitted after 1 July 2018.

Main outcome measures The primary outcome for the policies was the existence of a data-sharing policy (explicit data-sharing policy, no data-sharing policy, policy merely referring to ICMJE recommendations) as reported on the journal website, especially in the instructions for authors. For RCTs, our primary outcome was the intention to share individual participant data set out in the data-sharing statement.

Results Eight (out of 14; 57%) member journals had an explicit data-sharing policy on their website (three were more stringent than the ICMJE requirements, one was less demanding and four were compliant), five (35%) additional journals stated that they followed the ICMJE requirements, and one (8%) had no policy online. In RCTs published in these journals, there were data-sharing statements in 98 out of 100, with expressed intention to share individual patient data reaching 77 out of 100 (77%; 95% CI 67% to 85%). One hundred and forty-five (out of 489) ICMJE-affiliated journals (30%; 26% to 34%) had an explicit data-sharing policy on their website (11 were more stringent than the ICMJE requirements, 85 were less demanding and 49 were compliant) and 276 (56%; 52% to 61%) merely referred to the ICMJE requirements. In RCTs published in affiliated journals with an explicit data-sharing policy, data-sharing statements were rare (25%), and expressed intentions to share data were found in 22% (15% to 32%).

Conclusion The implementation of ICMJE data-sharing requirements in online journal policies was suboptimal for ICMJE-member journals and poor for ICMJE-affiliated journals. The implementation of the policy was good in member journals and of concern for affiliated journals. We suggest the conduct of continuous audits of medical journal data-sharing policies in the future.

Recommendations to enhance rigor and reproducibility in biomedical research | GigaScience | Oxford Academic

Abstract:  Biomedical research depends increasingly on computational tools, but mechanisms ensuring open data, open software, and reproducibility are variably enforced by academic institutions, funders, and publishers. Publications may present software for which source code or documentation are or become unavailable; this compromises the role of peer review in evaluating technical strength and scientific contribution. Incomplete ancillary information for an academic software package may bias or limit subsequent work. We provide 8 recommendations to improve reproducibility, transparency, and rigor in computational biology—precisely the values that should be emphasized in life science curricula. Our recommendations for improving software availability, usability, and archival stability aim to foster a sustainable data science ecosystem in life science research.

 

Applying FAIRness: Redesigning a Biomedical Informatics Research Data Management Pipeline

Abstract:  Background?Managing research data in biomedical informatics research requires solid data governance rules to guarantee sustainable operation, as it generally involves several professions and multiple sites. As every discipline involved in biomedical research applies its own set of tools and methods, research data as well as applied methods tend to branch out into numerous intermediate and output data objects, making it very difficult to reproduce research results.

Objectives?This article gives an overview of our implementation status applying the Findability, Accessibility, Interoperability and Reusability (FAIR) Guiding Principles for scientific data management and stewardship onto our research data management pipeline focusing on the software tools that are in use.

Methods?We analyzed our progress FAIRificating the whole data management pipeline, from processing non-FAIR data up to data usage. We looked at software tools for data integration, data storage, and data usage as well as how the FAIR Guiding Principles helped to choose appropriate tools for each task.

Results?We were able to advance the degree of FAIRness of our data integration as well as data storage solutions, but lack enabling more FAIR Guiding Principles regarding Data Usage. Existing evaluation methods regarding the FAIR Guiding Principles (FAIRmetrics) were not applicable to our analysis of software tools.

Conclusion?Using the FAIR Guiding Principles, we FAIRificated relevant parts of our research data management pipeline improving findability, accessibility, interoperability and reuse of datasets and research results. We aim to implement the FAIRmetrics to our data management infrastructure and—where required—to contribute to the FAIRmetrics for research data in the biomedical informatics domain as well as for software tools to achieve a higher degree of FAIRness of our research data management pipeline.

Digital Scholarship [at Harvard Medical School]

“The Countway Library’s Publishing & Data Services team is committed to supporting the Harvard community in advancing access, dissemination, and reproducibility of biomedical & health science research.

Our team offers workshops, tutorials, and consultations to increase awareness, proficiency and adoption of digital technologies that are essential to scholarly communication, research collaboration, data management, & bioinformatics.

Our goal is to provide expert services and support to the Harvard community in efforts that foster participation in Open Science initiatives, and share Harvard’s scholarly outputs and unique digital collections with researchers throughout the world. …”

The Declaration to Improve Biomedical & Health Research

“3) That all publicly funded research is registered and published in designated Research Repositories The majority of research is funded by public and charitable funds. Yet, huge amounts of research is never published at all, which aside from being an indefensible waste of public money, is a major source of publication bias 3 . Meanwhile, basic research documentation which is essential to ensure appropriate research conduct, such as protocols, are only sometimes available, either on voluntary databases or upon agreement of study authors. The World Health Organization (WHO) has long urged registration of trials in affiliated ‘primary registries’, such as ClinicalTrials.gov 17 and the EU Clinical Trials Register 18 which can all be searched simultaneously a dedicated WHO website 19 . Mandatory registration of trials has improved transparency , although compliance with publication requirements is poor 20 , possibly hampered by problems with the basic functionality of some major registries 21 22 . Even where trials have been registered, usually only very limited information is shared, rather than the full protocols requir ed to really understand study plans. Most researchers don’t work in trials. Some principled scientists do register their work but while this remains voluntary such researc hers are likely to remain a minority . A ll publically funded research, not just trials, comprehensive documentation including protocols , statistical analysis plans, statistical analysis code and raw or appropriately de-identified summary data should be available on a single WHO affiliated repository, designated for that purpose by each state or groups of states . Depositing documentation need not become onerous for researchers and could actually replace much of the overly bureaucratic reporting currently required for funders and ethics committees. Different solutions may exist in different countries. For example, England’s Health Research Authority could develop such a registry 23 , by building on the its existing public databases 24 . Or, through additional national funding and international support existing platforms which promote transparency and accessibility 25 26 27 could be designated for this purpose through collaboration with national research bodies.”