Abstract: We conducted an audit of 60 clinical psychology journals, covering the first 2 quartiles by impact factor on Web of Science. We evaluated editorial policies in 5 domains crucial to reproducibility and transparency (prospective registration, data sharing, preprints, endorsement of reporting guidelines and conflict of interest [COI] disclosure). We examined implementation in a randomly selected cross-sectional sample of 201 articles published in 2017 in the “best practice” journals, defined as having explicit supportive policies in 4 out of 5 domains. Our findings showed that 15 journals cited prospective registration, 40 data sharing, 15 explicitly permitted preprints, 28 endorsed reporting guidelines, and 52 had mandatory policies for COI disclosure. Except for COI disclosure, few policies were mandatory: registration in 15 journals, data sharing in 1, and reporting guidelines for randomized trials in 18 and for meta-analyses in 15. Seventeen journals were identified as “best practice.” An analysis of recent articles showed extremely low compliance for prospective registration (3% articles) and data sharing (2%). One preprint could be identified. Reporting guidelines were endorsed in 19% of the articles, though for most articles this domain was rated as nonapplicable. Only half of the articles included a COI disclosure. Desired open science policies should become clear and mandatory, and their enforcement streamlined by reducing the multiplicity of guidelines and templates.
“An ad hoc committee of the National Academies of Sciences, Engineering, and Medicine is convening a public workshop to discuss the current state of transparency in reporting pre-clinical biomedical research (e.g., disclosure of the availability and location of data, materials, analysis, and methodology) and to explore the possibility of improving the harmonization of guidelines across journals and funding agencies so that biomedical researchers propose and report data in a consistent manner. This workshop is sponsored by the National Institutes of Health, Cell Press, The Lancet, and Nature Research.
Highlight current efforts by researchers, institutions, funders, and journals to increase transparency in proposing and reporting pre-clinical biomedical research;
Discuss journal and funder assessments of researchers’ adherence to reporting guidelines, including a discussion of the effectiveness of checklists;
Consider lessons learned from field-specific best practices for increased transparency in reporting rigor elements (i.e., research design, methodology, analysis, interpretation and reporting of results) that are generalizable across biomedical research domains;
Discuss opportunities for improving the consistency of reporting guidelines and requirements for rigor and transparency by journals, funders, and institutions across the biomedical research lifecycle; and
Consider approaches to compare reporting of rigor elements proposed in grant applications to those included in publications.
The committee will plan and organize the workshop, develop the agenda, select and invite speakers and discussants, and moderate or identify moderators for the discussions. The agenda will include a panel discussion on facilitating the development of consistent guidelines (e.g. a common set of minimal reporting standards) that could be applied across journals and funders to increase transparency in proposing and reporting biomedical research.
A proceedings of the presentations and discussions at the workshop will be prepared by a designated rapporteur in accordance with institutional guidelines….”
Abstract: Efforts to make research results open and reproducible are increasingly reflected by journal policies encouraging or mandating authors to provide data availability statements. As a consequence of this, there has been a strong uptake of data availability statements in recent literature. Nevertheless, it is still unclear what proportion of these statements actually contain well-formed links to data, for example via a URL or permanent identifier, and if there is an added value in providing them. We consider 531,889 journal articles published by PLOS and BMC which are part of the PubMed Open Access collection, categorize their data availability statements according to their content and analyze the citation advantage of different statement categories via regression. We find that, following mandated publisher policies, data availability statements have become common by now, yet statements containing a link to a repository are still just a fraction of the total. We also find that articles with these statements, in particular, can have up to 25.36% higher citation impact on average: an encouraging result for all publishers and authors who make the effort of sharing their data. All our data and code are made available in order to reproduce and extend our results.
Abstract: To truly achieve reproducible research, having reproducible analytics must be a principal research goal. Biological discovery is not the only deliverable; reproducibility is an essential part of our research.
[From the body of the paper:] “As mandated data sharing resolves a portion of the overall transparency/reproducibility challenge, the unaddressed issue remains the sharing of analyses….”
“One of the serious barriers to reproducibility of research is the lack of detailed methods in published articles. As trainees leave a research lab, it is often impossible to identify precisely the steps of their performed experiments. As we look to tackle the various aspects of open access and open research, the University of California continues to explore how we can unlock the underlying methods and protocols used in lab experiments.
With this goal in mind, we are excited to announce a new pilot for UC-wide use of protocols.io — an open access repository for research methods. The pilot, which will run for a three-year period from June 1, 2019 through May 31, 2022, will remove all cost barriers and allow UC researchers to test the uses of protocols.io for private collaboration around method development and for use in classrooms. In the long term, this initiative should also increase the reproducibility and rigour of the research published by UC academics….”
Abstract: Assessing scientists using exploitable metrics can lead to the degradation of research methods even without any strategic behavior on the part of individuals, via “the natural selection of bad science.” Institutional incentives to maximize metrics like publication quantity and impact drive this dynamic. Removing these incentives is necessary, but institutional change is slow. However, recent developments suggest possible solutions with more rapid onsets. These include what we call open science improvements, which can reduce publication bias and improve the efficacy of peer review. In addition, there have been increasing calls for funders to move away from prestige- or innovation-based approaches in favor of lotteries. We investigated whether such changes are likely to improve the reproducibility of science even in the presence of persistent incentives for publication quantity through computational modeling. We found that modified lotteries, which allocate funding randomly among proposals that pass a threshold for methodological rigor, effectively reduce the rate of false discoveries, particularly when paired with open science improvements that increase the publication of negative results and improve the quality of peer review. In the absence of funding that targets rigor, open science improvements can still reduce false discoveries in the published literature but are less likely to improve the overall culture of research practices that underlie those publications.
“Open science is on the rise. Across disciplines, there are increasing rates of sharing data, making available underlying materials and protocols, and preregistering studies and analysis plans. Hundreds of services have emerged to support open science behaviors at every stage of the research lifecycle. But, what proportion of the research community is practicing open science? Where is penetration of these behaviors strongest and weakest? Answers to these questions are important for evaluating progress in culture reform and for strategic planning of where to invest resources next.
The hardest part of getting meaningful answers to these questions is quantifying the population that is NOT doing the behaviors. For example, in a recent post, Nici Pfeiffer summarized the accelerating growth of OSF users on the occasion of hitting 150,000 registered users. That number and non-linear growth suggests cultural movement associated with this one service, but how much movement?…”
“Centralized depositing of materials advances science in so many ways. It saves authors the time and burden of shipping requested materials. Researchers who request from repositories save time by not having to recreate reagents or wait months or years to receive samples. Many scientists have been on the receiving end of a request that was filled by an incorrect or degraded sample, which further delays research. Repositories like the ones recommended by PLOS handle the logistics of material requests, letting the scientists focus on what’s important: doing research….
By encouraging authors to deposit materials at the time of publication, journals will help accelerate research through timely distribution and accurate identification of reagents. Biological repositories exist to serve the scientific community. Take Addgene’s involvement in the explosive advancement of CRISPR research. Since 2012, over 8,400 CRISPR plasmids have been deposited and Addgene has distributed over 144,000 CRISPR plasmids worldwide, enabling researchers to share, modify, and improve this game-changing molecular tool. It is a prime example of the positive impact that biological repositories are making on research….”