Abstract: Recent concerns about the reproducibility of science have led to several calls for more open and transparent research practices and for the monitoring of potential improvements over time. However, with tens of thousands of new biomedical articles published per week, manually mapping and monitoring changes in transparency is unrealistic. We present an open-source, automated approach to identify 5 indicators of transparency (data sharing, code sharing, conflicts of interest disclosures, funding disclosures, and protocol registration) and apply it across the entire open access biomedical literature of 2.75 million articles on PubMed Central (PMC). Our results indicate remarkable improvements in some (e.g., conflict of interest [COI] disclosures and funding disclosures), but not other (e.g., protocol registration and code sharing) areas of transparency over time, and map transparency across fields of science, countries, journals, and publishers. This work has enabled the creation of a large, integrated, and openly available database to expedite further efforts to monitor, understand, and promote transparency and reproducibility in science.
Abstract: The evidence-based policy movement promotes the use of empirical evidence to inform policy decision-making. While several social science disciplines are undergoing a “credibility revolution” focused on openness and replication, policy analysis has yet to systematically embrace transparency and reproducibility. We argue that policy analysis should adopt the open research practices increasingly espoused in related disciplines to advance the credibility of evidence-based policymaking. We first discuss the importance of evidence-based policy in an era of increasing disagreement about facts, analysis, and expertise. We present a novel framework for “open” policy analysis (OPA) and how to achieve it, focusing on examples of recent policy analyses that have incorporated open research practices such as transparent reporting, open data, and code sharing. We conclude with recommendations on how key stakeholders in evidence-based policy can make OPA the norm and thus safeguard trust in using empirical evidence to inform important public policy decisions.
Science and data are interwoven in many ways. The scientific method has lent a good part of its overall approach and practices to data-driven analytics, software development, and data science. Now data science and software lend some tools to scientific research.
“Our goal is universal and immediate open sharing of all scientific knowledge and outputs. With our Open Science program, we empower more people to engage in research practices that accelerate the pace, robustness, and reproducibility of science through partnerships, policies, and grants. Helping scientists build on each others’ work can dramatically accelerate the pace of discovery, and in turn, our understanding of health and disease.
We support our grantees and the broader scientific community to deposit software code to open repositories, make experimental protocols openly accessible, and submit manuscripts to preprint servers to communicate results more quickly….”
“What are the publishing requirements of ASAP [Aligning Science Across Parkinson’s] and Plan S?
All ASAP-funded researchers will follow the basic tenets of OA publication set forth in Plan S, as follows:
Immediate free access: Peer-reviewed, author-accepted research must be made freely available immediately upon publication, without any embargo period (zero embargo).
Unrestricted reuse rights:
ASAP funded authors or their institutions must retain the copyright for their research articles unless they are published in the public domain.
Articles must be published under the Creative Commons Attribution license CC-BY 4.0, or under the CC0 license which does not require attribution, or equivalent. Both licenses permit reuse of the material without restriction….”
“The Dryad and Zenodo teams are proud to announce the launch of our first formal integration. As we’ve noted over the last years, we believe that the best way to support the broad scientific community in publishing their outputs is to leverage each other’s strengths and build together. Our plan has always been to find ways to seamlessly connect software publishing and data curation in ways that are both easy enough that the features will be used but also beneficial to the researchers re-using and building on scientific discoveries. This month, we’ve released our first set of features to support exactly that….”
“ESA has adopted a society-wide open research policy for its publications to further support scientific exploration and preservation, allow a full assessment of published research, and streamline policies across our family of journals. An open research policy provides full transparency for scientific data and code, facilitates replication and synthesis, and aligns ESA journals with current standards. As of Feb. 1, 2021, all new manuscript submissions to ESA journals must abide by the following policy:
As a condition for publication in ESA journals, all underlying data and statistical code pertinent to the results presented in the publication must be made available in a permanent, publicly accessible data archive or repository, with rare exceptions (see “Details” for more information). Archived data and statistical code should be sufficiently complete to allow replication of tables, graphs, and statistical analyses reported in the original publication, and perform new or meta-analyses. As such, the desire of authors to control additional research with these data and/or code shall not be grounds for withholding material. …”
“The Community-led Open Publication Infrastructures for Monographs project (COPIM) has today released the code originally written for their Opening the Future initiative, which collects and processes library signups. This release makes the software freely available for any publisher to adapt and use themselves – it is a generic signup system for open-access projects that have consortial membership models….”
Abstract: Scientific software registries and repositories serve various roles in their respective disciplines. These resources improve software discoverability and research transparency, provide information for software citations, and foster preservation of computational methods that might otherwise be lost over time, thereby supporting research reproducibility and replicability. However, developing these resources takes effort, and few guidelines are available to help prospective creators of registries and repositories. To address this need, we present a set of nine best practices that can help managers define the scope, practices, and rules that govern individual registries and repositories. These best practices were distilled from the experiences of the creators of existing resources, convened by a Task Force of the FORCE11 Software Citation Implementation Working Group during the years 2019-2020. We believe that putting in place specific policies such as those presented here will help scientific software registries and repositories better serve their users and their disciplines.