Abstract: Recent concerns about the reproducibility of science have led to several calls for more open and transparent research practices and for the monitoring of potential improvements over time. However, with tens of thousands of new biomedical articles published per week, manually mapping and monitoring changes in transparency is unrealistic. We present an open-source, automated approach to identify 5 indicators of transparency (data sharing, code sharing, conflicts of interest disclosures, funding disclosures, and protocol registration) and apply it across the entire open access biomedical literature of 2.75 million articles on PubMed Central (PMC). Our results indicate remarkable improvements in some (e.g., conflict of interest [COI] disclosures and funding disclosures), but not other (e.g., protocol registration and code sharing) areas of transparency over time, and map transparency across fields of science, countries, journals, and publishers. This work has enabled the creation of a large, integrated, and openly available database to expedite further efforts to monitor, understand, and promote transparency and reproducibility in science.
“NERL members are among the most prestigious and productive research institutions in the United States, with researchers at NERL-affiliated institutions producing an estimated 10-12% of the most important and impactful scholarship in the world. We are committed to leveraging our influence to achieve global sustainability, parity, and access in scholarly publishing. Ensuring a sustainable ecosystem for scholarly communications is crucial across our institutions for impact, access, and preservation. When we say we demand a better deal, we mean more than a good price. In keeping with NERL’s support for The MIT Framework for Publisher Contracts, we are committed to contracts that allow for maximum flexibility and options for researchers. As partners in the scholarly communication ecosystem, publishers and libraries share in the challenges of unprecedented health and economic crises, and our shared priority must be opening access to scholarship as our best way of supporting solutions to those crises….”
“The NERL Consortium issued a statement, “NERL Demands a Better Deal,” articulating the values NERL will adopt in negotiating agreements with publishers. The statement, which originated in the NERL Program Council and which has generated broad support across the NERL community, outlines the following core values in service to an open, equitable, and healthy academic publishing ecosystem:
Transparency: NERL commits to transparency of the negotiating process and will share details of discussions, outcomes, and cost whenever possible to demonstrate leadership for academic libraries. We commit to demanding transparency from our vendor partners and will prioritize vendor partners who honor this commitment.
Sustainability: NERL negotiates for terms that ensure greater sustainability, pursuing opportunities to support collective infrastructure and collective ownership. We prioritize agreements that move past historical pricing models and precedent. We encourage smarter, better, and often smaller deals that do not increase cost with unrequested content while providing clear and transparent pricing models.
Equity: NERL negotiates for terms that support the rights of all researchers to participate in the scholarly communications ecosystem as knowledge creators; to do so requires partnership between libraries and publishers to eliminate barriers. We work to ensure that costs to researchers and institutions are aligned with the costs of publishing, so everyone has access to open access publishing.
Reproducibility: NERL agreements uphold Author’s Rights, ensuring no forced copyright transfer from author to publisher, computational rights for researchers to use articles in text mining or other practices, and the right to deposit articles in institutional repositories.
Flexibility: We will encourage and prioritize NERL Agreements that incentivize emerging, efficient, and sustainable business models. We seek meaningful and creative alternatives that support the dissemination and preservation of the scholarly record. …”
“The new Center for Open and REproducibile Science (CORES) aims to develop and nurture transparency and reproducibility in the collection, analysis, and dissemination of data across all domains of scientific activity. The Center will focus on two core objectives. The first is to develop resources and support activities that promote the adoption of open science practices at Stanford and beyond. The second is to foster methodological innovations that can enhance the adoption and effectiveness of open science practices….”
“Dozens of professors from Stanford’s science, engineering and humanities departments have come together to launch the Center for Open and REproducible Science last week, an initiative that seeks to increase the transparency, reproducibility and openness of science.
The Center, also known as CORES, is encouraging early adoption of open science practices at Stanford, which include data sharing and study pre-registration. Eventually, it hopes to become the “gold standard” for open science, a fundamental shift that makes science more inclusive by emphasizing accessibility and dissemination of data, methods and tools, rather than just results….”
Abstract: Background: Open data on the locations and services provided by health facilities have, in some countries, allowed the development of software tools contributing to COVID-19 response. The UN and WHO encourage countries to make health facility location data open, to encourage use and improvement. We provide a summary of open access health facility location data in Africa using re-useable R code. We aim to support data analysts developing software tools to address COVID-19 response in individual countries. In Africa there are currently three main sources of such open data; 1) direct from national ministries of health, 2) a database for sub-Saharan Africa collated and published by a team from KEMRI-Wellcome Trust Research Programme and now hosted by WHO, and 3) The Global Healthsites Mapping Project in collaboration with OpenStreetMap.
Methods: We searched for and documented official national facility location data that were openly available. We developed re-useable open-source R code to summarise and visualise facility location data by country from the three sources. This re-useable code is used to provide a web user interface allowing data exploration through maps and plots of facility type.
Results: Out of 52 African countries, seven currently provide an official open facility list that can be downloaded and analysed reproducibly. Considering all three sources, there are over 185,000 health facility locations available for Africa. However, there are differences and overlaps between sources and a lack of data on capacities and service provision.
Conclusions: These summaries and software tools can be used to encourage greater use of existing health facility location data, incentivise further improvements in the provision of those data by national suppliers, and encourage collaboration within wider data communities. The tools are a part of the afrimapr project, actively developing R building blocks to facilitate the use of health data in Africa.
” Institutions are committing to working together to determine how their cultural practices, such as emphasizing the importance of novelty, discovery and priority, undermine the value of replication, verification and transparency. That is the goal of the UK Reproducibility Network, which I co-founded earlier this year. It started as informal groups of researchers at individual institutions that met with representatives from funders and publishers (including Nature) who were open to discussions about how best to align open-science initiatives — reproducibility sections in grant applications and reporting checklists in article submissions, for example. Now institutions themselves are cooperating to consider larger changes, from training to hiring and promotion practices….
Our ten university members span the United Kingdom from Aberdeen to Surrey, and we expect that list to grow. Each will appoint a senior academic to focus on research quality and improvement. Figuring out which system-level changes are needed and how to make them happen will now be someone’s primary responsibility, not a volunteer activity. What changes might ensue? Earlier this year, the University of Bristol, where I work, made the use of data sharing and other open-research practices an explicit criterion for promotion….
But these cultural changes might falter. Culture eats strategy for breakfast — grand plans founder on the rocks of implicit values, beliefs and ways of working. Top-down initiatives from funders and publishers will fizzle out if they are not implemented by researchers, who review papers and grant proposals. Grass-roots efforts will flourish only if institutions recognize and reward researchers’ efforts.
Funders, publishers and bottom-up networks of researchers have all made strides. Institutions are, in many ways, the final piece of the jigsaw. Universities are already investing in cutting-edge technology and embarking on ambitious infrastructure programmes. Cultural change is just as essential to long-term success.”
Abstract: Research indicating many study results do not replicate has raised questions about the credibility of science and prompted concerns about a potential reproducibility crisis. Moreover, most published research is not freely accessible, which limits the potential impact of science. Open science, which aims to make the research process more open and reproducible, has been proposed as one approach to increase the credibility and impact of scientific research. Although relatively little attention has been paid to open science in relation to single-case design, we propose that open-science practices can be applied to enhance the credibility and impact of single-case design research. In this paper, we discuss how open-science practices align with other recent developments in single-case design research, describe four prominent open-science practices (i.e., preregistration, registered reports, data and materials sharing, and open access), and discuss potential benefits and limitations of each practice for single-case design.