Abstract: Open Science is creating new forms of scientific interaction that were impossible or undreamed of in an earlier age. This has a strong impact on core academic processes like research, education and innovation. It is, for instance, easier to replicate an experiment if the relevant data sets are digitally available to any scientist who wishes to corroborate her colleague’s findings.TU Delft has a long history of engagement with Open Science. Yet, with its Open Science Programme 2020-2024, Research and Education in the Open Era, TU Delft wishes to take Open Science to the next level: a situation in which Open Science has become the default way of practising research and education, and the “information era” has become the “open era”. It is TU Delft’s ambition to be frontrunner in this revolutionary process. This is reflected in the TU Delft Strategic Framework 2018-2024, with “openness” as one of its major principles.The TU Delft Open Science Programme 2020-2024 tackles all areas of scholarly engagement where restrictions limit the flow of academic knowledge. It proposes new approaches to the process of research, education and innovation, with a strong focus on transparency, integrity and efficiency.The programme consists of five interrelated projects: Open Education, Open Access, Open Publishing Platform, FAIR Data, and FAIR Software. The projects are aimed at creating and disseminating various types of resources for the benefit of TU Delft researchers, teachers and students, as well as the general public. They will range from educational materials and software to a publishing platform. All outputs of the programme will be as ‘FAIR’ as possible: findable, accessible, interoperable and reusable.
Abstract: The ACcess to Transparent Statistics (ACTS) call to action assembles four measures that are rapidly achievable by journals and funding agencies to enhance the quality of statistical reporting. The ACTS call to action is an appeal for concrete actions from institutions that should spearhead the battle for reproducibility.
Abstract: Open access policies have been progressing since the beginning of this century. Important global initiatives, both public and private, have set the tone for what we understand by open access. The emergence of tools and web platforms for open access (both legal and illegal) have placed the focus of the discussion on open access to knowledge, both for academics and for the general public, who finance such research through their taxes, particularly in Latin America. This historically unnoticed discussion must, we believe, be discussed publicly, given the characteristics of the Latin American scientific community, as well as its funding sources. This article includes an overview of what is meant by open access and describes the origins of the term, both in its philosophical sense and in its practical sense, expressed in the global declarations of Berlin and Bethesda. It also includes the notion of open access managed (or not) by some reputable institutions in Chile, such as CONICYT (National Commission for Scientific and Technological Research) and higher education institutions reputed nationally, such as the Universdad de Chile and Pontificia Universidad Católica de Chile. Various Latin American initiatives related to open access (Scielo, Redalyc, among others) are described, as well as the presence of Chilean documents in those platforms. The national institutional repositories are listed, as well as their current status and a discussion about what open access has implied in Latin America and its importance for the replicability of the investigations carried out locally. Finally, we describe some governmental initiatives (mainly legislative) at the Latin American level and propose some recommendations regarding the promotion and implementation of repositories for the access to scientific data (for access and replication purposes) of the national research.
Abstract: Scholarly publishers can help to increase data quality and reproducible research by promoting transparency and openness. Increasing transparency can be achieved by publishers in six key areas: (1) understanding researchers’ problems and motivations, by conducting and responding to the findings of surveys; (2) raising awareness of issues and encouraging behavioural and cultural change, by introducing consistent journal policies on sharing research data, code and materials; (3) improving the quality and objectivity of the peer-review process by implementing reporting guidelines and checklists and using technology to identify misconduct; (4) improving scholarly communication infrastructure with journals that publish all scientifically sound research, promoting study registration, partnering with data repositories and providing services that improve data sharing and data curation; (5) increasing incentives for practising open research with data journals and software journals and implementing data citation and badges for transparency; and (6) making research communication more open and accessible, with open-access publishing options, permitting text and data mining and sharing publisher data and metadata and through industry and community collaboration. This chapter describes practical approaches being taken by publishers, in these six areas, their progress and effectiveness and the implications for researchers publishing their work.
“Scientific software often requires installing, navigating and troubleshooting a byzantine network of computational ‘dependencies’ — the code libraries and tools on which each software module relies. Some have to be compiled from source code or configured just so, and an installation that should take a few minutes can degenerate into a frustrating online odyssey through websites such as Stack Overflow and GitHub. “One of the hardest parts of reproducibility is getting your computer set up in exactly the same way as somebody else’s computer is set up. That is just ridiculously difficult,” says Kirstie Whitaker, a neuroscientist at the Alan Turing Institute in London….”
“The crisis of reproducibility in science is well known. The combination of ‘publish or perish’ incentives, secrecy around data and the drive for novelty at all costs can result in fragile advances and lots of wasted time and money. Even in data science, when a paper is published there is generally no way for an outsider to verify its results, because the data from which the findings were derived are not available for scrutiny. Such science cannot be built upon very easily: siloed science is slow science.
That’s one of the reasons funders and publishers are beginning to require that publications include access to the underlying data and analysis code. It’s clear that this new era of data science needs a new cultural and practical approach, one which embraces openness and collaboration more than ever before. To this end, a group of Turing researchers have created The Turing Way – an evolving online “handbook” on how to conduct world-leading, reproducible research in data science and artificial intelligence….”
From Google’s English: “Established indicators for research and innovation processes have so far insufficiently covered open science and open innovation. As a result, their chances and risks often remain in the fog. A new discussion paper therefore makes proposals for the extension of existing and the development of new indicators. We looked at possible innovations in the field of open science….”
“Journal practices (other than OA) promoting Open Science goals (relevance, reproducibility, efficiency, transparency)
Early, full and reproducible content
preregistration – use preregistrations in the review process
registered reports – apply peer review to preregistration prior to the study and publish results regardless of outcomes
preprint policy – liberally allow preprinting in any archive without license restrictions
data/code availability – foster or require open availability of data and code for reviewers and readers
TDM allowance – allow unrestricted TDM of full text and metadata for any use
null/negative results – publish regardless of outcome
Machine readable ecosystem
data/code citation – promote citation and use standards
persistent IDs – e.g. DOI, ORCID, ROR, Open Funder Registry, grant IDs
licenses (in Crossref) – register (open) licenses in Crossref
contributorship roles – credit all contributors for their part in the work
open citations – make citation information openly available via Crossref
open peer review – e.g. open reports and open identities
peer review criteria – evaluate methodological rigour and reporting quality only or also judge expected relevance or impact?
rejection rates – publish rejection rates and reconsider high selectivity
post-publication peer review – publish immediately after sanity check and let peer review follow that?
author diversity – age, position, gender, geography, ethnicity, colour
reviewer diversity – age, position, gender, geography, ethnicity, colour
editor diversity – age, position, gender, geography, ethnicity, colour
Metrics and DORA
DORA: journal metrics – refrain from promoting
DORA: article metrics – provide a range and use responsibly…”
“Question What percentage of clinical trials published in high-impact journals in 2017 generated evidence that could feasibly be replicated using observational methods and data sources?
Findings In this cross-sectional study of 220 clinical trials published in high-impact journals in 2017, only 15% could feasibly be replicated using currently available real-world data sources.
Meaning This study suggests that, although the increasing use of real-world evidence in medical research presents opportunities to supplement or even replace some clinical trials, observational methods are not likely to obviate the need for traditional clinical trials….”
Abstract: In the context of open science, the availability of research materials is essential for knowledge accumulation and to maximize the impact of scientific research. In microbiology, microbial domain biological resource centers (mBRCs) have long-standing experience in preserving and distributing authenticated microbial strains and genetic materials (e.g., recombinant plasmids and DNA libraries) to support new discoveries and follow-on studies. These culture collections play a central role in the conservation of microbial biodiversity and have expertise in cultivation, characterization, and taxonomy of microorganisms. Information associated with preserved biological resources is recorded in databases and is accessible through online catalogues. Legal expertise developed by mBRCs guarantees end users the traceability and legality of the acquired material, notably with respect to the Nagoya Protocol. However, awareness of the advantages of depositing biological materials in professional repositories remains low, and the necessity of securing strains and genetic resources for future research must be emphasized. This review describes the unique position of mBRCs in microbiology and molecular biology through their history, evolving roles, expertise, services, challenges, and international collaborations. It also calls for an increased deposit of strains and genetic resources, a responsibility shared by scientists, funding agencies, and publishers. Journal policies requesting a deposit during submission of a manuscript represent one of the measures to make more biological materials available to the broader community, hence fully releasing their potential and improving openness and reproducibility in scientific research.