“The NIH encourages investigators to use interim research products, such as preprints, to speed the dissemination and enhance the rigor of their work. This notice clarifies reporting instructions to allow investigators to cite their interim research products and claim them as products of NIH funding….”
“Science should not, and need not, be shackled by journal publication. Three sensible reforms would ensure that researchers’ results could be communicated to more people more quickly, without any compromise on quality. Step one is for the organisations that finance research to demand that scientists put their academic papers, along with their experimental data, in publicly accessible ‘repositories’ before they are sent to a journal. That would allow other researchers to make use of the findings without delay. Those opposed to such ‘preprints’ argue that they allow shoddy work to proliferate because it has not yet been peer-reviewed. That may surprise physicists and mathematicians, who have been posting work to arXiv, a preprint repository, for more than 25 years with no ill effects. After peer review, research should also be freely available for all to read. Too much science, much of it paid for from the public purse, languishes behind paywalls.
Step two is to improve the process of peer review itself. Journals currently administer a system of organising anonymous peer reviewers to pass judgment on new research—a fact they use, in part, to justify their hefty subscription prices. But this murky process is prone to abuse. At its worst, cabals of researchers are suspected of guaranteeing favourable reviews for each other’s work. Better that reviewers are named and that the reviews themselves are published. The Gates foundation has announced its support for an online repository where such open peer review of papers takes place. The repository was launched last year by the Wellcome Trust, meaning that the world’s two largest medical charities have thrown their weight behind it. Others should follow (see article).
Fight for your right
Finally, science needs to stop relying so much on journal publication as the only recognised credential for researchers and the only path to career progression. Tools exist that report how often a preprint has been viewed, for example, or whether a clinical data set has been cited in guidelines for doctors. A handful of firms are using artificial intelligence to assess the scientific importance of research, irrespective of how it has been disseminated. Such approaches need encouragement. Journals may lose out, but science itself will benefit.”
ON JANUARY 1st the Bill & Melinda Gates Foundation did something that may help to change the practice of science. It brought into force a policy, foreshadowed two years earlier, that research it supports (it is the world’s biggest source of charitable money for scientific endeavours, to the tune of some $4bn a year) must, when published, be freely available to all. On March 23rd it followed this up by announcing that it will pay the cost of putting such research in one particular repository of freely available papers.
“One of the world’s wealthiest charities, the Bill & Melinda Gates Foundation in Seattle, Washington, is set to launch its own open-access publishing venture later this year. The initiative,Gates Open Research, was announced on 23 March and will be modelled on a service begun last year by the London-based biomedical charity, the Wellcome Trust. Like that effort, the Gates Foundation’s platform is intended to accelerate the publication of articles and data from research funded by the charity.”
“One of the world’s biggest funders of scientific research is to establish an open access platform that will allow its grant winners to publish their findings, in a move that could be swiftly followed by the European Commission….Initiative will emulate Wellcome Trust’s publishing model, with European Commission set to follow”
“While it is illegal to destroy government data, removing data from accessible agency websites can effectively impede accessibility. Revising websites or creating other barriers to the underlying information can make it very difficult to find vital information. Also, much of the scientific information painstakingly collected over past decades, and costing hundreds of billions of dollars, remains held only by the government, and it is distributed through thousands of servers in hundreds of federal departments where it might not be backed up, making it difficult or impossible to find. Once information becomes sequestered, it becomes nearly impossible to know what has been lost if one doesn’t know what originally was there.
Thus, there is growing anxiety developing among many scientists who rely on the vast cache of data housed on government servers that key data may become sequestered or unavailable for public access. Many researchers further fear a crusade by the Trump administration against the scientific information provided to the public; the National Centers for Environmental Information may be one federal agency especially vulnerable to having vital information sequestered or removed from ready access. The proposed deep budget cuts for several government agencies have added to the fears of important databases being selectively reduced or removed….”
“The solution to the scientific reproducibility crisis is to move towards Open Research – the idea that scientific knowledge of all kinds should be openly shared as early as it is practical in the discovery process. We need to reward the publication of research outputs along the entire process, rather than just each journal article as it is published.”
“Open access to qualified research data is a precondition for the reproducibility, verification and falsification of data for further scientific and practical purposes. Hence, the FWF, supported by the Nationalstiftung für Forschung, Technologie und Entwicklung, has initiated the pilot programme Open Research Data (ORD) in order to create role models for the openness of research data in the digital age.
Open research data is defined as data produced in a course of research projects by experiments, source research, measurements, excavations, surveys or software developments and which are, following the FAIR Data Principles, findable, openly accessible, interoperable and re-usable. In January 2016, the FWF invited to an expression of interest for the pilot programme Open Research Data (ORD). 48 letters of interest were submitted. Based on the decision of the FWF Board in May 2016, 47 of those were invited for a full proposal. Until July 2016, the FWF received 41 full proposals, 19 in Humanities & Social Sciences, eleven in each the Natural Sciences and Life Sciences. After an international peer-review according to the high-quality FWF procedures, twelve projects could be funded:
List of projects Especially the high share of projects from the Humanities is remarkable. On the basis of this experience, the FWF will implement guidelines in all programmes which help to increase the openness of research data. Furthermore, with the programme Synthesis Networks a new initiative is planned that will enable international projects to conflate, process and analyse large datasets in order to answer highly relevant questions to science and society.”
An interview with neuroscientist Jeremy Freeman, manager of computational biology at the Chan Zuckerberg Initiative (CZI). Quoting Freeman: “We [at CZI] are absolutely committed to open dissemination of data and code and knowledge; that’s something I have been committed to all the time I’ve been a scientist.”