Chance discovery of forgotten 1960s ‘preprint’ experiment

“For years, scientists have complained that it can take months or even years for a scientific discovery to be published, because of the slowness of peer review. To cut through this problem, researchers in physics and mathematics have long used “preprints” – preliminary versions of their scientific findings published on internet servers for anyone to read. In 2013, similar services were launched for biology, and many scientists now use them. This is traditionally viewed as an example of biology finally catching up with physics, but following a chance discovery in the archives of Cold Spring Harbor Laboratory, Matthew Cobb, a scientist and historian at the University of Manchester, has unearthed a long-forgotten experiment in biology preprints that took place in the 1960s, and has written about them in a study publishing 16 November in the open access journal PLOS Biology.”

The original preprint system was scientists sharing photocopies

“The movement to make biology papers freely available before they have been peer-reviewed, let alone published in a reputable journal, finally succeeded in 2013, when bioRxiv (pronounced bio-archive) was launched by Cold Spring Harbor Laboratory. But 50 years before, the National Institutes of Health tried something similar: distributing unpublished scientific papers, or preprints, to a handpicked group of leading researchers.”

The Center for Open Science and MarXiv Launch Branded Preprint Service

The Center for Open Science (COS) and MarXiv have launched a new preprint service for the earth sciences, sources for both organizations announced today. The new service, called MarXiv, provides free, open access, open source archives for the ocean conservation and marine climate sciences.

Free the Science

“Free the Science is The Electrochemical Society’s initiative to move toward a future that embraces open science to further advance research in our fields. This is a long-term vision for transformative change in the traditional models of communicating scholarly research….ECS publishes over a third of its journal articles as open access. Other ECS programs that advance the shift to open science include a preprint server through a partnership with the Center for Open Science, enhanced research dissemination with Research4Life, ECS OpenCon, and expanding our publications to include more research in data sciences….”

Is the Center for Open Science a Viable Alternative for Elsevier? – Enago Academy

“Data management has become an increasingly discussed topic among the academic community. Managing data is an element of open science, which has proven to increase dissemination of research and citations for journal articles. Open science increases public access to academic articles, mostly through preprint repositories. Indeed, according to this study, open access (OA) articles are associated with a 36-172% increase in citations compared to non-OA articles. Publishers such as Elsevier have acquired preprint repositories to increase the dissemination of academic research.”

Green Open Access: An Imperfect Standard – Politics, Distilled

“In my last post on the lack of accessibility of Gold Open Access for early career researchers (ECRs), I mentioned that in my opinion Green Open Access was a very imperfect solution – in fact, hardly a solution at all.  I expand here on why that is the case, and why a focus on green OA presents new challenges for publication practices which compound the – already many – challenges of moving towards a greater accessibility of research. Not all OA initiatives are equal.  Green Open Access, by far the commonest kind, refers to the depositing of a non-final version of the published manuscript into a research repository – generally either an institutional repository (managed by the university with which the researcher is affiliated), a subject-specific repository (such as ArXiv/SocArXiv), an academic networking website such as Academia.edu, ResearchGate, or Mendeley, or a personal website.  Various publishers have rules on what version can be posted where and when, with the most common being that accepted manuscripts (after peer-review, but before proofreading and typesetting) can be made public in repositories after an embargo period, while the “version of record” – the published version – may not be shared publicly for free.  The published article remains accessible only with paid access (with publishers either explicitly authorizing (SAGE) or tacitly tolerating the private sharing of full articles.”

Peer review: the end of an error?

“It is not easy to have a paper published in the Lancet, so Wakefield’s paper presumably underwent a stringent process of peer review. As a result, it received a very strong endorsement from the scientific community. This gave a huge impetus to anti-vaccination campaigners and may well have led to hundreds of preventable deaths. By contrast, the two mathematics ­preprints were not peer reviewed, but that did not stop the correctness or otherwise of their claims being satisfactorily established.

An obvious objection to that last sentence is that the mathematics preprints were in fact peer-reviewed. They may not have been sent to referees by the editor of a journal, but they certainly were carefully scrutinized by peers of the authors. So to avoid any confusion, let me use the phrase “formal peer review” for the kind that is organized by a journal and “informal peer review” for the less official scrutiny that is carried out whenever an academic reads an article and comes to some sort of judgement on it. My aim here is to question whether we need formal peer review. It goes without saying that peer review in some form is essential, but it is much less obvious that it needs to be organized in the way it usually is today, or even that it needs to be organized at all.

What would the world be like without formal peer review? One can get some idea by looking at what the world is already like for many mathematicians. These days, the arXiv is how we disseminate our work, and the arXiv is how we establish priority. A typical pattern is to post a preprint to the arXiv, wait for feedback from other mathematicians who might be interested, post a revised version of the preprint, and send the revised version to a journal. The time between submitting a paper to a journal and its appearing is often a year or two, so by the time it appears in print, it has already been thoroughly assimilated. Furthermore, looking a paper up on the arXiv is much simpler than grappling with most journal websites, so even after publication it is often the arXiv preprint that is read and not the journal’s formatted version. Thus, in mathematics at least, journals have become almost irrelevant: their main purpose is to provide a stamp of approval, and even then one that gives only an imprecise and unreliable indication of how good a paper actually is….

An alternative system would almost certainly not be perfect, but to insist on perfection, given the imperfections of the current system, is nothing but status quo bias. To guard against this, imagine that an alternative system were fully established and see whether you can mount a convincing argument for switching to what we have now, where all the valuable commentary would be hidden away and we would have to pay large sums of money to read each other’s writings. You would be laughed out of court.”

Impact of Social Sciences – The next stage of SocArXiv’s development: bringing greater transparency and efficiency to the peer review process

“Almost 1,500 papers have been uploaded to SocArXiv since its launch last year. Up to now the platform has operated alongside the peer-review journal system rather than seriously disrupting it. Looking ahead to the next stage of its development, Philip Cohen considers how SocArXiv might challenge the peer review system to be more efficient and transparent, firstly by confronting the bias that leads many who benefit from the status quo to characterise mooted alternatives as extreme. The value and implications of openness at the various decision points in the system must be debated, as should potentially more disruptive innovations such as non-exclusive review and publication or crowdsourcing reviews.”