Elsevier charge $37.95 for access to an unformatted manuscript with intrusive watermarking and the illustrations removed | Sauropod Vertebra Picture of the Week

“It’s not, though. Because not only is this paper behind a paywall in Elsevier’s journal Cretaceous Research, but the paywalled paper is what they term a “pre-proof” — a fact advertised in a tiny font half way down the page rather than in a giant red letters at the top.

“Pre-proof” is not a term in common usage. What does it mean? It turns out to be an unformatted, double-spaced, and line-numbered manuscript. In other words, this is an AAM (author’s accepted manuscript) of the kind that the authors could have deposited in their institutional repository for anyone to read for free.

But wait — there’s more! By way of “added value”, Elsevier have slapped a big intrusive “journal pre-proof” watermark across the middle of every single page, to make it even less readable than a double-spaced line-numbered manuscript already is….”

Guest Post – Putting Publications into Context with the DocMaps Framework for Editorial Metadata – The Scholarly Kitchen

“Trust in academic journal articles is based on similar expectations. Journals carry out editorial processes from peer review to plagiarism checks. But these processes are highly heterogeneous in how, when, and by whom they are undertaken. In many cases, it’s not always readily apparent to the outside observer that they take place at all. And as new innovations in peer review and the open research movement lead to new experiments in how we produce and distribute research products, understanding what events take place is an increasingly important issue for publishers, authors, and readers alike.

With this in mind, the DocMaps project (a joint effort of the Knowledge Futures Group, ASAPbio, and TU Graz, supported by the Howard Hughes Medical Institute), has been working with a Technical Committee to develop a machine-readable, interoperable and extensible framework for capturing valuable context about the processes used to create research products such as journal articles. This framework is being designed to capture as much (or little) contextual data about a document as desired by the publisher: from a minimum assertion that an event took place, to a detailed history of every edit to a document….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

How the world is adapting to preprints

“A certain trend emerged from the preprint discussions. Praise for preprints and their virtues was reliably bracketed by an acknowledgement that the medium was a bit green and a bit untamed: the Wild West of scientific publishing, where anything can happen.

Similar analogies from the publishing community have been abundant. At a talk organized by the Society for Scholarly Publishing, Shirley Decker-Lucke, Content Director at SSRN, likened preprints to raw oysters: They’re generally safe, but sometimes you get a bad one. On the same panel, Lyle Ostrow, Assistant Professor of Neurology at Johns Hopkins, compared the adoption of preprints to the shift from horse-drawn buggies to automobiles: They’re faster, they’re better, but they will require education to use safely. The following week, a Scholarly Kitchen article about preprints drew a parallel with unruly teenagers: “tremendous promise, but in need of more adult supervision to achieve their potential”….

As it is, most preprint servers have not been agnostic about the papers they post. Generally, they restrict passage of content that is clearly unscientific, unethical, potentially harmful or not representative of a novel, empirically derived finding. That is, they already impose some editorial standards.

Preprint platforms offer authors a legitimate place to host their work with unprecedented speed, for free. In time, they could be in a position to enforce, or at least strongly incentivize, standards that are widely acknowledged to support research integrity, like data and code availability, details of randomization and blinding, study limitations and lay summaries for findings that are consequential to human health….”

How the world is adapting to preprints

“A certain trend emerged from the preprint discussions. Praise for preprints and their virtues was reliably bracketed by an acknowledgement that the medium was a bit green and a bit untamed: the Wild West of scientific publishing, where anything can happen.

Similar analogies from the publishing community have been abundant. At a talk organized by the Society for Scholarly Publishing, Shirley Decker-Lucke, Content Director at SSRN, likened preprints to raw oysters: They’re generally safe, but sometimes you get a bad one. On the same panel, Lyle Ostrow, Assistant Professor of Neurology at Johns Hopkins, compared the adoption of preprints to the shift from horse-drawn buggies to automobiles: They’re faster, they’re better, but they will require education to use safely. The following week, a Scholarly Kitchen article about preprints drew a parallel with unruly teenagers: “tremendous promise, but in need of more adult supervision to achieve their potential”….

As it is, most preprint servers have not been agnostic about the papers they post. Generally, they restrict passage of content that is clearly unscientific, unethical, potentially harmful or not representative of a novel, empirically derived finding. That is, they already impose some editorial standards.

Preprint platforms offer authors a legitimate place to host their work with unprecedented speed, for free. In time, they could be in a position to enforce, or at least strongly incentivize, standards that are widely acknowledged to support research integrity, like data and code availability, details of randomization and blinding, study limitations and lay summaries for findings that are consequential to human health….”

Meta-Research: Citation needed? Wikipedia and the COVID-19 pandemic | bioRxiv

Abstract:  With the COVID-19 pandemic’s outbreak at the beginning of 2020, millions across the world flocked to Wikipedia to read about the virus. Our study offers an in-depth analysis of the scientific backbone supporting Wikipedia’s COVID-19 articles. Using references as a readout, we asked which sources informed Wikipedia’s growing pool of COVID-19-related articles during the pandemic’s first wave (January-May 2020). We found that coronavirus-related articles referenced trusted media sources and cited high-quality academic research. Moreover, despite a surge in preprints, Wikipedia’s COVID-19 articles had a clear preference for open-access studies published in respected journals and made little use of non-peer-reviewed research uploaded independently to academic servers. Building a timeline of COVID-19 articles on Wikipedia from 2001-2020 revealed a nuanced trade-off between quality and timeliness, with a growth in COVID-19 article creation and citations, from both academic research and popular media. It further revealed how preexisting articles on key topics related to the virus created a framework on Wikipedia for integrating new knowledge. This “scientific infrastructure” helped provide context, and regulated the influx of new information into Wikipedia. Lastly, we constructed a network of DOI-Wikipedia articles, which showed the landscape of pandemic-related knowledge on Wikipedia and revealed how citations create a web of scientific knowledge to support coverage of scientific topics like COVID-19 vaccine development. Understanding how scientific research interacts with the digital knowledge-sphere during the pandemic provides insight into how Wikipedia can facilitate access to science. It also sheds light on how Wikipedia successfully fended of disinformation on the COVID-19 and may provide insight into how its unique model may be deployed in other contexts.

Bona Fide Journals – Creating a predatory-free academic publishing environment – Leiden Madtrics

Predatory journals pose a significant problem to academic publishing. In the past, a number of attempts have been made to identify them. This blog post presents a novel approach towards a predatory-free academic publishing landscape: Bona Fide Journals.

Honest signaling in academic publishing

Abstract:  Academic journals provide a key quality-control mechanism in science. Yet, information asymmetries and conflicts of interests incentivize scientists to deceive journals about the quality of their research. How can honesty be ensured, despite incentives for deception? Here, we address this question by applying the theory of honest signaling to the publication process. Our models demonstrate that several mechanisms can ensure honest journal submission, including differential benefits, differential costs, and costs to resubmitting rejected papers. Without submission costs, scientists benefit from submitting all papers to high-ranking journals, unless papers can only be submitted a limited number of times. Counterintuitively, our analysis implies that inefficiencies in academic publishing (e.g., arbitrary formatting requirements, long review times) can serve a function by disincentivizing scientists from submitting low-quality work to high-ranking journals. Our models provide simple, powerful tools for understanding how to promote honest paper submission in academic publishing.

 

Open science means better science – Leiden University

“Leiden University has an active open science community. Open science means transparency in all phases of research by precisely documenting every step of the way and making this publicly available. ‘It’s time to be open,’ say psychologists Anna van ’t Veer and Zsuzsika Sjoerds. There is increasing awareness of the need for open science, or open scholarship as it is sometimes called, also at Leiden University….”