Should authors pay to submit their papers? · john hawks weblog

“An article by Tim Vines in The Scholarly Kitchen looks at the pay-to-submit model of open access publication: “Plan T: Scrap APCs and Fund Open Access with Submission Fees”….

The article is worth considering. Articles cost money to publish. If we insist upon journal publication, that money needs to come from somewhere. I would be happy if my university subsidized submission of papers to open-access journals instead of subscriptions to closed-access journals.

However, I tend to agree with Richard Sever, who tweeted a link to the article and commented:

Plan U: just mandate preprint deposition and let a downstream ecosystem of overlays/journals with various business models evolve in response to community needs. Side benefit: speeding up science massively… “

Welcome to The Great Acceleration – The Scholarly Kitchen

I like to think of the period that we’ve entered into now as “The Great Acceleration,” a term coined by author Warren Ellis (or, as a recent exhibition states it, “Everything Happens So Much“). We aren’t really dealing with new issues – arXiv has been around posting preprints since 1991, mergers have been common for a while now (Wiley buying Blackwell happened more than 11 years ago), and the open access movement has been front and center since at least the year 2000….

But, like every other aspect of our lives in this interconnected, digital utopia in which we live, we’ve reached a point where everything feels like it’s happening at once. Every week it seems like another piece of crucial publishing infrastructure is changing hands, or a new open access policy is announced, or there’s a new open letter petitioning for change that you’re expected to sign onto, or a new technology or standard that you absolutely must implement….

Plan S is a great example of acceleration — the research world has been moving slowly toward open access, with different fields moving at different paces via different routes. This evolution has taken place at, not surprisingly, an evolutionary pace, and a small group of significant research funders have declared their impatience with this level of progress. Plan S is a deliberate attempt to accelerate change, throwing a comet into a complex ecosystem in hope that it will produce mammals, rather than mass extinction….

That brings us back to the notion of much-needed infrastructure. If the open source community really wants to make a difference, then the some focus should be directed toward back-end, e-commerce billing systems. The regulatory conditions of the market have reached a point where it is incredibly inefficient for them to be tracked and applied by hand. We need systems that can take advantage of persistent identifiers (ORCID, the CrossRef Funder Registry, the developing ORG-ID) and automate the process of ensuring that each author on a paper has met their requirements. A modular system where each funder, government, and institution can plug in their rules and have those applied to the publication process would enable much more rapid progress than reinventing the article submission system or building yet another publishing platform….”

Europe Speeds Ahead on Open Access: 2018 in Review | Electronic Frontier Foundation

“Plan S reflects a more aggressive open access policy than FASTR does. FASTR requires that government agencies that fund scientific research require grantees to make their papers available to the public within a year of publication; the original publication can happen in a traditional, closed journal. (Most U.S. government agencies already have that requirement under a 2013 White House memo.)

Plan S takes that much further, requiring grantees to publish their research in an open access journal or repository from day one. What’s more, grantees must publish their papers under an open license allowing others to share and reuse them. In discussions on open access laws, EFF has long urged lawmakers to consider including open licensing mandates. Allowing the public to read the research is a great first step, but allowing the public to reuse and adapt it (even commercially) unlocks its true economic and educational potential. We hope to see more similarly strong open access reforms, both in the U.S. and around the world….”

Making Scientific Research More Open and More Effective

A Mozilla Fellow and a team of open-science advocates have been awarded two major grants to make scientific research more open and effective.

Daniela Saderi and the rest of the PREreview leadership team will use a £50,000 grant from the Wellcome Trust’s Open Research Fund and $66,780 from the Alfred P. Sloan Foundation to carry out the work.

PREreview is a community and a platform for the crowd-sourcing of preprint peer reviews in scientific research. Preprints are early versions of scientific manuscripts that are published online before undergoing journal peer review. They allow researchers to share early scientific findings more openly and collaboratively….

PREreview will partner with the nonprofit Outbreak Science and use the Wellcome funds to develop “Rapid PREreview,” an interoperable and open-source extension for the PREreview platform. Rapid PREreview will allow scientists to share preprints swiftly during public health crises, and also to generate aggregated data visualizations based on feedback….”

On the value of preprints: an early career researcher perspective [PeerJ Preprints]

Abstract:  Peer-reviewed journal publication is the main means for academic researchers in the life sciences to create a permanent, public record of their work. These publications are also the de facto currency for career progress, with a strong link between journal brand recognition and perceived value. The current peer-review process can lead to long delays between submission and publication, with cycles of rejection, revision and resubmission causing redundant peer review. This situation creates unique challenges for early career researchers (ECRs), who rely heavily on timely publication of their work to gain recognition for their efforts. ECRs face changes in the academic landscape including the increased interdisciplinarity of life sciences research, expansion of the researcher population and consequent shifts in employer and funding demands. The publication of preprints, publicly available scientific manuscripts posted on dedicated preprint servers prior to journal managed peer-review, can play a key role in addressing these ECR challenges. Preprinting benefits include rapid dissemination of academic work, open access, establishing priority or concurrence, receiving feedback and facilitating collaborations. While there is a growing appreciation for and adoption of preprints, a minority of all articles in life sciences and medicine are preprinted. The current low rate of preprint submissions in life sciences and ECR concerns regarding preprinting needs to be addressed. We provide a perspective from an interdisciplinary group of early career researchers on the value of preprints and advocate the wide adoption of preprints to advance knowledge and facilitate career development.

Publishing speed and acceptance rates of open access megajournals | Online Information Review | Ahead of Print

“Purpose. The purpose of this paper is to look at two particular aspects of open access megajournals, a new type of scholarly journals. Such journals only review for scientific soundness and leave the judgment of scientific impact to the readers. The two leading journals currently each publish more than 20,000 articles per year. The publishing speed of such journals and acceptance rates of such journals are the topics of the study.

Design/methodology/approach. Submission, acceptance and publication dates for a sample of articles in 12 megajournals were manually extracted from the articles. Information about acceptance rates was obtained using web searches of journal home pages, editorials, blogs, etc.

Findings. The time from submission to publication varies a lot, with engineering megajournals publishing much more rapidly. But on average it takes almost half a year to get published, particularly in the high-volume biomedical journals. As some of the journals have grown in publication volume, the average review time has increased by almost two months. Acceptance rates have slightly decreased over the past five years, and are now in the range of 50–55 percent.

Originality/value. This is the first empirical study of how long it takes to get published in megajournals and it highlights a clear increase of around two months in publishing. Currently, the review process in the biomedical megajournals takes as long as in regular more selective journals in the same fields. Possible explanations could be increasing difficulties in finding willing and motivated reviewers and in a higher share of submissions from developing countries….”

Open access Academic publishing in transition

It’s the year 2024: a scientist in Sudan, the family member of a patient with a rare disease in the United States, a farmer in China – assuming they have access to the internet, they are all able to access the latest scientific findings at any time, without restriction and free of charge. On this basis, they can develop new energy supply options for their community, prepare for visits to the doctor or follow the latest research on seeds and breeds. A pipe dream? Or isn’t free access to academic literature something we should have had for a long time, three decades since the development of the world wide web?

Why Do We Digitize? The Case for Slow Digitization – Archive Journal

“But this advocacy for digitization has discouraged the development of critical and reflective discussions on the way in which digitization is undertaken. There is a risk that digitization programs, by focusing on making “treasures” more widely available, will reinforce existing cultural stereotypes and canonicities. The criteria used to select manuscripts for digitization and the way they are presented online are very poorly articulated and require wider discussion and debate.

 

Since the advent of Google Books, many librarians and curators have been anxious to maximize digital coverage of their collections as quickly as possible. However, by seeking to rapidly digitize large numbers of books, manuscripts, and archives, archivists, librarians, and scholars may sacrifice many of the benefits that digital technologies offer for the exploration of manuscripts and books as textual artifacts. Too often, digitization is treated as a form of color microfilm, thereby offering distorted views of the manuscript and making it appear to be a simpler and more stable object than it really is. Digitization provides a constantly expanding toolbox for probing and analyzing manuscripts that goes beyond simple color imaging. Like archaeological artifacts, manuscripts should be explored gradually, using a variety of technical aids and methods, building a multifaceted digital archive of the manuscript….”

Sluggish data sharing hampers reproducibility effort : Nature News & Comment

An initiative that aims to validate the findings of key cancer papers is being slowed by an unexpected hurdle — problems accessing data from the original studies.

The Reproducibility Initiative: Cancer Biology consortium aims to repeat experiments from 50 highly-cited studies published in 2010–12 in journals such as NatureCell and Science, to see how easy it is to reproduce their findings. Although these journals require authors to share their data on request, it has taken two months on average to get the data for each paper, said William Gunn, a co-leader of the project, at the 4th World Conference on Research Integrity in Rio de Janeiro, Brazil, on 3 June.

For one paper, securing the necessary data took a year. And the authors of four other papers have stopped communicating with the project altogether. In those instances, the journals that published the studies are stepping in to remind researchers of their responsibilities….”

Do authors comply when funders enforce open access to research?

“Last month, European research funders collectively called for research publications to be made free, fully and immediately; so far, 14 funders have signed up. Before that, at least 50 funders and 700 research institutions worldwide had already mandated some form of open access for the work they support. Federally funded agencies and institutions argue that taxpayers should be able to read publicly funded research, and that broader accessibility will allow researchers whose institutions do not subscribe to a particular journal to build on existing research.

However, few empirical analyses have examined whether work supported by funding agencies with such mandates actually is open access14. Here, we report the first large-scale analysis of compliance, focusing on 12 selected funding agencies. Bibliometric data are fraught with idiosyncrasies (see ‘Analysis methods’), but the trends are clear.

Of the more than 1.3 million papers we identified as subject to the selected funders’ open-access mandates, we found that some two-thirds were indeed freely available to read. Rates varied greatly, from around 90% for work funded by the US National Institutes of Health (NIH) and UK biomedical funder the Wellcome Trust, to 23% for work supported by the Social Sciences and Humanities Research Council of Canada (see ‘Mandates matter’)….

Our findings have policy implications. They highlight the importance to open access of enforcement, timeliness and infrastructure. And they underline the need to establish sustainable and equitable systems as the financial burdens for science publishing shift from research libraries to authors’ research funds….

Funders that allow authors to deposit papers after publication see lower rates of compliance, presumably because authors lose track of this obligation….

in chemistry research, 81% of work funded by the NIH is publicly available, whereas that is true of only around one-quarter of chemistry studies supported by the NSF and CIHR. Different funders support different types of work, but the variations we found also remain consistent within sub-disciplines (see Supplementary Information, Figure S5). Although researchers cite norms and needs within disciplines as a reason not to comply with open-access mandates, we believe that the funding agency is a stronger driver of open access than is the culture of any particular discipline….”