Trends for open access to publications | European Commission

“On this page you will find indicators on how the policies of journals and funding agencies favour open access, and the percentage of publications (gold, green, hybrid and bronze) actually available through open access.

The indicators cover bibliometric data on publications, as well as data on funders’ and journals’ policies. Indicators and case studies will be updated over time.

You can download the chart and its data through the dedicated menu within each chart (top right of the image). …”

What’s Wrong with Social Science and How to Fix It: Reflections After Reading 2578 Papers | Fantastic Anachronism

[Some recommendations:]

Ignore citation counts. Given that citations are unrelated to (easily-predictable) replicability, let alone any subtler quality aspects, their use as an evaluative tool should stop immediately.
Open data, enforced by the NSF/NIH. There are problems with privacy but I would be tempted to go as far as possible with this. Open data helps detect fraud. And let’s have everyone share their code, too—anything that makes replication/reproduction easier is a step in the right direction.
Financial incentives for universities and journals to police fraud. It’s not easy to structure this well because on the one hand you want to incentivize them to minimize the frauds published, but on the other hand you want to maximize the frauds being caught. Beware Goodhart’s law!
Why not do away with the journal system altogether? The NSF could run its own centralized, open website; grants would require publication there. Journals are objectively not doing their job as gatekeepers of quality or truth, so what even is a journal? A combination of taxonomy and reputation. The former is better solved by a simple tag system, and the latter is actually misleading. Peer review is unpaid work anyway, it could continue as is. Attach a replication prediction market (with the estimated probability displayed in gargantuan neon-red font right next to the paper title) and you’re golden. Without the crutch of “high ranked journals” maybe we could move to better ways of evaluating scientific output. No more editors refusing to publish replications. You can’t shift the incentives: academics want to publish in “high-impact” journals, and journals want to selectively publish “high-impact” research. So just make it impossible. Plus as a bonus side-effect this would finally sink Elsevier….”

New Approaches to Evaluate Researchers’ Impact

“The second Basel Sustainable Publishing Forum (BSPF) will be held virtually on 26–27 October 2020. Until then, due to the current COVID-19 situation, we would like to offer you some of the conference topics via webinars! In the weeks and months preceding the conference, selected speakers will discuss various aspects on Open-Access publishing. The first webinar in the series was already held on 2 July 2020. It touched upon topics such as Plan S and Price Transparency and consisted of talks from different publishers. The webinar website and the full recording can be found here.

Now, the second webinar within the framework of the Basel Sustainable Publishing Forum will be held on 11 September 2020 at 2:00 pm (CEST). The topic is “New Approaches to Evaluate Researchers’ Impact” and Prof. Dr. Ed Constable from the University of Basel and co-chair of the BSPF 2020 will moderate the webinar. So do not miss out on this webinar and register now! …”

Gaming the Metrics | The MIT Press

“The traditional academic imperative to “publish or perish” is increasingly coupled with the newer necessity of “impact or perish”—the requirement that a publication have “impact,” as measured by a variety of metrics, including citations, views, and downloads. Gaming the Metrics examines how the increasing reliance on metrics to evaluate scholarly publications has produced radically new forms of academic fraud and misconduct. The contributors show that the metrics-based “audit culture” has changed the ecology of research, fostering the gaming and manipulation of quantitative indicators, which lead to the invention of such novel forms of misconduct as citation rings and variously rigged peer reviews. The chapters, written by both scholars and those in the trenches of academic publication, provide a map of academic fraud and misconduct today. They consider such topics as the shortcomings of metrics, the gaming of impact factors, the emergence of so-called predatory journals, the “salami slicing” of scientific findings, the rigging of global university rankings, and the creation of new watchdogs and forensic practices.”

Elsevier have endorsed the Leiden Manifesto: so what? – The Bibliomagician

“If an organisation wants to make a public commitment to responsible research evaluation they have three main options: i) sign DORA, ii) endorse the Leiden Manifesto (LM), or iii) go bespoke – usually with a statement based on DORA, the LM, or the Metric Tide principles.

The LIS-Bibliometrics annual responsible metrics survey shows that research-performing organisations adopt a wide range of responses to this including sometimes signing DORA and adopting the LM. But when it comes to publishers and metric vendors, they tend to go for DORA. Signing DORA is a proactive, public statement and there is an open, independent record of your commitment. DORA also has an active Chair in Professor Stephen Curry, and a small staff in the form of a program director and community manager, all of whom will publicly endorse your signing which leads to good PR for the organisation.

A public endorsement of the LM leads to no such fanfare. Indeed, the LM feels rather abandoned by comparison. Despite a website and blog, there has been little active promotion of the Manifesto, nor any public recognition for anyone seeking to endorse it….”

Elsevier have endorsed the Leiden Manifesto: so what? – The Bibliomagician

“If an organisation wants to make a public commitment to responsible research evaluation they have three main options: i) sign DORA, ii) endorse the Leiden Manifesto (LM), or iii) go bespoke – usually with a statement based on DORA, the LM, or the Metric Tide principles.

The LIS-Bibliometrics annual responsible metrics survey shows that research-performing organisations adopt a wide range of responses to this including sometimes signing DORA and adopting the LM. But when it comes to publishers and metric vendors, they tend to go for DORA. Signing DORA is a proactive, public statement and there is an open, independent record of your commitment. DORA also has an active Chair in Professor Stephen Curry, and a small staff in the form of a program director and community manager, all of whom will publicly endorse your signing which leads to good PR for the organisation.

A public endorsement of the LM leads to no such fanfare. Indeed, the LM feels rather abandoned by comparison. Despite a website and blog, there has been little active promotion of the Manifesto, nor any public recognition for anyone seeking to endorse it….”

Elsevier endorses Leiden Manifesto to guide its development of improved research evaluation

“Elsevier, a global leader in research publishing and information analytics, today announced that it is endorsing the Leiden Manifesto for Research Metrics, ten principles that guide best practice in metrics-based research assessment.

The Leiden Manifesto is a set of practical and action-oriented recommendations for those engaged in the evaluation of research, whether in the role of evaluator, those being evaluated, or those responsible for designing and delivering research metrics and indicators. Elsevier, working through its recently-launched International Center for the Study of Research (ICSR), will now strive to develop its research evaluation tools and services aligned with the recommendations of the Leiden Manifesto….”

Elsevier endorses Leiden Manifesto to guide its development of improved research evaluation

“Elsevier, a global leader in research publishing and information analytics, today announced that it is endorsing the Leiden Manifesto for Research Metrics, ten principles that guide best practice in metrics-based research assessment.

The Leiden Manifesto is a set of practical and action-oriented recommendations for those engaged in the evaluation of research, whether in the role of evaluator, those being evaluated, or those responsible for designing and delivering research metrics and indicators. Elsevier, working through its recently-launched International Center for the Study of Research (ICSR), will now strive to develop its research evaluation tools and services aligned with the recommendations of the Leiden Manifesto….”

The cost of publishing in an indexed ophthalmology journal in 2019 – Canadian Journal of Ophthalmology

Abstract:  Objective

To determine the proportion of indexed ophthalmology journals with article processing charges (APCs) and potential factors associated with APCs.

Design

Cross-sectional study.

Participants

Web of Science–indexed Ophthalmology journals in 2019.

Methods

Indexed ophthalmology journal web sites were reviewed to obtain information on APCs, impact factor (IF), publication mode, publisher type, journal affiliation, waiver discount, and continent of origin. For data unavailable on the web site, the journal was contacted. Journal publication mode was categorized into subscription, fully open access, and hybrid (open access and subscription combined). Linear regression analysis was used to evaluate the association between APCs and the above variables.

Main Outcome Measure

Proportion of ophthalmology journals with APCs.

Results

59 indexed ophthalmology journals were identified; 3 (5.1%) subscription only, 10 (16.9%) open access, and 46 (78.0%) hybrid. Overall 52/59 (88.1%) journals had APCs; 10 of 59 journals (16.9%) required APCs for publication (7 fully open access and 3 hybrid journals), whereas 42/59 (71.2%, all hybrid journals) had optional APCs for open access. The 7/59 journals (11.9%) without APCs included 100% (3/3) of the subscription-only journals, 30% (3/10) of the open access, and 2% (1/46) of the hybrid journals. The mean cost for journals with APCs was US$2854 ± 708.9 (range US$490–5000). Higher IF, publication mode, and commercial publishers were associated with higher APCs.

Conclusions

16.9% of indexed ophthalmology journals in 2019 required APCs, and additional 71.2% hybrid journals had APCs for the option of open access. Independent predictors of APCs were IF and publication mode.