# Name the journal. Shame the publisher. | Sauropod Vertebra Picture of the Week

“Here’s an odd thing. Over and over again, when a researcher is mistreated by a journal or publisher, we see them telling their story but redacting the name of the journal or publisher involved. Here are a couple of recent examples….”

# Gaming the Metrics | The MIT Press

“The traditional academic imperative to “publish or perish” is increasingly coupled with the newer necessity of “impact or perish”—the requirement that a publication have “impact,” as measured by a variety of metrics, including citations, views, and downloads. Gaming the Metrics examines how the increasing reliance on metrics to evaluate scholarly publications has produced radically new forms of academic fraud and misconduct. The contributors show that the metrics-based “audit culture” has changed the ecology of research, fostering the gaming and manipulation of quantitative indicators, which lead to the invention of such novel forms of misconduct as citation rings and variously rigged peer reviews. The chapters, written by both scholars and those in the trenches of academic publication, provide a map of academic fraud and misconduct today. They consider such topics as the shortcomings of metrics, the gaming of impact factors, the emergence of so-called predatory journals, the “salami slicing” of scientific findings, the rigging of global university rankings, and the creation of new watchdogs and forensic practices.”

# Elsevier have endorsed the Leiden Manifesto: so what? – The Bibliomagician

“If an organisation wants to make a public commitment to responsible research evaluation they have three main options: i) sign DORA, ii) endorse the Leiden Manifesto (LM), or iii) go bespoke – usually with a statement based on DORA, the LM, or the Metric Tide principles.

The LIS-Bibliometrics annual responsible metrics survey shows that research-performing organisations adopt a wide range of responses to this including sometimes signing DORA and adopting the LM. But when it comes to publishers and metric vendors, they tend to go for DORA. Signing DORA is a proactive, public statement and there is an open, independent record of your commitment. DORA also has an active Chair in Professor Stephen Curry, and a small staff in the form of a program director and community manager, all of whom will publicly endorse your signing which leads to good PR for the organisation.

A public endorsement of the LM leads to no such fanfare. Indeed, the LM feels rather abandoned by comparison. Despite a website and blog, there has been little active promotion of the Manifesto, nor any public recognition for anyone seeking to endorse it….”

# Elsevier have endorsed the Leiden Manifesto: so what? – The Bibliomagician

“If an organisation wants to make a public commitment to responsible research evaluation they have three main options: i) sign DORA, ii) endorse the Leiden Manifesto (LM), or iii) go bespoke – usually with a statement based on DORA, the LM, or the Metric Tide principles.

The LIS-Bibliometrics annual responsible metrics survey shows that research-performing organisations adopt a wide range of responses to this including sometimes signing DORA and adopting the LM. But when it comes to publishers and metric vendors, they tend to go for DORA. Signing DORA is a proactive, public statement and there is an open, independent record of your commitment. DORA also has an active Chair in Professor Stephen Curry, and a small staff in the form of a program director and community manager, all of whom will publicly endorse your signing which leads to good PR for the organisation.

A public endorsement of the LM leads to no such fanfare. Indeed, the LM feels rather abandoned by comparison. Despite a website and blog, there has been little active promotion of the Manifesto, nor any public recognition for anyone seeking to endorse it….”

# The cost of publishing in an indexed ophthalmology journal in 2019 – Canadian Journal of Ophthalmology

Abstract:  Objective

To determine the proportion of indexed ophthalmology journals with article processing charges (APCs) and potential factors associated with APCs.

Design

Cross-sectional study.

Participants

Web of Science–indexed Ophthalmology journals in 2019.

Methods

Indexed ophthalmology journal web sites were reviewed to obtain information on APCs, impact factor (IF), publication mode, publisher type, journal affiliation, waiver discount, and continent of origin. For data unavailable on the web site, the journal was contacted. Journal publication mode was categorized into subscription, fully open access, and hybrid (open access and subscription combined). Linear regression analysis was used to evaluate the association between APCs and the above variables.

Main Outcome Measure

Proportion of ophthalmology journals with APCs.

Results

59 indexed ophthalmology journals were identified; 3 (5.1%) subscription only, 10 (16.9%) open access, and 46 (78.0%) hybrid. Overall 52/59 (88.1%) journals had APCs; 10 of 59 journals (16.9%) required APCs for publication (7 fully open access and 3 hybrid journals), whereas 42/59 (71.2%, all hybrid journals) had optional APCs for open access. The 7/59 journals (11.9%) without APCs included 100% (3/3) of the subscription-only journals, 30% (3/10) of the open access, and 2% (1/46) of the hybrid journals. The mean cost for journals with APCs was US$2854 ± 708.9 (range US$490–5000). Higher IF, publication mode, and commercial publishers were associated with higher APCs.

Conclusions

16.9% of indexed ophthalmology journals in 2019 required APCs, and additional 71.2% hybrid journals had APCs for the option of open access. Independent predictors of APCs were IF and publication mode.

# Pengene bak vitenskapelig publisering | Tidsskrift for Den norske legeforening

From Google’s English:  “Most doctors relate to the pharmaceutical industry with a healthy skepticism. Scientific publications are also something all doctors and researchers have to deal with every single day, but knowledge of and skepticism of the scientific publishing industry seems to be less. The topic has become more relevant, as everyday publication has changed radically in recent decades. The Research Council of Norway has also, like 14 other countries, approved Plan S. This means that research funded by funds from the Research Council announced after 2021 must be published in open scientific journals (open access) ( 1 – 3). How does this change scientific publishing, and what will the industry itself have to change? The purpose of this article is to draw attention to existing problems with scientific publication and new problems created with open access and Plan S….

The most important thing we as users of the system can do is to be aware of the actual conditions and meet the publishing houses, journals and scientific publications we read with a healthy skepticism. With increased attention, the professional communities can put pressure on the industry and the authorities. This has already led to changes in Plan S….”

# Pengene bak vitenskapelig publisering | Tidsskrift for Den norske legeforening

From Google’s English:  “Most doctors relate to the pharmaceutical industry with a healthy skepticism. Scientific publications are also something all doctors and researchers have to deal with every single day, but knowledge of and skepticism of the scientific publishing industry seems to be less. The topic has become more relevant, as everyday publication has changed radically in recent decades. The Research Council of Norway has also, like 14 other countries, approved Plan S. This means that research funded by funds from the Research Council announced after 2021 must be published in open scientific journals (open access) ( 1 – 3). How does this change scientific publishing, and what will the industry itself have to change? The purpose of this article is to draw attention to existing problems with scientific publication and new problems created with open access and Plan S….

The most important thing we as users of the system can do is to be aware of the actual conditions and meet the publishing houses, journals and scientific publications we read with a healthy skepticism. With increased attention, the professional communities can put pressure on the industry and the authorities. This has already led to changes in Plan S….”

# Publications | Free Full-Text | A Provisional System to Evaluate Journal Publishers Based on Partnership Practices and Values Shared with Academic Institutions and Libraries

Abstract:  Background: Journals with high impact factors (IFs) are the “coin of the realm” in many review, tenure, and promotion decisions, ipso facto, IFs influence academic authors’ views of journals and publishers. However, IFs do not evaluate how publishers interact with libraries or academic institutions. Goal: This provisional system introduces an evaluation of publishers exclusive of IF, measuring how well a publisher’s practices align with the values of libraries and public institutions of higher education (HE). Identifying publishers with similar values may help libraries and institutions make strategic decisions about resource allocation. Methods: Democratization of knowledge, information exchange, and the sustainability of scholarship were values identified to define partnership practices and develop a scoring system evaluating publishers. Then, four publishers were evaluated. A high score indicates alignment with the values of libraries and academic institutions and a strong partnership with HE. Results: Highest scores were earned by a learned society publishing two journals and a library publisher supporting over 80 open-access journals. Conclusions: Publishers, especially nonprofit publishers, could use the criteria to guide practices that align with mission-driven institutions. Institutions and libraries could use the system to identify publishers acting in good faith towards public institutions of HE.

# Publications | Free Full-Text | A Provisional System to Evaluate Journal Publishers Based on Partnership Practices and Values Shared with Academic Institutions and Libraries

Abstract:  Background: Journals with high impact factors (IFs) are the “coin of the realm” in many review, tenure, and promotion decisions, ipso facto, IFs influence academic authors’ views of journals and publishers. However, IFs do not evaluate how publishers interact with libraries or academic institutions. Goal: This provisional system introduces an evaluation of publishers exclusive of IF, measuring how well a publisher’s practices align with the values of libraries and public institutions of higher education (HE). Identifying publishers with similar values may help libraries and institutions make strategic decisions about resource allocation. Methods: Democratization of knowledge, information exchange, and the sustainability of scholarship were values identified to define partnership practices and develop a scoring system evaluating publishers. Then, four publishers were evaluated. A high score indicates alignment with the values of libraries and academic institutions and a strong partnership with HE. Results: Highest scores were earned by a learned society publishing two journals and a library publisher supporting over 80 open-access journals. Conclusions: Publishers, especially nonprofit publishers, could use the criteria to guide practices that align with mission-driven institutions. Institutions and libraries could use the system to identify publishers acting in good faith towards public institutions of HE.

# Dynamics of Journal Impact Factors and Limits to Their Inflation | Journal of Scholarly Publishing

Abstract:  Journal Impact Factors (JIFs) appear to increase for the majority of scientific journals. The current analysis was initiated to better define the dynamics of JIFs. Original data from the Journal Citation Reports, from 1997 to 2016, were analysed. The number of citations referring to publications of the previous two years was correlated with the number of articles and the increase in the number of articles. A model was calculated by smoothing the correlation curves. The mean JIF increased from 1.1 to 2.2 almost continuously. The model suggested that the mean JIF will asymptotically reach a maximum value of 2.6. The number of publications has been growing annually by a factor of 1.048. Correlating the overall number of countable citations with the number of published articles revealed a stable relationship of 6.3 citations referring to the previous two years. Validation of the model with a sample of forty-nine journals that have been published since 1961 showed that their recent JIF dynamics are well reflected in the data, but extrapolation of the current dynamics did not reflect the JIFs of these journals in the past. Average JIF is likely to reach a plateau in the future.