# Suppression as a form of liberation? – Ross Mounce

“On Monday 29th June 2020, I learned from Retraction Watch that Clarivate, the for-profit proprietor of Journal Impact Factor ™ has newly “suppressed” 33 journals from their indexing service. The immediate consequence of this “suppression” is that these 33 journals do not get assigned an official Clarivate Journal Impact Factor ™ . Clarivate justify this action on the basis of “anomalous citation patterns” but without much further detail given for each of the journals other than the overall “% Self-cites” of the journal, and the effect of those self-cites on Clarivate’s citation-based ranking of journals (% Distortion of category rank)….

The zoology section of the Chilean Society of Biology has already petitioned Clarivate to unsuppress Zootaxa, to give it back its Journal Impact Factor ™ . I understand why they would do this but I would actually call for something quite different and more far-reaching.

I would encourage all systematists, taxonomists, zoologists, microbiologists, and biologists in general to see the real problem here: Clarivate, a for-profit analytics company, should never be so relied-upon by research evaluation committees to arbitrarily decide the value of a research output. Especially given that the Journal Impact Factor ™ is untransparent, irreproducible, and fundamentally statistically illiterate.

Thus to bring us back to my title. I wonder if Clarivate’s wacky “suppression” might actually be a pathway to liberation from the inappropriate stupidity of using Journal Impact Factor ™ to evaluate individual research outputs. Given we have all now witnessed just how brainless some of Clarivate’s decision making is, I would ask Clarivate to please “suppress” all journals thereby removing the harmful stupidity of Journal Impact Factor ™ from the lives of researchers.”

# Suppression as a form of liberation? – Ross Mounce

“On Monday 29th June 2020, I learned from Retraction Watch that Clarivate, the for-profit proprietor of Journal Impact Factor ™ has newly “suppressed” 33 journals from their indexing service. The immediate consequence of this “suppression” is that these 33 journals do not get assigned an official Clarivate Journal Impact Factor ™ . Clarivate justify this action on the basis of “anomalous citation patterns” but without much further detail given for each of the journals other than the overall “% Self-cites” of the journal, and the effect of those self-cites on Clarivate’s citation-based ranking of journals (% Distortion of category rank)….

The zoology section of the Chilean Society of Biology has already petitioned Clarivate to unsuppress Zootaxa, to give it back its Journal Impact Factor ™ . I understand why they would do this but I would actually call for something quite different and more far-reaching.

I would encourage all systematists, taxonomists, zoologists, microbiologists, and biologists in general to see the real problem here: Clarivate, a for-profit analytics company, should never be so relied-upon by research evaluation committees to arbitrarily decide the value of a research output. Especially given that the Journal Impact Factor ™ is untransparent, irreproducible, and fundamentally statistically illiterate.

Thus to bring us back to my title. I wonder if Clarivate’s wacky “suppression” might actually be a pathway to liberation from the inappropriate stupidity of using Journal Impact Factor ™ to evaluate individual research outputs. Given we have all now witnessed just how brainless some of Clarivate’s decision making is, I would ask Clarivate to please “suppress” all journals thereby removing the harmful stupidity of Journal Impact Factor ™ from the lives of researchers.”

# Covid-19 Shows Scientific Journals Like Elsevier Need to Open Up – Bloomberg

“One big change brought on by Covid-19 is that virtually all the scientific research being produced about it is free to read. Anyone can access the many preliminary findings that scholars are posting on “preprint servers.” Data are shared openly via a multitude of different channels. Scientific journals that normally keep their articles behind formidable paywalls have been making an exception for new research about the virus, as well as much (if not all) older work relevant to it.

This response to a global pandemic is heartening and may well speed that pandemic to its end. But after that, what happens with scientific communication? Will everything go back behind the journal paywalls?

Well, no. Open-access advocates in academia have been pushing for decades to make more of their work publicly available and paywall-free, and in recent years they’ve been joined by the government agencies and large foundations that fund much scientific research. Covid-19 has accelerated this shift. I’m pretty sure there’s no going back. …”

# Responsible Metrics – YouTube

“The overreliance on metrics to assess academic outputs has led to the call for a more responsible use of these measures. This short video outlines the key principles of the responsible metrics movement.

Created as part of the Research Support Ambassador Programme from Cambridge University Libraries….”

# For China’s ambitious research reforms to be successful, they will need to be supported by new research assessment infrastructures | Impact of Social Sciences

“A radical reform of research evaluation and funding in China was recently launched in two prescriptive policy documents published by the Ministry of Science and Technology and the Ministry of Education. China now moves from a strong focus on Web of Science-based indicators towards a more balanced combination of qualitative and quantitative research evaluation, with one third of all publications to be oriented towards domestic journals. Universities are urged to implement the policy locally by the end of July at the latest. How to do it, and the possible consequences, have aroused intense discussion among Chinese academics and gained worldwide attention and debate.

This change has not come out of the blue. In 2016, President Xi Jinping called for reform towards a more comprehensive evaluation system for individual researchers. Further, in 2018, a document issued by three ministries and two national central institutions specifically proposed moving away from the “Four only” phenomenon of  recognising and rewarding “only papers, only titles, only diplomas and only awards”….

# The growth of open access publishing in geochemistry – ScienceDirect

“Highlights

• 40% of articles in 2018-2019 published as Gold Open Access (OA).
• 70% in fully OA journals with a mean Article Processing Charge (APC) of US$900. • 30% in historical hybrid journals with higher APC of more than$US 1,800.
• Correlation between number of OA articles in hybrids journals and impact factor.
• Relationship between number of OA articles in fully OA journals and APC….”

# Rethinking Research Assessment: Ideas for Action

Five myths about research assessment and five recommendations for concrete change.

# Introducing CiteScore, Our Journal’s Preferred Citation Index: Moving Beyond the Impact Factor – Joint Commission Journal on Quality and Patient Safety

“The mission of The Joint Commission Journal on Quality and Patient Safety is to improve health care quality, safety, and value by providing professionals and researchers a learning community to share innovative thinking, strategies, and practices. Although we publish a wide range of research in quality and safety, we emphasize rigorous, generalizable quality improvement research that our readers can use to improve care at their own institutions. Thus, our ultimate metric of success should be how often organizations read our articles, apply what they learn, and improve care and patient outcomes. Unfortunately, no such measure exists, so we must rely on several proxies, including downloads and references to the articles in press coverage and social media.”

# Use of the journal impact factor for assessing individual articles need not be statistically wrong

Abstract:  Most scientometricians reject the use of the journal impact factor for assessing individual articles and their authors. The well-known San Francisco Declaration on Research Assessment also strongly objects against this way of using the impact factor. Arguments against the use of the impact factor at the level of individual articles are often based on statistical considerations. The skewness of journal citation distributions typically plays a central role in these arguments. We present a theoretical analysis of statistical arguments against the use of the impact factor at the level of individual articles. Our analysis shows that these arguments do not support the conclusion that the impact factor should not be used for assessing individual articles. In fact, our computer simulations demonstrate the possibility that the impact factor is a more accurate indicator of the value of an article than the number of citations the article has received. It is important to critically discuss the dominant role of the impact factor in research evaluations, but the discussion should not be based on misplaced statistical arguments. Instead, the primary focus should be on the socio-technical implications of the use of the impact factor.

# The Megajournal Lifecycle – The Scholarly Kitchen

“PLOS ONE and Scientific Reports have been very successful journals. Any publisher would be thankful to have them in their portfolio. Nonetheless, their unstable performance should also serve as a warning. In the year of their steepest decline, each journal shrunk by about 7,000 articles, which can translate to a loss of more than \$10m year-on-year. That will reflect poorly on the balance sheet of any publisher.

The takeaways for publishers are simple:

Do not get carried away; the revenue of megajournals can be inconsistent, so avoid overselling their success to investors and avoid reckless investments
Invest heavily in marketing; if the journal is shedding 10% of citability every year, marketing should try plug this hole as well as possible
Build around their success; launch affiliated, higher impact journals that will absorb some of the eventual content loss
Do not put all your eggs in one basket; pursue a less risky, broad portfolio approach rather than a smaller, focused megajournal approach….”