Suppression as a form of liberation? – Ross Mounce

“On Monday 29th June 2020, I learned from Retraction Watch that Clarivate, the for-profit proprietor of Journal Impact Factor ™ has newly “suppressed” 33 journals from their indexing service. The immediate consequence of this “suppression” is that these 33 journals do not get assigned an official Clarivate Journal Impact Factor ™ . Clarivate justify this action on the basis of “anomalous citation patterns” but without much further detail given for each of the journals other than the overall “% Self-cites” of the journal, and the effect of those self-cites on Clarivate’s citation-based ranking of journals (% Distortion of category rank)….

The zoology section of the Chilean Society of Biology has already petitioned Clarivate to unsuppress Zootaxa, to give it back its Journal Impact Factor ™ . I understand why they would do this but I would actually call for something quite different and more far-reaching.

I would encourage all systematists, taxonomists, zoologists, microbiologists, and biologists in general to see the real problem here: Clarivate, a for-profit analytics company, should never be so relied-upon by research evaluation committees to arbitrarily decide the value of a research output. Especially given that the Journal Impact Factor ™ is untransparent, irreproducible, and fundamentally statistically illiterate.

 

Thus to bring us back to my title. I wonder if Clarivate’s wacky “suppression” might actually be a pathway to liberation from the inappropriate stupidity of using Journal Impact Factor ™ to evaluate individual research outputs. Given we have all now witnessed just how brainless some of Clarivate’s decision making is, I would ask Clarivate to please “suppress” all journals thereby removing the harmful stupidity of Journal Impact Factor ™ from the lives of researchers.”

Suppression as a form of liberation? – Ross Mounce

“On Monday 29th June 2020, I learned from Retraction Watch that Clarivate, the for-profit proprietor of Journal Impact Factor ™ has newly “suppressed” 33 journals from their indexing service. The immediate consequence of this “suppression” is that these 33 journals do not get assigned an official Clarivate Journal Impact Factor ™ . Clarivate justify this action on the basis of “anomalous citation patterns” but without much further detail given for each of the journals other than the overall “% Self-cites” of the journal, and the effect of those self-cites on Clarivate’s citation-based ranking of journals (% Distortion of category rank)….

The zoology section of the Chilean Society of Biology has already petitioned Clarivate to unsuppress Zootaxa, to give it back its Journal Impact Factor ™ . I understand why they would do this but I would actually call for something quite different and more far-reaching.

I would encourage all systematists, taxonomists, zoologists, microbiologists, and biologists in general to see the real problem here: Clarivate, a for-profit analytics company, should never be so relied-upon by research evaluation committees to arbitrarily decide the value of a research output. Especially given that the Journal Impact Factor ™ is untransparent, irreproducible, and fundamentally statistically illiterate.

 

Thus to bring us back to my title. I wonder if Clarivate’s wacky “suppression” might actually be a pathway to liberation from the inappropriate stupidity of using Journal Impact Factor ™ to evaluate individual research outputs. Given we have all now witnessed just how brainless some of Clarivate’s decision making is, I would ask Clarivate to please “suppress” all journals thereby removing the harmful stupidity of Journal Impact Factor ™ from the lives of researchers.”

Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities | The BMJ

Abstract:  Objective To determine the presence of a set of pre-specified traditional and non-traditional criteria used to assess scientists for promotion and tenure in faculties of biomedical sciences among universities worldwide.

Design Cross sectional study.

Setting International sample of universities.

Participants 170 randomly selected universities from the Leiden ranking of world universities list.

Main outcome measure Presence of five traditional (for example, number of publications) and seven non-traditional (for example, data sharing) criteria in guidelines for assessing assistant professors, associate professors, and professors and the granting of tenure in institutions with biomedical faculties.

Results A total of 146 institutions had faculties of biomedical sciences, and 92 had eligible guidelines available for review. Traditional criteria of peer reviewed publications, authorship order, journal impact factor, grant funding, and national or international reputation were mentioned in 95% (n=87), 37% (34), 28% (26), 67% (62), and 48% (44) of the guidelines, respectively. Conversely, among non-traditional criteria, only citations (any mention in 26%; n=24) and accommodations for employment leave (37%; 34) were relatively commonly mentioned. Mention of alternative metrics for sharing research (3%; n=3) and data sharing (1%; 1) was rare, and three criteria (publishing in open access mediums, registering research, and adhering to reporting guidelines) were not found in any guidelines reviewed. Among guidelines for assessing promotion to full professor, traditional criteria were more commonly reported than non-traditional criteria (traditional criteria 54.2%, non-traditional items 9.5%; mean difference 44.8%, 95% confidence interval 39.6% to 50.0%; P=0.001). Notable differences were observed across continents in whether guidelines were accessible (Australia 100% (6/6), North America 97% (28/29), Europe 50% (27/54), Asia 58% (29/50), South America 17% (1/6)), with more subtle differences in the use of specific criteria.

Conclusions This study shows that the evaluation of scientists emphasises traditional criteria as opposed to non-traditional criteria. This may reinforce research practices that are known to be problematic while insufficiently supporting the conduct of better quality research and open science. Institutions should consider incentivising non-traditional criteria.

Covid-19 Shows Scientific Journals Like Elsevier Need to Open Up – Bloomberg

“One big change brought on by Covid-19 is that virtually all the scientific research being produced about it is free to read. Anyone can access the many preliminary findings that scholars are posting on “preprint servers.” Data are shared openly via a multitude of different channels. Scientific journals that normally keep their articles behind formidable paywalls have been making an exception for new research about the virus, as well as much (if not all) older work relevant to it.

This response to a global pandemic is heartening and may well speed that pandemic to its end. But after that, what happens with scientific communication? Will everything go back behind the journal paywalls?

 

 

Well, no. Open-access advocates in academia have been pushing for decades to make more of their work publicly available and paywall-free, and in recent years they’ve been joined by the government agencies and large foundations that fund much scientific research. Covid-19 has accelerated this shift. I’m pretty sure there’s no going back. …”

The relationship between bioRxiv preprints, citations and altmetrics | Quantitative Science Studies | MIT Press Journals

Abstract:  A potential motivation for scientists to deposit their scientific work as preprints is to enhance its citation or social impact. In this study we assessed the citation and altmetric advantage of bioRxiv, a preprint server for the biological sciences. We retrieved metadata of all bioRxiv preprints deposited between November 2013 and December 2017, and matched them to articles that were subsequently published in peer-reviewed journals. Citation data from Scopus and altmetric data from Altmetric.com were used to compare citation and online sharing behavior of bioRxiv preprints, their related journal articles, and nondeposited articles published in the same journals. We found that bioRxiv-deposited journal articles had sizably higher citation and altmetric counts compared to nondeposited articles. Regression analysis reveals that this advantage is not explained by multiple explanatory variables related to the articles’ publication venues and authorship. Further research will be required to establish whether such an effect is causal in nature. bioRxiv preprints themselves are being directly cited in journal articles, regardless of whether the preprint has subsequently been published in a journal. bioRxiv preprints are also shared widely on Twitter and in blogs, but remain relatively scarce in mainstream media and Wikipedia articles, in comparison to peer-reviewed journal articles.

 

 

DORA Community Call: Strategies for responsible research assessment in the Asia-Pacific region – DORA

“DORA is pleased to announce its first webinar for the Asia-Pacific region with the Australasian Open Access Strategy Group on Thursday, July 2 at 12:00 PM Australian Eastern Standard Time (10:00 AM China Standard Time). The webinar is open to all and will provide an update from DORA and offer ideas about strategies to implement responsible research assessment practices….”

Does Tweeting Improve Citations? One-Year Results from the TSSMN Prospective Randomized Trial – ScienceDirect

Abstract:  Background

The Thoracic Surgery Social Media Network (TSSMN) is a collaborative effort of leading journals in cardiothoracic surgery to highlight publications via social media. This study aims to evaluate the 1-year results of a prospective randomized social media trial to determine the effect of tweeting on subsequent citations and non-traditional bibliometrics.

Methods

A total of 112 representative original articles were randomized 1:1 to be tweeted via TSSMN or a control (non-tweeted) group. Measured endpoints included citations at 1-year compared to baseline, as well as article-level metrics (Altmetric score) and Twitter analytics. Independent predictors of citations were identified through univariable and multivariable regression analyses.

Results

When compared to control articles, tweeted articles achieved significantly greater increase in Altmetric scores (Tweeted 9.4±5.8 vs. Non-Tweeted 1.0±1.8, p<0.001), Altmetric score percentiles relative to articles of similar age from each respective journal (Tweeted 76.0±9.1%ile vs. Non-Tweeted 13.8±22.7%ile, p<0.001), with greater change in citations at 1 year (Tweeted +3.1±2.4 vs. Non-Tweeted +0.7±1.3, p<0.001). Multivariable analysis showed that independent predictors of citations were randomization to tweeting (OR 9.50; 95%CI 3.30-27.35, p<0.001), Altmetric score (OR 1.32; 95%CI 1.15-1.50, p<0.001), open-access status (OR 1.56; 95%CI 1.21-1.78, p<0.001), and exposure to a larger number of Twitter followers as quantified by impressions (OR 1.30, 95%CI 1.10-1.49, p<0.001).

Conclusions

One-year follow-up of this TSSMN prospective randomized trial importantly demonstrates that tweeting results in significantly more article citations over time, highlighting the durable scholarly impact of social media activity.

For China’s ambitious research reforms to be successful, they will need to be supported by new research assessment infrastructures | Impact of Social Sciences

“A radical reform of research evaluation and funding in China was recently launched in two prescriptive policy documents published by the Ministry of Science and Technology and the Ministry of Education. China now moves from a strong focus on Web of Science-based indicators towards a more balanced combination of qualitative and quantitative research evaluation, with one third of all publications to be oriented towards domestic journals. Universities are urged to implement the policy locally by the end of July at the latest. How to do it, and the possible consequences, have aroused intense discussion among Chinese academics and gained worldwide attention and debate. 

This change has not come out of the blue. In 2016, President Xi Jinping called for reform towards a more comprehensive evaluation system for individual researchers. Further, in 2018, a document issued by three ministries and two national central institutions specifically proposed moving away from the “Four only” phenomenon of  recognising and rewarding “only papers, only titles, only diplomas and only awards”….

The growth of open access publishing in geochemistry – ScienceDirect

“Highlights

 

• 40% of articles in 2018-2019 published as Gold Open Access (OA).
• 70% in fully OA journals with a mean Article Processing Charge (APC) of US$ 900.
• 30% in historical hybrid journals with higher APC of more than $US 1,800.
• Correlation between number of OA articles in hybrids journals and impact factor.
• Relationship between number of OA articles in fully OA journals and APC….”