Manipulation of bibliometric data by editors of scientific journals

“Such misuse of terms not only justifies the erroneous practice of research bureaucracy of evaluating research performance on those terms but also encourages editors of scientific journals and reviewers of research papers to ‘game’ the bibliometric indicators. For instance, if a journal seems to lack adequate number of citations, the editor of that journal might decide to make it obligatory for its authors to cite papers from journal in question. I know an Indian journal of fairly reasonable quality in terms of several other criteria but can no longer consider it so because it forces authors to include unnecessary (that is plain false) citations to papers in that journal. Any further assessment of this journal that includes self-citations will lead to a distorted measure of its real status….

An average paper in the natural or applied sciences lists at least 10 references.1 Some enterprising editors have taken this number to be the minimum for papers submitted to their journals. Such a norm is enforced in many journals from Belarus, and we, authors, are now so used to that norm that we do not even realize the distortions it creates in bibliometric data. Indeed, I often notice that some authors – merely to meet the norm of at least 10 references – cite very old textbooks and Internet resources with URLs that are no longer valid. The average for a good paper may be more than 10 references, and a paper with fewer than 10 references may yet be a good paper (The first paper by Einstein did not have even one reference in its original version!). I believe that it is up to a peer reviewer to judge whether the author has given enough references and whether they are suitable, and it is not for a journal’s editor to set any mandatory quota for the number of references….

Some international journals intervene arbitrarily to revise the citations in articles they receive: I submitted a paper with my colleagues to an American journal in 2017, and one of the reviewers demanded that we replace references in Russian language with references in English. Two of us responded with a correspondence note titled ‘Don’t dismiss non-English citations’ that we had then submitted to Nature: in publishing that note, the editors of Nature removed some references – from the paper2 that condemned the practice of replacing an author’s references with those more to the editor’s liking – and replaced them with, maybe more relevant, reference to a paper that we had never read by that moment! … 

Editors of many international journals are now looking not for quality papers but for papers that will not lower the impact factor of their journals….”

The open access advantage for studies of human electrophysiology: Impact on citations and Altmetrics – ScienceDirect

“Highlights

• Barriers to accessing science contributes to knowledge inequalities

• 35% of articles published in the last 20 years in electrophysiology are open access.

• Open access articles received 9–21% more citations and 39% more Altmetric mentions.

• Green open access (author archived) enjoyed similar benefit as Gold open access.

• Studies of human electrophysiology enjoy the “open access advantage” in citations….”

 

Altmetrics: Part 2- Celebrating Altmetric’s Decade- AN ATG Original – Charleston Hub

“Altmetric, the company, has been in existence for ten years now. The company has grown, and to get the view from company officials themselves, we submitted questions and various company officials responded to give us an inside look at  Altmetric today – and what we might expect in the future….”

Factors associated with high Altmetric Attention Score in dermatology research – Iglesias?Puzas – – Australasian Journal of Dermatology – Wiley Online Library

Abstract:  Background

Alternative metrics are emerging scores to assess the impact of research beyond the academic environment.

Objective

To analyse whether a correlation exists between manuscript characteristics and alternative citation metrics.

Materials and methods

This bibliometric analysis included original articles published in the five journals with the highest impact factors during 2019.

We extracted the following characteristics from each record: journal, publication month, title, number of authors, type of institution, type of publication, research topic, number of references, financial support, free/open access status and literature citations. The main measure was the identification of variables of higher social attention (measured by the Altmetric Attention Score ?25) using binary logistic regression. Model performance was assessed by the change in the area under the curve (AUC).

Results

A total of 840 manuscripts were included. The Altmetric scores across all five journals ranged from 0 to 465 (mean 12.51 ± 33.7; median 3). The most prevalent topic was skin cancer, and the study design was clinical science. The scientific journal (P < 0.001), the presence of conflicts of interest (OR 2.2 [95%CI 1.3–3.7]; P = 0.002) and open access status OR 3.2 [95%CI 1.6–6.7]; P = 0.002) were found as independent predictors of high Altmetric scores.

Conclusions

Our study suggests an article´s social recognition may be dependent on some manuscript characteristics, thus providing useful information on the dissemination of dermatology research to the general public.

A Farewell to ALM, but not to article-level metrics! – The Official PLOS Blog

“In fact, the altmetrics movement has been so successful that it has spawned a market of providers who specialize in collecting and curating metrics and metadata about how research outputs are used and discussed. 

One of these services, in particular, has far outpaced the reach and capabilities of ALM, and PLOS is now excited to pass the baton of our altmetrics operations to the experts at Altmetric….”

A Farewell to ALM, but not to article-level metrics! – The Official PLOS Blog

“In fact, the altmetrics movement has been so successful that it has spawned a market of providers who specialize in collecting and curating metrics and metadata about how research outputs are used and discussed. 

One of these services, in particular, has far outpaced the reach and capabilities of ALM, and PLOS is now excited to pass the baton of our altmetrics operations to the experts at Altmetric….”

The Most Widely Disseminated COVID-19-Related Scientific Publications in Online Media: A Bibliometric Analysis of the Top 100 Articles with the Highest Altmetric Attention Scores

Abstract:  The novel coronavirus disease 2019 (COVID-19) is a global pandemic. This study’s aim was to identify and characterize the top 100 COVID-19-related scientific publications, which had received the highest Altmetric Attention Scores (AASs). Hence, we searched Altmetric Explorer using search terms such as “COVID” or “COVID-19” or “Coronavirus” or “SARS-CoV-2” or “nCoV” and then selected the top 100 articles with the highest AASs. For each article identified, we extracted the following information: the overall AAS, publishing journal, journal impact factor (IF), date of publication, language, country of origin, document type, main topic, and accessibility. The top 100 articles most frequently were published in journals with high (>10.0) IF (n = 67), were published between March and July 2020 (n = 67), were written in English (n = 100), originated in the United States (n = 45), were original articles (n = 59), dealt with treatment and clinical manifestations (n = 33), and had open access (n = 98). Our study provides important information pertaining to the dissemination of scientific knowledge about COVID-19 in online media. View Full-Text

 

Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics – Lemke – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  The amount of annually published scholarly articles is growing steadily, as is the number of indicators through which impact of publications is measured. Little is known about how the increasing variety of available metrics affects researchers’ processes of selecting literature to read. We conducted ranking experiments embedded into an online survey with 247 participating researchers, most from social sciences. Participants completed series of tasks in which they were asked to rank fictitious publications regarding their expected relevance, based on their scores regarding six prototypical metrics. Through applying logistic regression, cluster analysis, and manual coding of survey answers, we obtained detailed data on how prominent metrics for research impact influence our participants in decisions about which scientific articles to read. Survey answers revealed a combination of qualitative and quantitative characteristics that researchers consult when selecting literature, while regression analysis showed that among quantitative metrics, citation counts tend to be of highest concern, followed by Journal Impact Factors. Our results suggest a comparatively favorable view of many researchers on bibliometrics and widespread skepticism toward altmetrics. The findings underline the importance of equipping researchers with solid knowledge about specific metrics’ limitations, as they seem to play significant roles in researchers’ everyday relevance assessments.

 

How is science clicked on Twitter? Click metrics for Bitly short links to scientific publications – Fang – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter. Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted. Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators. Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes). In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.

 

How is science clicked on Twitter? Click metrics for Bitly short links to scientific publications – Fang – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter. Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted. Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators. Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes). In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.