All Citations Aren’t Created Equal

“Indeed, when we as researchers first look at a paper, we look at where it was published, who are the authors and where are they from, and some metrics like downloads, reads, and of course, citations. Most of this information is superficial, contributing no real useful information to understanding the research. Even citations, as used today, are mostly used as a number to make a quick assessment of the work, where the higher the number of citations an article has, the better.

However, citations represent a wealth of information. Behind each of the 41 articles that cite my work are years of directly related research and many thousands, if not millions, of dollars of research funding. But if I want to learn what these articles say about my work, I would need to read each of them. This is so impractical that it is effectively never fully done.

We’re changing that at scite, a new platform that uses deep learning to show how an article has been cited and, specifically, if it has been supported or contradicted, where the citations appear in the citing paper, and if it is a self-cite or a citation from a review or article. In short, we want to make citations smart–citations that not merely tell how many times an article is cited, but also provide the context for each citation and the citation meaning, such as whether it provides supporting or contradicting evidence for the cited claim. …”

Europe PMC Integrates Smart Citations from scite – scite – Medium

“scite, an award-winning citation analysis platform, and Europe PMC, an open science discovery tool that provides access to a worldwide collection of life science publications, have partnered to display what scite calls smart citations on the Europe PMC platform.

Smart citations advance regular citations by providing more contextual information beyond the information that one study references another. Specifically, smart citations provide the excerpt of text surrounding the citation, the section of the article in which the reference is mentioned, and indicate whether the citing study provides supporting or contradicting evidence. As a result, one can evaluate a study of interest much faster….”

Predatory-journal papers have little scientific impact

“Predatory journals are those that charge authors high article-processing fees but don’t provide expected publishing services, such as peer review or other quality checks. Researchers and publishers have long voiced fears that these practices could be harming research by flooding the literature with poor-quality studies.

But the authors of the analysis, posted to the preprint server arXiv on 21 December1, say their findings suggest papers in predatory journals have “very limited readership among academics”, and therefore have little effect on science….”

The Citation Advantage of Promoted Articles in a Cross?Publisher Distribution Platform: A 12?Month Randomized Controlled Trial – Kudlow – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  There is currently a paucity of evidence?based strategies that have been shown to increase citations of peer?reviewed articles following their publication. We conducted a 12?month randomized controlled trial to examine whether the promotion of article links in an online cross?publisher distribution platform (TrendMD) affects citations. In all, 3,200 articles published in 64 peer?reviewed journals across eight subject areas were block randomized at the subject level to either the TrendMD group (n = 1,600) or the control group (n = 1,600) of the study. Our primary outcome compares the mean citations of articles randomized to TrendMD versus control after 12 months. Articles randomized to TrendMD showed a 50% increase in mean citations relative to control at 12 months. The difference in mean citations at 12 months for articles randomized to TrendMD versus control was 5.06, 95% confidence interval [2.87, 7.25], was statistically significant (p?<?.001) and found in three of eight subject areas. At 6 months following publication, articles randomized to TrendMD showed a smaller, yet statistically significant (p = .005), 21% increase in mean citations, relative to control. To our knowledge, this is the first randomized controlled trial to demonstrate how an intervention can be used to increase citations of peer?reviewed articles after they have been published.

 

Articles in ‘predatory’ journals receive few or no citations | Science | AAAS

“Six of every 10 articles published in a sample of “predatory” journals attracted not one single citation over a 5-year period, according to a new study. Like many open-access journals, predatory journals charge authors to publish, but they offer little or no peer review or other quality controls and often use aggressive marketing tactics. The new study found that the few articles in predatory journals that received citations did so at a rate much lower than papers in conventional, peer-reviewed journals.

The authors say the finding allays concerns that low-quality or misleading studies published in these journals are getting undue attention. “There is little harm done if nobody reads and, in particular, makes use of such results,” write Bo-Christer Björk of the Hanken School of Economics in Finland and colleagues in a preprint posted 21 December 2019 on arXiv.

But Rick Anderson, an associate dean at the University of Utah who oversees collections in the university’s main library, says the finding that 40% of the predatory journal articles drew at least one citation “strikes me as pretty alarming.” …”

Revisiting the Open Access Citation Advantage for Legal Scholarship

“Citation studies in law have shown a significant citation advantage for open access legal scholarship. A recent cross-disciplinary study, however, gave opposite results. This article shows how methodology, including the definition of open access and the source of the citation data, can affect the results of open access citation studies.”

Revisiting the Open Access Citation Advantage for Legal Scholarship

“Citation studies in law have shown a significant citation advantage for open access legal scholarship. A recent cross-disciplinary study, however, gave opposite results. This article shows how methodology, including the definition of open access and the source of the citation data, can affect the results of open access citation studies.”

[1912.08648] Inferring the causal effect of journals on citations

Abstract:  Articles in high-impact journals are by definition more highly cited on average. But are they cited more often because the articles are somehow “better”? Or are they cited more often simply because they appeared in a high-impact journal? Although some evidence suggests the latter the causal relationship is not clear. We here compare citations of published journal articles to citations of their preprint versions to uncover the causal mechanism. We build on an earlier model to infer the causal effect of journals on citations. We find evidence for both effects. We show that high-impact journals seem to select articles that tend to attract more citations. At the same time, we find that high-impact journals augment the citation rate of published articles. Our results yield a deeper understanding of the role of journals in the research system. The use of journal metrics in research evaluation has been increasingly criticised in recent years and article-level citations are sometimes suggested as an alternative. Our results show that removing impact factors from evaluation does not negate the influence of journals. This insight has important implications for changing practices of research evaluation.

 

A study of the impact of data sharing on article citations using journal policies as a natural experiment

Abstract:  This study estimates the effect of data sharing on the citations of academic articles, using journal policies as a natural experiment. We begin by examining 17 high-impact journals that have adopted the requirement that data from published articles be publicly posted. We match these 17 journals to 13 journals without policy changes and find that empirical articles published just before their change in editorial policy have citation rates with no statistically significant difference from those published shortly after the shift. We then ask whether this null result stems from poor compliance with data sharing policies, and use the data sharing policy changes as instrumental variables to examine more closely two leading journals in economics and political science with relatively strong enforcement of new data policies. We find that articles that make their data available receive 97 additional citations (estimate standard error of 34). We conclude that: a) authors who share data may be rewarded eventually with additional scholarly citations, and b) data-posting policies alone do not increase the impact of articles published in a journal unless those policies are enforced.

 

Meta-Research: Releasing a preprint is associated with more attention and citations for the peer-reviewed article | eLife

Abstract:  Preprints in biology are becoming more popular, but only a small fraction of the articles published in peer-reviewed journals have previously been released as preprints. To examine whether releasing a preprint on bioRxiv was associated with the attention and citations received by the corresponding peer-reviewed article, we assembled a dataset of 74,239 articles, 5,405 of which had a preprint, published in 39 journals. Using log-linear regression and random-effects meta-analysis, we found that articles with a preprint had, on average, a 49% higher Altmetric Attention Score and 36% more citations than articles without a preprint. These associations were independent of several other article- and author-level variables (such as scientific subfield and number of authors), and were unrelated to journal-level variables such as access model and Impact Factor. This observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.