The relationship between bioRxiv preprints, citations and altmetrics | Quantitative Science Studies | MIT Press Journals

Abstract:  A potential motivation for scientists to deposit their scientific work as preprints is to enhance its citation or social impact. In this study we assessed the citation and altmetric advantage of bioRxiv, a preprint server for the biological sciences. We retrieved metadata of all bioRxiv preprints deposited between November 2013 and December 2017, and matched them to articles that were subsequently published in peer-reviewed journals. Citation data from Scopus and altmetric data from were used to compare citation and online sharing behavior of bioRxiv preprints, their related journal articles, and nondeposited articles published in the same journals. We found that bioRxiv-deposited journal articles had sizably higher citation and altmetric counts compared to nondeposited articles. Regression analysis reveals that this advantage is not explained by multiple explanatory variables related to the articles’ publication venues and authorship. Further research will be required to establish whether such an effect is causal in nature. bioRxiv preprints themselves are being directly cited in journal articles, regardless of whether the preprint has subsequently been published in a journal. bioRxiv preprints are also shared widely on Twitter and in blogs, but remain relatively scarce in mainstream media and Wikipedia articles, in comparison to peer-reviewed journal articles.



Does Tweeting Improve Citations? One-Year Results from the TSSMN Prospective Randomized Trial – ScienceDirect

Abstract:  Background

The Thoracic Surgery Social Media Network (TSSMN) is a collaborative effort of leading journals in cardiothoracic surgery to highlight publications via social media. This study aims to evaluate the 1-year results of a prospective randomized social media trial to determine the effect of tweeting on subsequent citations and non-traditional bibliometrics.


A total of 112 representative original articles were randomized 1:1 to be tweeted via TSSMN or a control (non-tweeted) group. Measured endpoints included citations at 1-year compared to baseline, as well as article-level metrics (Altmetric score) and Twitter analytics. Independent predictors of citations were identified through univariable and multivariable regression analyses.


When compared to control articles, tweeted articles achieved significantly greater increase in Altmetric scores (Tweeted 9.4±5.8 vs. Non-Tweeted 1.0±1.8, p<0.001), Altmetric score percentiles relative to articles of similar age from each respective journal (Tweeted 76.0±9.1%ile vs. Non-Tweeted 13.8±22.7%ile, p<0.001), with greater change in citations at 1 year (Tweeted +3.1±2.4 vs. Non-Tweeted +0.7±1.3, p<0.001). Multivariable analysis showed that independent predictors of citations were randomization to tweeting (OR 9.50; 95%CI 3.30-27.35, p<0.001), Altmetric score (OR 1.32; 95%CI 1.15-1.50, p<0.001), open-access status (OR 1.56; 95%CI 1.21-1.78, p<0.001), and exposure to a larger number of Twitter followers as quantified by impressions (OR 1.30, 95%CI 1.10-1.49, p<0.001).


One-year follow-up of this TSSMN prospective randomized trial importantly demonstrates that tweeting results in significantly more article citations over time, highlighting the durable scholarly impact of social media activity.

10 tips for tweeting research | Nature Index

“A paper presented earlier this month at the CHEST Congress 2019 in Thailand by researchers from the University of Toronto in Canada found that when authors tweeted about their own work, they saw as much as a 3.5-fold increase in tweets about their studies that year from other people, compared with authors who did not tweet about their studies at all….

A recent study by Finch and his colleagues investigating social media responses to ornithology papers found that Altmetrics – which measure attention received by a paper, including how many times it’s viewed, downloaded, or mentioned on social media, in blogs, news articles, and elsewhere online – not only complement traditional measures of scholarly impact such as citations, but might also anticipate or even drive them….

According to a 2018 study by Isabelle Côté from Simon Fraser University in Canada and Emily Darling from the University of Toronto, more than half of the average scientist’s Twitter followers are other scientists….”


Open Access and Altmetrics in the pandemic age: Forescast analysis on COVID-19 literature | bioRxiv

Abstract:  We present an analysis on the uptake of open access on COVID-19 related literature as well as the social media attention they gather when compared with non OA papers. We use a dataset of publications curated by Dimensions and analyze articles and preprints. Our sample includes 11,686 publications of which 67.5% are openly accessible. OA publications tend to receive the largest share of social media attention as measured by the Altmetric Attention Score. 37.6% of OA publications are bronze, which means toll journals are providing free access. MedRxiv contributes to 36.3% of documents in repositories but papers in BiorXiv exhibit on average higher AAS. We predict the growth of COVID-19 literature in the following 30 days estimating ARIMA models for the overall publications set, OA vs. non OA and by location of the document (repository vs. journal). We estimate that COVID-19 publications will double in the next 20 days, but non OA publications will grow at a higher rate than OA publications. We conclude by discussing the implications of such findings on the dissemination and communication of research findings to mitigate the coronavirus outbreak.


Research Square Partners with Dimensions to Provide Citation Data on Preprints | Research Square

” Research Square, the company behind the world’s fastest-growing preprint platform, is partnering with Dimensions to provide early citation data on preprints. The Dimensions Badge will now display on all Research Square preprints that have been cited and will provide 4 different types of data: the total citations, most recent citations, Field Citation Ratio (FCR), and Relative Citation Ratio (RCR)….”

Social Media Coverage of Scientific Articles Immediately After Publication Predicts Subsequent Citations – #SoME_Impact Score: Observational Analysis | Sathianathen | Journal of Medical Internet Research

“Social media attention predicts citations and could be used as an early surrogate measure of scientific impact. Owing to the cross-sectional study design, we cannot determine whether correlation relates to causation.”

Social engagement and institutional repositories: a case study

Abstract:  This article explores the community reach and societal impact of institutional repositories, in particular Griffith Research Online (GRO), Griffith University’s institutional repository. To promote research on GRO, and to encourage people to click through to the repository content, a pilot social media campaign and some subsequent smaller social media activities were undertaken in 2018. After briefly touching on these campaigns, this article provides some reflections from these activities and proposes options for the future direction of social engagement and GRO in particular, and for institutional repositories in general. This undertaking necessitates a shift in focus from repositories as a resource for the scholarly community to a resource for the community at large. The campaign also highlighted the need to look beyond performance metrics to social media metrics as a measure of the social and community impact of a repository.

Whilst the article is written from one Australian university’s perspective, the drivers and challenges behind researchers and universities translating their research into economic, social, environmental and cultural impacts are national and international. The primary takeaway message is for libraries to take more of a proactive stance and to kick-start conversations within their institutions and with their clients to actively partner in creating opportunities to share research.

Journal transparency index will be ‘alternative’ to impact scores | Times Higher Education (THE)

“A new ranking system for academic journals measuring their commitment to research transparency will be launched next month – providing what many believe will be a useful alternative to journal impact scores.

Under a new initiative from the Center for Open Science, based in Charlottesville, Virginia, more than 300 scholarly titles in psychology, education and biomedical science will be assessed on 10 measures related to transparency, with their overall result for each category published in a publicly available league table.

The centre aims to provide scores for about 1,000 journals within six to eight months of their site’s launch in early February….”

Do articles in open access journals have more frequent altmetric activity than articles in subscription-based journals? An investigation of the research output of Finnish universities | SpringerLink

Abstract:  Scientific articles available in Open Access (OA) have been found to attract more citations and online attention to the extent that it has become common to speak about OA Altmetrics Advantage. This research investigates how the OA Altmetrics Advantage holds for a specific case of research articles, namely the research outputs from universities in Finland. Furthermore, this research examines disciplinary and platform specific differences in that (dis)advantage. The new methodological approaches developed in this research focus on relative visibility, i.e. how often articles in OA journals receive at least one mention on the investigated online platforms, and relative receptivity, i.e. how frequently articles in OA journals gain mentions in comparison to articles in subscription-based journals. The results show significant disciplinary and platform specific differences in the OA advantage, with articles in OA journals within for instance veterinary sciences, social and economic geography and psychology receiving more citations and attention on social media platforms, while the opposite was found for articles in OA journals within medicine and health sciences. The results strongly support field- and platform-specific considerations when assessing the influence of journal OA status on altmetrics. The new methodological approaches used in this research will serve future comparative research into OA advantage of scientific articles over time and between countries.

Dutch universities and research funders move away from the impact factor – ScienceGuide

“By the end of 2019, all parties involved in this project pledge to have signed DORA  . This commitment has to be more than an empty gesture. For example, norms such as ‘four publications to obtain a PhD’ will be abolished, and NWO and ZonMw will no longer inquire about h-index or journal impact factor when academics submit grant applications. Instead of asking for a publication list and CV, they will ask for a more ‘narrative’ approach – inquiring about why this research is important, and why the applicant is the right person to carry it out.

This change will be fast, but not instant. The parties involved acknowledge that change takes time. Considering that to focus metrics such as impact factors took decades to become part of established practices, unlearning these routines will require a considerable amount of time, energy and perseverance. Correctly identifying diverse forms of talent and ‘good research’ will be a learning experience: “To accelerate the desired cultural change in recognition and rewards, we at NWO and ZonMW will strongly focus on training and instruction for our grant evaluation committees.” …”