Releasing a preprint is associated with more attention and citations | bioRxiv

Abstract:  Preprints in the life sciences are gaining popularity, but release of a preprint still precedes only a fraction of peer-reviewed publications. Quantitative evidence on the relationship between preprints and article-level metrics of peer-reviewed research remains limited. We examined whether having a preprint on bioRxiv.org was associated with the Altmetric Attention Score and number of citations of the corresponding peer-reviewed article. We integrated data from PubMed, CrossRef, Altmetric, and Rxivist (a collection of bioRxiv metadata). For each of 26 journals (comprising a total of 46,451 articles and 3,817 preprints), we used log-linear regression, adjusted for publication date and scientific subfield, to estimate fold-changes of Attention Score and citations between articles with and without a preprint. We also performed meta-regression of the fold-changes on journal-level characteristics. By random effects meta-analysis across journals, releasing a preprint was associated with a 1.53 times higher Attention Score + 1 (95% CI 1.42 to 1.65) and 1.31 times more citations + 1 (95% CI 1.24 to 1.38) of the peer-reviewed article. Journals with larger fold-changes of Attention Score tended to have lower impact factors and lower percentages of articles released as preprints. In contrast, a journal’s fold-change of citations was not associated with impact factor, percentage of articles released as preprints, or access model. The findings from this observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

Proposal for a Standard Article Metrics Dashboard to Replace the Journal Impact Factor

Abstract:  This paper proposes the creation of a dashboard consisting of five metrics that could be used to replace the journal impact factor. It should be especially useful in circumstances, like promotion and tenure committees, where the evaluators do not share the authors subject expertise and where they are working under time constraints.

Share or perish: Social media and the International Journal of Mental Health Nursing – McNamara – – International Journal of Mental Health Nursing – Wiley Online Library

Abstract:  The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information. The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non?traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research. The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal’s articles. To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed. Quantitative evidence demonstrates the effectiveness of one journal’s social media strategy in increasing the reach and readership of the articles it publishes. This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.

Journal impact factor: a bumpy ride in an open space | Journal of Investigative Medicine

Abstract:  The journal impact factor (IF) is the leading method of scholarly assessment in today’s research world. An important question is whether or not this is still a constructive method. For a specific journal, the IF is the number of citations for publications over the previous 2 years divided by the number of total citable publications in these years (the citation window). Although this simplicity works to an advantage of this method, complications arise when answers to questions such as ‘What is included in the citation window’ or ‘What makes a good journal impact factor’ contain ambiguity. In this review, we discuss whether or not the IF should still be considered the gold standard of scholarly assessment in view of the many recent changes and the emergence of new publication models. We will outline its advantages and disadvantages. The advantages of the IF include promoting the author meanwhile giving the readers a visualization of the magnitude of review. On the other hand, its disadvantages include reflecting the journal’s quality more than the author’s work, the fact that it cannot be compared across different research disciplines, and the struggles it faces in the world of open access. Recently, alternatives to the IF have been emerging, such as the SCImago Journal & Country Rank, the Source Normalized Impact per Paper and the Eigenfactor Score, among others. However, all alternatives proposed thus far are associated with their own limitations as well. In conclusion, although IF contains its cons, until there are better proposed alternative methods, IF remains one of the most effective methods for assessing scholarly activity.

The effect of bioRxiv preprints on citations and altmetrics | bioRxiv

Abstract:  A potential motivation for scientists to deposit their scientific work as preprints is to enhance its citation or social impact, an effect which has been empirically observed for preprints in physics, astronomy and mathematics deposited to arXiv. In this study we assessed the citation and altmetric advantage of bioRxiv, a preprint server for the biological sciences. We retrieved metadata of all bioRxiv preprints deposited between November 2013 and December 2017, and matched them to articles that were subsequently published in peer-reviewed journals. Citation data from Scopus and altmetric data from Altmetric.com were used to compare citation and online sharing behaviour of bioRxiv preprints, their related journal articles, and non-deposited articles published in the same journals. We found that bioRxiv-deposited journal articles received a sizeable citation and altmetric advantage over non-deposited articles. Regression analysis reveals that this advantage is not explained by multiple explanatory variables related to the article and its authorship. bioRxiv preprints themselves are being directly cited in journal articles, regardless of whether the preprint has been subsequently published in a journal. bioRxiv preprints are also shared widely on Twitter and in blogs, but remain relatively scarce in mainstream media and Wikipedia articles, in comparison to peer-reviewed journal articles.

The European University Association and Science Europe Join Efforts to Improve Scholarly Research Assessment Methodologies

“Evaluating research and assessing researchers is fundamental to the research enterprise and core to the activities of research funders and research performing organisations, as well as universities. The European University Association (EUA) and Science Europe are committed to building a strong dialogue between their members, who share the responsibility of developing and implementing more accurate, open, transparent and responsible approaches, that better reflect the evolution of research activity in the digital era.

Today, the outcomes of scholarly research are often measured through methods based on quantitative, albeit approximate, indicators such as the journal impact factor. There is a need to move away from reductionist ways of assessing research, as well as to establish systems that better assess research potential. Universities, research funders and research performing organisations are well-placed to explore new and improved research assessment approaches, while also being indispensable in turning these innovations into systemic reforms….”

Rethinking impact factors: better ways to judge a journal

Global efforts are afoot to create a constructive role for journal metrics in scholarly publishing and to displace the dominance of impact factors in the assessment of research. To this end, a group of bibliometric and evaluation specialists, scientists, publishers, scientific societies and research-analytics providers are working to hammer out a broader suite of journal indicators, and other ways to judge a journal’s qualities. It is a challenging task: our interests vary and often conflict, and change requires a concerted effort across publishing, academia, funding agencies, policymakers and providers of bibliometric data.

Here we call for the essential elements of this change: expansion of indicators to cover all functions of scholarly journals, a set of principles to govern their use and the creation of a governing body to maintain these standards and their relevance….”

Altmetrics Come to OJS: Announcing the Paperbuzz Plugin | Public Knowledge Project

PKP is pleased to announce the release of the Paperbuzz Plugin for Open Journal System (OJS) versions 3.1.2 and above, built in cooperation with the Paperbuzzteam at Impactstory. This new plugin will bring free altmetrics (an alternative to traditional citation-based metrics) based on open data to thousands of OJS journals….”

Deal or No Deal | Periodicals Price Survey 2019

“Pressure increases on publishers to move more quickly to open access, but this leaves many questions unanswered

For the past decade, libraries have battled declining university budgets and increasing serials expenditures. With each Big Deal package renewal or cancellation, librarians and publishers have asked themselves: Did I make the best deal? Did I make the right deal? Recent developments in open access (OA) promise to bring major reform to academic publishing and, with that, new challenges and opportunities to the way that librarians and publishers choose to deal….”