Set citation data free

“However, most poll respondents felt that citation-based indicators are useful, but that they should be deployed in more nuanced and open ways. The most popular responses to the poll were that citation-based indicators should be tweaked to exclude self-citations, or that self-citation rates should be reported alongside other metrics (see ‘The numbers game’). On the whole, respondents wanted to be able to judge for themselves when self-citations might be appropriate, and when not; to be able to compare self-citation across fields; and more….

But this is where there is a real problem, because for many papers citation data are locked inside proprietary databases. Since 2000, more and more publishers have been depositing information about research-paper references with an organization called Crossref, the non-profit agency that registers digital object identifiers (DOIs), the strings of characters that identify papers on the web. But not all publishers allow their reference lists to be made open for anyone to download and analyse — only 59% of the almost 48 million articles deposited with Crossref currently have open references.

 

There is, however, a solution. Two years ago, the Initiative for Open Citations (I4OC) was established for the purpose of promoting open scholarly citation data. As of 1 September, more than 1,000 publishers were members, including Sage Publishing, Taylor and Francis, Wiley and Springer Nature — which joined last year. Publishers still to join I4OC include the American Chemical Society, Elsevier — the largest not to do so — and the IEEE….”

[1909.01476] How much research shared on Facebook is hidden from public view? A comparison of public and private online activity around PLOS ONE papers

Abstract:  Despite its undisputed position as the biggest social media platform, Facebook has never entered the main stage of altmetrics research. In this study, we argue that the lack of attention by altmetrics researchers is not due to a lack of relevant activity on the platform, but because of the challenges in collecting Facebook data have been limited to activity that takes place in a select group of public pages and groups. We present a new method of collecting shares, reactions, and comments across the platform-including private timelines-and use it to gather data for all articles published between 2015 to 2017 in the journal PLOS ONE. We compare the gathered data with altmetrics collected and aggregated by Altmetric. The results show that 58.7% of papers shared on the platform happen outside of public view and that, when collecting all shares, the volume of activity approximates patterns of engagement previously only observed for Twitter. Both results suggest that the role and impact of Facebook as a medium for science and scholarly communication has been underestimated. Furthermore, they emphasise the importance of openness and transparency around the collection and aggregation of altmetrics.

 

Digital Science and the International Society for Scientometrics and Informetrics join forces to provide ISSI members with free access to Dimensions and Altmetric data  – Digital Science

“Digital Science, a leader in scholarly technology, is pleased to announce a collaboration with the International Society for Scientometrics and Informetrics (ISSI) that will give ISSI members enhanced access to Dimensions and Altmetric data for scientometric research.

ISSI is an international association of scholars and professionals active in the interdisciplinary study science of science, science communication, and science policy. The ISSI community advances the boundaries of quantitative science studies, from theoretical, empirical, and practical perspectives.

Starting on October 1 2019, ISSI members will formally be invited to apply for no-cost access to Altmetric and Dimensions web tools and APIs. A committee of ISSI members will provide expert assessment of researchers’ applications and guidance on using Altmetric and Dimensions in their research.

This partnership builds upon Altmetric and Dimensions’ existing no-cost data sharing programs, which are currently open to all researchers conducting non-commercial scientometric research, while providing ISSI members with additional expert advice on early-stage research….”

Releasing a preprint is associated with more attention and citations | bioRxiv

Abstract:  Preprints in the life sciences are gaining popularity, but release of a preprint still precedes only a fraction of peer-reviewed publications. Quantitative evidence on the relationship between preprints and article-level metrics of peer-reviewed research remains limited. We examined whether having a preprint on bioRxiv.org was associated with the Altmetric Attention Score and number of citations of the corresponding peer-reviewed article. We integrated data from PubMed, CrossRef, Altmetric, and Rxivist (a collection of bioRxiv metadata). For each of 26 journals (comprising a total of 46,451 articles and 3,817 preprints), we used log-linear regression, adjusted for publication date and scientific subfield, to estimate fold-changes of Attention Score and citations between articles with and without a preprint. We also performed meta-regression of the fold-changes on journal-level characteristics. By random effects meta-analysis across journals, releasing a preprint was associated with a 1.53 times higher Attention Score + 1 (95% CI 1.42 to 1.65) and 1.31 times more citations + 1 (95% CI 1.24 to 1.38) of the peer-reviewed article. Journals with larger fold-changes of Attention Score tended to have lower impact factors and lower percentages of articles released as preprints. In contrast, a journal’s fold-change of citations was not associated with impact factor, percentage of articles released as preprints, or access model. The findings from this observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

Proposal for a Standard Article Metrics Dashboard to Replace the Journal Impact Factor

Abstract:  This paper proposes the creation of a dashboard consisting of five metrics that could be used to replace the journal impact factor. It should be especially useful in circumstances, like promotion and tenure committees, where the evaluators do not share the authors subject expertise and where they are working under time constraints.

Share or perish: Social media and the International Journal of Mental Health Nursing – McNamara – – International Journal of Mental Health Nursing – Wiley Online Library

Abstract:  The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information. The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non?traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research. The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal’s articles. To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed. Quantitative evidence demonstrates the effectiveness of one journal’s social media strategy in increasing the reach and readership of the articles it publishes. This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.

Journal impact factor: a bumpy ride in an open space | Journal of Investigative Medicine

Abstract:  The journal impact factor (IF) is the leading method of scholarly assessment in today’s research world. An important question is whether or not this is still a constructive method. For a specific journal, the IF is the number of citations for publications over the previous 2 years divided by the number of total citable publications in these years (the citation window). Although this simplicity works to an advantage of this method, complications arise when answers to questions such as ‘What is included in the citation window’ or ‘What makes a good journal impact factor’ contain ambiguity. In this review, we discuss whether or not the IF should still be considered the gold standard of scholarly assessment in view of the many recent changes and the emergence of new publication models. We will outline its advantages and disadvantages. The advantages of the IF include promoting the author meanwhile giving the readers a visualization of the magnitude of review. On the other hand, its disadvantages include reflecting the journal’s quality more than the author’s work, the fact that it cannot be compared across different research disciplines, and the struggles it faces in the world of open access. Recently, alternatives to the IF have been emerging, such as the SCImago Journal & Country Rank, the Source Normalized Impact per Paper and the Eigenfactor Score, among others. However, all alternatives proposed thus far are associated with their own limitations as well. In conclusion, although IF contains its cons, until there are better proposed alternative methods, IF remains one of the most effective methods for assessing scholarly activity.

The effect of bioRxiv preprints on citations and altmetrics | bioRxiv

Abstract:  A potential motivation for scientists to deposit their scientific work as preprints is to enhance its citation or social impact, an effect which has been empirically observed for preprints in physics, astronomy and mathematics deposited to arXiv. In this study we assessed the citation and altmetric advantage of bioRxiv, a preprint server for the biological sciences. We retrieved metadata of all bioRxiv preprints deposited between November 2013 and December 2017, and matched them to articles that were subsequently published in peer-reviewed journals. Citation data from Scopus and altmetric data from Altmetric.com were used to compare citation and online sharing behaviour of bioRxiv preprints, their related journal articles, and non-deposited articles published in the same journals. We found that bioRxiv-deposited journal articles received a sizeable citation and altmetric advantage over non-deposited articles. Regression analysis reveals that this advantage is not explained by multiple explanatory variables related to the article and its authorship. bioRxiv preprints themselves are being directly cited in journal articles, regardless of whether the preprint has been subsequently published in a journal. bioRxiv preprints are also shared widely on Twitter and in blogs, but remain relatively scarce in mainstream media and Wikipedia articles, in comparison to peer-reviewed journal articles.

The European University Association and Science Europe Join Efforts to Improve Scholarly Research Assessment Methodologies

“Evaluating research and assessing researchers is fundamental to the research enterprise and core to the activities of research funders and research performing organisations, as well as universities. The European University Association (EUA) and Science Europe are committed to building a strong dialogue between their members, who share the responsibility of developing and implementing more accurate, open, transparent and responsible approaches, that better reflect the evolution of research activity in the digital era.

Today, the outcomes of scholarly research are often measured through methods based on quantitative, albeit approximate, indicators such as the journal impact factor. There is a need to move away from reductionist ways of assessing research, as well as to establish systems that better assess research potential. Universities, research funders and research performing organisations are well-placed to explore new and improved research assessment approaches, while also being indispensable in turning these innovations into systemic reforms….”