Do articles in open access journals have more frequent altmetric activity than articles in subscription-based journals? An investigation of the research output of Finnish universities | SpringerLink

Abstract:  Scientific articles available in Open Access (OA) have been found to attract more citations and online attention to the extent that it has become common to speak about OA Altmetrics Advantage. This research investigates how the OA Altmetrics Advantage holds for a specific case of research articles, namely the research outputs from universities in Finland. Furthermore, this research examines disciplinary and platform specific differences in that (dis)advantage. The new methodological approaches developed in this research focus on relative visibility, i.e. how often articles in OA journals receive at least one mention on the investigated online platforms, and relative receptivity, i.e. how frequently articles in OA journals gain mentions in comparison to articles in subscription-based journals. The results show significant disciplinary and platform specific differences in the OA advantage, with articles in OA journals within for instance veterinary sciences, social and economic geography and psychology receiving more citations and attention on social media platforms, while the opposite was found for articles in OA journals within medicine and health sciences. The results strongly support field- and platform-specific considerations when assessing the influence of journal OA status on altmetrics. The new methodological approaches used in this research will serve future comparative research into OA advantage of scientific articles over time and between countries.

Dutch universities and research funders move away from the impact factor – ScienceGuide

“By the end of 2019, all parties involved in this project pledge to have signed DORA  . This commitment has to be more than an empty gesture. For example, norms such as ‘four publications to obtain a PhD’ will be abolished, and NWO and ZonMw will no longer inquire about h-index or journal impact factor when academics submit grant applications. Instead of asking for a publication list and CV, they will ask for a more ‘narrative’ approach – inquiring about why this research is important, and why the applicant is the right person to carry it out.

This change will be fast, but not instant. The parties involved acknowledge that change takes time. Considering that to focus metrics such as impact factors took decades to become part of established practices, unlearning these routines will require a considerable amount of time, energy and perseverance. Correctly identifying diverse forms of talent and ‘good research’ will be a learning experience: “To accelerate the desired cultural change in recognition and rewards, we at NWO and ZonMW will strongly focus on training and instruction for our grant evaluation committees.” …”

Tracing the path from social attention to scientific impact | Cardiovascular Research | Oxford Academic

“The ‘citation’ has long served as the bedrock out of which the scientific impact of a published work is carved. This, too, is the basis of common metrics such as the impact factor (IF) and h-index. However, these traditional metrics have been criticized over time for being opaque, difficult to compare across fields, easily manipulatable and ultimately confined to assessing impact within academic circles (rather than educational, or in the general public).1 To confuse matters further, different article databases often report different citation counts for papers depending on their level of coverage.2 For scientists, the ‘publish or perish’ attitude in contemporary science has also led to a fundamental shift in behaviour to emphasize increased output. This includes, for example, splitting papers into multiple, smaller publications. Given the rate of scientific output has doubled approximately every 9 years, it has become increasingly difficult to know exactly what is worth paying attention to in the scientific milieu. Similarly, the considerable time for citations to accumulate hinders an immediate assessment of a work’s impact. These factors are among those which provided the impetus behind the development of Altmetrics: an alternative metric capitalizing on the advent of the internet and social media, where over a third of scholars are on Twitter, to allow a glimpse into a scientific work’s reach at an earlier stage than citations would typically allow.3

Altmetric Attention Scores (AAS) are largely qualitative scores calculated based on weighted variables associated with an article, such as the number of referencing blog posts and Wikipedia articles, Twitter mentions and peer-reviews on Faculty of 1000.4 The different score weightings applied to each variable ostensibly reflects their relative social reach, with news posts and Twitter mentions contributing 8 and 1 points towards the overall AAS, respectively. It gets slightly more complex with Twitter, where scores are applied based on the nature of the Tweeter: a publisher’s account should be valued less than an unconnected individual’s account with a sizeable following. Fundamental to the use of AAS is that they are indicators of attention rather than quality. For this reason, Altmetrics can be seen as complimentary to traditional bibliometrics. Where the score is particularly useful is in helping to discover papers, evaluate scientific impact in a broader context and as a tool for examining trends in disciplines over time….”

Journal practices (other than OA) promoting Open Science goals | Zenodo

“Journal practices (other than OA) promoting Open Science goals (relevance, reproducibility, efficiency, transparency)

Early, full and reproducible content

preregistration – use preregistrations in the review process
registered reports – apply peer review to preregistration prior to the study and publish results regardless of outcomes
preprint policy – liberally allow preprinting in any archive without license restrictions
data/code availability – foster or require open availability of data and code for reviewers and readers
TDM allowance – allow unrestricted TDM of full text and metadata for any use
null/negative results – publish regardless of outcome
 

Machine readable ecosystem

data/code citation – promote citation and use standards
persistent IDs – e.g. DOI, ORCID, ROR, Open Funder Registry, grant IDs
licenses (in Crossref) – register (open) licenses in Crossref
contributorship roles – credit all contributors for their part in the work
open citations – make citation information openly available via Crossref
 

Peer review

open peer review – e.g. open reports and open identities
peer review criteria – evaluate methodological rigour and reporting quality only or also judge expected relevance or impact?
rejection rates – publish rejection rates and reconsider high selectivity
post-publication peer review – publish immediately after sanity check and let peer review follow that?
 

Diversity

author diversity – age, position, gender, geography, ethnicity, colour
reviewer diversity – age, position, gender, geography, ethnicity, colour
editor diversity – age, position, gender, geography, ethnicity, colour

Metrics and DORA

DORA: journal metrics – refrain from promoting
DORA: article metrics – provide a range and use responsibly…”

Set citation data free

“However, most poll respondents felt that citation-based indicators are useful, but that they should be deployed in more nuanced and open ways. The most popular responses to the poll were that citation-based indicators should be tweaked to exclude self-citations, or that self-citation rates should be reported alongside other metrics (see ‘The numbers game’). On the whole, respondents wanted to be able to judge for themselves when self-citations might be appropriate, and when not; to be able to compare self-citation across fields; and more….

But this is where there is a real problem, because for many papers citation data are locked inside proprietary databases. Since 2000, more and more publishers have been depositing information about research-paper references with an organization called Crossref, the non-profit agency that registers digital object identifiers (DOIs), the strings of characters that identify papers on the web. But not all publishers allow their reference lists to be made open for anyone to download and analyse — only 59% of the almost 48 million articles deposited with Crossref currently have open references.

 

There is, however, a solution. Two years ago, the Initiative for Open Citations (I4OC) was established for the purpose of promoting open scholarly citation data. As of 1 September, more than 1,000 publishers were members, including Sage Publishing, Taylor and Francis, Wiley and Springer Nature — which joined last year. Publishers still to join I4OC include the American Chemical Society, Elsevier — the largest not to do so — and the IEEE….”

[1909.01476] How much research shared on Facebook is hidden from public view? A comparison of public and private online activity around PLOS ONE papers

Abstract:  Despite its undisputed position as the biggest social media platform, Facebook has never entered the main stage of altmetrics research. In this study, we argue that the lack of attention by altmetrics researchers is not due to a lack of relevant activity on the platform, but because of the challenges in collecting Facebook data have been limited to activity that takes place in a select group of public pages and groups. We present a new method of collecting shares, reactions, and comments across the platform-including private timelines-and use it to gather data for all articles published between 2015 to 2017 in the journal PLOS ONE. We compare the gathered data with altmetrics collected and aggregated by Altmetric. The results show that 58.7% of papers shared on the platform happen outside of public view and that, when collecting all shares, the volume of activity approximates patterns of engagement previously only observed for Twitter. Both results suggest that the role and impact of Facebook as a medium for science and scholarly communication has been underestimated. Furthermore, they emphasise the importance of openness and transparency around the collection and aggregation of altmetrics.

 

Digital Science and the International Society for Scientometrics and Informetrics join forces to provide ISSI members with free access to Dimensions and Altmetric data  – Digital Science

“Digital Science, a leader in scholarly technology, is pleased to announce a collaboration with the International Society for Scientometrics and Informetrics (ISSI) that will give ISSI members enhanced access to Dimensions and Altmetric data for scientometric research.

ISSI is an international association of scholars and professionals active in the interdisciplinary study science of science, science communication, and science policy. The ISSI community advances the boundaries of quantitative science studies, from theoretical, empirical, and practical perspectives.

Starting on October 1 2019, ISSI members will formally be invited to apply for no-cost access to Altmetric and Dimensions web tools and APIs. A committee of ISSI members will provide expert assessment of researchers’ applications and guidance on using Altmetric and Dimensions in their research.

This partnership builds upon Altmetric and Dimensions’ existing no-cost data sharing programs, which are currently open to all researchers conducting non-commercial scientometric research, while providing ISSI members with additional expert advice on early-stage research….”

Releasing a preprint is associated with more attention and citations | bioRxiv

Abstract:  Preprints in the life sciences are gaining popularity, but release of a preprint still precedes only a fraction of peer-reviewed publications. Quantitative evidence on the relationship between preprints and article-level metrics of peer-reviewed research remains limited. We examined whether having a preprint on bioRxiv.org was associated with the Altmetric Attention Score and number of citations of the corresponding peer-reviewed article. We integrated data from PubMed, CrossRef, Altmetric, and Rxivist (a collection of bioRxiv metadata). For each of 26 journals (comprising a total of 46,451 articles and 3,817 preprints), we used log-linear regression, adjusted for publication date and scientific subfield, to estimate fold-changes of Attention Score and citations between articles with and without a preprint. We also performed meta-regression of the fold-changes on journal-level characteristics. By random effects meta-analysis across journals, releasing a preprint was associated with a 1.53 times higher Attention Score + 1 (95% CI 1.42 to 1.65) and 1.31 times more citations + 1 (95% CI 1.24 to 1.38) of the peer-reviewed article. Journals with larger fold-changes of Attention Score tended to have lower impact factors and lower percentages of articles released as preprints. In contrast, a journal’s fold-change of citations was not associated with impact factor, percentage of articles released as preprints, or access model. The findings from this observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

Proposal for a Standard Article Metrics Dashboard to Replace the Journal Impact Factor

Abstract:  This paper proposes the creation of a dashboard consisting of five metrics that could be used to replace the journal impact factor. It should be especially useful in circumstances, like promotion and tenure committees, where the evaluators do not share the authors subject expertise and where they are working under time constraints.