The Most Widely Disseminated COVID-19-Related Scientific Publications in Online Media: A Bibliometric Analysis of the Top 100 Articles with the Highest Altmetric Attention Scores

Abstract:  The novel coronavirus disease 2019 (COVID-19) is a global pandemic. This study’s aim was to identify and characterize the top 100 COVID-19-related scientific publications, which had received the highest Altmetric Attention Scores (AASs). Hence, we searched Altmetric Explorer using search terms such as “COVID” or “COVID-19” or “Coronavirus” or “SARS-CoV-2” or “nCoV” and then selected the top 100 articles with the highest AASs. For each article identified, we extracted the following information: the overall AAS, publishing journal, journal impact factor (IF), date of publication, language, country of origin, document type, main topic, and accessibility. The top 100 articles most frequently were published in journals with high (>10.0) IF (n = 67), were published between March and July 2020 (n = 67), were written in English (n = 100), originated in the United States (n = 45), were original articles (n = 59), dealt with treatment and clinical manifestations (n = 33), and had open access (n = 98). Our study provides important information pertaining to the dissemination of scientific knowledge about COVID-19 in online media. View Full-Text

 

What Is the Price of Science? | mBio

Abstract:  The peer-reviewed scientific literature is the bedrock of science. However, scientific publishing is undergoing dramatic changes, which include the expansion of open access, an increased number of for-profit publication houses, and ready availability of preprint manuscripts that have not been peer reviewed. In this opinion article, we discuss the inequities and concerns that these changes have wrought.

 

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Rethinking Research Assessment: Ideas for Action | DORA

“DORA is developing a toolkit of resources to help academic institutions improve their policies and practices. So far, it includes two briefing documents that offer principles to guide institutional change and strategies to address the infrastructural implications of common cognitive biases to increase equity.

Ideas for Action outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices….”

Reimagining Academic Career Assessment: Stories of innovation and change

“This report and the accompanying online repository1 bring together case studies in responsible academic career assessment. Gathered by the San Francisco Declaration on Research Assessment (DORA),2 European University Association (EUA),3 and Scholarly Publishing and Academic Resources Coalition (SPARC) Europe, 4 the case studies independently serve as a source of inspiration for institutions looking to improve their academic career assessment practices. Following the publication of guidelines and recommendations on more responsible evaluation approaches, such as DORA,5 the Leiden Manifesto for Research Metrics, 6 and the Metric Tide, 7 more and more institutions have begun to consider how to implement a range of practical changes and innovations in recent years. However, information about the creation and development of new practices in academic career assessment are not always easy to find. Collectively, the case studies will further facilitate this “practical turn” toward implementation by providing a structured overview and conceptual clarity on key characteristics and contextual factors. In doing so, the report examines emerging pathways of institutional reform of academic career assessment…”

Open access in Europe: a national and regional comparison | SpringerLink

Maddi, A., Lardreau, E. & Sapinho, D. Open access in Europe: a national and regional comparison. Scientometrics (2021). https://doi.org/10.1007/s11192-021-03887-1

Abstract:

Open access to scientific publications has progressively become a key issue for European policy makers, resulting in concrete measures by the different country members to promote its development. The aim of paper is, after providing a quick overview of OA policies in Europe, to carry out a comparative study of OA practices within European countries, using data from the Web of Science (WoS) database. This analysis is based on two indicators: the OA share that illustrates the evolution over time, and the normalized OA indicator (NOAI) that allows spatial comparisons, taking into account disciplinary structures of countries. Results show a general trend towards the development of OA over time as expected, but with large disparities between countries, depending on how early they begin taking measures in favor of OA. While it is possible to stress the importance of policy and its influence on open access at country level, this does not appear to be the case at the regional level. There is not much variability between regions, within the same country, in terms of open access indicators.

ACS Publications signs DORA – ACS Axial

“Effective February 2021, ACS Publications has signed the Declaration on Research Assessment (DORA). This demonstrates our commitment as a publisher and professional organization to support broader assessment of research output.

DORA recognizes the need to improve the ways in which the outputs of scholarly research are evaluated. The declaration was developed in 2012 during the Annual Meeting of the American Society for Cell Biology in San Francisco. It has become a worldwide initiative covering all scholarly disciplines and all key stakeholders including funders, publishers, professional societies, institutions, and researchers. DORA’s vision is to advance practical and robust approaches to research assessment globally and across all scholarly disciplines….”

Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics – Lemke – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  The amount of annually published scholarly articles is growing steadily, as is the number of indicators through which impact of publications is measured. Little is known about how the increasing variety of available metrics affects researchers’ processes of selecting literature to read. We conducted ranking experiments embedded into an online survey with 247 participating researchers, most from social sciences. Participants completed series of tasks in which they were asked to rank fictitious publications regarding their expected relevance, based on their scores regarding six prototypical metrics. Through applying logistic regression, cluster analysis, and manual coding of survey answers, we obtained detailed data on how prominent metrics for research impact influence our participants in decisions about which scientific articles to read. Survey answers revealed a combination of qualitative and quantitative characteristics that researchers consult when selecting literature, while regression analysis showed that among quantitative metrics, citation counts tend to be of highest concern, followed by Journal Impact Factors. Our results suggest a comparatively favorable view of many researchers on bibliometrics and widespread skepticism toward altmetrics. The findings underline the importance of equipping researchers with solid knowledge about specific metrics’ limitations, as they seem to play significant roles in researchers’ everyday relevance assessments.

 

Responsible Metrics Implementation Officer

“We are looking for an enthusiastic and engaging colleague to lead the implementation of a project to embed the principles of the Declaration on Research Assessment in the university’s practice. An implementation plan is in place, and you will be involved in designing and delivering relevant training, liaising closely with academic faculties (both academic and professional services colleagues), and establishing systems to monitor the progress of the responsible metrics policy’s rollout. Based in the Library’s Open Research team, you will be comfortable engaging with academic and professional services colleagues at all levels in the university structure.”