You Want to See My Data? I Thought We Were Friends!

“Stuart Ritchie is a Lecturer in the Social, Genetic and Developmental Psychiatry Centre at King’s College London. His new book, Science Fictions: How Fraud, Bias, Negligence and Hype Undermine the Search for Truth, explains the ideas in this comic, by Zach Weinersmith, in more detail, telling shocking stories of scientific error and misconduct. It also proposes an abundance of ideas for how to rescue science from its current malaise….”

Guidance for research organisations on how to implement responsible and fair approaches for research assessment | Wellcome

“Our open access policy 2021 requires Wellcome-funded organisations to publicly commit to:

assessing research outputs and other research contributions based on their intrinsic merit 
discouraging the inappropriate use of proxies or metrics – such as the title or impact factor of the journal in which the work was published.

We believe that research assessment processes used by research organisations and funders in making recruitment, promotion and funding decisions should embody two core principles (‘the principles’) as set out in the San Francisco Declaration on Research Assessment (DORA)(opens in a new tab):  

be explicit about the criteria used to evaluate scientific productivity, and clearly highlight that the scientific content of a paper is more important than publication metrics or the identity of the journal in which it is published
recognise the value of all relevant research outputs (for example publications, datasets and software), as well as other types of contributions, such as training early-career researchers and influencing policy and practice…..”

Faculty are concerned about research assessment in the wake of COVID-19 – DORA

“Starting in mid-February, research that needed to be conducted in a laboratory or another setting on campus was dramatically scaled back or, more likely, stopped completely. This disruption has drawn attention to long-standing challenges in academia, including the ways that researchers are assessed for hiring, promotion, and tenure. Figuring out how to put the academic workforce on a better footing following the pandemic is a major question that needs to be addressed….”

Position Statement and Recommendations on Research Assessment Processes – Science Europe

“With limited funding and research positions available, there is increasing pressure on research organisations to put processes in place that ensure assessments of research quality are effective, efficient, fair, and transparent. For this reason, research organisations dedicate significant effort and resources towards the assessments they conduct, and continually look for ways to optimise and adapt these processes.

This document presents a set of policy recommendations that can be used as a framework to guide the evaluation of these assessment processes. They were developed following an extensive study performed in 2019 and a comprehensive consultation process, and are intended for both Science Europe Member Organisations and other research organisations.”

Research Assessment Processes – Science Europe

“Science Europe has created a set of policy recommendations in 2020 for its Member Organisations and other research organisations. They were developed following an extensive study performed in 2019 (see the following section), and through a comprehensive consultation process. More information about the methodology followed is also available.

The recommendations will help research organisations to review and improve the effectiveness and efficiency of their assessment processes for career progression and funding allocation. They also promote the sharing of knowledge so that organisations can learn from each other, which will enrich and strengthen national and international research systems as a whole….”

Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities | The BMJ

Abstract:  Objective To determine the presence of a set of pre-specified traditional and non-traditional criteria used to assess scientists for promotion and tenure in faculties of biomedical sciences among universities worldwide.

Design Cross sectional study.

Setting International sample of universities.

Participants 170 randomly selected universities from the Leiden ranking of world universities list.

Main outcome measure Presence of five traditional (for example, number of publications) and seven non-traditional (for example, data sharing) criteria in guidelines for assessing assistant professors, associate professors, and professors and the granting of tenure in institutions with biomedical faculties.

Results A total of 146 institutions had faculties of biomedical sciences, and 92 had eligible guidelines available for review. Traditional criteria of peer reviewed publications, authorship order, journal impact factor, grant funding, and national or international reputation were mentioned in 95% (n=87), 37% (34), 28% (26), 67% (62), and 48% (44) of the guidelines, respectively. Conversely, among non-traditional criteria, only citations (any mention in 26%; n=24) and accommodations for employment leave (37%; 34) were relatively commonly mentioned. Mention of alternative metrics for sharing research (3%; n=3) and data sharing (1%; 1) was rare, and three criteria (publishing in open access mediums, registering research, and adhering to reporting guidelines) were not found in any guidelines reviewed. Among guidelines for assessing promotion to full professor, traditional criteria were more commonly reported than non-traditional criteria (traditional criteria 54.2%, non-traditional items 9.5%; mean difference 44.8%, 95% confidence interval 39.6% to 50.0%; P=0.001). Notable differences were observed across continents in whether guidelines were accessible (Australia 100% (6/6), North America 97% (28/29), Europe 50% (27/54), Asia 58% (29/50), South America 17% (1/6)), with more subtle differences in the use of specific criteria.

Conclusions This study shows that the evaluation of scientists emphasises traditional criteria as opposed to non-traditional criteria. This may reinforce research practices that are known to be problematic while insufficiently supporting the conduct of better quality research and open science. Institutions should consider incentivising non-traditional criteria.

Open science and the reward system: how can they be aligned?

“How does the current reward system reflect how we value research? Is a reform of this system necessary to encourage researchers to engage in open science activities? How do current and proposed reward systems support early career researchers?

Questions on how academics’ careers and contributions are assessed and valued are under discussion. This webinar brings together a panel of experts on open science and career assessment to focus on the current reward system and the potential for its reform. This promises to be a lively exchange of ideas between representatives of Eurodoc, Young Academy of Europe, Marie Curie Alumni Association, and Elsevier. The aim is to gain a deeper understanding of possible changes to how we consider academic value, retain mobility internationally and beyond academia, and create incentives for open science activities….”

For China’s ambitious research reforms to be successful, they will need to be supported by new research assessment infrastructures | Impact of Social Sciences

“A radical reform of research evaluation and funding in China was recently launched in two prescriptive policy documents published by the Ministry of Science and Technology and the Ministry of Education. China now moves from a strong focus on Web of Science-based indicators towards a more balanced combination of qualitative and quantitative research evaluation, with one third of all publications to be oriented towards domestic journals. Universities are urged to implement the policy locally by the end of July at the latest. How to do it, and the possible consequences, have aroused intense discussion among Chinese academics and gained worldwide attention and debate. 

This change has not come out of the blue. In 2016, President Xi Jinping called for reform towards a more comprehensive evaluation system for individual researchers. Further, in 2018, a document issued by three ministries and two national central institutions specifically proposed moving away from the “Four only” phenomenon of  recognising and rewarding “only papers, only titles, only diplomas and only awards”….

Hiring: Research Communication Officer – Scholarly Communications Lab | ScholCommLab

“One of the key components of academic career advancement are the review, promotion, and tenure (RPT) packets that are prepared by faculty on a regular basis as part of their standard performance evaluations. As such, the RPT process in general, and the guidelines that inform the content of RPT packets, are a natural place to effect change in academia. While this is true of many aspects of academic life, it is especially true of the changes needed for a greater opening of access to research (something both Canada and the US have been pushing for through federal policies and laws). This project examined the RPT process in the US and Canada with the goal of directly informing actions likely to translate into behavioural change and to a greater opening of research.

The project has already yielded several important findings, which have been disseminated through peer reviewed publications:

On the Public Dimensions of Faculty Work;
On the Use of the Journal Impact Factor; and
On the Perceptions of Faculty….”