Research published in pay-and-publish journals won’t count: UGC panel | India News,The Indian Express

“Suggesting sweeping reforms to promote the quality of research in India, a UGC panel has recommended that publication of research material in “predatory” journals or presentations in conferences organised by their publishers should not be considered for academic credit in any form.

They include selection, confirmation, promotion, appraisal, and award of scholarships and degrees, the panel has suggested. The committee, which submitted its 14-page report to the UGC recently, has also recommended changes in PhD and MPhil programmes, including a new board for social sciences research….

Last week, the UGC launched the Consortium of Academic and Research Ethics (CARE) to approve a new official list of academic publications….”

Good Practices – Research Institutes – DORA

“DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment.  One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to gather and share existing examples of good practice in research assessment, including approaches to funding and fellowships, hiring and promotion, and awarding prizes, that emphasize research itself and not where it is published. 

If you know of exemplary research assessment methods that could provide inspiration and ideas for research institutes, funders, journals, professional societies, or researchers, please contact DORA….”

To fix research assessment, swap slogans for definitions

“Two years ago, the DORA steering committee hired me to survey practices in research assessment and promote the best ones. Other efforts have similar goals. These include the Leiden Manifesto and the HuMetricsHSS Initiative.

My view is that most assessment guidelines permit sliding standards: instead of clearly defined terms, they give us feel-good slogans that lack any fixed meaning. Facing the problem will get us much of the way towards a solution.

Broad language increases room for misinterpretation. ‘High impact’ can be code for where research is published. Or it can mean the effect that research has had on its field, or on society locally or globally — often very different things. Yet confusion is the least of the problems. Descriptors such as ‘world-class’ and ‘excellent’ allow assessors to vary comparisons depending on whose work they are assessing. Academia cannot be a meritocracy if standards change depending on whom we are evaluating. Unconscious bias associated with factors such as a researcher’s gender, ethnic origin and social background helps to perpetuate the status quo. It was only with double-blind review of research proposals that women finally got fair access to the Hubble Space Telescope. Research suggests that using words such as ‘excellence’ in the criteria for grants, awards and promotion can contribute to hypercompetition, in part through the ‘Matthew effect’, in which recognition and resources flow mainly to those who have already received them….”

Knowledge sector takes major step forward in new approach to recognising and rewarding academics

“Academics can excel in many areas, but thus far they have primarily been assessed based on research achievements. From now on, the public knowledge institutions and research funders want to consider academics’ knowledge and expertise more broadly in determining career policy and grant requirements. In doing so, our aim is to ensure that the recognition and rewards system is better suited to the core tasks of the knowledge institutions in the areas of education, research, impact and patient care, and that the appreciation academics receive is better aligned with society’s needs.

 

A change is urgently needed in the way universities recognise and reward their academic staff. Research achievements have long determined academics’ career paths, and this dominance is becoming increasingly at odds with reality. Education and impact are also crucial to the success of a modern knowledge institution, as is patient care for our university medical centres. New developments relating to Open Access and Open Science are placing different demands on modern-day academics as well. Tackling complex scientific and social issues requires greater collaboration. At the moment, there are still insufficient career prospects for staff who (in addition to doing good research) mainly excel in education….”

Room for everyone’s talent: Toward a new balance in the recognition and reward of academics

Dutch public knowledge institutions and funders call for a modernization of the academic system of recognition and rewards, in particular in five key areas: education, research, impact, leadership and (for university medical centres) patient care. Sicco de Knecht writes, for ScienceGuide, that a culture change and national and international cooperation is required to achieve such modernization. 

“Many academics feel there is a one-sided emphasis on research performance, frequently leading to the undervaluation of the other key areas such as education, impact, leadership and (for university medical centres) patient care. This puts strain on the ambitions that exist in these areas. The assessment system must be adapted and improved in each of the areas and in the connections between them.”

A look at prediction markets | Research Information

“Assessing the quality of research is difficult. Jisc and the University of Bristol are partnering to develop a tool that may help institutions improve this process.  

To attract government funding for their crucial research, UK universities are largely reliant on good ratings from the Research Excellent Framework (REF) – a process of expert review designed to assess the quality of research outputs. REF scores determine how much government funding will be allocated to their research projects. For instance, research that is world-leading in terms of originality, significance and rigour, will be scored higher than research that is only recognised nationally. 
 
Considerable time is spent by universities trying to figure out which research outputs will be rated highest (4*) on quality and impact. The recognised ‘gold standard’ for this process is close reading by a few internal academics, but this is time-consuming, onerous, and subject to the relatively limited perspective of just a few people.  …

Prediction markets capture the ‘wisdom of crowds’ by asking large numbers of people to bet on outcomes of future events – in this case how impactful a research project will be in the next REF assessment. It works a bit like the stock market, except that, instead of buying and selling shares in companies, participants buy and sell virtual shares online that will pay out if a particular event occurs – for instance, if a paper receives a 3* or above REF rating.  …”

Knowledge Exchange Openness Profile – Knowledge Exchange

“As part of our work on Open Scholarship, we are working to raise awareness of the lack of recognition in current evaluation practice and work towards a possible solution; through development of an ‘Openness Profile’…

Part of KE’s work on Open Scholarship aims to enhance the evaluation of research and researchers. This currently does not cover recognition of non-academic contributions to make Open Scholarship work, such as activities to open up and curate data for re-use, or making research results findable and available. Our approach is to raise more awareness on the lack of recognition in current evaluation practice and work towards a possible solution, through the development of an ‘Openness Profile’.

The KE Open Scholarship Research Evaluation task & finish group works on the awareness issue, listing all academic and non-academic contributions that are essential to Open Scholarship and should be recognised when evaluating research. The group also works on the Openness Profile, a tool that is meant to allow evaluation of currently ignored contributions that are essential for Open Scholarship. For the development of the Openness Profile we seek involvement of various key stakeholders and alignment with current identifiers such as DOI and ORCID iD.

By demonstrating the immaturity of current research evaluation practice, and by developing the Openness Profile tool, KE supports researchers as well as non-researchers to get credit for all their contributions that make Open Scholarship possible. Our ambition is that recognition of these essential activities becomes part of standard research evaluation routine….”

Driving Institutional Change for Research Assessment Reform

“Academic institutions and funders assess their scientists’ research outputs to help allocate their limited resources. Research assessments are codified in policies and enacted through practices. Both can be problematic: policies if they do not accurately reflect institutional mission and values; and practices if they do not reflect institutional policies.

Even if new policies and practices are developed and introduced, their adoption often requires significant cultural change and buy-in from all relevant parties – applicants, reviewers and decision makers.

We will discuss how to develop and adopt new research assessment policies and practices through panel discussions, short plenary talks and breakout sessions. We will use the levels of intervention described in the “Changing a Research Culture” pyramid (Nosek, 2019), to organize the breakout sessions….”

Chasing cash cows in a swamp? Perspectives on Plan S from Australia and the USA | Unlocking Research

“Rankings are a natural enemy of openness….

Australian universities are heavily financially reliant on overseas students….

University rankings are extremely important in the recruitment of overseas students….

There is incredible pressure on researchers in Australia to perform. This can take the form of reward, with many universities offering financial incentives for publication in ‘top’ journals….

For example, Griffith University’s Research and Innovation Plan 2017-2020 includes: “Maintain a Nature and Science publication incentive scheme”. Publication in these two journals comprises 20% of the score in the Academic Ranking of World Universities….”

India Not Joining Plan S, Pursuing More Nationally Focused Efforts: K. VijayRaghavan

“In February 2018, K. VijayRaghavan, the principal scientific adviser to the Government of India, announced through a series of tweets that the Government of India, which funds over half of all scientific research undertaken in the country, will be joining an ambitious European effort to lower the costs of scientific publishing and improve public access to the scientific literature.

However, at a talk he delivered in Bengaluru on October 25, VijayRaghavan said that India will not be enrolling with this initiative – called Plan S – and that it is pursuing a parallel effort to negotiate with journal publishers….”