Bringing Scholarship Back to the Heart of Scholarly Communication

“What are our chances of better aligning the paved and unpaved routes, or, in other words, what are our options to reduce the gap between established, ‘paved’ practices of scholarly communication and actual, evolving research practices? My thoughts are situated in the contexts of arts and humanities research, but similar phenomena are surely present in other disciplines as well….”

Research Assessment in the Transition to Open Science

“This report provides a comprehensive and up-to-date overview of the current state of research assessment at European universities, and shows why and how institutions are reviewing their evaluation practices. Based on the results of the 2019 EUA Open Science and Open Access Survey on Research Assessment, it aims to inform and strengthen the discussion by gathering and sharing information about current and future university approaches to research assessment….”

How a working group began the process of DORA implementation at Imperial College London – DORA

“Even so, it is much easier to sign DORA than to deliver on the commitment that signing entails. And while I would always recommend that universities sign as soon as they are ready to commit, because doing so sends such a positive message to their researchers, they should not put pen to paper without a clear idea of how signing will impact their approach to research assessment, or how they are going to develop any changes with their staff….

Out went phrases such as “contributions to research papers that appear in high-impact journals” to be replaced by “contributions to high quality and impactful research.” The change is subtle but significant – the revised guidance makes it plain that ‘impactful research’ in this context is not a cypher for the JIF; rather it is work “that makes a significant contribution to the field and/or has impact beyond the immediate field of research.” …”

Evaluating FAIR maturity through a scalable, automated, community-governed framework | Scientific Data

Abstract:  Transparent evaluations of FAIRness are increasingly required by a wide range of stakeholders, from scientists to publishers, funding agencies and policy makers. We propose a scalable, automatable framework to evaluate digital resources that encompasses measurable indicators, open source tools, and participation guidelines, which come together to accommodate domain relevant community-defined FAIR assessments. The components of the framework are: (1) Maturity Indicators – community-authored specifications that delimit a specific automatically-measurable FAIR behavior; (2) Compliance Tests – small Web apps that test digital resources against individual Maturity Indicators; and (3) the Evaluator, a Web application that registers, assembles, and applies community-relevant sets of Compliance Tests against a digital resource, and provides a detailed report about what a machine “sees” when it visits that resource. We discuss the technical and social considerations of FAIR assessments, and how this translates to our community-driven infrastructure. We then illustrate how the output of the Evaluator tool can serve as a roadmap to assist data stewards to incrementally and realistically improve the FAIRness of their resources.

REF should accommodate more diverse outputs, says study | Times Higher Education (THE)

“The UK’s research excellence framework should evolve to support the growing diversity of scholarly outputs, a major report says.

The study by consultants Rand Europe, who were commissioned by Research England to consider how research assessment might need to evolve over the next decade, draws on a survey of 3,768 academics in England.

 

It says that, while scholars currently produce an average of 4.7 different types of research output, this is likely to increase to 6.5 over the next decade, with 65 per cent of respondents saying that they expected to produce a greater diversity of output.

Respondents said that the three most dominant forms of output were likely to remain journal articles, conference contributions and book chapters. But many mentioned other types of content that they expected to produce more of in future: for example, website content, openly published peer reviews and research reports for external bodies….”

REF should accommodate more diverse outputs, says study | Times Higher Education (THE)

“The UK’s research excellence framework should evolve to support the growing diversity of scholarly outputs, a major report says.

The study by consultants Rand Europe, who were commissioned by Research England to consider how research assessment might need to evolve over the next decade, draws on a survey of 3,768 academics in England.

 

It says that, while scholars currently produce an average of 4.7 different types of research output, this is likely to increase to 6.5 over the next decade, with 65 per cent of respondents saying that they expected to produce a greater diversity of output.

Respondents said that the three most dominant forms of output were likely to remain journal articles, conference contributions and book chapters. But many mentioned other types of content that they expected to produce more of in future: for example, website content, openly published peer reviews and research reports for external bodies….”

Ethiopia adopts a national open access policy | EIFL

“The new national open access policy adopted by the Ministry of Science and Higher Education of Ethiopia (MOSHE) will transform research and education in our country. The policy comes into effect immediately. It mandates open access to all published articles, theses, dissertations and data resulting from publicly-funded research conducted by staff and students at universities that are run by the Ministry – that is over 47 universities located across Ethiopia.

In addition to mandating open access to publications and data, the new policy encourages open science practices by including ‘openness’ as one of the criteria for assessment and evaluation of research proposals. All researchers who receive public funding must submit their Data Management Plans to research offices and to university libraries for approval, to confirm that data will be handled according to international FAIR data principles. (FAIR data are data that meet standards of Findability, Accessibility, Interoperability and Reusabililty.)…”

Ethiopia adopts a national open access policy | EIFL

“The new national open access policy adopted by the Ministry of Science and Higher Education of Ethiopia (MOSHE) will transform research and education in our country. The policy comes into effect immediately. It mandates open access to all published articles, theses, dissertations and data resulting from publicly-funded research conducted by staff and students at universities that are run by the Ministry – that is over 47 universities located across Ethiopia.

In addition to mandating open access to publications and data, the new policy encourages open science practices by including ‘openness’ as one of the criteria for assessment and evaluation of research proposals. All researchers who receive public funding must submit their Data Management Plans to research offices and to university libraries for approval, to confirm that data will be handled according to international FAIR data principles. (FAIR data are data that meet standards of Findability, Accessibility, Interoperability and Reusabililty.)…”

Driving Institutional Change for Research Assessment Reform – DORA

“What is this meeting about?

DORA and the Howard Hughes Medical Institute (HHMI) are convening a diverse group of stakeholders to consider how to improve research assessment policies and practices.
By exploring different approaches to cultural and systems change, we will discuss practical ways to reduce the reliance on proxy measures of quality and impact in hiring, promotion, and funding decisions. To focus on practical steps forward that will improve research assessment practices, we are not going to discuss the well-documented deficiencies of the Journal Impact Factor (JIF) as a measure of quality….”

Preprints are valid research outputs for REF2021 – ASAPbio

“In conversations about preprints in the UK, the question is often raised: ‘are preprints included in REF?’ In brief: yes. This is most likely to be applicable for any research manuscript that is prepared close to the REF2021 submission deadline, and is deemed to be amongst your best work in the current cycle, but which would otherwise not be eligible for REF2021 due to not having time to be published in a journal before the deadline.

The Research Excellence Framework (or REF) is the exercise the UK higher education funding bodies undertake periodically to assess UK research institutions for excellence and impact of research outputs. The REF scores determine allocation of approximately £2bn/year national funding for research, so REF is a major driver of UK institutional policy and researcher behaviour.  Learn more about REF at the end of the post.

Below, we describe how preprints can be included in REF submissions, with extracts from the official REF guidance. …”