Knowledge Exchange Openness Profile – Knowledge Exchange

“As part of our work on Open Scholarship, we are working to raise awareness of the lack of recognition in current evaluation practice and work towards a possible solution; through development of an ‘Openness Profile’…

Part of KE’s work on Open Scholarship aims to enhance the evaluation of research and researchers. This currently does not cover recognition of non-academic contributions to make Open Scholarship work, such as activities to open up and curate data for re-use, or making research results findable and available. Our approach is to raise more awareness on the lack of recognition in current evaluation practice and work towards a possible solution, through the development of an ‘Openness Profile’.

The KE Open Scholarship Research Evaluation task & finish group works on the awareness issue, listing all academic and non-academic contributions that are essential to Open Scholarship and should be recognised when evaluating research. The group also works on the Openness Profile, a tool that is meant to allow evaluation of currently ignored contributions that are essential for Open Scholarship. For the development of the Openness Profile we seek involvement of various key stakeholders and alignment with current identifiers such as DOI and ORCID iD.

By demonstrating the immaturity of current research evaluation practice, and by developing the Openness Profile tool, KE supports researchers as well as non-researchers to get credit for all their contributions that make Open Scholarship possible. Our ambition is that recognition of these essential activities becomes part of standard research evaluation routine….”

Driving Institutional Change for Research Assessment Reform

“Academic institutions and funders assess their scientists’ research outputs to help allocate their limited resources. Research assessments are codified in policies and enacted through practices. Both can be problematic: policies if they do not accurately reflect institutional mission and values; and practices if they do not reflect institutional policies.

Even if new policies and practices are developed and introduced, their adoption often requires significant cultural change and buy-in from all relevant parties – applicants, reviewers and decision makers.

We will discuss how to develop and adopt new research assessment policies and practices through panel discussions, short plenary talks and breakout sessions. We will use the levels of intervention described in the “Changing a Research Culture” pyramid (Nosek, 2019), to organize the breakout sessions….”

Chasing cash cows in a swamp? Perspectives on Plan S from Australia and the USA | Unlocking Research

“Rankings are a natural enemy of openness….

Australian universities are heavily financially reliant on overseas students….

University rankings are extremely important in the recruitment of overseas students….

There is incredible pressure on researchers in Australia to perform. This can take the form of reward, with many universities offering financial incentives for publication in ‘top’ journals….

For example, Griffith University’s Research and Innovation Plan 2017-2020 includes: “Maintain a Nature and Science publication incentive scheme”. Publication in these two journals comprises 20% of the score in the Academic Ranking of World Universities….”

India Not Joining Plan S, Pursuing More Nationally Focused Efforts: K. VijayRaghavan

“In February 2018, K. VijayRaghavan, the principal scientific adviser to the Government of India, announced through a series of tweets that the Government of India, which funds over half of all scientific research undertaken in the country, will be joining an ambitious European effort to lower the costs of scientific publishing and improve public access to the scientific literature.

However, at a talk he delivered in Bengaluru on October 25, VijayRaghavan said that India will not be enrolling with this initiative – called Plan S – and that it is pursuing a parallel effort to negotiate with journal publishers….”

Bringing Scholarship Back to the Heart of Scholarly Communication

“What are our chances of better aligning the paved and unpaved routes, or, in other words, what are our options to reduce the gap between established, ‘paved’ practices of scholarly communication and actual, evolving research practices? My thoughts are situated in the contexts of arts and humanities research, but similar phenomena are surely present in other disciplines as well….”

Bringing Scholarship Back to the Heart of Scholarly Communication

“What are our chances of better aligning the paved and unpaved routes, or, in other words, what are our options to reduce the gap between established, ‘paved’ practices of scholarly communication and actual, evolving research practices? My thoughts are situated in the contexts of arts and humanities research, but similar phenomena are surely present in other disciplines as well….”

Research Assessment in the Transition to Open Science

“This report provides a comprehensive and up-to-date overview of the current state of research assessment at European universities, and shows why and how institutions are reviewing their evaluation practices. Based on the results of the 2019 EUA Open Science and Open Access Survey on Research Assessment, it aims to inform and strengthen the discussion by gathering and sharing information about current and future university approaches to research assessment….”

How a working group began the process of DORA implementation at Imperial College London – DORA

“Even so, it is much easier to sign DORA than to deliver on the commitment that signing entails. And while I would always recommend that universities sign as soon as they are ready to commit, because doing so sends such a positive message to their researchers, they should not put pen to paper without a clear idea of how signing will impact their approach to research assessment, or how they are going to develop any changes with their staff….

Out went phrases such as “contributions to research papers that appear in high-impact journals” to be replaced by “contributions to high quality and impactful research.” The change is subtle but significant – the revised guidance makes it plain that ‘impactful research’ in this context is not a cypher for the JIF; rather it is work “that makes a significant contribution to the field and/or has impact beyond the immediate field of research.” …”

Evaluating FAIR maturity through a scalable, automated, community-governed framework | Scientific Data

Abstract:  Transparent evaluations of FAIRness are increasingly required by a wide range of stakeholders, from scientists to publishers, funding agencies and policy makers. We propose a scalable, automatable framework to evaluate digital resources that encompasses measurable indicators, open source tools, and participation guidelines, which come together to accommodate domain relevant community-defined FAIR assessments. The components of the framework are: (1) Maturity Indicators – community-authored specifications that delimit a specific automatically-measurable FAIR behavior; (2) Compliance Tests – small Web apps that test digital resources against individual Maturity Indicators; and (3) the Evaluator, a Web application that registers, assembles, and applies community-relevant sets of Compliance Tests against a digital resource, and provides a detailed report about what a machine “sees” when it visits that resource. We discuss the technical and social considerations of FAIR assessments, and how this translates to our community-driven infrastructure. We then illustrate how the output of the Evaluator tool can serve as a roadmap to assist data stewards to incrementally and realistically improve the FAIRness of their resources.

REF should accommodate more diverse outputs, says study | Times Higher Education (THE)

“The UK’s research excellence framework should evolve to support the growing diversity of scholarly outputs, a major report says.

The study by consultants Rand Europe, who were commissioned by Research England to consider how research assessment might need to evolve over the next decade, draws on a survey of 3,768 academics in England.

 

It says that, while scholars currently produce an average of 4.7 different types of research output, this is likely to increase to 6.5 over the next decade, with 65 per cent of respondents saying that they expected to produce a greater diversity of output.

Respondents said that the three most dominant forms of output were likely to remain journal articles, conference contributions and book chapters. But many mentioned other types of content that they expected to produce more of in future: for example, website content, openly published peer reviews and research reports for external bodies….”