Good Practices – Research Institutes – DORA

“DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment.  One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to gather and share existing examples of good practice in research assessment, including approaches to funding and fellowships, hiring and promotion, and awarding prizes, that emphasize research itself and not where it is published. 

If you know of exemplary research assessment methods that could provide inspiration and ideas for research institutes, funders, journals, professional societies, or researchers, please contact DORA….”

To fix research assessment, swap slogans for definitions

“Two years ago, the DORA steering committee hired me to survey practices in research assessment and promote the best ones. Other efforts have similar goals. These include the Leiden Manifesto and the HuMetricsHSS Initiative.

My view is that most assessment guidelines permit sliding standards: instead of clearly defined terms, they give us feel-good slogans that lack any fixed meaning. Facing the problem will get us much of the way towards a solution.

Broad language increases room for misinterpretation. ‘High impact’ can be code for where research is published. Or it can mean the effect that research has had on its field, or on society locally or globally — often very different things. Yet confusion is the least of the problems. Descriptors such as ‘world-class’ and ‘excellent’ allow assessors to vary comparisons depending on whose work they are assessing. Academia cannot be a meritocracy if standards change depending on whom we are evaluating. Unconscious bias associated with factors such as a researcher’s gender, ethnic origin and social background helps to perpetuate the status quo. It was only with double-blind review of research proposals that women finally got fair access to the Hubble Space Telescope. Research suggests that using words such as ‘excellence’ in the criteria for grants, awards and promotion can contribute to hypercompetition, in part through the ‘Matthew effect’, in which recognition and resources flow mainly to those who have already received them….”

Internal Collaboration: Using the IR to Build a Promotion and Tenure Package

“At Embry-Riddle Aeronautical University, world-class experts conduct cutting-edge air and space research, and the Library works diligently to capture and share all the unique work through their institutional repository. When the Chief Information Officer asked library staff for help creating a Promotion and Tenure tool to better serve faculty looking to advance, librarians Debra Rodensky and Chip Wolfe knew they had the technology, all the content of the IR and the strong campus relationships to make it happen. Join Debra and Chip on December 11 for a webinar on the Library’s collaboration with IT and the library’s relationships with faculty through the tenure and promotion process. Topics will include:

– The historical relationship between the Library, its IR and the IT department at Embry-Riddle
– The changing culture of promotion and tenure on campus
– Challenges and successes of building a tool to meet the needs of both IT and faculty….”

Knowledge sector takes major step forward in new approach to recognising and rewarding academics

“Academics can excel in many areas, but thus far they have primarily been assessed based on research achievements. From now on, the public knowledge institutions and research funders want to consider academics’ knowledge and expertise more broadly in determining career policy and grant requirements. In doing so, our aim is to ensure that the recognition and rewards system is better suited to the core tasks of the knowledge institutions in the areas of education, research, impact and patient care, and that the appreciation academics receive is better aligned with society’s needs.

 

A change is urgently needed in the way universities recognise and reward their academic staff. Research achievements have long determined academics’ career paths, and this dominance is becoming increasingly at odds with reality. Education and impact are also crucial to the success of a modern knowledge institution, as is patient care for our university medical centres. New developments relating to Open Access and Open Science are placing different demands on modern-day academics as well. Tackling complex scientific and social issues requires greater collaboration. At the moment, there are still insufficient career prospects for staff who (in addition to doing good research) mainly excel in education….”

The fundamental problem blocking open access and how to overcome it: the BitViews project

Abstract:  In our view the fundamental obstacle to open access (OA) is the lack of any incentive-based mechanism that unbundles authors’ accepted manuscripts (AMs) from articles (VoRs). The former can be seen as the public good that ought to be openly accessible, whereas the latter is owned by publishers and rightly paywall-restricted. We propose one such mechanism to overcome this obstacle: BitViews. BitViews is a blockchain-based application that aims to revolutionize the OA publishing ecosystem. Currently, the main academic currency of value is the citation. There have been attempts in the past to create a second currency whose measure is the online usage of research materials (e.g. PIRUS). However, these have failed due to two problems. Firstly, it has been impossible to find a single agency willing to co-ordinate and fund the validation and collation of global online usage data. Secondly, online usage metrics have lacked transparency in how they filter non-human online activity. BitViews is a novel solution which uses blockchain technology to bypass both problems: online AMS usage will be recorded on a public, distributed ledger, obviating the need for a central responsible agency, and the rules governing activity-filtering will be part of the open-source BitViews blockchain application, creating complete transparency. Once online AMS usage has measurable value, researchers will be incentivized to promote and disseminate AMs. This will fundamentally re-orient the academic publishing ecosystem. A key feature of BitViews is that its success (or failure) is wholly and exclusively in the hands of the worldwide community of university and research libraries, as we suggest that it ought to be financed by conditional crowdfunding, whereby the actual financial commitment of each contributing library depends on the total amount raised. If the financing target is not reached, then all contributions are returned in full and if the target is over-fulfilled, then the surplus is returned pro rata.

Textbooks could be free if universities rewarded professors for writing them – Academic Matters

“eCampusOntario commissioned me to produce a report on how institutions of higher learning could support the implementation of open educational resources. I worked with the centre for a year as an Open Education Fellow, one of six who were selected because of our own involvement in producing open educational resources at our colleges and universities….

We only found two institutions in Canada, the University of British Columbia and the Southern Alberta Institute of Technology, where explicit mention of open education had been made in performance and tenure policies.

 

We recommended that Ontario’s colleges and universities recognize creating open resources in policies governing tenure and promotion. Doing so would change the culture of these institutions and be a more effective incentive than course buy-outs or small grants. It would communicate clearly that institutions of higher education take seriously the responsibility to tailor knowledge to students and to reduce barriers….”

Promoting openness – Research Professional News

“Of the potential solutions, open research practices are among the most promising. The argument is that transparency acts as an implicit quality control process. If others are able to scrutinise our work—not just the final published output, but the underlying data, code, and so on—researchers will be incentivised to ensure these are high quality.

So, if we think that research could benefit from improved quality control, and if we think that open research might have a role to play in this, why aren’t we all doing it? In a word: incentives….”

Knowledge Exchange Openness Profile – Knowledge Exchange

“As part of our work on Open Scholarship, we are working to raise awareness of the lack of recognition in current evaluation practice and work towards a possible solution; through development of an ‘Openness Profile’…

Part of KE’s work on Open Scholarship aims to enhance the evaluation of research and researchers. This currently does not cover recognition of non-academic contributions to make Open Scholarship work, such as activities to open up and curate data for re-use, or making research results findable and available. Our approach is to raise more awareness on the lack of recognition in current evaluation practice and work towards a possible solution, through the development of an ‘Openness Profile’.

The KE Open Scholarship Research Evaluation task & finish group works on the awareness issue, listing all academic and non-academic contributions that are essential to Open Scholarship and should be recognised when evaluating research. The group also works on the Openness Profile, a tool that is meant to allow evaluation of currently ignored contributions that are essential for Open Scholarship. For the development of the Openness Profile we seek involvement of various key stakeholders and alignment with current identifiers such as DOI and ORCID iD.

By demonstrating the immaturity of current research evaluation practice, and by developing the Openness Profile tool, KE supports researchers as well as non-researchers to get credit for all their contributions that make Open Scholarship possible. Our ambition is that recognition of these essential activities becomes part of standard research evaluation routine….”

Driving Institutional Change for Research Assessment Reform

“Academic institutions and funders assess their scientists’ research outputs to help allocate their limited resources. Research assessments are codified in policies and enacted through practices. Both can be problematic: policies if they do not accurately reflect institutional mission and values; and practices if they do not reflect institutional policies.

Even if new policies and practices are developed and introduced, their adoption often requires significant cultural change and buy-in from all relevant parties – applicants, reviewers and decision makers.

We will discuss how to develop and adopt new research assessment policies and practices through panel discussions, short plenary talks and breakout sessions. We will use the levels of intervention described in the “Changing a Research Culture” pyramid (Nosek, 2019), to organize the breakout sessions….”

Open Access Week 2019 – What are we talking about and where are we going? | Impact of Social Sciences

“As the organisers of Open Access Week describe, the debate around open access has transitioned from one about the viability of the concept of open access, to one of creating an equitable research culture, where openness is a default. As Daniel Hook argued in a post directly related to this question: The Open Tide – How openness in research and communication is becoming the default setting, whilst an open culture is emerging, it remains unevenly distributed and dependent on different national research systems, supporting research cultures with varying degrees of openness. As the post suggests, the ways in which governments fund and maintain open infrastructures are therefore critical to delivering not only openness, but also equity. 

For this reason, although 2019 has seen many important developments for open access, the most significant event for open access in 2019 likely occurred in 2018, when cOAlition S unveiled Plan_S, the ambitious funder led mandate to initiate a global transition to open forms of research publication. Whilst, this initiative has had significant implications for academic publishers, as described by Martin Szomszor in his post Making Waves – Assessing the potential impacts of Plan S on the scholarly communications ecosystem. It has also raised more existential questions about the purpose of open access and Plan_S. In particular, as Jon Tennant argues in Plan S – Time to decide what we stand for, it asks whether the open future we would like to see is one that maintains the status quo of commercial academic publishing, or one that promotes a scholarly communication system run by and focused on the needs of the academic community. …”