Good Practices – Research Institutes – DORA

“DORA’s ultimate aim is not to accumulate signatures but to promote real change in research assessment.  One of the keys to this is the development of robust and time-efficient ways of evaluating research and researchers that do not rely on journal impact factors. We are keen to gather and share existing examples of good practice in research assessment, including approaches to funding and fellowships, hiring and promotion, and awarding prizes, that emphasize research itself and not where it is published. 

If you know of exemplary research assessment methods that could provide inspiration and ideas for research institutes, funders, journals, professional societies, or researchers, please contact DORA….”

To fix research assessment, swap slogans for definitions

“Two years ago, the DORA steering committee hired me to survey practices in research assessment and promote the best ones. Other efforts have similar goals. These include the Leiden Manifesto and the HuMetricsHSS Initiative.

My view is that most assessment guidelines permit sliding standards: instead of clearly defined terms, they give us feel-good slogans that lack any fixed meaning. Facing the problem will get us much of the way towards a solution.

Broad language increases room for misinterpretation. ‘High impact’ can be code for where research is published. Or it can mean the effect that research has had on its field, or on society locally or globally — often very different things. Yet confusion is the least of the problems. Descriptors such as ‘world-class’ and ‘excellent’ allow assessors to vary comparisons depending on whose work they are assessing. Academia cannot be a meritocracy if standards change depending on whom we are evaluating. Unconscious bias associated with factors such as a researcher’s gender, ethnic origin and social background helps to perpetuate the status quo. It was only with double-blind review of research proposals that women finally got fair access to the Hubble Space Telescope. Research suggests that using words such as ‘excellence’ in the criteria for grants, awards and promotion can contribute to hypercompetition, in part through the ‘Matthew effect’, in which recognition and resources flow mainly to those who have already received them….”

Addendum to the cOAlition S Guidance on the Implementation of Plan S | Plan S

“cOAlition S endorse a number of strategies to encourage subscription publishers to transition to Open Access. These approaches are referred to as ’transformative arrangements’ and include transformative agreements, transformative model agreements and transformative journals[1].

The Guidance on the Implementation of Plan S indicates an ambition of developing a framework for ‘transformative journals’. Such ‘transformative journals’ are journals that (i) gradually increase the share of Open Access content, (ii) offset subscription income from payments for publishing services (to avoid double payments), and (iii) have a clear commitment to a transition to full and immediate Open Access for all peer-reviewed scholarly articles within an agreed timeframe.

The requirements below constitute this framework.

[Here omitting 8 mandatory criteria for transformative journals and 3 suggested criteria.]

We are now seeking input from the community on this draft framework and encourage all interested stakeholders to respond. The consultation on this draft framework is open until 09.00 CET on Monday 6th January 2020. We plan to publish a final version of this framework by the end of March 2020.”

Addendum to the cOAlition S Guidance on the Implementation of Plan S | Plan S

“cOAlition S endorse a number of strategies to encourage subscription publishers to transition to Open Access. These approaches are referred to as ’transformative arrangements’ and include transformative agreements, transformative model agreements and transformative journals[1].

The Guidance on the Implementation of Plan S indicates an ambition of developing a framework for ‘transformative journals’. Such ‘transformative journals’ are journals that (i) gradually increase the share of Open Access content, (ii) offset subscription income from payments for publishing services (to avoid double payments), and (iii) have a clear commitment to a transition to full and immediate Open Access for all peer-reviewed scholarly articles within an agreed timeframe.

The requirements below constitute this framework.

[Here omitting 8 mandatory criteria for transformative journals and 3 suggested criteria.]

We are now seeking input from the community on this draft framework and encourage all interested stakeholders to respond. The consultation on this draft framework is open until 09.00 CET on Monday 6th January 2020. We plan to publish a final version of this framework by the end of March 2020.”

The fundamental problem blocking open access and how to overcome it: the BitViews project

Abstract:  In our view the fundamental obstacle to open access (OA) is the lack of any incentive-based mechanism that unbundles authors’ accepted manuscripts (AMs) from articles (VoRs). The former can be seen as the public good that ought to be openly accessible, whereas the latter is owned by publishers and rightly paywall-restricted. We propose one such mechanism to overcome this obstacle: BitViews. BitViews is a blockchain-based application that aims to revolutionize the OA publishing ecosystem. Currently, the main academic currency of value is the citation. There have been attempts in the past to create a second currency whose measure is the online usage of research materials (e.g. PIRUS). However, these have failed due to two problems. Firstly, it has been impossible to find a single agency willing to co-ordinate and fund the validation and collation of global online usage data. Secondly, online usage metrics have lacked transparency in how they filter non-human online activity. BitViews is a novel solution which uses blockchain technology to bypass both problems: online AMS usage will be recorded on a public, distributed ledger, obviating the need for a central responsible agency, and the rules governing activity-filtering will be part of the open-source BitViews blockchain application, creating complete transparency. Once online AMS usage has measurable value, researchers will be incentivized to promote and disseminate AMs. This will fundamentally re-orient the academic publishing ecosystem. A key feature of BitViews is that its success (or failure) is wholly and exclusively in the hands of the worldwide community of university and research libraries, as we suggest that it ought to be financed by conditional crowdfunding, whereby the actual financial commitment of each contributing library depends on the total amount raised. If the financing target is not reached, then all contributions are returned in full and if the target is over-fulfilled, then the surplus is returned pro rata.

Do articles in open access journals have more frequent altmetric activity than articles in subscription-based journals? An investigation of the research output of Finnish universities | SpringerLink

Abstract:  Scientific articles available in Open Access (OA) have been found to attract more citations and online attention to the extent that it has become common to speak about OA Altmetrics Advantage. This research investigates how the OA Altmetrics Advantage holds for a specific case of research articles, namely the research outputs from universities in Finland. Furthermore, this research examines disciplinary and platform specific differences in that (dis)advantage. The new methodological approaches developed in this research focus on relative visibility, i.e. how often articles in OA journals receive at least one mention on the investigated online platforms, and relative receptivity, i.e. how frequently articles in OA journals gain mentions in comparison to articles in subscription-based journals. The results show significant disciplinary and platform specific differences in the OA advantage, with articles in OA journals within for instance veterinary sciences, social and economic geography and psychology receiving more citations and attention on social media platforms, while the opposite was found for articles in OA journals within medicine and health sciences. The results strongly support field- and platform-specific considerations when assessing the influence of journal OA status on altmetrics. The new methodological approaches used in this research will serve future comparative research into OA advantage of scientific articles over time and between countries.

Dutch universities and research funders move away from the impact factor – ScienceGuide

“By the end of 2019, all parties involved in this project pledge to have signed DORA  . This commitment has to be more than an empty gesture. For example, norms such as ‘four publications to obtain a PhD’ will be abolished, and NWO and ZonMw will no longer inquire about h-index or journal impact factor when academics submit grant applications. Instead of asking for a publication list and CV, they will ask for a more ‘narrative’ approach – inquiring about why this research is important, and why the applicant is the right person to carry it out.

This change will be fast, but not instant. The parties involved acknowledge that change takes time. Considering that to focus metrics such as impact factors took decades to become part of established practices, unlearning these routines will require a considerable amount of time, energy and perseverance. Correctly identifying diverse forms of talent and ‘good research’ will be a learning experience: “To accelerate the desired cultural change in recognition and rewards, we at NWO and ZonMW will strongly focus on training and instruction for our grant evaluation committees.” …”

Tracing the path from social attention to scientific impact | Cardiovascular Research | Oxford Academic

“The ‘citation’ has long served as the bedrock out of which the scientific impact of a published work is carved. This, too, is the basis of common metrics such as the impact factor (IF) and h-index. However, these traditional metrics have been criticized over time for being opaque, difficult to compare across fields, easily manipulatable and ultimately confined to assessing impact within academic circles (rather than educational, or in the general public).1 To confuse matters further, different article databases often report different citation counts for papers depending on their level of coverage.2 For scientists, the ‘publish or perish’ attitude in contemporary science has also led to a fundamental shift in behaviour to emphasize increased output. This includes, for example, splitting papers into multiple, smaller publications. Given the rate of scientific output has doubled approximately every 9 years, it has become increasingly difficult to know exactly what is worth paying attention to in the scientific milieu. Similarly, the considerable time for citations to accumulate hinders an immediate assessment of a work’s impact. These factors are among those which provided the impetus behind the development of Altmetrics: an alternative metric capitalizing on the advent of the internet and social media, where over a third of scholars are on Twitter, to allow a glimpse into a scientific work’s reach at an earlier stage than citations would typically allow.3

Altmetric Attention Scores (AAS) are largely qualitative scores calculated based on weighted variables associated with an article, such as the number of referencing blog posts and Wikipedia articles, Twitter mentions and peer-reviews on Faculty of 1000.4 The different score weightings applied to each variable ostensibly reflects their relative social reach, with news posts and Twitter mentions contributing 8 and 1 points towards the overall AAS, respectively. It gets slightly more complex with Twitter, where scores are applied based on the nature of the Tweeter: a publisher’s account should be valued less than an unconnected individual’s account with a sizeable following. Fundamental to the use of AAS is that they are indicators of attention rather than quality. For this reason, Altmetrics can be seen as complimentary to traditional bibliometrics. Where the score is particularly useful is in helping to discover papers, evaluate scientific impact in a broader context and as a tool for examining trends in disciplines over time….”

A new era for research publication: Will Open Access become the norm? – Hotta – – Journal of Diabetes Investigation – Wiley Online Library

“This new challenge [Plan S] causes some concerns to us. This program is unlikely to be equivalent between Europe and the United States8). because key US federal agencies such as National Institute of Health (NIH), mandate a ‘green’ Open Access policy, whereby articles in subscription journals are automatically made available after a 12-month embargo. This policy protects the existing ‘paywalled’ subscription business model. Also, ‘Plan S’ does not allow for scientists to publish their papers in hybrid journals….

One piece of bright news, however, is that Open Access publication fees would be covered by funders or research institutions, not by individual researchers. Although our journal is already Open Access, we have some concerns regarding the publication fee being covered by either researchers or institutions….”

Given that the publishing industry is approaching a new era in which 85% or more of journals are Open Access, it is necessary for us to develop a survival strategy against this coming fierce competition….