Science Europe Position Statement – On a New Vision for More Meaningful Research Impact Assessment

“Research has always had a wide impact on society, but this does not always come in the form of a clearly defined outcome, application, or effect. It frequently occurs as a more gradual development in the understanding of the consequences of new knowledge. It may also happen long after the corresponding research was done, or be the outcome of repeated interactions between research and society. This can make societal impact difficult or impossible to trace back and attribute to the specific research from which it originated. In addition, that research may have already been evaluated without this impact having been taken into account.”

Hirmeos Project – High Integration of Research Monographs in the European Open Science infrastructure

“Several projects, especially in Europe, pursue the aim of  publishing Open Access research monographs. However, not enough has been done yet to integrate Open Access monographs into the open science ecosystem in a systematic and coordinated fashion. That’s the main goal of High Integration of Research Monographs in the European Open Science (HIRMEOS) project. The project addresses the particularities of academic monographs as a specific support for scientific communication in the Social Sciences and the Humanities  and tackles the main obstacles of the full integration  of monographs into the European Open Science Cloud. It aims at prototyping innovative services for monographs in support of Open Science infrastructure by providing additional data, links and interactions to the documents, at the same time paving the way to new potential tools for research assessment, which is still a major challenge in the Humanities and Social Sciences.

By improving already existing publishing platforms and repositories participating in the OpenAIRE infrastructure, the HIRMEOS project will increase its impact and help including more disciplines into the Open Science paradigm, widening its boundaries towards the Humanities and Social Sciences and to reach out new fields up to now poorly integrated….”

The challenge of open access compliance

“Researchers at UK Higher Education Institutions (HEIs) are now subject to HEFCE’s open access policy if they want to submit their work to the Research Excellence Framework (REF) in 2021. The policy applies to journal articles and conference proceedings accepted for publication after 1 April 2016. These research outputs must be deposited in an institutional or subject repository as soon as possible after the point of acceptance. For the first two years of the policy there is flexibility to deposit up to three months after the date of early online publication. After April 2018, it is anticipated that the policy terms will become stricter and deposit must occur within three months of acceptance….The financial costs associated with supporting compliance with the policy are high. Many HEIs initially relied heavily on their Research Councils UK funding to meet staffing costs. Over time, institutions have taken on staff costs to ensure the longevity of their open access teams, and some have even been in a position to create institutional funds for gold open access. At a time when increasing subscription costs are regularly imposed by publishers it can be difficult for institutions to find the means to support open access, despite its obvious importance. The cultural challenges associated with the HEFCE policy can prove to be even more difficult to overcome….After three years of promotion and engagement with researchers through school board meetings, research support meetings, training sessions and online support materials, attitudes have gradually shifted towards support for open access. Following a review of 2016, we discovered that 93% of the papers in our repository that are subject to HEFCE’s policy are REF eligible. This positive trend has continued into 2017 with many more papers being deposited on a daily basis….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Consultation on the second Research Excellence Framework

“This document sets out the proposals of the four UK higher education funding bodies for the second Research Excellence Framework (REF) for the assessment of research in UK higher education institutions. The proposals seek to build on the first REF conducted in 2014, and to incorporate the principles identified in Lord Stern’s Independent Review of the REF….”

For open access, see especially paragraphs 55, 68, 69, 116, 117, and all of Annex C (on OA monographs).

2016/36 – Higher Education Funding Council for England

“This document sets out the proposals of the four UK higher education funding bodies for the second Research Excellence Framework (REF) for the assessment of research in UK higher education institutions. The proposals seek to build on the first REF conducted in 2014, and to incorporate the principles identified in Lord Stern’s Independent Review of the REF….”

David Wojick’s writings and stuff: Beall-based Indian turmoil?

“Synopsis: New data sheds light on Indian researcher’s use of low cost journals. The Indian Government’s attack on these journals, based on Beall’s list, could adversely affect the Indian university science community.

Three weeks ago we reported that an Indian agency was using a whitelist to ban the use of unlisted journals for the purpose of evaluating researcher performance. The Agency is the University Grants Commission (UGC), which apparently plays a major role in university based Indian science. I know little about this realm, but it seems to include setting the criteria for hiring and promotion, perhaps as well as granting PhD’s. 

See http://cbseugcnetforum.in/jobs/ugc-notice-approved-list-journals-career-advancement-scheme-direct-recruitment-teachers/”

A Letter to Thompson Reuters – ASCB

“In April 2013, some of the original signers of DORA [Declaration on Research Assessment] wrote to executives at Thomson Reuters to suggest ways in which it might improve its bibliometric offerings. Suggestions included replacing the flawed and frequently misused two-year Journal Impact Factor (JIF) with separate JIFs for the citable reviews and for the primary research article content of a journal; providing more transparency in Thomson Reuters’ calculation of JIFs; and publishing the median value of citations per citable article in addition to the JIFs. Thomson Reuters acknowledged receipt of the letter and said, “We are carefully reviewing all the points raised and will respond as soon as possible.”…”

Declaration on Research Assessment (DORA)

“The San Francisco Declaration on Research Assessment (DORA), initiated by the American Society for Cell Biology (ASCB) together with a group of editors and publishers of scholarly journals, recognizes the need to improve the ways in which the outputs of scientific research are evaluated. The group met in December 2012 during the ASCB Annual Meeting in San Francisco and subsequently circulated a draft declaration among various stakeholders. DORA as it now stands has benefited from input by many of the original signers listed below. It is a worldwide initiative covering all scholarly disciplines. We encourage individuals and organizations who are concerned about the appropriate assessment of scientific research to sign DORA….”