Europe set to miss flagship open access target | THE News

“The European Union is set to miss its target of having all scientific research freely available by 2020, as progress towards open access hits a “plateau” because of deeper problems in how research is assessed. Sixty to 70 per cent of universities reported that less than a fifth of their researchers’ peer-reviewed publications are freely available, depending on the type of open access, according to a survey of more than 300 members of the European University Association. 

Only one in 10 universities said that more than 40 per cent of their research was published as “gold” open access, where there is no delay making it public. In 2016, EU member states’ science and industry ministers, supported by the European Commission, backed a move to full open access in just four years. This latest survey asks members about papers published in 2013, 2014 and 2015, so may not capture all progress made to date. But it still concludes that to hit the 2020 target “will require greater engagement by all of the relevant stakeholders”.

This chimes with an EU progress report released at the end of February which concludes that “100 per cent full open access in 2020 is realistically not achievable in the majority of European countries participating in this exercise in the foreseeable future”. Lidia Borrell-Damian, the EUA’s director for research and innovation, said that “unfortunately [full open access] is very difficult to achieve” and that “we have reached a plateau in which it’s very difficult to move forward”.

Open access had taken off in some subjects – like physics, where the open access arXiv pre-print platform is widely used – in which “traditional indicators” of journal prestige such as impact factors and other measures of citations were “less relevant”, she explained. But in most disciplines, these measures were still crucial for burnishing researchers’ career prospects, she added, making it difficult for authors to switch to less prestigious, lower impact factor open access journals. “As long as it [research assessment] is based on these proxy indicators, it’s impossible to change the game,” Dr Borrell-Damian said. Search our database of more than 3,000 global university jobs

This is backed up by the survey findings. The biggest barrier to publishing in an open access repository was the “high priority given to publishing in conventional journals”, a hindrance cited by more than eight in 10 universities. “Concerns about the quality of open access publications” were also mentioned by nearly 70 per cent of respondents. In some disciplines, to publish open access, “you have to be a believer or activist” and it comes “at the risk of damaging your own career”, Dr Borrell-Damian said.

Echoing a long-standing concern in science, she argued that “we need a whole new system” of research assessment that does not rely so heavily on citations and impact factors. The EU’s flagship Horizon 2020 funding scheme requires grant recipients to publish their findings openly, but this was a far from universal policy for national funding bodies, she added. A spokesman for the EU Council acknowledged that “more efforts will be needed overall to accelerate progress towards full open access for all scientific publications”.”

Guest Post: Institutional Alignment: The University Press Redux – The Scholarly Kitchen

“That question of institutional relationship may have a whole new sense of urgency for some presses depending on how the Higher Education Funding Council for England (HEFCE), and its successor Research England, unpacks a key announcement made at the conference. Actually,  “announcement” is overstating it: it was more an expansion of an earlier hint on page 36 (Annex C) of December 2016’s Second Consultation on the Second Research Excellence Framework, that to be eligible for the next but one Research Excellence Framework (REF), which feeds the distribution of £1.6 billion of annual quality-related university funding in the UK, all monographs will need to be available in an OA manner. That is, in just over 1000 days from now in January 2021, when the REF 2027 cycle starts, UK university academic book authors will be expected to meet some as yet unspecified OA requirements. Only time will tell the exact form of OA that will be prescribed – Annex C somewhat frustratingly states ‘We do not intend to set out any detailed open-access policy requirements for monographs in a future REF exercise in this annex,’ and there hasn’t been a great deal of public discussion with publishers since its publication, at least until HEFCE’s Head of Research Policy, Steven Hill, threw down the gauntlet at Redux. Meanwhile, the 19 ‘new university presses’ in the UK and 12 institutions considering following suit according to JISC’s Graham Stone, look distinctly like a hedge on the long-term future of scholarly communication, and those US university presses that have been reluctant to engage with OA may feel obliged to do so or risk losing UK authors….”

 

RCUK statement on the responsible use of metrics in research assessment

[Undated but released c. February 8, 2018.]

“Research councils consider the journal impact factor and metrics such as the H-index are not appropriate measures for assessing the quality of publications or the contribution of individual researchers, and so will not use these measures in our peer review processes. …The research councils will highlight to reviewers, panel members, recruitment and promotion panels that they should not place undue emphasis on the journal in which papers are published, but assess the content of specific papers, when considering the impact of an individual researcher’s contribution….The Research Councils will sign DORA as a public indication of their support for these principles….”

REF 2021 Decisions on staff and outputs

“37. Evidence gathered through a recent survey on open access (OA) shows that, for over 80 per cent of outputs in the scope of the policy, either the outputs met the REF policy requirements in the first year (1 April 2016 to 1 April 2017), or an exception to the policy requirement is known to have applied. This reflects significant progress toward the policy intent to increase substantially the proportion of research that is made available open access in the UK.

38. The funding bodies have carefully considered the evidence gathered in the survey relating to the policy’s deposit requirements. We wish to continue building on the progress achieved to date and to maintain the momentum towards developing new tools to implement deposit as soon after the point of acceptance as possible. We therefore confirm the implementation of the REF OA policy as previously set out. The policy will require outputs to be deposited as soon after the point of acceptance as possible, and no later than three months after this date (as given in the acceptance letter or email from the publication to the author) from 1 April 2018.

39. Taking account of some of the practical concerns raised through the survey in relation to deposit on acceptance, we will introduce a deposit exception in to the policy from 1 April 2018. This exception will allow outputs unable to meet this deposit timescale, to remain compliant if they are deposited up to three months after the date of publication. The exception will read: ‘The output was not deposited within three months of acceptance date, but was deposited within three months of the earliest date of publication.’ This exception will remain in place for the rest of the REF 2021 publication period.

40. Further detail on the evidence assessed to make this decision is based at Annex B. The REF OA policy has been updated to include the additional exception. A full report of the UKwide survey on the delivery of funders’ open access policies will be published early in 2018….”

Science Europe Position Statement – On a New Vision for More Meaningful Research Impact Assessment

“Research has always had a wide impact on society, but this does not always come in the form of a clearly defined outcome, application, or effect. It frequently occurs as a more gradual development in the understanding of the consequences of new knowledge. It may also happen long after the corresponding research was done, or be the outcome of repeated interactions between research and society. This can make societal impact difficult or impossible to trace back and attribute to the specific research from which it originated. In addition, that research may have already been evaluated without this impact having been taken into account.”

Hirmeos Project – High Integration of Research Monographs in the European Open Science infrastructure

“Several projects, especially in Europe, pursue the aim of  publishing Open Access research monographs. However, not enough has been done yet to integrate Open Access monographs into the open science ecosystem in a systematic and coordinated fashion. That’s the main goal of High Integration of Research Monographs in the European Open Science (HIRMEOS) project. The project addresses the particularities of academic monographs as a specific support for scientific communication in the Social Sciences and the Humanities  and tackles the main obstacles of the full integration  of monographs into the European Open Science Cloud. It aims at prototyping innovative services for monographs in support of Open Science infrastructure by providing additional data, links and interactions to the documents, at the same time paving the way to new potential tools for research assessment, which is still a major challenge in the Humanities and Social Sciences.

By improving already existing publishing platforms and repositories participating in the OpenAIRE infrastructure, the HIRMEOS project will increase its impact and help including more disciplines into the Open Science paradigm, widening its boundaries towards the Humanities and Social Sciences and to reach out new fields up to now poorly integrated….”

The challenge of open access compliance

“Researchers at UK Higher Education Institutions (HEIs) are now subject to HEFCE’s open access policy if they want to submit their work to the Research Excellence Framework (REF) in 2021. The policy applies to journal articles and conference proceedings accepted for publication after 1 April 2016. These research outputs must be deposited in an institutional or subject repository as soon as possible after the point of acceptance. For the first two years of the policy there is flexibility to deposit up to three months after the date of early online publication. After April 2018, it is anticipated that the policy terms will become stricter and deposit must occur within three months of acceptance….The financial costs associated with supporting compliance with the policy are high. Many HEIs initially relied heavily on their Research Councils UK funding to meet staffing costs. Over time, institutions have taken on staff costs to ensure the longevity of their open access teams, and some have even been in a position to create institutional funds for gold open access. At a time when increasing subscription costs are regularly imposed by publishers it can be difficult for institutions to find the means to support open access, despite its obvious importance. The cultural challenges associated with the HEFCE policy can prove to be even more difficult to overcome….After three years of promotion and engagement with researchers through school board meetings, research support meetings, training sessions and online support materials, attitudes have gradually shifted towards support for open access. Following a review of 2016, we discovered that 93% of the papers in our repository that are subject to HEFCE’s policy are REF eligible. This positive trend has continued into 2017 with many more papers being deposited on a daily basis….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Consultation on the second Research Excellence Framework

“This document sets out the proposals of the four UK higher education funding bodies for the second Research Excellence Framework (REF) for the assessment of research in UK higher education institutions. The proposals seek to build on the first REF conducted in 2014, and to incorporate the principles identified in Lord Stern’s Independent Review of the REF….”

For open access, see especially paragraphs 55, 68, 69, 116, 117, and all of Annex C (on OA monographs).