NMC Horizon Report: 2017 Library Edition

“Six key trends, six significant challenges, and six developments in technology profiled in this report are poised to impact library strategies, operations, and services….These top 10 highlights capture the big picture themes of organizational change that underpin the 18 topics: …[Highlight 3:] In the face of financial constraints, open access is a potential solution. Open resources and publishing models can combat the rising costs of paid journal subscriptions and expand research accessibility. Although this idea is not new, current approaches and implementations have not yet achieved peak efficacy….”

Rationale for Requiring Immediate Deposit Upon Acceptance – Open Access Archivangelism

There are multiple reasons for depositing the AAM (Author Accepted Manuscript) immediately upon acceptance:

1. The date of acceptance is known. The date of publication is not. It is often long after acceptance, and often does not even correspond to the calendar date of the journal. 2. It is when research is refereed and accepted that it should be accessible to all potential users.  3. The delay between the date of acceptance and the date of publication can be anywhere from six months to a year or more. 3. Publishers are already trying to embargo OA for a year from date of publication. The gratuitous delay from acceptance could double that. 4. The date of acceptance is the natural date-stamp for deposit and the natural point in the author’s work-flow for deposit. 5. The AAV at date of acceptance is the version with the least publisher restrictions on it: Many publishers endorse making the AAM OA immediately, but not the PV (Publisher’s Version). 6. Having deposited the AAM, the author can update it if and when they wish, to incorporate any copy-editing and corrections (including the PV). 7. If the author elects to embargo the deposit, the copy-request button is available to authorize the immediate automatic sending of individual copies on request. Authors can make the deposit OA when they choose. (They can also decline to send the AAM till the copy-edited version has been deposited — but most authors will not want to delay compliance with copy requests: refereed AAMs that have not yet been copy-edited can be clearly marked as such.) 8. The acceptance letter provides the means of verifying timely compliance with the deposit mandate. It is the key to making the immediate-deposit policy timely, verifiable and effective. And it is the simplest and most natural way to integrate deposit into the author’s year-long workflow. 9. The above timing and compliance considerations apply to all refereed research, including research published in Gold OA journals. 10. Of the 853 OA policies registered in ROARMAP 96 of the 515 OA policies that require (rather than just request or recommend) deposit have adopted the immediate-deposit upon acceptance requirement

Below are references to some articles that have spelled out the rationale and advantages of the immediate-deposit requirement.

Stevan Vincent-Lamarre, Philippe, Boivin, Jade, Gargouri, Yassine, Larivière, Vincent and Harnad, Stevan (2016) Estimating Open Access Mandate Effectiveness: The MELIBEA ScoreJournal of the Association for Information Science and Technology (JASIST) 67(11) 2815-2828  Swan, Alma; Gargouri, Yassine; Hunt, Megan; & Harnad, Stevan (2015) Open Access Policy: Numbers, Analysis, EffectivenessPasteur4OA Workpackage 3 Report. Harnad, Stevan (2015) Open Access: What, Where, When, How and Why. In: Ethics, Science, Technology, and Engineering: An International Resourceeds. J. Britt Holbrook & Carl Mitcham, (2nd edition of Encyclopedia of Science, Technology, and Ethics, Farmington Hills MI: MacMillan Reference)   Harnad, Stevan (2015) Optimizing Open Access PolicyThe Serials Librarian, 69(2), 133-141   Sale, A., Couture, M., Rodrigues, E., Carr, L. and Harnad, S. (2014) Open Access Mandates and the “Fair Dealing” Button. In: Dynamic Fair Dealing: Creating Canadian Culture Online (Rosemary J. Coombe & Darren Wershler, Eds.)

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

The science ‘reproducibility crisis’ – and what can be done about it

“The solution to the scientific reproducibility crisis is to move towards Open Research – the idea that scientific knowledge of all kinds should be openly shared as early as it is practical in the discovery process. We need to reward the publication of research outputs along the entire process, rather than just each journal article as it is published.”

Offsetting and its discontents: challenges and opportunities of open access offsetting agreements

“The growth of open access (OA) via the payment of article processing charges (APCs) in hybrid journals has been a key feature of the approach to OA in the UK. In response, Jisc Collections has been piloting ‘offsetting agreements’ that explicitly link subscription and APCs, seeking to reduce one as the other grows. However, offsetting agreements have become increasingly contentious with institutions, advocates and publishers.

With reference to issues such as cost, administrative efficiency, transparency and the transition to open access, this paper provides an update on the status of UK negotiations, reflects on the challenges and opportunities presented by such agreements, and considers the implications for the path of future negotiations. “

Campus open-access policy “Choice Points”

“The basic policy framework recommended in this document highlights the institution’s ability to play a central role in the stewardship of the scholarly record generated by its faculty. The framework is straightforward; campus OA policies require authors to make manuscripts available for deposit in an institution’s repository at the time they are accepted for publication in a peer-reviewed journal. Authors automatically grant the institution the right to make their manuscripts openly accessible. At the same time, authors may request a waiver, or “opt out,” of the institutional license for a given article if needed to accommodate a pressing individual circumstance….”

“Fostering Open Science in Global Health – the case of datasharing in Public Health Emergenices“ overview of the CVV Workshop at the World Health Summit | Centre Virchow-Villermé

On October 10 2016 the Centre Virchow-Villermé hosted a workshop on ‘Fostering Open Science in Global Health – the case of datasharing in Public Health Emergenices’. Featuring a diverse panel with representatives from different fields and different parts of the world, the workshop identified barriers and bottlenecks of an open approach to datasharing and science in general.

More than just data, more than just emergencies

Public health emergencies like the recent Zika and Ebola outbreaks illustrate the need for a more collaborative approach to research. Keeping information on screened viral genomes in situations where time counts the most is directly delaying the development of adequate responses.

However, Katherine Littler from the Wellcome Trust pointed out very early in the workshop ‘What is good for Public Health Emergencies is good for any research’, ‘There is no reason to hold information back.’ Comments from the audience sounded even more drastic: ‘There are no other reason for not sharing research data but prestige or selfishness.’

Katherine Littler of the Wellcome Trust describing how much more effort is needed to change practises in science. (Image: World Health Summit)

Open but 

Dr Ali Sié, a researcher from Nuna in Burkina Faso generally supported ‘Open Science’ as a concept. Nevertheless, immediate sharing of clinical trial data would lead to even greater North-South inequalities in the research communities. He argued that those collecting data are not necessarily those with the biggest computing power. When making raw data directly acces- and processible, researchers in the global north could use technological expertise and piggy back on this and leverage the data faster than those who collected it in the global south.

As long as scientific reputation is based on publications in high impact journals, sharing is not incentivized. A paradigm shift in recognizing the value of shared data sets is highly needed.

Context matters

Dr Diallo from the Guinean Ministry of Health and Professor at the University of Conakry pointed out that data itself is only valuable when seen in the context of its creation. The community aspect of research, especially in outbreak situation, needs to be considered when opening datasets to the public.

Sceptisism meets interest – impressions from the workshop (Image: World Health Summit)

How to open science 

Eventhough data and information on trials are technically available, they are often spread out on the Internet and difficult to find. ‘Open Trials‘, an initiative by Open Knowledge International aims to do something about this issue by collecting ‘all the data on all the trials, Linked.’ They launched the public beta version on our workshop.

Researchers, the private sector, as well as government representatives present agreed, that in fact early career researchers need to sustain the movement in favour of more open and collaborative science as the default option within academia. Some questions remain unanswered – but the Centre Virchow-Villermé remains committed to offer a platform of exchange to discuss and support a shift in current practises.

 You can read more about the workshop in this BMJ Blog:

http://blogs.bmj.com/bmj/2016/10/25/peter-grabitz-et-al-how-can-we-improve-data-sharing-in-public-health-emergencies/

COAR Next Generation Repositories | Draft for Public Comment

“In April 2016, the Confederation of Open Access Repositories (COAR) launched a working group to help identify new functionalities and technologies for repositories and develop a road map for their adoption. For the past several months, the group has been working to define a vision for repositories and sketch out the priority user stories and scenarios that will help guide the development of new functionalities.

The vision is to position repositories as the foundation for a distributed, globally networked infrastructure for scholarly communication, on top of which layers of value added services will be deployed, thereby transforming the system, making it more research-centric, open to and supportive of innovation, while also collectively managed by the scholarly community.

Underlying this vision is the idea that a distributed network of repositories can and should be a powerful tool to promote the transformation of the scholarly communication ecosystem. In this context, repositories will provide access to published articles as well as a broad range of artifacts beyond traditional publications such as datasets, pre-prints, working papers, images, software, and so on….”

COPDESS – Coalition on Publishing Data in the Earth and Space Sciences

“The Coalition for Publishing Data in the Earth and Space Sciences (COPDESS) connects Earth and space science publishers and data facilities to help translate the aspirations of open, available, and useful data from policy into practice.  COPDESS has developed a statement of commitment, now signed by most leading publishers and repositories, and provides a directory of repositories for publishers and recommended best practices around data and identifiers (see links on left)….”