This briefing paper aims to support decision makers at research organisations and research funders to develop new monitoring exercises or assess and improve existing processes to measure the Open Access status of publications.
The availability of data and information on the current state of scholarly publishing is invaluable to help advance Open Access. Given the complexity of the scholarly publishing system, this involves a multitude of decisions.
This briefing paper provides recommendations on the three main questions an organisation should answer to develop a monitoring exercise: Why, What, and How?
Examples of different monitoring exercises have been selected to represent different use cases, organisational setups, data sources, and strategies of interpretation.
“Last month the National Health and Medical Research Council sought submissions on going immediate OA on publication. If publishers refuse the council suggested authors’ accepted manuscripts could be made available by named institutional repositories (CMM April 16).
Which is good, but Drs Kingsley and Smith (both ex Cambridge University’s Office of Scholarly Communication) suggest tighter wording to make intent impossible to ignore.
And they call for checks, which institutions could use to make sure OA actually occurs. “There is evidence that even ‘light touch’ compliance checking results in significant behavioural change,” they write. Especially if “there is a significant consequence for non-compliance,” – which could be tying grants to OA rules….”
From Google’s English: “The indicator is produced and launched annually by the Danish Agency for Education and Research, which is part of the Ministry of Education and Research. The indicator monitors the implementation of the Danish Open Access strategy 2018-2025 by collecting and analyzing publication data from the Danish universities.
OVERVIEW – National strategic goals and the realization of them at national and university level.
OA TYPES – Types of Open Access realization at national and local level.
DATA – Data for download as well as documentation at an overview and technical level.
GUIDANCE – Information to support the Danish universities’ implementation of Open Access, such as important dates and guidelines.
FAQ – Frequently Asked Questions….”
“For the sake of analysis, we compared what might happen if ALL authors chose one Plan S compliance route over another. In practice there will be a mix, and so the reality is likely to land somewhere between our two extremes. …
Compliance via fully OA journals
Plan S could lead to a slight lift in market value of just under 0.25% in the long term. Plan S articles add incremental revenues by boosting volumes in fully OA journals. Meanwhile with a mild drop in volumes from subscription journals, publishers are able to maintain their prices.
The UK’s UKRI is currently considering its position on OA. If the UKRI were to adopt Plan S principles, then it will make little difference to the market if the fully OA compliance route was followed.
Compliance via repositories
Plan S could lead to a slight fall in market value of just under 0.6% in the long term. This is driven by lost hybrid OA revenue, as authors opt for subscription journals instead.
If the UKRI were to adopt Plan S principles, then the long-term fall in market value would be just under 0.8%. This is another third or so compared with Plan S on its own. The UK’s current policies have driven significant hybrid uptake. If the value of these APCs is lost, it will have a noticeable effect….”
Compliance via fully OA journals
Plan S could lead to a fall in market value of around 2.8%. Subscription journals generate more revenues per article than their OA counterparts. Therefore, a reduction in subscription prices for a given volume of articles will be greater than the gains made from APCs. This adjustment will happen once. Then, as OA output is growing faster than the market as a whole, it will start to drive a very mild increase in market value.
If the UKRI were to adopt Plan S principles, then the long-term fall in market value would be just under 3.4%, or around 20% more than Plan S alone. The same dynamics apply as for Plan S alone….
“Advancing public access to research data is important to improving transparency and reproducibility of scientific results, increasing scientific rigor and public trust in science, and — most importantly — accelerating the pace of discovery and innovation through the open sharing of research results. Additionally, it is vital that institutions develop and implement policies now to ensure consistency of data management plans across their campuses to guarantee full compliance with federal research agency data sharing requirements. Beyond the establishment of policies, universities must invest in the infrastructure and support necessary to achieve the desired aspirations and aims of the policies. The open sharing of the results of scientific research is a value our two associations have long fought to protect and preserve. It is also a value we must continue to uphold at all levels within our universities. This will mean overcoming the various institutional and cultural impediments which have, at times, hampered the open sharing of research data….”
“Authors who adopt transparent practices for an article in Conservation Biology are now able to select from 3 open science badges: open data, open materials, and preregistration. Badges appear on published articles as visible recognition and highlight these efforts to the research community. There is an emerging body of literature regarding the influences of badges, for example, an increased number of articles with open data (Kidwell et al 2016) and increased rate of data sharing (Rowhani?Farid et al. 2018). However, in another study, Rowhani?Farid et al. (2020) found that badges did not “noticeably motivate” researchers to share data. Badges, as far as we know, are the only data?sharing incentive that has been tested empirically (Rowhani?Farid et al. 2017).
Rates of data and code sharing are typically low (Herold 2015; Roche et al 2015; Archmiller et al 2020; Culina et al 2020). Since 2016, we have asked authors of contributed papers, reviews, method papers, practice and policy papers, and research notes to tell us whether they “provided complete machine and human?readable data and computer code in Supporting Information or on a public archive.” Authors of 31% of these articles published in Conservation Biology said they shared their data or code, and all authors provide human?survey instruments in Supporting Information or via a citation or online link (i.e., shared materials)….”
“Researchers who receive federal help consistently fail to report their results to the public. The government should hold them accountable….
Researchers using federal funds to conduct cancer trials — experiments involving drugs or medical devices that rely on volunteer subjects — were sometimes taking more than a year to report their results to the N.I.H., as required. “If you don’t report, the law says you shouldn’t get any funding,” he said, citing an investigation I had published in Stat with my colleague Talia Bronshtein. “Doc, I’m going to find out if it’s true, and if it’s true, I’m going to cut funding. That’s a promise.”
It was true then. It’s true now. More than 150 trials completed since 2017 by the N.I.H’s National Cancer Institute, which leads the $1.8 billion Moonshot effort, should have reported results by now. About two-thirds reported after their deadlines or not at all, according to a University of Oxford website that tracks clinical trials regulated by the Food and Drug Administration and National Institutes of Health. Some trial results are nearly two years overdue. Over all, government-sponsored scientists have complied less than half the time for trial results due since 2018. (A spokeswoman for the N.I.H. said, “We are willing to do all measures to ensure compliance with ClinicalTrials.gov results reporting.”)…
In 2016, Dr. Francis Collins, the director of the National Institutes of Health, announced that the agency would begin penalizing researchers for failing to comply with its reporting requirements. “We are serious about this,” he said at the time. Yet in the years since, neither the F.D.A. nor N.I.H. has enforced the law. …”
“From January 2021, there are some changes for ACS authors funded by certain members of?cOAlition S. You may be required to make sure that you publish your work immediately open access under a CC-BY license. ACS offers a wide range of options enabling our authors to comply with these requirements through?publication in a fully open access journal or a gold open access option in all our hybrid journals. In addition, your institution may have signed an ACS Read + Publish Agreement that provides funding for open access publishing. See below for more information regarding these changes.”
The United States has mobilized the full force of its clinical research enterprise to address the Covid-19 pandemic, allocating billions of dollars to support timely research. As of January 2021, for example, the National Institutes of Health (NIH) had issued nearly a thousand awards cumulatively worth roughly $2 billion to support Covid-19 projects ranging from the development of medical products (including diagnostics and vaccines) to evaluations of population-specific risk factors and outcomes.1 Such initiatives, which have yielded new technologies and important evidence, illustrate the value of robust scientific infrastructure.
Abstract: PLOS has long supported Open Science. One of the ways in which we do so is via our stringent data availability policy established in 2014. Despite this policy, and more data sharing policies being introduced by other organizations, best practices for data sharing are adopted by a minority of researchers in their publications. Problems with effective research data sharing persist and these problems have been quantified by previous research as a lack of time, resources, incentives, and/or skills to share data.
In this study we built on this research by investigating the importance of tasks associated with data sharing, and researchers’ satisfaction with their ability to complete these tasks. By investigating these factors we aimed to better understand opportunities for new or improved solutions for sharing data. In May-June 2020 we surveyed researchers from Europe and North America to rate tasks associated with data sharing on (i) their importance and (ii) their satisfaction with their ability to complete them. We received 728 completed and 667 partial responses. We calculated mean importance and satisfaction scores to highlight potential opportunities for new solutions to and compare different cohorts. Tasks relating to research impact, funder compliance, and credit had the highest importance scores. 52% of respondents reuse research data but the average satisfaction score for obtaining data for reuse was relatively low. Tasks associated with sharing data were rated somewhat important and respondents were reasonably well satisfied in their ability to accomplish them. Notably, this included tasks associated with best data sharing practice, such as use of data repositories. However, the most common method for sharing data was in fact via supplemental files with articles, which is not considered to be best practice. We presume that researchers are unlikely to seek new solutions to a problem or task that they are satisfied in their ability to accomplish, even if many do not attempt this task. This implies there are few opportunities for new solutions or tools to meet these researcher needs. Publishers can likely meet these needs for data sharing by working to seamlessly integrate existing solutions that reduce the effort or behaviour change involved in some tasks, and focusing on advocacy and education around the benefits of sharing data. There may however be opportunities – unmet researcher needs – in relation to better supporting data reuse, which could be met in part by strengthening data sharing policies of journals and publishers, and improving the discoverability of data associated with published articles.