“From 1st January 2021, the cOAlition S Rights Retention Strategy (RRS) will start to be implemented by funders. A key reason for adopting this initiative is to allow authors to have the widest possible range of journals to choose from for article submission and to make sure they take advantage of the benefits of OA, whilst meeting their funder’s OA requirements. The RRS is not principally about compliance – OA should never primarily be about box-ticking and compliance – it is about restoring intellectual control of works describing research findings to the authors themselves. Adoption of the RRS gives authors the security that acceptance of their article for submission ensures that they can eventually make their work OA either via the Version of Record (VoR), or the author accepted manuscript (AAM), independently of the choice of venue (fully OA or subscription journal).
The RRS cuts through much of the confusion, obfuscation, and – to be frank – utter nonsense surrounding copyright transfer claims made by some publishers.
Abstract: This paper studies a selection of eleven Norwegian journals in the humanities and social sciences and their conversion from subscription to open access, a move heavily incentivized by governmental mandates and open access policies. By investigating the journals’ visiting logs in the period 2014-2019, the study finds that a conversion to open access induces higher visiting numbers; all journals in the study had a significant increase which can be attributed to the conversion. Converting a journal had no spillover in terms of increased visits to previously published articles still behind the paywall in the same journals. Visits from previously subscribing Norwegian higher education institutions did not account for the increase in visits, indicating that the increase must be accounted for by visitors from other sectors. The results could be relevant for policymakers concerning the effects of strict polices targeting economically vulnerable national journals, and could further inform journal owners and editors on the effects of converting to open access.
Abstract: Preprint is a version of a scientific paper that is publicly distributed preceding formal peer review. Since the launch of arXiv in 1991, preprints have been increasingly distributed over the Internet as opposed to paper copies. It allows open online access to disseminate the original research within a few days, often at a very low operating cost. This work overviews how preprint has been evolving and impacting the research community over the past thirty years alongside the growth of the Web. In this work, we first report that the number of preprints has exponentially increased 63 times in 30 years, although it only accounts for 4% of research articles. Second, we quantify the benefits that preprints bring to authors: preprints reach an audience 14 months earlier on average and associate with five times more citations compared with a non-preprint counterpart. Last, to address the quality concern of preprints, we discover that 41% of preprints are ultimately published at a peer-reviewed destination, and the published venues are as influential as papers without a preprint version. Additionally, we discuss the unprecedented role of preprints in communicating the latest research data during recent public health emergencies. In conclusion, we provide quantitative evidence to unveil the positive impact of preprints on individual researchers and the community. Preprints make scholarly communication more efficient by disseminating scientific discoveries more rapidly and widely with the aid of Web technologies. The measurements we present in this study can help researchers and policymakers make informed decisions about how to effectively use and responsibly embrace a preprint culture.
“This data protection agency could be combined with Data.gov, a government website created in 2009 that assembles and hosts hundreds of thousands of data sets for public use. Together they could form a kind of federal data library, democratizing knowledge for the digital age.
Just as traditional libraries curate and organize their collections, so could a digital library, adding new data sources and cleaning and assembling them for public use. A federal data library could also take the lead in developing and using new tools such as differential privacy, a technique designed to preserve important features of data while protecting individual identities.
Data’s increasing value as an economic resource requires a new way of thinking. Strict privacy protections are needed to make socially valuable data available for the public good.”
“Andre?’s dataset was shortlisted for the Mendeley Data FAIRest Datasets Award, which recognizes researchers who make their data available for the research community in a way that exemplifies the FAIR Data Principles – Findable, Accessible, Interoperable, Reusable. The dataset was applauded for a number of reasons, not least the provision of clear steps to reproduce the data. What’s more, the data was clearly catalogued and stored in sub folders, with additional links to Blender and GitHub, making the dataset easily available and reproducible for all….”
Abstract: Data management plans (DMPs) have increasingly been encouraged as a key component of institutional and funding body policy. Although DMPs necessarily place administrative burden on researchers, proponents claim that DMPs have myriad benefits, including enhanced research data quality, increased rates of data sharing, and institutional planning and compliance benefits.
In this article, we explore the international history of DMPs and describe institutional and funding body DMP policy. We find that economic and societal benefits from presumed increased rates of data sharing was the original driver of mandating DMPs by funding bodies. Today, 86% of UK Research Councils and 63% of US funding bodies require submission of a DMP with funding applications. Given that no major Australian funding bodies require DMP submission, it is of note that 37% of Australian universities have taken the initiative to internally mandate DMPs. Institutions both within Australia and internationally frequently promote the professional benefits of DMP use, and endorse DMPs as ‘best practice’. We analyse one such typical DMP implementation at a major Australian institution, finding that DMPs have low levels of apparent translational value. Indeed, an extensive literature review suggests there is very limited published systematic evidence that DMP use has any tangible benefit for researchers, institutions or funding bodies.
We are therefore led to question why DMPs have become the go-to tool for research data professionals and advocates of good data practice. By delineating multiple use-cases and highlighting the need for DMPs to be fit for intended purpose, we question the view that a good DMP is necessarily that which encompasses the entire data lifecycle of a project. Finally, we summarise recent developments in the DMP landscape, and note a positive shift towards evidence-based research management through more researcher-centric, educative, and integrated DMP services.
Abstract: Extensive research has taken place over the years to examine the barriers of OER adoption, but little empirical studies has been undertaken to map the amount of OER reuse. The discussion around the actual use of OER, outside the context in which they were developed, remains ongoing. Previous studies have already shown that searching and evaluating resources are barriers for actual reuse. Hence, in this quantitative survey study we explored teachers’ practices with resources in Higher Education Institutes in the Netherlands. The survey had three runs, each in a different context, with a total of 439 respondents. The results show that resources that are hard or time-consuming to develop are most often reused from third parties without adaptations. Resources that need to be more context specific are often created by teachers themselves. To improve our understanding of reuse, follow-up studies must explore reuse with a more qualitative research design in order to explore how these hidden practices of dark reuse look like and how teachers and students benefit of it.
“From its inception, the open access movement has postulated that publishing costs should be controlled by research institutions and funded by redirecting resources after canceling journal subscriptions. In reality, things have proved more complex. Although « transformative agreements” that cover both publishing and reading have rapidly increased the percentage of articles published in open access in some institutions, the details of these agreements are generally kept secret and so their scope is difficult to compare.
Nevertheless, it is clear that making most articles open access but for a fee, if tariffs are not a realistic reflection of actual costs, will explode university library budgets (Harvard estimates this increase at 71%) and mark large differences in the ability to publish. Indeed, this could create a vicious circle whereby well-funded researchers publish more, gain more visibility as well as recognition and, as a result, get more funding.
If Plan S does not explicitly monitor and maintain, within the terms of its open publication requirement, an insurmountable ceiling on publication costs, these perverse effects of budget explosion will be inevitable. This is now where the challenge of communicating public research lies….”
From Google’s English: “Open access is enjoying increasing success . Since 2017, the majority of new articles in all academic disciplines, especially in science, have been published in open access. In 2020, at the request of UNESCO , most publishers removed toll barriers from articles on the COVID-19 pandemic in order to quickly understand the characteristics of the SARS-CoV-2 virus and accelerate the development of vaccines and treatments. In this regard, the COVID-19 pandemic will have made many understand the usefulness, even the absolute necessity, of instant and open communication in the face of a large-scale collective challenge….
Another danger is that, by its binding nature (which is also its strength and its chance to operate), Plan S offers traditional publishers a tempting opportunity to demand publication rights (called APCs, article processing charges) excessively. high, in order not to cover costs, but to compensate for the shortfall in the cancellation of subscriptions….
Coalition S seeks to exert downward pressure on publishing prices by seeking transparency. When a grant recipient’s research is published, Plan S requires publishers to disclose their rates to funders, including the cost of services such as screening, organizing peer review. , improved writing and proofreading. The coalition is committed to sharing this information openly with authors and institutions, in the hope of ensuring some level of price control….
Some authors are also hesitant because of the requirement that they publish in prestigious and high-impact journals to obtain tenure, promotion or the means to carry out their work. In addition, they may fall victim to the misconception that journals which only offer open access articles lack rigor.
In addition, paying to publish in journals which benefit from the prestige of their publishing house creates a flagrant inequality between researchers according to the financial means at their disposal….”
In Australia the first challenge is to overcome the apathy about open access issues. The term “open access” has been too easy to ignore. Many consider it a low priority compared to achievements in research, obtaining grant funding, or university rankings glory.