Ten myths around open scholarly publishing [PeerJ Preprints]

Abstract:  The changing world of scholarly communication and the emergence of ‘Open Science’ or ‘Open Research’ has brought to light a number of controversial and hotly-debated topics. Yet, evidence-based rational debate is regularly drowned out by misinformed or exaggerated rhetoric, which does not benefit the evolving system of scholarly communication. The aim of this article is to provide a baseline evidence framework for ten of the most contested topics, in order to help frame and move forward discussions, practices and policies. We address preprints and scooping, the practice of copyright transfer, the function of peer review, and the legitimacy of ‘global’ databases. The presented facts and data will be a powerful tool against misinformation across wider academic research, policy and practice, and may be used to inform changes within the rapidly evolving scholarly publishing system.

Analyze the impact of the rising Open Access movement on your organization – Dimensions

Open Access is an integral part of the journey to a more collaborative research environment and continues to grow in importance across a variety of communities, including publishers, funders, librarians and of course the academic research community. Open Access in combination with Open Data has quickly become a key issue impacting both the quantity and the quality of scholarly communications.

In this recently published Digital Science Research report, Dimensions data were used to explore the implications that restricted access may impose and analyze current Open Access trends. Some of the reports key findings include that the volume of Open Access articles has clearly been rising in recent years and that countries that have invested in Open Access have typically increased their level of international collaboration.

All this and more can be discovered through Dimensions’ rich data and analytical capabilities as we recently developed and released a number of updates and new features which will help you to gain richer and more precise insights about Open Access for your organization.

Dimensions provides multiple filters to easily display results which are Open Access. Our filters are built around the four most commonly used basic classifications:

  • Bronze – available on websites hosted by their publisher, either immediately or following an embargo, but are not formally licensed for reuse.
  • Green – freely available somewhere other than the publisher’s website, e.g. in a subject or university repository, or the author’s personal website. Applies to self-archiving generally of the pre or post-print or potentially after an embargo period
  • Gold – refers to articles in fully accessible open access journals that are available immediately upon publication without a license
  • Hybrid – refers to subscription journals with open access to individual articles usually when a fee is paid to the publisher or journal by the author, the author’s organization, or the research funder….

Say you wanted to know how many gold Open Access papers by the University of Oxford, funded by the Wellcome Trust, were published in Springer Nature journals between 2013 – 2018? We made discovering that easy as you can see in the screenshot below….”

Analyze the impact of the rising Open Access movement on your organization – Dimensions

Open Access is an integral part of the journey to a more collaborative research environment and continues to grow in importance across a variety of communities, including publishers, funders, librarians and of course the academic research community. Open Access in combination with Open Data has quickly become a key issue impacting both the quantity and the quality of scholarly communications.

In this recently published Digital Science Research report, Dimensions data were used to explore the implications that restricted access may impose and analyze current Open Access trends. Some of the reports key findings include that the volume of Open Access articles has clearly been rising in recent years and that countries that have invested in Open Access have typically increased their level of international collaboration.

All this and more can be discovered through Dimensions’ rich data and analytical capabilities as we recently developed and released a number of updates and new features which will help you to gain richer and more precise insights about Open Access for your organization.

Dimensions provides multiple filters to easily display results which are Open Access. Our filters are built around the four most commonly used basic classifications:

  • Bronze – available on websites hosted by their publisher, either immediately or following an embargo, but are not formally licensed for reuse.
  • Green – freely available somewhere other than the publisher’s website, e.g. in a subject or university repository, or the author’s personal website. Applies to self-archiving generally of the pre or post-print or potentially after an embargo period
  • Gold – refers to articles in fully accessible open access journals that are available immediately upon publication without a license
  • Hybrid – refers to subscription journals with open access to individual articles usually when a fee is paid to the publisher or journal by the author, the author’s organization, or the research funder….

Say you wanted to know how many gold Open Access papers by the University of Oxford, funded by the Wellcome Trust, were published in Springer Nature journals between 2013 – 2018? We made discovering that easy as you can see in the screenshot below….”

Blockchain and OECD data repositories: opportunities and policymaking implications | Library Hi Tech | Vol 37, No 1

Abstract:  The purpose of this paper is to employ the case of Organization for Economic Cooperation and Development (OECD) data repositories to examine the potential of blockchain technology in the context of addressing basic contemporary societal concerns, such as transparency, accountability and trust in the policymaking process. Current approaches to sharing data employ standardized metadata, in which the provider of the service is assumed to be a trusted party. However, derived data, analytic processes or links from policies, are in many cases not shared in the same form, thus breaking the provenance trace and making the repetition of analysis conducted in the past difficult. Similarly, it becomes tricky to test whether certain conditions justifying policies implemented still apply. A higher level of reuse would require a decentralized approach to sharing both data and analytic scripts and software. This could be supported by a combination of blockchain and decentralized file system technology.

Design/methodology/approach

 

The findings presented in this paper have been derived from an analysis of a case study, i.e., analytics using data made available by the OECD. The set of data the OECD provides is vast and is used broadly. The argument is structured as follows. First, current issues and topics shaping the debate on blockchain are outlined. Then, a redefinition of the main artifacts on which some simple or convoluted analytic results are based is revised for some concrete purposes. The requirements on provenance, trust and repeatability are discussed with regards to the architecture proposed, and a proof of concept using smart contracts is used for reasoning on relevant scenarios.

Findings

 

A combination of decentralized file systems and an open blockchain such as Ethereum supporting smart contracts can ascertain that the set of artifacts used for the analytics is shared. This enables the sequence underlying the successive stages of research and/or policymaking to be preserved. This suggests that, in turn, and ex post, it becomes possible to test whether evidence supporting certain findings and/or policy decisions still hold. Moreover, unlike traditional databases, blockchain technology makes it possible that immutable records can be stored. This means that the artifacts can be used for further exploitation or repetition of results. In practical terms, the use of blockchain technology creates the opportunity to enhance the evidence-based approach to policy design and policy recommendations that the OECD fosters. That is, it might enable the stakeholders not only to use the data available in the OECD repositories but also to assess corrections to a given policy strategy or modify its scope.

Blockchain and OECD data repositories: opportunities and policymaking implications | Library Hi Tech | Vol 37, No 1

Abstract:  The purpose of this paper is to employ the case of Organization for Economic Cooperation and Development (OECD) data repositories to examine the potential of blockchain technology in the context of addressing basic contemporary societal concerns, such as transparency, accountability and trust in the policymaking process. Current approaches to sharing data employ standardized metadata, in which the provider of the service is assumed to be a trusted party. However, derived data, analytic processes or links from policies, are in many cases not shared in the same form, thus breaking the provenance trace and making the repetition of analysis conducted in the past difficult. Similarly, it becomes tricky to test whether certain conditions justifying policies implemented still apply. A higher level of reuse would require a decentralized approach to sharing both data and analytic scripts and software. This could be supported by a combination of blockchain and decentralized file system technology.

Design/methodology/approach

 

The findings presented in this paper have been derived from an analysis of a case study, i.e., analytics using data made available by the OECD. The set of data the OECD provides is vast and is used broadly. The argument is structured as follows. First, current issues and topics shaping the debate on blockchain are outlined. Then, a redefinition of the main artifacts on which some simple or convoluted analytic results are based is revised for some concrete purposes. The requirements on provenance, trust and repeatability are discussed with regards to the architecture proposed, and a proof of concept using smart contracts is used for reasoning on relevant scenarios.

Findings

 

A combination of decentralized file systems and an open blockchain such as Ethereum supporting smart contracts can ascertain that the set of artifacts used for the analytics is shared. This enables the sequence underlying the successive stages of research and/or policymaking to be preserved. This suggests that, in turn, and ex post, it becomes possible to test whether evidence supporting certain findings and/or policy decisions still hold. Moreover, unlike traditional databases, blockchain technology makes it possible that immutable records can be stored. This means that the artifacts can be used for further exploitation or repetition of results. In practical terms, the use of blockchain technology creates the opportunity to enhance the evidence-based approach to policy design and policy recommendations that the OECD fosters. That is, it might enable the stakeholders not only to use the data available in the OECD repositories but also to assess corrections to a given policy strategy or modify its scope.

Is the Value of the Big Deal in Decline? – The Scholarly Kitchen

Last week, the University of California system terminated its license with Elsevier. There has been a great deal of attention to California’s efforts to reach a Publish & Read (P&R) agreement. The what-could-have-been of this deal is interesting and important. But I wish to focus today on the what-no-longer-is of scholarly content licensing, focusing on the big deal model of subscription journals bundled together on a single publisher basis for three to five year deals. In the eyes of major libraries in Europe and the US, the value of the big deal has declined. As a result, we are moving into a new period, in which publisher pricing power has declined and the equilibrium price for content and related services is being reset. What is the principal culprit? I will maintain today that we must look in large part to what publishers call “leakage.”…

I have heard estimates that suggest publisher usage numbers could be at least 60-70% higher if “leakage” was included in addition to their on-platform usage statistics. This includes “green” options through a variety of repositories (including some that are operated by publishers themselves in addition to library and not-for-profit repositories), materials on scholarly collaboration networks, and through piracy. The share of leakage among entitled users at an institution with a license is probably lower than this estimate, but it is likely well in the double digits.

I am in no way arguing against green models. Indeed, publishers have largely become comfortable with green open access. I am simply observing that these percentages are beginning to add up….”

cOAlition S: Response to PNAS | PNAS

The assertion is made that most society publishers—who currently make use of hybrid OA—will “likely be prohibited for authors with Plan S funders.” This is not correct since Plan S supports deposition of articles in repositories as an option for compliance. Indeed, in recent weeks we have seen several United Kingdom learned societies—including the Royal Society* and the Microbiology Society—adopt a Plan S-compliant model, by allowing authors to self-archive their author-accepted manuscript in a repository at the time of publication, with a Creative Commons Attribution license (CC BY)….

To help learned societies explore alternative revenue streams and business models, the Wellcome Trust, in partnership with United Kingdom Research and Innovation and the Association of Learned and Professional Society Publishers, has just funded a consultancy to work with learned societies to help them adapt and thrive in a Plan S world….”

 

Repository Implementation Webinar: March 26, 2019

“Join us on March 26th at 8:00am PST/4:00pm GMT for a webinar on repository implementation of our COUNTER Code of Practice for Research Data and Make Data Count recommendations. This webinar will feature a panel of developers from repositories that have implemented or about to release standardized data metrics: Dataverse, Dryad, and Zenodo. We will interview each repository on their implementation process. This recorded discussion between our technical team and repositories, providing various perspectives of implementation, should be a helpful guidance for any repository interested in implementing!…”

David Worlock | Developing digital strategies for the information marketplace | Supporting the migration of information providers and content players into the networked services world of the future.

The Springer Nature announcement that they were working with ReseachGate on a fair sharing policy has elements that run right through the tracery of fissures . It tells us that commercial players have no commercial reason to do anything but compete , and that Springer Nature , Thieme , CUP and in time others want to be seen as more user supportive in this regard than , others . This is not for me a new form of permitted “syndication “ – simply a gracious concession to license what users were doing anyway and remove some friction. It also says that in the games yet to be played , many people see tracking usage of the traceable communication as an important source of information , and potentially of revenues . The pressures felt by players like Springer Nature and Wiley as they at once try to differentiate themselves from the very clear stance of a market leader like Elsevier while trying to protect their service integrity at the same time are similarly shown in the Projekt DEAL developments . Market leaders get trapped and isolated in market positions they cannot give up , while the rest dissociate and differentiate themselves as best they can , while trying hard not to lose revenue margins in the process . Then sit down and read the reactions to Plan S – Springer Nature were paragons of moderation and reason . The loudest squeals came from those with most to lose – scholarly societies with journal revenue dependence. …

So what can the market leader do about this change as they face increasing user criticism ? The traditional answer always was “ push intransigence as far as it will go , and if those who would change the terms of trade do not come to heel , change your CEO as a way of changing your own policy without losing face “ . It may of course be an entire co-incidence that Elsevier’s CEO Ron Mobed retired last week without prior indication that he was about to go , and has been replaced by a very experienced RELX strategy specialist , Kumsal Bayazit . She is warmly welcomed and deserves a good chance to rethink the strategies that have backed Elsevier into a corner with Projekt DEAL and with the University of California . The people who work at Elsevier are , to my certain knowledge , as dedicated as any group I know to the objectives of their customers and the improvement of scholarly communications : they know that at the end of the dy the customer has the final say . And let’s think about what the power of a market leader now really means : 20 years ago companies like Elsevier demanded that authors surrendered their copyrights on the grounds that only the publisher was powerful enough to protect them , while today no publisher is powerful enough to shutter SciHub….”

Digital landform reconstruction using old and recent open access digital aerial photos – ScienceDirect

Abstract:  Technological progress in remote sensing has enabled digital representation of terrain through new techniques (e.g. digital photogrammetry) and instruments (e.g. 3D laser scanners). However, the use of old aerial images remains important in geosciences to reconstruct past landforms and detect long-term topographic changes. Administrations have recently expressed growing interest in sharing photogrammetric datasets on public repositories, providing opportunities to exploit these resources and detect natural and anthropogenic topographic changes. The SfM-MVS photogrammetric technique was applied to scanned historical black and white aerial photos of the Serra de Fontcalent (Alicante, Spain), as well as to recent high-quality digital aerial photos. Ground control points (GCPs) extracted from a LiDAR-derived three-dimensional point cloud were used to georeference the results with non-linear deformations. Two point clouds obtained with SfM-MVS were compared with the LiDAR-derived reference point cloud. Based on the result, the quality of the models was analysed through the comparison of the stages on stable areas, i.e., lands where no variations were detected, and active areas, with quarries, new infrastructures, fillings, excavations or new buildings. This study also indicates that errors are higher for old aerial photos (up to 5?m on average) than recent digital photos (up to 0.5?m). The application of SfM-MVS to open access data generated 3D models that enhance the geomorphological analysis, compared to stereophotogrammetry, and effectively detected activities in quarries and building of landfills.