Statement on the Scholarly Merit and Evaluation of Open Scholarship in Linguistics | Linguistic Society of America

“The Linguistic Society of America values the open sharing of scholarship, and encourages the fair review of open scholarship in hiring, tenure, and promotion. The LSA encourages scholars, departments, and personnel committees to actively place value on open scholarship in their evaluation process with the aim of encouraging greater accessibility, distribution, and use of linguistic research….”

Next Generation Library Publishing Open Forum Tickets, Mon, Oct 5, 2020 at 1:00 PM | Eventbrite

“Please join Dr. Katherine Skinner and Sarah Lippincott and the Next Generation Library Publishing team for an hour-long open forum on their recent work to develop values and principles-based assessment tools to incentivize stronger alignment between publishing tools, services, and platforms and the scholarly communities and publics they ultimately serve.

During this open forum, we will be seeking input and feedback from attendees regarding the Values and Principles Framework and its affiliated Assessment Checklist, which we have issued for public comment from August-September 30, 2020.

Pending public feedback, we plan to refine and issue these tools in 2021 for broad use.”

Decentralized Assessment of FAIR datasets

“The pilot is based on DEIP’s own developed deep-tech innovation – Decentralized Assessment System (DAS). DAS is a peer review system that uses an incentive model with reputation rewards and produces a quantifiable metric about the quality and reliability of any data set(s) being assessed. DAS is designed specifically for assessment of assets in expertise-intensive areas, such as scientific research. DAS introduces a comprehensive and robust assessment model:

it sources the consensus about the quality of data sets among the domain experts through continuous two-level peer-review;

it ensures fair rewards for contributions and curation efforts;
it formalizes the result of assessment into explicit metrics/indicators useful for non-experts….”

Roadmap to Plan S for Australia: Final Report

“This report, commissioned by the Council of Australian University Librarians, for delivery to the DVCsR Committee, provides an analysis of the challenges and opportunities arising from Plan S for Australian researchers and universities, including high-level recommendations on how Australian universities should proceed in order to meet compliance obligations from 2021. The report considers the scale of the Plan S compliance issue, finding that 5% of Australian university research publications are affected by Plan S compliance obligations, and typically 0-2% of total research funding is from Coalition S funders. However, addressing compliance issues for affected researchers, can provide more open access publication options for all Australian university researchers in line with indications of similar requirements by other funding bodies. This allows for the challenges presented by Plan S compliance to be transformed into opportunities to enhance Australian research visibility more broadly. While a full set of recommendations can be found at the end of the report, the following summarises the high priority, urgent actions required: ? University Executives must set out clear institutional open access policy positions that align with Plan S and align recognition and reward frameworks accordingly. ? University Executives must ensure there is a central research support capability to identify affected researchers and to offer highly tailored advice. ? Universities must adequately support institutional repositories to fulfil Plan S technical and service requirements. ? CAUL must pursue negotiations with publishers to minimise or eliminate transactional APCs for open access journals. ? CAUL must ensure publishing output data and new consortium models are developed to improve the value of transformative agreements….”

Data journals: incentivizing data access and documentation within the scholarly communication system

Abstract:  Data journals provide strong incentives for data creators to verify, document and disseminate their data. They also bring data access and documentation into the mainstream of scholarly communication, rewarding data creators through existing mechanisms of peer-reviewed publication and citation tracking. These same advantages are not generally associated with data repositories, or with conventional journals’ data-sharing mandates. This article describes the unique advantages of data journals. It also examines the data journal landscape, presenting the characteristics of 13 data journals in the fields of biology, environmental science, chemistry, medicine and health sciences. These journals vary considerably in size, scope, publisher characteristics, length of data reports, data hosting policies, time from submission to first decision, article processing charges, bibliographic index coverage and citation impact. They are similar, however, in their peer review criteria, their open access license terms and the characteristics of their editorial boards.


What drives and inhibits researchers to share and use open research data? A systematic literature review to analyze factors influencing open research data adoption

Abstract:  Both sharing and using open research data have the revolutionary potentials for forwarding scientific advancement. Although previous research gives insight into researchers’ drivers and inhibitors for sharing and using open research data, both these drivers and inhibitors have not yet been integrated via a thematic analysis and a theoretical argument is lacking. This study’s purpose is to systematically review the literature on individual researchers’ drivers and inhibitors for sharing and using open research data. This study systematically analyzed 32 open data studies (published between 2004 and 2019 inclusively) and elicited drivers plus inhibitors for both open research data sharing and use in eleven categories total that are: ‘the researcher’s background’, ‘requirements and formal obligations’, ‘personal drivers and intrinsic motivations’, ‘facilitating conditions’, ‘trust’, ‘expected performance’, ‘social influence and affiliation’, ‘effort’, ‘the researcher’s experience and skills’, ‘legislation and regulation’, and ‘data characteristics.’ This study extensively discusses these categories, along with argues how such categories and factors are connected using a thematic analysis. Also, this study discusses several opportunities for altogether applying, extending, using, and testing theories in open research data studies. With such discussions, an overview of identified categories and factors can be further applied to examine both researchers’ drivers and inhibitors in different research disciplines, such as those with low rates of data sharing and use versus disciplines with high rates of data sharing plus use. What’s more, this study serves as a first vital step towards developing effective incentives for both open data sharing and use behavior.


Business Models and Market Structure within the Scholarly Communications Sector

“The paper proceeds by first positioning the situation within the broader setting of how to effectively regulate digital markets. The dominant business model and industrial structure within scholarly communications at the end of the last century is then discussed, as a springboard from which to consider new business models that have arisen over the past twenty years and their likely implications for the sector. The paper concludes that there would be considerable benefit to the establishment of a permanent digital markets unit to monitor and assess ongoing developments in the scholarly communications sector and to coordinate and encourage “good behaviour” across all actors in the sector….”

What’s Wrong with Social Science and How to Fix It: Reflections After Reading 2578 Papers | Fantastic Anachronism

[Some recommendations:]

Ignore citation counts. Given that citations are unrelated to (easily-predictable) replicability, let alone any subtler quality aspects, their use as an evaluative tool should stop immediately.
Open data, enforced by the NSF/NIH. There are problems with privacy but I would be tempted to go as far as possible with this. Open data helps detect fraud. And let’s have everyone share their code, too—anything that makes replication/reproduction easier is a step in the right direction.
Financial incentives for universities and journals to police fraud. It’s not easy to structure this well because on the one hand you want to incentivize them to minimize the frauds published, but on the other hand you want to maximize the frauds being caught. Beware Goodhart’s law!
Why not do away with the journal system altogether? The NSF could run its own centralized, open website; grants would require publication there. Journals are objectively not doing their job as gatekeepers of quality or truth, so what even is a journal? A combination of taxonomy and reputation. The former is better solved by a simple tag system, and the latter is actually misleading. Peer review is unpaid work anyway, it could continue as is. Attach a replication prediction market (with the estimated probability displayed in gargantuan neon-red font right next to the paper title) and you’re golden. Without the crutch of “high ranked journals” maybe we could move to better ways of evaluating scientific output. No more editors refusing to publish replications. You can’t shift the incentives: academics want to publish in “high-impact” journals, and journals want to selectively publish “high-impact” research. So just make it impossible. Plus as a bonus side-effect this would finally sink Elsevier….”

Research 2030 podcast: Can the reward system learn to love open science? Part 1 with Jean-Claude Burgelman

“The open science movement has been gaining momentum over the past decade, prompting initiatives such as cOAlition S, with its plan to increase open access publications. But while the goals of open science are welcomed by many, challenges remain. And top of the list is the researcher reward system.

This is the first episode in our short series on open science and the reward system. Host Dr. Stephane Berghmans, Elsevier VP of Academic and Research Relations EU, welcomes Prof. Jean-Claude Burgelman to the podcast. Prof. Burgelman is eminently qualified to talk about this topic. Not only is he a part-time Professor of Open Science Policy at Vrije Universiteit Brussel, he was recently Head of Unit Open Data Policies and Science Cloud at the European Commission and an open access envoy for the organization….”