Invest in Open Infrastructure: An Interview with Dan Whaley – The Scholarly Kitchen

“Dan Whaley from the Invest in Open initiative answers questions about what IOI is doing, and sets a broad context for the global effort….

Open infrastructure is the solution to all this.  For me, open infrastructure is simply shorthand for technology in which the incentives to collaborate and work together are built in by design. That includes elements like open source software, open APIs, open data and open standards, but more fundamentally it’s a mindset in which your reward — either personal or organizational — comes from working together as a community for the benefit of all.  

As someone who is product focused, a question I always try to ask is what is the best user experience, regardless of who owns which piece? Does what we’re implementing actually make it easier for people to accomplish their goals? Closed systems often make decisions simply for the sake of preventing or restricting access that create terrible experiences and result in lower utility. Open systems do this too sometimes, but at least the inherent motivations are more likely to be aligned….”

OSF Preprints | Open science and modified funding lotteries can impede the natural selection of bad science

Abstract:  Assessing scientists using exploitable metrics can lead to the degradation of research methods even without any strategic behavior on the part of individuals, via “the natural selection of bad science.” Institutional incentives to maximize metrics like publication quantity and impact drive this dynamic. Removing these incentives is necessary, but institutional change is slow. However, recent developments suggest possible solutions with more rapid onsets. These include what we call open science improvements, which can reduce publication bias and improve the efficacy of peer review. In addition, there have been increasing calls for funders to move away from prestige- or innovation-based approaches in favor of lotteries. We investigated whether such changes are likely to improve the reproducibility of science even in the presence of persistent incentives for publication quantity through computational modeling. We found that modified lotteries, which allocate funding randomly among proposals that pass a threshold for methodological rigor, effectively reduce the rate of false discoveries, particularly when paired with open science improvements that increase the publication of negative results and improve the quality of peer review. In the absence of funding that targets rigor, open science improvements can still reduce false discoveries in the published literature but are less likely to improve the overall culture of research practices that underlie those publications.

Credit data generators for data reuse

“Much effort has gone towards crafting mandates and standards for researchers to share their data13. Considerably less time has been spent measuring just how valuable data sharing is, or recognizing the scientific contributions of the people responsible for those data sets. The impact of research continues to be measured by primary publications, rather than by subsequent uses of the data….

To incentivize the sharing of useful data, the scientific enterprise needs a well-defined system that links individuals with reuse of data sets they generate4….

A system in which researchers are regularly recognized for generating data that become useful to other researchers could transform how academic institutions evaluate faculty members’ contributions to science….”

Plan S and the Transformation of Scholarly Communication: Are We Missing the Woods? – The Scholarly Kitchen

“My initial disclaimer, if it’s not obvious, is that I am completely supportive of the driving principles and objectives of Plan S. I lead an organization [PLoS] that is – and always has been – Plan S compliant from top to toe. But more than that, I share the goal of a future in which the research literature is fully and immediately open with liberal rights of reuse. And so, I am pleased to see that this bold goal not only remains unchanged but that the feedback process has encouraged (or, in some cases, perhaps forced) stakeholders to support these general principles …

With that battle won, the question is now all about the transition. The revised guidelines show evidence of having listened to stakeholder concerns without watering down the fundamental principles. That’s key, but there are also a number of positive changes that both help to clear up confusion and should make the transition easier….”

The Guild: Recommendations for Open Access and the implementation of Plan S | Science|Business

In view of the forthcoming publication of the Plan S’ revised Implementation Guidance, The Guild has published a position paper presenting its proposals for a successful transition towards Open Access. With these recommendations, The Guild builds on its submission to the Plan S consultation, contributing to a wider debate about how Plan S can help realise the ambitions of Open Science.

Full position paper at https://www.the-guild.eu/news/2019/12_open-science.pdf  

How to avoid borrowed plumes in academia

Abstract:  Publications in top journals today have a powerful influence on academic careers although there is much criticism of using journal rankings to evaluate individual articles. We ask why this practice of performance evaluation is still so influential. We suggest this is the case because a majority of authors benefit from the present system due to the extreme skewness of citation distributions. “Performance paradox” effects aggravate the problem. Three extant suggestions for reforming performance management are critically discussed. We advance a new proposal based on the insight that fundamental uncertainty is symptomatic for scholarly work. It suggests focal randomization using a rationally founded and well-orchestrated procedure.

Open access and subscription based journals have similar problems in terms of quantity and relaying science to the public | The BMJ

In the Head to Head debate on whether to publish in an open access journal, Ashton and Beattie report that PLOS One accepts 70% of submissions.1 That might have been true in 2013, but a more recent and perhaps more accurate figure would be that as of 2017 PLOS One accepts about 50% of submissions, which is an equivalent rate to that of BMJ Open.2 I also question whether acceptance rate is a meaningful statistic when research is moving towards a publish first, curate later model.3 Simply not publishing, or batting manuscripts around various journals until one finally accepts it after a lengthy delay, constitutes a form of research waste and is something that ought to be avoided.4

The argument that certain forms of open access encourage higher quantity is also true of subscription based publishing. Predominantly subscription based publishers routinely market their subscriptions to library consortiums on the basis of price per article, the lower the better value. As the price of subscriptions that libraries can afford remains flat, subscription based publishers have an incentive to make their services look better by publishing more to reduce the apparent price per article that a subscription gets you. The publishers then further obfuscate this by bundling together journals full of chaff articles with journals full of higher quality material. But under so called diamond or platinum open access publishing models, in which neither authors nor readers pay to support the publication process, there is no such dangerous incentive to erode professional standards….”

The European University Association and Science Europe Join Efforts to Improve Scholarly Research Assessment Methodologies

“EUA and Science Europe are committed to working together on building a strong dialogue between their members, with a view to:

• support necessary changes for a better balance between qualitative and quantitative research assessment approaches, aiming at evaluating the merits of scholarly research. Furthermore, novel criteria and methods need to be developed towards a fairer and more transparent assessment of research, researchers and research teams, conducive to selecting excellent proposals and researchers.governments and public authorities to guarantee scholars and students the rights that constitute academic freedom, including the rights to freedom of expression, opinion, thought, information and assembly as well as the rights to education and teaching;

• recognise the diversity of research outputs and other relevant academic activities and their value in a manner that is appropriate to each research field and that challenges the overreliance on journal-based metrics.universities, funding agencies, academies and other research organisations to ensure that all researchers, teachers and students are guaranteed academic freedom, by fostering a culture in which free expression and the open exchange of opinion are valued and by shielding the research and teaching community from sanctions for exercising academic freedom.

• consider a broad range of criteria to reward and incentivise research quality as the fundamental principle of scholarly research, and ascertain assessment processes and methods that accurately reflect the vast dimensions of research quality and credit all scientific contributions appropriately. EUA and Science Europe will launch activities to further engage their members in improving and strengthening their research assessment practices. Building on these actions, both associations commit to maintaining a continuous dialogue and explore opportunities for joint actions, with a view to promoting strong synergies between the rewards and incentives structures of research funders and research performing organisations, as well as universities….”

The European University Association and Science Europe Join Efforts to Improve Scholarly Research Assessment Methodologies

EUA and Science Europe have issued a joint statement on the need for research funders and research performing organisations as well as universities to combine their efforts to develop and implement more accurate, transparent and responsible approaches to scholarly research assessment.

 

Representing a vast section of Europe’s research and higher education system, EUA and Science Europe are committed to working together on building a strong dialogue between their members with a view to improving research assessment methodologies….”