Why we should worry less about predatory publishers and more about the quality of research and training at our academic institutions

In 2014 over 400,000 articles were published in about 8000 journals that many regard as predatory. The term “predatory publishers” was first used by Jeffrey Beall of the University of Colorado, who until recently documented this phenomenon on his blog and in an annual list. Although this term, and variants such as “predatory journals”, are widely used, they have been criticised. One problem is that the term predator may cover a spectrum of organizations, business activities and publications ranging from the amateurish but genuine to the deliberately misleading.

Stop binning negative results, researchers told | Times Higher Education (THE)

“A new Europe-wide code of research conduct has ordered academics and journals to treat negative experimental results as being equally worthy of publication as positive ones….The new European Code of Conduct for Research Integrity frames the bias against negative results as an issue of research conduct, stipulating that “authors and publishers [must] consider negative results to be as valid as positive findings for publication and dissemination”….It has been drawn up by All European Academies (Allea), a network of academic organisations including the British Academy, Germany’s Leopoldina and the French Académie des Sciences….The new code also puts more emphasis on research organisations themselves to prevent and detect misconduct; for example, universities should reward “open and reproducible practices” when it comes to hiring and promoting researchers, it says….”

Chambers, C.: The Seven Deadly Sins of Psychology: A Manifesto for Reforming the Culture of Scientific Practice. (eBook and Hardcover)

“Outlining a core set of best practices that can be applied across the sciences, Chambers demonstrates how all these sins can be corrected by embracing open science, an emerging philosophy that seeks to make research and its outcomes as transparent as possible….”

Academics can change the world – if they stop talking only to their peers

“Some academics insist that it’s not their job to write for the general public. They suggest that doing so would mean they’re “abandoning their mission as intellectuals”. They don’t want to feel like they’re “dumbing down” complex thinking and arguments.

The counter argument is that academics can’t operate in isolation from the world’s very real problems.

They may be producing important ideas and innovations that could help people understand and perhaps even begin to address issues like climate change, conflict, food insecurity and disease….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Do US Patent Incentives Need To Change To Get The ‘Cancer Moonshot’ Off The Ground? – Intellectual Property Watch

“The Moonshot is a partnership with the public, and significant public funds are being invested in the project. It is not so simple, however, to argue that information generated under the Moonshot can be ordered to be shared. When private companies work together on this project, they won’t just be sharing their data but sharing how they collect that data. Much of this information falls under the category of trade secrets, which companies guard closely….[Jacob] Sherkow advocates for prohibiting information that could be shared from eligibility as trade secrets….This would be coupled with a broadening of patentability criteria, said Sherkow. “Broadening up patentable subject matter has its disadvantages, to be sure, but in something like a public-private partnership, where the name of the game is creating information and then disclosing it to people, that’s definitely better than having the taxpayer pay for private trade secrets that vest in a for-profit company,” he said….”

Religious studies scholars not readily adopting open access, according to new Ithaka S+R report | Omega Alpha | Open Access

“Although the report noted that scholars are keen to use online venues like Academia.edu for sharing and discovery of their research among colleagues, they are distrustful or uncertain about open access as a primary publishing model, either due to lack of appropriate open access venues in their (sub-)discipline, a perception of lower academic standards, or that it would not be recognized for tenure or promotion….”

Energy scientists must show their workings : Nature News & Comment

“The list of reasons why energy models and data are not openly available is long: business confidentiality; concerns over the security of critical infrastructure; a desire to avoid exposure and scrutiny; worries about data being misrepresented or taken out of context; and a lack of time and resources.

This secrecy is problematic, because it is well known that closed systems hide and perpetuate mistakes. A classic example is the spreadsheet error discovered in the influential Reinhart–Rogoff paper used to support economic policies of national austerity. The European Commission’s Energy Roadmap 2050 was based on a model that could not be viewed by outsiders, leaving it open to criticism. Assumptions that remain hidden, like the costs of technologies, can largely determine what comes out of such models. In the United Kingdom, opaque and overly optimistic cost assumptions for onshore wind went into models used for policymaking, and that may well have delayed the country’s decarbonization.

This closed culture is alien to younger researchers, who grew up with collaborative online tools and share code and data on platforms such as GitHub. Yet academia’s love affair with metrics and the pressure to publish set the wrong incentives: every hour spent on cleaning up a data set for public release or writing open-source code is time not spent working on a peer-reviewed paper.”

Guest Post, Tony Sanfilippo: University Press Publishing Under an Autocracy – The Scholarly Kitchen

“So continuing the production of high-quality, peer-reviewed scholarship is only going to grow in importance because it provides a provable counternarrative to what is becoming an onslaught of misinformation and disinformation. The research we produce and our willingness to share both the actual data and the analysis as openly as possible is going to be on the frontline of what is shaping up to be a war on truth and on verification….”