Academics can change the world – if they stop talking only to their peers

“Some academics insist that it’s not their job to write for the general public. They suggest that doing so would mean they’re “abandoning their mission as intellectuals”. They don’t want to feel like they’re “dumbing down” complex thinking and arguments.

The counter argument is that academics can’t operate in isolation from the world’s very real problems.

They may be producing important ideas and innovations that could help people understand and perhaps even begin to address issues like climate change, conflict, food insecurity and disease….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Do US Patent Incentives Need To Change To Get The ‘Cancer Moonshot’ Off The Ground? – Intellectual Property Watch

“The Moonshot is a partnership with the public, and significant public funds are being invested in the project. It is not so simple, however, to argue that information generated under the Moonshot can be ordered to be shared. When private companies work together on this project, they won’t just be sharing their data but sharing how they collect that data. Much of this information falls under the category of trade secrets, which companies guard closely….[Jacob] Sherkow advocates for prohibiting information that could be shared from eligibility as trade secrets….This would be coupled with a broadening of patentability criteria, said Sherkow. “Broadening up patentable subject matter has its disadvantages, to be sure, but in something like a public-private partnership, where the name of the game is creating information and then disclosing it to people, that’s definitely better than having the taxpayer pay for private trade secrets that vest in a for-profit company,” he said….”

Religious studies scholars not readily adopting open access, according to new Ithaka S+R report | Omega Alpha | Open Access

“Although the report noted that scholars are keen to use online venues like Academia.edu for sharing and discovery of their research among colleagues, they are distrustful or uncertain about open access as a primary publishing model, either due to lack of appropriate open access venues in their (sub-)discipline, a perception of lower academic standards, or that it would not be recognized for tenure or promotion….”

Energy scientists must show their workings : Nature News & Comment

“The list of reasons why energy models and data are not openly available is long: business confidentiality; concerns over the security of critical infrastructure; a desire to avoid exposure and scrutiny; worries about data being misrepresented or taken out of context; and a lack of time and resources.

This secrecy is problematic, because it is well known that closed systems hide and perpetuate mistakes. A classic example is the spreadsheet error discovered in the influential Reinhart–Rogoff paper used to support economic policies of national austerity. The European Commission’s Energy Roadmap 2050 was based on a model that could not be viewed by outsiders, leaving it open to criticism. Assumptions that remain hidden, like the costs of technologies, can largely determine what comes out of such models. In the United Kingdom, opaque and overly optimistic cost assumptions for onshore wind went into models used for policymaking, and that may well have delayed the country’s decarbonization.

This closed culture is alien to younger researchers, who grew up with collaborative online tools and share code and data on platforms such as GitHub. Yet academia’s love affair with metrics and the pressure to publish set the wrong incentives: every hour spent on cleaning up a data set for public release or writing open-source code is time not spent working on a peer-reviewed paper.”

Guest Post, Tony Sanfilippo: University Press Publishing Under an Autocracy – The Scholarly Kitchen

“So continuing the production of high-quality, peer-reviewed scholarship is only going to grow in importance because it provides a provable counternarrative to what is becoming an onslaught of misinformation and disinformation. The research we produce and our willingness to share both the actual data and the analysis as openly as possible is going to be on the frontline of what is shaping up to be a war on truth and on verification….”

Managing the Transitional Impact of Open Access Journals

Abstract:  The explosion of open access (OA) journals in recent years has not only impacted on how libraries manage contents and budgets, but also the choice of journals for academic researcher submission of their article for publication. A study conducted at the University of Hong Kong indicated that academic researchers have a gradual tendency in shifting some of their publications toward OA journals, and interestingly the shifts are discipline specific. While OA does offer an alternative to the unsustainable pricing of serials and supports a core value of ensuring openness to knowledge, the perceived value toward the impact of OA journals are still lacking consensus among stakeholders. 

 

 The aims of this study are to better understand from the perspective of academic researchers in 4 broad disciplines—Health Science, Life Science, Physical Science and Social Science, their preferences in paper submission. Data on actual article submission trends at HKU will be analyzed together with qualitative feedback from researchers to examine the trend and incentive in shifting toward OA publishing in different disciplines. Researcher’s attitude will be understood within the context of the university’s open policy and research assessment, as well as the current OA landscape to inform the scholarly communication trend going forward.

The Economics of Replication

Abstract:  Replication studies are considered a hallmark of good scientific practice. Yet they are treated among researchers as an ideal to be professed but not practiced. To provide incentives and favorable boundary conditions for replication practice, the main stakeholders need to be aware of what drives replication. Here we investigate how often replication studies are published in empirical economics and what types of journal articles are replicated. We find that from 1974 to 2014 less than 0.1% of publications in the top-50 economics journals were replications. We do not find empirical support that mandatory data disclosure policies or the availability of data or code have a significant effect on the incidence of replication. The mere provision of data repositories may be ineffective, unless accompanied by appropriate incentives. However, we find that higher-impact articles and articles by authors from leading institutions are more likely to be subject of published replication studies whereas the replication probability is lower for articles published in higher-ranked journals.

Kudos to the Simon Fraser University Publishing Program

“Kudos to the Simon Fraser University Publishing Program for this key provision in its updated criteria for promotion and tenure:

“In keeping with the University’s Open Access Policy of 2017, only those publications that are in compliance with the policy will be considered by tenure, promotion, and review committees….”

https://docs.google.com/document/d/1I9bZK3hAHzzWAVILMO1_MvvDPT3vCWnkvdYS6vp6h_Y/edit

All universities should follow suit….”