When are researchers willing to share their data? – Impacts of values and uncertainty on open data in academia

Abstract:  Background

E-science technologies have significantly increased the availability of data. Research grant providers such as the European Union increasingly require open access publishing of research results and data. However, despite its significance to research, the adoption rate of open data technology remains low across all disciplines, especially in Europe where research has primarily focused on technical solutions (such as Zenodo or the Open Science Framework) or considered only parts of the issue.

Methods and findings

In this study, we emphasized the non-technical factors perceived value and uncertainty factors in the context of academia, which impact researchers’ acceptance of open data–the idea that researchers should not only publish their findings in the form of articles or reports, but also share the corresponding raw data sets. We present the results of a broad quantitative analysis including N = 995 researchers from 13 large to medium-sized universities in Germany. In order to test 11 hypotheses regarding researchers’ intentions to share their data, as well as detect any hierarchical or disciplinary differences, we employed a structured equation model (SEM) following the partial least squares (PLS) modeling approach.

Conclusions

Grounded in the value-based theory, this article proclaims that most individuals in academia embrace open data when the perceived advantages outweigh the disadvantages. Furthermore, uncertainty factors impact the perceived value (consisting of the perceived advantages and disadvantages) of sharing research data. We found that researchers’ assumptions about effort required during the data preparation process were diminished by awareness of e-science technologies (such as Zenodo or the Open Science Framework), which also increased their tendency to perceive personal benefits via data exchange. Uncertainty factors seem to influence the intention to share data. Effects differ between disciplines and hierarchical levels.

When are researchers willing to share their data? – Impacts of values and uncertainty on open data in academia

Abstract:  Background

E-science technologies have significantly increased the availability of data. Research grant providers such as the European Union increasingly require open access publishing of research results and data. However, despite its significance to research, the adoption rate of open data technology remains low across all disciplines, especially in Europe where research has primarily focused on technical solutions (such as Zenodo or the Open Science Framework) or considered only parts of the issue.

Methods and findings

In this study, we emphasized the non-technical factors perceived value and uncertainty factors in the context of academia, which impact researchers’ acceptance of open data–the idea that researchers should not only publish their findings in the form of articles or reports, but also share the corresponding raw data sets. We present the results of a broad quantitative analysis including N = 995 researchers from 13 large to medium-sized universities in Germany. In order to test 11 hypotheses regarding researchers’ intentions to share their data, as well as detect any hierarchical or disciplinary differences, we employed a structured equation model (SEM) following the partial least squares (PLS) modeling approach.

Conclusions

Grounded in the value-based theory, this article proclaims that most individuals in academia embrace open data when the perceived advantages outweigh the disadvantages. Furthermore, uncertainty factors impact the perceived value (consisting of the perceived advantages and disadvantages) of sharing research data. We found that researchers’ assumptions about effort required during the data preparation process were diminished by awareness of e-science technologies (such as Zenodo or the Open Science Framework), which also increased their tendency to perceive personal benefits via data exchange. Uncertainty factors seem to influence the intention to share data. Effects differ between disciplines and hierarchical levels.

Suppression as a form of liberation? – Ross Mounce

“On Monday 29th June 2020, I learned from Retraction Watch that Clarivate, the for-profit proprietor of Journal Impact Factor ™ has newly “suppressed” 33 journals from their indexing service. The immediate consequence of this “suppression” is that these 33 journals do not get assigned an official Clarivate Journal Impact Factor ™ . Clarivate justify this action on the basis of “anomalous citation patterns” but without much further detail given for each of the journals other than the overall “% Self-cites” of the journal, and the effect of those self-cites on Clarivate’s citation-based ranking of journals (% Distortion of category rank)….

The zoology section of the Chilean Society of Biology has already petitioned Clarivate to unsuppress Zootaxa, to give it back its Journal Impact Factor ™ . I understand why they would do this but I would actually call for something quite different and more far-reaching.

I would encourage all systematists, taxonomists, zoologists, microbiologists, and biologists in general to see the real problem here: Clarivate, a for-profit analytics company, should never be so relied-upon by research evaluation committees to arbitrarily decide the value of a research output. Especially given that the Journal Impact Factor ™ is untransparent, irreproducible, and fundamentally statistically illiterate.

 

Thus to bring us back to my title. I wonder if Clarivate’s wacky “suppression” might actually be a pathway to liberation from the inappropriate stupidity of using Journal Impact Factor ™ to evaluate individual research outputs. Given we have all now witnessed just how brainless some of Clarivate’s decision making is, I would ask Clarivate to please “suppress” all journals thereby removing the harmful stupidity of Journal Impact Factor ™ from the lives of researchers.”

Suppression as a form of liberation? – Ross Mounce

“On Monday 29th June 2020, I learned from Retraction Watch that Clarivate, the for-profit proprietor of Journal Impact Factor ™ has newly “suppressed” 33 journals from their indexing service. The immediate consequence of this “suppression” is that these 33 journals do not get assigned an official Clarivate Journal Impact Factor ™ . Clarivate justify this action on the basis of “anomalous citation patterns” but without much further detail given for each of the journals other than the overall “% Self-cites” of the journal, and the effect of those self-cites on Clarivate’s citation-based ranking of journals (% Distortion of category rank)….

The zoology section of the Chilean Society of Biology has already petitioned Clarivate to unsuppress Zootaxa, to give it back its Journal Impact Factor ™ . I understand why they would do this but I would actually call for something quite different and more far-reaching.

I would encourage all systematists, taxonomists, zoologists, microbiologists, and biologists in general to see the real problem here: Clarivate, a for-profit analytics company, should never be so relied-upon by research evaluation committees to arbitrarily decide the value of a research output. Especially given that the Journal Impact Factor ™ is untransparent, irreproducible, and fundamentally statistically illiterate.

 

Thus to bring us back to my title. I wonder if Clarivate’s wacky “suppression” might actually be a pathway to liberation from the inappropriate stupidity of using Journal Impact Factor ™ to evaluate individual research outputs. Given we have all now witnessed just how brainless some of Clarivate’s decision making is, I would ask Clarivate to please “suppress” all journals thereby removing the harmful stupidity of Journal Impact Factor ™ from the lives of researchers.”

UK Research and Development Roadmap – GOV.UK

“Research has rapidly improved our understanding of COVID-19. Supported by rapid action by funding bodies, scientists around the world have directed their efforts to this global priority, working collaboratively across countries and disciplines, and sharing findings openly and quickly. Rapid targeted funding has enabled researchers and policy makers to join up to clarify and tackle pressing questions and has enabled businesses to collaborate in new ways to address national needs. For example, the COVID-19 Genomics UK consortium has achieved rapid sequencing of over 50% of all the SARS-CoV-2 genomes in the world. The UK has led the world’s largest randomised control trial for COVID-19, with findings helping the sickest patients not only in the UK but all around the world. We should aspire to this level of openness, connectivity and pace across our whole R&D system….

Crucially, we must embrace the potential of open research practices. First, we will require that research outputs funded by the UK government are freely available to the taxpayer who funds research. Such open publication will also ensure that UK research is cited and built on all over the world. We will mandate open publication and strongly incentivise open data sharing where appropriate, so that reproducibility is enabled, and knowledge is shared and spread collaboratively. Second, we will ensure that more modern research outputs are recognised and rewarded. For example, we will ensure that digital software and datasets are properly recognised as research outputs, so that we can minimise efforts spent translating digital outputs into more traditional formats. Third, we will consider the case for new infrastructure to enable more effective sharing of knowledge between researchers and with industry to accelerate open innovation where possible….”

Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities | The BMJ

Abstract:  Objective To determine the presence of a set of pre-specified traditional and non-traditional criteria used to assess scientists for promotion and tenure in faculties of biomedical sciences among universities worldwide.

Design Cross sectional study.

Setting International sample of universities.

Participants 170 randomly selected universities from the Leiden ranking of world universities list.

Main outcome measure Presence of five traditional (for example, number of publications) and seven non-traditional (for example, data sharing) criteria in guidelines for assessing assistant professors, associate professors, and professors and the granting of tenure in institutions with biomedical faculties.

Results A total of 146 institutions had faculties of biomedical sciences, and 92 had eligible guidelines available for review. Traditional criteria of peer reviewed publications, authorship order, journal impact factor, grant funding, and national or international reputation were mentioned in 95% (n=87), 37% (34), 28% (26), 67% (62), and 48% (44) of the guidelines, respectively. Conversely, among non-traditional criteria, only citations (any mention in 26%; n=24) and accommodations for employment leave (37%; 34) were relatively commonly mentioned. Mention of alternative metrics for sharing research (3%; n=3) and data sharing (1%; 1) was rare, and three criteria (publishing in open access mediums, registering research, and adhering to reporting guidelines) were not found in any guidelines reviewed. Among guidelines for assessing promotion to full professor, traditional criteria were more commonly reported than non-traditional criteria (traditional criteria 54.2%, non-traditional items 9.5%; mean difference 44.8%, 95% confidence interval 39.6% to 50.0%; P=0.001). Notable differences were observed across continents in whether guidelines were accessible (Australia 100% (6/6), North America 97% (28/29), Europe 50% (27/54), Asia 58% (29/50), South America 17% (1/6)), with more subtle differences in the use of specific criteria.

Conclusions This study shows that the evaluation of scientists emphasises traditional criteria as opposed to non-traditional criteria. This may reinforce research practices that are known to be problematic while insufficiently supporting the conduct of better quality research and open science. Institutions should consider incentivising non-traditional criteria.

Open science and the reward system: how can they be aligned?

“How does the current reward system reflect how we value research? Is a reform of this system necessary to encourage researchers to engage in open science activities? How do current and proposed reward systems support early career researchers?

Questions on how academics’ careers and contributions are assessed and valued are under discussion. This webinar brings together a panel of experts on open science and career assessment to focus on the current reward system and the potential for its reform. This promises to be a lively exchange of ideas between representatives of Eurodoc, Young Academy of Europe, Marie Curie Alumni Association, and Elsevier. The aim is to gain a deeper understanding of possible changes to how we consider academic value, retain mobility internationally and beyond academia, and create incentives for open science activities….”

Is a software revolution on the cards? | Research Information

“New research outputs also create new software challenges as a wide variety of formats must be integrated into existing information and knowledge systems. In fact, one of the main reasons researchers are not sharing data at scale are because they don’t know where to share it and lack incentives from the community to do so.

Existing tools, such as institutional repositories, content workflow or discovery services, do not put user experience or innovative discovery and dissemination concepts at the forefront, nor do they target specific formats such as pre-published research. As such, software services that can make content easily discoverable and useful for researchers are becoming all the more relevant and have a massive business opportunity in the scholarly ecosystem….

Having access to new kinds of highly relevant and useful software services helps businesses achieve success and accelerate their growth by providing a tailor-made solution to their needs.

There’s evidence that a similar shift is underway in scholarly publishing. The research workflow and the way that content is shared, discovered, and analysed is being reinvented to invigorate processes that are often decades-old….

Supporting researchers to do their best work while ensuring research is more accessible is a win for science and a win for sustainable business models.”

Pharmaceutical patents: reconciling the human right to health with the incentive to invent – ScienceDirect

“Highlights

Drug discovery is exciting and transformative but conflicts exist between the incentive to invent and the rights of others to access medicines
Tensions between fundamental rights to access essential medicines and rights of the inventor and investors are considered 
Effective incentives to innovate in developed countries can lead to global improvements in access to medicine if the intellectual property system is calibrated to permit this 
Compulsory licensing and alternative mechanisms facilitating global access to drugs in the context of rights to the highest attainable standard of health and intellectual property are also discussed…”