“‘Open research’ (used interchangeably with ‘open science’) is an all-encompassing term speaking to the set of practices that aim to improve the accessibility, reproducibility, and integrity of research outputs. It’s also complex, spanning issues such as open access, open practices that increase the integrity and reproducibility of research (e.g., Registered Reports, open data and code), open collaboration, and open recognition (e.g. transparent peer review and CRediT Contributor Roles Taxonomy).
So, what do researchers think about open research? We invited researchers to participate in Wiley’s Open Research Survey to share their views and experiences of open research practices. It’s clear from our findings that researchers welcome open research initiatives in terms of their motivation for publishing open access, willingness to share data and to experiment with opening up the peer review process (see overview below for more detail).
Recent studies have shown that articles that are freely available obtain more citations and are downloaded more often. Institutions are beginning to reward and recognise open research practices, especially in recruitment and for promotion. Funders are also requiring that researchers publish open access and share data (for example, Horizon Europe).
Open research isn’t the future – it’s the here and now, and journal editors have a vital role to play in facilitating open research and open publishing practices alongside researchers, institutions, funders, and publishers. Editors can play their part by supporting open access publishing, adopting Registered Reports, adopting open data policies and data availability statements, recognizing and celebrating open research practices such as displaying open research badges on published articles, and opening up peer review. If you want to implement one or more of these initiatives on your journal, please speak with your Wiley Journal Publishing Manager….”
“Open science reduces waste and accelerates the discovery of knowledge, solutions, and cures for the world’s most pressing needs. Shifting research culture toward greater openness, transparency, and reproducibility is challenging, but there are incremental steps at every stage of the research lifecycle that can improve rigor and reduce waste. Visit cos.io to learn more.”
Abstract: In January 2020, I presented at the Librarians Building Momentum for Reproducibility virtual conference. The theme of the presentation was preregistration and registered reports and their role in reproducibility of research results. The presentation was twofold in that it provided background information on these themes and then advocated for the adoption of a registered reports submission track in Library and Information Science journals. I asked attendees to notify me if they wanted to learn more and to join me in contacting LIS journals to advocate for this model. The first journal that we targeted was College & Research Libraries. We drafted a letter that was sent to editor Wendi Arant Kaspar who discussed the topic with the editorial board and ultimately asked me to write a guest editorial for C&RL.
Ignore citation counts. Given that citations are unrelated to (easily-predictable) replicability, let alone any subtler quality aspects, their use as an evaluative tool should stop immediately.
Open data, enforced by the NSF/NIH. There are problems with privacy but I would be tempted to go as far as possible with this. Open data helps detect fraud. And let’s have everyone share their code, too—anything that makes replication/reproduction easier is a step in the right direction.
Financial incentives for universities and journals to police fraud. It’s not easy to structure this well because on the one hand you want to incentivize them to minimize the frauds published, but on the other hand you want to maximize the frauds being caught. Beware Goodhart’s law!
Why not do away with the journal system altogether? The NSF could run its own centralized, open website; grants would require publication there. Journals are objectively not doing their job as gatekeepers of quality or truth, so what even is a journal? A combination of taxonomy and reputation. The former is better solved by a simple tag system, and the latter is actually misleading. Peer review is unpaid work anyway, it could continue as is. Attach a replication prediction market (with the estimated probability displayed in gargantuan neon-red font right next to the paper title) and you’re golden. Without the crutch of “high ranked journals” maybe we could move to better ways of evaluating scientific output. No more editors refusing to publish replications. You can’t shift the incentives: academics want to publish in “high-impact” journals, and journals want to selectively publish “high-impact” research. So just make it impossible. Plus as a bonus side-effect this would finally sink Elsevier….”
“We believe that the value of science is in the rigor of the method, not the appeal of the results – an ethos at the heart of our publishing model. Choosing to publish your research as a Registered Report puts this into practice, by shifting the focus away from the results and back to the research question. Registered Reports can be used for research in almost any field of study, from psychology and neuroscience, to medicine or ecology.
Registered Reports on F1000Research follow a two-stage process: firstly, the Study Protocol (Stage 1) is published and peer-reviewed by subject experts before data collection begins. Then, once the research has been completed, the Research Article (Stage 2) is published, peer reviewed, and awarded a Registered Report badge.
F1000Research is the first publisher to combine the Registered Reports format with an open, post-publication peer review model. Alongside our open data policy, this format enhances credibility, and takes transparency and reproducibility in research to the next level….”
“3.5.7 Registered reports and open practices badges
One possible way to incorporate all the information listed above and to combat the stigma against papers that report nonsignificant findings is through the implementation of Registered Reports or rewarding transparent research practices. Registered Reports are empirical articles designed to eliminate publication bias and incentivize best scientific practice. Registered Reports are a form of empirical article in which the methods and the proposed analyses are preregistered and reviewed prior to research being conducted. This format is designed to minimize bias, while also allowing complete flexibility to conduct exploratory (unregistered) analyses and report serendipitous findings. The cornerstone of the Registered Reports format is that the authors submit as a Stage 1 manuscript an introduction, complete and transparent methods, and the results of any pilot experiments (where applicable) that motivate the research proposal, written in the future tense. These proposals will include a description of the key research question and background literature, hypotheses, experimental design and procedures, analysis pipeline, a statistical power analysis, and full description of the planned comparisons. Submissions, which are reviewed by editors, peer reviewers and in some journals, statistical editors, meeting the rigorous and transparent requirements for conducting the research proposed are offered an in?principle acceptance, meaning that the journal guarantees publication if the authors conduct the experiment in accordance with their approved protocol. Many journals publish the Stage 1 report, which could be beneficial not only for citations, but for the authors’ progress reports and tenure packages. Following data collection, the authors prepare and resubmit a Stage 2 manuscript that includes the introduction and methods from the original submission plus their obtained results and discussion. The manuscript will undergo full review; referees will consider whether the data test the authors’ proposed hypotheses by satisfying the approved outcome?neutral conditions, will ensure the authors adhered precisely to the registered experimental procedures, and will review any unregistered post hoc analyses added by the authors to confirm they are justified, methodologically sound, and informative. At this stage, the authors must also share their data (see also Wiley’s Data Sharing and Citation Policy) and analysis scripts on a public and freely accessible archive such as Figshare and Dryad or at the Open Science Framework. Additional details, including template reviewer and author guidelines, can be found by clicking the link to the Open Science Framework from the Center for Open Science (see also94).
The authors who practice transparent and rigorous science should be recognized for this work. Funders can encourage and reward open practice in significant ways (see https://wellcome.ac.uk/what?we?do/our?work/open?research). One way journals can support this is to award badges to the authors in recognition of these open scientific practices. Badges certify that a particular practice was followed, but do not define good practice. As defined by the Open Science Framework, three badges can be earned. The Open Data badge is earned for making publicly available the digitally shareable data necessary to reproduce the reported results. These data must be accessible via an open?access repository, and must be permanent (e.g., a registration on the Open Science Framework, or an independent repository at www.re3data.org). The Open Materials badge is earned when the components of the research methodology needed to reproduce the reported procedure and analysis are made publicly available. The Preregistered badge is earned for having a preregistered design, whereas the Preregistered+Analysis Plan badge is earned for having both a preregistered research design and an analysis plan for the research; the authors must report results according to that plan. Additional information about the badges, including the necessary information to be awarded a badge, can be found by clicking this link to the Open Science Framework from the Center for Open Science….”
Abstract: The Open Science movement has gained considerable traction in the last decade. The Open Science movement tries to increase trust in research results and open the access to all elements of a research project to the public. Central to these goals, Open Science has promoted five critical tenets: Open Data, Open Analysis, Open Materials, Preregistration, and Open Access. All Open Science elements can be thought of as extensions to the traditional way of achieving openness in science, which has been scientific publication of research outcomes in journals or books. Open Science in education sciences, however, has the potential to be much more than a safeguard against questionable research. Open Science in education science provides opportunities to (a) increase the transparency and therefore replicability of research and (b) develop and answer research questions about individuals with learning disabilities and learning difficulties that were previously impossible to answer due to complexities in data analysis methods. We will provide overviews of the main tenets of Open Science (i.e., Open Data, Open Analysis, Open Materials, Preregistration, and Open Access), show how they are in line with grant funding agencies’ expectations for rigorous research processes, and present resources on best practices for each of the tenets.
Abstract: Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.
Methods: We scored the author guidelines of a comprehensive set of 28 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. This instrument rates the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.
Results: Across the 28 journals, we find low values on the TOP Factor (median [25th, 75th percentile] 2.5 [1, 3], min. 0, max. 9, out of a total possible score of 28) in sleep research and chronobiology journals.
Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support the recent developments in transparent and open science by implementing transparency and openness principles in their guidelines and making adherence to them mandatory.
“During the 2010s, I gradually adopted open science practices. With each study I started, I began to take more and more steps to make my research transparent. I started uploading my data, documenting analysis procedures, pre-registering my work, and taking other steps to ensure my research was transparent. After adding components of open science to my work, I finally decided in fall 2017 that I would conduct a fully open science project. My only regret was that I didn’t fully embrace open science earlier….
This article is the result of the first fully open science study of my career, though I had adopted pieces of open science beforehand. Here is what I learned from this study: …”