Open science, communal culture, and women’s participation in the movement to improve science | PNAS

Abstract:  Science is undergoing rapid change with the movement to improve science focused largely on reproducibility/replicability and open science practices. This moment of change—in which science turns inward to examine its methods and practices—provides an opportunity to address its historic lack of diversity and noninclusive culture. Through network modeling and semantic analysis, we provide an initial exploration of the structure, cultural frames, and women’s participation in the open science and reproducibility literatures (n = 2,926 articles and conference proceedings). Network analyses suggest that the open science and reproducibility literatures are emerging relatively independently of each other, sharing few common papers or authors. We next examine whether the literatures differentially incorporate collaborative, prosocial ideals that are known to engage members of underrepresented groups more than independent, winner-takes-all approaches. We find that open science has a more connected, collaborative structure than does reproducibility. Semantic analyses of paper abstracts reveal that these literatures have adopted different cultural frames: open science includes more explicitly communal and prosocial language than does reproducibility. Finally, consistent with literature suggesting the diversity benefits of communal and prosocial purposes, we find that women publish more frequently in high-status author positions (first or last) within open science (vs. reproducibility). Furthermore, this finding is further patterned by team size and time. Women are more represented in larger teams within reproducibility, and women’s participation is increasing in open science over time and decreasing in reproducibility. We conclude with actionable suggestions for cultivating a more prosocial and diverse culture of science.

What’s Wrong with Social Science and How to Fix It: Reflections After Reading 2578 Papers | Fantastic Anachronism

[Some recommendations:]

Ignore citation counts. Given that citations are unrelated to (easily-predictable) replicability, let alone any subtler quality aspects, their use as an evaluative tool should stop immediately.
Open data, enforced by the NSF/NIH. There are problems with privacy but I would be tempted to go as far as possible with this. Open data helps detect fraud. And let’s have everyone share their code, too—anything that makes replication/reproduction easier is a step in the right direction.
Financial incentives for universities and journals to police fraud. It’s not easy to structure this well because on the one hand you want to incentivize them to minimize the frauds published, but on the other hand you want to maximize the frauds being caught. Beware Goodhart’s law!
Why not do away with the journal system altogether? The NSF could run its own centralized, open website; grants would require publication there. Journals are objectively not doing their job as gatekeepers of quality or truth, so what even is a journal? A combination of taxonomy and reputation. The former is better solved by a simple tag system, and the latter is actually misleading. Peer review is unpaid work anyway, it could continue as is. Attach a replication prediction market (with the estimated probability displayed in gargantuan neon-red font right next to the paper title) and you’re golden. Without the crutch of “high ranked journals” maybe we could move to better ways of evaluating scientific output. No more editors refusing to publish replications. You can’t shift the incentives: academics want to publish in “high-impact” journals, and journals want to selectively publish “high-impact” research. So just make it impossible. Plus as a bonus side-effect this would finally sink Elsevier….”

ripeta – responsible science

“Ripeta is a credit review for scientific publications. Similar to a financial credit report, which reviews the fiscal health of a person, Ripeta assesses the responsible reporting of the scientific paper. The Ripeta suite identifies and extracts the key components of research reporting, thus drastically shortening and improving the publication process; furthermore, Ripeta’s ability to extract data makes these pieces of text easily discoverable for future use….

Researchers: Rapidly check your pre-print manuscripts to improve the transparency of reporting your research.

Publishers: Improve the reproducibility of the articles you publish with an automated tool that helps evidence-based science.

Funders: Evaluate your portfolio by checking your manuscripts for robust scientific reporting.”

Sharing and organizing research products as R packages | SpringerLink

Abstract:  A consensus on the importance of open data and reproducible code is emerging. How should data and code be shared to maximize the key desiderata of reproducibility, permanence, and accessibility? Research assets should be stored persistently in formats that are not software restrictive, and documented so that others can reproduce and extend the required computations. The sharing method should be easy to adopt by already busy researchers. We suggest the R package standard as a solution for creating, curating, and communicating research assets. The R package standard, with extensions discussed herein, provides a format for assets and metadata that satisfies the above desiderata, facilitates reproducibility, open access, and sharing of materials through online platforms like GitHub and Open Science Framework. We discuss a stack of R resources that help users create reproducible collections of research assets, from experiments to manuscripts, in the RStudio interface. We created an R package, vertical, to help researchers incorporate these tools into their workflows, and discuss its functionality at length in an online supplement. Together, these tools may increase the reproducibility and openness of psychological science.

 

Publish your Registered Report on F1000Research

“We believe that the value of science is in the rigor of the method, not the appeal of the results – an ethos at the heart of our publishing model. Choosing to publish your research as a Registered Report puts this into practice, by shifting the focus away from the results and back to the research question. Registered Reports can be used for research in almost any field of study, from psychology and neuroscience, to medicine or ecology.

 

Registered Reports on F1000Research follow a two-stage process: firstly, the Study Protocol (Stage 1) is published and peer-reviewed by subject experts before data collection begins. Then, once the research has been completed, the Research Article (Stage 2) is published, peer reviewed, and awarded a Registered Report badge.

 

F1000Research is the first publisher to combine the Registered Reports format with an open, post-publication peer review model. Alongside our open data policy, this format enhances credibility, and takes transparency and reproducibility in research to the next level….”

Open and Reproducible Research Group (ORRG)

“Open Science is better science. Research benefits from sharing data and scientific knowledge which is publicly available, making open science essential. The Open and Reproducible Research Group (ORRG) uses evidence-based and computational approaches to make research cultures more open, transparent and participatory through new practices and technologies. In our interdisciplinary team we combine competences in philosophy, sociology, and information science with computer science and life science. We research services, policies, and tools to investigate and foster the uptake and evaluation of Open Science practices in the following areas: …”

 

Help me redesign the scientific paper | Dynamic Ecology

“If scientists spend taxpayer money to generate irreproducible results, the public’s logical response should be to either withhold funds or demand a new process that emphasizes reproducibility….

Collaborative Independent Review is one way that funders, journals and scientists could implement a more reproducible paper….

Welcome to a new ERA of reproducible publishing | Labs | eLife

“Since 2017, we have been working on the concept of computationally reproducible papers. The open-source suite of tools that started life as the Reproducible Document Stack is now live on eLife as ERA, the Executable Research Article, delivering a truly web-native format for taking published research to a new level of transparency, reproducibility and interactivity.

From today, authors with a published eLife paper can register their interest to enrich their published work with the addition of live code blocks, programmatically-generated interactive figures, and dynamically generated in-line values, using familiar tools like R Markdown and Jupyter in combination with Stencila Hub’s intuitive asset management and format conversion interface. The resulting new ERA publication will be presented as a complement to the original published paper. Very soon, a Google Docs plugin will also be made available to let authors insert executable code and data blocks into their documents using the cloud service.

Readers of ERA publications will be able to inspect the code, modify it, and re-execute it directly in the browser, enabling them to better understand how a figure is generated. They will be able to change a plot from one format to another, alter the data range of a specific analysis, and much more. All changes are limited to an individual’s browsing session and do not affect the published article, so anyone can experiment safely. Readers can also download the ERA publication – with all embedded code and data preserved – and use it as a basis for further study or derivative works….”

An evaluation of the practice of transparency and reproducibility in addiction medicine literature – ScienceDirect

“Highlights

 

• Reproducibility is an essential aspect of open science that allows synthesis of old and new knowledge.
• Reproducibility makes research more efficient and prevents waste of resources.
• It is unknown if reproducible research practices are being practiced by the research community in addiction medicine.
• Our findings show a lack in practices that encourage and promote reproducible research in addiction medicine.
• We propose several ways by which this lack of reproducibility can be improved….”