Abstract: The article describes a position statement and recommendations for actions that need to be taken to develop best practices for promoting scientific integrity through open science in health psychology endorsed at a Synergy Expert Group Meeting. Sixteen Synergy Meeting participants developed a set of recommendations for researchers, gatekeepers, and research end-users. The group process followed a nominal group technique and voting system to elicit and decide on the most relevant and topical issues. Seventeen priority areas were listed and voted on, 15 of them were recommended by the group. Specifically, the following priority actions for health psychology were endorsed: (1) for researchers: advancing when and how to make data open and accessible at various research stages and understanding researchers’ beliefs and attitudes regarding open data; (2) for educators: integrating open science in research curricula, e.g., through online open science training modules, promoting preregistration, transparent reporting, open data and applying open science as a learning tool; (3) for journal editors: providing an open science statement, and open data policies, including a minimal requirements submission checklist. Health psychology societies and journal editors should collaborate in order to develop a coordinated plan for research integrity and open science promotion across behavioural disciplines.
“Even though The Journal of Social Psychology was one of the first psychology journals to adopt open science badges (J. E. Grahe, 2014), and the first to require Research Materials Transparency (J. Grahe, 2018), we have resisted requiring Data Transparency. The reasons for this have varied across the years, but most recently we paused for two reasons which I will present momentarily. However, our reasons were generally concerned that early adoption would drive away too many authors and we needed to wait. In the early spring of 2020, the editors once again discussed adopting Data Transparency as a requirement for publication, but again demurred. Though our other concerns were again discussed, the onset of the CV-19 pandemic was our primary caution. In short, we recognized that this decision will require a transition as authors grapple with a new reality of sharing their data as a condition of publication, and we were waiting until the time was right to implement the new rules. Well, the time has come, and this editorial is the announcement that Data Transparency will now be required for publication in The Journal of Social Psychology. Along with a short explanation of the timing, this editorial also describes what is required versus recommended in our new data sharing policy.”
Abstract: Open data-sharing is a valuable practice that ought to enhance the impact, reach, and transparency of a research project. While widely advocated by many researchers and mandated by some journals and funding agencies, little is known about detailed practices across psychological science. In a pre-registered study, we show that overall, few research papers directly link to available data in many, though not all, journals. Most importantly, even where open data can be identified, the majority of these lacked completeness and reusability—conclusions that closely mirror those reported outside of Psychology. Exploring the reasons behind these findings, we offer seven specific recommendations for engineering and incentivizing improved practices, so that the potential of open data can be better realized across psychology and social science more generally.
“The American Psychological Association (APA), the nation’s largest organization representing psychologists and psychological researchers has become a signatory to the Transparency and Openness Promotion (TOP) Guidelines, an important step for helping to make research data and processes more open by default, according to the Center for Open Science (COS).
The TOP Guidelines are a community-driven effort to align research behaviors with scientific ideals. Transparency, open sharing, and reproducibility are core values of science, but not always part of daily practice. Journals, funders, and institutions can increase reproducibility and integrity of research by aligning their author or grantee guidelines with the TOP Guidelines.
The APA said it will officially begin implementing standardized disclosure requirements of data and underlying research materials (TOP Level 1). Furthermore, it encourages editors of core journals to move to Level 2 TOP (required transparency of data and research items when ethically possible). More information on the specific levels of adoption by each of the core journals will be coming in the first half of 2021….”
“As a jocular retort to one of a few cases of strange and aggressive behaviour from some open science people towards others online, one of us (Olivia) coined the expression #bropenscience in a June 2017 tweet. This was after a discussion with other women within the open science movement, who had noticed this phenomenon, but were looking for a concise description. #bropenscience is a tongue-in-cheek expression but also has a serious side, shedding light on the narrow demographics and off-putting behavioural patterns seen in open science. The phrase is a necessary rhetorical device to draw attention to an issue that has been systematically underappreciated. It evokes a visceral reaction. By design. Labelling broblems allows us to tackle them. As a field, psychology is well-equipped to self-reflect on patterns of behaviours and rhetorical devices – most of us are used to analysing complex social dynamics. However, #bropenscience has also been misunderstood and misrepresented, not least because Twitter has a tricky interface and people love drama!
Here we will clarify the important points for those who might not have been following these discussions. We will explain why having a hashtag like #bropenscience, or at least having this dialogue, is useful as part of the process of achieving openness in scholarship. Along the way we will explain what open science and open scholarship are, why we care about them, and finally, we will describe specific actions that readers can take to help promote equity and inclusion, the fundamentals for openness.
We offer our opinions as open science advocates, albeit with different priorities and expertise. Just as it is important for scientists to criticise the scientific process, so too must open science advocates critically engage with the suggested reforms….”
Abstract: The Repository of Psychological Instruments in Serbian (REPOPSI; https://osf.io/5zb8p/), run by the Laboratory for Research of Individual Differences at the University of Belgrade and hosted on the Open Science Framework, is an open-access repository of psychological instruments. REPOPSI is a collection of over 130 instruments (e.g., scales, tests) commonly used in social and behavioral science research. Documented are Serbian, English and multilingual instruments, which can be used free of charge for non-commercial purposes (e.g., academic research or education). We argue that REPOPSI enables scientists to increase the efficiency of their research and the visibility of their output. We analyze REPOPSI’s commitment to ensure that its (meta)data is findable, accessible, interoperable, and reusable (the FAIR Data Principles) and its trustworthiness with respect to transparency, responsibility, user focus, sustainability, and technology (the TRUST principles). Finally, we describe how the FAIR and the TRUST principles will support the process of continuous improvement of REPOPSI.
“Professor of psychology and public affairs Betsy Levy Paluck and behavioral sciences librarian Meghan Testerman were recently awarded Princeton’s Data-Driven Social Science Initiative (DDSSI) Grant for their joint proposal, “Prejudice Reduction: Creating an Open Repository of ‘What Works’ from Experimental Research.”
A decade ago, Paluck and Donald P. Green, professor of political science at Columbia University, composed an essay for the Annual Review of Psychology that addressed the vast literature on prejudice from a new vantage point. Instead of reviewing leading theories of how prejudices are formed and expressed, Paluck and Green (2009) summarized theory and evidence on how to reduce prejudice. The review article immediately attracted scholarly attention and earned a “highly cited” badge on ISI Web of Science.
That essay’s most salient contribution was to put “prejudice reduction” on the map as a central object of study that speaks to both social science theory and real-world policymaking. Importantly, they linked their essay to a resource that proved to be equally if not more influential—a database that listed and made available for thematic searching the comprehensive list of over 900 studies that they had compiled for the purposes of writing the essay. The database was hosted on Paluck’s personal website and received hundreds of hits per month and has been cited in subsequent work as a resource for scholars reviewing the literature on prejudice reduction and developing new interventions to be tested. In short, the need for a database of studies curated by an expert in the field was clear. …”
Abstract: A consensus on the importance of open data and reproducible code is emerging. How should data and code be shared to maximize the key desiderata of reproducibility, permanence, and accessibility? Research assets should be stored persistently in formats that are not software restrictive, and documented so that others can reproduce and extend the required computations. The sharing method should be easy to adopt by already busy researchers. We suggest the R package standard as a solution for creating, curating, and communicating research assets. The R package standard, with extensions discussed herein, provides a format for assets and metadata that satisfies the above desiderata, facilitates reproducibility, open access, and sharing of materials through online platforms like GitHub and Open Science Framework. We discuss a stack of R resources that help users create reproducible collections of research assets, from experiments to manuscripts, in the RStudio interface. We created an R package, vertical, to help researchers incorporate these tools into their workflows, and discuss its functionality at length in an online supplement. Together, these tools may increase the reproducibility and openness of psychological science.
“In summary, only two studies (7%) reporting results of a psychological treatment for common mental disorders in LMICs provided citations to the exact manual used in the study, and only two (7%) provided open access to the manual.
Access to treatment manuals for psychological interventions is important for the replication and independent scrutiny of study results and for the dissemination of effective interventions.
Change is not only needed but also feasible. For example, two relevant RCTs of psychological treatments were released around the same time of the systematic review3 and were thus not included in our analyses. One included a reference to an online version of the exact manual used8, and the other offered access to a linked training programme to learn the intervention9.
Accessibility to treatment manuals is a key aspect of open science of psychological treatments. Mental health journals and research funders should consider setting up mechanisms that require authors of RCTs to make the psychological treatment manuals they used open access.”
Abstract: The present crisis demands an all-out response if it is to be mastered with minimal damage. This means we, as the behavioural science community, need to think about how we can adapt to best support evidence-based policy in a rapidly changing, high-stakes environment. This piece is an attempt to initiate this process. The ‘recommendations’ made are first stabs that will hopefully be critiqued, debated and improved.