“Professor of psychology and public affairs Betsy Levy Paluck and behavioral sciences librarian Meghan Testerman were recently awarded Princeton’s Data-Driven Social Science Initiative (DDSSI) Grant for their joint proposal, “Prejudice Reduction: Creating an Open Repository of ‘What Works’ from Experimental Research.”
A decade ago, Paluck and Donald P. Green, professor of political science at Columbia University, composed an essay for the Annual Review of Psychology that addressed the vast literature on prejudice from a new vantage point. Instead of reviewing leading theories of how prejudices are formed and expressed, Paluck and Green (2009) summarized theory and evidence on how to reduce prejudice. The review article immediately attracted scholarly attention and earned a “highly cited” badge on ISI Web of Science.
That essay’s most salient contribution was to put “prejudice reduction” on the map as a central object of study that speaks to both social science theory and real-world policymaking. Importantly, they linked their essay to a resource that proved to be equally if not more influential—a database that listed and made available for thematic searching the comprehensive list of over 900 studies that they had compiled for the purposes of writing the essay. The database was hosted on Paluck’s personal website and received hundreds of hits per month and has been cited in subsequent work as a resource for scholars reviewing the literature on prejudice reduction and developing new interventions to be tested. In short, the need for a database of studies curated by an expert in the field was clear. …”
Abstract: A consensus on the importance of open data and reproducible code is emerging. How should data and code be shared to maximize the key desiderata of reproducibility, permanence, and accessibility? Research assets should be stored persistently in formats that are not software restrictive, and documented so that others can reproduce and extend the required computations. The sharing method should be easy to adopt by already busy researchers. We suggest the R package standard as a solution for creating, curating, and communicating research assets. The R package standard, with extensions discussed herein, provides a format for assets and metadata that satisfies the above desiderata, facilitates reproducibility, open access, and sharing of materials through online platforms like GitHub and Open Science Framework. We discuss a stack of R resources that help users create reproducible collections of research assets, from experiments to manuscripts, in the RStudio interface. We created an R package, vertical, to help researchers incorporate these tools into their workflows, and discuss its functionality at length in an online supplement. Together, these tools may increase the reproducibility and openness of psychological science.
“In summary, only two studies (7%) reporting results of a psychological treatment for common mental disorders in LMICs provided citations to the exact manual used in the study, and only two (7%) provided open access to the manual.
Access to treatment manuals for psychological interventions is important for the replication and independent scrutiny of study results and for the dissemination of effective interventions.
Change is not only needed but also feasible. For example, two relevant RCTs of psychological treatments were released around the same time of the systematic review3 and were thus not included in our analyses. One included a reference to an online version of the exact manual used8, and the other offered access to a linked training programme to learn the intervention9.
Accessibility to treatment manuals is a key aspect of open science of psychological treatments. Mental health journals and research funders should consider setting up mechanisms that require authors of RCTs to make the psychological treatment manuals they used open access.”
Abstract: The present crisis demands an all-out response if it is to be mastered with minimal damage. This means we, as the behavioural science community, need to think about how we can adapt to best support evidence-based policy in a rapidly changing, high-stakes environment. This piece is an attempt to initiate this process. The ‘recommendations’ made are first stabs that will hopefully be critiqued, debated and improved.
Abstract: This editorial announces this journal’s policy on transparency, openness and replication. From 1 July 2020, authors of manuscripts submitted to Journal of Health Psychology (JHP) are required to make the raw data fully accessible to all readers. JHP will only consider manuscripts which follow an open publication model defined as follows: M = Mandatory, I = Inclusion (of), R = Raw, D = Data (MIRD). All data and analytical procedures must be sufficiently well described to enable a third party with the appropriate expertise to replicate the data analyses. It is expected that findings and analyses in the JHP will be fully capable of being accurately reproduced.
“[W]e are very pleased to announce that IJMPR [International Journal of Methods in Psychiatric Research] has transitioned to an open access journal, effective January 2020. As a result, all submissions to IJMPR will be subject to an Article Publication Charge (APC) if accepted and published in the journal. Since 10th July 2019, all articles submitted and accepted for publication in IJMPR have been published open access under a creative commons license. You can find details of the APCs here: https://onlinelibrary.wiley.com/page/journal/15570657/article_publication_charges.html and details about IJMPR’s open access licensing and copyright here: https://onlinelibrary.wiley.com/page/journal/15570657/homepage/open_access_license_and_copyright.htm?. If you have any questions regarding this transition, please feel free to contact the editors directly….”
Abstract: The ability to independently verify and replicate observations made by other researchers is a hallmark of science. In this article, we provide an overview of recent discussions concerning replicability and best practices in mainstream psychology with an emphasis on the practical benefists to both researchers and the field as a whole. We first review challenges individual researchers face in producing research that is both publishable and reliable. We then suggest methods for producing more accurate research claims, such as transparently disclosing how results were obtained and analyzed, preregistering analysis plans, and publicly posting original data and materials. We also discuss ongoing changes at the institutional level to incentivize stronger research. These include officially recognizing open science practices at the journal level, disconnecting the publication decision from the results of a study, training students to conduct replications, and publishing replications. We conclude that these open science practices afford exciting low-cost opportunities to improve the quality of psychological science.
“On Friday, December 13, 2019, Research!America confirmed to the broader scientific community that the US White House Office of Science and Technology Policy (OSTP) would soon issue an Executive Order (EO) requiring open access immediately for all scientific publications resulting from research supported by US federal grants. Such an immediate change would inject chaos into the current means of disseminating research findings and potentially cause serious financial challenges for many scientific societies.
Over the ensuing weekend, two public letters were drafted to the US presidential administration and other American politicians. One letter was led by the American Chemical Society (which focused on the impact of the EO on scientific societies) and the other letter was led by the Association of American Publishers (that focused on the economic impact of the EO). The goal was to insist that the US administration reconsider any impulsive action so journals and societies could evaluate and adapt to the proposed change in an orderly way. There was no opportunity for APS to offer edits….”