BABCP journals, openness and transparency | Behavioural and Cognitive Psychotherapy | Cambridge Core

“Our BABCP journals have for some time been supportive of open science in its various forms. We are now taking the next steps towards this in terms of our policies and practices. For some things we are transitioning to the changes (but would encourage our contributors to embrace these as early as possible), and in others we are implementing things straight away. This is part of the global shift to open practices in science, and has many benefits and few, if any, drawbacks. See for example http://www.unesco.or/e//ommunication-and-informatio/ortals-and-platform/oa/pen-science-movement/

One of the main drivers for open science has been the recent ‘reproducibility crisis’, which crystallised long-standing concerns about a range of biases within and across research publication. Open science and research transparency will provide the means to reduce the impact of such biases, and can reasonably be considered to be a paradigm change. There are benefits beyond dealing with problems, however.

McKiernan et al. (2016) for example suggest that ‘open research is associated with increases in citations, media attention, potential collaborators, job opportunities and funding opportunities’. This is, of course, from a researcher-focused perspective. The BABCP and the Journal Editors take the view that open and transparent research practices will have the greatest long-term impact on service users both directly and indirectly through more accurate reporting and interpretation of research and its applications by CBT practitioners. So what are the practical changes we are implementing in partnership with our publisher, Cambridge University Press?…”

PsyArXiv Preprints | Replicability, Robustness, and Reproducibility in Psychological Science

Abstract:  Replication, an important, uncommon, and misunderstood practice, is making a comeback in psychology. Achieving replicability is a necessary but not sufficient condition for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledge. Assessing replicability can be productive for generating and testing hypotheses by actively confronting current understanding to identify weaknesses and spur innovation. For psychology, the 2010s might be characterized as a decade of active confrontation. Systematic and multi-site replication projects assessed current understanding and observed surprising failures to replicate many published findings. Replication efforts also highlighted sociocultural challenges, such as disincentives to conduct replications, framing of replication as personal attack rather than healthy scientific practice, and headwinds for replication contributing to self-correction. Nevertheless, innovation in doing and understanding replication, and its cousins, reproducibility and robustness, have positioned psychology to improve research practices and accelerate progress.

As new venues for peer review flower, will journals catch up? – Psychonomic Society Featured Content

“Given that preprints are here to stay, the field should be devoting resources to getting them certified more quickly as having received some amount of expert scrutiny. This is particularly important, of course, for preprints making claims relevant to the response to the pandemic.

In many cases, one component of this certification is already happening very quickly. More publicly-available peer review is happening today than ever before – just not at our journals. While academic journals typically call on half a handful of hand-picked, often reluctant referees, social media is not as limiting, and lively expert discussions are flourishing at forums like Twitter, Pubpeer, and the commenting facility of preprint servers.

So far, most journals have simply ignored this. As a result, science is now happening on two independent tracks, one slow, and one fast. The fast track is chaotic and unruly, while the slow track is bureaucratic and secretive – at most journals the experts’ comments never become available to readers, and the resulting evaluation by the editor of the strengths and weaknesses of the manuscript are never communicated to readers….

Will we need to reinvent the scientific journal wheel, or will legacy journals catch up with the modern world, by both taking advantage of and adding value to the peer review that is happening on the fast track?”

 

PsyArXiv Preprints | The Pandemic as a Portal: Reimagining Psychological Science as Truly Open and Inclusive

Abstract:  Psychological science is at an inflection point: The COVID-19 pandemic has already begun to exacerbate inequalities that stem from our historically closed and exclusive culture. Meanwhile, reform efforts to change the future of our science are too narrow in focus to fully succeed. In this paper, we call on psychological scientists—focusing specifically on those who use quantitative methods in the United States as one context in which such a conversation can begin—to reimagine our discipline as fundamentally open and inclusive. First, we discuss who our discipline was designed to serve and how this history produced the inequitable reward and support systems we see today. Second, we highlight how current institutional responses to address worsening inequalities are inadequate, as well as how our disciplinary perspective may both help and hinder our ability to craft effective solutions. Third, we take a hard look in the mirror at the disconnect between what we ostensibly value as a field and what we actually practice. Fourth and finally, we lead readers through a roadmap for reimagining psychological science in whatever roles and spaces they occupy, from an informal discussion group in a department to a formal strategic planning retreat at a scientific society.

 

Full article: To share or not to share – 10 years of European Journal of Psychotraumatology

Abstract:  The European Journal of Psychotraumatology, owned by the European Society for Traumatic Stress Studies (ESTSS), launched as one of the first full Open Access ‘specialist’ journals in its field. Has this Open Access model worked in how the Journal has performed? With the European Journal of Psychotraumatology celebrating its ten-year anniversary we look back at the past decade of sharing our research with the world and with how the journal sits with the broader movement beyond Open Access to Open Research and we present new policies we have adopted to move the field of psychotraumatology to the next level of Open Research. While we as researchers now make our publications more often freely available to all, how often do we share our protocols, our statistical analysis plans, or our data? We all gain from more transparency and reproducibility, and big steps are being made in this direction. The journal’s decennial performance as well as the exciting new Open Research developments are presented in this editorial. The journal is no longer in its infancy and eager to step into the next decade of Open Research.

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study | Royal Society Open Science

Abstract:  For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy’ (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.

 

Not One but Many Models of Open-Access Publishing – Association for Psychological Science – APS

“The OA movement has proliferated in numerous directions over the last two decades, and a color-naming system has evolved in an attempt to simplify this diversity. PsyArXiv is classified in this system as “green” OA because it is a repository for authors who seek to freely share their scholarly output with both consumers (readers) and producers of research (Samberg et al., 2018). The niches that Kitayama has described—serving “cutting-edge” and “nontraditional” research projects—are both examples of “gold” OA. These outlets are peer-reviewed journals that publish open articles and make use of article processing charges (APCs). This approach differs substantially from traditional publishing models where peer-reviewed articles are published without expense for the authors, but at substantial expense to libraries; further, articles are locked away behind a “paywall.” Many readers of the APS Observer are likely familiar with hybrid approaches as well (sometimes called “paid open access”). This model gives authorship teams the choice, after peer review, to pay APCs to add OA publishing to their accepted paper, or they can choose to publish without expense by effectively signing away the licensing rights to their article. Many additional variations exist, each with its own color-name (see Barnes, 2020, and Samberg et al., 2018)….

At the most fundamental level, PsyArXiv complements all forms of publishing by equitably providing psychological researchers with a free, simple, and immediate outlet that can be accessed by anyone with reliable Internet service. This gives early access to timely research findings, provides an alternative access option for works that are not published openly, increases discoverability (Norris et al., 2008; Lewis, 2018), and reduces the file-drawer problem (Franco et al., 2014). Beyond this, the PsyArXiv infrastructure allows for further innovation in psychology publishing that can build on the benefits of OA. These might include overlay journals, which have gained considerable attention in other scientific disciplines recently and provide peer-review and/or editorial curation of content posted on arXiv (for examples, see Discrete Analysis and The Open Journal of Astrophysics). Models like these offer the potential for niche journals to flourish in a manner that would not be viable within the traditional publishing ecosystem. In short, we hope that researchers, including submitters to APS journals, will take advantage of APS’s generous article-posting policies and make copies of their pre- and post-publication work available for the community at PsyArXiv, thereby helping the community capitalize on these many benefits.”

Full article: Promoting scientific integrity through open science in health psychology: results of the Synergy Expert Meeting of the European health psychology society

Abstract:  The article describes a position statement and recommendations for actions that need to be taken to develop best practices for promoting scientific integrity through open science in health psychology endorsed at a Synergy Expert Group Meeting. Sixteen Synergy Meeting participants developed a set of recommendations for researchers, gatekeepers, and research end-users. The group process followed a nominal group technique and voting system to elicit and decide on the most relevant and topical issues. Seventeen priority areas were listed and voted on, 15 of them were recommended by the group. Specifically, the following priority actions for health psychology were endorsed: (1) for researchers: advancing when and how to make data open and accessible at various research stages and understanding researchers’ beliefs and attitudes regarding open data; (2) for educators: integrating open science in research curricula, e.g., through online open science training modules, promoting preregistration, transparent reporting, open data and applying open science as a learning tool; (3) for journal editors: providing an open science statement, and open data policies, including a minimal requirements submission checklist. Health psychology societies and journal editors should collaborate in order to develop a coordinated plan for research integrity and open science promotion across behavioural disciplines.

 

The necessity of data transparency to publish

“Even though The Journal of Social Psychology was one of the first psychology journals to adopt open science badges (J. E. Grahe, 2014), and the first to require Research Materials Transparency (J. Grahe, 2018), we have resisted requiring Data Transparency. The reasons for this have varied across the years, but most recently we paused for two reasons which I will present momentarily. However, our reasons were generally concerned that early adoption would drive away too many authors and we needed to wait. In the early spring of 2020, the editors once again discussed adopting Data Transparency as a requirement for publication, but again demurred. Though our other concerns were again discussed, the onset of the CV-19 pandemic was our primary caution. In short, we recognized that this decision will require a transition as authors grapple with a new reality of sharing their data as a condition of publication, and we were waiting until the time was right to implement the new rules. Well, the time has come, and this editorial is the announcement that Data Transparency will now be required for publication in The Journal of Social Psychology. Along with a short explanation of the timing, this editorial also describes what is required versus recommended in our new data sharing policy.”

Opening Pandora’s Box: Peeking inside Psychology’s data sharing practices, and seven recommendations for change | SpringerLink

Abstract:  Open data-sharing is a valuable practice that ought to enhance the impact, reach, and transparency of a research project. While widely advocated by many researchers and mandated by some journals and funding agencies, little is known about detailed practices across psychological science. In a pre-registered study, we show that overall, few research papers directly link to available data in many, though not all, journals. Most importantly, even where open data can be identified, the majority of these lacked completeness and reusability—conclusions that closely mirror those reported outside of Psychology. Exploring the reasons behind these findings, we offer seven specific recommendations for engineering and incentivizing improved practices, so that the potential of open data can be better realized across psychology and social science more generally.