Increasing transparency through open science badges

“Authors who adopt transparent practices for an article in Conservation Biology are now able to select from 3 open science badges: open data, open materials, and preregistration. Badges appear on published articles as visible recognition and highlight these efforts to the research community. There is an emerging body of literature regarding the influences of badges, for example, an increased number of articles with open data (Kidwell et al 2016) and increased rate of data sharing (Rowhani?Farid et al. 2018). However, in another study, Rowhani?Farid et al. (2020) found that badges did not “noticeably motivate” researchers to share data. Badges, as far as we know, are the only data?sharing incentive that has been tested empirically (Rowhani?Farid et al. 2017).

Rates of data and code sharing are typically low (Herold 2015; Roche et al 2015; Archmiller et al 2020; Culina et al 2020). Since 2016, we have asked authors of contributed papers, reviews, method papers, practice and policy papers, and research notes to tell us whether they “provided complete machine and human?readable data and computer code in Supporting Information or on a public archive.” Authors of 31% of these articles published in Conservation Biology said they shared their data or code, and all authors provide human?survey instruments in Supporting Information or via a citation or online link (i.e., shared materials)….”

Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals

Abstract:  Background: “Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline. Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods: We scored the author guidelines of a comprehensive set of 27 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor. The TOP Factor is a quantitative summary of the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results: Across the 27 journals, we find low values on the TOP Factor (median [25 th, 75 th percentile] 3 [1, 3], min. 0, max. 9, out of a total possible score of 29) in sleep research and chronobiology journals.

Conclusions: Our findings suggest an opportunity for sleep research and chronobiology journals to further support recent developments in transparent and open science by implementing transparency and openness principles in their author guidelines.

Transparency and Open Science at the Journal of Personality – Wright – – Journal of Personality – Wiley Online Library

“Changes are afoot in the way the scientific community is approaching the practice and reporting of research. Spurred by concerns about the fundamental reliability (i.e., replicability), or rather lack thereof, of contemporary psychological science (e.g., Open Science Collaboration, 2015), as well as how we go about our business (e.g., Gelman & Loken, 2014), several recommendations have been furthered for increasing the rigor of the published research through openness and transparency. The Journal has long prized and published the type of research with features, like large sample sizes (Fraley & Vazire, 2014), that has fared well by replicability standards (Soto, 2019). The type of work traditionally published here, often relying on longitudinal samples, large public datasets (e.g., Midlife in the United States Study), or complex data collection designs (e.g., ambulatory assessment and behavioral coding) did not seem to fit neatly into the template of the emerging transparency practices. However, as thinking in the open science movement has progressed and matured, we have decided to full?throatedly endorse these practices and join the growing chorus of voices that are encouraging and rewarding more transparent work in psychological science. We believe this can be achieved while maintaining the “big tent” spirit of personality research at the Journal with a broad scope in content, methods, and analytical tools that has made it so special and successful all of these years. Moving forward, we will be rigorously implementing a number of procedures for openness and transparency consistent with the Transparency and Open Science Promotion (TOP) Guidelines.

The TOP Guidelines are organized into eight standards, each of which can be implemented at three levels of stringency (Nosek et al., 2015). In what follows, we outline the initial TOP Standards Levels adopted by the Journal and the associated rationale. Generally, we have adopted Level 2 standards, as we believe these strike a desirable balance between compelling a high degree of openness and transparency while not being overly onerous and a deterrent for authors interested in the Journal as an outlet for their work….”

NISO’s Recommended Practice on Reproducibility Badging and Definitions Now Published | Industry Announcements and Events SSP-L

“The National Information Standards Organization (NISO) today announces the publication of its Recommended Practice, RP-31-2021, Reproducibility Badging and Definitions. Developed by the NISO Taxonomy, Definitions, and Recognition Badging Scheme Working Group, this new Recommended Practice provides a set of recognition standards that can be deployed across scholarly publishing outputs, to easily recognize and reward the sharing of data and methods….”

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Full article: To share or not to share – 10 years of European Journal of Psychotraumatology

Abstract:  The European Journal of Psychotraumatology, owned by the European Society for Traumatic Stress Studies (ESTSS), launched as one of the first full Open Access ‘specialist’ journals in its field. Has this Open Access model worked in how the Journal has performed? With the European Journal of Psychotraumatology celebrating its ten-year anniversary we look back at the past decade of sharing our research with the world and with how the journal sits with the broader movement beyond Open Access to Open Research and we present new policies we have adopted to move the field of psychotraumatology to the next level of Open Research. While we as researchers now make our publications more often freely available to all, how often do we share our protocols, our statistical analysis plans, or our data? We all gain from more transparency and reproducibility, and big steps are being made in this direction. The journal’s decennial performance as well as the exciting new Open Research developments are presented in this editorial. The journal is no longer in its infancy and eager to step into the next decade of Open Research.

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study | Royal Society Open Science

Abstract:  For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy’ (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.