Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Badge Detail – Badgr

“This digital credential (Open Badge) recognises the completion of the online course “Open Badges for Open Science” at Foundations Level and certifies that the owner of this credential has attained these learning outcomes: 1. Report on how you understand the concept of Open Badges. Focus on the aims and practical uses of Open Badges. 2. Present/visualise the context of the Open Badges including history and organisations involved. 3. Create a portfolio of application fields and good practice examples of Open Badges which are relevant to you and your work. Designed by the OBERRED Erasmus+ project, the Foundations Level of the MOOC “Open Badges for Open Science” provides researchers, practitioners, educators, students and other stakeholders in the field of Research Data Management (RDM) with skills and knowledge in Open Badges which are relevant for successful engagement in Open Science.”

 

Full article: To share or not to share – 10 years of European Journal of Psychotraumatology

Abstract:  The European Journal of Psychotraumatology, owned by the European Society for Traumatic Stress Studies (ESTSS), launched as one of the first full Open Access ‘specialist’ journals in its field. Has this Open Access model worked in how the Journal has performed? With the European Journal of Psychotraumatology celebrating its ten-year anniversary we look back at the past decade of sharing our research with the world and with how the journal sits with the broader movement beyond Open Access to Open Research and we present new policies we have adopted to move the field of psychotraumatology to the next level of Open Research. While we as researchers now make our publications more often freely available to all, how often do we share our protocols, our statistical analysis plans, or our data? We all gain from more transparency and reproducibility, and big steps are being made in this direction. The journal’s decennial performance as well as the exciting new Open Research developments are presented in this editorial. The journal is no longer in its infancy and eager to step into the next decade of Open Research.

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study | Royal Society Open Science

Abstract:  For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy’ (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.

 

FAIR metrics and certification, rewards and recognition, skills and training: FAIRsFAIR contribution to the EOSC Strategic Research and Innovation Agenda | FAIRsFAIR

“FAIRsFAIR is a key contributor to the ongoing development of global standards for FAIR data and repository certification and to the policies and practices that will turn the EOSC programme into a functioning infrastructure. The project strongly endorses all of the guiding principles already identified as relevant to implementing the EOSC vision, with a special emphasis on the importance of FAIR-by-design tools. The guiding principles are a multi-stakeholder approach; data as open as possible and as closed as necessary; implementation of a Web of FAIR data and related services for science; federation of existing research infrastructures; and the need for machine-run algorithms transparent to researchers)….”

Open science badges are coming. “A ‘badge’ is a symbol or indicator of… | by Bruce Caron | Aug, 2020 | Medium

“The notion of using open digital badges to acknowledge certain practices and learning achievements has been circulating in the open science endeavor for more than a decade. Over these years, this has become a perennial “near future” augmentation/implementation of how open science can recognize and reward practices and skills. Instead of using game-able metrics that rank individuals as though they were in a race, badges can promote active learning, current standards, professional development, and research quality assurance.

The transition from arbitrarily scarce reputation markers (impact metrics, prizes, awards) to universally available recognition markers also helps to level the ground on which careers can be built across the global republic of science. Every scientist who wants to take the time and effort to earn a badge for achieving some level of, say, research-data reusability, or graduate-student mentorship, can then show off this badge to the world. Every student/scientist who acquires a specific skill (R programming, software reusability, statistics, etc.) can add a new badge to their CV….”

Improving transparency and scientific rigor in academic publishing – Prager – 2019 – CANCER REPORTS – Wiley Online Library

“3.5.7 Registered reports and open practices badges

One possible way to incorporate all the information listed above and to combat the stigma against papers that report nonsignificant findings is through the implementation of Registered Reports or rewarding transparent research practices. Registered Reports are empirical articles designed to eliminate publication bias and incentivize best scientific practice. Registered Reports are a form of empirical article in which the methods and the proposed analyses are preregistered and reviewed prior to research being conducted. This format is designed to minimize bias, while also allowing complete flexibility to conduct exploratory (unregistered) analyses and report serendipitous findings. The cornerstone of the Registered Reports format is that the authors submit as a Stage 1 manuscript an introduction, complete and transparent methods, and the results of any pilot experiments (where applicable) that motivate the research proposal, written in the future tense. These proposals will include a description of the key research question and background literature, hypotheses, experimental design and procedures, analysis pipeline, a statistical power analysis, and full description of the planned comparisons. Submissions, which are reviewed by editors, peer reviewers and in some journals, statistical editors, meeting the rigorous and transparent requirements for conducting the research proposed are offered an in?principle acceptance, meaning that the journal guarantees publication if the authors conduct the experiment in accordance with their approved protocol. Many journals publish the Stage 1 report, which could be beneficial not only for citations, but for the authors’ progress reports and tenure packages. Following data collection, the authors prepare and resubmit a Stage 2 manuscript that includes the introduction and methods from the original submission plus their obtained results and discussion. The manuscript will undergo full review; referees will consider whether the data test the authors’ proposed hypotheses by satisfying the approved outcome?neutral conditions, will ensure the authors adhered precisely to the registered experimental procedures, and will review any unregistered post hoc analyses added by the authors to confirm they are justified, methodologically sound, and informative. At this stage, the authors must also share their data (see also Wiley’s Data Sharing and Citation Policy) and analysis scripts on a public and freely accessible archive such as Figshare and Dryad or at the Open Science Framework. Additional details, including template reviewer and author guidelines, can be found by clicking the link to the Open Science Framework from the Center for Open Science (see also94).

The authors who practice transparent and rigorous science should be recognized for this work. Funders can encourage and reward open practice in significant ways (see https://wellcome.ac.uk/what?we?do/our?work/open?research). One way journals can support this is to award badges to the authors in recognition of these open scientific practices. Badges certify that a particular practice was followed, but do not define good practice. As defined by the Open Science Framework, three badges can be earned. The Open Data badge is earned for making publicly available the digitally shareable data necessary to reproduce the reported results. These data must be accessible via an open?access repository, and must be permanent (e.g., a registration on the Open Science Framework, or an independent repository at www.re3data.org). The Open Materials badge is earned when the components of the research methodology needed to reproduce the reported procedure and analysis are made publicly available. The Preregistered badge is earned for having a preregistered design, whereas the Preregistered+Analysis Plan badge is earned for having both a preregistered research design and an analysis plan for the research; the authors must report results according to that plan. Additional information about the badges, including the necessary information to be awarded a badge, can be found by clicking this link to the Open Science Framework from the Center for Open Science….”

Open Metrics Require Open Infrastructure

“Today, Zenodo announced their intentions to remove the altmetrics.com badges from their landing pages–and we couldn’t be more energized by their commitment to open infrastructure, supporting their mission to make scientific information open and free.

“We strongly believe that metadata about records including citation data & other data used for computing metrics should be freely available without barriers” – Zenodo Leadership….

In light of emerging needs for metrics and our work at Make Data Count (MDC) to build open infrastructure for data metrics, we believe that it is necessary for corporations or entities that provide analytics and researcher tools to share the raw data sources behind their work. In short, if we trust these metrics enough to display on our websites or add to our CVs, then we should also demand that they be available for us to audit….

These principles are core to our mission to build the infrastructure for open data metrics. As emphasis shifts in scholarly communication toward “other research outputs” beyond the journal article, we believe it is important to build intentionally open infrastructure, not repeating mistakes made in the metrics systems developed for articles. We know that it is possible for the community to come together and develop the future of open metrics, in a non-prescriptive manner, and importantly built on completely open and reproducible infrastructure.”

Open Science Practices at the Journal of Traumatic Stress – Kerig – 2020 – Journal of Traumatic Stress – Wiley Online Library

Abstract:  This editorial describes new initiatives designed to promote and maintain open science practices (OSP) at the Journal of Traumatic Stress, to be enacted beginning January 2020. Following a brief description of the rationale underlying the argument for conducting and reporting research in ways that maximize transparency and replicability, this article summarizes changes in Journal submission and publication procedures that are designed to foster and highlight such practices. These include requesting an Open Science Practices Statement from authors of all accepted manuscripts, which will be published as supplementary material for each article, and providing authors with the opportunity to earn OSP badges for preregistering studies, making data available to other researchers by posting on a third party archive, and making available research materials and codes used in the study.