FAIR metrics and certification, rewards and recognition, skills and training: FAIRsFAIR contribution to the EOSC Strategic Research and Innovation Agenda | FAIRsFAIR

“FAIRsFAIR is a key contributor to the ongoing development of global standards for FAIR data and repository certification and to the policies and practices that will turn the EOSC programme into a functioning infrastructure. The project strongly endorses all of the guiding principles already identified as relevant to implementing the EOSC vision, with a special emphasis on the importance of FAIR-by-design tools. The guiding principles are a multi-stakeholder approach; data as open as possible and as closed as necessary; implementation of a Web of FAIR data and related services for science; federation of existing research infrastructures; and the need for machine-run algorithms transparent to researchers)….”

Open science badges are coming. “A ‘badge’ is a symbol or indicator of… | by Bruce Caron | Aug, 2020 | Medium

“The notion of using open digital badges to acknowledge certain practices and learning achievements has been circulating in the open science endeavor for more than a decade. Over these years, this has become a perennial “near future” augmentation/implementation of how open science can recognize and reward practices and skills. Instead of using game-able metrics that rank individuals as though they were in a race, badges can promote active learning, current standards, professional development, and research quality assurance.

The transition from arbitrarily scarce reputation markers (impact metrics, prizes, awards) to universally available recognition markers also helps to level the ground on which careers can be built across the global republic of science. Every scientist who wants to take the time and effort to earn a badge for achieving some level of, say, research-data reusability, or graduate-student mentorship, can then show off this badge to the world. Every student/scientist who acquires a specific skill (R programming, software reusability, statistics, etc.) can add a new badge to their CV….”

Improving transparency and scientific rigor in academic publishing – Prager – 2019 – CANCER REPORTS – Wiley Online Library

“3.5.7 Registered reports and open practices badges

One possible way to incorporate all the information listed above and to combat the stigma against papers that report nonsignificant findings is through the implementation of Registered Reports or rewarding transparent research practices. Registered Reports are empirical articles designed to eliminate publication bias and incentivize best scientific practice. Registered Reports are a form of empirical article in which the methods and the proposed analyses are preregistered and reviewed prior to research being conducted. This format is designed to minimize bias, while also allowing complete flexibility to conduct exploratory (unregistered) analyses and report serendipitous findings. The cornerstone of the Registered Reports format is that the authors submit as a Stage 1 manuscript an introduction, complete and transparent methods, and the results of any pilot experiments (where applicable) that motivate the research proposal, written in the future tense. These proposals will include a description of the key research question and background literature, hypotheses, experimental design and procedures, analysis pipeline, a statistical power analysis, and full description of the planned comparisons. Submissions, which are reviewed by editors, peer reviewers and in some journals, statistical editors, meeting the rigorous and transparent requirements for conducting the research proposed are offered an in?principle acceptance, meaning that the journal guarantees publication if the authors conduct the experiment in accordance with their approved protocol. Many journals publish the Stage 1 report, which could be beneficial not only for citations, but for the authors’ progress reports and tenure packages. Following data collection, the authors prepare and resubmit a Stage 2 manuscript that includes the introduction and methods from the original submission plus their obtained results and discussion. The manuscript will undergo full review; referees will consider whether the data test the authors’ proposed hypotheses by satisfying the approved outcome?neutral conditions, will ensure the authors adhered precisely to the registered experimental procedures, and will review any unregistered post hoc analyses added by the authors to confirm they are justified, methodologically sound, and informative. At this stage, the authors must also share their data (see also Wiley’s Data Sharing and Citation Policy) and analysis scripts on a public and freely accessible archive such as Figshare and Dryad or at the Open Science Framework. Additional details, including template reviewer and author guidelines, can be found by clicking the link to the Open Science Framework from the Center for Open Science (see also94).

The authors who practice transparent and rigorous science should be recognized for this work. Funders can encourage and reward open practice in significant ways (see https://wellcome.ac.uk/what?we?do/our?work/open?research). One way journals can support this is to award badges to the authors in recognition of these open scientific practices. Badges certify that a particular practice was followed, but do not define good practice. As defined by the Open Science Framework, three badges can be earned. The Open Data badge is earned for making publicly available the digitally shareable data necessary to reproduce the reported results. These data must be accessible via an open?access repository, and must be permanent (e.g., a registration on the Open Science Framework, or an independent repository at www.re3data.org). The Open Materials badge is earned when the components of the research methodology needed to reproduce the reported procedure and analysis are made publicly available. The Preregistered badge is earned for having a preregistered design, whereas the Preregistered+Analysis Plan badge is earned for having both a preregistered research design and an analysis plan for the research; the authors must report results according to that plan. Additional information about the badges, including the necessary information to be awarded a badge, can be found by clicking this link to the Open Science Framework from the Center for Open Science….”

Open Metrics Require Open Infrastructure

“Today, Zenodo announced their intentions to remove the altmetrics.com badges from their landing pages–and we couldn’t be more energized by their commitment to open infrastructure, supporting their mission to make scientific information open and free.

“We strongly believe that metadata about records including citation data & other data used for computing metrics should be freely available without barriers” – Zenodo Leadership….

In light of emerging needs for metrics and our work at Make Data Count (MDC) to build open infrastructure for data metrics, we believe that it is necessary for corporations or entities that provide analytics and researcher tools to share the raw data sources behind their work. In short, if we trust these metrics enough to display on our websites or add to our CVs, then we should also demand that they be available for us to audit….

These principles are core to our mission to build the infrastructure for open data metrics. As emphasis shifts in scholarly communication toward “other research outputs” beyond the journal article, we believe it is important to build intentionally open infrastructure, not repeating mistakes made in the metrics systems developed for articles. We know that it is possible for the community to come together and develop the future of open metrics, in a non-prescriptive manner, and importantly built on completely open and reproducible infrastructure.”

Open Science Practices at the Journal of Traumatic Stress – Kerig – 2020 – Journal of Traumatic Stress – Wiley Online Library

Abstract:  This editorial describes new initiatives designed to promote and maintain open science practices (OSP) at the Journal of Traumatic Stress, to be enacted beginning January 2020. Following a brief description of the rationale underlying the argument for conducting and reporting research in ways that maximize transparency and replicability, this article summarizes changes in Journal submission and publication procedures that are designed to foster and highlight such practices. These include requesting an Open Science Practices Statement from authors of all accepted manuscripts, which will be published as supplementary material for each article, and providing authors with the opportunity to earn OSP badges for preregistering studies, making data available to other researchers by posting on a third party archive, and making available research materials and codes used in the study.

 

SSHOC WEBINAR: How to improve the quality of your repository? SSHOC and certification of repositories | DARIAH

“Certification is a sign of trust that benefits a data repository in many ways. How can your repository achieve certification? The SSHOC webinar will focus on the certification of digital repositories and how your repository can apply for the CoreTrustSeal. The webinar will also touch upon how SSHOC can support repositories seeking certification.

CoreTrustSeal is a community-driven certification framework with over 80 past certifications. The certification consists of sixteen requirements for which applicants are asked to provide self-assessment statements along with relevant evidence. CoreTrustSeal certification is sufficiently stringent for data repositories within the social sciences and humanities but significantly less costly and labour-intensive than formal audit against ISO/DIN standards. Certification requirements for the CoreTrustSeal are also reviewed every three years in comparison with every five years for ISO/DIN standards. CoreTrustSeal is open to feedback and continuously considering the widest possible range of certification candidates….”

Did awarding badges increase data sharing in BMJ Open? A randomized controlled trial | Royal Society Open Science

Abstract:  Sharing data and code are important components of reproducible research. Data sharing in research is widely discussed in the literature; however, there are no well-established evidence-based incentives that reward data sharing, nor randomized studies that demonstrate the effectiveness of data sharing policies at increasing data sharing. A simple incentive, such as an Open Data Badge, might provide the change needed to increase data sharing in health and medical research. This study was a parallel group randomized controlled trial (protocol registration: doi:10.17605/OSF.IO/PXWZQ) with two groups, control and intervention, with 80 research articles published in BMJ Open per group, with a total of 160 research articles. The intervention group received an email offer for an Open Data Badge if they shared their data along with their final publication and the control group received an email with no offer of a badge if they shared their data with their final publication. The primary outcome was the data sharing rate. Badges did not noticeably motivate researchers who published in BMJ Open to share their data; the odds of awarding badges were nearly equal in the intervention and control groups (odds ratio = 0.9, 95% CI [0.1, 9.0]). Data sharing rates were low in both groups, with just two datasets shared in each of the intervention and control groups. The global movement towards open science has made significant gains with the development of numerous data sharing policies and tools. What remains to be established is an effective incentive that motivates researchers to take up such tools to share their data.

 

 

Supporting journal publishing practices in the global south | Research Information

“Journals in the developing world face challenges in becoming known and respected in the international research landscape. Siân Harris describes Journal Publishing Practices and Standards, established and managed by African Journals Online and INASP.”

OSF | Badges to Acknowledge Open Practices Wiki

“A “PA” (Protected Access) notation may be added to open data badges if sensitive, personal data are available only from an approved third party repository that manages access to data to qualified researchers through a documented process. To be eligible for an open data badge with such a notation, the repository must publicly describe the steps necessary to obtain the data and detailed data documentation (e.g. variable names and allowed values) must be made available publicly. This notation is not available to researchers who state that they will make “data available upon request” and is not available if requests for data sharing are evaluated on any criteria beyond considerations for compliance with proper handling of sensitive data. For example, this notation is not available if limitations are placed on the permitted use of the data, such as for data that are only made available for the purposes of replicating previously published results or for which there is substantive review of analytical results. Review of results to avoid disclosure of confidential information is permissible….”