Improving transparency and scientific rigor in academic publishing – Prager – 2019 – CANCER REPORTS – Wiley Online Library

“3.5.7 Registered reports and open practices badges

One possible way to incorporate all the information listed above and to combat the stigma against papers that report nonsignificant findings is through the implementation of Registered Reports or rewarding transparent research practices. Registered Reports are empirical articles designed to eliminate publication bias and incentivize best scientific practice. Registered Reports are a form of empirical article in which the methods and the proposed analyses are preregistered and reviewed prior to research being conducted. This format is designed to minimize bias, while also allowing complete flexibility to conduct exploratory (unregistered) analyses and report serendipitous findings. The cornerstone of the Registered Reports format is that the authors submit as a Stage 1 manuscript an introduction, complete and transparent methods, and the results of any pilot experiments (where applicable) that motivate the research proposal, written in the future tense. These proposals will include a description of the key research question and background literature, hypotheses, experimental design and procedures, analysis pipeline, a statistical power analysis, and full description of the planned comparisons. Submissions, which are reviewed by editors, peer reviewers and in some journals, statistical editors, meeting the rigorous and transparent requirements for conducting the research proposed are offered an in?principle acceptance, meaning that the journal guarantees publication if the authors conduct the experiment in accordance with their approved protocol. Many journals publish the Stage 1 report, which could be beneficial not only for citations, but for the authors’ progress reports and tenure packages. Following data collection, the authors prepare and resubmit a Stage 2 manuscript that includes the introduction and methods from the original submission plus their obtained results and discussion. The manuscript will undergo full review; referees will consider whether the data test the authors’ proposed hypotheses by satisfying the approved outcome?neutral conditions, will ensure the authors adhered precisely to the registered experimental procedures, and will review any unregistered post hoc analyses added by the authors to confirm they are justified, methodologically sound, and informative. At this stage, the authors must also share their data (see also Wiley’s Data Sharing and Citation Policy) and analysis scripts on a public and freely accessible archive such as Figshare and Dryad or at the Open Science Framework. Additional details, including template reviewer and author guidelines, can be found by clicking the link to the Open Science Framework from the Center for Open Science (see also94).

The authors who practice transparent and rigorous science should be recognized for this work. Funders can encourage and reward open practice in significant ways (see https://wellcome.ac.uk/what?we?do/our?work/open?research). One way journals can support this is to award badges to the authors in recognition of these open scientific practices. Badges certify that a particular practice was followed, but do not define good practice. As defined by the Open Science Framework, three badges can be earned. The Open Data badge is earned for making publicly available the digitally shareable data necessary to reproduce the reported results. These data must be accessible via an open?access repository, and must be permanent (e.g., a registration on the Open Science Framework, or an independent repository at www.re3data.org). The Open Materials badge is earned when the components of the research methodology needed to reproduce the reported procedure and analysis are made publicly available. The Preregistered badge is earned for having a preregistered design, whereas the Preregistered+Analysis Plan badge is earned for having both a preregistered research design and an analysis plan for the research; the authors must report results according to that plan. Additional information about the badges, including the necessary information to be awarded a badge, can be found by clicking this link to the Open Science Framework from the Center for Open Science….”

Open Metrics Require Open Infrastructure

“Today, Zenodo announced their intentions to remove the altmetrics.com badges from their landing pages–and we couldn’t be more energized by their commitment to open infrastructure, supporting their mission to make scientific information open and free.

“We strongly believe that metadata about records including citation data & other data used for computing metrics should be freely available without barriers” – Zenodo Leadership….

In light of emerging needs for metrics and our work at Make Data Count (MDC) to build open infrastructure for data metrics, we believe that it is necessary for corporations or entities that provide analytics and researcher tools to share the raw data sources behind their work. In short, if we trust these metrics enough to display on our websites or add to our CVs, then we should also demand that they be available for us to audit….

These principles are core to our mission to build the infrastructure for open data metrics. As emphasis shifts in scholarly communication toward “other research outputs” beyond the journal article, we believe it is important to build intentionally open infrastructure, not repeating mistakes made in the metrics systems developed for articles. We know that it is possible for the community to come together and develop the future of open metrics, in a non-prescriptive manner, and importantly built on completely open and reproducible infrastructure.”

Open Science Practices at the Journal of Traumatic Stress – Kerig – 2020 – Journal of Traumatic Stress – Wiley Online Library

Abstract:  This editorial describes new initiatives designed to promote and maintain open science practices (OSP) at the Journal of Traumatic Stress, to be enacted beginning January 2020. Following a brief description of the rationale underlying the argument for conducting and reporting research in ways that maximize transparency and replicability, this article summarizes changes in Journal submission and publication procedures that are designed to foster and highlight such practices. These include requesting an Open Science Practices Statement from authors of all accepted manuscripts, which will be published as supplementary material for each article, and providing authors with the opportunity to earn OSP badges for preregistering studies, making data available to other researchers by posting on a third party archive, and making available research materials and codes used in the study.

 

SSHOC WEBINAR: How to improve the quality of your repository? SSHOC and certification of repositories | DARIAH

“Certification is a sign of trust that benefits a data repository in many ways. How can your repository achieve certification? The SSHOC webinar will focus on the certification of digital repositories and how your repository can apply for the CoreTrustSeal. The webinar will also touch upon how SSHOC can support repositories seeking certification.

CoreTrustSeal is a community-driven certification framework with over 80 past certifications. The certification consists of sixteen requirements for which applicants are asked to provide self-assessment statements along with relevant evidence. CoreTrustSeal certification is sufficiently stringent for data repositories within the social sciences and humanities but significantly less costly and labour-intensive than formal audit against ISO/DIN standards. Certification requirements for the CoreTrustSeal are also reviewed every three years in comparison with every five years for ISO/DIN standards. CoreTrustSeal is open to feedback and continuously considering the widest possible range of certification candidates….”

Did awarding badges increase data sharing in BMJ Open? A randomized controlled trial | Royal Society Open Science

Abstract:  Sharing data and code are important components of reproducible research. Data sharing in research is widely discussed in the literature; however, there are no well-established evidence-based incentives that reward data sharing, nor randomized studies that demonstrate the effectiveness of data sharing policies at increasing data sharing. A simple incentive, such as an Open Data Badge, might provide the change needed to increase data sharing in health and medical research. This study was a parallel group randomized controlled trial (protocol registration: doi:10.17605/OSF.IO/PXWZQ) with two groups, control and intervention, with 80 research articles published in BMJ Open per group, with a total of 160 research articles. The intervention group received an email offer for an Open Data Badge if they shared their data along with their final publication and the control group received an email with no offer of a badge if they shared their data with their final publication. The primary outcome was the data sharing rate. Badges did not noticeably motivate researchers who published in BMJ Open to share their data; the odds of awarding badges were nearly equal in the intervention and control groups (odds ratio = 0.9, 95% CI [0.1, 9.0]). Data sharing rates were low in both groups, with just two datasets shared in each of the intervention and control groups. The global movement towards open science has made significant gains with the development of numerous data sharing policies and tools. What remains to be established is an effective incentive that motivates researchers to take up such tools to share their data.

 

 

Supporting journal publishing practices in the global south | Research Information

“Journals in the developing world face challenges in becoming known and respected in the international research landscape. Siân Harris describes Journal Publishing Practices and Standards, established and managed by African Journals Online and INASP.”

OSF | Badges to Acknowledge Open Practices Wiki

“A “PA” (Protected Access) notation may be added to open data badges if sensitive, personal data are available only from an approved third party repository that manages access to data to qualified researchers through a documented process. To be eligible for an open data badge with such a notation, the repository must publicly describe the steps necessary to obtain the data and detailed data documentation (e.g. variable names and allowed values) must be made available publicly. This notation is not available to researchers who state that they will make “data available upon request” and is not available if requests for data sharing are evaluated on any criteria beyond considerations for compliance with proper handling of sensitive data. For example, this notation is not available if limitations are placed on the permitted use of the data, such as for data that are only made available for the purposes of replicating previously published results or for which there is substantive review of analytical results. Review of results to avoid disclosure of confidential information is permissible….”

Open Science Comes To Policy Analysis – CEGA – Medium

“This post is co-authored by Fernando Hoces de la Guardia, BITSS postdoctoral scholar, along with Sean Grant (Associate Behavioral and Social Scientist at RAND) and CEGA Faculty Director Ted Miguel. It is cross-posted with the BITSS Blog.

The Royal Society’s motto, “Take nobody’s word for it,” reflects a key principle of scientific inquiry: as researchers, we aspire to discuss ideas in the open, to examine our analyses critically, to learn from our mistakes, and to constantly improve. This type of thinking shouldn’t guide only the creation of rigorous evidence?—?rather, it should extend to the work of policy analysts whose findings may affect very large numbers of people. At the end of the day, a commitment to scientific rigor in public policy analysis is the only durable response to potential attacks on credibility. We, the three authors of this blog?—?Fernando Hoces de la Guardia, Sean Grant, and Ted Miguel?—?recently published a working paper suggesting a parallel between the reproducibility crisis in social science and observed threats to the credibility of public policy analysis. Researchers and policy analysts both perform empirical analyses; have a large amount of undisclosed flexibility when collecting, analyzing, and reporting data; and may face strong incentives to obtaining “desired” results (for example, p-values of <0.05 in research, or large negative/positive effects in policy analysis)….”

APPRAISE (A Post-Publication Review and Assessment In Science Experiment) | ASAPbio

“I describe here a new project – called Appraise – that is both a model and experimental platform for what peer review can and should look like in a world without journals….

The rise of preprints gives us the perfect opportunity to create a new system that takes full advantage of the Internet to more rapidly, effectively and fairly engage the scientific community in assessing the validity, audience and impact of published works….

APPRAISE (A Post-Publication Review and Assessment In Science Experiment)…

It is perhaps easiest to think of Appraise as an editorial board without a journal (and we hope to be a model for how existing editorial boards can transition away from journals). Like journal editorial boards they will curate the scientific literature through the critical process of peer review. However members of Appraise will not be reviewing papers submitted to a journal and deciding whether it should be published. Rather Appraise reviewers are working in service of members of the scientific community, selecting papers they think warrant scrutiny and attention, and reviewing them to help others find, understand and assess published paper….

In the spirit of openness we encourage Appraise members to identify themselves, but recognize that the ability to speak freely sometimes requires anonymity. Appraise will allow members to post reviews anonymously provided that there are no conflicts of interest and the reviewer does not use anonymity as a shield for inappropriate behavior. Whether reviewers are publicly identified or not, Appraise will never tolerate personal attacks of any kind.

We are launching Appraise with a small group of scientists. This is for purely practical purposes – to develop our systems and practices without the challenges of managing a large, open community. But the goal is to as quickly as possible open the platform up to everyone.”

APA releases new journal article reporting standards

“Brian Nosek, PhD, co-founder and director of the Center for Open Science, welcomed the new standards. “Achieving the ideals of transparency in science requires knowing what one needs to be transparent about,” he said. “These updated standards will improve readers’ understanding of what happened in the research. This will improve both the accuracy of interpretation of the existing evidence, and the ability to replicate and extend the findings to improve understanding.” APA has partnered with the Center for Open Science to advance open science practices in psychological research through open science badges on articles, a data repository for APA published articles and designating the COS’ PsyArXiv as the preferred preprint server for APA titles….”