The co-authors of this work call themselves the Open Science Collaboration.
“No single indicator sufficiently describes replication success, and the five indicators examined here are not the only ways to evaluate reproducibility. Nonetheless, collectively these results offer a clear conclusion: A large portion of replications produced weaker evidence for the original findings despite using materials provided by the original authors, review in advance for methodological fidelity, and high statistical power to detect the original effect sizes. Moreover, correlational evidence is consistent with the conclusion that variation in the strength of initial evidence (such as original P value) was more predictive of replication success than variation in the characteristics of the teams conducting the research (such as experience and expertise). The latter factors certainly can influence replication success, but they did not appear to do so here.
Reproducibility is not well understood because the incentives for individual scientists prioritize novelty over replication. Innovation is the engine of discovery and is vital for a productive, effective scientific enterprise. However, innovative ideas become old news fast. Journal reviewers and editors may dismiss a new test of a published idea as unoriginal. The claim that “we already know this” belies the uncertainty of scientific evidence. Innovation points out paths that are possible; replication points out paths that are likely; progress relies on both. Replication can increase certainty when findings are reproduced and promote innovation when they are not. This project provides accumulating evidence for many findings in psychological research and suggests that there is still more work to do to verify whether we know what we think we know….”
“Brian Nosek, PhD, co-founder and director of the Center for Open Science, welcomed the new standards. “Achieving the ideals of transparency in science requires knowing what one needs to be transparent about,” he said. “These updated standards will improve readers’ understanding of what happened in the research. This will improve both the accuracy of interpretation of the existing evidence, and the ability to replicate and extend the findings to improve understanding.” APA has partnered with the Center for Open Science to advance open science practices in psychological research through open science badges on articles, a data repository for APA published articles and designating the COS’ PsyArXiv as the preferred preprint server for APA titles….”
“SIPS affirms a set of key values, which are core to the society: … Transparency and openness. Scientific knowledge is independently verifiable? veracity relies only on content, not source. SIPS supports free and open sharing of the process of conducting research (e.g., preregistration), for content generated during research (e.g., code, data, materials), and of reported findings (e.g., open access)….”
“How do you encourage researchers to share the data underlying their publications? The journal Psychological Science introduced a digital badge system in 2014 to signify when authors make the data and related materials accompanying their articles openly available. Criteria to earn the Open Data badge include (1) sharing data via a publicly accessible repository with a persistent identifier, such as a DOI, (2) assigning an open license, such as CC-BY or CC0, allowing reuse and credit to the data producer, and (3) providing enough documentation that another researcher could reproduce the reported results (Badges to Acknowledge Open Practices project on the Open Science Framework)….”
An open letter to the new editor-in-chief of Journal of Personality and Social Psychology: Attitudes and Social Cognition, urging the adoption of best practices for data sharing, reproducibility, and open science.
“There’s this idea that open science will attract more ‘disciples’ if it comes across as having a more positive, inclusive tone. Goodness me, what a load of honking bullshit this is. Open science will attract individual adopters for three reasons: (1) when scientists grow a conscience and appreciate that their public mission demands transparency and reproducibility; (2) when scientists decide to take advantage of individual incentives (career and social) for being open (e.g. Registered Reports, joining SIPS etc.); (3) when funders, journals and institutions make it a requirement. All of these are in progress. The cold embrace of open science by gatekeepers and regulators is in the post – it is only a matter of time before transparent, reproducible practices will be required if you want to spend public money. That’s why I tell early career researchers to get ahead now because the ground is shifting under your feet….”
“The American Psychological Association, the nonprofit publisher of 90 psychology journals, has entered a partnership with the Center for Open Science to offer open science badges to authors, create an APA data repository to ease sharing and designate a preferred preprint server for APA journal articles.”
Abstract: In this article, we explore the state of the OA market and the current situation with respect to offsetting deals in the Netherlands. We then offer a case study of the LingOA model for a transition to open access, backed by a consortial funding mechanism: the Open Library of Humanities (OLH). We also suggest how this approach can be extended into new disciplinary spaces (in particular, mathematics and psychology, where there is already some willingness from editors).