“An interdisciplinary archive of articles focused on improving research transparency and reproducibility. Maintained by The Berkeley Initiative for Transparency in the Social Sciences (BITSS) …”
“The Countway Library of Medicine at the Harvard Medical School has received funding from the NIH Big Data to Knowledge (BD2K) Initiative for Research Education, to develop a Massive Open Online Course (MOOC) on Data Management for Biomedical Big Data. This MOOC is is built upon and expands the training materials of the New England Collaborative Data Management Curriculum (NECDMC), developed by several libraries in the New England region.
The Best Practices for Biomedical Research Data Management course is hosted by the Canvas Network and provides training to librarians, biomedical researchers, undergraduate and graduate biomedical students, and other individuals interested on best practices for discoverability, access, integrity, reuse value, privacy, security, and long term preservation of biomedical research data. The course is free and self-paced….”
“Biomedical research today is not only rigorous, innovative and insightful, it also has to be organized and reproducible. With more capacity to create and store data, there is the challenge of making data discoverable, understandable, and reusable. Many funding agencies and journal publishers are requiring publication of relevant data to promote open science and reproducibility of research.
In order to meet to these requirements and evolving trends, researchers and information professionals will need the data management and curation knowledge and skills to support the access, reuse and preservation of data.
This course is designed to address present and future data management needs….”
The co-authors of this work call themselves the Open Science Collaboration.
“No single indicator sufficiently describes replication success, and the five indicators examined here are not the only ways to evaluate reproducibility. Nonetheless, collectively these results offer a clear conclusion: A large portion of replications produced weaker evidence for the original findings despite using materials provided by the original authors, review in advance for methodological fidelity, and high statistical power to detect the original effect sizes. Moreover, correlational evidence is consistent with the conclusion that variation in the strength of initial evidence (such as original P value) was more predictive of replication success than variation in the characteristics of the teams conducting the research (such as experience and expertise). The latter factors certainly can influence replication success, but they did not appear to do so here.
Reproducibility is not well understood because the incentives for individual scientists prioritize novelty over replication. Innovation is the engine of discovery and is vital for a productive, effective scientific enterprise. However, innovative ideas become old news fast. Journal reviewers and editors may dismiss a new test of a published idea as unoriginal. The claim that “we already know this” belies the uncertainty of scientific evidence. Innovation points out paths that are possible; replication points out paths that are likely; progress relies on both. Replication can increase certainty when findings are reproduced and promote innovation when they are not. This project provides accumulating evidence for many findings in psychological research and suggests that there is still more work to do to verify whether we know what we think we know….”
“CODATA-RDA schools changed my career, making me a more responsible researcher but also an Open Science ambassador for the Central American area. I now aspire to be a young researcher that can teach Open and Data Science principles through my job at the University of Costa Rica and through the CODATA-RDA Schools, as well as also serve as a mentor for other people that want to learn how to practice Open Science.”
Abstract: Energy policy often builds on insights gained from quantitative energy models and their underlying data. As climate change mitigation and economic concerns drive a sustained transformation of the energy sector, transparent and well-founded analyses are more important than ever. We assert that models and their associated data must be openly available to facilitate higher quality science, greater productivity through less duplicated effort, and a more effective science-policy boundary. There are also valid reasons why data and code are not open: ethical and security concerns, unwanted exposure, additional workload, and institutional or personal inertia. Overall, energy policy research ostensibly lags behind other fields in promoting more open and reproducible science. We take stock of the status quo and propose actionable steps forward for the energy research community to ensure that it can better engage with decision-makers and continues to deliver robust policy advice in a transparent and reproducible way.
“Brian Nosek, PhD, co-founder and director of the Center for Open Science, welcomed the new standards. “Achieving the ideals of transparency in science requires knowing what one needs to be transparent about,” he said. “These updated standards will improve readers’ understanding of what happened in the research. This will improve both the accuracy of interpretation of the existing evidence, and the ability to replicate and extend the findings to improve understanding.” APA has partnered with the Center for Open Science to advance open science practices in psychological research through open science badges on articles, a data repository for APA published articles and designating the COS’ PsyArXiv as the preferred preprint server for APA titles….”
“A growing number of scientists are reporting their methods and data online and in real time, rather than only publishing their most exciting results behind a paywall in some academic journal. It’s called open science, but is nowhere near being the accepted way to carry out scientific research. This has to change. Now. Maintaining public trust in science depends on it….”
“The Science Journals support the Transparency and Openness Promotion (TOP) guidelines to raise the quality of research published in Science and to increase transparency regarding the evidence on which conclusions are based….All data used in the analysis must be available to any researcher for purposes of reproducing or extending the analysis. Data must be available in the paper, deposited in a community special-purpose repository, accessible via a general-purpose repository such as Dryad, or otherwise openly available….”
“Data management has become an increasingly discussed topic among the academic community. Managing data is an element of open science, which has proven to increase dissemination of research and citations for journal articles. Open science increases public access to academic articles, mostly through preprint repositories. Indeed, according to this study, open access (OA) articles are associated with a 36-172% increase in citations compared to non-OA articles. Publishers such as Elsevier have acquired preprint repositories to increase the dissemination of academic research.”