“Open Access India partners with the Center for Open Science to launch IndiaRxiv on the eve of India’s 73rd Independence Day as the country joins the global march for open science.
Open Access India and the Center for Open Science have collaborated to launch IndiaRxiv, India’s first preprint service. IndiaRxiv began as a vision for a single open platform that could provide free access to all publicly-funded research outputs (publications) from India and to provide Indian scholars with a way to share their scholarly outputs. Today, on the eve of India’s 73rd Independence Day, we are happy to introduce IndiaRxiv. Beginning 15th August, 2019, the preprint service will be open to all researchers and scholars of India and others who are working on issues related to India.
IndiaRxiv is being launched not only for scholars to share their articles and read the work published by their peers, but also to provide public access to the latest research, allowing authors to gather feedback and ideas and build upon existing work….”
“Open Access India and the Center for Open Science have collaborated to launch IndiaRxiv, India’s first preprint service. IndiaRxiv began as a vision for a single open platform that could provide free access to all publicly-funded research outputs (publications) from India and to provide Indian scholars with a way to share their scholarly outputs. Today, on the eve of India’s 73rd Independence Day, we are happy to introduce IndiaRxiv. Beginning August 15, 2019, the preprint service will be open to all researchers and scholars of India and others working on research related to India….”
“Throughout 16 years of experience, Redalyc has promoted, from permanent technological development and accompaniment to editors, a collaborative, sustainable and non-commercial scientific communication for the benefit of the Latin American scientific communities, mainly of the Social Sciences and the Humanities.
In the pursuit of this goal, Redalyc celebrates the emergence of Invest in Open Infrastructure (IOI), an initiative that has brought together various institutions (including OPERAS, SPARC, Center for Open Science and recently Redalyc), meeting with the goal of building a Open, scalable and durable scientific infrastructure that seeks to extend its benefits on a global scale.
Redalyc is pleased to be part of this initiative and thus consolidates its objective of building a collaborative, sustainable and non-commercial Open Access ecosystem for Latin America….”
“Several years ago I moved to help fill a void I saw in sociology— a need for greater openness and transparency in research practices and publications—something that many scientists in other disciplines were moving to embrace. I founded SocArXiv, an open social science archive for research papers, modeled after arXiv in math and physics and bioRxiv in life sciences. Working with the Center for Open Science and a steering committee of sociologists and librarians (including Chris Bourg), we started accepting papers in 2016, and now host more than 3,000. The work is free to share and read, with links to research materials, and proper archiving and tagging, so it’s accessible and discoverable by anyone.
Since 2016, I’ve had lots of work to do to help build an equitable, open, and durable system of knowledge communication, and it’s work I love. Thanks to the leadership of Chris Bourg, support from a group of libraries from the Association of Research Libraries, and a sabbatical leave from Maryland, in 2018 I had the opportunity to extend that work at MIT’s new Center for Research on Equitable and Open Scholarship (CREOS) as its first visiting scholar….”
“Researchers are trying to fix the problem. They’re encouraging more sharing of data sets and urging each other to preregister their hypotheses—declaring what they intend to find and how they intend to find it. The idea is to cut down on the statistical shenanigans and memory-holing of negative results that got the field into this mess. No more collecting a giant blob of data and then combing through it for a publishable outcome, a practice known as “HARKing”—hypothesizing after results are known.
And self-appointed teams are even going back through old work, manually, to see what holds up and what doesn’t. That means doing the same experiment again, or trying to expand it to see if the effect generalizes. It’s a slog—boring, expensive, and time-consuming. To the Defense Advanced Research Projects Agency, the Pentagon’s mad-science wing, the problem demands an obvious solution: Robots.
A Darpa program called Systematizing Confidence in Open Research and Evidence—yes, SCORE—aims to assign a “credibility score” (see what they did there) to research findings in the social and behavioral sciences, a set of related fields to which the reproducibility crisis has been particularly unkind. In 2017, I called the project a bullshit detector for science, somewhat to the project director’s chagrin. Well, now it’s game on: Darpa has promised $7.6 million to the Center for Open Science, a nonprofit organization that’s leading the charge for reproducibility. COS is going to aggregate a database of 30,000 claims from the social sciences. For 3,000 of those claims, the Center will either attempt to replicate them or subject them to a prediction market—asking human beings to essentially bet on whether the claims would replicate or not. (Prediction markets are pretty good at this; in a study of reproducibility in the social sciences last summer, for example, a betting market and a survey of other researchers performed about as well as actual do-overs of the studies.)…”
“While open access to peer-reviewed publications is important for achieving open science, this is just one part of the solution; data and study materials that underpin findings also need to beas open as possible.
Plan S principles touch upon the need to change the incentive structures for researchers, but FAIR (findable, accessible, interoperable and reusable) open data is left out of the picture. The international consortium of research funders, the cOAlition, could have used its Plan S announcement to signal that funders will soon be turning attention to FAIR and open research practices. This would help reproducibility to be considered as a part of the research lifecycle, rather than risk it being an afterthought.
To ensure reproducibility is justly considered, a group of academics recently established the UK Reproducibility Network, which has grown rapidly in just a few months. Top of the list of actions for this group is the promotion of open research practices and carrying out meta-research into how effective such initiatives are. The network will provide a platform to advocate for specific practices, in an evidence-based way, and provide training for UK researchers.
Around the globe, others are investigating how to support researchers to produce more robust evidence, such as the Center for Open Science in the US, which has links with programmes in Japan and Canada. …”
“eLife, in collaboration with software engineer Vincent Tunru from Flockademic and the Center for Open Science (COS), is supporting the development of Plaudit – a mechanism for academics to share their research recommendations openly with readers.
Stemming from a concept refined at the eLife Innovation Sprint 2018 by a team of publishers, technologists and researchers, Plaudit aims to provide an easy way to recognise the value of scholarly content, regardless of where it is published. The tool has three main benefits for users: those who recommend research objects lend their authority to the endorsement, the authors of the objects benefit from the endorsement, and readers gain insight into the objects’ potential value….”
“A “PA” (Protected Access) notation may be added to open data badges if sensitive, personal data are available only from an approved third party repository that manages access to data to qualified researchers through a documented process. To be eligible for an open data badge with such a notation, the repository must publicly describe the steps necessary to obtain the data and detailed data documentation (e.g. variable names and allowed values) must be made available publicly. This notation is not available to researchers who state that they will make “data available upon request” and is not available if requests for data sharing are evaluated on any criteria beyond considerations for compliance with proper handling of sensitive data. For example, this notation is not available if limitations are placed on the permitted use of the data, such as for data that are only made available for the purposes of replicating previously published results or for which there is substantive review of analytical results. Review of results to avoid disclosure of confidential information is permissible….”
The Center for Open Science (COS) and MarXiv have launched a new preprint service for the earth sciences, sources for both organizations announced today. The new service, called MarXiv, provides free, open access, open source archives for the ocean conservation and marine climate sciences.
“Data management has become an increasingly discussed topic among the academic community. Managing data is an element of open science, which has proven to increase dissemination of research and citations for journal articles. Open science increases public access to academic articles, mostly through preprint repositories. Indeed, according to this study, open access (OA) articles are associated with a 36-172% increase in citations compared to non-OA articles. Publishers such as Elsevier have acquired preprint repositories to increase the dissemination of academic research.”