Input to “Data Repository Selection: Criteria that Matter” – COAR

“There has been significant concern expressed in the repository community about the requirements contained in the Data Repository Selection: Criteria that Matter, which sets out a number of criteria for the identification and selection of data repositories that will be used by publishers to guide authors in terms of where they should deposit their data.

COAR agrees that it is important to encourage and support the adoption of best practices in repositories. And there are a number of initiatives looking at requirements for repositories, based on different objectives such as the FAIR Principles, CoreTrustSeal, the TRUST Principles, and the CARE Principles of Indigenous Data Governance. Recently COAR brought together many of these requirements – assessed and validated them with a range of repository types and across regions – resulting in the publication of the COAR Community Framework for Best Practices in Repositories.

However, there is a risk that if repository requirements are set very high or applied strictly, then only a few well-resourced repositories will be able to fully comply. The criteria set out in Data Repository Selection: Criteria that Matter are not currently supported by most domain or generalist data repositories, in particular the dataset-level requirements. If implemented by publishers, this will have a very detrimental effect on the open science ecosystem by concentrating repository services within a few organizations, further exacerbating inequalities in access to services. Additionally, it will introduce bias against some researchers, for example,  researchers who prefer to share their data locally; researchers in the global south; or researchers who want to share their data in a relevant domain repository, so it can be visible to their peers and integrated with other similar datasets….”

Full article: Promoting scientific integrity through open science in health psychology: results of the Synergy Expert Meeting of the European health psychology society

Abstract:  The article describes a position statement and recommendations for actions that need to be taken to develop best practices for promoting scientific integrity through open science in health psychology endorsed at a Synergy Expert Group Meeting. Sixteen Synergy Meeting participants developed a set of recommendations for researchers, gatekeepers, and research end-users. The group process followed a nominal group technique and voting system to elicit and decide on the most relevant and topical issues. Seventeen priority areas were listed and voted on, 15 of them were recommended by the group. Specifically, the following priority actions for health psychology were endorsed: (1) for researchers: advancing when and how to make data open and accessible at various research stages and understanding researchers’ beliefs and attitudes regarding open data; (2) for educators: integrating open science in research curricula, e.g., through online open science training modules, promoting preregistration, transparent reporting, open data and applying open science as a learning tool; (3) for journal editors: providing an open science statement, and open data policies, including a minimal requirements submission checklist. Health psychology societies and journal editors should collaborate in order to develop a coordinated plan for research integrity and open science promotion across behavioural disciplines.

 

DataSeer

“DataSeer scans scientific texts for sentences describing data collection, then gives best-practice advice for sharing that type of data.

Researchers can use DataSeer to ensure that their data sharing is complete and follows best practice.

Funders, journals, and institutions can use DataSeer to find all of the data associated with a corpus of articles, or use it to promote compliance with their data sharing policies….”

Preprint Servers’ Policies, Submission Requirements, and Transparency in Reporting and Research Integrity Recommendations | Medical Journals and Publishing | JAMA | JAMA Network

“Preprint servers are online platforms that enable free sharing of preprints, scholarly manuscripts that have not been peer reviewed or published in a traditional publishing venue (eg, journal, conference proceeding, book). They facilitate faster dissemination of research, soliciting of feedback or collaborations, and establishing of priority of discoveries and ideas.1 However, they can also enable sharing of manuscripts that lack sufficient quality or methodological details necessary for research assessment, and can help spread unreliable and even fake information.2 Since 2010, more than 30 new preprint servers have emerged, yet research on preprint servers is still scarce.3 With the increase in the numbers of preprints and preprint servers, we explored servers’ policies, submission requirements, and transparency in reporting and research integrity recommendations, as the latter are often perceived as mechanisms by which academic rigor and trustworthiness are fostered and preserved.

Why is uploading clinical trial results onto trial registries so important?

“Some university researchers still believe that if their clinical trial publishes its outcomes in a peer-reviewed journal, they do not also have to upload its summary results onto trial registries.

 

That is wrong. Here are the facts:

 

Both EU regulations and US law require the results of many (though not all) clinical trial results to be uploaded onto trial registries within 12 months of trial completion.

Best practices set out by the World Health Organization (WHO) require the results of all clinical trials to be uploaded onto a trial registry within that timeframe.

Posting results onto registries accelerates medical progress because the 12-month timeline permits far more rapid results sharing than the slow academic publication process allows.

Posting results onto registries minimises the risk of a trial never reporting its results and becoming research waste, which can happen when a principal investigator dies or leaves their post during the prolonged process of submitting an academic paper to a succession of medical journals.

Results posted on registries are easier to locate and are open access.

Research shows that trial results posted on registries typically give a more comprehensive and accurate picture of patient-relevant trial outcomes than corresponding journal articles do.

Registry reporting facilitates comparison of trial outcomes with a trial’s originally stated aims, and thus discourages harmful research malpractices such as the ‘silent’ suppression, addition, or switching of selected outcomes, HARKing, and p-hacking.

Results on trial registries enable the more rapid and reliable identification of potential safety risks posed by medicines already on the market. …”

Why is uploading clinical trial results onto trial registries so important?

“Some university researchers still believe that if their clinical trial publishes its outcomes in a peer-reviewed journal, they do not also have to upload its summary results onto trial registries.

 

That is wrong. Here are the facts:

 

Both EU regulations and US law require the results of many (though not all) clinical trial results to be uploaded onto trial registries within 12 months of trial completion.

Best practices set out by the World Health Organization (WHO) require the results of all clinical trials to be uploaded onto a trial registry within that timeframe.

Posting results onto registries accelerates medical progress because the 12-month timeline permits far more rapid results sharing than the slow academic publication process allows.

Posting results onto registries minimises the risk of a trial never reporting its results and becoming research waste, which can happen when a principal investigator dies or leaves their post during the prolonged process of submitting an academic paper to a succession of medical journals.

Results posted on registries are easier to locate and are open access.

Research shows that trial results posted on registries typically give a more comprehensive and accurate picture of patient-relevant trial outcomes than corresponding journal articles do.

Registry reporting facilitates comparison of trial outcomes with a trial’s originally stated aims, and thus discourages harmful research malpractices such as the ‘silent’ suppression, addition, or switching of selected outcomes, HARKing, and p-hacking.

Results on trial registries enable the more rapid and reliable identification of potential safety risks posed by medicines already on the market. …”

Microsoft and the Open Data Institute join together to launch a Peer Learning Network for Data Collaborations – Microsoft on the Issues

“Today, in partnership with the Open Data Institute (ODI), we are delighted to announce an open call for participation in a new Peer Learning Network for Data Collaborations. Peer learning networks are an important tool to foster the exchange of knowledge and help participants learn from one another so they can more effectively address the challenges they face.

In April, with the launch of Microsoft’s Open Data Campaign, we committed to putting open and shared data into practice by addressing specific challenges through data collaborations. For a data collaboration to achieve its goals, there are many factors that must come together successfully. Oftentimes, this process can be incredibly challenging. From aligning on key outcomes and data use agreements to preparing datasets for use and analysis, these considerations require time and extensive coordination….

Awardees will have the opportunity to:

receive up to £20,000 for their time over the six months of the peer learning network
learn about and receive guidance from the ODI and Microsoft on different technical approaches, governance mechanisms, and other means for managing data collaborations
connect with peers also working on these challenges

For the purpose of the Peer Learning Network, data collaborations are defined as:

involving a collaboration of companies, research institutions, non-profits, and/or government entities
addressing a clear societal or business-related challenge
are working to make their data as open as possible in the context of the collaboration (collaborations working with restrictions related to privacy or commercial sensitivity are encouraged to apply)
ultimately demonstrate increased access to, and/or meaningful use of, data in reaching the specific goal …”