Transparency and Openness Promotion (TOP) Guidelines

Abstract:  The Transparency and Openness Promotion (TOP) Committee met in November 2014 to address one important element of the incentive systems – journals’ procedures and policies for publication. The outcome of the effort is the TOP Guidelines. There are eight standards in the TOP guidelines; each move scientific communication toward greater openness. These standards are modular, facilitating adoption in whole or in part. However, they also complement each other, in that commitment to one standard may facilitate adoption of others. Moreover, the guidelines are sensitive to barriers to openness by articulating, for example, a process for exceptions to sharing because of ethical issues, intellectual property concerns, or availability of necessary resources.

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves

Abstract:  Despite the increase in the number of journals issuing data policies requiring authors to make data underlying reporting findings publicly available, authors do not always do so, and when they do, the data do not always meet standards of quality that allow others to verify or extend published results. This phenomenon suggests the need to consider the effectiveness of journal data policies to present and articulate transparency requirements, and how well they facilitate (or hinder) authors’ ability to produce and provide access to data, code, and associated materials that meet quality standards for computational reproducibility. This article describes the results of a research study that examined the ability of journal-based data policies to: 1) effectively communicate transparency requirements to authors, and 2) enable authors to successfully meet policy requirements. To do this, we conducted a mixed-methods study that examined individual data policies alongside editors’ and authors’ interpretation of policy requirements to answer the following research questions. Survey responses from authors and editors along with results from a content analysis of data policies found discrepancies among editors’ assertion of data policy requirements, authors’ understanding of policy requirements, and the requirements stated in the policy language as written. We offer explanations for these discrepancies and offer recommendations for improving authors’ understanding of policies and increasing the likelihood of policy compliance.

 

Introducing the STM 2020 Research Data Year

“STM has declared 2020 the ‘STM Research Data Year’ and is working with publishers and other partners to boost effective sharing of research data:

SHARE: Increase the number of journals with data policies and articles with Data Availability Statements (DAS)
LINK: Increase the number of journals that deposit the data links to the SCHOLIX framework
CITE: Increase the citations to datasets along the Joint Declaration of Data Citation Principles…”

Is it Finally the Year of Research Data? – The STM Association Thinks So – The Scholarly Kitchen

“At the recent Researcher to Reader conference in London, Mark Allin (@allinsnap) had the job of doing the conference round-up, which is the slot immediately before the closing keynote where the themes and take-homes of the conference are brought together. In his four summary themes, Allin inevitably drew out Open Access / Open Science. It’s almost impossible to have a publishing or library conference without it, however, in terms of significance, he put it at the bottom of the list, almost as an afterthought. His reasoning is that open science now feels like an inevitability. With a clear trend towards both open access and open data mandates among funders, institutions, and publishers, the question that each of us must ask ourselves isn’t whether it will or should happen, but how are we going to adapt as change continues….

Practices around open research data are gaining traction. In 2019’s The State of Open Data Report, 64% of respondents claimed that they made their data openly available in 2018. That’s a rise of 4% from the previous year. Comprehensive information on the prevalence of open data policies is hard to come by, but there is a general sense that publishers, funders, and institutions alike are all moving towards firstly having data policies and then steadily strengthening those policies over time. 

The JoRD project, based at Nottingham University in the UK was funded by Jisc and ran from December 2012 until its final blog post in 2014. In this article, Sturges et al., report that JoRD found the state of open data policies among journals to be patchy and inconsistent, with about half of all the journals they looked at having no policy at all, and with 75% of those that did exist being categorized as weak….

Unfortunately, the short timescale of the JoRD project limits its findings to a snapshot. However, there has since been piecemeal evidence of progress towards a more robust open research data landscape. The case studies presented in this article by Jones et al., — a different Jones, not me — describe how both Taylor and Francis, and Springer Nature have followed the path of steadily increasing the number of journals with data policies while strengthening those that exist….”

New Measure Rates Quality of Research Journals’ Policies to Promote Transparency and Reproducibility

“Today, the Center for Open Science launches TOP Factor, an alternative to journal impact factor (JIF) to evaluate qualities of journals. TOP Factor assesses journal policies for the degree to which they promote core scholarly norms of transparency and reproducibility. TOP Factor provides a first step toward evaluating journals based on their quality of process and implementation of scholarly values. This alternative to JIF may reduce the dysfunctional incentives for journals to publish exciting results whatever their credibility….

TOP Factor is based primarily on the Transparency and Openness Promotion (TOP) Guidelines, a framework of eight standards that summarize behaviors that can improve transparency and reproducibility of research such as transparency of data, materials, code, and research design, preregistration, and replication. Journals can adopt policies for each of the eight standards that have increasing levels of stringency. For example, for the data transparency standard, a score of 0 indicates that the journal policy fails to meet the standard, 1 indicates that the policy requires that authors disclose whether data are publicly accessible, 2 indicates that the policy requires authors to make data publicly accessible unless it qualifies for an exception (e.g., sensitive health data, proprietary data), and 3 indicates that the policy includes both a requirement and a verification process for the data’s correspondence with the findings reported in the paper. TOP Factor also includes indicators of whether journals offer Registered Reports, a publishing model that reduces publication bias of ignoring negative and null results, and badging to acknowledge open research practices to facilitate visibility of open behaviors….”

TOP (Transparency and Openness Promotion)

“Transparency, open sharing, and reproducibility are core values of science, but not always part of daily practice. Journals, funders, and societies can increase research reproducibility by adopting the TOP Guidelines….”

5 Scholarly Publishing Trends to Watch in 2020

“The vision for a predominantly open access (OA) publishing landscape has shifted from a possibility to a probability in the opinions of many. A 2017 Springer Nature survey of 200 professional staff working in research institutions around the world found that over 70% of respondents agreed scholarly content should be openly accessible and 91% of librarians agreed that “open access is the future of academic and scientific publishing.” …

As noted, there is growing consensus within academia that the majority of scholarly content will be available OA in the future — but how to reach that end is still a matter of debate. The announcement of Plan S in September 2018, an initiative by a consortium of national and international research funders to make research fully and immediately OA, sent shockwaves throughout academia. 2019 saw the release of the revised Plan S guidelines with some significant changes, including an extension of the Plan S deadline to January 2021, a clearer Green OA compliance pathway, and greater flexibility around non-derivative copyright licenses. What remains the same — and has been a matter of significant debate — is that Plan S will not acknowledge hybrid OA as a compliant publishing model.

In response to concerns raised by scholarly societies around the feasibility of transitioning to full and immediate OA publishing without compromising their operational funding, Wellcome and UKRI in partnership with ALPSP launched the “Society Publishers Accelerating Open Access and Plan S“ (SPA-OPS) project to identify viable OA publishing models and transition options for societies. The final SPA-OPS report was released in September of 2019, encompassing over 20 potential OA models and strategies as well as a “transformative agreement toolkit.” …”

Open data: growing pains | Research Information

“In its latest State of Open Data survey, Figshare revealed that a hefty 64 per cent of respondents made their data openly available in 2018.

The percentage, up four per cent from last year and seven per cent from 2016, indicates a healthy awareness of open data and for Daniel Hook, chief executive of Figshare’s parent company, Digital Science, it spells good news….

For example, the majority of respondents – 63 per cent – support national mandates for open data, an eight  per cent rise from 2017. And, at the same time, nearly half of the respondents – 46 per cent – reckon data citations motivate them to make data openly available. This figure is up seven per cent from last year….

Yet, amid the data-sharing success stories, myriad worries remain. Top of the pile is the potential for data misuse….

Inappropriate sharing of data is another key concern….

Results indicated that a mighty 58 per cent of respondents felt they do not receive sufficient credit for sharing data, while only nine per cent felt they do….

Coko recently won funding from the Sloan Foundation to build DataSeer, an online service that will use Natural Language Processing to identify datasets that are associated with a particular article. …”