Influence of accessibility (open and toll-based) of scholarly publications on retractions | SpringerLink

“We have examined retracted publications in different subject fields and attempted to analyse whether online free accessibility (Open Access) influences retraction by examining the scholarly literature published from 2000 through 2019, an incidence of the recent 20 years of publications. InCites, a research analytics tool developed by Clarivate Analytics®, in consultation with Web of Science, PubMed Central, and Retraction Watch databases were used to harvest data for the study. Retracted ‘Article’ and ‘Review’ publications were examined concerning their online accessibility mode (Toll Access and Open Access), based on non-parametric tests like Odds Ratio, Wilcoxon Signed Rank Test, Mann–Whitney U Test, Mann–Kendall and Sen’s methods. The Odds for OA articles to have retraction are about 1.62 as large (62% higher) compared with TA articles (95% CI 1.5, 1.7). 0.028% of OA publications are retracted compared with 0.017% TA publications. Retractions have occurred in all subject areas. In eight subject areas, the Odds for retraction of OA articles are larger compared with retraction of TA articles. In three subject areas, the Odds for retraction of OA articles are lesser compared with the retraction of TA articles. In the remaining 11 subject areas, no significant difference is observed. Post-retraction, though a decline is observed in the citation count of OA & TA publications (p?<?.01), yet the Odds for OA articles to get cited after retraction are about 1.21 as large (21% higher) compared with TA articles (95% CI 1.53, 1.72). TA publications are retracted earlier compared to OA publications (p?<?.01). We observed an increasing trend of retracted works published in both modes. However, the rate of retraction of OA publications is double than the rate of retraction of TA publications.

Joint Statement on transparency and data integrity International Coalition of Medicines Regulatory Authorities (ICMRA) and WHO

“ICMRA1 and WHO call on the pharmaceutical industry to provide wide access to clinical data for all new medicines and vaccines (whether full or conditional approval, under emergency use, or rejected). Clinical trial reports should be published without redaction of confidential information for reasons of overriding public health interest….

Regulators continue to spend considerable resources negotiating transparency with sponsors. Both positive and negative clinically relevant data should be made available, while only personal data and individual patient data should be redacted. In any case, aggregated data are unlikely to lead to re-identification of personal data and techniques of anonymisation can be used….

 

Providing systematic public access to data supporting approvals and rejections of medicines reviewed by regulators, is long overdue despite existing initiatives, such as those from the European Medicines Agency and Health Canada. The COVID-19 pandemic has revealed how essential to public trust access to data is. ICMRA and WHO call on the pharmaceutical industry to commit, within short timelines, and without waiting for legal changes, to provide voluntary unrestricted access to trial results data for the benefit of public health.”

 

 

The Quality of Statistical Reporting and Data Presentation in Predatory Dental Journals Was Lower Than in Non-Predatory Journals

Abstract:  Proper peer review and quality of published articles are often regarded as signs of reliable scientific journals. The aim of this study was to compare whether the quality of statistical reporting and data presentation differs among articles published in ‘predatory dental journals’ and in other dental journals. We evaluated 50 articles published in ‘predatory open access (OA) journals’ and 100 clinical trials published in legitimate dental journals between 2019 and 2020. The quality of statistical reporting and data presentation of each paper was assessed on a scale from 0 (poor) to 10 (high). The mean (SD) quality score of the statistical reporting and data presentation was 2.5 (1.4) for the predatory OA journals, 4.8 (1.8) for the legitimate OA journals, and 5.6 (1.8) for the more visible dental journals. The mean values differed significantly (p < 0.001). The quality of statistical reporting of clinical studies published in predatory journals was found to be lower than in open access and highly cited journals. This difference in quality is a wake-up call to consume study results critically. Poor statistical reporting indicates wider general lower quality in publications where the authors and journals are less likely to be critiqued by peer review. 

 

Post-publication peer review: another sort of quality control of the scientific record in biomedicine | Gaceta Médica de México

Abstract:  Traditional peer review is undergoing increasing questioning, given the increase in scientific fraud detected and the replication crisis biomedical research is currently going through. Researchers, academic institutions, and research funding agencies actively promote scientific record analysis, and multiple tools have been developed to achieve this. Different biomedical journals were founded with post-publication peer review as a feature, and there are several digital platforms that make this process possible. In addition, an increasing number biomedical journals allow commenting on articles published on their websites, which is also possible in preprint repositories. Moreover, publishing houses and researchers are largely using social networks for the dissemination and discussion of articles, which sometimes culminates in refutations and retractions.

 

Journal Open Access and Plan S: Solving Problems or Shifting Burdens? – Kamerlin – – Development and Change – Wiley Online Library

Abstract:  This academic thought piece provides an overview of the history of, and current trends in, publishing practices in the scientific fields known to the authors (chemical sciences, social sciences and humanities), as well as a discussion of how open access mandates such as Plan S from cOAlition S will affect these practices. It begins by summarizing the evolution of scientific publishing, in particular how it was shaped by the learned societies, and highlights how important quality assurance and scientific management mechanisms are being challenged by the recent introduction of ever more stringent open access mandates. The authors then discuss the various reactions of the researcher community to the introduction of Plan S, and elucidate a number of concerns: that it will push researchers towards a pay?to?publish system which will inevitably create new divisions between those who can afford to get their research published and those who cannot; that it will disrupt collaboration between researchers on the different sides of cOAlition S funding; and that it will have an impact on academic freedom of research and publishing. The authors analyse the dissemination of, and responses to, an open letter distributed and signed in reaction to the introduction of Plan S, before concluding with some thoughts on the potential for evolution of open access in scientific publishing.

 

 

 

It’s Time for Open Educational Resources

“Yet, in far too many cases, we are still requiring very expensive textbooks in our classes. Over the degree program, students are expending thousands of dollars for texts that many sell back to the bookstore for less than half of their original value. Much of the material embedded in the texts is either already available freely online or could be assembled by the instructor from open-access sources. At the same time, many instructors still complain that the text does not precisely fit their needs; they skip chapters and assign additional readings to update the material in the text that is already one or two years out of date before the book hits the students’ desks. Why not just create your own texts and update them as often as is needed?

During the first three semesters in COVID times, awareness of open educational resources (OER) has surged among faculty members. Faculty members who put their classes online through remote learning discovered more fully the range and timeliness of relevant materials that are available online. In a study by Bay View Analytics, sponsored by the William and Flora Hewitt Foundation, it was found that faculty who adopted OER rated their materials superior to the commercial alternatives, and while the percentage of required OER materials did not increase, the percentage of supplemental OER materials did….”

F1000 working on ‘digital twin’ platform launches | Research Information

“F1000 is collaborating with two Chinese customers to develop open research publishing platforms dedicated to the research and application of collaborative robots and ‘digital twin’ technologies. Both will be the world’s first open publishing platforms in their fields and will launch for submission in July 2021. 

The platforms will utilise F1000’s open research publishing model, enabling all research outputs to be published open access, as well as combining the benefits of pre-printing (providing rapid publication with no editorial bias) with mechanisms to assure quality and transparency (invited and open peer review, archiving and indexing). They also offer researchers an open and transparent peer review process and have a mandatory FAIR data policy to provide full and easy access to the source data underlying the results….”

F1000 working on ‘digital twin’ platform launches | Research Information

“F1000 is collaborating with two Chinese customers to develop open research publishing platforms dedicated to the research and application of collaborative robots and ‘digital twin’ technologies. Both will be the world’s first open publishing platforms in their fields and will launch for submission in July 2021. 

The platforms will utilise F1000’s open research publishing model, enabling all research outputs to be published open access, as well as combining the benefits of pre-printing (providing rapid publication with no editorial bias) with mechanisms to assure quality and transparency (invited and open peer review, archiving and indexing). They also offer researchers an open and transparent peer review process and have a mandatory FAIR data policy to provide full and easy access to the source data underlying the results….”

How faculty define quality, prestige, and impact in research | bioRxiv

Abstract:  Despite the calls for change, there is significant consensus that when it comes to evaluating publications, review, promotion, and tenure processes should aim to reward research that is of high “quality,” has an “impact,” and is published in “prestigious” journals. Nevertheless, such terms are highly subjective and present challenges to ascertain precisely what such research looks like. Accordingly, this article responds to the question: how do faculty from universities in the United States and Canada define the terms quality, prestige, and impact? We address this question by surveying 338 faculty members from 55 different institutions. This study’s findings highlight that, despite their highly varied definitions, faculty often describe these terms in overlapping ways. Additionally, results shown that marked variance in definitions across faculty does not correspond to demographic characteristics. This study’s results highlight the need to more clearly implement evaluation regimes that do not rely on ill-defined concepts.

 

Is MDPI a predatory publisher? – Paolo Crosetto

“So, is MDPI predatory or not? I think it has elements of both. I would name their methods aggressive rent extracting, rather than predatory. And I also think that their current methods & growth rate are likely to make them shift towards more predatory over time.

MDPI publishes good papers in good journals, but it also employs some strategies that are proper to predatory publishers. I think that the success of MDPI in recent years is due to the creative combination of these two apparently contradicting strategy. One — the good journals with high quality — creates a rent that the other — spamming hundreds of colleagues to solicit papers, an astonishing increase in Special Issues, publishing papers as fast as possible — exploits.This strategy makes a lot of sense for MDPI, who shows strong growth rates and is en route to become the largest open access publisher in the world. But I don’t think it is a sustainable strategy. It suffers from basic collective action problems, that might deal a lot of damage to MDPI first, and, most importantly, to scientific publishing in general….

A predatory publisher is a journal that would publish anything — usually in return for money. MDPI rejection rates make this argument hard to sustain. Yet, MDPI is using some of the same techniques of predatory journals….”