“‘Common wisdom,’ according to the authors of a new piece in Nature, “assumes that the hazard of predatory publishing is restricted mainly to the developing world.” But the authors of the new paper, led by David Moher of the Ottawa Hospital Research Institute, found that more than half — 57% — of the 2,000 articles published in journals they determined were predatory were from high-income countries. In fact, the U.S. was second only to India in number of articles published in such journals. We asked Moher, who founded Ottawa Hospital’s Centre for Journalology in 2015, a few questions about the new work.”
“Now, a new twist is emerging, and that seems to be that PubMed may be consciously or unwittingly acting as a facilitator of predatory or unscrupulous publishing.
In a paper published in Neuroscience, the authors analyzing the neurology and neuroscience journals included in PubMed found that:
- Twenty-five predatory neurology journals were indexed in PubMed, accounting for 24.7% of all predatory neurology journals.
- Fourteen predatory neuroscience journals were indexed in PubMed, accounting for 16.1% of all predatory neuroscience journals.
- Only one of the 188 predatory neuroscience or neurology journals appeared in the DOAJ index.
- Only 54.6% of the journals deemed predatory in neuroscience actually contained articles.”
“Several studies have shown that Wikipedia is as reliable if not more reliable than more traditional encyclopedias. A 2012 study commissioned by Oxford University and the Wikimedia Foundation, for example, showed that when compared with other encyclopedic entries, Wikipedia articles scored higher overall with respect to accuracy, references and overall judgment when compared with articles from more traditional encyclopedias. Wikipedia articles were also generally seen as being more up-to-date, better-referenced and at least as comprehensive and neutral. This study followed a similar 2005 study from Nature that found Wikipedia articles on science as reliable as their counterparts from Encyclopedia Britannica.”
“With growing calls for transparency and data disclosure, global publications leaders find themselves in a balancing act—ensuring both scientific credibility and commercial viability. To help publications leaders navigate this emerging landscape, research and consulting leader Best Practices, LLC undertook benchmarking research to investigate how top pharmaceutical and biotechnology companies shape their global scientific publication strategies to maintain credibility in the scientific community and deliver publications that drive brand success.
The study found that open access platforms are gaining popularity for publication processes; 44% of companies in the study believe such platforms will impact publication strategy going forward. Part of the allure for open access platforms is they make information readily available to physicians and patients alike. In the wake of open access platforms, companies foresee an impact on areas such as journal selection, publication approval and delivery, and speed of data disclosure.
In particular, this study provides benchmarks around publications structure and leadership; staffing and budget levels; publication strategy creation and data delivery; publication channel utilization across product lifecycle; and measuring publication effectiveness. In addition, the 85-page study identifies publication strategy changes for the new marketplace, best practices for maximizing the effectiveness of strategic publication planning, top publication challenges and lessons learned for implementing successful scientific publication strategy.”
In 2014 over 400,000 articles were published in about 8000 journals that many regard as predatory. The term “predatory publishers” was first used by Jeffrey Beall of the University of Colorado, who until recently documented this phenomenon on his blog and in an annual list. Although this term, and variants such as “predatory journals”, are widely used, they have been criticised. One problem is that the term predator may cover a spectrum of organizations, business activities and publications ranging from the amateurish but genuine to the deliberately misleading.
“Pisanski and three colleagues concocted the fake application—supported by a cover letter, a CV boasting phoney degrees, and a list of non-existent book chapters — and sent it to 360 peer-reviewed social science publications.
In the peer-review process, journals ask outside experts to assess the methodology and importance of submissions before accepting then.
The journals were drawn equally from three directories: one listing reputable titles available through subscriptions, with a second devoted to ‘open access’ publications.
The third was a blacklist — compiled by University of Colorado librarian Jeffrey Beall — of known or suspected ‘predatory journals’ that make money by extracting fees from authors.
The number of these highly dubious publications has exploded in recent years, number at least 10,000.
Indeed, 40 of the 48 journals that took the bait and offered a position to the fictitious Anna O. figured on Beall’s list, which has since been taken offline.
The other eight were from the open-access registry. No one made any attempt to contact the university listed on the fake CV, and few probed her obviously spotty experience.
One journal suggested ‘Ms Fraud’ organise a conference after which presenters would be charged for a special issue.
‘Predatory publishing is becoming an organised industry’, said Pisanski, who decided not to name-and-shame the journals caught out by the sting.
Their rise ‘threatens the quality of scholarship’, she added.
Even after the researchers contacted all the journals to inform them that Anna O. Szust did not really exist, her name continued to appear on the editorial board of 11 — including one to which she had not even applied.
None of the journals from the most select directory fell in the trap, and a few sent back tartly worded answers.”
“That being said, some folks I spoke to, including Beall and people in the open access community, thought it was a larger problem than open access publishing alone. The community tries to regulate itself after all, Andrew Wesolek, head of digital scholarship at Clemson, pointed out to Gizmodo. The DOAJ removed 39 of the 120 journals listed in its directory before the analysis came out in Nature today, though six of the eight journals that accepted the fake editor still remain. When I called Lars Bjørnshauge, their founder and managing director, he immediately asked to be put in touch with Pisanski so he could find out the titles of the six journals. He said the DOAJ removes journals with fake editors immediately. …”
“We sent Szust’s application to 360 journals, 120 from each of three well-known directories: the JCR (journals with an official impact factor as indexed on Journal Citation Reports), the DOAJ (journals included on the Directory of Open Access Journals) and ‘Beall’s list’ (potential, possible or probable predatory open-access publishers and journals, compiled by University of Colorado librarian Jeffrey Beall; Beall took down his list in January this year for unknown reasons, after we had completed our study)….Although journals that accepted our fraud were informed that Szust “kindly withdraws her application”, her name still appears on the editorial boards listed by at least 11 journals’ websites. In fact, she is listed as an editor of at least one journal to which we did not apply. She is also listed as management staff, a member of conference organizing committees, and ironically, a member of the Advisory Board of the Journals Open Access Indexing Agency whose mission it is to “increase the visibility and ease of use of open access scientific and scholarly journals”….”
“Synopsis: New data sheds light on Indian researcher’s use of low cost journals. The Indian Government’s attack on these journals, based on Beall’s list, could adversely affect the Indian university science community.
Three weeks ago we reported that an Indian agency was using a whitelist to ban the use of unlisted journals for the purpose of evaluating researcher performance. The Agency is the University Grants Commission (UGC), which apparently plays a major role in university based Indian science. I know little about this realm, but it seems to include setting the criteria for hiring and promotion, perhaps as well as granting PhD’s.