DOAJ deliberately limits its coverage to the peer-reviewed variety, and evaluates each listed journal individually.
At the same time, some scam or “predatory” OA journals claim to perform peer review but do not. They give OA a bad name, and get wide publicity, creating the false impression that all or most OA journals are scams.
Analogy: Some police are corrupt, and cases of (actual or suspected) police corruption get wide publicity. But that doesn’t mean that all or most police are corrupt….”
“In this study, you will be asked to complete a short survey. During the survey, you will be asked to provide your opinion about the factors that affect the credibility of preprints and what you see as the potential benefits and costs of preprints. “
Abstract: A thriving black-market economy of scam scholarly publishing, typically referred to as ‘predatory publishing,’ threatens the quality of scientific literature globally. The scammers publish research with minimal or no peer review and are motivated by article processing charges and not the advancement of scholarship. Authors involved in this scam are either duped or willingly taking advantage of the low rejection rates and quick publication process. Geographic analysis of the origin of predatory journal articles indicates that they predominantly come from developing countries. Consequently, most universities in developing countries operate blacklists of deceptive journals to deter faculty from submitting to predatory publishers. The present article discusses blacklisting and, conversely, whitelisting of legitimate journals as options of deterrence. Specifically, the article provides a critical evaluation of the two approaches by explaining how they work and comparing their pros and cons to inform a decision about which is the better deterrent.
“If South Africa truly wants to encourage good research, it must stop paying academics by the paper…
Why are South Africans relying so much on journals that do little or nothing to ensure quality? In an effort to boost academic productivity, the country’s education department launched a subsidy scheme in 2005. It now awards roughly US$7,000 for each research paper published in an accredited journal. Depending on the institution, up to half of this amount is paid directly to faculty members. At least one South African got roughly $40,000 for research papers published in 2016 — about 60% of a full professor’s annual salary. There is no guarantee (or expectation) that a researcher will use this money for research purposes. Most simply see it as a financial reward over and above their salaries….
In my experience, publication subsidies promote several other counterproductive practices. Some researchers salami-slice their research to spread it across more papers. Others target low-quality journals that are deemed less demanding….”
Abstract: The world of medical science literature is ever increasingly accessible via the Internet. Open access online medical journals, in particular, offer access to a wide variety of useful information at no cost. In addition, they provide avenues for publishing that are available to health care providers of all levels of training and practice. Whereas costs are less with the publishing of online open access journals, fewer resources for funding and technical support also exist. A recent rise in predatory journals, which solicit authors but charge high fees per paper published and provide low oversight, pose other challenges to ensuring the credibility of accessible scientific literature. Recognizing the value and efforts of legitimate open access online medical journals can help the reader navigate the over 11,000 open access journals that are available to date.
Abstract: Objectives: In 2017 the journal Nature published challenges to the assumption that research intensive U.S. institutions are immune to the hazards of predatory publishing. Sample articles from hundreds of potentially predatory journals were analyzed: the NIH was the most frequent funder and Harvard was among the most frequent institutions. Our study was designed to identify the publication prevalence at our institution.
Methods: Predatory publishers were defined using an archived version of Beall’s list, a now defunct website that was widely recognized as the only comprehensive black list for potential predators. The archive was collected January 15, 2017 and reflects updates made 1-2 weeks prior. To identify our NIH publications, records were collected from PubMed Central using an institution search and limiting to 2011-2016 to reflect a five-year period covered by Beall’s last update. PMC was selected under the assumption that direct journal inclusion in PubMed/MedLine serves as a proxy for quality. Journal and ISSN data were referenced against Ulrich’s Periodical Directory to determine publishers. Data were then compared against the Beall’s listing of potentially predatory publishers and standalone journals. The publication costs for the predatory journals were used to determine the total amount of NIH funding used to pay for publications in predatory journals.
Results: The review of the University’s Publications submitted to PubMed Central from 2011 to 2016 revealed 15090 publications. Of those 15090 articles 218 publications (1.4%) were from publishers that fell in Beall’s list of predatory publishers. A review of publication fees for the publishers that University faculty published in revealed that approximately $300,000 dollars of Federal grant money was spent over the 5 year period publishing in predatory publications.
Conclusions: Previously, it was thought that publishing predatory journals was primarily a problem in developing countries. However, like the 2017 Nature study, we found that researchers publishing at Emory are publishing in journals that are considered predatory. While the rate of publication in predatory journals is low (1.4%) it did cost approximately $300,000 of Federal tax payer money, which amounts to approximately 70% of the funds of one year of the average NIH R01 grant.
“The literature claims that mainly researchers from low-ranked universities in developing countries publish in predatory journals. We decided to challenge this claim using the University of Southern Denmark as a case. We ran the Beall’s List against our research registration database and identified 31 possibly predatory publications from a set of 6,851 publications within 2015-2016. A qualitative research interview revealed that experienced researchers from the developed world publish in predatory journals mainly for the same reasons as do researchers from developing countries: lack of awareness, speed and ease of the publication process, and a chance to get elsewhere rejected work published. However, our findings indicate that the Open Access potential and a larger readership outreach were also motives for publishing in open access journals with quick acceptance rates. …”
“The work that DOAJ is doing to improve transparency and the screening process is very important for open access advocates, who will soon have a tool that they can trust to provide much more complete information for scholars and librarians. For too long we have been forced to use the concept of a list of “questionable” or even “predatory” journals. A directory of journals with robust standards and easy to understand interface will be a fresh start for the rhetoric of open access journals….”
“A recent investigation led by an international group of journalists raised concerns over the scale of the problem of deceptive publishing practices, with many researchers of standing and reputation found to have published in “predatory” journals. However, while the findings of this investigation garnered significant media attention, the robustness of the study itself was not subject to the same scrutiny. To Tom Olijhoek and Jon Tennant, the profile afforded to investigations of this type causes some to overstate the problem of predatory publishing, while often discrediting open access publishing at the same time. The real problem here is one of education around questionable journals, and should not distract from more urgent questions around the shifting scholarly ecosystem….”
“Following a Peer Review session at the AAAS meeting this week, I am going to record my thoughts for posterity, proselytizing shamelessly about my vision for the future of peer review.
So… let me begin with the catchy name. The system I propose is entirely reliant on the Internet, and everyone knows that the first requirement for success of any new Internet entity is a catchy name. I trust (especially in context) that the intended connotations are obvious: Peer needs no explanation; the O’ prefix stands variously for of or by peers and for a shortening of Open, which you will see is a key feature. That being said, if you want to call it something else, go for it! This is only a suggestion….
What we really need is a (multiparameter) “credibility profile” for each reviewer of any paper. If every would-be referee were thus rated, it might be feasible to Open up peer review without erasing its effectiveness….”