New resource for books added to Think. Check. Submit. | Think. Check. Submit.

“Further to our announcement in October, the Steering Committee of Think. Check. Submit. is delighted to announce a new addition to its resources: a checklist for authors wishing to verify the reliability and trustworthiness of a book or monograph publisher.

Drawing on existing expertise from within the group and from experiences of our newest partner, OAPEN, the checklist for books offers sound advice along the lines of the recommendations already offered by the journal checklist….”

Research published in pay-and-publish journals won’t count: UGC panel | India News,The Indian Express

“Suggesting sweeping reforms to promote the quality of research in India, a UGC panel has recommended that publication of research material in “predatory” journals or presentations in conferences organised by their publishers should not be considered for academic credit in any form.

They include selection, confirmation, promotion, appraisal, and award of scholarships and degrees, the panel has suggested. The committee, which submitted its 14-page report to the UGC recently, has also recommended changes in PhD and MPhil programmes, including a new board for social sciences research….

Last week, the UGC launched the Consortium of Academic and Research Ethics (CARE) to approve a new official list of academic publications….”

International observatory targets predatory publishers | Times Higher Education (THE)

“A coalition of scientists, funders, publishing societies and librarians believes that the formation of an international observatory to study predatory journals will lead to improved advice on how to tackle them.

The initiative aims to fill the void left by the closure three years ago of Jeffrey Beall’s blacklist of predatory publishers. Since then, many others have set up their own blacklists and checklists, but there is “a lack of unity across the community about what predatory journals are”, said Agnes Grudniewicz, assistant professor at the Telfer School of Management at the University of Ottawa.

The coalition’s biggest achievement so far is to create a consensus definition of predatory journals. It defines predatory journals and publishers as “entities that prioritise self-interest at the expense of scholarship” and “are characterised by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggregate and indiscriminate solicitation practices”….

Creating an international observatory – potentially funded by research funders, charities, publishers and research institutions – was a less contentious solution than relying on blacklists or “whitelists” of approved providers, said Dr Grudniewicz. Research led by Michaela Strinzel, from the Swiss National Science Foundation, found that 34 journals listed as predatory by Professor Beall appeared on an approved list of titles run by the Directory of Open Access Journals (DOAJ), while 31 DOAJ titles were deemed predatory by subscription service Cabells….”

Predatory journals: no definition, no defence

“The consensus definition reached was: “Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices.” …”

Identifying publications in questionable journals in the context of performance-based research funding

Abstract:  In this article we discuss the five yearly screenings for publications in questionable journals which have been carried out in the context of the performance-based research funding model in Flanders, Belgium. The Flemish funding model expanded from 2010 onwards, with a comprehensive bibliographic database for research output in the social sciences and humanities. Along with an overview of the procedures followed during the screenings for articles in questionable journals submitted for inclusion in this database, we present a bibliographic analysis of the publications identified. First, we show how the yearly number of publications in questionable journals has evolved over the period 2003–2016. Second, we present a disciplinary classification of the identified journals. In the third part of the results section, three authorship characteristics are discussed: multi-authorship, the seniority–or experience level–of authors in general and of the first author in particular, and the relation of the disciplinary scope of the journal (cognitive classification) with the departmental affiliation of the authors (organizational classification). Our results regarding yearly rates of publications in questionable journals indicate that awareness of the risks of questionable journals does not lead to a turn away from open access in general. The number of publications in open access journals rises every year, while the number of publications in questionable journals decreases from 2012 onwards. We find further that both early career and more senior researchers publish in questionable journals. We show that the average proportion of senior authors contributing to publications in questionable journals is somewhat higher than that for publications in open access journals. In addition, this paper yields insight into the extent to which publications in questionable journals pose a threat to the public and political legitimacy of a performance-based research funding system of a western European region. We include concrete suggestions for those tasked with maintaining bibliographic databases and screening for publications in questionable journals.


Professors Receive NSF Grant to Develop Training for Recognizing Predatory Publishing | Texas Tech Today | TTU

“With more open-access journals making research articles free for people to view, some journals are charging authors publication fees to help cover costs. While some journals that do this are still peer-reviewed and credible, others are not and will publish lower quality work strictly for profit. The difference can be hard to tell, even to the most seasoned author….”

Plaudit · Open endorsements from the academic community

“Plaudit links researchers, identified by their ORCID, to research they endorse, identified by its DOI….

Because endorsements are publisher-independent and provided by known and trusted members of the academic community, they provide credibility for valuable research….

Plaudit is built on open infrastructure. We use permanent identifiers from ORCID and DOI, and endorsements are fed into CrossRef Event Data.

We’re open source, community-driven, and not for profit….”

Where Can I Publish? Part 2: Is there a definitive list? – Delta Think

“We set out to examine whether there is a definitive, curated list of journals that researchers can use when deciding on their publication venue. While some offer very good coverage, the short answer appears to be that no one index offers a definitive list.

Across all journals, there seems to be overlap of significant proportions of the mainstream indexes. However, fully OA journals present a more varied landscape. You need to combine multiple lists to round up a comprehensive list of curated fully OA journals.

Our analysis has combined over 100,000 ISSNs across over 65,000 titles and, we think it represents one of the most comprehensive round ups of the coverage of curated lists available….”

How Americans view research and findings| Pew Research Center

“The Pew Research Center survey asked about several factors that could potentially increase – or decrease – trust in research findings and recommendations. The two steps that inspire the most confidence among members of the public are open access to data and an independent review.

A majority of U.S. adults (57%) say they trust scientific research findings more if the researchers make their data publicly available. Another 34% say that makes no difference, and just 8% say they are less apt to trust research findings if the data is released publicly….

People with higher levels of science knowledge are especially likely to say that open access to data and an independent review boost their confidence in research findings. For example, 69% of those with high science knowledge say that having data publicly available makes them trust research findings, versus 40% of those with low science knowledge….”


Peter Suber: The largest obstacles to open access are unfamiliarity and misunderstanding of open access itself

I’ve already complained about the slowness of progress. So I can’t pretend to be patient. Nevertheless, we need patience to avoid mistaking slow progress for lack of progress, and I’m sorry to see some friends and allies make this mistake. We need impatience to accelerate progress, and patience to put slow progress in perspective. The rate of OA growth is fast relative to the obstacles, and slow relative to the opportunities.”