2nd HIRMEOS Webinar: A Peer-review Certification System for Open Access Books – Hirmeos Project

The webinar is aimed at presenting the peer-review certification service developed in the course of the HIRMEOS project.

Peer-review has a critical importance in scholarly communication, but both its practices and understanding exhibit a great deal of opacity. This is especially true for the peer review processes concerning open access monographs.

The HIRMEOS Open Book Peer-Review Certification service is a response to the increasing need for transparency and a better understanding of book peer review processes. The certification system, developed in collaboration with the Directory of Open Access Books (DOAB), provides a convenient way to reassure authors and evaluation agencies about the scientific quality of Open Access books. In the webinar, we are going to introduce this service to different communities by bringing together the perspectives of scholars, publishers, developers and librarians. …”

Plan S, the Verschlimmbesserung of Scholarly Information

Perhaps it isn’t surprising that Germany steered clear of signing on to Plan S. If you can create the word verschlimmbesserung to describe an attempted improvement that actually makes things worse, you are probably pretty good at spotting and avoiding a verschlimmbesserung more quickly than you can say it….

But if we widen the aperture to align with the mission of Plan S funders and consider whether Plan S is good for science, medicine, humanities, and knowledge, the focus changes, and we can see that Plan S could well actually make things worse….

Plan S undermines this complex ecosystem, making the more selective and curated subscription outlets less viable. In doing so, Plan S flattens the multitude of venues where scholarly information appears, and funnels research towards high-volume, low-cost, less-discerning outlets. …

Plan S is not really about advancing science, or OA, but about harming large commercial publishers (I made this argument here). …

[W]e may find that low-margin society publishers, who are dedicated to advancing their fields, find Plan S makes their operations unsustainable and are forced to divest their publishing assets. As a result, we may well see large commercial players become even larger, and while there be some margin compression in traversing to a Plan S-catalyzed flipped world, net profits of commercial players could well grow….”

How to bring preprints to the charged field of medicine

“The founders of the popular biology preprint server bioRxiv have launched a repository on which medical scientists can share their results before peer review.

BioRxiv’s success prompted some clinical scientists to push for such a site because the biology repository accepts preprints in only certain fields of medical science. But some researchers are concerned that releasing unvetted clinical research could be risky, if patients or doctors act on what could end up being inaccurate information.

The organizations behind the new server, named medRxiv, have been working on the project since 2017 and say they have built in safeguards to address those concerns….”

Compliance with ethical rules for scientific publishing in biomedical Open Access journals indexed in Journal Citation Reports | proLéka?e.cz

Abstract:  This study examined compliance with the criteria of transparency and best practice in scholarly publishing defined by COPE, DOAJ, OASPA and WAME in Biomedical Open Access journals indexed in Journal Citation Reports (JCR). 259 Open Access journals were drawn from the JCR database and on the basis of their websites their compliance with 14 criteria for transparency and best practice in scholarly publishing was verified. Journals received penalty points for each unfulfilled criterion when they failed to comply with the criteria defined by COPE, DOAJ, OASPA and WAME. The average number of obtained penalty points was 6, where 149 (57.5%) journals received ? 6 points and 110 (42.5%) journals ? 7 points. Only 4 journals met all criteria and did not receive any penalty points. Most of the journals did not comply with the criteria declaration of Creative Commons license (164 journals), affiliation of editorial board members (116), unambiguity of article processing charges (115), anti-plagiarism policy (113) and the number of editorial board members from developing countries (99). The research shows that JCR cannot be used as a whitelist of journals that comply with the criteria of transparency and best practice in scholarly publishing.

Journal transparency rules to help scholars pick where to publish | Times Higher Education (THE)

“New requirements for journals to be more transparent about their editorial processes could help researchers to make more informed decisions about where to submit their work, as the European-led Plan S initiative moves into its next phase.

Freshly revised requirements for the open access mandate – which is now due to come into force in January 2021, a year later than originally planned – outline a series of mandatory conditions that journals and other platforms must adhere to if academics financed by participating funders are to publish in them.

This states that a journal must provide on its website “a detailed description of its editorial policies and decision-making processes”, with a “solid system” in place for peer review that must adhere to guidelines produced by the Committee on Publication Ethics. “In addition, at least basic statistics must be published annually, covering in particular the number of submissions, the number of reviews requested, the number of reviews received, the approval rate, and the average time between submission and publication,” the guidance says. 

David Sweeney, executive chair of Research England and co-chair of the Plan S implementation task force, described greater transparency in journals’ editorial and publishing practices as the logical “next step in the puzzle” of creating a “fairer, more open publishing landscape”. …

Journals will also be required to price the services they provide, such as reviewing and copy-editing, since funders will find themselves supporting the article processing charges associated with many forms of open access publishing….”

Open access and subscription based journals have similar problems in terms of quantity and relaying science to the public | The BMJ

In the Head to Head debate on whether to publish in an open access journal, Ashton and Beattie report that PLOS One accepts 70% of submissions.1 That might have been true in 2013, but a more recent and perhaps more accurate figure would be that as of 2017 PLOS One accepts about 50% of submissions, which is an equivalent rate to that of BMJ Open.2 I also question whether acceptance rate is a meaningful statistic when research is moving towards a publish first, curate later model.3 Simply not publishing, or batting manuscripts around various journals until one finally accepts it after a lengthy delay, constitutes a form of research waste and is something that ought to be avoided.4

The argument that certain forms of open access encourage higher quantity is also true of subscription based publishing. Predominantly subscription based publishers routinely market their subscriptions to library consortiums on the basis of price per article, the lower the better value. As the price of subscriptions that libraries can afford remains flat, subscription based publishers have an incentive to make their services look better by publishing more to reduce the apparent price per article that a subscription gets you. The publishers then further obfuscate this by bundling together journals full of chaff articles with journals full of higher quality material. But under so called diamond or platinum open access publishing models, in which neither authors nor readers pay to support the publication process, there is no such dangerous incentive to erode professional standards….”

Are open access journals peer reviewed? – Quora

As of today, the Directory of Open Access Journals (DOAJ) lists 13,229 peer-reviewed open-access (OA) journals.

DOAJ deliberately limits its coverage to the peer-reviewed variety, and evaluates each listed journal individually.

At the same time, some scam or “predatory” OA journals claim to perform peer review but do not. They give OA a bad name, and get wide publicity, creating the false impression that all or most OA journals are scams.

Analogy: Some police are corrupt, and cases of (actual or suspected) police corruption get wide publicity. But that doesn’t mean that all or most police are corrupt….”

Why Beall’s List Died — and What It Left Unresolved About Open Access – The Chronicle of Higher Education

Why, after toiling so hard for five years — and creating a resource cherished by scientists wary of exploitative publishers — did the University of Colorado at Denver’s Jeffrey Beall abruptly give it all up? Who, or what, forced his hand?

There are several prime suspects:

  • His fellow university librarians, whom Mr. Beall faults for overpromoting open-access publishing models.
  • A well-financed Swiss publisher, angry that Mr. Beall had had the temerity to put its journals on his list.
  • His own university, perhaps fatigued by complaints from the publisher, the librarians, or others.
  • The broader academic community — universities, funders of research, publishers, and fellow researchers, many of whom long understood the value of Mr. Beall’s list but did little to help him out.
  • Mr. Beall himself, who failed to recognize that a bit of online shaming wouldn’t stop many scientists from making common cause with journals that just don’t ask too many questions.

In the end, all played important roles in the demise of Beall’s List. On one level, Mr. Beall’s saga is just another tale of warring personalities. On another, though, it points to a broader problem in publishing: Universities still have a long way to go to create systems for researchers to share and collaborate with one another, evaluate one another’s work, and get credit for what really matters in research….”

Exploring the quality of government open data | Comparison study of the UK, the USA and Korea | The Electronic Library | Vol 37, No 1

Abstract:  Purpose

The use of “open data” can help the public find value in various areas of interests. Many governments have created and published a huge amount of open data; however, people have a hard time using open data because of data quality issues. The UK, the USA and Korea have created and published open data; however, the rate of open data implementation and level of open data impact is very low because of data quality issues like incompatible data formats and incomplete data. This study aims to compare the statuses of data quality from open government sites in the UK, the USA and Korea and also present guidelines for publishing data format and enhancing data completeness.

Design/methodology/approach

This study uses statistical analysis of different data formats and examination of data completeness to explore key issues of data quality in open government data.

Findings

Findings show that the USA and the UK have published more than 50 per cent of open data in level one. Korea has published 52.8 per cent of data in level three. Level one data are not machine-readable; therefore, users have a hard time using them. The level one data are found in portable document format and hyper text markup language (HTML) and are locked up in documents; therefore, machines cannot extract out the data. Findings show that incomplete data are existing in all three governments’ open data.

Originality/value

Governments should investigate data incompleteness of all open data and correct incomplete data of the most used data. Governments can find the most used data easily by monitoring data sets that have been downloaded most frequently over a certain period.

The European University Association and Science Europe Join Efforts to Improve Scholarly Research Assessment Methodologies

“EUA and Science Europe are committed to working together on building a strong dialogue between their members, with a view to:

• support necessary changes for a better balance between qualitative and quantitative research assessment approaches, aiming at evaluating the merits of scholarly research. Furthermore, novel criteria and methods need to be developed towards a fairer and more transparent assessment of research, researchers and research teams, conducive to selecting excellent proposals and researchers.governments and public authorities to guarantee scholars and students the rights that constitute academic freedom, including the rights to freedom of expression, opinion, thought, information and assembly as well as the rights to education and teaching;

• recognise the diversity of research outputs and other relevant academic activities and their value in a manner that is appropriate to each research field and that challenges the overreliance on journal-based metrics.universities, funding agencies, academies and other research organisations to ensure that all researchers, teachers and students are guaranteed academic freedom, by fostering a culture in which free expression and the open exchange of opinion are valued and by shielding the research and teaching community from sanctions for exercising academic freedom.

• consider a broad range of criteria to reward and incentivise research quality as the fundamental principle of scholarly research, and ascertain assessment processes and methods that accurately reflect the vast dimensions of research quality and credit all scientific contributions appropriately. EUA and Science Europe will launch activities to further engage their members in improving and strengthening their research assessment practices. Building on these actions, both associations commit to maintaining a continuous dialogue and explore opportunities for joint actions, with a view to promoting strong synergies between the rewards and incentives structures of research funders and research performing organisations, as well as universities….”