The Varying Openness of Digital Open Science Tools | Zenodo

Abstract:  Digital tools that support Open Science practices play a key role in the seamless accumulation, archiving and dissemination of scholarly data, outcomes and conclusions. Despite their integration into Open Science practices, the providence and design of these digital tools are rarely explicitly scrutinized. This means that influential factors, such as the funding models of the parent organizations, their geographic location, and the dependency on digital infrastructures are rarely considered. Suggestions from literature and anecdotal evidence already draw attention to the impact of these factors, and raise the question of whether the Open Science ecosystem can realise the aspiration to become a truly “unlimited digital commons” in its current structure. 

In an online research approach, we compiled and analysed the geolocation, terms and conditions as well as funding models of 242 digital tools increasingly being used by researchers in various disciplines. Our findings indicate that design decisions and restrictions are biased towards researchers in North American and European scholarly communities. In order to make the future Open Science ecosystem inclusive and operable for researchers in all world regions including Africa, Latin America, Asia and Oceania, those should be actively included in design decision processes. 

 

Digital Open Science Tools carry the promise of enabling collaboration across disciplines, world regions and language groups through responsive design. We therefore encourage long term funding mechanisms and ethnically as well as culturally inclusive approaches serving local prerequisites and conditions to tool design and construction allowing a globally connected digital research infrastructure to evolve in a regionally balanced manner.

Systematize information on journal policies and practices – A call to action – Leiden Madtrics

In most research fields, journals play a dominant role in the scholarly communication system. However, the availability of systematic information on the policies and practices of journals, for instance with respect to peer review and open access publishing, is surprisingly limited and scattered. Of course we have the journal impact factor, as well as a range of other citation-based journal metrics (e.g., CiteScore, SNIP, SJR, and Eigenfactor), but these metrics provide information only on one very specific aspect of a journal. As is widely recognized, there is a strong need for a wider range of information on journals (see for instance here and here). Such information is for instance needed to facilitate responsible evaluation practices, to promote open access publishing, and to improve journal peer review.

Open Access Transformation in Switzerland & Germany > ./scidecode

“Christian Gutknecht published an exciting posting on the Swiss EUR 57 million Elsevier deal in which he outlines the transformative Open Access agreement between Elsevier and swissuniversities. Since Germany has been trying for years to reach such a contract with Elsevier, it is worth comparing it with the two transformative contracts with Wiley and Springer Nature in Germany, which were reached and coordinated by Project DEAL. Both German agreements were discussed here before just as other transformative Open Access agreements. For those in a hurry: At the end of the posting there is a synopsis of the costs and Open Access components of the Open Access Transformation in Switzerland & Germany. At the very beginning I would like to thank Christian Gutknecht very much for sharing and discussing information that went into this posting….”

National comparisons of early career researchers’ scholarly communication attitudes and behaviours – Jamali – – Learned Publishing – Wiley Online Library

Abstract:  The paper compares the scholarly communication attitudes and practices of early career researchers (ECRs) in eight countries concerning discovery, reading, publishing, authorship, open access, and social media. The data are taken from the most recent investigation in the 4?year?long Harbingers project. A survey was undertaken to establish whether the scholarly communication behaviours of the new wave of researchers are uniform, progressing, or changing in the same overall direction or whether they are impacted significantly by national and cultural differences. A multilingual questionnaire hosted on SurveyMonkey was distributed in 2019 via social media networks of researchers, academic publishers, and key ECR platforms in the UK, USA, France, China, Spain, Russia, Malaysia, and Poland. Over a thousand responses were obtained, and the main findings are that there is a significant degree of diversity in terms of scholarly communication attitudes and practices of ECRs from the various countries represented in the study, which cannot be solely explained by the different make?up of the samples. China, Russia, France, and Malaysia were more likely to be different in respect to a scholarly activity, and responses from the UK and USA were relatively similar.

 

National comparisons of early career researchers’ scholarly communication attitudes and behaviours – Jamali – – Learned Publishing – Wiley Online Library

Abstract:  The paper compares the scholarly communication attitudes and practices of early career researchers (ECRs) in eight countries concerning discovery, reading, publishing, authorship, open access, and social media. The data are taken from the most recent investigation in the 4?year?long Harbingers project. A survey was undertaken to establish whether the scholarly communication behaviours of the new wave of researchers are uniform, progressing, or changing in the same overall direction or whether they are impacted significantly by national and cultural differences. A multilingual questionnaire hosted on SurveyMonkey was distributed in 2019 via social media networks of researchers, academic publishers, and key ECR platforms in the UK, USA, France, China, Spain, Russia, Malaysia, and Poland. Over a thousand responses were obtained, and the main findings are that there is a significant degree of diversity in terms of scholarly communication attitudes and practices of ECRs from the various countries represented in the study, which cannot be solely explained by the different make?up of the samples. China, Russia, France, and Malaysia were more likely to be different in respect to a scholarly activity, and responses from the UK and USA were relatively similar.

 

Publishing computational research – a review of infrastructures for reproducible and transparent scholarly communication | Research Integrity and Peer Review | Full Text

Abstract:  Background

The trend toward open science increases the pressure on authors to provide access to the source code and data they used to compute the results reported in their scientific papers. Since sharing materials reproducibly is challenging, several projects have developed solutions to support the release of executable analyses alongside articles.

Methods

We reviewed 11 applications that can assist researchers in adhering to reproducibility principles. The applications were found through a literature search and interactions with the reproducible research community. An application was included in our analysis if it (i) was actively maintained at the time the data for this paper was collected, (ii) supports the publication of executable code and data, (iii) is connected to the scholarly publication process. By investigating the software documentation and published articles, we compared the applications across 19 criteria, such as deployment options and features that support authors in creating and readers in studying executable papers.

Results

From the 11 applications, eight allow publishers to self-host the system for free, whereas three provide paid services. Authors can submit an executable analysis using Jupyter Notebooks or R Markdown documents (10 applications support these formats). All approaches provide features to assist readers in studying the materials, e.g., one-click reproducible results or tools for manipulating the analysis parameters. Six applications allow for modifying materials after publication.

Conclusions

The applications support authors to publish reproducible research predominantly with literate programming. Concerning readers, most applications provide user interfaces to inspect and manipulate the computational analysis. The next step is to investigate the gaps identified in this review, such as the costs publishers have to expect when hosting an application, the consideration of sensitive data, and impacts on the review process.

Comparing journal-independent review services – ASAPbio

“Preprinting not only accelerates the dissemination of science, but also enables early feedback from a broad community. Therefore, it’s no surprise that there are many innovative projects offering feedback, commentary, and peer reviews on preprints. Such feedback can range from the informal (tweets, comments, annotations, or a simple endorsement) to the formal (an editor-organized process that can provide an in-depth assessment of the manuscript leading to a formal acceptance/endorsement like in a journal). This organized, journal-independent peer review might accomplish several goals: it can provide readers with context to evaluate the paper and foster constructive review that is focused on improving the science rather than gatekeeping for a particular journal. It can also be used as a way to validate the scientific content of a preprint, supporting its value as a citable reference for the scientific literature. When preprints are submitted to a journal, journal-independent peer review can be used by editors to speed up their editorial decisions. Additionally, since 15 million hours of reviewers’ time is wasted every year in re-reviewing revised manuscripts, transparent peer review on preprints could be one way to make the entire publishing process more efficient for reviewers, authors, and editors alike.

Nevertheless, some researchers have expressed confusion about the wealth of options available. To address this, we’ve built a table to compare four services that allow authors to request journal-independent peer reviews: eLife’s Preprint Review, Peerage of Science, Peer Community In, and Review Commons from EMBO and ASAPbio. (Note that there are other services, such as the overlay journal RR:C19, that perform reviews without the authors’ request and allow the reviews to be reused.) All four of the services covered here create reviews that can be used for journal editorial decisions, and they share reviewers’ identities with journals if authors decide to submit their article to a journal. At the same time, they differ in several key areas, outlined below….”

Generalist Repository Comparison Chart | Zenodo

“The General Repository Comparison Chart and FAIRsharing Collection (https://fairsharing.org/collection/GeneralRepositoryComparison) is an outcome of the NIH Workshop on the Role of Generalist Repositories to Enhance Data Discoverability and Reuse held 11-12 February 2020 (workshop summary).  Following the workshop, representatives of the participating generalist repositories collaborated to develop a tool researchers could use to make decisions about selecting a general repository. We intend for the content to be dynamically updated through our partnership with FAIRsharing.  As we work towards that goal, we currently have a static version of the comparison.  

It is important to state that researchers should first determine if an appropriate domain repository exists for their research data.  Tools such as FAIRsharing.org and re3data.org can help with this determination. If using a domain repository is not possible, then a researcher should review both the general repository chart and consider their own institutional repository as a possible location to store their data.  Researchers need to comply with the requirements of their community, funder, country, publisher, and possibly others to ensure the best repository is selected. 

For those interested in continuing the discussion on the role of generalist repositories to enhance data discoverability and reuse, please consider joining the Research Data Alliance (RDA) and particularly the joint RDA/Force11 FAIRsharing WG (https://www.rd-alliance.org/group/fairsharing-registry-connecting-data-policies-standards-databases.html).  There is now an RDA Community to support conversation and a mailing list you join (https://www.rd-alliance.org/groups/generalist-repository-comparison-chart-management-group).  Watch for relevant sessions in the upcoming plenaries.”

Publication models in scientific publishing: to open or not? | Royal College of Physicians of Edinburgh

“The process of scientific publication entails significant input, not only from the authors but also from the editors, reviewers and publishers. Journals may be published by commercial publishers or by scientific societies, such as the Royal College of Physicians of Edinburgh (RCPE), which publishes this journal. Such published information may be available only to subscribers of the journal (subscription-based model) or freely available to be accessed by anyone i.e. open access (OA) model.1 In this article, we provide an overview of various models of publication and also briefly analyse some recent developments….”