Transparent review in preprints – Cold Spring Harbor Laboratory

“Cold Spring Harbor Laboratory (CSHL) today announced a new pilot project—Transparent Review in Preprints (TRiP)—that enables journals and peer review services to post peer reviews of submitted manuscripts on CSHL’s preprint server bioRxiv.

“The new project is part of broader efforts by bioRxiv to work with other organizations to help the scholarly publishing ecosystem evolve,” said John Inglis, co-founder of bioRxiv at CSHL.

The project is powered by the web annotation tool Hypothesis and will allow participating organizations to post peer reviews in dedicated Hypothesis groups alongside relevant preprints on the bioRxiv website. Authors must opt-in with the journal/service in advance. The use of restricted Hypothesis groups allows participating organizations to control the process and ensure that only reviews they approve are displayed. Readers will continue to be able to post their own reactions to individual preprints through bioRxiv’s dedicated comment section.

eLife and the EMBO Press journals, together with Peerage of Science and Review Commons, two journal-independent peer review initiatives, will be the first to participate. Several other groups plan to join the pilot later, including the American Society for Plant Biology and the Public Library of Science….”

Enabling A Conversation Across Scholarly Monographs through Open Annotation

Abstract:  The digital format opens up new possibilities for interaction with monographic publications. In particular, annotation tools make it possible to broaden the discussion on the content of a book, to suggest new ideas, to report errors or inaccuracies, and to conduct open peer reviews. However, this requires the support of the users who might not yet be familiar with the annotation of digital documents. This paper will give concrete examples and recommendations for exploiting the potential of annotation in academic research and teaching. After presenting the annotation tool of Hypothesis, the article focuses on its use in the context of HIRMEOS (High Integration of Research Monographs in the European Open Science Infrastructure), a project aimed to improve the Open Access digital monograph. The general line and the aims of a post-peer review experiment with the annotation tool, as well as its usage in didactic activities concerning monographic publications are presented and proposed as potential best practices for similar annotation activities.

Open Web annotation as collaborative learning | Kalir | First Monday

Abstract:  This paper describes the use of open Web annotation (OWA) for collaborative learning among online communities. OWA is defined by the open standards, principles, and practices associated with the open Web. Specifically, this case study examines collaborative learning mediated by the OWA technology Hypothesis, a standards-compliant and open-source technology that situates collaboration in texts-as-contexts. Hypothesis OWA supports a repertoire of six collaborative learning practices: Affording multimodal expression, establishing connections across contexts, archiving activity, visualizing expertise and cognition, contributing to open educational resources, and fostering open educational practices. The use of Hypothesis OWA is then described in three online communities associated with scientific research and communication, educator professional development, and Web literacy and fact-checking. The article concludes by advancing three broad questions and related research agendas regarding how OWA as collaborative learning attends to linkages among formal and informal learning environments, the growth of both open educational resources and practices, and the use of open data as learning analytics.

Announcing Invest in Open Infrastructure – Hypothesis

Today, together with a set of global partnersHypothesis is proud to announce the formation of Invest in Open Infrastructure (IOI) — a new initiative to dramatically increase the amount of funding available to open scholarly infrastructure….

Projects like Hypothesis are extremely difficult to begin, grow and sustain over time. We were fortunate to have had early believers on Kickstarter, and then stalwart supporters in over the last 8 years in foundations like Sloan, Mellon, Shuttleworth, Knight, Helmsley and Omidyar. However, this foundation support is still insufficient to the longer term, larger funding required to bridge to a sustainable future for most open projects, including ours. Foundations tend to support early projects, but that support usually falls off with time. The kind of mezzanine funding that a for-profit technology might find from venture groups in later stages is simply not available within the ecosystem of non-profit, open source projects.

The core problem is that the true consumers of scholarly infrastructure — namely the researchers, scholars and their institutions and agencies which form the gross majority of users — have the means to sustain it, but lack the structure to do so. The libraries know of a few platforms that they need and provide direct support, but there are hundreds of other projects for which there is no visibility at the institutional level, because they’re still early, or because researchers rather than institutions themselves depend on them directly. Projects like Hypothesis, like any technology infrastructure trying to scale over years to maturity, need ongoing funding until sustainability can be achieved.

 

What is needed is a coordinating system which can identify, track and assess open infrastructure across diverse categories and constituencies and make recommendations to funders who can pool their resources to sustain it. This coordinating system is exactly the idea behind IOI….”

Web annotation tool Hypothesis hits a milestone

The team behind Hypothesis, an open-source software tool that allows people to annotate web pages, announced in March that its users had collectively posted more than 5 million comments across the scholarly web since the tool was launched in 2011. That’s up from about 220,000 total comments in 2015 (see ‘Comment counts’). The company has grown from 26,000 registered users to 215,000 over the same period….”

Towards Open Annotation: Examples and Experiments

Abstract:  This article interrogates how digital text annotation tools and projects facilitate online engagement and virtual communities of practice. With the rise of the Web 2.0 movement and the proliferation of digital resources, annotation has evolved from an isolated practice to a collaborative one. This article unpacks the impact of this shift by providing an in-depth discussion of five web-based tools and two social reading projects. This article examines issues of design, usability, and applicability to pedagogical intervention as well as underscores how productive group dynamics can be fostered through digital, social annotation. 

Towards Open Annotation: Examples and Experiments

Abstract:  This article interrogates how digital text annotation tools and projects facilitate online engagement and virtual communities of practice. With the rise of the Web 2.0 movement and the proliferation of digital resources, annotation has evolved from an isolated practice to a collaborative one. This article unpacks the impact of this shift by providing an in-depth discussion of five web-based tools and two social reading projects. This article examines issues of design, usability, and applicability to pedagogical intervention as well as underscores how productive group dynamics can be fostered through digital, social annotation. 

‘No comment’? A study of commenting on PLOS articles – Simon Wakeling, Peter Willett, Claire Creaser, Jenny Fry, Stephen Pinfield, Valerie Spezi, Marc Bonne, Christina Founti, Itzelle Medina Perea, 2019

Abstract:  Article–commenting functionality allows users to add publicly visible comments to an article on a publisher’s website. As well as facilitating forms of post-publication peer review, for publishers of open-access mega-journals (large, broad scope, open-access journals that seek to publish all technically or scientifically sound research) comments are also thought to serve as a means for the community to discuss and communicate the significance and novelty of the research, factors which are not assessed during peer review. In this article we present the results of an analysis of commenting on articles published by the Public Library of Science (PLOS), publisher of the first and best-known mega-journal PLOS ONE, between 2003 and 2016. We find that while overall commenting rates are low, and have declined since 2010, there is substantial variation across different PLOS titles. Using a typology of comments developed for this research, we also find that only around half of comments engage in an academic discussion of the article and that these discussions are most likely to focus on the paper’s technical soundness. Our results suggest that publishers are yet to encourage significant numbers of readers to leave comments, with implications for the effectiveness of commenting as a means of collecting and communicating community perceptions of an article’s importance.

Göttingen University Press Platform supports Annotation via Hypothes.is – Hirmeos Project

Within the HIRMEOS project, Göttingen University Press aims to add on its platform new services that allow for deeper interaction with Open Access monographs. The University Press is pleased to announce that now it is possible to annotate all its publications within the browser through the Hypothes.is annotation tool….”

Publisciences – Sciences publishing by Researchers

“Scientific publication is an essential tool for the dis- semination and transfer of knowledge. Free access to publications and “bibliodiversity” are fundamen- tal both to researchers, who wish to broadcast their work and thus secure funding for it, and for the sci- entific community, which is fuelled by the progress of each of its members. Researchers ensure that scien- tific discoveries can be replicated. Today, publishers, via their editorial choices, influence the direction of research, whereas this should be the prerogative of researchers. As evidenced in the “Appel de Jussieu”(1), research- ers are in favour of an Open Access model, yet the community is faced with one pressing problem: “Who will foot the bill?”. Within the current Open Access model, publishing is expensive for authors/ researchers (from €1000 to €5000 for one article), despite low added value provided by publishers in terms of editing, but also reviewing. Peer review and validation are performed by other researchers for free. There is a real need to create a more af- fordable, more diverse, and fairer Open Access solu- tion. Researchers’ work, especially as authors and reviewers, is financed by public funds with no com- pensation provided by private publishers. An innovative cooperative We propose a scientific publication platform led by the research community itself. For the collective in- terest to prevail, we want to set up the first coop- erative platform dedicated to scientific publication. In France there is a type of business entity that suits this purpose perfectly: the SCIC. Short for Société Coopérative d’Intérêt Collectif (public interest cooperative company), this type of entity lets each par- ticipant weigh in as a stakeholder: researchers (authors and reviewers), publishers, public institutions and investors….”