Abstract: Maximizing the impact and value of scientific research requires efficient knowledge distribution, which increasingly depends on the integration of standardized published data into online databases. To make data integration more comprehensive and efficient for fission yeast research, PomBase has pioneered a community curation effort that engages publication authors directly in FAIR-sharing of data representing detailed biological knowledge from hypothesis-driven experiments. Canto, an intuitive online curation tool that enables biologists to describe their detailed functional data using shared ontologies, forms the core of PomBase’s system. With 8 years’ experience, and as the author response rate reaches 50%, we review community curation progress and the insights we have gained from the project. We highlight incentives and nudges we deploy to maximize participation, and summarize project outcomes, which include increased knowledge integration and dissemination as well as the unanticipated added value arising from co-curation by publication authors and professional curators.
“Given the growth of preprint servers and alternative platforms, it is increasingly important to describe their disciplinary scope and compare and contrast policies including governance, licensing, archiving strategies and the nature of any screening checks. These practices are important to both researchers and policymakers.
Here we present searchable information about preprint platforms relevant to life sciences, biomedical, and clinical research….”
“Hypothesis just reached its 10 millionth annotation. Half of those have happened in the last year.
This milestone is the achievement of a community: all the scientists, scholars, journalists, authors, publishers, fact-checkers, technologists and, now more than ever, teachers and students who have used and valued collaborative annotation over the years. Thank you all for reaching this momentous number with us, especially during this challenging time….”
“Research Square is a preprint platform that allows you to share your work early, gain feedback and improve your manuscript, and discover emerging science all in one place….
Research Square features all the characteristics of a traditional preprint server, but with some notable differences:
All preprints are displayed in HTML. The full text is indexed and machine-readable so that it is more discoverable by search engines.
Authors can demonstrate to the community they meet established standards in scientific reporting by purchasing assessments in integrity, reproducibility, and statistical rigor. Badge icons are displayed on their article page for assessments they pass. Learn more about our badges here.
Video summaries can be added to the article page to communicate your research to a broader audience.
Readers can comment on a paper using our custom-built commenting system or the hypothes.is annotation tool.
Figures are rendered using a lightbox that allows for zooming and downloading. …”
“Over the past weeks, our contacts at schools, colleges, and universities have been writing to us asking about how they can use Hypothesis in response to campus closures and the move to online courses as a result of the COVID-19 crisis. We’d like to help.
Collaborative annotation can help connect students and teachers while they keep their distance to safeguard their health during the current crisis. Reading alongside and interacting with each other using Hypothesis is about as close to a seminar-style experience as they can have online.
To support the role that collaborative annotation can play in facilitating expanded online classes, Hypothesis is waiving all fees to educational institutions for the remainder of 2020, and will evaluate whether to extend this as the current situation develops. Existing partners can request a refund or apply any fees that they have already paid towards future costs….”
“Cold Spring Harbor Laboratory (CSHL) today announced a new pilot project—Transparent Review in Preprints (TRiP)—that enables journals and peer review services to post peer reviews of submitted manuscripts on CSHL’s preprint server bioRxiv.
“The new project is part of broader efforts by bioRxiv to work with other organizations to help the scholarly publishing ecosystem evolve,” said John Inglis, co-founder of bioRxiv at CSHL.
The project is powered by the web annotation tool Hypothesis and will allow participating organizations to post peer reviews in dedicated Hypothesis groups alongside relevant preprints on the bioRxiv website. Authors must opt-in with the journal/service in advance. The use of restricted Hypothesis groups allows participating organizations to control the process and ensure that only reviews they approve are displayed. Readers will continue to be able to post their own reactions to individual preprints through bioRxiv’s dedicated comment section.
eLife and the EMBO Press journals, together with Peerage of Science and Review Commons, two journal-independent peer review initiatives, will be the first to participate. Several other groups plan to join the pilot later, including the American Society for Plant Biology and the Public Library of Science….”
Abstract: The digital format opens up new possibilities for interaction with monographic publications. In particular, annotation tools make it possible to broaden the discussion on the content of a book, to suggest new ideas, to report errors or inaccuracies, and to conduct open peer reviews. However, this requires the support of the users who might not yet be familiar with the annotation of digital documents. This paper will give concrete examples and recommendations for exploiting the potential of annotation in academic research and teaching. After presenting the annotation tool of Hypothesis, the article focuses on its use in the context of HIRMEOS (High Integration of Research Monographs in the European Open Science Infrastructure), a project aimed to improve the Open Access digital monograph. The general line and the aims of a post-peer review experiment with the annotation tool, as well as its usage in didactic activities concerning monographic publications are presented and proposed as potential best practices for similar annotation activities.
Abstract: This paper describes the use of open Web annotation (OWA) for collaborative learning among online communities. OWA is defined by the open standards, principles, and practices associated with the open Web. Specifically, this case study examines collaborative learning mediated by the OWA technology Hypothesis, a standards-compliant and open-source technology that situates collaboration in texts-as-contexts. Hypothesis OWA supports a repertoire of six collaborative learning practices: Affording multimodal expression, establishing connections across contexts, archiving activity, visualizing expertise and cognition, contributing to open educational resources, and fostering open educational practices. The use of Hypothesis OWA is then described in three online communities associated with scientific research and communication, educator professional development, and Web literacy and fact-checking. The article concludes by advancing three broad questions and related research agendas regarding how OWA as collaborative learning attends to linkages among formal and informal learning environments, the growth of both open educational resources and practices, and the use of open data as learning analytics.
Projects like Hypothesis are extremely difficult to begin, grow and sustain over time. We were fortunate to have had early believers on Kickstarter, and then stalwart supporters in over the last 8 years in foundations like Sloan, Mellon, Shuttleworth, Knight, Helmsley and Omidyar. However, this foundation support is still insufficient to the longer term, larger funding required to bridge to a sustainable future for most open projects, including ours. Foundations tend to support early projects, but that support usually falls off with time. The kind of mezzanine funding that a for-profit technology might find from venture groups in later stages is simply not available within the ecosystem of non-profit, open source projects.
The core problem is that the true consumers of scholarly infrastructure — namely the researchers, scholars and their institutions and agencies which form the gross majority of users — have the means to sustain it, but lack the structure to do so. The libraries know of a few platforms that they need and provide direct support, but there are hundreds of other projects for which there is no visibility at the institutional level, because they’re still early, or because researchers rather than institutions themselves depend on them directly. Projects like Hypothesis, like any technology infrastructure trying to scale over years to maturity, need ongoing funding until sustainability can be achieved.
What is needed is a coordinating system which can identify, track and assess open infrastructure across diverse categories and constituencies and make recommendations to funders who can pool their resources to sustain it. This coordinating system is exactly the idea behind IOI….”
“The team behind Hypothesis, an open-source software tool that allows people to annotate web pages, announced in March that its users had collectively posted more than 5 million comments across the scholarly web since the tool was launched in 2011. That’s up from about 220,000 total comments in 2015 (see ‘Comment counts’). The company has grown from 26,000 registered users to 215,000 over the same period….”