Crowdsourcing Scholarly Discourse Annotations | 26th International Conference on Intelligent User Interfaces

Abstract:  The number of scholarly publications grows steadily every year and it becomes harder to find, assess and compare scholarly knowledge effectively. Scholarly knowledge graphs have the potential to address these challenges. However, creating such graphs remains a complex task. We propose a method to crowdsource structured scholarly knowledge from paper authors with a web-based user interface supported by artificial intelligence. The interface enables authors to select key sentences for annotation. It integrates multiple machine learning algorithms to assist authors during the annotation, including class recommendation and key sentence highlighting. We envision that the interface is integrated in paper submission processes for which we define three main task requirements: The task has to be . We evaluated the interface with a user study in which participants were assigned the task to annotate one of their own articles. With the resulting data, we determined whether the participants were successfully able to perform the task. Furthermore, we evaluated the interface’s usability and the participant’s attitude towards the interface with a survey. The results suggest that sentence annotation is a feasible task for researchers and that they do not object to annotate their articles during the submission process.

 

Open Context: Web-based research data publishing

“Open Context reviews, edits, annotates, publishes and archives research data and digital documentation. We publish your data and preserve it with leading digital libraries. We take steps beyond archiving to richly annotate and integrate your analyses, maps and media. This links your data to the wider world and broadens the impact of your ideas….”

ANN: A platform to annotate text with Wikidata IDs | Zenodo

Abstract:  Report of the work done by the Ann team at the eLife Sprint 2020. 

It describes the effort pursued towards a system for universal annotation of biomedical articles using the collaborative knowledge graph of Wikidata.  

The project is currently active at https://github.com/lubianat/ann. 

Community curation in PomBase: enabling fission yeast experts to provide detailed, standardized, sharable annotation from research publications | Database | Oxford Academic

Abstract:  Maximizing the impact and value of scientific research requires efficient knowledge distribution, which increasingly depends on the integration of standardized published data into online databases. To make data integration more comprehensive and efficient for fission yeast research, PomBase has pioneered a community curation effort that engages publication authors directly in FAIR-sharing of data representing detailed biological knowledge from hypothesis-driven experiments. Canto, an intuitive online curation tool that enables biologists to describe their detailed functional data using shared ontologies, forms the core of PomBase’s system. With 8 years’ experience, and as the author response rate reaches 50%, we review community curation progress and the insights we have gained from the project. We highlight incentives and nudges we deploy to maximize participation, and summarize project outcomes, which include increased knowledge integration and dissemination as well as the unanticipated added value arising from co-curation by publication authors and professional curators.

 

Directory of preprint server policies and practices – ASAPbio

“Given the growth of preprint servers and alternative platforms, it is increasingly important to describe their disciplinary scope and compare and contrast policies including governance, licensing, archiving strategies and the nature of any screening checks. These practices are important to both researchers and policymakers.

Here we present searchable information about preprint platforms relevant to life sciences, biomedical, and clinical research….”

Our 10 Millionth Annotation – Hypothesis

“Hypothesis just reached its 10 millionth annotation. Half of those have happened in the last year.

This milestone is the achievement of a community: all the scientists, scholars, journalists, authors, publishers, fact-checkers, technologists and, now more than ever, teachers and students who have used and valued collaborative annotation over the years. Thank you all for reaching this momentous number with us, especially during this challenging time….”

Research Square

“Research Square is a preprint platform that allows you to share your work early, gain feedback and improve your manuscript, and discover emerging science all in one place….

Research Square features all the characteristics of a traditional preprint server, but with some notable differences:

All preprints are displayed in HTML. The full text is indexed and machine-readable so that it is more discoverable by search engines.
Authors can demonstrate to the community they meet established standards in scientific reporting by purchasing assessments in integrity, reproducibility, and statistical rigor. Badge icons are displayed on their article page for assessments they pass. Learn more about our badges here.
Video summaries can be added to the article page to communicate your research to a broader audience.
Readers can comment on a paper using our custom-built commenting system or the hypothes.is annotation tool.
Figures are rendered using a lightbox that allows for zooming and downloading. …”

 

Hypothesis for Instructional Continuity During COVID-19 – Hypothesis

“Over the past weeks, our contacts at schools, colleges, and universities have been writing to us asking about how they can use Hypothesis in response to campus closures and the move to online courses as a result of the COVID-19 crisis. We’d like to help.

Collaborative annotation can help connect students and teachers while they keep their distance to safeguard their health during the current crisis. Reading alongside and interacting with each other using Hypothesis is about as close to a seminar-style experience as they can have online.

To support the role that collaborative annotation can play in facilitating expanded online classes, Hypothesis is waiving all fees to educational institutions for the remainder of 2020, and will evaluate whether to extend this as the current situation develops. Existing partners can request a refund or apply any fees that they have already paid towards future costs….”

Transparent review in preprints – Cold Spring Harbor Laboratory

“Cold Spring Harbor Laboratory (CSHL) today announced a new pilot project—Transparent Review in Preprints (TRiP)—that enables journals and peer review services to post peer reviews of submitted manuscripts on CSHL’s preprint server bioRxiv.

“The new project is part of broader efforts by bioRxiv to work with other organizations to help the scholarly publishing ecosystem evolve,” said John Inglis, co-founder of bioRxiv at CSHL.

The project is powered by the web annotation tool Hypothesis and will allow participating organizations to post peer reviews in dedicated Hypothesis groups alongside relevant preprints on the bioRxiv website. Authors must opt-in with the journal/service in advance. The use of restricted Hypothesis groups allows participating organizations to control the process and ensure that only reviews they approve are displayed. Readers will continue to be able to post their own reactions to individual preprints through bioRxiv’s dedicated comment section.

eLife and the EMBO Press journals, together with Peerage of Science and Review Commons, two journal-independent peer review initiatives, will be the first to participate. Several other groups plan to join the pilot later, including the American Society for Plant Biology and the Public Library of Science….”

Enabling A Conversation Across Scholarly Monographs through Open Annotation

Abstract:  The digital format opens up new possibilities for interaction with monographic publications. In particular, annotation tools make it possible to broaden the discussion on the content of a book, to suggest new ideas, to report errors or inaccuracies, and to conduct open peer reviews. However, this requires the support of the users who might not yet be familiar with the annotation of digital documents. This paper will give concrete examples and recommendations for exploiting the potential of annotation in academic research and teaching. After presenting the annotation tool of Hypothesis, the article focuses on its use in the context of HIRMEOS (High Integration of Research Monographs in the European Open Science Infrastructure), a project aimed to improve the Open Access digital monograph. The general line and the aims of a post-peer review experiment with the annotation tool, as well as its usage in didactic activities concerning monographic publications are presented and proposed as potential best practices for similar annotation activities.