The In/Visible, In/Audible Labor of Digitizing the Public Domain

Abstract:  In this article I call for more recognition of and scholarly engagement with public, volunteer digital humanities projects, using the example of LibriVox.org to consider what public, sustainable, digital humanities work can look like beyond the contexts of institutional sponsorship. Thousands of volunteers are using LibriVox to collaboratively produce free audiobook versions of texts in the US public domain. The work of finding, selecting, and preparing texts to be digitized and published in audio form is complex and slow, and not all of this labor is ultimately visible, valued, or rewarded. Drawing on an ethnographic study of 12 years of archived discourse and documentation, I interrogate digital traces of the processes by which several LibriVox versions of Anne of Green Gables have come into being, watching for ways in which policies and infrastructure have been influenced by variously visible and invisible forms of work. Making visible the intricate, unique, archived experiences of the crowdsourcing community of LibriVox volunteers and their tools adds to still-emerging discussions about how to value extra-institutional, public, distributed digital humanities work.

RightsStatements in Wikidata

“We are pleased to report that the volunteer community behind Wikidata – the freely licensed structured database of information, sister to Wikipedia, has recently approved the creation of a dedicated metadata Property for RightsStatements. P6426 to be precise. This will increase the chances that accurate, understandable, and precise rights-labelling information about cultural heritage works will be findable by end-users.

Here Liam Wyatt explains how this change came about, and what it means for cultural heritage organisations around the world who contribute items to Wikidata….”

The Wikipedia Library – Meta

The Wikipedia Library is an open research hub, a place for active Wikipedia editors to gain access to the vital reliable sources that they need to do their work and to be supported in using those resources to improve the encyclopedia. We aim to make access and use of sources free, easy, collaborative and efficient.

The Wikipedia Library is run by a team of Wikimedia Foundation staff and global volunteers. We operate on a community-organized satellite model: we administer the global project but work with local coordinators in local Wikipedia projects to help each community set up their own libraries….”

Enabling A Conversation Across Scholarly Monographs through Open Annotation

Abstract:  The digital format opens up new possibilities for interaction with monographic publications. In particular, annotation tools make it possible to broaden the discussion on the content of a book, to suggest new ideas, to report errors or inaccuracies, and to conduct open peer reviews. However, this requires the support of the users who might not yet be familiar with the annotation of digital documents. This paper will give concrete examples and recommendations for exploiting the potential of annotation in academic research and teaching. After presenting the annotation tool of Hypothesis, the article focuses on its use in the context of HIRMEOS (High Integration of Research Monographs in the European Open Science Infrastructure), a project aimed to improve the Open Access digital monograph. The general line and the aims of a post-peer review experiment with the annotation tool, as well as its usage in didactic activities concerning monographic publications are presented and proposed as potential best practices for similar annotation activities.

The Rise of Open Science in Psychology, a Preliminary Report

Open science is on the rise. Across disciplines, there are increasing rates of sharing data, making available underlying materials and protocols, and preregistering studies and analysis plans.  Hundreds of services have emerged to support open science behaviors at every stage of the research lifecycle.  But, what proportion of the research community is practicing open science? Where is penetration of these behaviors strongest and weakest?  Answers to these questions are important for evaluating progress in culture reform and for strategic planning of where to invest resources next.  

 

The hardest part of getting meaningful answers to these questions is quantifying the population that is NOT doing the behaviors.  For example, in a recent post, Nici Pfeiffer summarized the accelerating growth of OSF users on the occasion of hitting 150,000 registered users.  That number and non-linear growth suggests cultural movement associated with this one service, but how much movement?…”

The Wikipedia Library/1Lib1Ref/Lessons/2019 – Meta

The January [2019] #1Lib1Ref campaign saw an energy exhibited by participants that was infectious. The campaign saw major additions, new entrants and a new sense of competition between languages and institutions. In this iteration #1Lib1Ref reached record highs and saw extensive participation from emerging communities and languages. For the first time the French Wikipedia took the lead with over 33% of the total number of contributions made during the campaign. Based on these results, we anticipate that #1Lib1Ref has the potential of supporting outreach in diverse communities….”

Crowdsourcing in medical research: concepts and applications

Abstract:  Crowdsourcing shifts medical research from a closed environment to an open collaboration between the public and researchers. We define crowdsourcing as an approach to problem solving which involves an organization having a large group attempt to solve a problem or part of a problem, then sharing solutions. Crowdsourcing allows large groups of individuals to participate in medical research through innovation challenges, hackathons, and related activities. The purpose of this literature review is to examine the definition, concepts, and applications of crowdsourcing in medicine. This multi-disciplinary review defines crowdsourcing for medicine, identifies conceptual antecedents (collective intelligence and open source models), and explores implications of the approach. Several critiques of crowdsourcing are also examined. Although several crowdsourcing definitions exist, there are two essential elements: (1) having a large group of individuals, including those with skills and those without skills, propose potential solutions; (2) sharing solutions through implementation or open access materials. The public can be a central force in contributing to formative, pre-clinical, and clinical research. A growing evidence base suggests that crowdsourcing in medicine can result in high-quality outcomes, broad community engagement, and more open science.