Abstract: In this article I call for more recognition of and scholarly engagement with public, volunteer digital humanities projects, using the example of LibriVox.org to consider what public, sustainable, digital humanities work can look like beyond the contexts of institutional sponsorship. Thousands of volunteers are using LibriVox to collaboratively produce free audiobook versions of texts in the US public domain. The work of finding, selecting, and preparing texts to be digitized and published in audio form is complex and slow, and not all of this labor is ultimately visible, valued, or rewarded. Drawing on an ethnographic study of 12 years of archived discourse and documentation, I interrogate digital traces of the processes by which several LibriVox versions of Anne of Green Gables have come into being, watching for ways in which policies and infrastructure have been influenced by variously visible and invisible forms of work. Making visible the intricate, unique, archived experiences of the crowdsourcing community of LibriVox volunteers and their tools adds to still-emerging discussions about how to value extra-institutional, public, distributed digital humanities work.
“In order to increase the contribution of open science to producing better science, the National Academies of Sciences, Engineering, and Medicine’s Roundtable on Aligning Incentives for Open Science convenes critical stakeholders to discuss the effectiveness of current incentives for adopting open science practices, current barriers of all types, and ways to move forward in order to align reward structures and institutional values. The Roundtable convenes two times per year and creates a venue for the exchange of ideas and joint strategic planning among key stakeholders. Each Roundtable meeting has a theme. The diverse themes target slightly different audiences but the core audience will consist of universities, government agencies, foundations, and other groups doing work related to open science. The Roundtable aims to improve coordination among stakeholders and increase awareness of current and future efforts in the broader open science community. The Roundtable will also convene one symposium per year, which may produce National Academies proceedings in brief….”
Abstract: Open research data is one of the key areas in the expanding open scholarship movement. Scholarly journals and publishers find themselves at the heart of the shift towards openness, with recent years seeing an increase in the number of scholarly journals with data-sharing policies aiming to increase transparency and reproducibility of research. In this article we present two case studies which examine the experiences that two leading academic publishers, Taylor & Francis and Springer Nature, have had in rolling out data-sharing policies. We illustrate some of the considerations involved in providing consistent policies across journals of many disciplines, reflecting on successes and challenges.
Abstract: We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms. Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.
Abstract: Using an online survey of academics at 55 randomly selected institutions across the US and Canada, we explore priorities for publishing decisions and their perceived importance within review, promotion, and tenure (RPT). We find that respondents most value journal readership, while they believe their peers most value prestige and related metrics such as impact factor when submitting their work for publication. Respondents indicated that total number of publications, number of publications per year, and journal name recognition were the most valued factors in RPT. Older and tenured respondents (most likely to serve on RPT committees) were less likely to value journal prestige and metrics for publishing, while untenured respondents were more likely to value these factors. These results suggest disconnects between what academics value versus what they think their peers value, and between the importance of journal prestige and metrics for tenured versus untenured faculty in publishing and RPT perceptions.
“Overall, the results of our survey give reason to be optimistic: the majority of faculty understand that OA is about making research accessible and available. However, they also point to persistent misconceptions about OA, like necessarily high costs and low quality. This raises questions: How might these misconceptions be affecting RPT [review, promotion, and tenure] evaluations? How should researchers who want to prioritise the public availability of their work guard against the potential that their peers hold one of these negative associations? And, as a community, how can we better communicate the complexities of OA without further diluting the central message of open access? Perhaps we can begin by adequately representing and incentivising the basic principles of openness in our RPT documents.”
“Minister of State for Training, Skills, Innovation, Research & Development, John Halligan, has launched Ireland’s National Framework on the Transition to an Open Research Environment.
Prepared by the National Open Research Forum (NORF), the framework was a response to developments in open research, both in the EU and internationally.
Open research refers to the movement towards more transparent, collaborative, accessible and efficient research.
The frameworks objective is to enhance the integrity, public trust and excellence in research across all disciplines. Its principles are to support access to research funded by the Irish government, improve the free flow of information across research communities, and boost transparency, accountability and public awareness of the results of publicly funded research. This is aligned with European Commission policy that has devloped in this area. It makes recommendations on a range of topics, including open access to research data, the preservation and reuse of scientific information, skills and competencies and incentives and rewards….”
“Minister John Halligan has launched Ireland’s National Framework on the Transition to an Open Research Environment….
The National Framework is a key deliverable of the National Open Research Forum (NORF), which was set up in 2017 to bring together key members of the research community to drive Ireland’s open research agenda as set out in Innovation 2020, Ireland’s research and development, science and technology strategy.
Patricia Clarke of the Health Research Board and co-chair of the NORF said: “The National Framework is a clear statement of intent by the Irish research community to take practical steps to embed open research in Ireland….
The framework is aligned with emerging European Union policy and includes principles on: open access to publications; enabling FAIR research data; underpinning infrastructures for access to and preservation of research; development of skills and competencies, and incentives and rewards for open research within research evaluation processes.
The framework will open up access to publicly funded research in Ireland and support research excellence across all disciplines. Open Research will be a requirement of the next EU Framework Programme, Horizon Europe, and Irish researchers and institutions need to be ready….”
“Research papers that make their underlying data openly available are significantly more likely to be cited in future work, according to an analysis led by researchers at the Alan Turing Institute in London that has been published as a preprint. The study, which is currently under peer review, examined nearly 532,000 articles in over 350 open access journals published by Public Library of Science (PLoS) and BioMed Central (BMC) between 1997 and 2018, and found those that linked directly to source data sets received 25% more citations on average….”
- Implementation of FAIR (findable, accessible, interoperable, and reusable) offers significant return on investment (ROI) but requires major changes in research culture, incentives, and substantial funding, and implementation is hindered by the need to coordinate across European Union’s member states.
- FAIR is constituted by data objects and a wider technical and data ecosystem.
- Publishers’ role is broad but prescribed in this report – although there may be business opportunities.
- While the continued validity of non?open data is acknowledged, the report recognizes that ROI is maximized where data are both FAIR and open….”