Raising research quality will require collective action

” Institutions are committing to working together to determine how their cultural practices, such as emphasizing the importance of novelty, discovery and priority, undermine the value of replication, verification and transparency. That is the goal of the UK Reproducibility Network, which I co-founded earlier this year. It started as informal groups of researchers at individual institutions that met with representatives from funders and publishers (including Nature) who were open to discussions about how best to align open-science initiatives — reproducibility sections in grant applications and reporting checklists in article submissions, for example. Now institutions themselves are cooperating to consider larger changes, from training to hiring and promotion practices….

Our ten university members span the United Kingdom from Aberdeen to Surrey, and we expect that list to grow. Each will appoint a senior academic to focus on research quality and improvement. Figuring out which system-level changes are needed and how to make them happen will now be someone’s primary responsibility, not a volunteer activity. What changes might ensue? Earlier this year, the University of Bristol, where I work, made the use of data sharing and other open-research practices an explicit criterion for promotion….

But these cultural changes might falter. Culture eats strategy for breakfast — grand plans founder on the rocks of implicit values, beliefs and ways of working. Top-down initiatives from funders and publishers will fizzle out if they are not implemented by researchers, who review papers and grant proposals. Grass-roots efforts will flourish only if institutions recognize and reward researchers’ efforts.

Funders, publishers and bottom-up networks of researchers have all made strides. Institutions are, in many ways, the final piece of the jigsaw. Universities are already investing in cutting-edge technology and embarking on ambitious infrastructure programmes. Cultural change is just as essential to long-term success.”

How can Editors encourage Open Research Practices?

“‘Open research’ (used interchangeably with ‘open science’) is an all-encompassing term speaking to the set of practices that aim to improve the accessibility, reproducibility, and integrity of research outputs. It’s also complex, spanning issues such as open access, open practices that increase the integrity and reproducibility of research (e.g., Registered Reports, open data and code), open collaboration, and open recognition (e.g. transparent peer review and CRediT Contributor Roles Taxonomy).

So, what do researchers think about open research? We invited researchers to participate in Wiley’s Open Research Survey to share their views and experiences of open research practices. It’s clear from our findings that researchers welcome open research initiatives in terms of their motivation for publishing open access, willingness to share data and to experiment with opening up the peer review process (see overview below for more detail).

Recent studies have shown that articles that are freely available obtain more citations and are downloaded more often. Institutions are beginning to reward and recognise open research practices, especially in recruitment and for promotion. Funders are also requiring that researchers publish open access and share data (for example, Horizon Europe).

Open research isn’t the future – it’s the here and now, and journal editors have a vital role to play in facilitating open research and open publishing practices alongside researchers, institutions, funders, and publishers. Editors can play their part by supporting open access publishing, adopting Registered Reports, adopting open data policies and data availability statements, recognizing and celebrating open research practices such as displaying open research badges on published articles, and opening up peer review. If you want to implement one or more of these initiatives on your journal, please speak with your Wiley Journal Publishing Manager….”

Go To Hellman: Open Access for Backlist Books, Part II: The All-Stars

“In my post about the value of Open Access for books, I suggested that usage statistics (circulation, downloads, etc.) are a useful proxy for the value that books generate for their readers. The logical conclusion is that the largest amount of value that can be generated from opening of the backlist comes from the books that are most used, the “all-stars” of the library, not the discount rack or the discards. If libraries are to provide funding for Open Access backlist books, shouldn’t they focus their resources on the books that create the most value?

The question of course, is how the library community would ever convince publishers, who have monopolies on these books as a consequence of international copyright laws, to convert these books to Open Access. Although some sort of statutory licensing or fair-use carve-outs could eventually do the trick, I believe that Open Access for a significant number of “backlist All-Stars” can be achieved today by pushing ALL the buttons available to supporters of Open Access. Here’s where the Open Access can learn from the game (and business) of baseball….

Open Access should be an All-Star game for backlist books. We need to create community-based award programs that recognize and reward backlist conversions to OA. If the world’s libraries want to spend $50,000 on backlist physics books, for example, isn’t it better to spend it on the the Mike Trout of physics books than on a team full of discount-rack replacement-level players? …

If you doubt that “All-Star Open Access” could work, don’t discount the fact that it’s also the right thing to do. Authors of All-Star backlist books want their books to be used, cherished and remembered. Libraries want books that measurably benefit the communities they serve. Foundations and governmental agencies want to make a difference. Even publishers who look only at their bottom lines can structure a rights conversion as a charitable donation to reduce their tax bills.

 

And did I mention that there could be Gala Award Celebrations? We need more celebrations, don’t you think? ”

Honest signaling in academic publishing

Abstract:  Academic journals provide a key quality-control mechanism in science. Yet, information asymmetries and conflicts of interests incentivize scientists to deceive journals about the quality of their research. How can honesty be ensured, despite incentives for deception? Here, we address this question by applying the theory of honest signaling to the publication process. Our models demonstrate that several mechanisms can ensure honest journal submission, including differential benefits, differential costs, and costs to resubmitting rejected papers. Without submission costs, scientists benefit from submitting all papers to high-ranking journals, unless papers can only be submitted a limited number of times. Counterintuitively, our analysis implies that inefficiencies in academic publishing (e.g., arbitrary formatting requirements, long review times) can serve a function by disincentivizing scientists from submitting low-quality work to high-ranking journals. Our models provide simple, powerful tools for understanding how to promote honest paper submission in academic publishing.

 

Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics – Lemke – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  The amount of annually published scholarly articles is growing steadily, as is the number of indicators through which impact of publications is measured. Little is known about how the increasing variety of available metrics affects researchers’ processes of selecting literature to read. We conducted ranking experiments embedded into an online survey with 247 participating researchers, most from social sciences. Participants completed series of tasks in which they were asked to rank fictitious publications regarding their expected relevance, based on their scores regarding six prototypical metrics. Through applying logistic regression, cluster analysis, and manual coding of survey answers, we obtained detailed data on how prominent metrics for research impact influence our participants in decisions about which scientific articles to read. Survey answers revealed a combination of qualitative and quantitative characteristics that researchers consult when selecting literature, while regression analysis showed that among quantitative metrics, citation counts tend to be of highest concern, followed by Journal Impact Factors. Our results suggest a comparatively favorable view of many researchers on bibliometrics and widespread skepticism toward altmetrics. The findings underline the importance of equipping researchers with solid knowledge about specific metrics’ limitations, as they seem to play significant roles in researchers’ everyday relevance assessments.

 

Rewarding contributions to research culture is part of building a better university | Impact of Social Sciences

“We introduced the awards to surface, celebrate and share good practice. We announced the awardees at our annual research celebration event that is hosted by the Vice-Chancellor. This event normally recognises grant awards, scholarships, and external forms of recognition such as prizes or prestigious academy membership. By including the awards in this celebration, we reinforced a broader definition of success in academia. The four winners were awarded a monetary reward to use as they wished, for example to celebrate team contributions. The awards were one initiative in a broader programme of work to advance our research culture, including research integrity, open research, support for careers, and fair approaches to evaluating research quality. The awards also sit alongside the changes made in 2019 to our promotion criteria requiring applicants to demonstrate collegiality for professorial promotion….”

Universities without walls: A vision for 2030

“Open Science, making research accessible to all, will be the default way of producing knowledge. Universities will support a diverse non-commercial publishing system and will, themselves, be directly involved in such a system, by promoting and supporting non-commercial and smaller publishing initiatives. Data and other outputs resulting from research will be made FAIR (Findable, Accessible, Interoperable, Reusable). Scientists will be adequately rewarded for the processing and publishing of data. Europe’s scholarly information infrastructure will facilitate cross-border, multidisciplinary research with advanced digital services and tools….”

Universities without walls: A vision for 2030

“Open Science, making research accessible to all, will be the default way of producing knowledge. Universities will support a diverse non-commercial publishing system and will, themselves, be directly involved in such a system, by promoting and supporting non-commercial and smaller publishing initiatives. Data and other outputs resulting from research will be made FAIR (Findable, Accessible, Interoperable, Reusable). Scientists will be adequately rewarded for the processing and publishing of data. Europe’s scholarly information infrastructure will facilitate cross-border, multidisciplinary research with advanced digital services and tools….”

Evaluation of Data Sharing After Implementation of the International Committee of Medical Journal Editors Data Sharing Statement Requirement | Medical Journals and Publishing | JAMA Network Open | JAMA Network

“Question  What are the rates of declared and actual sharing of clinical trial data after the medical journals’ implementation of the International Committee of Medical Journal Editors data sharing statement requirement?

Findings  In this cross-sectional study of 487 clinical trials published in JAMA, Lancet, and New England Journal of Medicine, 334 articles (68.6%) declared data sharing. Only 2 (0.6%) individual-participant data sets were actually deidentified and publicly available on a journal website, and among the 89 articles declaring that individual-participant data would be stored in secure repositories, data from only 17 articles were found in the respective repositories as of April 10, 2020.

Meaning  These findings suggest that there is a wide gap between declared and actual sharing of clinical trial data.”

Web Accessibility in the Institutional Repository: Crafting User-Centered Submission Policies: The Serials Librarian: Vol 0, No 0

Abstract:  While institutional repositories have long focused on ensuring the availability of research, recent university initiatives have begun to focus on other aspects of open access, such as digital accessibility. Indiana University’s institutional repository (IR), IUScholarWorks, audited the accessibility of its existing content and created policies to encourage accessible submissions. No established workflows considering accessibility existed when this audit took place, and no additional resources were allocated to facilitate this shift in focus. As a result, the Scholarly Communication team altered the repository submission workflow to encourage authors to make their finished documents accessible with limited intervention. This paper shares an overview of the accessibility audit that took place, the changes made to our submission process, and finally provides tips and resources for universities who aim to integrate accessibility more thoroughly into their IR practices.