CAUT signs the San Francisco Declaration on Research Assessment | CAUT

“The Canadian Association of University Teachers (CAUT) has signed the San Francisco Declaration on Research Assessment (DORA), an international initiative to address the overreliance on journal-based metrics in hiring, promotion, and funding decisions and to promote and support equity in the academy….”

Introduction to DORA: a short presentation at the Global Research Council’s virtual Responsible Research Assessment Conference | DORA

“DORA chair, Prof. Stephen Curry made a short introduction to DORA for the Global Research Council conference on Responsible Research Assessment, which was held online over the week of 23-27 November 2020. He briefly explains the origins of DORA, the meaning of the declaration, and how DORA developed into an active initiative campaigning for the world-wide reform of research assessment. In advance of the conference, Curry, and Program Director, Dr. Anna Hatch, contributed to a working paper outlining the state of play regarding responsible research assessment, exploring what it means and describing existing initiatives in the space….”

Rethinking Research Assessment for the Greater Good | DORA

“The ScholCommLab in Canada conducted a multi-year project examining more than 850 review, promotion, and tenure (RPT) guidelines in the United States and Canada to better understand academic career advancement. The lab examined how the public dimensions of faculty work, use of the Journal Impact Factor, and non-traditional scholarly outputs were recognized and rewarded in review, promotion, and tenure. Key findings have been represented in a series of infographics for the scholarly community….”

The transformative power of values-enacted scholarship | Humanities and Social Sciences Communications

Abstract:  The current mechanisms by which scholars and their work are evaluated across higher education are unsustainable and, we argue, increasingly corrosive. Relying on a limited set of proxy measures, current systems of evaluation fail to recognize and reward the many dependencies upon which a healthy scholarly ecosystem relies. Drawing on the work of the HuMetricsHSS Initiative, this essay argues that by aligning values with practices, recognizing the vital processes that enrich the work produced, and grounding our indicators of quality in the degree to which we in the academy live up to the values for which we advocate, a values-enacted approach to research production and evaluation has the capacity to reshape the culture of higher education.

 

ACRL Framework for Impactful Scholarship and Metrics

“ACRL recommended “as standard practice that academic librarians publish in open access venues.” ….In June 2019, ACRL outlined priorities and plans to reshape the current system of scholarly communications to increase equity and inclusivity.  While by no means an exhaustive list of the values that institutions should discuss and balance, both of these priorities place value on a scholarly infrastructure that is new, emerging, different, and may not completely align with current evaluative practices. We urge institutions to discuss their core institutional values and priorities, and how support for open access, equity, and inclusion, and impact will be represented by the codified institutional guidelines, expectations, and rank/tenure/promotion/evaluation processes….”

The document is undated. But the announcement is dated December 11, 2020.

 

Open Access Legislation and Regulation in the United States: Implications for Higher Education | Journal of Copyright in Education & Librarianship

Accessing quality research when not part of an academic institution can be challenging. Dating back to the 1980s, open access (OA) was a response to journal publishers who restricted access to publications by requiring a subscription and limited access to knowledge. Although the OA movement seeks to remove costly barriers to accessing research, especially when funded by state and federal governments, it remains the subject of continuous debates. After providing a brief overview of OA, this article summarizes OA statutory and regulatory developments at the federal and state levels regarding free and open access to research. It compares similarities and differences among enacted and proposed legislation and describes the advantages and disadvantages of these laws. It analyzes the effects of these laws in higher education, especially on university faculty regarding tenure and promotion decisions as well as intellectual property rights to provide recommendations and best practices regarding the future of legislation and regulation in the United States.

Publishing, P&T, and Equity, an Open Access Week Miniseries, Part 3: How Librarians Became Experts on Publishing and Equity

Happy Open Access Week! This is the final installment in our 3-part mini-series of blog posts on Publishing, P&T, and Equity. The overarching issue: how to reform our research evaluation processes to eliminate bias and promote structural equity. On Monday I argued for ending P&T standards that reward journal ‘prestige.’ On Wednesday I wrote about why institutions who want to build structural equity should reward open publishing practices in their research evaluation processes. Today I will conclude with a little meta-piece on the Library’s place in all this.

Publishing, P&T, and Equity, an Open Access Week Miniseries, Part 3: How Librarians Became Experts on Publishing and Equity

Happy Open Access Week! This is the final installment in our 3-part mini-series of blog posts on Publishing, P&T, and Equity. The overarching issue: how to reform our research evaluation processes to eliminate bias and promote structural equity. On Monday I argued for ending P&T standards that reward journal ‘prestige.’ On Wednesday I wrote about why institutions who want to build structural equity should reward open publishing practices in their research evaluation processes. Today I will conclude with a little meta-piece on the Library’s place in all this.

Publishing, P&T, and Equity, an Open Access Week Miniseries, Part 1: Stop Rewarding Journal “Prestige”

“-Part 1 (this post!) will discuss why updating P&T standards to eschew journal level metrics and journal prestige is an important strategy for advancing equity and inclusion, as well as open access.

-Part 2 (Wednesday-ish) will suggest that rewarding open practice and open publishing in P&T standards is an important step toward affirmatively promoting equity and inclusion in the academy (and in the communities we serve).

-Part 3 (Friday-ish) is a kind of postscript that explains a bit about why libraries are especially interested in these issues and how we see the intersection of Open Access, Equity/Inclusion, and Promotion and Tenure with perhaps a unique clarity relative to other parts of the scholarly ecosystem….”

Journal- or article-based citation measure? A study… | F1000Research

Abstract:  In academia, decisions on promotions are influenced by the citation impact of the works published by the candidates. The Medical Faculty of the University of Bern used a measure based on the journal impact factor (JIF) for this purpose: the JIF of the papers submitted for promotion should rank in the upper third of journals in the relevant discipline (JIF rank >0.66). The San Francisco Declaration on Research Assessment (DORA) aims to eliminate the use of journal-based metrics in academic promotion. We examined whether the JIF rank could be replaced with the relative citation ratio (RCR), an article-level measure of citation impact developed by the National Institutes of Health (NIH). An RCR percentile >0.66 corresponds to the upper third of citation impact of articles from NIH-sponsored research. We examined 1525 publications submitted by 64 candidates for academic promotion at University of Bern. There was only a moderate correlation between the JIF rank and RCR percentile (Pearson correlation coefficient 0.34, 95% CI 0.29-0.38). Among the 1,199 articles (78.6%) published in journals ranking >0.66 for the JIF, less than half (509, 42.5%) were in the upper third of the RCR percentile. Conversely, among the 326 articles published in journals ranking <0.66 regarding the JIF, 72 (22.1%) ranked in the upper third of the RCR percentile. Our study demonstrates that the rank of the JIF is a bad proxy measure for the actual citation impact of individual articles. The Medical Faculty of University of Bern has signed DORA and replaced the JIF rank with the RCR percentile to assess the citation impact of papers submitted for academic promotion.