Reframing research access | Emerald Insight

Abstract:  Purpose

The paper describes how Charles Darwin University (CDU) used a three-pronged approach to better serve its researchers: it developed a single interface for improved accessibility and discoverability of its research outputs, consolidated its corresponding policies and procedures and implemented training programs to support the new portal. This in turn made its suite of research outputs more openly accessible and better discoverable. The intention was to make CDU research compliant with the FAIR (Findable, Accessible, Interoperable and Reusable) policy statement, affirming the need to make Australia’s research more visible, thereby enabling better access, better collaboration locally and internationally and researchers more accountable to their community.


This paper uses case study methodology and a qualitative approach.


CDU Library collaborated with the University’s Research Office in undertaking a series of strategies towards reframing access to its research. The partners migrated their research collections into a single, new, integrated interface; developed new policies and consolidated existing ones; and to this end, rolled out a training and educational program for the research community. The intention of the program was to introduce the Pure repository to new researchers and to train all staff to self archive and curate their own research outputs. This new streamlined approach ensured a more comprehensive and timely availability and accessibility of the University’s research outputs.


A single source of truth was established through the migration of iCDU’s research collections, ensuring data quality was maintained. At the start of this project, there were few institutions in Australia using the Pure system, and even fewer using it as their sole repository for displaying research outputs.

School of Data Science Open Access Guidelines and Recommendations — School of Data Science

“On February 3, 2021 the School of Data Science’s Academic Affairs Committee (AAC) officially passed the Open Access Guidelines and Recommendations. The University of Virginia School of Data Science is guided by goals to further discovery through open, collaborative, and responsible data science research. These guidelines and recommendations are adhered to by all faculty members in their research.

The Open Access Guidelines and Recommendations are part of the School of Data Science’s effort to drive innovation across boundaries in a culture of transparency and open access to knowledge….”

UVA School of Data Science Sets Example for Campus on Open Access – SPARC

“The University of Virginia School of Data Science (SDS) has adopted Open Access Guidelines and Recommendations for its faculty members to follow in sharing their research. The move was recommended by the school’s Academic Affairs Committee and approved by the dean of SDS on Feb. 3.

The guidelines call on faculty to make all scholarly articles, papers, books, data, and software openly available, free of charge in formats that allow reuse. It acknowledges the value of transparency in driving innovation so scholars can build upon each other’s research and accelerate science. The hope is that others on campus will follow the lead of SDS and the guidelines will be embraced more broadly….”

Rewarding contributions to research culture is part of building a better university | Impact of Social Sciences

“We introduced the awards to surface, celebrate and share good practice. We announced the awardees at our annual research celebration event that is hosted by the Vice-Chancellor. This event normally recognises grant awards, scholarships, and external forms of recognition such as prizes or prestigious academy membership. By including the awards in this celebration, we reinforced a broader definition of success in academia. The four winners were awarded a monetary reward to use as they wished, for example to celebrate team contributions. The awards were one initiative in a broader programme of work to advance our research culture, including research integrity, open research, support for careers, and fair approaches to evaluating research quality. The awards also sit alongside the changes made in 2019 to our promotion criteria requiring applicants to demonstrate collegiality for professorial promotion….”

Who Does What? – Research Data Management at ETH Zurich

Abstract:  We present the approach to Research Data Management (RDM) support for researchers taken at ETH Zurich. Overall requirements are governed by institutional guidelines for Research Integrity, funders’ regulations, and legal obligations. The ETH approach is based on the distinction of three phases along the research data life-cycle: 1. Data Management Planning; 2. Active RDM; 3. Data Publication and Preservation. Two ETH units, namely the Scientific IT Services and the ETH Library, provide support for different aspects of these phases, building on their respective competencies. They jointly offer trainings, consulting, information, and materials for the first phase.

The second phase deals with data which is in current use in active research projects. Scientific IT Services provide their own platform, openBIS, for keeping track of raw, processed and analysed data, in addition to organising samples, materials, and scientific procedures.

ETH Library operates solutions for the third phase within the infrastructure of ETH Zurich’s central IT Services. The Research Collection is the institutional repository for research output including Research Data, Open Access publications, and ETH Zurich’s bibliography.

Open Access & the Library of the 21st Century: A Discussion of the Open Access Initiatives and Practices at Carnegie Mellon University – LibCal – Carnegie Mellon University

“As publishers and funders have used various methods or requirements to stimulate the adoption of open access, academic libraries have sought to alter their role in further supporting their authors and researchers. Evolving from supporting the mechanisms to OA with institutional repositories or Article Processing Charge (APC) Funds, universities like Carnegie Mellon have taken the responsibility to take direct action upon themselves through replacing the ‘Big Deal’ agreement with models focused on “read and publish” that see scholarly publishing as a single service covering both readership and publishing. As one of the world’s leading universities, the University Libraries at Carnegie Mellon shifted in 2020 towards licensing agreements that would allow CMU authors publishing with Elsevier, ACM, and PLoS, to focus on publishing their works open access by default.

Our university bears the name of business titan and philanthropist Andrew Carnegie, whose personal donations defined the library of the 20th century. Carnegie was driven by his desire to make knowledge and education accessible to the working class, so they would have the tools to better their own condition.

This Open Access Week 2020 event will present a brief history of open access at CMU, followed by a lively discussion with faculty and graduate student authors supported by the recent agreements, the CMU APC Fund, and other open access initiatives and services at CMU. Panelists will discuss their individual and disciplinary insights and perspectives. The event will conclude with Dean Keith Webster presenting a brief perspective on the future of advocacy and leadership in open access at CMU, as we seek to build upon our founder’s legacy and define the library of the 21st century.”

Journal- or article-based citation measure? A study… | F1000Research

Abstract:  In academia, decisions on promotions are influenced by the citation impact of the works published by the candidates. The Medical Faculty of the University of Bern used a measure based on the journal impact factor (JIF) for this purpose: the JIF of the papers submitted for promotion should rank in the upper third of journals in the relevant discipline (JIF rank >0.66). The San Francisco Declaration on Research Assessment (DORA) aims to eliminate the use of journal-based metrics in academic promotion. We examined whether the JIF rank could be replaced with the relative citation ratio (RCR), an article-level measure of citation impact developed by the National Institutes of Health (NIH). An RCR percentile >0.66 corresponds to the upper third of citation impact of articles from NIH-sponsored research. We examined 1525 publications submitted by 64 candidates for academic promotion at University of Bern. There was only a moderate correlation between the JIF rank and RCR percentile (Pearson correlation coefficient 0.34, 95% CI 0.29-0.38). Among the 1,199 articles (78.6%) published in journals ranking >0.66 for the JIF, less than half (509, 42.5%) were in the upper third of the RCR percentile. Conversely, among the 326 articles published in journals ranking <0.66 regarding the JIF, 72 (22.1%) ranked in the upper third of the RCR percentile. Our study demonstrates that the rank of the JIF is a bad proxy measure for the actual citation impact of individual articles. The Medical Faculty of University of Bern has signed DORA and replaced the JIF rank with the RCR percentile to assess the citation impact of papers submitted for academic promotion.  

Knowledge for all: A decade of open access at uOttawa | Gazette | University of Ottawa

“This month marks the 10th anniversary of uOttawa’s OA program—the first of its kind in Canada. By helping to make research freely available online, the University has positioned itself as a global leader in the transformation of scholarly communication….”

Knowledge for all: A decade of open access at uOttawa | Gazette | University of Ottawa

“This month marks the 10th anniversary of uOttawa’s OA program—the first of its kind in Canada. By helping to make research freely available online, the University has positioned itself as a global leader in the transformation of scholarly communication….”

Promoting openness – Research Professional News

“Of the potential solutions, open research practices are among the most promising. The argument is that transparency acts as an implicit quality control process. If others are able to scrutinise our work—not just the final published output, but the underlying data, code, and so on—researchers will be incentivised to ensure these are high quality.

So, if we think that research could benefit from improved quality control, and if we think that open research might have a role to play in this, why aren’t we all doing it? In a word: incentives….”