How a working group began the process of DORA implementation at Imperial College London – DORA

“Even so, it is much easier to sign DORA than to deliver on the commitment that signing entails. And while I would always recommend that universities sign as soon as they are ready to commit, because doing so sends such a positive message to their researchers, they should not put pen to paper without a clear idea of how signing will impact their approach to research assessment, or how they are going to develop any changes with their staff….

Out went phrases such as “contributions to research papers that appear in high-impact journals” to be replaced by “contributions to high quality and impactful research.” The change is subtle but significant – the revised guidance makes it plain that ‘impactful research’ in this context is not a cypher for the JIF; rather it is work “that makes a significant contribution to the field and/or has impact beyond the immediate field of research.” …”

Driving Institutional Change for Research Assessment Reform – DORA

DORA and the Howard Hughes Medical Institute (HHMI) are convening a diverse group of stakeholders to consider how to improve research assessment policies and practices.
By exploring different approaches to cultural and systems change, we will discuss practical ways to reduce the reliance on proxy measures of quality and impact in hiring, promotion, and funding decisions. To focus on practical steps forward that will improve research assessment practices, we are not going to discuss the well-documented deficiencies of the Journal Impact Factor (JIF) as a measure of quality….”

A cross-sectional description of open access publication costs, policies and impact in emergency medicine and critical care journals. – PubMed – NCBI

Abstract

INTRODUCTION:

Finding journal open access information alongside its global impact requires access to multiple databases. We describe a single, searchable database of all emergency medicine and critical care journals that include their open access policies, publication costs, and impact metrics.

METHODS:

A list of emergency medicine and critical care journals (including citation metrics) was created using Scopus (Citescore) and the Web of Science (Impact Factor). Cost of gold/hybrid open access and article process charges (open access fees) were collected from journal websites. Self-archiving policies were collected from the Sherpa/RoMEO database. Relative cost of access in different regions were calculated using the World Bank Purchasing Power Parity index for authors from the United States, Germany, Turkey, China, Brazil, South Africa and Australia.

RESULTS:

We identified 78 emergency medicine and 82 critical care journals. Median Citescore for emergency medicine was 0.73 (interquartile range, IQR 0.32-1.27). Median impact factor was 1.68 (IQR 1.00-2.39). Median Citescore for critical care was 0.95 (IQR 0.25-2.06). Median impact factor was 2.18 (IQR 1.73-3.50). Mean article process charge for emergency medicine was $2243.04, SD?=?$1136.16 and for critical care $2201.64, SD?=?$1174.38. Article process charges were 2.24, 1.75, 2.28 and 1.56 times more expensive for South African, Chinese, Turkish and Brazilian authors respectively than United States authors, but neutral for German and Australian authors (1.02 and 0.81 respectively). The database can be accessed here: http://www.emct.info/publication-search.html.

CONCLUSIONS:

We present a single database that captures emergency medicine and critical care journal impact rankings alongside its respective open access cost and green open access policies.

The allure of the journal impact factor holds firm, despite its flaws | Nature Index

“Many researchers still see the journal impact factor (JIF) as a key metric for promotions and tenure, despite concerns that it’s a flawed measure of a researcher’s value….

A recent survey of 338 researchers from 55 universities in the United States and Canada showed that more than one-third (36%) consider JIFs to be “very valued” for promotions and tenure, and 27% said they were “very important” when deciding where to submit articles….

[N]on-tenured and younger researchers, for whom RPT matters most, put more weight on JIFs when deciding where to publish….

According to Björn Brembs, a neuroscientist from the University of Regensburg, in Germany, who reviewed the study for eLife, the continuing deference to the JIF shows how scientists can be highly critical in their own subject domain, yet “gullible and evidence-resistant” when evaluating productivity. “This work shows just how much science is in dire need of a healthy dose of its own medicine, and yet refuses to take the pill,” he says….”

Identification of Influencing Factors Regarding the Decision for or Against an Open Access Publication of Scientists of Medical Informatics: Description and First Results of Group Discussions and Interviews

Abstract:  Open Access provides researchers another opportunity of publishing, besides the traditional publication in subscription-based journals. Providing higher dissemination and therefore visibility as well as better accessibility, among others, Open Access helps to fulfil changed needs of authors and readers in our information and communication society of today. Though this publication model provides a lot of advantages both for readers and authors, there are also some obstacles. In order to identify the incentives that can lead scientists of medical informatics to an Open-Access-publication, we conducted a study consisting of group discussions, interviews, and surveys. This tripartite evaluation starts in its first part with group discussions and interviews. First results of them show that, among others, the higher visibility, indexing, Impact Factor and better accessibility are factors for an Open-Access-publication.

Disrupting medical publishing and the future of medical journals: a personal view – Gee – 2019 – Medical Journal of Australia – Wiley Online Library

“We strongly support the principle that research must be freely accessible. At the MJA [Medical Journal of Australia], we practise what we believe and make all research freely accessible from publication, a unique feature of a subscription journal. We further support the idea that subscription journals should ensure all peer?reviewed articles are freely accessible after an embargo period and suggest this period be set at no more than 24 months after final publication. We suggest that Plan S is off track in its opposition to hybrid journals. There are many metrics of quality and impact, including media (and social media) attention, but the primary currency by which research quality is judged remains citations by peers; major breakthroughs attract very high citations as the work is replicated then adapted and extended by others around the world, which is in reality how science advances and research is translated. Several of the journals with the greatest impact and highest citations will be excluded under Plan S if they maintain their current subscription models.

When it all boils down to basics, researchers want to have their research published quickly after peer and editorial review, with near perfect certainty in the most prestigious, most impactful place possible. In 2019, authors do not necessarily need a traditional subscription medical journal to achieve this goal, and if this spells the end of the subscription model, time will tell as the market decides. In the meantime and whatever our personal views, researchers will continue to seek to have their work widely read and cited, which is why the top medical journals (many of which remain subscription journals) will continue to attract the best research and will have a wide choice of what to accept….”

Meta-Research: Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations | eLife

Abstract:  We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms. Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.

Perceptions of Educational Leadership Faculty Regarding Open Access Publishing | Richardson | International Journal of Education Policy and Leadership

Abstract:  There is a dearth of research on the perceptions of faculty members in educational leadership regarding open access publications. This reality may exist because of a lack of funding for educational leadership research, financial obstacles, tenure demands, or reputation concerns. It may be that there are simply fewer established open access publishers with reputable impact factors to encourage publication by members in the field. The current study seeks to answer the following question: “What are the perceptions of educational leadership faculty members in UCEA about open access publishing?” The results are based on responses from 180 faculty members in the field of educational leadership.

Why we publish where we do: Faculty publishing values and their relationship to review, promotion and tenure expectations | bioRxiv

Abstract:  Using an online survey of academics at 55 randomly selected institutions across the US and Canada, we explore priorities for publishing decisions and their perceived importance within review, promotion, and tenure (RPT). We find that respondents most value journal readership, while they believe their peers most value prestige and related metrics such as impact factor when submitting their work for publication. Respondents indicated that total number of publications, number of publications per year, and journal name recognition were the most valued factors in RPT. Older and tenured respondents (most likely to serve on RPT committees) were less likely to value journal prestige and metrics for publishing, while untenured respondents were more likely to value these factors. These results suggest disconnects between what academics value versus what they think their peers value, and between the importance of journal prestige and metrics for tenured versus untenured faculty in publishing and RPT perceptions.