The Impact of Open Access Status on Journal Indexes of Radiology Journals : American Journal of Roentgenology : Ahead of Print (AJR)

ABSTRACT :

OBJECTIVE. The impact of open access (OA) journals is still understudied in the field of radiology. In this study, we compared the measures of impact (e.g., CiteScore, citation count, SCImago Journal Rank) between OA and subscription radiology journals.

MATERIALS AND METHODS. We collected data on journals included in the Scopus Source List on November 1, 2018. We filtered the list for radiology journals for the years from 2011 to 2017. OA journals covered by Scopus (Elsevier) are indicated as OA if the journal is listed in the Directory of Open Access Journals, the Directory of Open Access Scholarly Resources, or both. We also compared citation metrics between OA and subscription radiology journals.

RESULTS. The 2017 Scopus report included 265 radiology journals. The percentage of OA journals increased from 14.7% in 2011 to 21.9% in 2017 (49% increase). The median scholarly output and the citation count were both significantly lower in OA radiology journals compared with subscription journals (p < 0.001 and p = 0.016, respectively). The proportion of documents that received at least one citation was higher in OA (50.2%) compared with subscription journals (44.4%), but the difference was not statistically significant.

CONCLUSION. This study found that the trend toward OA publishing in the fields of radiology and nuclear medicine has slowed in recent years, although the percent cited (i.e., the proportion of documents that receive at least one citation) is higher for OA journals. We believe the radiology field should be more supportive of OA publishing.

Credit data generators for data reuse

“Much effort has gone towards crafting mandates and standards for researchers to share their data13. Considerably less time has been spent measuring just how valuable data sharing is, or recognizing the scientific contributions of the people responsible for those data sets. The impact of research continues to be measured by primary publications, rather than by subsequent uses of the data….

To incentivize the sharing of useful data, the scientific enterprise needs a well-defined system that links individuals with reuse of data sets they generate4….

A system in which researchers are regularly recognized for generating data that become useful to other researchers could transform how academic institutions evaluate faculty members’ contributions to science….”

Over-optimization of academic publishing metrics: observing Goodhart’s Law in action | GigaScience | Oxford Academic

Abstract:  Background

The academic publishing world is changing significantly, with ever-growing numbers of publications each year and shifting publishing patterns. However, the metrics used to measure academic success, such as the number of publications, citation number, and impact factor, have not changed for decades. Moreover, recent studies indicate that these metrics have become targets and follow Goodhart’s Law, according to which, “when a measure becomes a target, it ceases to be a good measure.”

Results

In this study, we analyzed >120 million papers to examine how the academic publishing world has evolved over the last century, with a deeper look into the specific field of biology. Our study shows that the validity of citation-based measures is being compromised and their usefulness is lessening. In particular, the number of publications has ceased to be a good metric as a result of longer author lists, shorter papers, and surging publication numbers. Citation-based metrics, such citation number and h-index, are likewise affected by the flood of papers, self-citations, and lengthy reference lists. Measures such as a journal’s impact factor have also ceased to be good metrics due to the soaring numbers of papers that are published in top journals, particularly from the same pool of authors. Moreover, by analyzing properties of >2,600 research fields, we observed that citation-based metrics are not beneficial for comparing researchers in different fields, or even in the same department.

Conclusions

Academic publishing has changed considerably; now we need to reconsider how we measure success.

The European University Association and Science Europe Join Efforts to Improve Scholarly Research Assessment Methodologies

“Evaluating research and assessing researchers is fundamental to the research enterprise and core to the activities of research funders and research performing organisations, as well as universities. The European University Association (EUA) and Science Europe are committed to building a strong dialogue between their members, who share the responsibility of developing and implementing more accurate, open, transparent and responsible approaches, that better reflect the evolution of research activity in the digital era.

Today, the outcomes of scholarly research are often measured through methods based on quantitative, albeit approximate, indicators such as the journal impact factor. There is a need to move away from reductionist ways of assessing research, as well as to establish systems that better assess research potential. Universities, research funders and research performing organisations are well-placed to explore new and improved research assessment approaches, while also being indispensable in turning these innovations into systemic reforms….”

Rethinking impact factors: better ways to judge a journal

Global efforts are afoot to create a constructive role for journal metrics in scholarly publishing and to displace the dominance of impact factors in the assessment of research. To this end, a group of bibliometric and evaluation specialists, scientists, publishers, scientific societies and research-analytics providers are working to hammer out a broader suite of journal indicators, and other ways to judge a journal’s qualities. It is a challenging task: our interests vary and often conflict, and change requires a concerted effort across publishing, academia, funding agencies, policymakers and providers of bibliometric data.

Here we call for the essential elements of this change: expansion of indicators to cover all functions of scholarly journals, a set of principles to govern their use and the creation of a governing body to maintain these standards and their relevance….”

How to avoid borrowed plumes in academia

Abstract:  Publications in top journals today have a powerful influence on academic careers although there is much criticism of using journal rankings to evaluate individual articles. We ask why this practice of performance evaluation is still so influential. We suggest this is the case because a majority of authors benefit from the present system due to the extreme skewness of citation distributions. “Performance paradox” effects aggravate the problem. Three extant suggestions for reforming performance management are critically discussed. We advance a new proposal based on the insight that fundamental uncertainty is symptomatic for scholarly work. It suggests focal randomization using a rationally founded and well-orchestrated procedure.

Understanding the Impact of OER: Achievements and Challenges – UNESCO IITE

The publication “Understanding the Impact of OER: Achievements and Challenges” is the result of partnership between the UNESCO Institute for Information Technologies in Education (UNESCO IITE) and OER Africa, an initiative established by Saide.

It critically reviews the growth of open educational resources (OER) and its potential impact on education systems around the world; and points at some significant achievements as well as key challenges hindering the growth and potential of OER that need to be addressed.

The publication summarizes the conclusions of a series of country case studies conducted by experts from Australia, Brazil, Canada, Chile, China, Germany, Mexico, Mongolia, New Zealand, Nigeria, Slovenia, South Africa, Tanzania, Tunisia, and the United Kingdom. It seeks to shed light on such important issues as the economic and pedagogical value of investing in OER; the role of OER in fostering diversity, inclusion, and in purposively pursuing quality improvement and innovation; and, finally, the extent to which these important issues are being researched.

The publication is addressed to decision-makers, educators and innovators, and is aimed to stimulate the debate about the impact of OER and encourage governments to engage with OER in ways that drive defined pedagogical improvements, while encouraging equity and diversity in global knowledge networks….”

Impact Assessment of Non-Indexed Open Access Journals: A Case Study

Abstract:  This case study assesses the impact of a small, open-access social sciences journal not included in citation tracking indexes by exploring measures of the journal’s influence beyond the established “impact factor” formula. An analysis of Google Scholar data revealed the journal’s global reach and value to researchers. This study enabled the journal’s editors to measure the success of their publication according to its professed scope and mission, and to quantify its impact for prospective contributors. The impact assessment strategies outlined here can be leveraged effectively by academic librarians to provide high-value consultancy for scholar-editors of open access research journals.

Can Twitter, Facebook, and Other Social Media Drive Downloads, Citations? – The Scholarly Kitchen

Even before the development of the Internet and social media tools, the association between media promotion and article performance was well documented.1234 What was not fully understood, however, was the underlying cause of this association. Editors and journalists tend to promote what they view as the most important and novel papers. As a result, it is difficult to disambiguate selection effects from dissemination and amplification effects, especially from uncontrolled observational studies. Likely, multiple effects operate in concert. If we want to isolate these effects, we need to rely on a more rigorous methodology–the randomized controlled trial….

While there are many studies exploring the relationships among indicators, most are methodologically weak and may suffer from confounding causes and effects. More rigorous trials, summarized above, report little, if any, effect between social media interventions and readership. Nevertheless, whereas social medical campaigns may have limited effect within the research and clinical community, they may provide other ancillary benefits to a journal, such as providing outreach to healthcare professionals, communicating directly with the general public, and increasing brand recognition.20 …”

Access to Top-Cited Emergency Care Articles (Published Between 2012 and 2016) Without Subscription

Abstract:  Introduction: Unrestricted access to journal publications speeds research progress, productivity, and knowledge translation, which in turn develops and promotes the efficient dissemination of content. We describe access to the 500 most-cited emergency medicine (EM) articles (published between 2012 and 2016) in terms of publisher-based access (open access or subscription), alternate access routes (self-archived or author provided), and relative cost of access.

Methods: We used the Scopus database to identify the 500 most-cited EM articles published between 2012 and 2016. Access status was collected from the journal publisher. For studies not available via open access, we searched on Google, Google Scholar, Researchgate, Academia.edu, and the Unpaywall and Open Access Button browser plugins to locate self archived copies. We contacted corresponding authors of the remaining inaccessible studies for a copy of each of their articles. We collected article processing and access costs from the journal publishers, and then calculated relative cost differences using the World Bank purchasing power parity index for the United States (U.S.), Germany, Turkey, China, Brazil, South Africa, and Australia. This allows costs to be understood relative to the economic context of the countries from which they originated.

Results: We identified 500 articles for inclusion in the study. Of these, 167 (33%) were published in an open access format. Of the remaining 333 (67%), 204 (61%) were available elsewhere on the internet, 18 (4%) were provided by the authors, and 111 (22%) were accessible by subscription only. The mean article processing and access charges were $2,518.62 and $44.78, respectively. These costs were 2.24, 1.75, 2.28 and 1.56 times more expensive for South African, Chinese, Turkish, and Brazilian authors, respectively, than for U.S. authors (p<0.001 all).

Conclusion: Despite the advantage of open access publication for knowledge translation, social responsibility, and increased citation, one in five of the 500 EM articles were accessible only via subscription. Access for scientists from upper-middle income countries was significantly hampered by cost. It is important to acknowledge the value this has for authors from low- and middle-income countries. Authors should also consider the citation advantage afforded by open access publishing when deciding where to publish.