imagjn – Bringing Trust to the World

“Science drives innovation. Unfortunately, scientific knowledge is locked behind closed doors. The current system requires academics to publish in high impact journals. This is inefficient knowledge sharing. It is slow, bureaucratic and requires academics to give away their copyright. Above all, it is very expensive. Each university has to pay 2-7 millions of euros per year in public money to obtain access to that research which was paid by them in the first place. It is a 32 billion market, controlled by five publishers who have a higher profit margin than Google. Money that could have been spent on research.

Imagjn open knowledge, where everybody has access to all scientific papers without artificial barriers such as paywalls. To do that, we have to change the rules in how we judge scientific impact. We should no longer focus on where someone publishes. Instead, we should focus on what someone publishes. Therefore, We want to move from a Journal Impact Factor to an Open Impact Factor, controlled and owned by academics. We develop a platform that simplifies writing, citing, reviewing and publishing scientific papers, making knowledge freely available to anyone….”

‘Undue reliance’ on journal impact factor in academic evaluation | Times Higher Education (THE)

At least one in three research-intensive universities in North America examined by a study leaned on the journal impact factor of periodicals that academics had published in when making decisions on promotion and tenure, but the true proportion may be much higher.

The study, believed to be the first to examine the use of the journal impact factor in academic performance reviews, warns that there is an “undue reliance” on the controversial metric….

Among the documents from 57 research-intensive institutions considered by the study, 23 (40 per cent) referred to journal impact factors, with 19 of these mentions (83 per cent of the subtotal) being supportive. Only three of the mentions expressed caution about use of journal impact factors.

Of the documents that did refer to journal impact factors, 14 associated the metric with research quality, while eight tied it to impact and a further five referred to prestige or reputation.

The overall results, including large numbers of universities that offer few doctoral degrees, found that 23 per cent of review, promotion and tenure policies mentioned the journal impact factor, with 87 per cent of these mentions being supportive….”

Impact factors are still widely used in academic evaluations

Almost half of research-intensive universities consider journal impact factors when deciding whom to promote, a survey of North American institutions has found.

About 40% of institutes with a strong focus on research mention impact factors in documents used in the review, promotion and tenure process, according to the analysis, which examined more than 800 documents across 129 institutions in the United States and Canada.

The data imply that many universities are evaluating the performance of their staff using a metric that has been widely criticized as a crude and misleading proxy for the quality of scientists’ work….

Less than one-quarter of the institutions mentioned impact factor or a closely related term such as “high impact journal” in their documents. But this proportion rose to 40% for the 57 research-intensive universities included in the survey. By contrast, just 18% of universities that focused on master’s degrees mentioned journal impact factors (see ‘High impact’).

In more than 80% of the mentions at research-heavy universities, the language in the documents encouraged the use of the impact factor in academic evaluations. Only 13% of mentions at these institutions came with any cautionary words about the metric. The language also tended to imply that high impact factors were associated with better research: 61% of the mentions portrayed the impact factor as a measure of the quality of research, for example, and 35% stated that it reflected the impact, importance or significance of the work….”

Universities should be working for the greater good | Times Higher Education (THE)

What might happen if the provost of a highly visible research university that had recently reconfirmed its public-facing mission gathered the entire campus together – deans, department chairs and faculty – in rethinking the university’s promotion and tenure standards from top to bottom? What might become possible if that provost were to say that our definitions of “excellence” in research, teaching and service must have that public-facing mission at their heart? What might be possible if that public mission really became Job One?

The provost paused. Then he gave his answer: “Any institution that did that would immediately lose competitiveness within its cohort.” …

The pursuit of prestige is not the problem in and of itself, and excellence is, of course, something to strive for. In fact, friendly competition can push us all to do better. But excellence and prestige and the competitiveness that fuels their pursuit are too often based in marketing – indeed, in the logic of the market – rather than in the actual purposes of higher education. It’s a diversion from the on-the-ground work of producing and sharing knowledge that can result in misplaced investments and misaligned priorities….”

 

The citation advantage for open access science journals with and without article processing charges – Mohammad Reza Ghane, Mohammad Reza Niazmand, Ameneh Sabet Sarvestani, 2019

Abstract:  In this study of access models, we compared citation performance in journals that do and do not levy article processing charges (APCs) as part of their business model. We used a sample of journals from the Directory of Open Access Journals (DOAJ) science class and its 13 subclasses and recorded four citation metrics: JIF, H-index, citations per publication (CPP) and quartile rank. We examined 1881 science journals indexed in DOAJ. Thomson Reuters Journal Citation Reports and Web of Science were used to extract JIF, H-index, CPP and quartile category. Overall, the JIF, H-index and CPP indicated that APC and non-APC open access (OA) journals had equal impact. Quartile category ranking indicated a difference in favour of APC journals. In each science subclass, we found significant differences between APC and non-APC journals in all citation metrics except for quartile rank. Discipline-related variations were observed in non-APC journals. Differences in the rank positions of scores in different groups identified citation advantages for non-APC journals in physiology, zoology, microbiology and geology, followed by botany, astronomy and general biology. Impact ranged from moderate to low in physics, chemistry, human anatomy, mathematics, general science and natural history. The results suggest that authors should consider field- and discipline-related differences in the OA citation advantage, especially when they are considering non-APC OA journals categorised in two or more subjects. This may encourage OA publishing at least in the science class.

Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations [PeerJ Preprints]

Abstract:  The Journal Impact Factor (JIF) was originally designed to aid libraries in deciding which journals to index and purchase for their collections. Over the past few decades, however, it has become a relied upon metric used to evaluate research articles based on journal rank. Surveyed faculty often report feeling pressure to publish in journals with high JIFs and mention reliance on the JIF as one problem with current academic evaluation systems. While faculty reports are useful, information is lacking on how often and in what ways the JIF is currently used for review, promotion, and tenure (RPT). We therefore collected and analyzed RPT documents from a representative sample of 129 universities from the United States and Canada and 381 of their academic units. We found that 40% of doctoral, research-intensive (R-type) institutions and 18% of master’s, or comprehensive (M-type) institutions explicitly mentioned the JIF, or closely related terms, in their RPT documents. Undergraduate, or baccalaureate (B-type) institutions did not mention it at all. A detailed reading of these documents suggests that institutions may also be using a variety of terms to indirectly refer to the JIF. Our qualitative analysis shows that 87% of the institutions that mentioned the JIF supported the metric’s use in at least one of their RPT documents, while 13% of institutions expressed caution about the JIF’s use in evaluations. None of the RPT documents we analyzed heavily criticized the JIF or prohibited its use in evaluations. Of the institutions that mentioned the JIF, 63% associated it with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. In sum, our results show that the use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and indicates there is work to be done to improve evaluation processes to avoid the potential misuse of metrics like the JIF.

Changing trends in otorhinolaryngology publishing | ACTA Otorhinolaryngologica Italica

Abstract:  The aim of this study is to compare the changes in impact factors and citation numbers of Open Access (OA) vs subscription-based (SB) journals between 1999 and 2016 and to explore the changing trends in ORL publishing. All data extracted from SCImago Journal and Country ranking (SJR) website have been used as input for statistical analysis. The chi-square test of independency was applied in order to understand whether the ratio of number of OA journals of ORL category have dramatically changed between years 1999 and 2016. Also, the years and impact factors of journals belonging to the OA and SB journals have been graphed separately and the changes of annual SJR ranks of both journal types have been compared using one-way Z-test. There was a statistical difference as the proportion of OA Journals were not equal to the proportion of SB Journals throughout the years 1999 and 2016, and it showed the tendency to increase greater compared to SB Journals (p < 0.01). Although the overall level of impact factors of SB journals was generally high, by comparing two regression models, it was obvious that the level of increase of the impact factors of OA journals were significantly higher (p < 0.01). When choosing where to publish, it is important to consider the journal’s visibility, cost of publication, IF or SJR of the journal and speed of publication as well as changing trends in medical publishing nourished by the Web of Science.

The value of a journal is the community it creates, not the papers it publishes | Impact of Social Sciences

“Initially PLOS ONE was a “club” of radicals who could afford to experiment with a new publishing model. This resulted in a higher than expected initial JIF and a massive influx of new authors, who were attracted to this (now) “proven” publishing model. Consequently, article processing times expanded (congestion), the initial sense of community became harder to maintain and the influx of articles ultimately reduced the JIF, leading to the flight of authors that were just seeking access to the prestige of the journal. The journal then shifted from a community (if not properly a knowledge club, as the disciplines were too disparate) to a social network market, which it could not sustain.

Scientific Reports follows a similar trajectory, but for different reasons. Initial submissions were not driven by a desire to be radical or progressive, as the concept of a mega-journal was already proven. Rather, Scientific Reports launched as a social network market, providing access to the prestige of the Nature brand. This model in turn became unsustainable, as the journal developed its own reputation and niche, which had been carefully planned through the naming (which does not include the name “Nature”) to avoid any dilution of the existing Nature brand.

 

What does this mean for Open Access and for initiatives like PlanS? Note that the club-theoretic model is ambivalent about how payments are made. We see similar patterns of growth and decline for subscription and APC journals alike. However the model is arguably better configured to understand how to create knowledge-value efficiently, because it asks how a community can be created and sustained, and how open access to membership can both stimulate and dilute knowledge-making itself. In our next post, we will discuss the implications of our model for planning a transition to full open access.”