The MIT Press has received a three-year $850,000 grant from Arcadia, a charitable fund of Lisbet Rausing and Peter Baldwin, to perform a broad-based monograph publishing cost analysis and to develop and openly disseminate a durable financial framework and business plan for open-access (OA) monographs. The press, a leader in OA publishing for almost 25 years, will also undertake a pilot program to implement the resulting framework for scholarly front- and backlist titles.
“What is this meeting about?
DORA and the Howard Hughes Medical Institute (HHMI) are convening a diverse group of stakeholders to consider how to improve research assessment policies and practices.
By exploring different approaches to cultural and systems change, we will discuss practical ways to reduce the reliance on proxy measures of quality and impact in hiring, promotion, and funding decisions. To focus on practical steps forward that will improve research assessment practices, we are not going to discuss the well-documented deficiencies of the Journal Impact Factor (JIF) as a measure of quality….”
Finding journal open access information alongside its global impact requires access to multiple databases. We describe a single, searchable database of all emergency medicine and critical care journals that include their open access policies, publication costs, and impact metrics.
A list of emergency medicine and critical care journals (including citation metrics) was created using Scopus (Citescore) and the Web of Science (Impact Factor). Cost of gold/hybrid open access and article process charges (open access fees) were collected from journal websites. Self-archiving policies were collected from the Sherpa/RoMEO database. Relative cost of access in different regions were calculated using the World Bank Purchasing Power Parity index for authors from the United States, Germany, Turkey, China, Brazil, South Africa and Australia.
We identified 78 emergency medicine and 82 critical care journals. Median Citescore for emergency medicine was 0.73 (interquartile range, IQR 0.32-1.27). Median impact factor was 1.68 (IQR 1.00-2.39). Median Citescore for critical care was 0.95 (IQR 0.25-2.06). Median impact factor was 2.18 (IQR 1.73-3.50). Mean article process charge for emergency medicine was $2243.04, SD?=?$1136.16 and for critical care $2201.64, SD?=?$1174.38. Article process charges were 2.24, 1.75, 2.28 and 1.56 times more expensive for South African, Chinese, Turkish and Brazilian authors respectively than United States authors, but neutral for German and Australian authors (1.02 and 0.81 respectively). The database can be accessed here: http://www.emct.info/publication-search.html.
We present a single database that captures emergency medicine and critical care journal impact rankings alongside its respective open access cost and green open access policies.
Abstract: Despite its undisputed position as the biggest social media platform, Facebook has never entered the main stage of altmetrics research. In this study, we argue that the lack of attention by altmetrics researchers is not due to a lack of relevant activity on the platform, but because of the challenges in collecting Facebook data have been limited to activity that takes place in a select group of public pages and groups. We present a new method of collecting shares, reactions, and comments across the platform-including private timelines-and use it to gather data for all articles published between 2015 to 2017 in the journal PLOS ONE. We compare the gathered data with altmetrics collected and aggregated by Altmetric. The results show that 58.7% of papers shared on the platform happen outside of public view and that, when collecting all shares, the volume of activity approximates patterns of engagement previously only observed for Twitter. Both results suggest that the role and impact of Facebook as a medium for science and scholarly communication has been underestimated. Furthermore, they emphasise the importance of openness and transparency around the collection and aggregation of altmetrics.
“Many researchers still see the journal impact factor (JIF) as a key metric for promotions and tenure, despite concerns that it’s a flawed measure of a researcher’s value….
A recent survey of 338 researchers from 55 universities in the United States and Canada showed that more than one-third (36%) consider JIFs to be “very valued” for promotions and tenure, and 27% said they were “very important” when deciding where to submit articles….
[N]on-tenured and younger researchers, for whom RPT matters most, put more weight on JIFs when deciding where to publish….
According to Björn Brembs, a neuroscientist from the University of Regensburg, in Germany, who reviewed the study for eLife, the continuing deference to the JIF shows how scientists can be highly critical in their own subject domain, yet “gullible and evidence-resistant” when evaluating productivity. “This work shows just how much science is in dire need of a healthy dose of its own medicine, and yet refuses to take the pill,” he says….”
“If (for example) the seeds of Plan S were sown when funders began to introduce requirements and funding around access to research publications, what do today’s funder preferences and requirements tell us about how to prepare for tomorrow? Here is a handful of insights from the project….
95% of respondents considered that being able to demonstrate broader communications and impacts is important to their future funding and career progression….
The dialogue, and thus the focus of publisher efforts at the moment, continues to be around open access and data sharing – but funders’ focus seems to be moving on to other aspects of communication, with dissemination / impact plans, knowledge exchange / transfer and broader audiences all more commonly required by respondents’ funders than open access and data sharing. This may be an early indicator of emerging opportunities for publishers – helping researchers with knowledge exchange, knowledge transfer, and research commercialization….”
To investigate whether there is a difference in citation rate between open access and subscription access articles in the field of radiology.
This study included consecutive original articles published online in European Radiology. Pearson ?2, Fisher’s exact, and Mann-Whitney U tests were used to assess for any differences between open access and subscription access articles. Linear regression analysis was performed to determine the association between open access publishing and citation rate, adjusted for continent of origin, subspeciality, study findings in article title, number of authors, number of references, length of the article, and number of days the article has been online. In a secondary analysis, we determined the association between open access and number of downloads and shares.
A total of 500 original studies, of which 86 (17.2%) were open access and 414 (82.8%) were subscription access articles, were included. Articles from Europe or North America were significantly more frequently published open access (p?=?0.024 and p?=?0.001), while articles with corresponding authors from Asia were significantly less frequently published open access (p?<?0.001). In adjusted linear regression analysis, open access articles were significantly more frequently cited (beta coefficient?=?3.588, 95% confidence interval [CI] 0.668 to 6.508, p?=?0.016), downloaded (beta coefficient?=?759.801, 95% CI 630.917 to 888.685, p?<?0.001), and shared (beta coefficient?=?0.748, 95% CI 0.124 to 1.372, p?=?0.019) than subscription access articles (beta coefficient?=?3.94, 95% confidence interval 1.44 to 6.44, p?=?0.002).
Open access publishing is independently associated with an increased citation, download, and share rate in the field of radiology.
Abstract: Open Access provides researchers another opportunity of publishing, besides the traditional publication in subscription-based journals. Providing higher dissemination and therefore visibility as well as better accessibility, among others, Open Access helps to fulfil changed needs of authors and readers in our information and communication society of today. Though this publication model provides a lot of advantages both for readers and authors, there are also some obstacles. In order to identify the incentives that can lead scientists of medical informatics to an Open-Access-publication, we conducted a study consisting of group discussions, interviews, and surveys. This tripartite evaluation starts in its first part with group discussions and interviews. First results of them show that, among others, the higher visibility, indexing, Impact Factor and better accessibility are factors for an Open-Access-publication.
The field of oncology is among the highest productive fields in medicine, with the highest impact journals. The impact of open access (OA) journals is still understudied in the field of oncology. In this study, we aim to study the open-access status of oncology journals and the impact of the open-access status on journal indices.
We collected data on the included journals from Scopus Source List on 1st of November 2018. We filtered the list for oncology journals for the years from 2011 to 2017. OA journals covered by Scopus are indicated as OA if the journal is listed in the Directory of Open Access Journals (DOAJ) and/or the Directory of Open Access Scholarly Resources (ROAD).
There were 318 oncology journals compared to 260 in 2011, an increase by about 24.2%, and the percentage of OA journals has increased from 19.6% to 23.9%. Although non-OA journals have significantly higher scholarly output (P=0.001), percent cited and source normalized impact per paper (SNIP) were higher for OA journals.
Publishing in oncology OA journals will yield more impact, in term of citations, and will reach boarder audience
Abstract: We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms. Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.