“But do people use that information to make choices? Does it change where they get their care? There is little evidence that this is the case. While the study mentioned earlier showed hospitals respond to this such data, that same study showed that consumers and purchasers of healthcare rarely search out the information and do not understand or trust it. It had a small, although increasing, impact on their decision making. According to another study, there is little evidence that patients use publicly reported data to make a choice. There are several reasons that the authors of the study felt this was the case. Among those reasons were that consumers didn’t believe they had choice because of their insurance provider, consumers couldn’t understand the quality data (the reports are poorly designed), and that consumers don’t trust the information provided….”
“Even so, it is much easier to sign DORA than to deliver on the commitment that signing entails. And while I would always recommend that universities sign as soon as they are ready to commit, because doing so sends such a positive message to their researchers, they should not put pen to paper without a clear idea of how signing will impact their approach to research assessment, or how they are going to develop any changes with their staff….
Out went phrases such as “contributions to research papers that appear in high-impact journals” to be replaced by “contributions to high quality and impactful research.” The change is subtle but significant – the revised guidance makes it plain that ‘impactful research’ in this context is not a cypher for the JIF; rather it is work “that makes a significant contribution to the field and/or has impact beyond the immediate field of research.” …”
“What is this meeting about?
DORA and the Howard Hughes Medical Institute (HHMI) are convening a diverse group of stakeholders to consider how to improve research assessment policies and practices.
By exploring different approaches to cultural and systems change, we will discuss practical ways to reduce the reliance on proxy measures of quality and impact in hiring, promotion, and funding decisions. To focus on practical steps forward that will improve research assessment practices, we are not going to discuss the well-documented deficiencies of the Journal Impact Factor (JIF) as a measure of quality….”
Finding journal open access information alongside its global impact requires access to multiple databases. We describe a single, searchable database of all emergency medicine and critical care journals that include their open access policies, publication costs, and impact metrics.
A list of emergency medicine and critical care journals (including citation metrics) was created using Scopus (Citescore) and the Web of Science (Impact Factor). Cost of gold/hybrid open access and article process charges (open access fees) were collected from journal websites. Self-archiving policies were collected from the Sherpa/RoMEO database. Relative cost of access in different regions were calculated using the World Bank Purchasing Power Parity index for authors from the United States, Germany, Turkey, China, Brazil, South Africa and Australia.
We identified 78 emergency medicine and 82 critical care journals. Median Citescore for emergency medicine was 0.73 (interquartile range, IQR 0.32-1.27). Median impact factor was 1.68 (IQR 1.00-2.39). Median Citescore for critical care was 0.95 (IQR 0.25-2.06). Median impact factor was 2.18 (IQR 1.73-3.50). Mean article process charge for emergency medicine was $2243.04, SD?=?$1136.16 and for critical care $2201.64, SD?=?$1174.38. Article process charges were 2.24, 1.75, 2.28 and 1.56 times more expensive for South African, Chinese, Turkish and Brazilian authors respectively than United States authors, but neutral for German and Australian authors (1.02 and 0.81 respectively). The database can be accessed here: http://www.emct.info/publication-search.html.
We present a single database that captures emergency medicine and critical care journal impact rankings alongside its respective open access cost and green open access policies.
“However, most poll respondents felt that citation-based indicators are useful, but that they should be deployed in more nuanced and open ways. The most popular responses to the poll were that citation-based indicators should be tweaked to exclude self-citations, or that self-citation rates should be reported alongside other metrics (see ‘The numbers game’). On the whole, respondents wanted to be able to judge for themselves when self-citations might be appropriate, and when not; to be able to compare self-citation across fields; and more….
But this is where there is a real problem, because for many papers citation data are locked inside proprietary databases. Since 2000, more and more publishers have been depositing information about research-paper references with an organization called Crossref, the non-profit agency that registers digital object identifiers (DOIs), the strings of characters that identify papers on the web. But not all publishers allow their reference lists to be made open for anyone to download and analyse — only 59% of the almost 48 million articles deposited with Crossref currently have open references.
There is, however, a solution. Two years ago, the Initiative for Open Citations (I4OC) was established for the purpose of promoting open scholarly citation data. As of 1 September, more than 1,000 publishers were members, including Sage Publishing, Taylor and Francis, Wiley and Springer Nature — which joined last year. Publishers still to join I4OC include the American Chemical Society, Elsevier — the largest not to do so — and the IEEE….”
Abstract: Despite its undisputed position as the biggest social media platform, Facebook has never entered the main stage of altmetrics research. In this study, we argue that the lack of attention by altmetrics researchers is not due to a lack of relevant activity on the platform, but because of the challenges in collecting Facebook data have been limited to activity that takes place in a select group of public pages and groups. We present a new method of collecting shares, reactions, and comments across the platform-including private timelines-and use it to gather data for all articles published between 2015 to 2017 in the journal PLOS ONE. We compare the gathered data with altmetrics collected and aggregated by Altmetric. The results show that 58.7% of papers shared on the platform happen outside of public view and that, when collecting all shares, the volume of activity approximates patterns of engagement previously only observed for Twitter. Both results suggest that the role and impact of Facebook as a medium for science and scholarly communication has been underestimated. Furthermore, they emphasise the importance of openness and transparency around the collection and aggregation of altmetrics.
“Digital Science, a leader in scholarly technology, is pleased to announce a collaboration with the International Society for Scientometrics and Informetrics (ISSI) that will give ISSI members enhanced access to Dimensions and Altmetric data for scientometric research.
ISSI is an international association of scholars and professionals active in the interdisciplinary study science of science, science communication, and science policy. The ISSI community advances the boundaries of quantitative science studies, from theoretical, empirical, and practical perspectives.
Starting on October 1 2019, ISSI members will formally be invited to apply for no-cost access to Altmetric and Dimensions web tools and APIs. A committee of ISSI members will provide expert assessment of researchers’ applications and guidance on using Altmetric and Dimensions in their research.
This partnership builds upon Altmetric and Dimensions’ existing no-cost data sharing programs, which are currently open to all researchers conducting non-commercial scientometric research, while providing ISSI members with additional expert advice on early-stage research….”
“Many researchers still see the journal impact factor (JIF) as a key metric for promotions and tenure, despite concerns that it’s a flawed measure of a researcher’s value….
A recent survey of 338 researchers from 55 universities in the United States and Canada showed that more than one-third (36%) consider JIFs to be “very valued” for promotions and tenure, and 27% said they were “very important” when deciding where to submit articles….
[N]on-tenured and younger researchers, for whom RPT matters most, put more weight on JIFs when deciding where to publish….
According to Björn Brembs, a neuroscientist from the University of Regensburg, in Germany, who reviewed the study for eLife, the continuing deference to the JIF shows how scientists can be highly critical in their own subject domain, yet “gullible and evidence-resistant” when evaluating productivity. “This work shows just how much science is in dire need of a healthy dose of its own medicine, and yet refuses to take the pill,” he says….”
Abstract: Open Access provides researchers another opportunity of publishing, besides the traditional publication in subscription-based journals. Providing higher dissemination and therefore visibility as well as better accessibility, among others, Open Access helps to fulfil changed needs of authors and readers in our information and communication society of today. Though this publication model provides a lot of advantages both for readers and authors, there are also some obstacles. In order to identify the incentives that can lead scientists of medical informatics to an Open-Access-publication, we conducted a study consisting of group discussions, interviews, and surveys. This tripartite evaluation starts in its first part with group discussions and interviews. First results of them show that, among others, the higher visibility, indexing, Impact Factor and better accessibility are factors for an Open-Access-publication.
“We strongly support the principle that research must be freely accessible. At the MJA [Medical Journal of Australia], we practise what we believe and make all research freely accessible from publication, a unique feature of a subscription journal. We further support the idea that subscription journals should ensure all peer?reviewed articles are freely accessible after an embargo period and suggest this period be set at no more than 24 months after final publication. We suggest that Plan S is off track in its opposition to hybrid journals. There are many metrics of quality and impact, including media (and social media) attention, but the primary currency by which research quality is judged remains citations by peers; major breakthroughs attract very high citations as the work is replicated then adapted and extended by others around the world, which is in reality how science advances and research is translated. Several of the journals with the greatest impact and highest citations will be excluded under Plan S if they maintain their current subscription models.
When it all boils down to basics, researchers want to have their research published quickly after peer and editorial review, with near perfect certainty in the most prestigious, most impactful place possible. In 2019, authors do not necessarily need a traditional subscription medical journal to achieve this goal, and if this spells the end of the subscription model, time will tell as the market decides. In the meantime and whatever our personal views, researchers will continue to seek to have their work widely read and cited, which is why the top medical journals (many of which remain subscription journals) will continue to attract the best research and will have a wide choice of what to accept….”