3 Reasons to Publish in Open Access Journals | ELISA Genie

“1. Great scientific discoveries stand alone and do not require the brand of a high impact journal to make it a great seminal publication that influences the field. If you work is good enough, your peers will recognize this and the same opportunities will open for you as would have opened if you published it elsewhere.

2. All data is good data: For many reasons projects just don’t work, whether it’s a technical reason, the tools researchers are using or previously published observations have been incorrect or partially correct. Publishing data in open access journals/platforms allows researchers to take a complete view of the field and not just a partial view due to restriction of data.

3. Impact factors are growing: In 2010 Nature Publishing group launched their own version of an open access journal called Nature Communications. Since it was launched Nature Communications has impressively acquired impact factor of 10.015, furthermore within the last year Cell Press has launched their own open access journal called Cell Reports and has attracted publications from some of the leading laboratories in world such as Doug Green and Alex Behrens and will surely publish a respectable impact factor within the coming months.Overall publishing in open access journals is critical for the future of science and will be the driving force behind the greatest human discovery….”

Open Access: Advocacy

“Widespread acceptance of open access has progressed more slowly than many advocates had hoped. One such advocate, Dr. Peter Suber, explains the barriers and misconceptions, and offers some strategic and practical advice….”

ATG Newschannel Original: The Passing of a Giant: Eugene Garfield Dies at 90 | Against The Grain

“The citation indexes and the Current Contents service became essential tools, not only in libraries but in research labs and technology companies across the globe….By having ready access to the tables of contents of core research journals available, researchers were easily able to mark those articles of key interest and then contact the authors for a copy of the article. Perhaps this can be seen today as a precursor to today’s Open Access movement, allowing for direct communication between researchers and their colleagues as well as potential developers….”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Budapest Open Access Initiative | Open Access: Toward the Internet of the Mind

“On February 14, 2002, a small text of fewer than a thousand words quietly appeared on the Web: titled the “Budapest Open Access Initiative” (BOAI), it gave a public face to discussions between sixteen participants that had taken place on December 1 and 2, 2001 in Budapest, at the invitation of the Open Society Foundations (then known as the Open Society Institute)….Wedding the old – the scientific ethos – with the new – computers and the Internet – elicited a powerful, historically grounded synthesis that gave gravitas to the BOAI. In effect, the Budapest Initiative stated, Open Access was not the hastily cobbled up conceit of a small, marginal band of scholars and scientists dissatisfied with their communication system; instead, it asserted anew the central position of communication as the foundation of the scientific enterprise. Communication, as William D. Harvey famously posited, is the “essence of science,” and thanks to the Internet, scientific communication could be further conceived as the distributed system of human intelligence….”

What are the personal and professional characteristics that distinguish the researchers who publish in high- and low-impact journals? A multi-national web-based survey. ecancermedicalscience – The open access journal from the European Institute of Oncology and the OCEI

Abstract:  Purpose: This study identifies the personal and professional profiles of researchers with a greater potential to publish high-impact academic articles.

Method: The study involved conducting an international survey of journal authors using a web-based questionnaire. The survey examined personal characteristics, funding, and the perceived barriers of research quality, work-life balance, and satisfaction and motivation in relation to career. The processes of manuscript writing and journal publication were measured using an online questionnaire that was developed for this study. The responses were compared between the two groups of researchers using logistic regression models.

Results: A total of 269 questionnaires were analysed. The researchers shared some common perceptions; both groups reported that they were seeking recognition (or to be leaders in their areas) rather than financial remuneration. Furthermore, both groups identified time and funding constraints as the main obstacles to their scientific activities. The amount of time that was spent on research activities, having >5 graduate students under supervision, never using text editing services prior to the publication of articles, and living in a developed and English-speaking country were the independent variables that were associated with their article getting a greater chance of publishing in a high-impact journal. In contrast, using one’s own resources to perform studies decreased the chance of publishing in high-impact journals.

Conclusions: The researchers who publish in high-impact journals have distinct profiles compared with the researchers who publish in low-impact journals. English language abilities and the actual amount of time that is dedicated to research and scientific writing, as well as aspects that relate to the availability of financial resources are the factors that are associated with a successful researcher’s profile.

New study explores disparities between researchers who publish in high-and low-impact journals

“A new study surveying authors from a range of countries investigates the crucial differences between authors who publish in high- and low-impact factor medical journals. This original research shows that the growth of open access hasn’t significantly changed the publishing landscape as regards impact factor….”

A Letter to Thompson Reuters – ASCB

“In April 2013, some of the original signers of DORA [Declaration on Research Assessment] wrote to executives at Thomson Reuters to suggest ways in which it might improve its bibliometric offerings. Suggestions included replacing the flawed and frequently misused two-year Journal Impact Factor (JIF) with separate JIFs for the citable reviews and for the primary research article content of a journal; providing more transparency in Thomson Reuters’ calculation of JIFs; and publishing the median value of citations per citable article in addition to the JIFs. Thomson Reuters acknowledged receipt of the letter and said, “We are carefully reviewing all the points raised and will respond as soon as possible.”…”

Retracted Science and the Retraction Index

Abstract:  Articles may be retracted when their findings are no longer considered trustworthy due to scientific misconduct or error, they plagiarize previously published work, or they are found to violate ethical guidelines. Using a novel measure that we call the “retraction index,” we found that the frequency of retraction varies among journals and shows a strong correlation with the journal impact factor. Although retractions are relatively rare, the retraction process is essential for correcting the literature and maintaining trust in the scientific process.