What happens to journals that break away? | Filling a much-needed gap

Although it is still a relatively rare occurrence, several journal boards have broken away from large commercial publishers. A good list is at the Open Access Directory. These journals usually are required to change their name, because the previous publisher will not relinquish it. They are cut off from the enormous support provided by large commercial publishers (after all their subscription prices are so high, the money is surely being put back into developing better infrastructure, rather than, say enriching shareholders, giving inflated honoraria to editors or paying inefficient support staff). Thus one might expect that these journals would struggle.

I looked at the fortunes of the mathematics journals that have taken this route. Below I list the original title name, the approximate date of the breakaway, the new title and publisher, and citation impact measures taken from 2014 data at eigenfactor.org, and compare them to the results for the original journal….

It seems clear that the new journals are doing considerably better than the old ones overall. I wonder whether the idea often touted by radical leftist OA advocates that large commercial publishers don’t add much value could have a grain of truth in it.”

Exploring possibilities to use bibliometric data to monitor Gold open access publishing at the national level – van Leeuwen – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  This article1 describes the possibilities to analyze open access (OA) publishing in the Netherlands in an international comparative way. OA publishing is now actively stimulated by Dutch science policy, similar to the United Kingdom. We conducted a bibliometric baseline measurement to assess the current situation, to be able to measure developments over time. We collected data from various sources, and for three different smaller European countries (the Netherlands, Denmark, and Switzerland). Not all of the analyses for this baseline measurement are included here. The analysis presented in this article focuses on the various ways OA can be defined using the Web of Science, limiting the analysis mainly to Gold OA. From the data we collected we can conclude that the way OA is currently registered in various electronic bibliographic databases is quite unclear, and various methods applied deliver results that are different, although the impact scores derived from the data point in the same direction.

Exploring possibilities to use bibliometric data to monitor Gold open access publishing at the national level – van Leeuwen – – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  This article1 describes the possibilities to analyze open access (OA) publishing in the Netherlands in an international comparative way. OA publishing is now actively stimulated by Dutch science policy, similar to the United Kingdom. We conducted a bibliometric baseline measurement to assess the current situation, to be able to measure developments over time. We collected data from various sources, and for three different smaller European countries (the Netherlands, Denmark, and Switzerland). Not all of the analyses for this baseline measurement are included here. The analysis presented in this article focuses on the various ways OA can be defined using the Web of Science, limiting the analysis mainly to Gold OA. From the data we collected we can conclude that the way OA is currently registered in various electronic bibliographic databases is quite unclear, and various methods applied deliver results that are different, although the impact scores derived from the data point in the same direction.

Impact of Social Sciences – A variety of strategies and funding approaches are required to accelerate the transition to open access. But in all, authors are key

“More than two decades of work towards liberating scholarly publishing from paywalled constraints has left many within the scholarly community exploring ways to accelerate the transition to open access. Not all institutions or author communities will agree upon which strategies or funding approaches to undertake, and nor do they need to. But whichever strategy is pursued, having university faculty lead the charge represents the most effective way forward. Rachael G. Samberg, Richard A. Schneider, Ivy Anderson and Jeff MacKie-Mason share the University of California’s range of open access policy and advocacy materials, and highlight some potential next steps that may be of use to faculty and author communities.”

Research Data Infrastructures in the UK: Landscape Report

“This report reviews:

a. the policies and the services that support and promote open research data in

the UK,

b. some of the evidence as to the impact of those policies,

c. the take-up of services,

d. and the adoption of open data and more generally open science practices….”

Increasing Participation in Your Institutional Repository

“So you’ve established an institutional repository (IR), where users can put papers, theses, and experimental data on file, making it easily accessible to the larger world. While getting an institutional repository up and running is no small feat, it’s only the first step. To make the most of this tool, you have to fill it, and that means getting ongoing participation from faculty and students.

While making information as broadly accessible as possible is a high priority for most librarians, the same can’t necessarily be said for all the faculty members producing that information. To drive participation by potential contributors, librarians have to show students and faculty what’s in it for them in addition to making a principled appeal. One way of doing that is helping to tie participation to things that already matter to academics, like tracking (and increasing) citations and other proof of usage of their work….”

Springer Nature is committed to being a part of the open-access movement | by Steven Inchcoombe, chief publishing officer

“Institutions, research funding bodies and publishers must all work together to change the system in the interest of advancing research, says Steven Inchcoombe

As part of our recent IPO process, there was a regulatory requirement for Springer Nature to prepare a “prospectus”: a lengthy legal reference document intended for “qualified investors”. In the past week, some content from this 400-plus-page document has been taken out of context to make inaccurate and unfair comments about us, our plans and our business; and we want to set the record straight.

We have been accused of “paying lip service” to the San Francisco Declaration on Research Assessment (DORA). This is not true and is particularly upsetting for our colleagues who are proud to stand firmly behind DORA and who have been implementing the large-scale changes needed to fulfil our obligations. This has seen us stop using journal impact factors in isolation in our marketing (note: a prospectus is a legal document aimed at potential investors, not a marketing tool for authors or librarians). In fact, for more than 10 years, long before DORA, Nature editorials have expressed concerns about the overuse of impact factors and have set out the case for a greater variety of more suitable metrics for different purposes. We continue to see this need, and we will continue to offer our librarians, authors, readers, editors and partners other choices, especially those at article level.

We have been accused of “exploiting” impact factor to market our journals. We are not. At Springer Nature, we have increased the use of other journal-level and article-level metrics including article usage and altmetrics. This is clearly stated in the prospectus, which references the importance of other metrics such as views/downloads or mentions in social media. The fact, however, remains that authors do choose which journals to publish in partly based on their impact factors, which is why we had a duty to explain this. Indeed, their long history of being independently calculated and published means that they are an important reference point in a prospectus, which is a verifiable, fact-based document aimed at investors. In our author survey last year (completed by more than 70,000 authors from all disciplines and regions), a journal’s impact factor is one the top-four criteria when choosing where to submit their draft articles, alongside a journal’s reputation, relevance and quality of peer review, in that order.

Finally, it has been claimed that our only motivation for higher impact factors is to drive higher article-processing charges. This is also not true. Part of our commitment to developing the largest and most comprehensive range of open-access journals in the publishing industry includes a desire to have a range of community-based OA journals, sound science OA journals and selective OA journals. For example, we flipped Nature Communications many years ago to become fully OA to ensure that such a choice existed for authors, and it is now the highest-cited OA journal in the world, demonstrating its appeal to authors and readers alike….”

 

Perception of the importance of chemistry research papers and comparison to citation rates

Abstract:  Chemistry researchers are frequently evaluated on the perceived significance of their work with the citation count as the most commonly-used metric for gauging this property. Recent studies have called for a broader evaluation of significance that includes more nuanced bibliometrics as well as altmetrics to more completely evaluate scientific research. To better understand the relationship between metrics and peer judgements of significance in chemistry, we have conducted a survey of chemists to investigate their perceptions of previously published research. Focusing on a specific issue of the Journal of the American Chemical Society published in 2003, respondents were asked to select which articles they thought best matched importance and significance given several contexts: highest number of citations, most significant (subjectively defined), most likely to share among chemists, and most likely to share with a broader audience. The answers to the survey can be summed up in several observations. The ability of respondents to predict the citation counts of established research is markedly lower than the ability of those counts to be predicted by the h-index of the corresponding author of each article. This observation is conserved even when only considering responses from chemists whose expertise falls within the subdiscipline that best describes the work performed in an article. Respondents view both cited papers and significant papers differently than papers that should be shared with chemists. We conclude from our results that peer judgements of importance and significance differ from metrics-based measurements, and that chemists should work with bibliometricians to develop metrics that better capture the nuance of opinions on the importance of a given piece of research.

The academic papers researchers regard as significant are not those that are highly cited

“For many years, academia has relied on citation count as the main way to measure the impact or importance of research, informing metrics such as the Impact Factor and the h-index. But how well do these metrics actually align with researchers’ subjective evaluation of impact and significance? Rachel Borchardt and Matthew R. Hartings report on a study that compares researchers’ perceptions of significance, importance, and what is highly cited with actual citation data. The results reveal a strikingly large discrepancy between perceptions of impact and the metric we currently use to measure it.”