“Now an analysis shows that researchers in the UK are indeed posting their papers online earlier, as are their colleagues all over the world. The time researchers are taking to post papers online shrunk by an average of 472 days per country between 2013 and 2017, finds a study published on 17 April and to be presented at the ACM/IEEE Joint Conference on Digital Libraries in June. Though the authors can’t definitively say what’s behind the trend, they suggest that the Research England policy and other funding eligibility requirements recently announced worldwide are pushing academics to rapidly make their work freely available….”
“Almost half of research-intensive universities consider journal impact factors when deciding whom to promote, a survey of North American institutions has found.
About 40% of institutes with a strong focus on research mention impact factors in documents used in the review, promotion and tenure process, according to the analysis, which examined more than 800 documents across 129 institutions in the United States and Canada.
The data imply that many universities are evaluating the performance of their staff using a metric that has been widely criticized as a crude and misleading proxy for the quality of scientists’ work….
Less than one-quarter of the institutions mentioned impact factor or a closely related term such as “high impact journal” in their documents. But this proportion rose to 40% for the 57 research-intensive universities included in the survey. By contrast, just 18% of universities that focused on master’s degrees mentioned journal impact factors (see ‘High impact’).
In more than 80% of the mentions at research-heavy universities, the language in the documents encouraged the use of the impact factor in academic evaluations. Only 13% of mentions at these institutions came with any cautionary words about the metric. The language also tended to imply that high impact factors were associated with better research: 61% of the mentions portrayed the impact factor as a measure of the quality of research, for example, and 35% stated that it reflected the impact, importance or significance of the work….”
“What might happen if the provost of a highly visible research university that had recently reconfirmed its public-facing mission gathered the entire campus together – deans, department chairs and faculty – in rethinking the university’s promotion and tenure standards from top to bottom? What might become possible if that provost were to say that our definitions of “excellence” in research, teaching and service must have that public-facing mission at their heart? What might be possible if that public mission really became Job One?
The provost paused. Then he gave his answer: “Any institution that did that would immediately lose competitiveness within its cohort.” …
The pursuit of prestige is not the problem in and of itself, and excellence is, of course, something to strive for. In fact, friendly competition can push us all to do better. But excellence and prestige and the competitiveness that fuels their pursuit are too often based in marketing – indeed, in the logic of the market – rather than in the actual purposes of higher education. It’s a diversion from the on-the-ground work of producing and sharing knowledge that can result in misplaced investments and misaligned priorities….”
Abstract: The Journal Impact Factor (JIF) was originally designed to aid libraries in deciding which journals to index and purchase for their collections. Over the past few decades, however, it has become a relied upon metric used to evaluate research articles based on journal rank. Surveyed faculty often report feeling pressure to publish in journals with high JIFs and mention reliance on the JIF as one problem with current academic evaluation systems. While faculty reports are useful, information is lacking on how often and in what ways the JIF is currently used for review, promotion, and tenure (RPT). We therefore collected and analyzed RPT documents from a representative sample of 129 universities from the United States and Canada and 381 of their academic units. We found that 40% of doctoral, research-intensive (R-type) institutions and 18% of master’s, or comprehensive (M-type) institutions explicitly mentioned the JIF, or closely related terms, in their RPT documents. Undergraduate, or baccalaureate (B-type) institutions did not mention it at all. A detailed reading of these documents suggests that institutions may also be using a variety of terms to indirectly refer to the JIF. Our qualitative analysis shows that 87% of the institutions that mentioned the JIF supported the metric’s use in at least one of their RPT documents, while 13% of institutions expressed caution about the JIF’s use in evaluations. None of the RPT documents we analyzed heavily criticized the JIF or prohibited its use in evaluations. Of the institutions that mentioned the JIF, 63% associated it with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. In sum, our results show that the use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and indicates there is work to be done to improve evaluation processes to avoid the potential misuse of metrics like the JIF.
“The publication of the results of the fourth EUA Open Access Survey coincides with the emergence of two important approaches in the construction of an Open Science environment. The first is „Plan S“, signed by an increasing number of research funding organisations. The second is the development of „Publish and Read“ models in negotiations with publishers by scholar negotiating consortia. These can be considered as complementary in the sense that the first aims to rapidly expand Open Access to research publications, and the second to control the total amount of funds spent by research performing organisations, that is, universities and research institutes, to publish in and to have access to scientific journals. The need to address these two major aims concurrently is the main goal of the work of the EUA Expert Group on Science 2.0/Open Science, and more generally EUA’s central objective for the future of scientific publications….
Key results regarding Open Access to research publications
• 62% of the institutions surveyed have an Open Access policy on research publications in place and 26% are in the process of drafting one.
• At institutions with an OA policy in place: – Almost 50% require publications to be self-archived in the repository – 60% recommend that researchers publish in OA – 74% do not include any provisions linking Open Access to research evaluation. Only 12% have mandatory guidelines linking OA to internal research assessment.
• Despite the fact that most surveyed institutions have implemented an Open Access policy for research publications, 73% had not defined specific Open Access targets or timelines.
• 70% of these institutions monitor deposits in the repository. However, only 40% monitor Open Access publishing and only 30% monitor related costs (gold OA).
• Librarians are most knowledgeable about and most committed to (~80%) Open Access (publishers’ policies, H2020 rules) followed by institutional leadership (~50%). For researchers, including early-stage researchers, the figure drops to ~20%.
• Raising awareness and developing additional incentives for researchers to make their work available via Open Access are top priorities….”
“The followings are the discussed response to Plan S Guidance on Implementation.
01 We are in broad support of Plan S and its goals to ensure immediate and complete open access to journal articles resulting from publicly funded research to the world. We applaud the effort of Plan S to provide strong incentives to make research open access. We support an international effort to achieve this goal worldwide as soon as possible.
02 We fully recognize that the need for forceful and accountable policies by public funders in research, education, and libraries, to facilitate open access against various entrenched interests or the inertia of the status quo. We urge all in research, education, publishing, platforms, repositories, and libraries to engage diligently in transformative efforts abreast with time to meet the challenges.
03 We support the Final Conference Statement of the 14th Berlin Conference on Open Access with its commitments. We urge all the publishers to work with the global research community to effect complete and immediate open access according to the Statement.
04 We support the principles and roadmaps of OA2020 Initiative which aims to transform a majority of today’s scholarly journals from subscription to OA publishing, while continues to support new forms of OA publishing. We believe the transition process can be realized within the framework of currently available resources. We see no legitimate reasons for, and will object to, any attempts to increase spending from the original subscribing institutions in the transformation.
05 We support that authors retain copyrights of their publications in open access publishing through journals or open access platforms.
06 We support that open access publications are made under open licenses. We support the use of the CC_BY license as the preferred one but recommend that other CC licenses also be allowed as compliant to Plan S.
07 We recognize the strong need for compliant requirements, agreed by the research communities, for open access journals and platforms. We agree that infrastructural instruments like DOAJ and OpenDOAR can be utilized to help identifying and signaling compliance, but we urge that cOAlition S and other funders recognize and support other appropriate mechanisms for the purpose and require any such instruments are put under international oversight by the global research community to ensure their no-for-profit nature, inclusiveness, objectiveness, integrity, and efficiency.
08 We commend the recognition by Plan S that there exist different models of financing and paying for Open Access publication. We support an inclusive range of immediate open access publishing approaches. We support the transparency and monitoring of open access publication costs and fees.
09 We urge that cOAlition S and other funders, through Plan S or other means, provide financial support for no-fee OA journals. The wide range of support approaches to no-fee OA journals should be encouraged to enhance the diversity of open access publishing and competiveness of publishing market, and to avoid the perverse effect of giving no-fee journals an incentive to start charging fees. While the support can start with general term statements, measures can be timely designed and tested to encourage quality, integrity, transparency and openness, and increasing host investment and other diverse and appropriate income.
10 We support that where article processing charges (APCs) apply, efforts are made to establish a fair and reasonable APC level, including equitable waiver policies, that reflects the costs involved in the quality assurance, editing, and publishing process and how that adds value to the publication. We hold it very important that any such effort should take into consideration of the diversity in the world to ensure applicability and affordability of any such measures across countries and disciplines.
11 We commend the support and requirements of Plan S for financing APCs for open access publication in subscription journals (‘hybrid Open Access’) only under transformative agreements. These agreements should be temporary and transitional, with a shift to full open access within a very few years.
12 We understand the purposes and the benefits of using ORCIDs in journal publications. Considering different paces of adopting ORCID in different regions and disciplines, we recommend that it is implemented as a preferred condition, at least in the short beginning years. We recommend the same treatment for using DOI.
13 We support the Plan S recommendation that “all publications and also other research outputs deposited in open repositories.” We recommend that Plan S make full acknowledge and use of the full range of capabilities of open repositories to support open access, long-term preservation, research management, and re-use.
14 We encourage that Plan S takes the transformative green OA mechanism as one of venues to implement open access, as long as the embargo period of com
“I think that biggest barrier is the existing system of incentives – people are not made professor for making their research openly available — that needs to change. The current system was never built to scale to the current size of the research world. I think that there will be some radical changes in scholarly communication and evaluation. Research, however, is quite rightly a conservative world. Systems need to be tried and tested – we can’t afford to switch to a system that is susceptible to effects like fake news. So, I don’t think that change will happen quickly….
As a researcher, I want it to be simple. I don’t want to have to find money from different pots to publish my work. I don’t want to have to understand licensing and copyright law nor do I want to have to understand if my funder’s requirements are at odds with my institution’s requirements of me or indeed my government’s views on what constitutes open. I also really don’t want to have to go through the same thing with my data and my software as well as my journal article. So, in short, yes, I do think that there needs to be simplification. Not wanting to wade into the minefield that is Plan S, I will say that one thing that must be welcome to everyone is that there is now clear coordination going on between different stakeholders. Ideally this would lead to a framework or standard that allows stakeholders to adopt or to sign up to a standardized set of Open Access requirements that are internally consistent and easy to understand….”
“Perhaps the paper itself is to blame. Scientific methods evolve now at the speed of software; the skill most in demand among physicists, biologists, chemists, geologists, even anthropologists and research psychologists, is facility with programming languages and “data science” packages. And yet the basic means of communicating scientific results hasn’t changed for 400 years. Papers may be posted online, but they’re still text and pictures on a page.
What would you get if you designed the scientific paper from scratch today? …
Software is a dynamic medium; paper isn’t. When you think in those terms it does seem strange that research like Strogatz’s, the study of dynamical systems, is so often being shared on paper …
I spoke to Theodore Gray, who has since left Wolfram Research to become a full-time writer. He said that his work on the notebook was in part motivated by the feeling, well formed already by the early 1990s, “that obviously all scientific communication, all technical papers that involve any sort of data or mathematics or modeling or graphs or plots or anything like that, obviously don’t belong on paper. That was just completely obvious in, let’s say, 1990,” he said. …”