How a working group began the process of DORA implementation at Imperial College London – DORA

“Even so, it is much easier to sign DORA than to deliver on the commitment that signing entails. And while I would always recommend that universities sign as soon as they are ready to commit, because doing so sends such a positive message to their researchers, they should not put pen to paper without a clear idea of how signing will impact their approach to research assessment, or how they are going to develop any changes with their staff….

Out went phrases such as “contributions to research papers that appear in high-impact journals” to be replaced by “contributions to high quality and impactful research.” The change is subtle but significant – the revised guidance makes it plain that ‘impactful research’ in this context is not a cypher for the JIF; rather it is work “that makes a significant contribution to the field and/or has impact beyond the immediate field of research.” …”

The business model for publishing has not changed since the 17th century. It should. | Sustainable Scholarship

“Four hundred years after the first scholarly journals appeared, the internet means everything has changed. And, yet, nothing has changed. The scholarly journal remains the currency of the academic realm and one of the most important means that researchers have to share their findings, make a reputation and earn tenure. …”

Driving Institutional Change for Research Assessment Reform – DORA

“What is this meeting about?

DORA and the Howard Hughes Medical Institute (HHMI) are convening a diverse group of stakeholders to consider how to improve research assessment policies and practices.
By exploring different approaches to cultural and systems change, we will discuss practical ways to reduce the reliance on proxy measures of quality and impact in hiring, promotion, and funding decisions. To focus on practical steps forward that will improve research assessment practices, we are not going to discuss the well-documented deficiencies of the Journal Impact Factor (JIF) as a measure of quality….”

Blogging as an Open Scholarship Practice | W. Ian O’Byrne

“I’ve found that blogging helps me in my scholarship in a variety of ways. There are also challenges as I strive to embed these practices in my everyday work….

When I submitted my materials for third year review at UNH, the first page of my binder included the URL and a QR code to the address for my main blog. I indicated that my binder would contain my publications, teaching evaluations, and service documentation. But that I believed my best work lived on my website, and it was an example of how I viewed my role as a scholar. My dean at the time ripped out the page at my review meeting and threw it away. She indicated that none of that mattered, and would only serve to confuse reviewers and my colleagues.

I learned a lesson that day. My work blogging as an open scholar was set aside from my work at the institution. If I chose to continue this work, it would (for the most part) not be valued in most/all of my evaluations. I have continued this practice, and have been motivated by others as they continue to write, share, and document their thinking….”

The allure of the journal impact factor holds firm, despite its flaws | Nature Index

“Many researchers still see the journal impact factor (JIF) as a key metric for promotions and tenure, despite concerns that it’s a flawed measure of a researcher’s value….

 

A recent survey of 338 researchers from 55 universities in the United States and Canada showed that more than one-third (36%) consider JIFs to be “very valued” for promotions and tenure, and 27% said they were “very important” when deciding where to submit articles….

[N]on-tenured and younger researchers, for whom RPT matters most, put more weight on JIFs when deciding where to publish….

According to Björn Brembs, a neuroscientist from the University of Regensburg, in Germany, who reviewed the study for eLife, the continuing deference to the JIF shows how scientists can be highly critical in their own subject domain, yet “gullible and evidence-resistant” when evaluating productivity. “This work shows just how much science is in dire need of a healthy dose of its own medicine, and yet refuses to take the pill,” he says….”

The Canadian Open Neuroscience Platform: Catching Up to Plan S and Going Further | The Official PLOS Blog

“It is worth pausing here for a brief aside about the distinction between open sharing, open publishing of research resources, and open access publishing of articles. All of these are important but for open science to be successful the distinction between them has to be clear.

Open sharing consists of making research resources available in a way they can be freely accessed and used. Sharing datasets in a repository or data sharing platform like Dryad, or code used for data analysis and visualization via a service like Github, are good examples. Sharing in this way rapidly disseminates resources and makes them available for use and adaptation by others as quickly as possible. Open publishing of research resources, however, involves the filtration of these resources through other researchers. These peer researchers make sure that the shared resource – whether it is data, code, single figures, or any of the plethora of resources developed throughout the scientific process – is in a form that is standard and easily usable by others, as well as presenting those resources in a curated form on a website or repository. Open access publishing of articles is the primary target of efforts like Plan S and relates to publishing scholarly articles in such a way that they are freely accessible and usable.

The Canadian Open Neuroscience Platform (CONP), along with myriad other organizations, are developing the resources needed to enable open sharing, open publishing of research resources, and open publishing of articles. By doing so the CONP is helping open science and reduce the current inequalities in access to all of the tools and research outputs science needs to thrive….

Opening science requires the collective effort of funders, data sharing platforms, academic institutions, and individual scientists. Science doesn’t have to be opened all at once, but steps down the open road must be taken, and must be taken now. The CONP will provide tools and guidance, but scientific culture shift requires a concerted community effort.

Some first steps needed to enable the open publishing of all research resources include: (1) forging agreements and partnerships between journals and open science platforms to make it easy for scientists to share their data, publish it in a curated form, and link it to publications, (2) promotion and tenure policies at academic institutions that value the sharing and publishing of data on par with producing articles, (3) funding agencies that require (and enforce) sharing and publishing data, code, and materials associated with publications as a condition of receiving a grant, and (4) a commitment from scientists themselves to change the culture of science towards openly sharing and publishing as many of their resources as they can.”

 

Meta-Research: Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations | eLife

Abstract:  We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms. Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.

Why we publish where we do: Faculty publishing values and their relationship to review, promotion and tenure expectations | bioRxiv

Abstract:  Using an online survey of academics at 55 randomly selected institutions across the US and Canada, we explore priorities for publishing decisions and their perceived importance within review, promotion, and tenure (RPT). We find that respondents most value journal readership, while they believe their peers most value prestige and related metrics such as impact factor when submitting their work for publication. Respondents indicated that total number of publications, number of publications per year, and journal name recognition were the most valued factors in RPT. Older and tenured respondents (most likely to serve on RPT committees) were less likely to value journal prestige and metrics for publishing, while untenured respondents were more likely to value these factors. These results suggest disconnects between what academics value versus what they think their peers value, and between the importance of journal prestige and metrics for tenured versus untenured faculty in publishing and RPT perceptions.

Academic review promotion and tenure documents promote a view of open access that is at odds with the wider academic community | Impact of Social Sciences

“Overall, the results of our survey give reason to be optimistic: the majority of faculty understand that OA is about making research accessible and available. However, they also point to persistent misconceptions about OA, like necessarily high costs and low quality. This raises questions: How might these misconceptions be affecting RPT [review, promotion, and tenure] evaluations? How should researchers who want to prioritise the public availability of their work guard against the potential that their peers hold one of these negative associations? And, as a community, how can we better communicate the complexities of OA without further diluting the central message of open access? Perhaps we can begin by adequately representing and incentivising the basic principles of openness in our RPT documents.”

So, are early career researchers the harbingers of change? – Nicholas – 2019 – Learned Publishing – Wiley Online Library

Interestingly, open science, which is something that many ECRs are still only waking up to as a concept, is the next most unchanging aspect. The large gap between positive attitudes (30%) and more practice (14%) is partly explained by the fact that it is only just obtaining traction and partly because of fears over tenure and reputation. Take Spanish ECRs, for instance, where assessment policies and reputational concerns – absolutely critical, of course, to ECRs in obtaining secure employment – conspire to prevent the ready adoption of open science in practice. That is not to say that all ECRs are completely happy with all the component parts of open science. Thus, they tend not welcome the visibility open peer review brings with it as it could have reputational consequences, as one French ECR said: ‘Open Peer Review is tricky because you engage your own reputation as a reviewer’. Open data can be a poisoned chalice as well because ECRs do not want to give away their data until they have fully exploited it, as one Spanish ECR told us: ‘Sharing data is good for verification and reproducibility, but we should wait before we do this until they have been completely exploited to avoid losing our competitive edge’. Nevertheless, a number of counties (e.g. France and Poland) are rolling out open science national plans, and funders will expect compliance down the line….

Returning to the question posed at the very beginning of the study, whether ECRs are the harbingers of change, weighting up all the evidence, the answer has to be yes, albeit a slightly qualified yes. The drivers of change are social media, open science, and collaboration propelled by ECRs’ Millennium generation beliefs. …

Indeed, there may be plenty of papers exhorting ECRs to embrace open practices (Eschert, 2015; Gould, 2015; McKiernan et al., 2016), but no research robustly showing that ECRs are in fact rushing to do this. Of course, most of these studies predate the Harbingers study, so, maybe, things have changed in the interim, which explains why the results of this study, indicating that the scholarly walls have been breached in places, and ECRs have planted one foot in the future, is at odds with the research of many of our peers. …”