Bibliometrics: The Leiden Manifesto for research metrics : Nature News & Comment

“…Yet the abuse of research metrics has become too widespread to ignore. We therefore present the Leiden Manifesto, named after the conference at which it crystallized (see http://sti2014.cwts.nl). Its ten principles are not news to scientometricians, although none of us would be able to recite them in their entirety because codification has been lacking until now. Luminaries in the field, such as Eugene Garfield (founder of the ISI), are on record stating some of these principles3, 4. But they are not in the room when evaluators report back to university administrators who are not expert in the relevant methodology. Scientists searching for literature with which to contest an evaluation find the material scattered in what are, to them, obscure journals to which they lack access.

We offer this distillation of best practice in metrics-based research assessment so that researchers can hold evaluators to account, and evaluators can hold their indicators to account….”

Few UK universities have adopted rules against impact-factor abuse

“A survey of British institutions reveals that few have taken concrete steps to stop the much-criticized misuse of research metrics in the evaluation of academics’ work. The results offer an early insight into global efforts to clamp down on such practices.

More than three-quarters of the 96 research organizations that responded to the survey said they did not have a research-metrics policy, according to data presented at a London meeting on metrics on 8 February. The same number — 75 — had not signed up to the Declaration on Research Assessment (DORA), an international concord that aims to eliminate the misuse of research metrics, which was developed in San Francisco in December 2012….

The survey found 52 institutions had implemented some measures to promote responsible-metrics principles, but only four had taken what the forum considers to be comprehensive action….”

RCUK statement on the responsible use of metrics in research assessment

[Undated but released c. February 8, 2018.]

“Research councils consider the journal impact factor and metrics such as the H-index are not appropriate measures for assessing the quality of publications or the contribution of individual researchers, and so will not use these measures in our peer review processes. …The research councils will highlight to reviewers, panel members, recruitment and promotion panels that they should not place undue emphasis on the journal in which papers are published, but assess the content of specific papers, when considering the impact of an individual researcher’s contribution….The Research Councils will sign DORA as a public indication of their support for these principles….”

Prof Randy Schekman: Giving Science To The People

“It is a peculiar situation when commercial science journals can not only ask investigators to pay for the privilege of sending in their work but also charge universities and others for the privilege of accessing work that was publicly funded.”

Recommendation on the evaluation of individual researchers in the mathematical sciences

“Nothing (and in particular no semi-automatized pseudo-scientific evaluation that involves numbers or data) can replace evaluation by an individual who actually understands what he/she is evaluating. Furthermore, tools such as impact factors are clearly not helpful or relevant in the context of mathematical research….”

Open access research | Revista Pesquisa Fapesp

“Brazil stands out on the international landscape when it comes to open access, a movement launched in the early 2000s with the aim of making scientific output freely available online. According to data compiled by Spanish research group Scimago, 33.5% of the Brazilian articles indexed in the Scopus database in 2016 were published in journals whose content is free to read online as soon as it is published, under a model known as the “golden road.” This is the largest proportion among the 15 nations with the highest volume of scientific output recorded on Scopus. Brazil is also top of the list of nations with the highest number of open access scientific journals (see charts).”

Article visibility: journal impact factor and availability of full text in PubMed Central and open access

Abstract:  Both the impact factor of the journal and immediate full-text availability in Pubmed Central (PMC) have featured in editorials before.1-3 In 2004, the editor of the Cardiovascular Journal of Africa (CVJA) lamented, like so many others, the injustice of not having an impact factor, its validity as a tool for measuring science output, and the negative effect of a low perceived impact in drawing attention from publications from developing countries.1,4

Since then, after a selection process, we have been indexed by the Web of Science® (WoS) and Thomson Reuters (Philadelphia, PA, USA), and have seen a growing impact factor. In the case of PMC, our acceptance to this database was announced in 2012,2 and now we are proud that it is active and full-text articles are available dating back to 2009. The journal opted for immediate full open access (OA), which means that full-text articles are available on publication date for anybody with access to the internet.

Assessing Current Practices in Review, Tenure, and Promotion – #ScholCommLab

“One of the key components of workplace advancement at the university level are the review, promotion and tenure (RPT) packets that are typically submitted every other year by early career faculty. These guidelines and forms are considered to be of highest importance for all faculty, especially for early career faculty who need to demonstrate the value and impact of their work to the university and the broader scientific community. Quite often impact is equated with “impact factor,” leading many researchers to target a narrow range of journals at the expense of a broader societal considerations (such as the public’s right to access). The importance of RPT guidelines and forms makes them a natural place to effect change towards an opening of access to research (something both Canada and the US have been pushing for through federal policies and laws).

While we believe changes in RPT guidelines and forms may provide the impetus for behavioral change, leading to broader interest and adoption of open access principles, the reality is that very little is known about current RPT practices as they relate to questions of openness. This project seeks to examine the RPT process in the US and Canada in ways that can directly inform actions likely to translate into behavioural change and to a greater opening of research….”

Assessing Current Practices in Review, Tenure, and Promotion – #ScholCommLab

“One of the key components of workplace advancement at the university level are the review, promotion and tenure (RPT) packets that are typically submitted every other year by early career faculty. These guidelines and forms are considered to be of highest importance for all faculty, especially for early career faculty who need to demonstrate the value and impact of their work to the university and the broader scientific community. Quite often impact is equated with “impact factor,” leading many researchers to target a narrow range of journals at the expense of a broader societal considerations (such as the public’s right to access). The importance of RPT guidelines and forms makes them a natural place to effect change towards an opening of access to research (something both Canada and the US have been pushing for through federal policies and laws). While we believe changes in RPT guidelines and forms may provide the impetus for behavioral change, leading to broader interest and adoption of open access principles, the reality is that very little is known about current RPT practices as they relate to questions of openness. This project seeks to examine the RPT process in the US and Canada in ways that can directly inform actions likely to translate into behavioural change and to a greater opening of research. You can see our progress in data collection here.”

Amplifying the impact of open access: Wikipedia and the diffusion of science – Teplitskiy – 2016 – Journal of the Association for Information Science and Technology – Wiley Online Library

Abstract:  With the rise of Wikipedia as a first-stop source for scientific information, it is important to understand whether Wikipedia draws upon the research that scientists value most. Here we identify the 250 most heavily used journals in each of 26 research fields (4,721 journals, 19.4M articles) indexed by the Scopus database, and test whether topic, academic status, and accessibility make articles from these journals more or less likely to be referenced on Wikipedia. We find that a journal’s academic status (impact factor) and accessibility (open access policy) both strongly increase the probability of it being referenced on Wikipedia. Controlling for field and impact factor, the odds that an open access journal is referenced on the English Wikipedia are 47% higher compared to paywall journals. These findings provide evidence is that a major consequence of open access policies is to significantly amplify the diffusion of science, through an intermediary like Wikipedia, to a broad audience.