Abstract: Citation indices are tools used by the academic community for research and research evaluation which aggregate scientific literature output and measure scientific impact by collating citation counts. Citation indices help measure the interconnections between scientific papers but fall short because they only display paper titles, authors, and the date of publications, and fail to communicate contextual information about why a citation was made. The usage of citations in research evaluation without due consideration to context can be problematic, if only because a citation that disputes a paper is treated the same as a citation that supports it. To solve this problem, we have used machine learning and other techniques to develop a “smart citation index” called scite, which categorizes citations based on context. Scite shows how a citation was used by displaying the surrounding textual context from the citing paper, and a classification from our deep learning model that indicates whether the statement provides supporting or disputing evidence for a referenced work, or simply mentions it. Scite has been developed by analyzing over 23 million full-text scientific articles and currently has a database of more than 800 million classified citation statements. Here we describe how scite works and how it can be used to further research and research evaluation.
“Aries Systems Corporation, a leading technology workflow solutions provider to the scholarly publishing community, and scite, a platform for discovering and evaluating scientific articles via Smart Citations, are pleased to announce their partnership to facilitate the veracity of scientific references….”
“scite, an award-winning platform for discovering and evaluating scientific articles, and Cambridge University Press (CUP), a leading academic publisher and the world’s oldest university press, have partnered to index CUP articles on scite.
The indexing partnership gives scite access to the full-text of all articles published by CUP, which it will use to create Smart Citations. Smart Citations show how a scientific paper has been cited by providing the context of the citation and a classification describing whether it provides supporting or disputing evidence for the cited claim….”
“Scientists, institutions and journals have been increasingly evaluated statistically, by metrics that focus on the number of published reports rather than on their content, raising a concern that this approach interferes with the progress of biomedical research. To offset this effect, we propose to use the R-factor, a metric that indicates whether a report or its conclusions have been verified.”
“scite, an award-winning platform for discovering and evaluating scientific articles, and Rockefeller University Press (RUP), a leading publisher in the life sciences have partnered to improve how research articles are discovered and evaluated.
With this partnership, all relevant articles published in the Journal of Cell Biology (JCB), the Journal of Experimental Medicine (JEM), and the Journal of General Physiology (JGP) will display Smart Citations from scite. Smart Citations allow researchers to see how a scientific paper has been cited by providing the context of the citation and indicating whether it provides supporting or disputing evidence for the cited claim….”
Abstract: This study investigates whether negative citations in articles and comments posted on post-publication peer review platforms are both equally contributing to the correction of science. These 2 types of written evidence of disputes are compared by analyzing their occurrence in relation to articles that have already been retracted or corrected. We identi-fied retracted or corrected articles in a corpus of 72,069 articles coming from the Engineer-ing field, from 3 journals (Science, Tumor Biology, Cancer Research) and from 3 authors with many retractions to their credit (Sarkar, Schön, Voinnet). We used Scite to retrieve contradicting citations and PubPeer to retrieve the number of comments for each article, and then we considered them as traces left by scientists to contest published results. Our study shows that contradicting citations are very uncommon and that retracted or corrected articles are not more contradicted in scholarly articles than those that are neither retracted nor corrected but they do generate more comments on Pubpeer, presumably because of the possibility for contributors to remain anonymous. Moreover, post-publication peer review platforms, although external to the scientific publication process contribute more to the correction of science than negative citations. Consequently, post-publication peer review venues, and more specifically the comments found on it, although not contributing to the scientific literature, are a mechanism for correcting science. Lastly, we introduced the idea of strengthening the role of contradicting citations to rehabilitate the clear expression of judgment in scientific papers.
“scite, a platform for the discovery and evaluation of scientific articles, today announced it has been awarded $1.5 million from Phase II of its Fast-Track Small Business Innovation Research (SBIR) grant by the National Institute on Drug Abuse (NIDA) of the National Institutes of Health (NIH)….”
“COVID-19 has caused worldwide anxiety and thousands of deaths. Each day, new data and research are published. With the high level of activity and collaboration worldwide, what gets published today may be outdated the week after. How do researchers, medical professionals, policymakers, and the public keep up to date and informed?
At scite, we have created an easy way for anyone to see how a scientific article has been cited and, specifically, if it has been supported or contradicted by subsequent research. We do this by analyzing millions of full-text publications, extracting the citation statements from these publications, and then classifying these as supporting or contradicting evidence.
Recently, to help the world make more sense of COVID-19 research, we turned our attention and novel functionality to COVID-19 papers and preprints….”
“Indeed, when we as researchers first look at a paper, we look at where it was published, who are the authors and where are they from, and some metrics like downloads, reads, and of course, citations. Most of this information is superficial, contributing no real useful information to understanding the research. Even citations, as used today, are mostly used as a number to make a quick assessment of the work, where the higher the number of citations an article has, the better.
However, citations represent a wealth of information. Behind each of the 41 articles that cite my work are years of directly related research and many thousands, if not millions, of dollars of research funding. But if I want to learn what these articles say about my work, I would need to read each of them. This is so impractical that it is effectively never fully done.
We’re changing that at scite, a new platform that uses deep learning to show how an article has been cited and, specifically, if it has been supported or contradicted, where the citations appear in the citing paper, and if it is a self-cite or a citation from a review or article. In short, we want to make citations smart–citations that not merely tell how many times an article is cited, but also provide the context for each citation and the citation meaning, such as whether it provides supporting or contradicting evidence for the cited claim. …”
“scite, an award-winning citation analysis platform, and Europe PMC, an open science discovery tool that provides access to a worldwide collection of life science publications, have partnered to display what scite calls smart citations on the Europe PMC platform.
Smart citations advance regular citations by providing more contextual information beyond the information that one study references another. Specifically, smart citations provide the excerpt of text surrounding the citation, the section of the article in which the reference is mentioned, and indicate whether the citing study provides supporting or contradicting evidence. As a result, one can evaluate a study of interest much faster….”