Influence of accessibility (open and toll-based) of scholarly publications on retractions | SpringerLink

“We have examined retracted publications in different subject fields and attempted to analyse whether online free accessibility (Open Access) influences retraction by examining the scholarly literature published from 2000 through 2019, an incidence of the recent 20 years of publications. InCites, a research analytics tool developed by Clarivate Analytics®, in consultation with Web of Science, PubMed Central, and Retraction Watch databases were used to harvest data for the study. Retracted ‘Article’ and ‘Review’ publications were examined concerning their online accessibility mode (Toll Access and Open Access), based on non-parametric tests like Odds Ratio, Wilcoxon Signed Rank Test, Mann–Whitney U Test, Mann–Kendall and Sen’s methods. The Odds for OA articles to have retraction are about 1.62 as large (62% higher) compared with TA articles (95% CI 1.5, 1.7). 0.028% of OA publications are retracted compared with 0.017% TA publications. Retractions have occurred in all subject areas. In eight subject areas, the Odds for retraction of OA articles are larger compared with retraction of TA articles. In three subject areas, the Odds for retraction of OA articles are lesser compared with the retraction of TA articles. In the remaining 11 subject areas, no significant difference is observed. Post-retraction, though a decline is observed in the citation count of OA & TA publications (p?<?.01), yet the Odds for OA articles to get cited after retraction are about 1.21 as large (21% higher) compared with TA articles (95% CI 1.53, 1.72). TA publications are retracted earlier compared to OA publications (p?<?.01). We observed an increasing trend of retracted works published in both modes. However, the rate of retraction of OA publications is double than the rate of retraction of TA publications.

Is Sci-Hub Increasing Visibility of Indian Research Papers? An Analytical Evaluation

Abstract:  Sci-Hub, founded by Alexandra Elbakyan in 2011 in Kazakhstan has, over the years, emerged as a very popular source for researchers to download scientific papers. It is believed that Sci-Hub contains more than 76 million academic articles. However, recently three foreign academic publishers (Elsevier, Wiley and American Chemical Society) have filed a lawsuit against Sci-Hub and LibGen before the Delhi High Court and prayed for complete blocking these websites in India. It is in this context, that this paper attempts to find out how many Indian research papers are available in Sci-Hub and who downloads them. The citation advantage of Indian research papers available on Sci-Hub is analysed, with results confirming that such an advantage do exist. 

Data Citation: Let’s Choose Adoption Over Perfection | Zenodo

“In the last decade, attitudes towards open data publishing have continued to shift, including a rising interest in data citation as well as incorporating open data in research assessment (see Parsons et al. for an overview). This growing emphasis on data citation is driving incentives and evaluation systems for researchers publishing their data. While increased efforts and interest in data citation are a move in the right direction for understanding research data impact and assessment, there are clear difficulties and roadblocks in having universal and accessible data citation across all research disciplines. But these roadblocks can be mitigated and do not need to keep us in a consistent limbo. The unique properties of data as a citable object have attracted much needed attention, although it has also created an unhelpful perception that data citation is a challenge and requires uniquely burdensome processes to implement. This perception of difficulty begins with defining a ‘citation’ for data. The reality is that all citations are relationships between scholarly objects. A ‘data citation’ can be as simple as a journal article or other dataset declaring that a dataset was important to the creation of that work. This is not a unique challenge. However, many publishers and funders have elevated the relationship of data that “underlies the research” into a Data Availability Statement (DAS). This has helped address some issues publishers have found with typesetting or production techniques that stripped non-articles from citations. However, because of this segmentation of data from typical citation lists, and the exclusion of data citations in article metadata, many communities have felt they are in a stalemate about how to move forward….”

Google Scholar, Web of Science, and Scopus: Which is best for me? | Impact of Social Sciences

“Being able to find, assess and place new research within a field of knowledge, is integral to any research project. For social scientists this process is increasingly likely to take place on Google Scholar, closely followed by traditional scholarly databases. In this post, Alberto Martín-Martín, Enrique Orduna-Malea , Mike Thelwall, Emilio Delgado-López-Cózar, analyse the relative coverage of the three main research databases, Google Scholar, Web of Science and Scopus, finding significant divergences in the social sciences and humanities and suggest that researchers face a trade-off when using different databases: between more comprehensive, but disorderly systems and orderly, but limited systems….”

Correlation Between Social Media Postings and Academic Citations of Hand Surgery Research Publications: A Pilot Study Using Twitter and Google Scholar – Journal of Hand Surgery

Abstract:  Purpose

The relationship between social media postings and academic citations of hand surgery research publications is not known. The objectives of this study were (1) to quantify adoption of social media for the dissemination of original research publications by 3 hand surgery journals, and (2) to determine the correlation between social media postings and academic citations in recent hand surgery research publications.

Methods

An Internet-based study was performed of all research articles from 3 hand surgery journals published from January 2018 to March 2019. A final sample of 472 original full-length scientific research articles was included. For each article, the total number of social media postings was determined using Twitter, as well as the number of tweets, number of retweets, number of tweets from an official outlet, and number of tweets from an author. The number of academic citations for each article was determined using Google Scholar.

Results

Average number of academic citations per article was 3.9. Average number of social media posts per article was 3.2, which consisted of an average of 1.3 tweets and 1.9 retweets per article. The number of academic citations per article was weakly correlated with the number of social medial postings, the number of tweets, and the number of retweets. The number of tweets from an official outlet and from an author were weakly correlated with academic citation.

Conclusions

In the early adoption of social media for the dissemination of hand surgery research, there is a weak correlation between social media posting of hand surgery research and academic citation.

Web analytics for open access academic journals: justification, planning and implementation | BiD: textos universitaris de biblioteconomia i documentació

Abstract:  An overview is presented of resources and web analytics strategies useful in setting solutions for capturing usage statistics and assessing audiences for open access academic journals. A set of complementary metrics to citations is contemplated to help journal editors and managers to provide evidence of the performance of the journal as a whole, and of each article in particular, in the web environment. The measurements and indicators selected seek to generate added value for editorial management in order to ensure its sustainability. The proposal is based on three areas: counts of visits and downloads, optimization of the website alongside with campaigns to attract visitors, and preparation of a dashboard for strategic evaluation. It is concluded that, from the creation of web performance measurement plans based on the resources and proposals analysed, journals may be in a better position to plan the data-driven web optimization in order to attract authors and readers and to offer the accountability that the actors involved in the editorial process need to assess their open access business model.

 

 

The open access advantage for studies of human electrophysiology: Impact on citations and Altmetrics – ScienceDirect

“Highlights

• Barriers to accessing science contributes to knowledge inequalities

• 35% of articles published in the last 20 years in electrophysiology are open access.

• Open access articles received 9–21% more citations and 39% more Altmetric mentions.

• Green open access (author archived) enjoyed similar benefit as Gold open access.

• Studies of human electrophysiology enjoy the “open access advantage” in citations….”

 

What happens when a journal converts to Open Access? A bibliometric analysis

Abstract:  In recent years, increased stakeholder pressure to transition research to Open Access has led to many journals converting, or ‘flipping’, from a closed access (CA) to an open access (OA) publishing model. Changing the publishing model can influence the decision of authors to submit their papers to a journal, and increased article accessibility may influence citation behaviour. In this paper we aimed to understand how flipping a journal to an OA model influences the journal’s future publication volumes and citation impact. We analysed two independent sets of journals that had flipped to an OA model, one from the Directory of Open Access Journals (DOAJ) and one from the Open Access Directory (OAD), and compared their development with two respective control groups of similar journals. For bibliometric analyses, journals were matched to the Scopus database. We assessed changes in the number of articles published over time, as well as two citation metrics at the journal and article level: the normalised impact factor (IF) and the average relative citations (ARC), respectively. Our results show that overall, journals that flipped to an OA model increased their publication output compared to journals that remained closed. Mean normalised IF and ARC also generally increased following the flip to an OA model, at a greater rate than was observed in the control groups. However, the changes appear to vary largely by scientific discipline. Overall, these results indicate that flipping to an OA publishing model can bring positive changes to a journal.

 

scite: a smart citation index that displays the context of citations and classifies their intent using deep learning | bioRxiv

Abstract:  Citation indices are tools used by the academic community for research and research evaluation which aggregate scientific literature output and measure scientific impact by collating citation counts. Citation indices help measure the interconnections between scientific papers but fall short because they only display paper titles, authors, and the date of publications, and fail to communicate contextual information about why a citation was made. The usage of citations in research evaluation without due consideration to context can be problematic, if only because a citation that disputes a paper is treated the same as a citation that supports it. To solve this problem, we have used machine learning and other techniques to develop a “smart citation index” called scite, which categorizes citations based on context. Scite shows how a citation was used by displaying the surrounding textual context from the citing paper, and a classification from our deep learning model that indicates whether the statement provides supporting or disputing evidence for a referenced work, or simply mentions it. Scite has been developed by analyzing over 23 million full-text scientific articles and currently has a database of more than 800 million classified citation statements. Here we describe how scite works and how it can be used to further research and research evaluation.

Article-Level Metrics

Abstract:  In the era of digitization and Open Access, article-level metrics are increasingly employed to distinguish influential research works and adjust research management strategies. Tagging individual articles with digital object identifiers allows exposing them to numerous channels of scholarly communication and quantifying related activities. The aim of this article was to overview currently available article-level metrics and highlight their advantages and limitations. Article views and downloads, citations, and social media metrics are increasingly employed by publishers to move away from the dominance and inappropriate use of journal metrics. Quantitative article metrics are complementary to one another and often require qualitative expert evaluations. Expert evaluations may help to avoid manipulations with indiscriminate social media activities that artificially boost altmetrics. Values of article metrics should be interpreted in view of confounders such as patterns of citation and social media activities across countries and academic disciplines.