“Being able to find, assess and place new research within a field of knowledge, is integral to any research project. For social scientists this process is increasingly likely to take place on Google Scholar, closely followed by traditional scholarly databases. In this post, Alberto Martín-Martín, Enrique Orduna-Malea , Mike Thelwall, Emilio Delgado-López-Cózar, analyse the relative coverage of the three main research databases, Google Scholar, Web of Science and Scopus, finding significant divergences in the social sciences and humanities and suggest that researchers face a trade-off when using different databases: between more comprehensive, but disorderly systems and orderly, but limited systems….”
Abstract: Traditionally, Web of Science and Scopus have been the two most widely used databases for bibliometric analyses. However, during the last few years some new scholarly databases, such as Dimensions, have come up. Several previous studies have compared different databases, either through a direct comparison of article coverage or by comparing the citations across the databases. This article aims to present a comparative analysis of the journal coverage of the three databases (Web of Science, Scopus and Dimensions), with the objective to describe, understand and visualize the differences in them. The most recent master journal lists of the three databases is used for analysis. The results indicate that the databases have significantly different journal coverage, with the Web of Science being most selective and Dimensions being the most exhaustive. About 99.11% and 96.61% of the journals indexed in Web of Science are also indexed in Scopus and Dimensions, respectively. Scopus has 96.42% of its indexed journals also covered by Dimensions. Dimensions database has the most exhaustive journal coverage, with 82.22% more journals than Web of Science and 48.17% more journals than Scopus. This article also analysed the research outputs for 20 selected countries for the 2010–2018 period, as indexed in the three databases, and identified database-induced variations in research output volume, rank, global share and subject area composition for different countries. It is found that there are clearly visible variations in the research output from different countries in the three databases, along with differential coverage of different subject areas by the three databases. The analytical study provides an informative and practically useful picture of the journal coverage of Web of Science, Scopus and Dimensions databases.
Maddi, A., Lardreau, E. & Sapinho, D. Open access in Europe: a national and regional comparison. Scientometrics (2021). https://doi.org/10.1007/s11192-021-03887-1
Open access to scientific publications has progressively become a key issue for European policy makers, resulting in concrete measures by the different country members to promote its development. The aim of paper is, after providing a quick overview of OA policies in Europe, to carry out a comparative study of OA practices within European countries, using data from the Web of Science (WoS) database. This analysis is based on two indicators: the OA share that illustrates the evolution over time, and the normalized OA indicator (NOAI) that allows spatial comparisons, taking into account disciplinary structures of countries. Results show a general trend towards the development of OA over time as expected, but with large disparities between countries, depending on how early they begin taking measures in favor of OA. While it is possible to stress the importance of policy and its influence on open access at country level, this does not appear to be the case at the regional level. There is not much variability between regions, within the same country, in terms of open access indicators.
Abstract: This study is one of the first that uses the recently introduced open access (OA) labels in the Web of Science (WoS) metadata to investigate whether OA articles published in Directory of Open Access Journals (DOAJ) listed journals experience a citation advantage in comparison to subscription journal articles, specifically those of which no self-archived versions are available. Bibliometric data on all articles and reviews indexed in WoS, and published from 2013 to 2015, were analysed. In addition to normalised citation score (NCS), we used two additional measures of citation advantage: whether an article was cited at all; and whether an article is among the most frequently cited percentile of articles within its respective subject area (pptopX %). For each WoS subject area, the strength of the relationship between access status (whether an article was published in an OA journal) and each of these three measures was calculated. We found that OA journal articles experience a citation advantage in very few subject areas and, in most of these subject areas, the citation advantage was found on only a single measure of citation advantage, namely whether the article was cited at all. Our results lead us to conclude that access status accounts for little of the variability in the number of citations an article accumulates. The methodology and the calculations that were used in this study are described in detail and we believe that the lessons we learnt, and the recommendations we make, will be of much use to future researchers interested in using the WoS OA labels, and to the field of citation advantage in general.
” Clarivate Plc (NYSE:CCC), a global leader in providing trusted information and insights to accelerate the pace of innovation, is supporting the Open Access Monitor (OA Monitor), Germany with the provision of Web of Science™ publication, grant and funding data to increase the impact of scientific scholarship and to enable more equitable participation in research. Clarivate™ will provide weekly customised data from the Web of Science covering the publication literature for the DACH region (which includes Germany, Switzerland and Austria).
Supported by the German Federal Ministry of Education and Research (BMBF) and managed by Forschungszentrum Jülich, the OA Monitor provides evaluations of both the volume and financing of publications at federal, state and institutional level in the DACH region. The ability to connect the corresponding author data from the Web of Science with the publication fee information sourced by OA Monitor will have particularly broad implications for the German academic library community. The data will also help policy makers gauge the status of the transformation to Open Access (OA). …”
Abstract: New sources of citation data have recently become available, such as Microsoft Academic, Dimensions, and the OpenCitations Index of CrossRef open DOI-to-DOI citations (COCI). Although these have been compared to the Web of Science Core Collection (WoS), Scopus, or Google Scholar, there is no systematic evidence of their differences across subject categories. In response, this paper investigates 3,073,351 citations found by these six data sources to 2,515 English-language highly-cited documents published in 2006 from 252 subject categories, expanding and updating the largest previous study. Google Scholar found 88% of all citations, many of which were not found by the other sources, and nearly all citations found by the remaining sources (89–94%). A similar pattern held within most subject categories. Microsoft Academic is the second largest overall (60% of all citations), including 82% of Scopus citations and 86% of WoS citations. In most categories, Microsoft Academic found more citations than Scopus and WoS (182 and 223 subject categories, respectively), but had coverage gaps in some areas, such as Physics and some Humanities categories. After Scopus, Dimensions is fourth largest (54% of all citations), including 84% of Scopus citations and 88% of WoS citations. It found more citations than Scopus in 36 categories, more than WoS in 185, and displays some coverage gaps, especially in the Humanities. Following WoS, COCI is the smallest, with 28% of all citations. Google Scholar is still the most comprehensive source. In many subject categories Microsoft Academic and Dimensions are good alternatives to Scopus and WoS in terms of coverage.
Abstract: We present a large-scale comparison of five multidisciplinary bibliographic data sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic. The comparison considers all scientific documents from the period 2008-2017 covered by these data sources. Scopus is compared in a pairwise manner with each of the other data sources. We first analyze differences between the data sources in the coverage of documents, focusing for instance on differences over time, differences per document type, and differences per discipline. We then study differences in the completeness and accuracy of citation links. Based on our analysis, we discuss strengths and weaknesses of the different data sources. We emphasize the importance of combining a comprehensive coverage of the scientific literature with a flexible set of filters for making selections of the literature.
Abstract: Rigorous evidence identification is essential for systematic reviews and meta?analyses (evidence syntheses), because the sample selection of relevant studies determines a review’s outcome, validity, and explanatory power. Yet, the search systems allowing access to this evidence provide varying levels of precision, recall, and reproducibility and also demand different levels of effort. To date, it remains unclear which search systems are most appropriate for evidence synthesis and why. Advice on which search engines and bibliographic databases to choose for systematic searches is limited and lacking systematic, empirical performance assessments.
This study investigates and compares the systematic search qualities of 28 widely used academic search systems, including Google Scholar, PubMed and Web of Science. A novel, query?based method tests how well users are able to interact and retrieve records with each system. The study is the first to show the extent to which search systems can effectively and efficiently perform (Boolean) searches with regards to precision, recall and reproducibility. We found substantial differences in the performance of search systems, meaning that their usability in systematic searches varies. Indeed, only half of the search systems analysed and only a few Open Access databases can be recommended for evidence syntheses without adding substantial caveats. Particularly, our findings demonstrate why Google Scholar is inappropriate as principal search system.
We call for database owners to recognise the requirements of evidence synthesis, and for academic journals to re?assess quality requirements for systematic reviews. Our findings aim to support researchers in conducting better searches for better evidence synthesis.
“How much is your institution spending on APC fees?
How does your institution’s Open Access footprint compare to your peers?
In this session, learn how you can use data from the Web of Science to calculate your institution’s spend on Open Access, and to benchmark your institution’s participation in OA publishing against activity at peer institutions.
We’ll also discuss recent market developments, including how Plan S, a multi-national initiative aimed at making an increasing share of research findings available in OA publications, may impact faculty at U.S. institutions….”
“The Web of Science Group (a Clarivate Analytics company) has entered into a new partnership with Emerald Publishing, to pilot the industry’s first cross-publisher, scalable and transparent peer review workflow from Publons and ScholarOne across three of Emerald’s leading journals.
Transparent peer review shows the complete peer review process from initial review to final decision, and has gained popularity with authors, reviewers and editors alike in recent years.
The new transparent peer review service will be rolled out across Online Information Review, Industrial Lubrication and Tribology and International Journal of Social Economics. The workflows ensure that alongside the published article, readers can access a comprehensive peer review history, including reviewer reports, editor decision letters and authors’ responses. Each of these elements is assigned its own digital object identified (DOI), which helps readers easily reference and cite the peer review content. Transparency can also aid teaching of best practice in peer review. The transparent peer review workflow complies with best-practice data privacy regulation, ensuring the individual preferences of authors, peer reviewers and journals are met….”