Perceptions and beliefs of academic librarians in Germany and the USA: a comparative study

“The German respondents felt it more likely (by ½ point) than those in the USA that attracting a new generation to the profession will become more difficult, and that for scholarly articles, Open Access will emerge as the predominant publishing model. The American respondents felt it more likely (by ? of a point) than those in Germany that one or a few commercial entities will dominate the global scientific information infrastructure, and that libraries will be forced to reduce purchasing or subscribing to publisher-controlled information sources due to their rising costs….

The German respondents think it more likely that it will become more difficult to attract young people to it, and that Open Access will become the predominant scholarly article publishing model, than their American colleagues. The respondents in the USA, on the other hand, see greater likelihood than those in Germany that large parts of the global scientific information infrastructure will become dominated by one or a few commercial entities, and that rising costs of publisher-controlled information sources will force libraries to reduce their purchases/subscriptions….”

The rise of the “open” discovery indexes? Lens.org, Semantic Scholar and Scinapse | Musings about librarianship oa.scite

“In this blog post, I will talk specifically on a very important source of data used by Academic Search engines – Microsoft Academic Graph (MAG) and do a brief review of four academic search engines – Microsoft Academic, Lens.org, Semantic Scholar and Scinapse ,which uses MAG among other sources….

We live in a time, where large (>50 million) Scholarly discovery indexes are no longer as hard to create as in the past, thanks to the availability of freely available Scholarly article index data like Crossref and MAG.”

Completeness of reporting in abstracts of randomized controlled trials in subscription and open access journals: cross-sectional study | Trials | Full Text

Abstract

Background

Open access (OA) journals are becoming a publication standard for health research, but it is not clear how they differ from traditional subscription journals in the quality of research reporting. We assessed the completeness of results reporting in abstracts of randomized controlled trials (RCTs) published in these journals.

Methods

We used the Consolidated Standards of Reporting Trials Checklist for Abstracts (CONSORT-A) to assess the completeness of reporting in abstracts of parallel-design RCTs published in subscription journals (n?=?149; New England Journal of Medicine, Journal of the American Medical Association, Annals of Internal Medicine, and Lancet) and OA journals (n?=?119; BioMedCentral series, PLoS journals) in 2016 and 2017.

Results

Abstracts in subscription journals completely reported 79% (95% confidence interval [CI], 77–81%) of 16 CONSORT-A items, compared with 65% (95% CI, 63–67%) of these items in abstracts from OA journals (P?<?0.001, chi-square test). The median number of completely reported CONSORT-A items was 13 (95% CI, 12–13) in subscription journal articles and 11 (95% CI, 10–11) in OA journal articles. Subscription journal articles had significantly more complete reporting than OA journal articles for nine CONSORT-A items and did not differ in reporting for items trial design, outcome, randomization, blinding (masking), recruitment, and conclusions. OA journals were better than subscription journals in reporting randomized study design in the title.

Conclusion

Abstracts of randomized controlled trials published in subscription medical journals have greater completeness of reporting than abstracts published in OA journals. OA journals should take appropriate measures to ensure that published articles contain adequate detail to facilitate understanding and quality appraisal of research reports about RCTs.

Completeness of reporting in abstracts of randomized controlled trials in subscription and open access journals: cross-sectional study | Trials | Full Text

Abstract

Background

Open access (OA) journals are becoming a publication standard for health research, but it is not clear how they differ from traditional subscription journals in the quality of research reporting. We assessed the completeness of results reporting in abstracts of randomized controlled trials (RCTs) published in these journals.

Methods

We used the Consolidated Standards of Reporting Trials Checklist for Abstracts (CONSORT-A) to assess the completeness of reporting in abstracts of parallel-design RCTs published in subscription journals (n?=?149; New England Journal of Medicine, Journal of the American Medical Association, Annals of Internal Medicine, and Lancet) and OA journals (n?=?119; BioMedCentral series, PLoS journals) in 2016 and 2017.

Results

Abstracts in subscription journals completely reported 79% (95% confidence interval [CI], 77–81%) of 16 CONSORT-A items, compared with 65% (95% CI, 63–67%) of these items in abstracts from OA journals (P?<?0.001, chi-square test). The median number of completely reported CONSORT-A items was 13 (95% CI, 12–13) in subscription journal articles and 11 (95% CI, 10–11) in OA journal articles. Subscription journal articles had significantly more complete reporting than OA journal articles for nine CONSORT-A items and did not differ in reporting for items trial design, outcome, randomization, blinding (masking), recruitment, and conclusions. OA journals were better than subscription journals in reporting randomized study design in the title.

Conclusion

Abstracts of randomized controlled trials published in subscription medical journals have greater completeness of reporting than abstracts published in OA journals. OA journals should take appropriate measures to ensure that published articles contain adequate detail to facilitate understanding and quality appraisal of research reports about RCTs.

CURRENT STATUS OF THE INSTITUTIONAL REPOSITORY AT THE SOUTH EASTERN UNIVERSITY OF SRI LANKA

Abstract:  DSpace is an open-source software which is the most popular and cost-effective tool to build digital repositories. There are 15 Sri Lankan institutional repositories listed in the Directory of Open Access Repositories (OpenDOAR) platform. OpenDOAR is the global directory of academic open access repositories. The present study mainly focuses on the current status of the Institutional Repository at the South Eastern University of Sri Lanka (SEUIR). The study further attempts to compare SEUIR with other listed institutional repositories in OpenDOAR of Sri Lanka. The data were extracted from the statistics calculated through DSpace open source software and analysed for the necessary information. The study highlights the current status of SEUIR and further developments to improve the accessibility of contents to the viewers.

CURRENT STATUS OF THE INSTITUTIONAL REPOSITORY AT THE SOUTH EASTERN UNIVERSITY OF SRI LANKA

Abstract:  DSpace is an open-source software which is the most popular and cost-effective tool to build digital repositories. There are 15 Sri Lankan institutional repositories listed in the Directory of Open Access Repositories (OpenDOAR) platform. OpenDOAR is the global directory of academic open access repositories. The present study mainly focuses on the current status of the Institutional Repository at the South Eastern University of Sri Lanka (SEUIR). The study further attempts to compare SEUIR with other listed institutional repositories in OpenDOAR of Sri Lanka. The data were extracted from the statistics calculated through DSpace open source software and analysed for the necessary information. The study highlights the current status of SEUIR and further developments to improve the accessibility of contents to the viewers.

Comparing published scientific journal articles to their pre-print versions | SpringerLink

Abstract:  Academic publishers claim that they add value to scholarly communications by coordinating reviews and contributing and enhancing text during publication. These contributions come at a considerable cost: US academic libraries paid   $1.7  billion for serial subscriptions in 2008 alone. Library budgets, in contrast, are flat and not able to keep pace with serial price inflation. We have investigated the publishers’ value proposition by conducting a comparative study of pre-print papers from two distinct science, technology, and medicine corpora and their final published counterparts. This comparison had two working assumptions: (1) If the publishers’ argument is valid, the text of a pre-print paper should vary measurably from its corresponding final published version, and (2) by applying standard similarity measures, we should be able to detect and quantify such differences. Our analysis revealed that the text contents of the scientific papers generally changed very little from their pre-print to final published versions. These findings contribute empirical indicators to discussions of the added value of commercial publishers and therefore should influence libraries’ economic decisions regarding access to scholarly publications.