scite: Making Science More Reliable – scite – Medium

Today, we are introducing scite to make it easier to tell what is fact and what is not.

scite is a platform that allows anyone to see if a scientific report has been supported or contradicted by subsequent work. We do this by using deep learning and a network of experts to analyze hundreds of millions of citation statements, classifying them as supporting, contradicting, or just mentioning, and presenting the results in an easy to understand interface. Thus, anyone can check if a scientific paper has been supported or contradicted with just a few clicks….”

Scientists who share data publicly receive more citations | Science Codex

“A new study finds that papers with data shared in public gene expression archives received increased numbers of citations for at least five years. The large size of the study allowed the researchers to exclude confounding factors that have plagued prior studies of the effect and to spot a trend of increasing dataset reuse over time. The findings will be important in persuading scientists that they can benefit directly from publicly sharing their data.

The study, which adds to growing evidence for an open data citation benefit across different scientific fields, is entitled “Data reuse and the open citation advantage”. It was conducted by Dr. Heather Piwowar of Duke University and Dr. Todd Vision of the University of North Carolina at Chapel Hill, and published today in PeerJ, a peer reviewed open access journal in which all articles are freely available to everyone….”

Scientists who share data publicly receive more citations | Science Codex

“A new study finds that papers with data shared in public gene expression archives received increased numbers of citations for at least five years. The large size of the study allowed the researchers to exclude confounding factors that have plagued prior studies of the effect and to spot a trend of increasing dataset reuse over time. The findings will be important in persuading scientists that they can benefit directly from publicly sharing their data.

The study, which adds to growing evidence for an open data citation benefit across different scientific fields, is entitled “Data reuse and the open citation advantage”. It was conducted by Dr. Heather Piwowar of Duke University and Dr. Todd Vision of the University of North Carolina at Chapel Hill, and published today in PeerJ, a peer reviewed open access journal in which all articles are freely available to everyone….”

The Wikipedia Library/1Lib1Ref/Lessons/2019 – Meta

The January [2019] #1Lib1Ref campaign saw an energy exhibited by participants that was infectious. The campaign saw major additions, new entrants and a new sense of competition between languages and institutions. In this iteration #1Lib1Ref reached record highs and saw extensive participation from emerging communities and languages. For the first time the French Wikipedia took the lead with over 33% of the total number of contributions made during the campaign. Based on these results, we anticipate that #1Lib1Ref has the potential of supporting outreach in diverse communities….”

Same Question, Different World: Replicating an Open Access Research Impact Study | Arendt | College & Research Libraries

“To examine changes in the open access landscape over time, this study partially replicated Kristin Antelman’s 2004 study of open access citation advantage. Results indicated open access articles still have a citation advantage. For three of the four disciplines examined, the most common sites hosting freely available articles were independent sites, such as academic social networks or article-sharing sites. For the same three disciplines, more than 70 percent of the open access copies were publishers’ PDFs. The major difference from Antelman’s is the increase in the number of freely available articles that appear to be in violation of publisher policies….”

Data citation needed | Scientific Data

“Starting last month, publications at Scientific Data now include data citations in the main reference list, rather than in a separate data citations section. This change will be supported by changes to the underlying structure of our content to promote machine readability and reuse of links between scholarly articles and datasets. This aligns the journal with a roadmap for data citation co-developed by representatives of the academic community and several publishers, which seeks to make data citation a standard part of the scholarly publishing process….”

Impact factors are still widely used in academic evaluations

Almost half of research-intensive universities consider journal impact factors when deciding whom to promote, a survey of North American institutions has found.

About 40% of institutes with a strong focus on research mention impact factors in documents used in the review, promotion and tenure process, according to the analysis, which examined more than 800 documents across 129 institutions in the United States and Canada.

The data imply that many universities are evaluating the performance of their staff using a metric that has been widely criticized as a crude and misleading proxy for the quality of scientists’ work….

Less than one-quarter of the institutions mentioned impact factor or a closely related term such as “high impact journal” in their documents. But this proportion rose to 40% for the 57 research-intensive universities included in the survey. By contrast, just 18% of universities that focused on master’s degrees mentioned journal impact factors (see ‘High impact’).

In more than 80% of the mentions at research-heavy universities, the language in the documents encouraged the use of the impact factor in academic evaluations. Only 13% of mentions at these institutions came with any cautionary words about the metric. The language also tended to imply that high impact factors were associated with better research: 61% of the mentions portrayed the impact factor as a measure of the quality of research, for example, and 35% stated that it reflected the impact, importance or significance of the work….”

The citation advantage for open access science journals with and without article processing charges – Mohammad Reza Ghane, Mohammad Reza Niazmand, Ameneh Sabet Sarvestani, 2019

Abstract:  In this study of access models, we compared citation performance in journals that do and do not levy article processing charges (APCs) as part of their business model. We used a sample of journals from the Directory of Open Access Journals (DOAJ) science class and its 13 subclasses and recorded four citation metrics: JIF, H-index, citations per publication (CPP) and quartile rank. We examined 1881 science journals indexed in DOAJ. Thomson Reuters Journal Citation Reports and Web of Science were used to extract JIF, H-index, CPP and quartile category. Overall, the JIF, H-index and CPP indicated that APC and non-APC open access (OA) journals had equal impact. Quartile category ranking indicated a difference in favour of APC journals. In each science subclass, we found significant differences between APC and non-APC journals in all citation metrics except for quartile rank. Discipline-related variations were observed in non-APC journals. Differences in the rank positions of scores in different groups identified citation advantages for non-APC journals in physiology, zoology, microbiology and geology, followed by botany, astronomy and general biology. Impact ranged from moderate to low in physics, chemistry, human anatomy, mathematics, general science and natural history. The results suggest that authors should consider field- and discipline-related differences in the OA citation advantage, especially when they are considering non-APC OA journals categorised in two or more subjects. This may encourage OA publishing at least in the science class.

Changing trends in otorhinolaryngology publishing | ACTA Otorhinolaryngologica Italica

Abstract:  The aim of this study is to compare the changes in impact factors and citation numbers of Open Access (OA) vs subscription-based (SB) journals between 1999 and 2016 and to explore the changing trends in ORL publishing. All data extracted from SCImago Journal and Country ranking (SJR) website have been used as input for statistical analysis. The chi-square test of independency was applied in order to understand whether the ratio of number of OA journals of ORL category have dramatically changed between years 1999 and 2016. Also, the years and impact factors of journals belonging to the OA and SB journals have been graphed separately and the changes of annual SJR ranks of both journal types have been compared using one-way Z-test. There was a statistical difference as the proportion of OA Journals were not equal to the proportion of SB Journals throughout the years 1999 and 2016, and it showed the tendency to increase greater compared to SB Journals (p < 0.01). Although the overall level of impact factors of SB journals was generally high, by comparing two regression models, it was obvious that the level of increase of the impact factors of OA journals were significantly higher (p < 0.01). When choosing where to publish, it is important to consider the journal’s visibility, cost of publication, IF or SJR of the journal and speed of publication as well as changing trends in medical publishing nourished by the Web of Science.