Abstract: Preprints in the life sciences are gaining popularity, but release of a preprint still precedes only a fraction of peer-reviewed publications. Quantitative evidence on the relationship between preprints and article-level metrics of peer-reviewed research remains limited. We examined whether having a preprint on bioRxiv.org was associated with the Altmetric Attention Score and number of citations of the corresponding peer-reviewed article. We integrated data from PubMed, CrossRef, Altmetric, and Rxivist (a collection of bioRxiv metadata). For each of 26 journals (comprising a total of 46,451 articles and 3,817 preprints), we used log-linear regression, adjusted for publication date and scientific subfield, to estimate fold-changes of Attention Score and citations between articles with and without a preprint. We also performed meta-regression of the fold-changes on journal-level characteristics. By random effects meta-analysis across journals, releasing a preprint was associated with a 1.53 times higher Attention Score + 1 (95% CI 1.42 to 1.65) and 1.31 times more citations + 1 (95% CI 1.24 to 1.38) of the peer-reviewed article. Journals with larger fold-changes of Attention Score tended to have lower impact factors and lower percentages of articles released as preprints. In contrast, a journal’s fold-change of citations was not associated with impact factor, percentage of articles released as preprints, or access model. The findings from this observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.
“Research papers that make their underlying data openly available are significantly more likely to be cited in future work, according to an analysis led by researchers at the Alan Turing Institute in London that has been published as a preprint. The study, which is currently under peer review, examined nearly 532,000 articles in over 350 open access journals published by Public Library of Science (PLoS) and BioMed Central (BMC) between 1997 and 2018, and found those that linked directly to source data sets received 25% more citations on average….”
Abstract: Efforts to make research results open and reproducible are increasingly reflected by journal policies encouraging or mandating authors to provide data availability statements. As a consequence of this, there has been a strong uptake of data availability statements in recent literature. Nevertheless, it is still unclear what proportion of these statements actually contain well-formed links to data, for example via a URL or permanent identifier, and if there is an added value in providing them. We consider 531,889 journal articles published by PLOS and BMC which are part of the PubMed Open Access collection, categorize their data availability statements according to their content and analyze the citation advantage of different statement categories via regression. We find that, following mandated publisher policies, data availability statements have become common by now, yet statements containing a link to a repository are still just a fraction of the total. We also find that articles with these statements, in particular, can have up to 25.36% higher citation impact on average: an encouraging result for all publishers and authors who make the effort of sharing their data. All our data and code are made available in order to reproduce and extend our results.
Abstract: The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information. The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non?traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research. The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal’s articles. To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed. Quantitative evidence demonstrates the effectiveness of one journal’s social media strategy in increasing the reach and readership of the articles it publishes. This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.
Abstract: A potential motivation for scientists to deposit their scientific work as preprints is to enhance its citation or social impact, an effect which has been empirically observed for preprints in physics, astronomy and mathematics deposited to arXiv. In this study we assessed the citation and altmetric advantage of bioRxiv, a preprint server for the biological sciences. We retrieved metadata of all bioRxiv preprints deposited between November 2013 and December 2017, and matched them to articles that were subsequently published in peer-reviewed journals. Citation data from Scopus and altmetric data from Altmetric.com were used to compare citation and online sharing behaviour of bioRxiv preprints, their related journal articles, and non-deposited articles published in the same journals. We found that bioRxiv-deposited journal articles received a sizeable citation and altmetric advantage over non-deposited articles. Regression analysis reveals that this advantage is not explained by multiple explanatory variables related to the article and its authorship. bioRxiv preprints themselves are being directly cited in journal articles, regardless of whether the preprint has been subsequently published in a journal. bioRxiv preprints are also shared widely on Twitter and in blogs, but remain relatively scarce in mainstream media and Wikipedia articles, in comparison to peer-reviewed journal articles.
OBJECTIVE. The impact of open access (OA) journals is still understudied in the field of radiology. In this study, we compared the measures of impact (e.g., CiteScore, citation count, SCImago Journal Rank) between OA and subscription radiology journals.
MATERIALS AND METHODS. We collected data on journals included in the Scopus Source List on November 1, 2018. We filtered the list for radiology journals for the years from 2011 to 2017. OA journals covered by Scopus (Elsevier) are indicated as OA if the journal is listed in the Directory of Open Access Journals, the Directory of Open Access Scholarly Resources, or both. We also compared citation metrics between OA and subscription radiology journals.
RESULTS. The 2017 Scopus report included 265 radiology journals. The percentage of OA journals increased from 14.7% in 2011 to 21.9% in 2017 (49% increase). The median scholarly output and the citation count were both significantly lower in OA radiology journals compared with subscription journals (p < 0.001 and p = 0.016, respectively). The proportion of documents that received at least one citation was higher in OA (50.2%) compared with subscription journals (44.4%), but the difference was not statistically significant.
CONCLUSION. This study found that the trend toward OA publishing in the fields of radiology and nuclear medicine has slowed in recent years, although the percent cited (i.e., the proportion of documents that receive at least one citation) is higher for OA journals. We believe the radiology field should be more supportive of OA publishing.
Abstract: In the last 3 years, several new (free) sources for academic publication and citation data have joined the now well-established Google Scholar, complementing the two traditional commercial data sources: Scopus and the Web of Science. The most important of these new data sources are Microsoft Academic (2016), Crossref (2017) and Dimensions (2018). Whereas Microsoft Academic has received some attention from the bibliometric commu-nity, there are as yet very few studies that have investigated the coverage of Crossref or Dimensions. To address this gap, this brief letter assesses Crossref and Dimensions cover-age in comparison to Google Scholar, Microsoft Academic, Scopus and the Web of Science through a detailed investigation of the full publication and citation record of a single academic, as well as six top journals in Business & Economics. Overall, this first small-scale study suggests that, when compared to Scopus and the Web of Science, Crossref and Dimensions have a similar or better coverage for both publications and citations, but a substantively lower coverage than Google Scholar and Microsoft Academic. If our find-ings can be confirmed by larger-scale studies, Crossref and Dimensions might serve as good alternatives to Scopus and the Web of Science for both literature reviews and citation analysis. However, Google Scholar and Microsoft Academic maintain their position as the most comprehensive free sources for publication and citation data
“Much effort has gone towards crafting mandates and standards for researchers to share their data1–3. Considerably less time has been spent measuring just how valuable data sharing is, or recognizing the scientific contributions of the people responsible for those data sets. The impact of research continues to be measured by primary publications, rather than by subsequent uses of the data….
To incentivize the sharing of useful data, the scientific enterprise needs a well-defined system that links individuals with reuse of data sets they generate4….
A system in which researchers are regularly recognized for generating data that become useful to other researchers could transform how academic institutions evaluate faculty members’ contributions to science….”
The academic publishing world is changing significantly, with ever-growing numbers of publications each year and shifting publishing patterns. However, the metrics used to measure academic success, such as the number of publications, citation number, and impact factor, have not changed for decades. Moreover, recent studies indicate that these metrics have become targets and follow Goodhart’s Law, according to which, “when a measure becomes a target, it ceases to be a good measure.”
In this study, we analyzed >120 million papers to examine how the academic publishing world has evolved over the last century, with a deeper look into the specific field of biology. Our study shows that the validity of citation-based measures is being compromised and their usefulness is lessening. In particular, the number of publications has ceased to be a good metric as a result of longer author lists, shorter papers, and surging publication numbers. Citation-based metrics, such citation number and h-index, are likewise affected by the flood of papers, self-citations, and lengthy reference lists. Measures such as a journal’s impact factor have also ceased to be good metrics due to the soaring numbers of papers that are published in top journals, particularly from the same pool of authors. Moreover, by analyzing properties of >2,600 research fields, we observed that citation-based metrics are not beneficial for comparing researchers in different fields, or even in the same department.
Academic publishing has changed considerably; now we need to reconsider how we measure success.