A cohort study of how faculty in LIS schools perceive and engage with open-access publishing – Wilhelm Peekhaus,

Abstract:  This article presents results from a survey of faculty in North American Library and Information Studies (LIS) schools about their attitudes towards and experience with open-access publishing. As a follow-up to a similar survey conducted in 2013, the article also outlines the differences in beliefs about and engagement with open access that have occurred between 2013 and 2018. Although faculty in LIS schools are proponents of free access to research, journal publication choices remain informed by traditional considerations such as prestige and impact factor. Engagement with open access has increased significantly, while perceptions of open access have remained relatively stable between 2013 and 2018. Nonetheless, those faculty who have published in an open-access journal or are more knowledgeable about open access tend to be more convinced about the quality of open-access publications and less apprehensive about open-access publishing than those who have no publishing experience with open-access journals or who are less knowledgeable about various open-access modalities. Willingness to comply with gold open-access mandates has increased significantly since 2013.

Releasing a preprint is associated with more attention and citations | bioRxiv

Abstract:  Preprints in the life sciences are gaining popularity, but release of a preprint still precedes only a fraction of peer-reviewed publications. Quantitative evidence on the relationship between preprints and article-level metrics of peer-reviewed research remains limited. We examined whether having a preprint on bioRxiv.org was associated with the Altmetric Attention Score and number of citations of the corresponding peer-reviewed article. We integrated data from PubMed, CrossRef, Altmetric, and Rxivist (a collection of bioRxiv metadata). For each of 26 journals (comprising a total of 46,451 articles and 3,817 preprints), we used log-linear regression, adjusted for publication date and scientific subfield, to estimate fold-changes of Attention Score and citations between articles with and without a preprint. We also performed meta-regression of the fold-changes on journal-level characteristics. By random effects meta-analysis across journals, releasing a preprint was associated with a 1.53 times higher Attention Score + 1 (95% CI 1.42 to 1.65) and 1.31 times more citations + 1 (95% CI 1.24 to 1.38) of the peer-reviewed article. Journals with larger fold-changes of Attention Score tended to have lower impact factors and lower percentages of articles released as preprints. In contrast, a journal’s fold-change of citations was not associated with impact factor, percentage of articles released as preprints, or access model. The findings from this observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

Cambridge University signs San Francisco Declaration on Research Assessment | University of Cambridge

“The University of Cambridge and Cambridge University Press today announce that they have signed up to the San Francisco Declaration on Research Assessment (DORA), a set of recommendations agreed in 2012 that seek to ensure that the quality and impact of research outputs are “measured accurately and evaluated wisely”. …”

Proposal for a Standard Article Metrics Dashboard to Replace the Journal Impact Factor

Abstract:  This paper proposes the creation of a dashboard consisting of five metrics that could be used to replace the journal impact factor. It should be especially useful in circumstances, like promotion and tenure committees, where the evaluators do not share the authors subject expertise and where they are working under time constraints.

Share or perish: Social media and the International Journal of Mental Health Nursing – McNamara – – International Journal of Mental Health Nursing – Wiley Online Library

Abstract:  The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information. The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non?traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research. The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal’s articles. To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed. Quantitative evidence demonstrates the effectiveness of one journal’s social media strategy in increasing the reach and readership of the articles it publishes. This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.

Journal impact factor: a bumpy ride in an open space | Journal of Investigative Medicine

Abstract:  The journal impact factor (IF) is the leading method of scholarly assessment in today’s research world. An important question is whether or not this is still a constructive method. For a specific journal, the IF is the number of citations for publications over the previous 2 years divided by the number of total citable publications in these years (the citation window). Although this simplicity works to an advantage of this method, complications arise when answers to questions such as ‘What is included in the citation window’ or ‘What makes a good journal impact factor’ contain ambiguity. In this review, we discuss whether or not the IF should still be considered the gold standard of scholarly assessment in view of the many recent changes and the emergence of new publication models. We will outline its advantages and disadvantages. The advantages of the IF include promoting the author meanwhile giving the readers a visualization of the magnitude of review. On the other hand, its disadvantages include reflecting the journal’s quality more than the author’s work, the fact that it cannot be compared across different research disciplines, and the struggles it faces in the world of open access. Recently, alternatives to the IF have been emerging, such as the SCImago Journal & Country Rank, the Source Normalized Impact per Paper and the Eigenfactor Score, among others. However, all alternatives proposed thus far are associated with their own limitations as well. In conclusion, although IF contains its cons, until there are better proposed alternative methods, IF remains one of the most effective methods for assessing scholarly activity.

UKRI signs San Francisco Declaration of Research Assessment – UK Research and Innovation

UK Research and Innovation (UKRI) has signed an international declaration aimed at strengthening and promoting best practice in the way research is assessed.

The San Francisco Declaration of Research Assessment (DORA) recognises the need to improve the ways in which the outputs of research are evaluated with regards to appropriate use of metrics and makes high-level recommendations for how this can be achieved. DORA includes specific recommendations for funders and organisations that undertake evaluation.

The seven Research Councils* are current signatories, and the Higher Education Funding Council for England was a signatory. Research Councils UK (RCUK), the umbrella organisation for the seven Research Councils before the formation of UKRI, signed DORA in February 2018.

UKRI is a member of the Plan S coalition, an international initiative launched to make full and immediate open access to research publications a reality. Plan S recognises DORA and that research needs to be assessed on its own merits rather than on the venue of publication….”

UKRI signs San Francisco Declaration of Research Assessment – UK Research and Innovation

UK Research and Innovation (UKRI) has signed an international declaration aimed at strengthening and promoting best practice in the way research is assessed.

The San Francisco Declaration of Research Assessment (DORA) recognises the need to improve the ways in which the outputs of research are evaluated with regards to appropriate use of metrics and makes high-level recommendations for how this can be achieved. DORA includes specific recommendations for funders and organisations that undertake evaluation.

The seven Research Councils* are current signatories, and the Higher Education Funding Council for England was a signatory. Research Councils UK (RCUK), the umbrella organisation for the seven Research Councils before the formation of UKRI, signed DORA in February 2018.

UKRI is a member of the Plan S coalition, an international initiative launched to make full and immediate open access to research publications a reality. Plan S recognises DORA and that research needs to be assessed on its own merits rather than on the venue of publication….”