Abstract: We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms. Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status. We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.
Abstract: There is a dearth of research on the perceptions of faculty members in educational leadership regarding open access publications. This reality may exist because of a lack of funding for educational leadership research, financial obstacles, tenure demands, or reputation concerns. It may be that there are simply fewer established open access publishers with reputable impact factors to encourage publication by members in the field. The current study seeks to answer the following question: “What are the perceptions of educational leadership faculty members in UCEA about open access publishing?” The results are based on responses from 180 faculty members in the field of educational leadership.
Abstract: Using an online survey of academics at 55 randomly selected institutions across the US and Canada, we explore priorities for publishing decisions and their perceived importance within review, promotion, and tenure (RPT). We find that respondents most value journal readership, while they believe their peers most value prestige and related metrics such as impact factor when submitting their work for publication. Respondents indicated that total number of publications, number of publications per year, and journal name recognition were the most valued factors in RPT. Older and tenured respondents (most likely to serve on RPT committees) were less likely to value journal prestige and metrics for publishing, while untenured respondents were more likely to value these factors. These results suggest disconnects between what academics value versus what they think their peers value, and between the importance of journal prestige and metrics for tenured versus untenured faculty in publishing and RPT perceptions.
Abstract: This article presents results from a survey of faculty in North American Library and Information Studies (LIS) schools about their attitudes towards and experience with open-access publishing. As a follow-up to a similar survey conducted in 2013, the article also outlines the differences in beliefs about and engagement with open access that have occurred between 2013 and 2018. Although faculty in LIS schools are proponents of free access to research, journal publication choices remain informed by traditional considerations such as prestige and impact factor. Engagement with open access has increased significantly, while perceptions of open access have remained relatively stable between 2013 and 2018. Nonetheless, those faculty who have published in an open-access journal or are more knowledgeable about open access tend to be more convinced about the quality of open-access publications and less apprehensive about open-access publishing than those who have no publishing experience with open-access journals or who are less knowledgeable about various open-access modalities. Willingness to comply with gold open-access mandates has increased significantly since 2013.
“For decades, the syllabus has been the roadmap to college classes, listing homework, assignments, and most crucially, texts for students to read and reference. But while a syllabus might be able to teach students what they’re in for during the semester, academics have lacked a tool to analyze large masses of syllabi to better understand what teachers are teaching in different disciplines. That means there isn’t as much empirical data about the content being taught at universities.
The Open Syllabus Project aims to fix this problem. Researchers at the the American Assembly, a nonprofit housed within Columbia University, have collected an archive of more than six million syllabi from college courses all over the world that could help teachers to create new syllabi and researchers to garner a cross-cultural understanding of higher education.
The project first launched three years ago, but this new update has six times as many syllabi and search tools and visualizations designed to map out how academia works right now….”
Abstract: Preprints in the life sciences are gaining popularity, but release of a preprint still precedes only a fraction of peer-reviewed publications. Quantitative evidence on the relationship between preprints and article-level metrics of peer-reviewed research remains limited. We examined whether having a preprint on bioRxiv.org was associated with the Altmetric Attention Score and number of citations of the corresponding peer-reviewed article. We integrated data from PubMed, CrossRef, Altmetric, and Rxivist (a collection of bioRxiv metadata). For each of 26 journals (comprising a total of 46,451 articles and 3,817 preprints), we used log-linear regression, adjusted for publication date and scientific subfield, to estimate fold-changes of Attention Score and citations between articles with and without a preprint. We also performed meta-regression of the fold-changes on journal-level characteristics. By random effects meta-analysis across journals, releasing a preprint was associated with a 1.53 times higher Attention Score + 1 (95% CI 1.42 to 1.65) and 1.31 times more citations + 1 (95% CI 1.24 to 1.38) of the peer-reviewed article. Journals with larger fold-changes of Attention Score tended to have lower impact factors and lower percentages of articles released as preprints. In contrast, a journal’s fold-change of citations was not associated with impact factor, percentage of articles released as preprints, or access model. The findings from this observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.
“Ludo [Waltman] started his inaugural lecture by discussing the improvements made to the indicators, algorithms, and software tools used by CWTS. He explained how these improvements have led to better ways in which scientometric methods support the evaluation of scientific research. Ludo then talked about contextualized scientometrics. In this new approach to scientometrics, scientometric analyses are made fully transparent, enabling a deep engagement of end users in the interpretation of the analyses. Ludo emphasized that contextualized scientometrics requires openly available scientometric data sources. Finally, Ludo called for higher levels of quantitative literacy to improve the way scientometric analyses are used to inform research policy. Improvements need to be made in research and education. In addition, Ludo stressed the importance of having realistic expectations from scientometric analyses.”
“The University of Cambridge and Cambridge University Press today announce that they have signed up to the San Francisco Declaration on Research Assessment (DORA), a set of recommendations agreed in 2012 that seek to ensure that the quality and impact of research outputs are “measured accurately and evaluated wisely”. …”
Abstract: This paper proposes the creation of a dashboard consisting of five metrics that could be used to replace the journal impact factor. It should be especially useful in circumstances, like promotion and tenure committees, where the evaluators do not share the authors subject expertise and where they are working under time constraints.
Abstract: The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information. The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non?traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research. The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal’s articles. To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed. Quantitative evidence demonstrates the effectiveness of one journal’s social media strategy in increasing the reach and readership of the articles it publishes. This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.