OpenAIRE Monitoring Services – EC FP7 & H2020 and other national funders (presentation at the Open Science FAIR 2017 conference)

“Presentation at the FOSTERplus project workshop on “fostering the practical implementation of open science in horizon 2020 and beyond”, at the Open Science FAIR conference in Athens, September, 7 2017.”

Article visibility: journal impact factor and availability of full text in PubMed Central and open access

Abstract:  Both the impact factor of the journal and immediate full-text availability in Pubmed Central (PMC) have featured in editorials before.1-3 In 2004, the editor of the Cardiovascular Journal of Africa (CVJA) lamented, like so many others, the injustice of not having an impact factor, its validity as a tool for measuring science output, and the negative effect of a low perceived impact in drawing attention from publications from developing countries.1,4

Since then, after a selection process, we have been indexed by the Web of Science® (WoS) and Thomson Reuters (Philadelphia, PA, USA), and have seen a growing impact factor. In the case of PMC, our acceptance to this database was announced in 2012,2 and now we are proud that it is active and full-text articles are available dating back to 2009. The journal opted for immediate full open access (OA), which means that full-text articles are available on publication date for anybody with access to the internet.

Paola Di Maio, Toward shared system knowledge : an empirical study of knowledge sharing policy and practice in systems engineering research in the UK

Abstract:  Research in Open Access (OA) to Scholarly Publications has flourished in recent years, however studies published to date tend to be quantitative, statistical analyses over undifferentiated corpuses, that monitor the overall uptake (Bjo?rk et al. 2010; Laakso et al. 2011). This doctoral thesis explores a different path of inquiry: it examines the effectiveness of OA policies in relation to the perspective of a ‘knowledge seeker’ and considers them in the context of the wider regulatory landscape that motivates their existence, specifically monitoring the availability of shared resources – journal publications, as well as other knowledge sharing artefacts adopted in technical domains – in relation to systems engineering research in the UK. Research Funding Councils adopt Open Access policies and display them prominently on their website, yet not all funded research projects seem to share knowledge by publishing Open Access resources. The main hypothesis driving this thesis is that a gap exists between Open Access in theory and Open Access in practice. A unique research methodology is devised that combines evidence based research (EBR) with a wide range of mixed method techniques, including FOI (freedom of information) requests. A novel collection instrument, a set of heuristic indicators, are developed to support the empirical observation of the gap between ‘Open Access policies in theory’, corresponding approximately to what the funding body state on their website, and ‘Open Access policies in practice’, corresponding to the level of adoption of these policies by grant holders. A systematic review and a meta-analysis of a 100 publicly-funded projects are carried out. The research demonstrates empirically that in the majority of the audited publicly-funded projects, no Open Access resources can be located.

Science Europe Position Statement – On a New Vision for More Meaningful Research Impact Assessment

“Research has always had a wide impact on society, but this does not always come in the form of a clearly defined outcome, application, or effect. It frequently occurs as a more gradual development in the understanding of the consequences of new knowledge. It may also happen long after the corresponding research was done, or be the outcome of repeated interactions between research and society. This can make societal impact difficult or impossible to trace back and attribute to the specific research from which it originated. In addition, that research may have already been evaluated without this impact having been taken into account.”

HuMetricsHSS – Rethinking humane indicators of excellence in the humanities and social sciences

“HuMetricsHSS takes the approach that metrics should only be used to measure a scholar’s progress toward embodying five values that our initial research suggests are central to all HSS disciplines:

  • Collegiality, which can be described as the professional practices of kindness, generosity, and empathy toward other scholars and oneself;
  • Quality, a value that demonstrates one’s originality, willingness to push boundaries, methodological soundness, and the advancement of knowledge both within one’s own discipline and among other disciplines and with the general public, as well;
  • Equity, or the willingness to undertake study with social justice, equitable access to research, and the public good in mind;
  • Openness, which includes a researcher’s transparency, candor, and accountability, in addition to the practice of making one’s research open access at all stages; and
  • Community, the value of being engaged in one’s community of practice and with the public at large and also in practicing principled leadership. …”

Assessing Current Practices in Review, Tenure, and Promotion – #ScholCommLab

“One of the key components of workplace advancement at the university level are the review, promotion and tenure (RPT) packets that are typically submitted every other year by early career faculty. These guidelines and forms are considered to be of highest importance for all faculty, especially for early career faculty who need to demonstrate the value and impact of their work to the university and the broader scientific community. Quite often impact is equated with “impact factor,” leading many researchers to target a narrow range of journals at the expense of a broader societal considerations (such as the public’s right to access). The importance of RPT guidelines and forms makes them a natural place to effect change towards an opening of access to research (something both Canada and the US have been pushing for through federal policies and laws).

While we believe changes in RPT guidelines and forms may provide the impetus for behavioral change, leading to broader interest and adoption of open access principles, the reality is that very little is known about current RPT practices as they relate to questions of openness. This project seeks to examine the RPT process in the US and Canada in ways that can directly inform actions likely to translate into behavioural change and to a greater opening of research….”

Usage Bibliometrics as a Tool to Measure Research Activity

Abstract:  Measures for research activity and impact have become an integral ingredient in the assessment of a wide range of entities (individual researchers, organizations, instruments, regions, disciplines). Traditional bibliometric indicators, like publication and citation based indicators, provide an essential part of this picture, but cannot describe the complete picture. Since reading scholarly publications is an essential part of the research life cycle, it is only natural to introduce measures for this activity in attempts to quantify the efficiency, productivity and impact of an entity. Citations and reads are significantly different signals, so taken together, they provide a more complete picture of research activity. Most scholarly publications are now accessed online, making the study of reads and their patterns possible. Click-stream logs allow us to follow information access by the entire research community, real-time. Publication and citation datasets just reflect activity by authors. In addition, download statistics will help us identify publications with significant impact, but which do not attract many citations. Click-stream signals are arguably more complex than, say, citation signals. For one, they are a superposition of different classes of readers. Systematic downloads by crawlers also contaminate the signal, as does browsing behavior. We discuss the complexities associated with clickstream data and how, with proper filtering, statistically significant relations and conclusions can be inferred from download statistics. We describe how download statistics can be used to describe research activity at different levels of aggregation, ranging from organizations to countries. These statistics show a correlation with socio-economic indicators. A comparison will be made with traditional bibliometric indicators. We will argue that astronomy is representative of more general trends.

European Institutions Adopt Altmetric Explorer for Institutions – Digital Science

“Our portfolio company Altmetric announce that École Polytechnique Fédérale de Lausanne (EPFL) has become the latest institution to adopt the Explorer for Institutions platform to help analyse the online engagement surrounding its scholarly research outputs.

With an intuitive interface which enables users to browse, filter and report on the latest shares and mentions for over 10 million research outputs, the Explorer for Institutions platform makes it easy to identify where academic work has received mainstream or social media coverage, been referenced in public policy, or received attention from scholarly and broader audiences in places such as Wikipedia, Reddit and post-publication peer-review forums. Citation data from Scopus and Web of Science is also included where available.

EPFL joins leading institutions including Ghent University, ETH Zurich, The University of Helsinki and the International Institute of Social Studies and the Erasmus Research Institute of Management at Erasmus University Rotterdam in utilising Altmetric data to better understand the reach and influence of published research.”

Launch of MyScienceOpen gives researchers new ways to promote their work | STM Publishing News

“Today, we [ScienceOpen] are pleased to announce the launch of MyScienceOpen, our professional networking platform designed for the modern research environment. By leveraging the power of ORCID, MyScienceOpen is an integrated profile where academics can visualize their research impact through our enhanced author-level metrics….”

Make Data Count: Building a System to Support Recognition of Data as a First Class Research Output | Data Pub

“The Alfred P. Sloan Foundation has made a 2-year, $747K award to the California Digital LibraryDataCite and DataONE to support collection of usage and citation metrics for data objects. Building on pilot work, this award will result in the launch of a new service that will collate and expose data level metrics.

The impact of research has traditionally been measured by citations to journal publications: journal articles are the currency of scholarly research.  However, scholarly research is made up of a much larger and richer set of outputs beyond traditional publications, including research data. In order to track and report the reach of research data, methods for collecting metrics on complex research data are needed.  In this way, data can receive the same credit and recognition that is assigned to journal articles.

‘Recognition of data as valuable output from the research process is increasing and this project will greatly enhance awareness around the value of data and enable researchers to gain credit for the creation and publication of data’ – Ed Pentz, Crossref.

This project will work with the community to create a clear set of guidelines on how to define data usage. In addition, the project will develop a central hub for the collection of data level metrics. These metrics will include data views, downloads, citations, saves, social media mentions, and will be exposed through customized user interfaces deployed at partner organizations. Working in an open source environment, and including extensive user experience testing and community engagement, the products of this project will be available to data repositories, libraries and other organizations to deploy within their own environment, serving their communities of data authors.”