An overview of hybrid open access journals that meet two conditions: (1) academic institutions sponsored the publication fee according to the Open APC initiative and (2) publishers shared licensing information about fulltext accessibility and re-use rights with Crossref.
Recently, we announced a new Open Access hosting partnership with UCL Press. But just what does this mean, exactly?
Our customised hosting services are designed to help publishers showcase and distribute the Open Access journals that they publish to maximum effect. These are the natural extension of our marketing and indexing services, developed on the basis of our years of experience in content management architecture layered with advanced discovery technologies. By working with a range of publishers and content types, we have built a flexible platform to inter-connect scholarly articles at the level of their metadata, and establish a forum for user interaction around them. For Open Access journals, however, we are able to offer further advantages by embedding the full text articles within our discovery environment.
“This report presents the first major comparative analysis of usage data for OA and non-OA scholarly books, and provides an informed view of how a book benefits from OA publication. It also highlights the challenges involved in measuring the impact of OA on scholarly books and suggests that there is much to do across the whole scholarly communications network in supporting authors and their funders.”
“Goal 3: Change collection focus from “outside in” to “inside out”. This is the critical flip in strategy that academic libraries need to make….Measures: 1. The number of library staff that is allocated to “inside-out” activities, including scholarly communication, data management, repository management, digitization, etc. This might also be expressed as a percentage of all library staff or of staff involved in collections, including selection, acquisitions, cataloging, and circulation. 2. The portion of the collections budget, defined to include funds allocated to digital scholarship activities like Open Access Authors fund and to support community Open Access projects. 3. The amount of money invested in the acquisitions of special collections. This could be represented in dollars or as a percentage of the collections budget….”
“To round off a great Open Access week, we’d like to announce a new interesting project we’ve started. Continuing our efforts in the field of Open Science, Open Knowledge Finland was commissioned by CSC – IT Center for Science and the Finnish Ministry of Education and Culture to implement a Study on the Openness of Scientific Publishers.”
“In my last post on the lack of accessibility of Gold Open Access for early career researchers (ECRs), I mentioned that in my opinion Green Open Access was a very imperfect solution – in fact, hardly a solution at all. I expand here on why that is the case, and why a focus on green OA presents new challenges for publication practices which compound the – already many – challenges of moving towards a greater accessibility of research. Not all OA initiatives are equal. Green Open Access, by far the commonest kind, refers to the depositing of a non-final version of the published manuscript into a research repository – generally either an institutional repository (managed by the university with which the researcher is affiliated), a subject-specific repository (such as ArXiv/SocArXiv), an academic networking website such as Academia.edu, ResearchGate, or Mendeley, or a personal website. Various publishers have rules on what version can be posted where and when, with the most common being that accepted manuscripts (after peer-review, but before proofreading and typesetting) can be made public in repositories after an embargo period, while the “version of record” – the published version – may not be shared publicly for free. The published article remains accessible only with paid access (with publishers either explicitly authorizing (SAGE) or tacitly tolerating the private sharing of full articles.”
“Nothing (and in particular no semi-automatized pseudo-scientific evaluation that involves numbers or data) can replace evaluation by an individual who actually understands what he/she is evaluating. Furthermore, tools such as impact factors are clearly not helpful or relevant in the context of mathematical research….”
“In their comment, Janssens et al.  offer a critique of the Relative Citation Ratio (RCR), objecting to the construction of both the numerator and denominator of the metric. While we strongly agree that any measure used to assess the productivity of research programs should be thoughtfully designed and carefully validated, we believe that the specific concerns outlined in their correspondence are unfounded.
Our original article acknowledged that RCR or, for that matter, any bibliometric measure has limited power to quantify the influence of any very recently published paper, because citation rates are inherently noisy when the absolute number of citations is small . For this reason, in our iCite tool, we have not reported RCRs for papers published in the calendar year previous to the current year . However, while agreeing with our initial assertion that RCR cannot be used to conclusively evaluate recent papers, Janssens et al. also suggest that the failure to report RCRs for new publications might unfairly penalize some researchers. While it is widely understood that it takes time to accurately assess the influence that new papers have on their field, we have attempted to accommodate this concern by updating iCite so that RCRs are now reported for all papers in the database that have at least 5 citations and by adding a visual indicator to flag values for papers published in the last 18 months, which should be considered provisional . This modified practice will be maintained going forward.
Regarding article citation rates of older articles, we have added data on the stability of RCR values to the “Statistics” page of the iCite website [4, 5]. We believe that these new data, which demonstrate that the vast majority of influential papers retain their influence over the period of an investigator’s career, should reassure users that RCR does not unfairly disadvantage older papers. Our analysis of the year-by-year changes in RCR values of National Institutes of Health (NIH)-funded articles published in 1991 reinforces this point (Fig 1). From 1992–2014, both on the individual level and in aggregate, RCR values are remarkably stable. For cases in which RCRs change significantly, the values typically increase. That said, we strongly believe that the potential for RCR to decrease over time is necessary and important; as knowledge advances and old models are replaced, publications rooted in those outdated models naturally become less influential….”
“16. We affirm the principle that efforts should be directed to promote a widespread participation of researchers in the network of global research infrastructures, taking account of the opportunities offered by open science paradigms. Significant contributions to this discussion come from the “Group of Senior Officials on Global Research Infrastructures” (GSO) and the G7 “Open Science Working Group” (OS WG)…. We welcome the GSO’s 2017 report that includes both the evolution of the Framework corresponding to a broader and deeper consensus on global access criteria, the developments on open innovation and open data policies….19. We recognize that ICT developments, the digitisation and the vast availability of data, efforts to push the science frontiers, and the need to address complex economic and societal challenges, are transforming the way in which science is performed towards Open Science paradigms. We agree that an international approach can help the speed and coherence of this transition, and that it should target in particular two aspects. First, the incentives for the openness of the research ecosystem: the evaluation of research careers should better recognize and reward Open Science activities. Secondly, the infrastructures for an optimal use of research data: all researchers should be able to deposit, access and analyse scientific data across disciplines and at the global scale, and research data should adhere to the FAIR principles of being findable, accessible, interoperable, and reusable….20. We support the work and results achieved so far by the G7 Open Science Working group. The OS Working Group has identified priorities that deserve and require common aligned actions, both in encouraging openness and data skills in scientific research practice, through workforce development and training. We encourage the OS WG to follow-up actions taken by G7 members according to the WG’s recommendations and to collect good practices, in order to report to the next G7 Science Minister’s Meeting. In particular, we support the OS WG deepening its efforts on the two topics identified above (paragraph 19), namely the incentives for openness of the research ecosystem, including the role of research indicators and metrics relevant to open science, and the infrastructures and standards for optimal use of research. The summary report of the OS working group is attached to this Communiqué….”