Ubiquity Press

“Hyku and Invenio. Hyku is community-developed as a turnkey Samvera application and Invenio is developed by CERN. Both systems are modern, attractive and well-suited for both traditional and non-traditional content. Bring together your theses, articles, research data, and software under one high-quality repository….All repositories we host will be fully open source. We guarantee to transfer the entire installation to a host of your choice if you decide to switch platforms. …”

CREDIT reflects Complete Workflow

“CREDIT is a cloud-enabled SaaS tool for data management to provide an opportunity to authors to register their Additional Research Outputs(AROs) reflecting RAW, REPEAT & NULL/NEGATIVE entities generated at various stages of research workflow to ensure their reusability & gaining credit. Hence contributing towards enriching research articles & reproducible science. CREDIT framework & interface is developed on FAIR data principles….The appearance of these badges happens dynamically, hence creates a possibility that the metrics around the data, when readers engage with it would be fed back to the main published article in real-time (accessible via the badge – Enhancing Discoverability and also giving credits to Authors). And in the near-future we also have plans to roll out Badges that can be embedded in PDF articles….”

New reports on the European Open Science Cloud governance and on Open Access Publishing in Europe. | eoscpilot.eu

“The Open Science Policy Platform has adopted consensual reports on the European Open Science Cloud governance and on Open Access Publishing in Europe. The report on EOSC includes recommendations on governance and financial schemes, while the report on Open Science Publishing gives recommendations on how to implement open access publishing in Europe by 2020.”

EOSC (Europen Open Science Cloud for Research Pilot Project)

“The EOSCpilot project will support the first phase in the development of the European Open Science Cloud (EOSC). It will:

  • Propose and trial the governance framework for the EOSC and contribute to the development of European open science policy and best practice;
  • Develop a number of demonstrators functioning as high-profile pilots that integrate services and infrastructures to show interoperability and its benefits in a number of scientific domains; and
  • Engage with a broad range of stakeholders, crossing borders and communities, to build the trust and skills required for adoption of an open approach to scientific research.

These actions will build on and leverage already available resources and capabilities from research infrastructure and e-infrastructure organisations to maximise their use across the research community.

  • Reduce fragmentation between data infrastructures by working across scientific and economic domains, countries and governance models, and
  • Improve interoperability between data infrastructures by progressing and demonstrating how data and resources can be shared even when they are large and complex and in varied formats.

The EOSCpilot project will improve the ability to preserve and reuse data resources and provide an important step towards building a dependable open innovation environment where data from publicly funded research is always open and there are clear incentives and rewards for the sharing of data and resources….”

Europe joins forces to create largest ever shared data repository for researchers | Horizon: the EU Research & Innovation magazine | European Commission

“World-leading research institutes have agreed to join forces with funding agencies and policymakers to create the European Open Science Cloud, the largest shared data repository in history.”

European Open Science Cloud, pilot project

“EOSCpilot Mission

Facilitating access of researchers across all scientific disciplines to data

Establishing a governance and business model that sets the rules for the use of EOSC

Creating a cross-border and multi-disciplinary open innovation environment for research data, knowledge and services

Establishing global standards for interoperability for scientific data…”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”

Next-generation metrics: Responsible metrics and evaluation for open science

“Over the past year, the Expert Group has reviewed available metrics, with special attention to altmetrics, and identified frameworks for responsible usage, in the context of the EC’s agenda for open science. This agenda is developing under five action lines: fostering and creating incentives for open science; removing barriers for open science; mainstreaming and further promoting open access policies; developing an open science cloud; and open science as a socio-economic driver.

A multi-stakeholder Open Science Policy Platform has been established, to advise on strategic direction and implementation.3 In May 2016, the EU Competitiveness Council issued a set of conclusions on the transition towards an open science system. It noted that the remit of the Open Science Policy Platform should include ‘adapting reward and evaluation systems, alternative models for open access publishing and management of research data (including archiving), altmetrics….and other aspects of open science.’4

This is the context in which the Expert Group on Altmetrics undertook its work, and will input findings to EC policymakers and to the Open Science Policy Platform.

[…] 

This report builds on the expertise of the group members, complemented by desk-research and an extensive literature review. The group also issued a call for evidence in June 2016, to gather the views of stakeholders11. Respondents had one month to reply with brief submissions. They were asked to indicate whether they were making an individual or organisational response, and what role they occupied in the open science agenda. In total, twenty responses to the call for evidence were received, of which nineteen were valid answers. The list of respondents can be found in Appendix 1.

A summary of the results from the call for evidence was presented at the Science and Technology Indicators (STI) Conference in Valencia (September 15, 2016)12 and the 3AM Conference in Bucharest (September 29, 2016)13. Both occasions were used to receive more feedback. The audience at the STI Conference mainly consisted of researchers in scientometrics and bibliometrics, whereas attendees at the 3AM Conference mainly came from research institutes, altmetric providers, and libraries. Feedback was mostly anonymous via plenary contributions and a paperand-pencil-exercise during the 3AM Conference.”