Cost-benefit analysis for FAIR research data – ENVRI community

FAIR research data encompasses the way to create, store and publish research data in a way that they are findable, accessible, interoperable and reusable. In order to be FAIR, research data published should meet certain criteria described by the FAIR principles. Despite this, many research performing organisations and infrastructures are still reluctant to apply the FAIR principles and share their datasets due to real or perceived costs, including time investment and money.

To answer such concerns, this report formulates 36 policy recommendations on cost-effective funding and business models to make the model of FAIR data sustainable. It provides evidence to decision makers on setting up short and long-term actions pertinent to the practical implementation of FAIR principles….”

A conceptual peer review model for arXiv and other preprint databases – Wang – 2019 – Learned Publishing – Wiley Online Library

Abstract:  A global survey conducted by arXiv in 2016 showed that 58% of arXiv users thought arXiv should have a peer review system. The current opinion is that arXiv should adopt the Community Peer Review model. This paper evaluates and identifies two weak points of Community Peer Review and proposes a new peer review model – Self?Organizing Peer Review. We propose a model in which automated methods of matching reviewers to articles and ranking both users and articles can be implemented. In addition, we suggest a strategic plan to increase recognition of articles in preprint databases within academic circles so that second generation preprint databases can achieve faster and cheaper publication.

Reproducibility and Transparency by Design | Molecular & Cellular Proteomics

Abstract:  To truly achieve reproducible research, having reproducible analytics must be a principal research goal. Biological discovery is not the only deliverable; reproducibility is an essential part of our research.

[From the body of the paper:] “As mandated data sharing resolves a portion of the overall transparency/reproducibility challenge, the unaddressed issue remains the sharing of analyses….” 

Proposal for a Standard Article Metrics Dashboard to Replace the Journal Impact Factor

Abstract:  This paper proposes the creation of a dashboard consisting of five metrics that could be used to replace the journal impact factor. It should be especially useful in circumstances, like promotion and tenure committees, where the evaluators do not share the authors subject expertise and where they are working under time constraints.

Implementing the FAIR Data Principles in precision oncology: review of supporting initiatives | Briefings in Bioinformatics | Oxford Academic

Abstract:  Compelling research has recently shown that cancer is so heterogeneous that single research centres cannot produce enough data to fit prognostic and predictive models of sufficient accuracy. Data sharing in precision oncology is therefore of utmost importance. The Findable, Accessible, Interoperable and Reusable (FAIR) Data Principles have been developed to define good practices in data sharing. Motivated by the ambition of applying the FAIR Data Principles to our own clinical precision oncology implementations and research, we have performed a systematic literature review of potentially relevant initiatives. For clinical data, we suggest using the Genomic Data Commons model as a reference as it provides a field-tested and well-documented solution. Regarding classification of diagnosis, morphology and topography and drugs, we chose to follow the World Health Organization standards, i.e. ICD10, ICD-O-3 and Anatomical Therapeutic Chemical classifications, respectively. For the bioinformatics pipeline, the Genome Analysis ToolKit Best Practices using Docker containers offer a coherent solution and have therefore been selected. Regarding the naming of variants, we follow the Human Genome Variation Society’s standard. For the IT infrastructure, we have built a centralized solution to participate in data sharing through federated solutions such as the Beacon Networks.

Canadian copyright report: Let’s wait and see how upload filters and press publishers rights will fail. – International Communia Association

“Last week the Canadian Parliament’s Standing Committee on Industry, Science and Technology (INDU) released a report with 36 recommendations to reform Canadian copyright law. Under Canadian law the committee is required to review the Canadian copyright statutes every five years and the report presented now is the outcome of such a review. While this means that it is relatively unlikely that many of the recommendations contained in the report will result in immediate legislative actions (the government is not required to act on them) the report is nevertheless interesting as it contains a number of recommendations that go in the opposite direction of the changes that the DSM directive will bring to copyright in the European Union (for a full overview of the recommendations see Michael Geist’s summary).

After a year-long study that includes a public consultation and a number of committee hearings on a wide variety of issues, the INDU committee has come to the conclusion that there is a lack of evidence for both a DSM-style press publishers right and for changes to the liability position of platform intermediaries as foreseen in Article 17 of the DSM directive. While Canadian rightsholders argued for the necessity of such interventions, they failed to convince the committee of the merits for these provisions….”

The Authoritative Canadian Copyright Review: Industry Committee Issues Balanced, Forward-Looking Report on the Future of Canadian Copyright Law – Michael Geist

In December 2017, the government launched its copyright review with a Parliamentary motion to send the review to the Standing Committee on Industry, Science and Technology. After months of study and hundreds of witnesses and briefs, the committee released the authoritative review with 36 recommendations that include expanding fair dealing, a rejection of a site blocking system, and a rejection of proposals to exclude education from fair dealing where a licence is otherwise available. The report represents a near-total repudiation of the one-sided Canadian Heritage report that was tasked with studying remuneration models to assist the actual copyright review. While virtually all stakeholders will find aspects they agree or disagree with, that is the hallmark of a more balanced approach to copyright reform.

This post highlights some of the most notable recommendations in the report that are likely to serve as the starting point for any future copyright reform efforts. There is a lot here but the key takeaways on the committee recommendations:

  • expansion of fair dealing by making the current list of fair dealing purposes illustrative rather than exhaustive (the “such as” approach)
  • rejection of new limits on educational fair dealing with further study in three years
  • retention of existing Internet safe harbour rules
  • rejection of the FairPlay site blocking proposal with insistence that any blocking include court oversight
  • expansion of the anti-circumvention rules by permitting circumvention of digital locks for purposes that are lawful (ie. permit circumvention to exercise fair dealing rights)
  • extend the term of copyright only if ratifying the USCMA and include a registration requirement for the additional 20 years
  • implement a new informational analysis exception
  • further study of statutory damages for all copyright collectives along with greater transparency
  • adoption of an open licence rather than the abolition of crown copyright….”

The OER Starter Kit – Simple Book Publishing

“This starter kit has been created to provide instructors with an introduction to the use and creation of open educational resources (OER). The text is broken into five sections: Getting Started, Copyright, Finding OER, Teaching with OER, and Creating OER. Although some chapters contain more advanced content, the starter kit is primarily intended for users who are entirely new to Open Education….”

Libraries in a computational age | Feral Librarian

Openness is a also a very important part of our culture and widely-shared value at MIT. We are one of the few private universities in the US with an open campus, including libraries that are open to all visitors. We are also committed to openly sharing our educational and research materials with the world.

MIT created Open Courseware in 2000, “a simple but bold idea that MIT should publish all of our course materials online and make them widely available to everyone.” To date Open Courseware has over 2 million visitors/month, and hosts 2400 courses.

In 2009, MIT passed one of the first campus-wide open access policies in the US, passed by a unanimous vote of the faculty. MIT turned to the libraries to implement the policy, and because of a commitment to provide adequate staffing and resources to collecting faculty research, we now share 45% of MIT faculty journal articles written since 2009 openly with the world through our OA repository….

The first conclusion was that although the initial digital turn in libraries was not yet complete, we were already on the cusp of a second, potentially  more profound one. The first, original digital shift in libraries was print to digital plus print, and was brought about by the internet, google, and e-books/journals….

Although this was a HUGE shift, it did not open up access to scholarly content the way many of us hoped it would. In large part because of the market power of many large commercial publishers, the advent of online journals did not democratize access to knowledge, and the potential for the rise of the internet and of online information and scholarship to create information equality has been stunted. None the less, the first digital turn in libraries and scholarly communication did make research and reading arguably more efficient for those who had access….

In describing the next evolution of libraries, the MIT future of libraries task force emphasized not only the technological shift, but also the importance of combining this shift with a renewed commitment to open science and open scholarship. What is the next shift? It is an evolution of libraries from service to platform, and is from not just digital and physical; but also to computational….”

A single, open access journal may prevent the primary publishing problems in the life sciences: Accountability in Research: Vol 0, No 0

Abstract:  Herein, we discuss a novel way to knit current life sciences publishing structures together under the scope of a single life science journal that would countermand many of the issues faced in current publishing paradigms. Such issues include, but are not limited to, publication fees, subscription fees, impact factor, and publishing in more “glamorous” journals for career health. We envision a process flow involving (i) a single, overall, life sciences journal, (ii) divided into sections headed by learned societies, (iii) to whom all scientific papers are submitted for peer review, and (iv) all accepted scientific literature would be published open access and without author publication fees. With such a structure, journal fees, the merit system of science, and unethical aspects of open access would be reformed for the better. Importantly, such a journal could leverage existing online platforms; that is to say, it is conceptually feasible. We conclude that wholly inclusive publishing paradigms can be possible. A single, open access, online, life sciences journal could solve the myriad problems associated with current publishing paradigms and would be feasible to implement.