A new era for research publication: Will Open Access become the norm? – Hotta – – Journal of Diabetes Investigation – Wiley Online Library

“This new challenge [Plan S] causes some concerns to us. This program is unlikely to be equivalent between Europe and the United States8). because key US federal agencies such as National Institute of Health (NIH), mandate a ‘green’ Open Access policy, whereby articles in subscription journals are automatically made available after a 12-month embargo. This policy protects the existing ‘paywalled’ subscription business model. Also, ‘Plan S’ does not allow for scientists to publish their papers in hybrid journals….

One piece of bright news, however, is that Open Access publication fees would be covered by funders or research institutions, not by individual researchers. Although our journal is already Open Access, we have some concerns regarding the publication fee being covered by either researchers or institutions….”

Given that the publishing industry is approaching a new era in which 85% or more of journals are Open Access, it is necessary for us to develop a survival strategy against this coming fierce competition….

bioRxiv: the preprint server for biology | bioRxiv

Abstract:  The traditional publication process delays dissemination of new research, often by months, sometimes by years. Preprint servers decouple dissemination of research papers from their evaluation and certification by journals, allowing researchers to share work immediately, receive feedback from a much larger audience, and provide evidence of productivity long before formal publication. Launched in 2013 as a non-profit community service, the bioRxiv server has brought preprint practice to the life sciences and recently posted its 64,000th manuscript. The server now receives more than four million views per month and hosts papers spanning all areas of biology. Initially dominated by evolutionary biology, genetics/genomics and computational biology, bioRxiv has been increasingly populated by papers in neuroscience, cell and developmental biology, and many other fields. Changes in journal and funder policies that encourage preprint posting have helped drive adoption, as has the development of bioRxiv technologies that allow authors to transfer papers easily between the server and journals. A bioRxiv user survey found that 42% of authors post their preprints prior to journal submission whereas 37% post concurrently with journal submission. Authors are motivated by a desire to share work early; they value the feedback they receive, and very rarely experience any negative consequences of preprint posting. Rapid dissemination via bioRxiv is also encouraging new initiatives that experiment with the peer review process and the development of novel approaches to literature filtering and assessment.

Promoting openness – Research Professional News

“Of the potential solutions, open research practices are among the most promising. The argument is that transparency acts as an implicit quality control process. If others are able to scrutinise our work—not just the final published output, but the underlying data, code, and so on—researchers will be incentivised to ensure these are high quality.

So, if we think that research could benefit from improved quality control, and if we think that open research might have a role to play in this, why aren’t we all doing it? In a word: incentives….”

ASTRO Journals’ Data Sharing Policy and Recommended Best Practices- ClinicalKey

Abstract:  Transparency, openness, and reproducibility are important characteristics in scientific publishing. Although many researchers embrace these characteristics, data sharing has yet to become common practice. Nevertheless, data sharing is becoming an increasingly important topic among societies, publishers, researchers, patient advocates, and funders, especially as it pertains to data from clinical trials. In response, ASTRO developed a data policy and guide to best practices for authors submitting to its journals. ASTRO’s data sharing policy is that authors should indicate, in data availability statements, if the data are being shared and if so, how the data may be accessed.

 

Open access efforts begin to bloom: ESC Heart Failure gets full attention and first impact factor – Anker – 2019 – ESC Heart Failure – Wiley Online Library

Abstract:  In 2014, the Heart Failure Association (HFA) of the European Society of Cardiology (ESC) founded the first open access journal focusing on heart failure, called ESC Heart Failure (ESC?HF). In the first 5 years, in ESC?HF we published more than 450 articles. Through ESC?HF, the HFA gives room for heart failure research output from around the world. A transfer process from the European Journal of Heart Failure to ESC?HF has also been installed. As a consequence, in 2018 ESC?HF received 289 submissions, and published 148 items (acceptance rate 51%). The journal is listed in Scopus since 2014 and on the PubMed website since 2015. In 2019, we received our first impact factor from ISI Web of Knowledge / Thomson?Reuters, which is 3.407 for 2018. This report reviews which papers get best cited. Not surprisingly, many of the best cited papers are reviews and facts & numbers mini reviews, but original research is also well cited.

How journals are using overlay publishing models to facilitate equitable OA

“Preprint repositories have traditionally served as platforms to share copies of working papers prior to publication. But today they are being used for so much more, like posting datasets, archiving final versions of articles to make them Green Open Access, and another major development — publishing academic journals. Over the past 20 years, the concept of overlay publishing, or layering journals on top of existing repository platforms, has developed from a pilot project idea to a recognized and growing publishing model.

In the overlay publishing model, a journal performs refereeing services, but it doesn’t publish articles on its website. Rather, the journal’s website links to final article versions hosted on an online repository….”

“Research Data Management Among Life Sciences Faculty” by Kelly A. Johnson and Vicky Steeves

Abstract:  Objective: This paper aims to inform on opportunities for librarians to assist faculty with research data management by examining practices and attitudes among life sciences faculty at a tier one research university.

Methods: The authors issued a survey to estimate actual and perceived research data management needs of New York University (NYU) life sciences faculty in order to understand how the library could best contribute to the research life cycle.

Results: Survey responses indicate that over half of the respondents were aware of publisher and funder mandates, and most are willing to share their data, but many indicated they do not utilize data repositories. Respondents were largely unaware of data services available through the library, but the majority were open to considering such services. Survey results largely mimic those of similar studies, in that storing data (and the subsequent ability to share it) is the most easily recognized barrier to sound data management practices.

Conclusions: At NYU, as with other institutions, the library is not immediately recognized as a valuable partner in managing research output. This study suggests that faculty are largely unaware of, but are open to, existent library services, indicating that immediate outreach efforts should be aimed at promoting them.

“Research Data Management Among Life Sciences Faculty” by Kelly A. Johnson and Vicky Steeves

Abstract:  Objective: This paper aims to inform on opportunities for librarians to assist faculty with research data management by examining practices and attitudes among life sciences faculty at a tier one research university.

Methods: The authors issued a survey to estimate actual and perceived research data management needs of New York University (NYU) life sciences faculty in order to understand how the library could best contribute to the research life cycle.

Results: Survey responses indicate that over half of the respondents were aware of publisher and funder mandates, and most are willing to share their data, but many indicated they do not utilize data repositories. Respondents were largely unaware of data services available through the library, but the majority were open to considering such services. Survey results largely mimic those of similar studies, in that storing data (and the subsequent ability to share it) is the most easily recognized barrier to sound data management practices.

Conclusions: At NYU, as with other institutions, the library is not immediately recognized as a valuable partner in managing research output. This study suggests that faculty are largely unaware of, but are open to, existent library services, indicating that immediate outreach efforts should be aimed at promoting them.

“Assessing usability of GTD supplementary files” by Steven Van Tuyl

Abstract:  Objectives: The objective of this study is to evaluate the quality and usability of supplementary data files deposited, between 1971 and 2015, to our university institutional repository. Understanding the extent to which content historically deposited in digital repositories is usable by today’s researchers can help inform digital preservation and documentation practices for researchers today.

Methods: I identified all graduate level theses and dissertations (GTDs) in the institutional repository with multiple files as a first pass at identifying documents that included supplementary data files. These GTDs were then individually examined, removing supplementary files that were artifacts of either the upload or digitization process. The remaining “true” supplementary files were then individually opened and evaluated following elements of the DATA rubric of Van Tuyl and Whitmire (2016).

Results: Supplementary files were discovered in the repository dating back to 1971 in 116 GTD submissions totalling more than 25,000 files. Most GTD submissions included fewer than 30 files, though some submissions included thousands of individual data files. The most common file types submitted include imagery, tabular data, and databases, with a very large number of unknown file types. Overall, levels of documentation were poor while actionability of datasets was generally middling.

Conclusions: The results presented in this study suggest that legacy data submitted to our institutional repository with GTDs is generally in poor shape with respect to Transparency and somewhat less so for Actionability. It is clear from this study and others that researchers have a long road ahead when it comes to sharing data in a way that makes it potentially useable by other researchers.

“Assessing usability of GTD supplementary files” by Steven Van Tuyl

Abstract:  Objectives: The objective of this study is to evaluate the quality and usability of supplementary data files deposited, between 1971 and 2015, to our university institutional repository. Understanding the extent to which content historically deposited in digital repositories is usable by today’s researchers can help inform digital preservation and documentation practices for researchers today.

Methods: I identified all graduate level theses and dissertations (GTDs) in the institutional repository with multiple files as a first pass at identifying documents that included supplementary data files. These GTDs were then individually examined, removing supplementary files that were artifacts of either the upload or digitization process. The remaining “true” supplementary files were then individually opened and evaluated following elements of the DATA rubric of Van Tuyl and Whitmire (2016).

Results: Supplementary files were discovered in the repository dating back to 1971 in 116 GTD submissions totalling more than 25,000 files. Most GTD submissions included fewer than 30 files, though some submissions included thousands of individual data files. The most common file types submitted include imagery, tabular data, and databases, with a very large number of unknown file types. Overall, levels of documentation were poor while actionability of datasets was generally middling.

Conclusions: The results presented in this study suggest that legacy data submitted to our institutional repository with GTDs is generally in poor shape with respect to Transparency and somewhat less so for Actionability. It is clear from this study and others that researchers have a long road ahead when it comes to sharing data in a way that makes it potentially useable by other researchers.