“Because of the low job security in the early stage of an academic career it is possible that early career researchers will be negatively affected by Plan S. Plan S currently involves 14 national funding agencies (including India that announced their participation on January 12th) and draws support from big private funds like the Wellcome Trust and the Bill & Melinda Gates Foundation. Combined, these funds represent not more than 15% of the available research money in the world.
This relatively small market share could hurt young researchers dependent on Plan S funders as they will not be allowed to publish in some prestigious, but closed access journals. When researchers funded by other agencies can put these publications on their CV they would have an unfair advantage on the academic labour market. Only when Plan S or similar initiatives would cover a critical mass of the world’s research output would the playing field be levelled….”
Abstract: Sharing research data is an increasingly necessary requirement for the advancement of science. The goal of this paper is twofold. First, to analyze the policies on openness in sharing scientific research data in a sample of pediatric journals and to determine whether there is any correlation with a journal’s impact factor; second, to determine if there have been changes in the opening policies from 2013 to 2016. Journals included in the Pediatrics area of the Journal Citation Reports were used for the analysis, with reference to the instructions to authors published on the journals’ websites. These instructions were revised in 2012 and in 2016. The majority of pediatric journals advise authors to deposit their data but do not provide specific instructions on how to do so. No correlation was found between the value of the impact factor of the journals and their open data policies. Deposit policies vary among publishing entities, with predominantly PubMed Central and repositories of clinical trials among those suggested for data deposit. Most pediatric journals recommend that authors deposit their data in a repository, but they do not provide clear instructions for doing so. No correlation was found between the value of a journal’s impact factor and the availability of open data. Policies regarding deposit in specific repositories vary among publishing entities, with PubMed Central and various clinical trial repositories being those primarily suggested for deposit.
“Open Access publishing is more widespread in Latin America than in any other region of the world, and continues to grow. We sat down with CLACSO’s Open Access Advisor Dominique Babini to find out why….”
“The National Library of Luxembourg has a consortial section1 which is planning to transition its agreements to the new realities of Open Access. The past years have been spent setting up an integrated administrative structure that allows for flexible shifting of costs and devising entirely new models. We envision the transition to play on two levels: 1. Our subscriptions contain more and more Open Access content, hence the subscription costs should fall proportionally, the “Transition credit”; 2. Our consortial partners pay increasing amounts for Open Access publications, these costs should be covered by the savings of the subscription part. Goals for transition agreements: 1. Transparent and sustainable for both publishers and libraries 2. Long term commitment (3-5 years) 3. Data-driven…”
“Researchers posted more preprints to the bioRxiv server in 2018 alone than in the four previous years, according to an analysis of the 37,648 preprints posted on the site in its first 5 years.
The analysis also shows that the number of downloads from the site has topped 1 million per month. BioRxiv, which allows researchers in the life sciences to post preliminary versions of studies, turned five last November….
Preprints that are downloaded more often on bioRxiv tend to be published in journals with higher impact factors than preprints that are not downloaded as much….”
Abstract: This article looks at whether there is evidence to support two prevailing assumptions about open access (OA). These assumptions are: (1) fully OA journals are inherently of poorer quality than journals supported by other business models and (2) the OA business model, that is, paying for publication, is more ‘competitive’ than the subscription journal access business model. The assumptions have been discussed in contemporary industry venues, and we have encountered them in the course of their work advising scholarly communications organizations. Our objective was to apply data analytics techniques to see if these assumptions bore scrutiny. By combining citation?based impact scores with data from publishers’ price lists, we were able to look for relationships between business model, price, and ‘quality’ across several thousands of journals. We found no evidence suggesting that OA journals suffer significant quality issues compared with non?OA journals. Furthermore, authors do not appear to ‘shop around’ based on OA price.
“The current dissatisfaction with scientific publishers is an obvious issue, with practices, such as ‘double dipping’ and services, such as the reprint servers, becoming the reason for the community to ask: what exactly is the added value of a traditional publisher? From our point of view, as a fully gold open access publisher and technology provider, there is an urgent need for publishers to demonstrate transparency, when it comes to forming their price policy, alongside a strong will to develop and adapt technologically together with the needs of the community. This is exactly why we have projects such as the Research Ideas and Outcomes (RIO) journal (www.riojournal.com), in our portfolio. RIO demonstrates our will to work towards true open science, where outputs along the full research cycle are published alongside the final peer-reviewed research article. These include research proposals, data, methods, negative results, presentation abstracts, software descriptions in a single research project collection. Opening the research cycle in this way does not only stimulate re-use and help researchers avoid duplication of work, but can also promote collaboration and interdisciplinarity. The real value to publish in RIO also comes from the fact that upon publication all these outputs become citable and discoverable. This functionality is, in fact, enabled on all our journals thanks to our publishing platform ARPHA, which is specifically developed to provide high level of automation and technologically advanced workflows, not only on terms of dissemination of archiving, but also for semantic enhancements of published content and integrations with industry’s leading service providers. …”
Abstract: This paper examines issues relating to the perceptions and adoption of open access (OA) and institutional repositories. Using a survey research design, we collected data from academics and other researchers in the humanities, arts and social sciences (HASS) at a university in Australia. We looked at factors influencing choice of publishers and journal outlets, as well as the use of social media and nontraditional channels for scholarly communication. We used an online questionnaire to collect data and used descriptive statistics to analyse the data. Our findings suggest that researchers are highly influenced by traditional measures of quality, such as journal impact factor, and are less concerned with making their work more findable and promoting it through social media. This highlights a disconnect between researchers’ desired outcomes and the efforts that they put in toward the same. Our findings also suggest that institutional policies have the potential to increase OA awareness and adoption. This study contributes to the growing literature on scholarly communication by offering evidence from the HASS field, where limited studies have been conducted. Based on the findings, we recommend that academic librarians engage with faculty through outreach and workshops to change perceptions of OA and the institutional repository.
Abstract: The use of bibliometric measures in the evaluation of research has increased considerably based on expertise from the growing research field of evaluative citation analysis (ECA). However, mounting criticism of such metrics suggests that the professionalization of bibliometric expertise remains contested. This paper investigates why impact metrics, such as the journal impact factor and the h-index, proliferate even though their legitimacy as a means of professional research assessment is questioned. Our analysis is informed by two relevant sociological theories: Andrew Abbott’s theory of professions and Richard Whitley’s theory of scientific work. These complementary concepts are connected in order to demonstrate that ECA has failed so far to provide scientific authority for professional research assessment. This argument is based on an empirical investigation of the extent of reputational control in the relevant research area. Using three measures of reputational control that are computed from longitudinal inter-organizational networks in ECA (1972–2016), we show that peripheral and isolated actors contribute the same number of novel bibliometric indicators as central actors. In addition, the share of newcomers to the academic sector has remained high. These findings demonstrate that recent methodological debates in ECA have not been accompanied by the formation of an intellectual field in the sociological sense of a reputational organization. Therefore, we conclude that a growing gap exists between an academic sector with little capacity for collective action and increasing demand for routine performance assessment by research organizations and funding agencies. This gap has been filled by database providers. By selecting and distributing research metrics, these commercial providers have gained a powerful role in defining de-facto standards of research excellence without being challenged by expert authority.
“Research evaluation has become routine and often relies on metrics. But it is increasingly driven by data and not by expert judgement. As a result, the procedures that were designed to increase the quality of research are now threatening to damage the scientific system. To support researchers and managers, five experts led by Diana Hicks, professor in the School of Public Policy at Georgia Institute of Technology, and Paul Wouters, director of CWTS at Leiden University, have proposed 10 principles for the measurement of research performance: the Leiden Manifesto for Research Metrics published as a comment in Nature….”