“The Passport For Open Science is a guide designed to accompany PhD students at every step of their research career, whatever their disciplinary field. It provides a set of tools and good practices that can be directly implemented….”
Abstract: Open science (OS) is of paramount importance for the improvement of science worldwide and across research fields. Recent years have witnessed a transition toward open and transparent scientific practices, but there is still a long way to go. Early career researchers (ECRs) are of crucial relevance in the process of steering toward the standardization of OS practices, as they will become the future decision makers of the institutional change that necessarily accompanies this transition. Thus, it is imperative to gain insight into where ECRs stand on OS practices. Under this premise, the Open Science group of the Max Planck PhDnet designed and conducted an online survey to assess the stance toward OS practices of doctoral candidates from the Max Planck Society. As one of the leading scientific institutions for basic research worldwide, the Max Planck Society provides a considerable population of researchers from multiple scientific fields, englobed into three sections: biomedical sciences, chemistry, physics and technology, and human and social sciences. From an approximate total population of 5,100 doctoral candidates affiliated with the Max Planck Society, the survey collected responses from 568 doctoral candidates. The survey assessed self-reported knowledge, attitudes, and implementation of different OS practices, namely, open access publications, open data, preregistrations, registered reports, and replication studies. ECRs seemed to hold a generally positive view toward these different practices and to be interested in learning more about them. Furthermore, we found that ECRs’ knowledge and positive attitudes predicted the extent to which they implemented these OS practices, although levels of implementation were rather low in the past. We observed differences and similarities between scientific sections. We discuss these differences in terms of need and feasibility to apply these OS practices in specific scientific fields, but additionally in relation to the incentive systems that shape scientific communities. Lastly, we discuss the implications that these results can have for the training and career advancement of ECRs, and ultimately, for the consolidation of OS practices.
Abstract: This study investigates the attitudes of Chinese PhD students toward predatory journals. Data were gathered using an online questionnaire to which 332 Chinese PhD students responded. Our main conclusions are 1) in the sciences, technology, and medicine, respondents frequently confused predatory journals with open access journals; 2) in the humanities and social sciences, the respondents identified only Chinese-language (not English-language) journals as predatory and made a number of misidentifications; and 3) most respondents indicated that they would not submit papers to predatory journals, mainly because doing so would hurt their reputation, yet the minority who were willing to do so mentioned easy acceptance and a short wait time for publication as the top reasons for considering it.
“Research is relatively new in many countries in Asia, Africa and Latin America. Across these regions, young scientists are working to build practices for open science from the ground up. The aim is that scientific communities will incorporate these principles as they grow. But these communities’ needs differ from those that are part of mature research systems. So, rather than shifting and shaping established systems, scientists are endeavouring to design new ones….”
“A survey of nearly 1,000 academic researchers in South Africa suggests that the majority are in favour of keeping a government scheme that offers cash rewards for publishing research papers in accredited journals, even though they agree that this can promote unethical practices.
The publication-incentive programme — which awards payments when researchers publish journal articles, conference proceedings and book chapters — is the country’s largest single pool of research funding, worth an estimated 2.4 billion South African rand (US$160 million) each year.
Under the scheme, researchers can receive about 120,000 rand per published article. The subsidies were initially implemented in 2005 to drive academic output, and it worked: South Africa’s overall research output rose from 4,063 articles in 2005 to 25,371 in 2018….”
“Now, thanks to support from the Andrew W. Mellon Foundation, Smarthistory is able to offer thirty additional $1,000 honoraria to emerging scholars who have suffered financial hardship due to the pandemic.
These honoraria are available for the successful publication, on Smarthistory, of a short, accessible essay of general interest and in the scholar’s area of specialization (the topic will be determined in consultation with the editors at Smarthistory), and is open to active Ph.D. students who are ABD, as well as those who have earned a Ph.D. in art history within the past two years. Smarthistory essays are aimed at non-specialist, undergraduate learners….
Authors will retain intellectual property rights to their work and will grant the right for Smarthistory to publish the resulting essay with a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License across all of its channels including Smarthistory.org and Khanacademy.org. Essays must be submitted before March 1, 2020. The acceptance of essays and the awarding of honoraria will be at the sole discretion of Smarthistory….”
Abstract: The pursuit of simple, yet fair, unbiased, and objective measures of researcher performance has occupied bibliometricians and the research community as a whole for decades. However, despite the diversity of available metrics, most are either complex to calculate or not readily applied in the most common assessment exercises (e.g., grant assessment, job applications). The ubiquity of metrics like the h-index (h papers with at least h citations) and its time-corrected variant, the m-quotient (h-index ÷ number of years publishing) therefore reflect the ease of use rather than their capacity to differentiate researchers fairly among disciplines, career stage, or gender. We address this problem here by defining an easily calculated index based on publicly available citation data (Google Scholar) that corrects for most biases and allows assessors to compare researchers at any stage of their career and from any discipline on the same scale. Our ??-index violates fewer statistical assumptions relative to other metrics when comparing groups of researchers, and can be easily modified to remove inherent gender biases in citation data. We demonstrate the utility of the ??-index using a sample of 480 researchers with Google Scholar profiles, stratified evenly into eight disciplines (archaeology, chemistry, ecology, evolution and development, geology, microbiology, ophthalmology, palaeontology), three career stages (early, mid-, late-career), and two genders. We advocate the use of the??-index whenever assessors must compare research performance among researchers of different backgrounds, but emphasise that no single index should be used exclusively to rank researcher capability.
“What does research integrity mean in an ideal open science ecosystem and how can libraries contribute to heighten professional ethics and standards required by open science? The sixth session of the OCLC/LIBER Open Science Discussion series brought together a small group of engaged participants focusing on these questions….”
“The European Research Council (ERC) has withdrawn its support for a radical open-access initiative in Europe, known as Plan S, saying that it will follow its own path towards open access….
While the council is “still committed to implementing full and immediate open access”, it states that it wants to focus more on researchers’ needs – especially early-career researchers – as well as preserving equity among European countries, particularly those with more limited national financial support for research.
The main sticking point for the ERC over Plan S was cOAlition S’s stance on so-called “hybrid” journals….”