“Nothing (and in particular no semi-automatized pseudo-scientific evaluation that involves numbers or data) can replace evaluation by an individual who actually understands what he/she is evaluating. Furthermore, tools such as impact factors are clearly not helpful or relevant in the context of mathematical research….”
“A partnership of the American Folklore Society and the Indiana University Libraries, Open Folklore is a scholarly resource devoted to increasing the number and variety of open access resources, published and unpublished, that are available for the field of folklore studies and the communities with whom folklore scholars partner….”
“It is not easy to have a paper published in the Lancet, so Wakefield’s paper presumably underwent a stringent process of peer review. As a result, it received a very strong endorsement from the scientific community. This gave a huge impetus to anti-vaccination campaigners and may well have led to hundreds of preventable deaths. By contrast, the two mathematics preprints were not peer reviewed, but that did not stop the correctness or otherwise of their claims being satisfactorily established.
An obvious objection to that last sentence is that the mathematics preprints were in fact peer-reviewed. They may not have been sent to referees by the editor of a journal, but they certainly were carefully scrutinized by peers of the authors. So to avoid any confusion, let me use the phrase “formal peer review” for the kind that is organized by a journal and “informal peer review” for the less official scrutiny that is carried out whenever an academic reads an article and comes to some sort of judgement on it. My aim here is to question whether we need formal peer review. It goes without saying that peer review in some form is essential, but it is much less obvious that it needs to be organized in the way it usually is today, or even that it needs to be organized at all.
What would the world be like without formal peer review? One can get some idea by looking at what the world is already like for many mathematicians. These days, the arXiv is how we disseminate our work, and the arXiv is how we establish priority. A typical pattern is to post a preprint to the arXiv, wait for feedback from other mathematicians who might be interested, post a revised version of the preprint, and send the revised version to a journal. The time between submitting a paper to a journal and its appearing is often a year or two, so by the time it appears in print, it has already been thoroughly assimilated. Furthermore, looking a paper up on the arXiv is much simpler than grappling with most journal websites, so even after publication it is often the arXiv preprint that is read and not the journal’s formatted version. Thus, in mathematics at least, journals have become almost irrelevant: their main purpose is to provide a stamp of approval, and even then one that gives only an imprecise and unreliable indication of how good a paper actually is….
An alternative system would almost certainly not be perfect, but to insist on perfection, given the imperfections of the current system, is nothing but status quo bias. To guard against this, imagine that an alternative system were fully established and see whether you can mount a convincing argument for switching to what we have now, where all the valuable commentary would be hidden away and we would have to pay large sums of money to read each other’s writings. You would be laughed out of court.”
“In early 2016, the Office for Scholarly Communication (OSC) launched a pilot project to recruit help from around the university to deposit faculty-authored articles in DASH, Harvard’s open-access repository. This project has the full support of the Harvard Library. In January of this year, the project emerged from the pilot phase, and was officially renamed the Distributed DASH Deposits program, or D3. All Harvard schools have made a start with D3, and the next goal is to scale up.”
“The Harvard-China Project on Energy, Economy and Environment is pleased to announce that the Project’s faculty, researchers, and staff have adopted an open-access policy. They unanimously endorsed the policy on September 21, 2017 to grant Harvard a nonexclusive and worldwide right to distribute “the fruits of [their] research and scholarship as widely as possible.”
“In deciding where to publish our research, we have to consider why we do research. While some of us would probably undertake research for the intellectual challenge or excitement of discovery alone, for many of us it is important that our research will impact society in some way. This may be from contributing to the advance of our scientific discipline, or through the use of our research by the public, policymakers or industry. For all of these to come to pass, there is a basic premise that our publications can be found and accessed by those who can make use of the information they contain. Hence one of the key decisions around choice of where to publish is to think of the audience that reads the journal, and whether to make your paper Open Access.”
“This link to an open access version of an article comes through oadoi.org, which indexes millions of articles and delivers open-access full-text versions over an open API. The MIT Libraries are excited to offer this new path to access scholarly content. oaDOI is a contribution to an open access infrastructure that, by taking readers to versions of articles that are not behind paywalls, supports MIT’s aim of democratizing access to information. …”
Selections from David A. Pendlebury’s blog articles – ‘Easing the Path to Access OA Content for Researchers – Part 1‘ and ‘That was then but this is now… #OAWeek – Part 2′
David Pendlebury presents a look into the future of Open Access discovery, and how Clarivate Analytics is leading the way.
‘This week marks the 10th anniversary of ‘International Open Access Week,’ and this year’s observance carries a thought-provoking theme: “Open in order to….” This is in itself an open-ended statement which urges us all to focus on what Open Access (OA) really enables, and how we can further enable OA.
Clarivate Analytics is not a publisher, but we are a valued and long-standing part of the scholarly ecosystem, partnering with scholars, and authors, with publishers, and editors and reviewers, so all the hats a researcher wears.
At a time when debate is raging in Europe about OA and the costs associated with access to scholarly research, we are proud to be easing the path to access OA content for researchers.’
‘Clarivate Analytics, which now produces Garfield’s Web of Science, carries his legacy forward and continues to be aligned first and foremost with the interests of researchers.’
‘Like Eugene Garfield, we value what scientists and scholars do and we honor their commitment and dedication to their institutions and their colleagues. A vital research activity that has been undervalued historically is peer review, a truly unselfish and “under-rewarded” service.
That is why, earlier this year, we acquired Publons, a pioneering enterprise that will document and provide data on the review activities of researchers. We also want to enable them to validate their contribution to the quality of published content through their reviewer activities.’
To learn more about how Clarivate Analytics is maximizing the free availability of new research findings and other scholarly output, please follow this link to download our white paper on ‘Opening the way to Open Access’.
If you wish to read the full articles, you will see what is in store for the future of open access discovery and links to other sources of valuable information on open access – ‘Easing the Path to Access OA Content for Researchers – Part 1‘ and ‘That was then but this is now… #OAWeek – Part 2′
Happy International Open Access Week!
“The Science International Accord on ‘Open Data in a Big Data World’ presents an inclusive vision of the need for and the benefits of Open Data for science internationally, and in particular for Lower and Middle Income Countries. A major outcome is the African Open Science Platform initiative, supported by the South African Department of Science and Technology, directed by CODATA and implemented by ASSAf.
The development of an open science and innovation platform depends not only on the physical infrastructure for acquiring, curating and disseminating data and information, but also on protocols, policies and procedures in the science system that provide the structure and support to ensure that science objectives are achieved….”