In support of journal-agnostic review | ASAPbio

“It’s easy to dismiss a paper when we think we know what the journal wants. In contrast, I found time and again that when a journal is new (and I’ve helped launch several journals, including Molecular Cell, Developmental Cell, PLOS Biology, and Disease Models & Mechanisms), peer reviews would take on a different quality. Reviewers would write something like, “I don’t know how to advise you because I don’t know what your standards are or what you want in the journal, so I guess I’ll just tell you the strengths and weaknesses of the paper and you will have to decide.” (Thank you, this is all I ever wanted). At the same time, authors would remark that the reviews were unusually constructive – whether we chose to accept or reject the paper. These are anecdotal observations, and I would love to collect some data on the quality of review when you don’t know journal identity.

Back in my days as a journal editor, not many biologists were posting preprints, so my idea was to create a “journal blind” submission system during standard peer review. But with BioRxiv and other preprint servers, performing peer review in advance of journal publication becomes possible. I encourage the community to try it, gather some data, and proceed based on evidence.”

APPRAISE (A Post-Publication Review and Assessment In Science Experiment) | ASAPbio

“I describe here a new project – called Appraise – that is both a model and experimental platform for what peer review can and should look like in a world without journals….

The rise of preprints gives us the perfect opportunity to create a new system that takes full advantage of the Internet to more rapidly, effectively and fairly engage the scientific community in assessing the validity, audience and impact of published works….

APPRAISE (A Post-Publication Review and Assessment In Science Experiment)…

It is perhaps easiest to think of Appraise as an editorial board without a journal (and we hope to be a model for how existing editorial boards can transition away from journals). Like journal editorial boards they will curate the scientific literature through the critical process of peer review. However members of Appraise will not be reviewing papers submitted to a journal and deciding whether it should be published. Rather Appraise reviewers are working in service of members of the scientific community, selecting papers they think warrant scrutiny and attention, and reviewing them to help others find, understand and assess published paper….

In the spirit of openness we encourage Appraise members to identify themselves, but recognize that the ability to speak freely sometimes requires anonymity. Appraise will allow members to post reviews anonymously provided that there are no conflicts of interest and the reviewer does not use anonymity as a shield for inappropriate behavior. Whether reviewers are publicly identified or not, Appraise will never tolerate personal attacks of any kind.

We are launching Appraise with a small group of scientists. This is for purely practical purposes – to develop our systems and practices without the challenges of managing a large, open community. But the goal is to as quickly as possible open the platform up to everyone.”

RFP: Review OER in Your Discipline –CUNY Teaching and Learning Center

“The Teaching and Learning Center and the Graduate Center Library invite individual or group proposals from CUNY Graduate Center students for literature reviews of OER in specific disciplines. We will pay $1000 per discipline, and can fund up to five distinct projects in the Spring 2018 semester.

Each project will result in a report of 1000-2000 words that evaluates Open Educational Resources in a specific discipline with attention to breadth and depth of coverage, inclusiveness of emergent voices and arguments, and appropriateness for deployment in an undergraduate classroom. Funded projects will be developed with support from staff at the Teaching and Learning Center and Graduate Center Library, and grantees will be expected to attend up to three meetings during the Spring semester to report on their progress. Final reports will be due June 1, 2018, and will be published in Summer 2018 on Visible Pedagogy.  

To apply, submit a single PDF by email to tlc@gc.cuny.edu by midnight on February 12, 2018….”

Perspectives on #OpenAccess During #OpenLearning17 hangout | Reflecting Allowed

“We just ended the first of two #OpenLearning17 hangouts, with Frances Bell, Chris Gilliard, Chris Friend and surprise guest, Peter Suber, whose book on Open Access we’ve been reading this week. The hangout was co-facilitated by Sue Erickson and myself, and I also invited folks from the community to participate, so Amy Nelson and Jim Luke joined us and enriched the discussion further. When putting together the guest list for this, I thought of reaching out to people with diverse approaches to openness, and I think while we all have a similar orientation towards openness and social justice, we definitely took different approaches to it in the hangout. From Chris Friend talking about openness in the Hybrid Pedagogy review process, to Frances Bell providing her perspective on open access over time, and offering critical questions (what Frances has to offer is so multi-faceted it’s difficult to summarize, honestly), and Chris Gilliard talking about digital redlining – and Peter Suber answering questions on different topics, but particularly giving his views on Gold Open Access that involves Article Processing Charges. …”

Phil Davis: Future of the OA Megajournal – The Scholarly Kitchen

“On June 1st, 2011, Peter Binfield, then publisher of PLOS ONE, made a bold and shocking prediction at the Society of Scholarly Publishing annual meeting: I believe we have entered the era of the OA mega journal,” adding, “Some basic modeling predicts that in 2016, almost 50% of the STM literature could be published in approximately 100 mega journals…Content will rapidly concentrate into a small number of very large titles. Filtering based solely on Journal name will disappear and will be replaced with new metrics. The content currently being published in the universe of 25,000 journals will presumably start to dry up. If you were not present for that pre-meeting workshop, you likely heard it repeated throughout the conference. The open access (OA) megajournal was taking over STM publishing and Binfield had data to prove it. PLOS ONE, which had received its first 2010 Impact Factor (4.351) the previous summer, was exploding with new submissions. In a few weeks, the journal would receive its second Impact Factor (4.411), a confirmation that its model was both wildly successful and dangerously competitive. PLOS had discovered the future of STM publishing and others had better get on board or get out of the way….”

Open and Shut?: Realising the BOAI vision: Peter Suber’s Advice

Peter Suber’s current high-priority recommendations for advancing open access.

Open and Shut?: Realising the BOAI vision: Peter Suber’s Advice

Peter Suber’s high-priority recommendations for advancing OA.

The idea of an open-access evidence rack

“Here’s the idea in three steps.

First, identify the basic propositions in the field or sub-field you want to cover. To start small, identify the basic propositions you want to defend in a given article.

Second, create a separate OA web page for each proposition. For now, don’t worry about the file format or other technicalities. What’s important is that the pages should (1) be easy to update, (2) carry a time-stamp showing when they were last updated, and (3) give each proposition a unique URL. Let’s call them “proposition pages”.

Third, start filling in each page with the evidence in support of its proposition. If some evidence has been published in an article or book, then cite the publication. When the work is online (OA or TA), add a link as well. Whenever you can link directly to evidence, rather than merely to publications describing evidence, do that. For example, some propositions can be supported by linkable data in an open dataset. But because citations and data don’t always speak for themselves, consider adding some annotations to explain how cited pieces of evidence support the given proposition.

Each supporting study or piece of evidence should have an entry to itself. A proposition page should look more like a list than an article. It should look like a list of citations, annotated citations, or bullet points. It should look like a footnote, perhaps a very long footnote, for the good reason that one intended use of a proposition page is to be available for citation and review as a compendious, perpetually updated, public footnote. …”