National Bike Month comes to a close

2105252468_9e829a638b National Bike Month which takes place every May, ends this week. To celebrate a month of cycling focused activities, EveryONE is highlighting some recent cycling research published in PLOS ONE.

As biking becomes ever more popular and bike-sharing programs expand, such as in New York City last weekend, cycling injuries and fatalities may increase as well. Although most people acknowledge the utility of wearing a helmet, encouraging cyclists to actually use them can be difficult.  A study examining the efficacy of several helmet-promotion measures showed that attitudes about helmets making people “look ridiculous” or “old-fashioned” can be hard to counter. Even providing cyclists with free helmet was only mildly successfully in convincing non-helmet users to wear one. The most effective measures included pressure from family or friends to wear it, as well as shifting the safety dialogue from helmets as head and brain protection to promoting helmets as “face-protecting” devices. So folks, as you get on your bikes this weekend, protect your face and wear a helmet.

Aesthetics aside, competitive cyclists are frequently seeking ways to improve their performance and speed recovery. Compression sportswear, from sleeves to knee-high socks to shorts, is one current performance-enhancing trend. These products tout improved arterial blood flow from the compression as a way to increase speed, reduce chances of injury and shorten recovery time. A pair of compression cycling shorts can be quite expensive though, so before you go purchasing your way to shorter race times, evidence in recent research indicates that claims of their efficacy may be overreaching. In fact, a study on athletes wearing compression shorts showed blood flow to the muscle actually decreased, contradicting many of the claims from these sportswear companies. Getting faster may just require more time in the saddle.

Theft of bikes is a persistent issue facing casual and competitive cyclists alike, but there’s some good news on this front: recent research showed a relatively simple deterrent to be surprisingly effective. The study found a 62% decrease in bicycle theft in locations where an ominous sign showing a person’s eyes and the words “Cycle Thieves We Are Watching You” (below) was posted above the bike racks. Theft in locations without these posters rose.


For more research on bikes and cycling performance, visit PLOS ONE.

Citation:  Constant A, Messiah A, Felonneau M-L, Lagarde E (2012) Investigating Helmet Promotion for Cyclists: Results from a Randomised Study with Observation of Behaviour, Using a Semi-Automatic Video System. PLoS ONE 7(2): e31651. doi:10.1371/journal.pone.0031651

Sperlich B, Born D-P, Kaskinoro K, Kalliokoski KK, Laaksonen MS (2013) Squeezing the Muscle: Compression Clothing and Muscle Metabolism during Recovery from High Intensity Exercise. PLoS ONE 8(4): e60923. doi:10.1371/journal.pone.0060923

Nettle D, Nott K, Bateson M (2012) ‘Cycle Thieves, We Are Watching You’: Impact of a Simple Signage Intervention against Bicycle Theft. PLoS ONE 7(12): e51738. doi:10.1371/journal.pone.0051738

Image Credits: cyclist by jesse.millan, poster from pone.0051738

Brain and Behavior: Issue 3 Highlights and Introducing Altmetric!

BRB 3 3Brain and Behavior’s latest issue brings together research on increased risk of anxiety from cigarette smoking, the effects of bisphenol A on social behavior in mice, and how the brain responds to cognitive load.  The cover features an image from, Neuroanatomical and neuropharmacological approaches to postictal antinociception-related prosencephalic neurons: the role of muscarinic and nicotinic cholinergic receptors by Renato Leonardo de Freitas,  Luana Iacovelo Bolognesi, André Twardowschy, Fernando Morgan Aguiar Corrêa, Nicola R. Sibson, Norberto Cysne Coimbra.

We’re also excited to announce that Brain and Behavior is participating in a pilot program offered by the service Altmetric to offer authors and users alternative metrics for articles and datasets to measure their impact on both traditional and social media. 

Below is another highlight chosen by the editorial team. 

purple_lock_open Crucifixion and median neuropathy
By Jacqueline M. Regan, Kiarash Shahlaie, Joseph C. Watson
Abstract: Crucifixion as a means of torture and execution was first developed in the 6th century B.C. and remained popular for over 1000 years. Details of the practice, which claimed hundreds of thousands of lives, have intrigued scholars as historical records and archaeological findings from the era are limited. As a result, various aspects of crucifixion, including the type of crosses used, methods of securing victims to crosses, the length of time victims survived on the cross, and the exact mechanisms of death, remain topics of debate. One aspect of crucifixion not previously explored in detail is the characteristic hand posture often depicted in artistic renditions of crucifixion. In this posture, the hand is clenched in a peculiar and characteristic fashion: there is complete failure of flexion of the thumb and index finger with partial failure of flexion of the middle finger. Such a “crucified clench” is depicted across different cultures and from different eras. A review of crucifixion history and techniques, median nerve anatomy and function, and the historical artistic depiction of crucifixion was performed to support the hypothesis that the “crucified clench” results from proximal median neuropathy due to positioning on the cross, rather than from direct trauma of impalement of the hand or wrist.

Link to the full table of contents here.

Submit your research here.

Don’t miss an issue!  Register for email table of contents alerts here.

Jailbreaking the PDF – 4; Making text from characters

In previous posts I have shown how we can, in most cases, create a set of Unicode characters from a PDF. If the original authors (e.g. many Government documents) were standard-compliant this is almost trivial. For scholarly publications, where the taxpayer/student pays 5000 USD per paper, the publishers refuse to use standards. So we have to use heuristics on this awful mess. (I have not yet found a scholarly publisher which is compliant and makes a syntactically manageable PDF – we pay them and they corrupt the information). But we have enough experience that for a given publisher we are correct 99->99.999% of the time (depending on the discipline – maths is harder than narrative text).

So now we have pages and on each page we have an UNORDERED list of characters. (We cannot rely on the order in which characters are transmitted – I spent two “wasted” months trying to use sequences and character groupings). We have to reconstruct text from the following STANDARD information for each character:

  • Its XY coordinates (raw PDF uses complex coordinates, PDFBox normalises to the page (0-600, 0-800))
  • Its FontFamily (e.g. Helvetica). This is because semantics are often conveyed by Fonts – monospace implies code or data. (I shall upset typographical purists as I should use “typeface” ( ) and not “font” or “font family”. But “FontFamily” is universal in PDF and computer terminology.
  • Its colour. This can be moderately complex – a character has an outline (stroke) and body (fill) and there are alpha overlays, transparency, etc. But most of the time it’s black.
  • Its font Weight. Normal or Bold. It’s complicated when publishers use fonts like MediumBold (greyish)
  • Its Size. The size is the actual font-size in pixels and not necessarily the points as in .

    Characters in the same font have different extents because of ascenders and descenders:

  • Its width. Monospaced fonts ( ) have equal width for all characters:

    Note that “I” and “m” have the same width. Any deliberate spaces also have the same width. That makes it easy to create words. The example above would have words “Aa”, “Ee”, “Qd”. (A word here is better described as a space-separated token, but “word” is simpler. It doesn’t mean it makes linguistic or numeric sense.

    If the font is not monospaced then we need to know the width. Here’s a proportional font ( ):

    See how the “P” is twice as wide as the “I” or “l” in the proportional font. We MUST know the width to work out whether there is a space after it. Because there are NO SPACES in PDFs.

  • Its style. Conflated with “slope”. Most scientists simply think “italic” (as in Java). But we find “oblique” and “underline” and many others. We need to project these to “italic” and “underline” as these have semantics.

Note that NormalBold , Normal|Italic, Normal|Underline can be multiplied to give 8 variants. Conformant PDF makes this easy – PDFBox has an API which includes:

  • public float getItalicAngle()
  • public float getUnderlineThickness();
  • public float getItalicAngle()
  • public
    boolean isBold(Font font)


If we have all this information then it isn’t too difficult to reconstruct:

  • words
  • Weight of words (bold)
  • Style of word (italic or underline)

Which already takes us a long way.

Do scholarly publishers use this standard?


(You probably guessed this.) For example I cannot get the character width out of ELife, the new Wellcome/MPI/HHMI journal. This seems to be because ELife hasn’t implemented the standard. They launched in 2012. There is no excuse for a modern publisher not being standards-compliant.

So the last posts have shown non-compliance in Elife, PeerJ, BMC. Oh, and PLoSOne also uses opaque fontFamilies (e.g. AdvP49811) . So the Open Access publishers all use non-standard fonts.

Do you assume that because closed access publishers charge more, they do better?

I can’t answer that because they have more money to pay lawyers.

I’ll let you guess. Since #AMI2 is Open Source you can do it yourself.

Global Research Council: Counting Gold OA Chicks Before the Green OA Eggs Are Laid

The Global Research Council?s Open Access Action Plan is, overall, timely and welcome, but it is far too focused on OA as (?Gold?) OA publishing, rather than on OA itself (online access to peer-reviewed research free for all).

And although GRC does also discuss OA self-archiving in repositories (?Green? OA), it does not seem to understand Green OA?s causal role in OA itself, nor does it assign it its proper priority.

There is also no mention at all of the most important, effective and rapidly growing OA plan of action, which is for both funders and institutions to mandate (require) Green OA self-archiving. Hence neither does the action plan give any thought to the all-important task of designing Green OA mandates and ensuring that they have an effective mechanism for monitoring and ensuring compliance.

The plan says:

?The major principles and aims of the Action Plan are simple: they are (a) encouragement and support for publishing in open access journals, (b) encouragement and support for author self-deposit into open access repositories, and (c) the creation and inter-connection of repositories.?

Sounds like it covers everything — (a) Gold, (b) Green, and (c) Gold+Green ? but the devil is in the details, the causal contingencies, and hence the priorities and sequence of action.

?In transitioning to open access, efficient mechanisms to shift money from subscription budgets into open access publication funds need to be developed.?

But the above statement is of course not about transitioning to OA itself, but just about transitioning to OA publishing (Gold OA).

And the GRC?s action plans for this transition are putting the cart before the horse.

There are very strong, explicit reasons why Green OA needs to come first — rather than double-paying for Gold pre-emptively (subscriptions plus Gold) without first having effectively mandated Green, since it is Green OA that will drive the transition to Gold OA at a fair, affordable, sustainable price:

Plans by universities and research funders to pay the costs of Open Access Publishing (“Gold OA”) are premature. Funds are short; 80% of journals (including virtually all the top journals) are still subscription-based, tying up the potential funds to pay for Gold OA; the asking price for Gold OA is still high; and there is concern that paying to publish may inflate acceptance rates and lower quality standards. What is needed now is for universities and funders to mandate OA self-archiving (of authors’ final peer-reviewed drafts, immediately upon acceptance for publication) (“Green OA”). That will provide immediate OA; and if and when universal Green OA should go on to make subscriptions unsustainable (because users are satisfied with just the Green OA versions) that will in turn induce journals to cut costs (print edition, online edition, access-provision, archiving), downsize to just providing the service of peer review, and convert to the Gold OA cost-recovery model; meanwhile, the subscription cancellations will have released the funds to pay these residual service costs. The natural way to charge for the service of peer review then will be on a “no-fault basis,” with the author’s institution or funder paying for each round of refereeing, regardless of outcome (acceptance, revision/re-refereeing, or rejection). This will minimize cost while protecting against inflated acceptance rates and decline in quality standards.

Harnad, S. (2010) No-Fault Peer Review Charges: The Price of Selectivity Need Not Be Access Denied or Delayed. D-Lib Magazine 16 (7/8).

Action 5: Develop an integrated funding stream for hybrid open access

Worst of all, the GRC action plan proposes to encourage and support hybrid Gold OA, with publishing not just being paid for doubly (via subscriptions to subscription publishers + via Gold OA fees to Gold OA publishers) but, in the case of hybrid Gold, with the double-payment going to the very same publisher, which not only entails double-payment by the research community, but allows double-dipping by the publisher.

That is the way to leave both the price and the timetable for any transition to OA in the hands of the publisher.

Action 6: Monitor and assess the affordability of open access

There is no point monitoring the affordability of Gold OA today, at a stage when it is just a needless double-payment, at the publisher?s current arbitrary, inflated Gold OA asking price.

What does need monitoring is compliance with mandates to provide cost-free Green OA, while subscriptions are still paying in full (and fulsomely) for the cost of publication, as they are today.

Action 7: Work with scholarly societies to transition society journals into open access

The only thing needed from publishers today ? whether scholarly or commercial ? is that they not embargo Green OA. Most (60%) don?t.

The transition to Gold OA will only come after Green OA has made subscriptions unsustainable, which will not only induce publishers to cut obsolete costs, downsize and convert to Gold OA, but it will also release the concomitant institutional subscription cancellation windfall savings to pay the price of that affordable, sustainable post-Green Gold.

Action 8: Supporting self-archiving through funding guidelines and copyright regulations
?The deposit of publications in open access repositories is often hampered not only by legal uncertainties, but also by the authors? reluctance to take on such additional tasks. Funding agencies will address this issue by exploring whether and how authors can be encouraged and supported in retaining simple copyrights as a precondition to self-archiving. In doing so, funders will also address authors? need to protect the integrity of their publications by providing guidance on suitable licenses for such purpose.?

Yes, Green OA needs to be supported. But the way to do that is certainly not just to ?encourage? authors to retain copyright and to self-archive.

It is (1) to mandate (require) Green OA self-archiving (as 288 funders and institutions are already doing: see ROARMAP), (2) to adopt effective mandates that moot publisher OA embargoes by requiring immediate-deposit, whether or not access to the deposit is embargoed, and (3) to designate institutional repository deposit as the mechanism for making articles eligible for research performance review. Then institutions will (4) monitor and ensure that their own research output is being deposited immediately upon acceptance for publication.

Action 9: Negotiate publisher services to facilitate deposit in open access repositories

Again, the above is a terribly counterproductive proposal. On no account should it be left up to publishers to deposit articles.

For subscription publishers, it is in their interests to gain control over the Green OA deposit process, thereby making sure that it is done on their timetable (if it is done at all).

For Gold OA, it?s already OA, so depositing it in a repository is no challenge.

It has to be remembered and understood that the ?self? in self-archiving is the author. The keystrokes don?t have to be personally executed by the author (students, librarians, secretaries can do the keystrokes too). But they should definitely not be left to publishers to do!

Green OA mandates are adopted to ensure that the keystrokes get done, and on time. Most journal are not Gold OA, but a Green OA mandate requires immediate deposit whether or not the journal is Gold OA, and whether or not access to the deposit is embargoed.

Action 10: Work with publishers to find intelligent billing solutions for the increasing amount of open access articles

The challenge is not to find ?billing solutions? for the minority of articles that are published as Gold OA today. The challenge if to adopt an effective, verifiable Green OA mandate to self-archive all articles.

Action 11: Work with repository organisations to develop efficient mechanisms for harvesting and accessing information

This is a non-problem. Harvesting and accessing OA content is already powerful and efficient.

It can of course be made incomparably more powerful and efficient. But there is no point or incentive in doing this while the target content is still so sparse ? because it has not yet been made OA (whether Green or Gold)!

Only about 10 ? 40% of content is OA most fields.

The way to drive that up to the 100% that it could already have been for years is to mandate Green OA.

Then (and only then) will be there be the motivation to ?develop [ever more] efficient mechanisms for harvesting and accessing [OA] information?

Action 12: Explore new ways to assess quality and impact of research articles

This too is happening already, and is not really an OA matter. But once most articles are OA, OA itself will generate rich new ways of measuring quality and impact.

Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1)

(Some of these comments have already been made in connection with Richard Poynder’s intreview of Johannes Fournier.)

Evolutionary Applications Issue 6.4 Now Live

EVA_6_4_coverEvolutionary Applications has now published its latest issue. This issue includes a number of top papers highlighted by Editor-in-Chief, Louis Bernatchez:

purple_lock_open The impact of natural selection on health and disease: uses of the population genetics approach in humans by Estelle Vasseur and Lluis Quintana-Murci

purple_lock_open Genomic and environmental selection patterns in two distinct lettuce crop–wild hybrid crosses by Yorike Hartman, Brigitte Uwimana, Danny A. P. Hooftman, Michael E. Schranz, Clemens C. M. van de Wiel, Marinus J. M. Smulders, Richard G. F. Visser and Peter H. van Tienderen

purple_lock_open Evolutionary rescue in populations of Pseudomonas fluorescens across an antibiotic gradient by Johan Ramsayer, Oliver Kaltz and Michael E. Hochberg

The journal continues to receive a high number of submissions across all areas of evolutionary biology and we would encourage you to submit your paper to the journal. Evolutionary Applications publishes papers that utilize concepts from evolutionary biology to address biological questions of health, social and economic relevance. In order to better serve the community, we also now strongly encourage submissions of papers making use of modern molecular and genetic methods to address important questions in any of these disciplines and in an applied evolutionary framework. Two of the papers highlighted above are good examples of this sort of paper. Further information about the journal’s aims and scopes can be found on the website.

Make sure that you never miss an issue by signing up for free table of content alerts here >

“Licences4Europe” has not accepted “The Right to Read is the Right to Mine”

One sentence summary (this link has all the documentation)

Stakeholders representing the research sector, SMEs and open access publishers withdraw from Licences for Europe


I have formally been a member of EC-L4E-WG4 a working group of the European Commission concentrating on Text and Data Mining (TDM, though I prefer “Content Mining”). I haven’t attended meetings (due to date clashes) but Ross Mounce has stood in for me and given brilliant presentations). The initial idea of the WG was to facilitate TDM as an added value to conventional publications and other sources. (The current problem is that copyright can be interpreted as forbidding TDM). When I and others joined this effort it was on the assumption that we would be looking for positive ways forward to encourage TDM.

When I buy a book I can do what I like with it. I can write on it.

from ( ) I can cut it up into bits. I can give/sell the book to someone else. I can give/sell the cut-out bits to someone else. I can stick the cut-out bits into a new book. I can transcribe the factual content. I can do almost anything other than copy non-facts.

With scholarly articles I can’t do any of this. I cannot own an article, I can only rent it. (Appalling concession #1 by Universities went completely unnoticed – I shall blog more). I cannot extract facts from it. (Even more Appalling concession #2 by Universities went completely unnoticed – I shall blog more). So the publishers have dictated to Universities that we cannot anything with the 10,000,000,000 USD we give to the publishers each year.

The publishers are now proposing that if we want to use any of OUR content (which we have already paid for) we should pay the publishers MORE. That TDM is an “added service” provided by publishers. It’s not. I can TDM without any help from the publishers. The only thing the publishers are doing is holding us to ransom.

If you don’t feel this is unjust and counterproductive stop reading. Back to “Licences for Europe”…

The L4E group has had no chance to set the group assumptions. From the outset the chair has insisted that this group is “L4E”, licences for Europe. The default premise is that document producers can and should add additional restrictions through licences. In short – we have fought this publicly and the chair has failed to listen to us, let alone consider our arguments. Who are we?

  • The Association of European Research Libraries (LIBER)
  • The Coalition for a Digital Economy
  • European Bureau of Library Information and Documentation Associations (EBLIDA)
  • The Open Knowledge Foundation
  • Communia
  • Ubiquity Press Ltd.
  • Trans?Atlantic Consumer Dialogue
  • National Centre for Text Mining, University of Manchester
  • European Network for Copyright in support of Education and Science (ENCES)
  • Jisc

Not a lightweight list. Here’s the formal history:

We welcomed the orientation debate by the Commission in December 2012 and the subsequent commitment to adapt the copyright framework to the digital age. We believe that any meaningful engagement on the legal framework within which data driven innovation exists must, as a point of centrality, address the issue of limitations and exceptions. Having placed licensing as the central pillar of the discussion, the “Licences for Europe” Working Group has not made this focused evaluation possible. Instead, the dialogue on limitations and exceptions is only taking place through the refracted lens of licensing. This incorrectly presupposes that additional relicensing of already licensed content (i.e. double licensing) – and by implication also licensing of the open internet– is the solution to the rapid adoption of TDM technology.

We wrote expressing our concerns (March 14) – some sentences (highlighting is mine):

10. Data driven innovation requires the lowest barriers possible to reusing content. Requiring the relicensing of copyright works one already has lawful access to for a non – competing use is entirely disproportionate, and raises strong ethical questions as it will affect what computer based medical and scientific research can and cannot be undertaken in the EU.

11. A situation where each proposed TDM based research or use of content, to which one already has lawful access, has to be submitted for approval is unscalable*, and will raise barriers to research and reduce online innovation. It will slow medical discoveries and data driven innovation inexorably, and will only serve to drive jobs, research, health and wealth – creation elsewhere.

12. For the full potential of data driven innovation to become a reality, a limitation and exception that allows text and data mining for any purposes, which cannot be over – ridden by private contracts is required in EU law.

13. Subject to point 3, we must be able to share the results of text and data mining with no hindrances irrespective of copyright laws or licensing terms to the contrary. 14. In the European information society, the right to read must be the right to mine.

(I am particularly pleased that my phrase “the right to read must be the right to mine” expresses our message succinctly.

Unfortunately the response ( ) was anodyne and platitudinal (“win-win solutions for all stakeholders”). It became clear that this group could not make any useful progress and at worse would legitimize the interests of the “content owners”.

So we have withdrawn.

Having placed licensing as the central pillar of the discussion, the “Licences for Europe” Working Group has not made this focused evaluation possible. Instead, the dialogue on limitations and exceptions is only taking place through the refracted lens of licensing. This incorrectly presupposes that additional relicensing of already licensed content (i.e. double licensing) – and by implication also licensing of the open internet– is the solution to the rapid adoption of TDM technology.

Therefore, we can no longer participate in the “Licences for Europe” process. We maintain that a vibrant internet and a healthy scholarly publishing community need not be at odds with a modern copyright framework that also allows for the barrier – free extraction of facts and data. We have already expressed this view sufficiently well within the Working Group.

And we have concerns about transparency.

We would like to reiterate our request for transparency around the “Licences for Europe” dialogue and kindly request that the following actions be taken:

  • That the list of organisations participating in all of the “Licenses for Europe” Working Groups be made publicly available on the “Licences for Europe” website;
  • That the date of withdrawal for organisations leaving the process is also recorded on this list;
  • That it is made clear on any final documents that the outputs from the working group on TDM are not endorsed by our organisations and communities.


If you feel that we have a right to mine our information, then help us fight for it. Because inaction simply hands our rights to vested interests.

Sharing was Caring for Ancient Humans and Their Prehistoric Pups

huskies_1While the tale of how man’s best friend came to be (i.e., domestication) is still slowly unfolding, a recently published study in PLOS ONE may provide a little context—or justification?—for dog lovers everywhere. It turns out that even thousands of years ago, humans loved to share food with, play with, and dress up their furry friends.

In the study titled “Burying Dogs in Ancient Cis-Baikal, Siberia: Temporal Trends and Relationships with Human Diet and Subsistence Practices,” biologists, anthropologists, and archaeologists joined forces to investigate the nature of the ancient human-dog relationship by analyzing previously excavated canid remains worldwide, with a large portion of specimens in modern-day Eastern Siberia, Russia. The authors performed genetic analysis and skull comparisons to establish that the canid specimens were most likely dogs, not wolves, which was an unsurprising but important distinction when investigating the human-canine bond. The canid skulls from the Cis-Baikal region most closely resembled large Siberian huskies, or sled dogs. Radiocarbon dating from previous studies also provided information regarding the dates of death and other contextual information at the burial sites.

The researchers found that the dogs buried in Siberia, many during the Early Neolithic period 7,000-8,000 years ago, were only found at burial sites shared with foraging humans. Dogs were found buried in resting positions, or immediately next to humans at these sites, and their graves often included various items or tools seemingly meant for the dogs. One dog in particular was adorned with a red deer tooth necklace around its neck and deer remnants by its side, and another was buried with what appears to be a pebble or toy in its mouth.

prehistoric dog_3

By analyzing the carbon and nitrogen in human and dog specimens in this region, the researchers were able to determine similarities in human and dog diets, both of which were rich in fish. This finding may be somewhat surprising because one might assume that dogs helped humans hunt terrestrial game, and would consequently be less likely found among humans that ate primarily fish.

The authors speculate that dogs were considered spiritually similar to humans, and were therefore buried at the same time in the same graves. The nature of the burials and the similarities in diet also point toward an intimate and personal relationship, both emotional and social, between humans and their dogs—one that involved sharing food and giving dogs the same burial rites as the humans they lived among. Ancient dogs weren’t just work animals or hunters, the authors suggest, but important companion animals and friends as well.

Citation: Losey RJ, Garvie-Lok S, Leonard JA, Katzenberg MA, Germonpré M, et al. (2013) Burying Dogs in Ancient Cis-Baikal, Siberia: Temporal Trends and Relationships with Human Diet and Subsistence Practices. PLoS ONE 8(5): e63740. doi:10.1371/journal.pone.0063740

Image Credits: Losey RJ, Garvie-Lok S, Leonard JA, Katzenberg MA, Germonpré M, et al. (2013) Burying Dogs in Ancient Cis-Baikal, Siberia: Temporal Trends and Relationships with Human Diet and Subsistence Practices. PLoS ONE 8(5): e63740. doi:10.1371/journal.pone.0063740

Siberian husky photo by Pixel Spit

Invite: SPARC Europe Open Session at the Pre-LIBER conference in Munich Germany

You are warmly invited to SPARC Europe Open Session in conjunction with the Pre-LIBER conference in Munich, Germany, the 25th of June at 2.30pm-5.30pm.

Venue: Hilton Park Hotel Am Tucherpark 7, 80538.

We will discuss how libraries can make open access work now that open access is moving into the mainstream. Can libraries/universities reallocate funds from the big deals to the support of open access publishing? Speakers for this topic are:

Prof. Björn Brembs, Universität Regensburg, Germany
Prof. Dr. Susanne Weigelin-Schwiedrzik, Pro Vice Chancellor, University of Vienna, Austria
Anna Lundén, Coordinator, the Swedish Library Consortium, BIBSAM, National Library of Sweden
Berndt Dugall, Library Director, Johann Wolfgang Goethe-Universität, Frankfurt am Main, Germany

There will also be an informal discussion between Dr Celina Ramjoué, European Commission and Dr Alma Swan, SPARC Europe’s Director for Advocacy, on how organisations are devoting considerable resources to working for a good open access policy for the EU. How far have we come and what is the nature of progress?

Please send a mail to if you wish to participate in the session. 

Invite in German: