Modern Humans: Were We Really Better than Neanderthals, or Did We Just Get Lucky?


We’ve all heard the story: dim-witted Neanderthals couldn’t quite keep up with our intelligent modern human ancestors, leading to their eventual downfall and disappearance from the world we know now. Apparently they needed more brain space for their eyes. The authors of a recent PLOS ONE paper are digging into the ideas behind this perception, and take a closer look at eleven common hypotheses for the demise of the Neanderthals, comparing each to the latest research in this field to convince us that Neanderthals weren’t the simpletons we’ve made them out to be.

The authors tackled ideas like the Neanderthal’s capacity for language and innovative ability, both often described as possible weaknesses leading to their decline. Analyzing the published research on each topic, they found that archaeologists often used their finds to “build scenarios” that agreed with the running theories of human superiority, and that some long-held truths have now been challenged by recent discoveries and ongoing research at the same excavation sites.

As one example, researchers who found shell beads and pieces of ochre and manganese in South Africa—­used as pigments—claimed them as evidence of the use of structured language in anatomically modern humans. While we can only guess when linking items like these to the presence of language, new findings at Neanderthal sites indicate that they also decorated objects with paints and created personal ornaments using feathers and claws. Whatever the anatomically modern humans were doing in South Africa, Neanderthals were also doing in Europe around the same time, negating the claim that this ability may have provided the anatomically modern humans with better survival prospects once they arrived in Europe.

Another set of South African artifacts led the archaeological community to believe that anatomically modern humans were capable of rapidly improving on their own technology, keeping them ahead of their Neanderthal contemporaries. Two generations of tools, created during the Stillbay and Howiesons Poort periods, were originally believed to have evolved in phases shorter than 10,000 years—a drop in the bucket compared to the Neanderthals’ use of certain tools, unchanged, for 200,000 years. However, new findings suggest that the Stillbay and Howiesons Poort periods lasted much longer than previously thought, meaning that the anatomically modern humans may not have been the great visionaries we had assumed. Additionally, while Neanderthals were not thought capable of crafting the adhesives used by anatomically modern humans to assemble weapons and tools, it is now known that they did, purifying plant resin through an intricate distillation process.

We’re all living proof that anatomically modern humans survived in the end. Perhaps in an effort to flatter our predecessors, we have been holding on to dated hypotheses and ignoring recent evidence showing that Neanderthals were capable of a lot more (and perhaps the anatomically modern humans of a lot less) skill-wise than previously believed. Genetic studies continue to support the idea that anatomically modern humans and Neanderthals interbred and show that the genome of modern humans with Asian or European ancestry contains nearly 2% Neanderthal genes, a substantial quantity considering 40,000 years and 2000 generations have passed since they ceased to exist. These genes may have helped modern humans adjust to life outside of Africa, possibly aiding in the development of our immune system and variation in skin color. Researchers believe that the concentration of Neanderthal genes in modern humans was once much higher, but genetic patterns in modern humans show that hybrid Neanderthal-Human males may have been sterile, leaving no opportunity for their genes to be passed to the next generation.

So, while they may not walk among us today, we have Neanderthals to thank for some major adaptations that allowed us to thrive and spread across the planet. Too bad they’re not here to see the wonderful things we were able to accomplish with their help.

Related links:

Picked Clean: Neanderthals’ Use of Toothpicks to Fight Toothache

Contextualizing the Hobbits

Sharing was Caring for Ancient Humans and Their Prehistoric Pups

Citation: Villa P, Roebroeks W (2014) Neandertal Demise: An Archaeological Analysis of the Modern Human Superiority Complex. PLoS ONE 9(4): e96424. doi:10.1371/journal.pone.0096424

Image 1: Neandertaler im Museum from Wikimedia Commons

The post Modern Humans: Were We Really Better than Neanderthals, or Did We Just Get Lucky? appeared first on EveryONE.

For Yeast’s Sake: The Benefits of Eating Cheese, Chocolate, and Wine


Yeast—including more than 1500 species that make up 1% of all known fungi—plays an important role in the existence of many of our favorite foods. With a job in everything from cheese making to alcohol production to cocoa preparation, humans could not produce such diverse food products without this microscopic, unicellular sous-chef. While we have long been aware of our dependence on yeast, new research in PLOS ONE suggests that some strains of yeast would not be the same without us, either.

Studies have previously shown how our historical use of yeast has affected the evolution of one of the most commonly used species, Saccharomyces cerevisiae, creating different strains that are used for different purposes (bread, wine, and so on). To further investigate our influence on yeast, researchers from the University of Bordeaux, France, took a look at a different yeast species of recent commercial interest, Torulaspora delbrueckii. In mapping the T. delbrueckii family tree, the authors show not only that human intervention played a major role in the shaping of this species, but they provide us with valuable information for further improving this yeast as a tool for food production.

The authors collected 110 strains of T. delbrueckii from global sources of wine grapes, baked goods, dairy products, and fermented beverages. Possible microsatellites, or repeating sequences of base pairs (like A-T and G-C), were found in one strain’s DNA and used to create tools that would identify similar sequences in the other strains. They used the results to pinpoint eight different microsatellite markers (base pair sequences) that were shared by some strains but not others to measure genetic variation in the T. delbrueckii family. The composition of each strain was measured using microchip electrophoresis, a process in which DNA fragments migrate through a gel containing an electric field, which helps researchers separate the fragments according to size. As each strain’s microsatellite markers were identified, the information was added to a dendrogram (a funny-looking graph, shown below) to illustrate the level of similarity between strains. The researchers also estimated the time it took different strains to evolve by comparing the average rate of mutation and reproduction time for T. delbrueckii to the level of genetic difference between each strain.


The dendrogram shows four clear clusters of yeast strains heavily linked to each sample’s origin. Two groups contain most of the strains isolated from Nature, but can be distinguished from each other by those collected on the American continents (nature Americas group) and those collected in Europe, Asia, and Africa (nature Old World group). The other two clusters include strains collected from food and drink samples, but cannot be discriminated by geographic location. The grape/wine group contains 27 strains isolated from grape habitats in the major wine-producing regions of the world: Europe, California, Australia, New Zealand, and South America. The bioprocess group contains geographically diverse strains collected from other areas of food processing—such as bread products, spoiled food, and fermented beverages—and includes a subgroup of strains used specifically for dairy products. Further analysis of the variation between strains confirmed that, while the clusters don’t perfectly segregate the strains according to human usage, and geographic origin of the sample played some role in diversity, a large part of the population’s structure is explained by the material source of the strain.

Divergence times calculated for the different groups further emphasize the connection between human adoption of T. delbrueckii yeast and the continued evolution of this species. The grape/wine cluster of strains diverged from the Old World group approximately 1900 years ago, aligning with the expansion of the Roman Empire, and the spread of Vitis vinifera, or the common grape, alongside. The bioprocesses group diverged much earlier, an estimated four millennia ago (around the Neolithic era), showing that yeast was used for food production long before it was domesticated for wine making.

While T. delbrueckii has often been overlooked by winemakers in favor of the more common S. cerevisiae, it has recently been gaining traction for its ability to reduce levels of volatile compounds that negatively affect wine’s flavor and scent. It has also been shown to have a high freezing tolerance when used as a leavening agent, making it of great interest to companies attempting to successfully freeze and transport dough. Though attempts to develop improved strains of this yeast for commercial use have already begun, we previously lacked an understanding of its life-cycle and reproductive habits. In creating this T. delbrueckii family tree, the authors also gained a deeper understanding of the species’ existence, which may help with further development for technological use.

Yeast has weaseled its way into our hearts via our stomachs, and it seems that, in return, we have fully worked our way into its identity. With a bit of teamwork, and perhaps a splash of genetic tweaking, we can continue this fruitful relationship and pursue new opportunities in Epicureanism. I think we would all drink to that!

Related Links:

A Novel Strategy to Construct Yeast Saccharomyces cerevisiae Strains for Very High Gravity Fermentation

The Vineyard Yeast Microbiome, a Mixed Model Microbial Map

Reference: Albertin W, Chasseriaud L, Comte G, Panfili A, Delcamp A, et al. (2014) Winemaking and Bioprocesses Strongly Shaped the Genetic Diversity of the Ubiquitous Yeast Torulaspora delbrueckii. PLoS ONE 9(4): e94246. doi:10.1371/journal.pone.0094246

Image 1: Figure 1 from article

Image 2: Figure 3 from article

The post For Yeast’s Sake: The Benefits of Eating Cheese, Chocolate, and Wine appeared first on EveryONE.

Contextualizing the Hobbits


18,000 years ago, the remote Indonesian island of Flores was home to a population of tiny humans. They stood only about 3.5 feet tall on their large feet, and their skulls housed unusually small brains approximately the size of a grapefruit. The identity of these ‘hobbits’ has been hotly debated for years: Were they modern humans suffering a disease, or a new species, Homo floresiensis?

Biological anthropologist Karen Baab first studied a model of LB1, the only skull recovered from the site, at the American Museum of Natural History in 2005. In a recently published PLOS ONE study, she and other researchers compare this specimen to a range of other modern human and extinct hominin skulls to get closer to settling the identity of Homo floresiensis, or ‘Flores man’.

The origins of ‘Flores man’ have been debated for quite a while now. What are the possible origins that are being discussed, and why the uncertainty?

The primary debate has centered on whether LB1 (and possibly the other individuals found on Flores) represents a new species that descended from an extinct species of the genus Homo or whether it is instead a pathological modern Homo sapiens, i.e the same species as us. If the Flores remains do in fact represent a distinct species, then the next question is whether they descended from Homo erectus, a species that may be our direct ancestor, or an even more primitive species. The latter scenario implies an otherwise undocumented migration out of Africa.

What makes it so hard to settle the argument one way or the other?

One of the difficulties in settling this particular argument is that most studies have focused on one or the other of these ideas and compared the Flores remains to either fossil hominins or to pathological modern humans, each using a different set of features. This makes it challenging to compare the alternative hypotheses side-by-side.

What kind of diseases might have caused modern humans to have features similar to these ‘hobbits’?

The three that have been discussed most prominently (and the three we looked at) are microcephaly, endemic hypothyroidism (“cretinism”) and Laron Syndrome. Microcephaly is not a disease per se, but rather a symptom of many different disorders. It refers to having an abnormally small brain and therefore skull. “Cretins” suffer from a lack of thyroid hormone before and after birth, which leads to stunted growth and possibly a slight decrease in skull size. Laron Syndrome individuals produce growth hormone, but their bodies do not properly recognize it, again leading to stunted growth and other developmental issues.

Only a few specimens of this hominin have been found, and there’s only one known skull, from the specimen named LB1. Are there reasons why these specimens have not been discovered elsewhere?

If Homo floresiensis descended from Homo erectus, then their closest relative lived just “next door” on the nearby island of Java. In this case, the unique features of the Homo floresiensis species probably evolved in the isolated island environment of Flores. If, however, the ancestor was a more primitive species, and Homo floresiensis didn’t branch off from H.erectus, it is possible that they might have migrated earlier than known, and we could still find older sites in mainland Asia containing this ancestral species.

Liang Bua cafe

You compared the morphology of the LB1 skull to many hominin ancestors and modern human populations from around the world. What were some of the most striking similarities and differences?

The LB1 skull is very distinct from the typical modern human’s, as it has a lower,  more elongated silhouette when viewed from the side, , greater width at the rear of the braincase, and a flatter frontal bone (the bone underlying the forehead) with a more pronounced brow ridge. Interestingly, these are some of the same features that distinguish archaic species like Homo erectus from modern humans.

Specimens of Laron Syndrome and “cretin” skulls from modern Homo sapiens presented large, round, globular braincases, which are very different from the smaller, lower and less rounded braincase of LB1. The microcephalic human skulls present a closer comparison to LB1, but still show clear distinctions from LB1 in much the same way that they differ from species like Homo erectus or Homo habilis.

Overall, the LB1 braincase is most similar in its overall shape to small-brained Homo erectus from Eurasia that are 1.8 million years old.

How does this analysis add to, or change, what we knew about Flores man? 

This analysis provides a unique opportunity to evaluate these evolutionary and pathological hypotheses side-by-side based on the same criterion – of cranial shape similarity. The results support a stronger affiliation of LB1 with fossil Homo than with any of the proposed pathologies. This study also offers an improvement over previous assessments of the microcephaly hypothesis by using a more extensive sample that better captures the variability in this disorder.

Do these results conclusively settle the discussion? What other possibilities still exist for the origins of H. floresiensis?

While very little in paleoanthropology is ever “settled,” I do think this study represents an important step forward in terms of putting the pathological hypotheses to rest. The question that remains to be answered definitively is which species of archaic Homo is the most likely ancestor of Homo floresiensisHomo erectus or an earlier and more primitive species of Homo?

Citation: Baab KL, McNulty KP, Harvati K (2013) Homo floresiensis Contextualized: A Geometric Morphometric Comparative Analysis of Fossil and Pathological Human Samples. PLoS ONE 8(7): e69119. doi:10.1371/journal.pone.0069119

Images: Homo floresiensis by Ryan Somma, Cave where the remains of Homo Floresiensis where discovered in 2003, Liang Bua, Flores, Indonesia by Rosino


Hairy, Sticky Leg Pads are In: How Different Spiders Hunt


Spiders are everywhere (Arachnophobes, stop reading now). They’re among the most successful predators on earth today and colonize nearly every terrestrial habitat (that is, not just ceiling corners and under beds), and occasionally do so in numbers large enough to take over small islands. Spider silk may be strong enough to stop a speeding train and some webs, ten times stronger than Kevlar, can be large enough to cross rivers in tropical rainforests.

But more than half of today’s spider species don’t rely on webs or silk to capture their prey. Instead, these hunting spiders have evolved hairy adhesive pads on their legs to grab and hold struggling prey down, according to the results of a recently published PLOS ONE study. The adhesive pads, called scopulae, were commonly seen in many spider species but what wasn’t clear until now was whether they were found in all species, or more likely to occur in hunting spiders.

scopulaeIn this study, researchers used a phylogenetic analysis of spider family trees to correlate different species’ prey capture strategies with the presence or absence of adhesive pads on their legs. They found that the majority of spiders were either web builders or free-ranging hunters, and the latter were most often found to have adhesive hairs on their legs (Apart from these two, at least one rare variety may be mostly vegetarian). Nearly 83% of hunting spiders had adhesive bristles on their legs (compared with 1.1% of web-building varieties). Most of these hunters had either not developed silk-dependent strategies to capture prey, or abandoned web-building for hunting.

Spider Web on PlantWhy would so many spiders abandon an obviously successful way to catch prey? Web-building is a useful way to trap insects and some small mammals, but even to a spider, silk is expensive. Creating a web requires work, damages caused by prey or people need frequent repairs, and certain kinds of webs can require large amounts of silk to be effective. The classic orb-web (seen in the picture here) radically reduced these costs, which may be why the spiders that make these are particularly common. However, this new study reveals that hunting has proved at least as successful a strategy as web-building to more than half of today’s spiders.

Bristly scopulae on hunting spiders’ legs have played a big part in this, enabling spiders to grasp and hold on to struggling prey. The thin bristles on scopulae come in many shapes and forms, and also contribute to these spiders’ mad climbing skills. Read more about which spiders evolved these bristles or learn about other arachnid research published in PLOS ONE here.


Citations: Gregori? M, Agnarsson I, Blackledge TA, Kuntner M (2011) How Did the Spider Cross the River? Behavioral Adaptations for River-Bridging Webs in Caerostris darwini (Araneae: Araneidae). PLoS ONE 6(10): e26847. doi:10.1371/journal.pone.0026847

Rogers H, Hille Ris Lambers J, Miller R, Tewksbury JJ (2012) ‘Natural experiment’ Demonstrates Top-Down Control of Spiders by Birds on a Landscape Level. PLoS ONE 7(9): e43446. doi:10.1371/journal.pone.0043446

Wolff JO, Nentwig W, Gorb SN (2013) The Great Silk Alternative: Multiple Co-Evolution of Web Loss and Sticky Hairs in Spiders. PLoS ONE 8(5): e62682. doi:10.1371/journal.pone.0062682

Nyffeler M, Knörnschild M (2013) Bat Predation by Spiders. PLoS ONE 8(3): e58120. doi:10.1371/journal.pone.0058120

Images: Foot of the little jumping spider Euophrys frontalis, credit Jonas Wolffvaried shapes and sizes of bristles on scopulae from pone.0062682spider web on plant by mikebaird