March 19, 2013
In 2010, the surprising discovery that Neanderthals likely crossbred with our ancestors tens of thousands of years ago generated headlines around the world.
Now, we have a new finding about the sex lives of early Homo sapiens: It looks like they engaged in some inbreeding as well.
That is the conclusion of anthropologist Erik Trinkhaus of Washington University in St. Louis and Xiu-Jie Wu and Song Xing of the Chinese Academy of Sciences’ Institute of Vertebrate Paleontology and Paleoanthropology, based on a fractured 100,000-year-old skull excavated from China’s Nihewan Basin. Their finding, published yesterday in PLOS ONE, is that the skull shows evidence of an unusual genetic mutation that is likely the result of high levels of inbreeding.
The researchers used CT scanning and 3D modeling to join together for the first time the 5 pieces of the fractured skull—known as Xujiayao 11, named for the site where it was found back in 1977—and realized that it exhibited an unusual deformity. When the pieces are combined, they leave a hole on the crown of the skull, but there is no evidence that the fracture was caused by a traumatic injury or disease. As a result, they consider it most likely that the hole is a defect known as an enlarged parietal foramen.
Nowadays, this hole is mostly found in people with a particular pair of genetic mutations on chromosomes 5 and 11—most often a consequence of inbreeding—and occurs in about 1 of 25,000 live births. The mutation interferes with bone formation in the skull over the first five months of an infant’s life, when the pieces of the skull are supposed to fuse together to cover up the “soft spot.”
Given the tiny sample size of human skulls this old and the fact that similar kinds of genetic abnormalities have been seen so often in other prehistoric skulls—the researchers count 22 individuals with skull deformities discovered from this era—Trinkhaus thinks the simplest explanation is that small and unstable human populations forced our ancestors to inbreed.
If no inbreeding occurred, “the probability of finding one of these abnormalities in the small available sample of human fossils is very low, and the cumulative probability of finding so many is exceedingly small,” he said in a press statement. “The presence of the Xujiayao and other Pleistocene [2.6 million to 12,000 years ago] human abnormalities therefore suggests unusual population dynamics, most likely from high levels of inbreeding and local population instability.”
Such inbreeding was likely inevitable, given that most of humanity likely lived in small, isolated populations for most of our species’ evolution. For example, some scientists believe that an earlier population bottleneck that predated this skull may have driven the worldwide human population to as low as 2,000 individuals, at times making inbreeding a necessity. Our ancestors certainly didn’t understand the importance of genetic diversity and the dangerous consequences of inbreeding. But with such a scant population, the survival of our species might actually have depended on our ancient grandmothers procreating with their male relatives.
The good news? The researchers say that the genetic deformity preserved in this skull as a result of inbreeding may not have been too detrimental for this individual. Normally, it’s linked with major cognitive problems, but that’s doubtful in this case, given the demanding conditions of surviving in the Pleistocene. This prehistoric human appears to have survived to a ripe old age—which, in those days, probably means the individual lived into his or her thirties.
March 13, 2013
In one of the fastest-growing areas in psychology, researchers are gaining insight into the mental processes of subjects that are barely able to communicate: babies. In recent years, innovative and playful experimental setups have suggested that infants as young as six months old have a sense of morality and fairness, and that 18-month-olds are capable of altruistically helping others.
Some of this research, though, has also shed light on babies’ dark side. A new study published in Psychological Science suggests that 9- to 14-month-olds exhibit a particularly unwelcome trait—in watching a puppet show, at least, they seem to prefer their own kind, and support puppets that pick on those who are different from them.
Because babies can’t communicate verbally, J. Kiley Hamlin of the University of British Columbia has pioneered the use of puppet shows to probe their psychology and better understand how they see the world. In this study, her research team put on an show in which 52 infant participants were led to identify themselves as similar to one of the characters in the show and different from the other.
To accomplish this, the researchers started off by asking the infants to pick a food, either graham crackers or green beans (a little surprisingly, a full 42 percent chose the vegetables). Then, the infants were shown a pair of rabbit puppets, one who liked graham crackers and one who liked green beans:
Once they’d solidly demonstrated each rabbit’s choice, one of them—either the one with the same preference as the infant observer, or the one with an opposite preference—would be randomly chosen to encounter a pair of new characters: one dog, termed a “helper,” and another, called a “harmer.” As the rabbit played with a ball and dropped it, the nice “helper” dog threw it back, but the mean “harmer” dog (below) held onto the ball:
After both of the scenes were over, both dogs were presented to the infant, and the particular dog that the baby first reached for was interpreted as the character it preferred.
The results were a bit startling: When the infants had watched a play involving a rabbit with a food choice that matched theirs, 83 percent preferred the “helper” dog. When they’d watched a play with a rabbit who liked a different food, 88 percent chose the “harmer” dog. This held true regardless of the babies’ original food choices—the only thing that mattered was whether the rabbit’s identity, it terms of food choice, matched their own.
To further parse the motivations underlying the infants’ choices, the researchers conducted a similar experiment that involved a neutral dog that neither help nor harmed the rabbit. In this part of the study, the older infants’ preferences revealed that when watching rabbits who had different favorite foods than them, they not only liked “harmer” dogs more than neutral dogs, but strongly preferred even neutral dogs when compared to “helpers” (this was true among the 14-month-olds, but not the 9-month-olds). In other words, it seemed that they not only wanted to see the rabbit treated poorly, but also would rather see it treated neutrally than get some help.
Of course, when designing experiments for subjects that can’t use words to communicate, the simplest of variables could potentially throw off the results. It’s unclear, for example, if the researchers alternated which side the “helper” and “harmer” puppets appeared on, so the babies could have been influenced by their emerging sense of handedness. In the past, critics of such puppet show experiments have also charged that a baby merely reaching for one puppet or another might be an impulsive reflex, rather than reflecting an underlying moral judgement.
What’s clear, though, is that this experiment demonstrated a consistent reflex across the babies tested. While extrapolating this to mean that the babies are racist or bigoted is probably a step too far—for one, they were merely considering individual puppets, not groups of puppets with similar characteristics—it does raise interesting questions about the origins of xenophobia in an individual’s lifetime.
March 12, 2013
Neanderthals never invented written language, developed agriculture or progressed past the Stone Age. At the same time, they had brains just as big in volume as modern humans’. The question of why we Homo sapiens are significantly more intelligent than the similarly big-brained Neanderthals—and why we survived and proliferated while they went extinct—has puzzled scientists for some time.
Now, a new study by Oxford researchers provides evidence for a novel explanation. As they detail in a paper published today in the Proceedings of the Royal Society B, a greater percentage of the Neanderthal brain seems to have been devoted to vision and control of their larger bodies, leaving less mental real estate for higher thinking and social interactions.
The research team, led by Eiluned Pearce, came to the finding by comparing the skulls of 13 Neanderthals who lived 27,000 to 75,000 years ago to 32 human skulls from the same era. In contrast to previous studies, which merely measured the interior of Neanderthal skulls to arrive at a brain volume, the researchers attempted to come to a “corrected” volume, which would account for the fact that the Neanderthals’ brains were in control of rather differently-proportioned bodies than ours ancestors’ brains were.
One of the easiest differences to quantify, they found, was the size of the visual cortex—the part of the brain responsible for interpreting visual information. In primates, the volume of this area is roughly proportional to the size of the animal’s eyes, so by measuring the Neanderthals’ eye sockets, they could get a decent approximation of their the visual cortex as well. The Neanderthals, it turns out, had much larger eyes than ancient humans. The researchers speculate that this could be because they evolved exclusively in Europe, which is of higher latitude (and thus has poorer light conditions) than Africa, where H. sapiens evolved.
Along with eyes, Neanderthals had significantly larger bodies than humans, with wider shoulders, thicker bones and a more robust build overall. To account for this difference, the researchers drew upon previous research into the estimated body masses of the skeletons found with these skulls and of other Neanderthals. In primates, the amount of brain capacity devoted to body control is also proportionate to body size, so the scientists were able to calculate roughly how much of the Neanderthals’ brains were assigned to this task.
After correcting for these differences, the research team found that the amount of brain volume left over for other tasks—in other words, the mental capacity not devoted to seeing the world or moving the body—was significantly smaller for Neanderthals than for ancient H. sapiens. Although the average raw brain volumes of the two groups studied were practically identical (1473.84 cubic centimeters for humans versus 1473.46 for Neanderthals), the average “corrected” Neanderthal brain volume was just 1133.98 cubic centimeters, compared to 1332.41 for the humans.
This divergence in mental capacity for higher cognition and social networking, the researcher argue, could have led to the wildly different fates of H. sapiens and Neanderthals. “Having less brain available to manage the social world has profound implications for the Neanderthals’ ability to maintain extended trading networks,” Robin Dunbar, one of the co-authors, said in a press statement. “[They] are likely also to have resulted in less well developed material culture—which, between them, may have left them more exposed than modern humans when facing the ecological challenges of the Ice Ages.”
Previous studies have also suggested that the internal organization of Neanderthal brains differed significantly from ours. For example, a 2010 project used computerized 3D modeling and Neanderthal skulls of varying ages to find that their brains developed at different rates over the course of an individual’s adolescence as compared to human brains despite comparable brain volumes.
The overall explanation for why Neanderthals went extinct while we survived, of course, is more complicated. Emerging evidence points to the idea that Neaderthals were smarter than previously thought, though perhaps not smart enough to outmaneuver humans for resources. But not all of them had to—in another major 2010 discovery,a team of researchers compared human and Neanderthal genomes and found evidence that our ancestors in Eurasia may have interbred with Neanderthals, preserving a few of their genes amidst our present-day DNA.
Apart from the offspring of a small number of rare interbreeding events, though, the Neanderthals did die out. Their brains might have been just as big as ours, but ours might have been better at a few key tasks–those involved in building social bonds in particular—allowing us to survive the most recent glacial period while the Neanderthals expired.
January 7, 2013
Around 120 B.C.E., the Relitto del Pozzino, a Roman shipping vessel, sank off the coast of Tuscany. More than two millennia later, in the 1980s and 90s, a team sent by the Archeological Superintendency of Tuscany began to excavate the ruins, hauling up planks of rotting wood.
“It wasn’t an easy task. The wreck is covered by marine plants and their roots. This makes it hard to excavate it,” underwater archaeologist Enrico Ciabatti told Discovery News in 2010. “But our efforts paid off, since we discovered a unique, heterogeneous cargo.”
That cargo, it turned out, included ceramic vessels made to carry wine, glass cups from the Palestine area and lamps from Asia minor. But in 2004, the archaeologists discovered it also included something even more interesting: the remains of 2,000-year-old medicine chest.
Although the chest itself—which had presumably belonged to a Roman doctor—was apparently destroyed, researchers found a surgery hook, a mortar, 136 wooden drug vials and several cylindrical tin vessels (called pyxides) all clustered together on the ocean floor. When they x-rayed the pyxides, they saw that one of them had a number of layered objects inside: five circular, relatively flat grey medicinal tablets. Because the vessels had been sealed, the pills had been kept completely dry over the years, providing a tantalizing opportunity for us to find out what exactly the ancient Romans used as medicine.
Now, as revealed today in a paper in the Proceedings of the National Academy of Sciences, a team of Italian chemists has conducted a thorough chemical analysis of the tablets for the first time. Their conclusion? The pills contain a number of zinc compounds, as well as iron oxide, starch, beeswax, pine resin and other plant-derived materials. One of the pills seems to have the impression of a piece of fabric on one side, indicating it may have once been wrapped in fabric in order to prevent crumbling.
Based on their shape and composition, the researchers venture that the tablets may have served as some sort of eye medicine or eyewash. The Latin name for eyewash (collyrium), in fact, comes from the Greek word κoλλυρα, which means “small round loaves.”
Although it remains to be seen just how effective this sort of compound would have been as an actual eye treatment, the rare glimpse into Roman-era medicinal practices is fascinating nonetheless. The vast majority of our knowledge of ancient medicine comes from writings—which may vary in accuracy and lack crucial details—so the presence of actual physical evidence is especially exciting.
January 2, 2013
In 1719, Daniel Defoe wrote in Robinson Crusoe, ”He declar’d he had reserv’d nothing from the Men, and went Share and Share alike with them in every Bit they eat.” Defoe’s famous sharing phrase has persisted throughout the years, passing from parent to child as a lesson on the virtues of sharing with family, peers and even strangers.
But in the context of evolution and survival of the fittest, sharing makes no sense. Until now, scientists assumed that humans alone subscribed to this behavior, especially when it comes to sharing with strangers, and wrote the trait off as a quirk stemming from our unique cognitive and social development.
Sure, primatologists know that great apes help and voluntarily share food with other group mates (acts that indirectly benefits themselves). But strangers? Such a behavior is unheard of amidst species that often compete aggressively with other groups and even murder foreign individuals.
Researchers from Duke University decided to challenge the great ape’s bad sharing rep, seeking to discover whether or not our furry relatives may also have a propensity for partitioning goods with animals they do not know. The scientists chose bonobos–a type of great ape sometimes referred to as a pygmy chimpanzee–for their study. Compared to chimpanzees, bonobos possess a relatively high tolerance for strangers, so they seemed like a logical candidate for investigations into the nature of sharing.
At a bonobo sanctuary in the Democratic Republic of the Congo, they enrolled 15 wild-born bonobos orphaned and rescued from the illegal wildlife trade in four experiments. In the first experiment, the researchers led a bonobo into a room piled high with delicious banana slices. Behind two sliding doors, they placed either a friend of the main bonobo or a stranger (a bonobo unrelated and unknown to their main research subject). The bonobo with the bananas could chose to eat the food all on its own, or open the sliding door and invite both or either the friend or stranger to join in. In the second experiment, they placed only one bonobo–either the friend or stranger–behind a door and left the second room empty.
The results, which they describe this week in the journal PLoS One, confounded the researchers. In more than 70 percent of the trials, the bonobos shared their food at least once. They preferred to release the stranger over their group mate, and the stranger in turn often released the other bonobo, even though that meant splitting the food three ways and being outnumbered by two bonobos that already knew each other. They ignored the door leading to the empty room, showing that the novelty of opening the door was not motivating their behavior.
So, were the bonobos willing to share their food with strangers because of an overwhelming desire to interact with the unknown apes, or were they motivated by a sense of altruism? The researchers set up two more experiments to find out. They arranged a rope which, when pulled, released either a bonobo stranger or friend into a room which held more bananas. A mesh divider separated the main bonobo from that room, however, meaning it could neither reach the food or interact directly with the released ape. Even when there was no immediate social or culinary reward on offer, the researchers found, 9 out of 10 bonobos still chose to release their friend or the stranger at least once, allowing the other ape to reach the banana reward.
Bonobos drew the line, however, in the final experiment. This setup allowed both bonobos to access the food, but did not let them interact physically with the stranger or friend. In other words, the main bonobo would have to forfeit some of its food but receive no reward of sniffing, petting or playing with another ape. None of the bonobos chose to open the door, suggesting that the seemingly altruistic sharing of the first two experiments was just a ploy to gain gratifying access to intriguing strangers and, to a lesser extent, friends. The third experiment, however, shows that the bonobos’ motivations are not completely selfish. When the food was so far out of reach that they themselves could not benefit, they allowed a friend or stranger to enjoy it instead.
Bonobos, in other words, break the rules when it comes to sharing, showing that kindness towards strangers is not unique to humans. Oddly enough, unlike their bipedal counterparts, bonobos even seem to prefer strangers to group mates. This behavior, the study authors think, might have evolved to help groups of bonobos expand their social networks. Further investigations may lend clues about evolution of sharing in humans.
“Like chimpanzees, our species would kill strangers; like bonobos, we could also be very nice to strangers,” said Jingzhi Tan, an evolutionary anthropologist at Duke University and lead author of the paper, in a statement. “Our results highlight the importance of studying bonobos to fully understand the origins of such human behaviors.”