December 4, 2013
Since its discovery in 1990, La Sima de los Huesos, an underground cave in Northern Spain’s Atapuerca Mountains, has yielded more than 6,000 fossils from 28 individual ancient human ancestors, making it Europe’s most significant site for the study of ancient humans. But despite years of analysis, the exact age and even the species to which these individuals belonged has been in doubt.
Now, though, an international group of scientists has extracted and sequenced DNA from the fossilized femur of one of these individuals for the first time. The resulting data—which represent the oldest genetic material ever sequenced from a hominin, or ancient human ancestor—finally give us an idea of the age and lineage of these mysterious individuals, and it’s not what many scientists expected.
The fossilized bone tested, a femur, is roughly 400,000 years old. But the big surprise is that, although scientists had previously believed the fossils belonged to Neanderthals because of their anatomical appearance, the DNA analysis actually shows they’re more closely related to Denisovans, a recently-discovered third lineage of human ancestors known only from DNA isolated from a few fossils found in Siberia in 2010. The findings, published today in Nature, will force anthropologists to further reconsider how the Denisovans, Neanderthals and the direct ancestors of modern-day humans fit together in a complicated family tree.
The analysis was enabled by recent advances in methods for recovering ancient DNA fragments developed at the Max Planck Institute for Evolutionary Anthropology in Germany, previously used to analyze the DNA of a cave bear fossil found in the same cave. “This wouldn’t have been possible just two years ago,” says Juan Luis Arsuaga, a paleontologist at the University of Madrid who led the initial excavations of the cave and collaborated on the new study. “And even given these new methods, we still didn’t expect these bones to preserve DNA, because they’re so old—ten times older than some of the oldest Neanderthals from whom we’ve taken DNA.”
After extracting a two grams of crushed bone from the femur, a group of scientists led by Matthias Meyer isolated the mitochondrial DNA (mtDNA), a pool of genetic material that’s distinct from the DNA in the chromosomes located in our cells’ nuclei. Instead, this mtDNA lives in our cells’ mitochondria—microscopic organelles responsible for cellular respiration—and is much shorter in length than nuclear DNA.
There’s another quirk of mtDNA that makes it especially valuable as a means of studying the evolution of ancient humans: Unlike your nuclear DNA, which is a mix of DNA from both your parents, your mtDNA comes solely from your mother, because most of a sperm’s mitochondria are found in its tail, which it sheds after fertilization. As a result, mtDNA is nearly identical from generation to generation, and a limited number of distinct sequences of mtDNA (called haplogroups) have been observed in both modern humans and ancient human ancestors. Unlike anatomical characteristics and nuclear DNA, which can vary within a group and make it difficult to confidently distinguish one from another, mtDNA is generally consistent, making it easier to link a particular specimen with a lineage.
Which is why, when the researchers compared the femur’s mtDNA to previously sequenced samples from Neanderthals, from a Denisovan finger bone and tooth found in Siberia and from many different modern humans, they found it so surprising that it more closely resembled the Denisovans. “This was really unexpected,” Arsuaga says. “We had to think really hard to come up with a few scenarios that could potentially explain this.”
Anthropologists had already known that all three lineages (humans, Neanderthals and Denisovans) shared a common ancestor, but it’s far from clear how all three groups fit together, and the picture is further clouded by the fact that interbreeding may have occurred between them after they diverged. Helpfully, comparing the femur’s mtDNA to the Neanderthal, Denisovan and modern human samples allowed the researchers to estimate its age—based upon known rates of mtDNA mutation, the previously established ages of the other samples, and the degree of difference between them—leading to the 400,000 year figure.
To explain how a Neanderthal-looking individual could come to have Denisovan mtDNA during this time period, the scientists present several different hypothetical scenarios. It’s possible, for instance, that the fossil in question belongs to a lineage that served as ancestors of both Neanderthals and Denisovans, or more likely, one that came after the split between the two groups (estimated to be around 1 million years ago) and was closely related to the latter but not the former. It’s also a possibility that the femur belongs to a third, different group, and that its similarities to Denisovan mtDNA are explained by either interbreeding with the Denisovans or the existence of yet another hominin lineage that bred with both Denisovans and the La Sima de los Huesos population and introduced the same mtDNA to both groups.
If this sounds like a complicated family tree to you, you’re not alone. This analysis, along with earlier work, adds further mystery to an already puzzling situation. Initial testing on the Denisovan finger bone found in Siberia, for instance, found that it shared mtDNA with modern humans living in New Guinea, but nowhere else. Meanwhile, it was previously thought that Neanderthals had settled in Europe and Denisovans further east, on the other side of the Ural Mountains. The new analysis complicates that idea.
For now, the researchers believe the most plausible scenario (illustrated below) is the femur belongs to a lineage that split off from Denisovans sometime after they diverged from the common ancestor of both Neanderthals and modern humans. But perhaps the most exciting conclusion to come out of this work is that it proves that genetic material can survive for at least 400,000 years, and can be analyzed even after that amount of degradation. Armed with this knowledge and the new techniques, anthropologists can now attempt to genetically survey many other ancient specimens in hopes of better understanding our family tree.
December 2, 2013
It’s a platitude that we’ve all heard dozens of times, whether to justify our treatment of other species or simply to celebrate a carnivorous lifestyle: humans are the top of the food chain.
Ecologists, though, have a statistical way of calculating a species’ trophic level—its level, or rank, in a food chain. And interestingly enough, no one ever tried to rigorously apply this method to see exactly where humans fall.
Until, that is, a group of French researchers recently decided to use food supply data from the U.N Food and Agricultural Organization (FAO) to calculate human tropic level (HTL) for the first time. Their findings, published today in the Proceedings of the Natural Academy of Sciences, might be a bit deflating for anyone who’s taken pride in occupying the top position.
On a scale of 1 to 5, with 1 being the score of a primary producer (a plant) and 5 being a pure apex predator (a animal that only eats meat and has few or no predators of its own, like a tiger, crocodile or boa constrictor), they found that based on diet, humans score a 2.21—roughly equal to an anchovy or pig. Their findings confirm common sense: We’re omnivores, eating a mix of plants and animals, rather than top-level predators that only consume meat.
To be clear, this doesn’t imply that we’re middle-level in that we routinely get eaten by higher-level predators—in modern society, at least, that isn’t a common concern—but that to be truly at the “top of the food chain,” in scientific terms, you have to strictly consume the meat of animals that are predators themselves. Obviously, as frequent consumers of rice, salad, bread, broccoli and cranberry sauce, among other plant products, we don’t fit that description.
The researchers, led by Sylvain Bonhommeau of the French Research Institute for Exploitation of the Sea, used FAO data to construct models of peoples’ diets in different countries over time, and used this to calculate HTL in 176 countries from 1961 to 2009. Calculating HTL is fairly straightforward: If a person diet is made up of half plant products and half meat, his or her trophic level will be 2.5. More meat, and the score increases; more plants, and it decreases.
With the FAO data, they found that while the worldwide HTL is 2.21, this varies widely: The country with the lowest score (Burundi) was 2.04, representing a diet that was 96.7 percent plant-based, while the country with the highest (Iceland) was 2.54, reflecting a diet that contained slightly more meats than plants.
On the whole, since 1961, our species’ overall HTL has increased just slightly—from 2.15 to 2.21—but this averaged number obscures several important regional trends.
A group of 30 developing nations in Southeast Asia and Sub-Saharan Africa (shown in red)—including Indonesia, Bangladesh and Nigeria, for example—have had HTLs below 2.1 during the entire period. But a second group of developing countries that includes India and China (shown in blue) has slightly higher HTL measures that have consistently risen over time, going from around 2.18 to over 2.2. The HTLs of a third group, shown in green (including Brazil, Chile, South Africa and several countries in Southern Europe), have risen further, from around 2.28 to 2.33.
By contrast, HTL in the world’s wealthiest countries (shown in purple)—including those in North America, Northern Europe and Australia—was extremely high for most of the study period but decreased slightly starting during the 1990s, going from around 2.42 to 2.4. A fifth group of small, mostly island countries with limited access to agricultural products (shown in yellow, including Iceland and Mauritania) has seen more dramatic declines, from over 2.6 to less than 2.5.
These trends closely correlate, it turns out, with a number of World Bank development indicators, such as gross domestic product, urbanization and education level. The basic trend, in other words, is that as people become wealthier, they eat more meat and fewer vegetable products.
That has translated to massive increases in meat consumption in many developing countries, including China, India, Brazil and South Africa. It also explains why meat consumption has leveled off in the world’s richest countries, as gains in wealth leveled off as well. Interestingly, these trends in meat consumption also correlate with observed and projected trends in trash production—data indicate that more wealth means more meat consumption and more garbage.
But the environmental impacts of eating meat go far beyond the trash thrown away afterward. Because of the quantities of water used, the greenhouse gases emitted and the pollution generated during the meat production process, it’s not a big leap to speculate that the transition of huge proportions of the world’s population from a plant-based diet to a meat-centric one could have dire consequences for the environment.
Unfortunately, like the garbage problem, the meat problem doesn’t hint at an obvious solution. Billions of people getting wealthier and having more choice over the diet they eat, on a basic level, is a good thing. In an ideal world, we’d figure out ways to make that transition less damaging while still feeding huge populations. For example, some researchers have advocated for offbeat food sources like meal worms as a sustainable meat, while others are trying to develop lab-grown cultured meat as an environmentally-friendly alternative. Meanwhile, some in Sweden are proposing a tax on meat to curb its environmental cost while government officials in the UK are urging consumers to cut back on their demand for meat to increase global food security and to improve health. Time will tell which approaches stick.
In the meantime, simply keeping track of the amount of meat we’re eating as a society via HTL could provide a host of useful baseline information. As the authors write, “HTL can be used by educators to illustrate the ecological position of humans in the food web, by policy makers to monitor the nutrition transition at global and national scales and to analyze the effects of development on dietary trends, and by resource managers to assess the impacts of human diets on resource use.”
In other words, monitoring the intricacies of our middling position on the food chain may yield scientific fodder to tackle problems like food security, obesity, malnutrition and environmental costs of the agricultural industry. A heavy caseload for a number that ranks us on the same trophic level as anchovies.
November 15, 2013
There are plenty of ways to study history. You can conduct archaeological digs, examining the artifacts and structures buried under the ground to learn about past lifestyles. You can read historical texts, perusing the written record to better understand events that occurred long ago.
But an international group of medical researchers led by Andrés Moreno-Estrada and Carlos Bustamante of Stanford and Eden Martin of the University of Miami are looking instead at a decidedly unconventional historical record: human DNA.
Hidden in the microscopic genetic material of people from the Caribbean, they’ve found, is an indelible record of human history, stretching back centuries to the arrival of Europeans, the decimation of Native American populations and the trans-Atlantic slave trade. By analyzing these genetic samples and comparing them to the genes of people around the world, they’re able to pinpoint not only the geographic origin of various populations but even the timing of when great migrations occurred.
As part of a new project, documented in a study published yesterday in PLOS Genetics, the researchers sampled and studied the DNA of 251 people living in Florida who had ancestry from one of six countries and islands that border the Caribbean—Cuba, Haiti, Dominican Republic, Puerto Rico, Honduras and Colombia—along with 79 residents of Venezuela who belong to one of three Native American groups (the Yukpa, Warao and Bari tribes). Each study participant was part of a triad that included two parents and one of their children who were also surveyed, so the researchers could track which particular genetic markers were passed on from which parents.
The researchers sequenced the DNA of these participants, analyzing their entire genomes in search of particular genetic sequences—called single-nucleotide polymorphisms (SNPs)—that often differ between unrelated individuals and are passed down from parent to child. To provide context for the SNPs they found in people from these groups and areas, they compared them to existing databases of sequenced DNA from thousands of people globally, such as data from the HapMap Project.
Tracing a person’s DNA to a geographical area is relatively straightforward—it’s well-established that particular SNPs tend to occur in different frequencies in people with different ancestries. As a result, sequencing the DNA of someone living in Florida whose family came from Haiti can reveal what proportion of his or her ancestors originally came from Africa and even where in Africa those people lived.
But one of the most amazing things about the state of modern genetics is that it also allows scientists to draw chronological conclusions about human migration, because blocks of these SNPs shorten over time at a generally consistent rate. ”You can essentially break the genome up into European chunks, Native American chunks and African chunks,” Martin says. “If each of these regions are longer, it suggests they arrived in the gene pool more recently, because time tends to break up the genome. If these chunks are shorter, it suggests there’s been a lot of recombination and mixing up of the genome, which suggests the events were longer ago.”
Modeling their DNA data with these assumptions built in, the researchers created a portrait of Caribbean migration and population change that stretches back to before the arrival of Columbus. One of their most interesting findings was just how few Native Americans survived the arrival of Europeans, based on the DNA data. “There was an initial Native American genetic component on the islands,” Martin says, “but after colonization by the Europeans, they were almost decimated.”
This decimation was the result of European attacks and enslavement, as well as the disease and starvation that came in their wake. The DNA analysis showed that the native population collapse of Caribbean islands happened almost immediately after the arrival of Columbus, within one generation of his first visits and the appearance of other Europeans. The gene pool on the mainland, by contrast, shows a more significant Native American influence, indicating that they didn’t die off at the same rates.
What replaced the missing Native American genes in island populations? The answer reflects the conquering Europeans’ solution to diminishing populations available for labor: slaves kidnapped and imported from Africa. The DNA analysis showed a heavy influence from characteristically African SNPs, but notably, it revealed two separate phases in the trans-Atlantic slave trade. “There were two distinct pulses of African immigration,” Martin says. “The first pulse came from one part of West Africa—the Senegal region—and the second, larger pulse came from another part of it, near the Congo.”
This corresponds to written records and other historical sources, which show an initial phase of slave trade starting around 1550, in which slaves were mostly kidnapped from the Senegambia area of the Mali Empire, covering modern-day Senegal, Gambia and Mali (the orange area in the map at right). This first push accounted for somewhere between 3 and 16 percent of the total Atlantic slave trade. It was followed by a second, much heavier period that made up more than half of the trade and peaked during the late 1700s, in which slaves were largely taken from what is now Nigeria, Cameroon, Gabon and the Congo (the red and green areas).
The genetic analysis can also look at genes that are passed down on the X chromosome in particular, revealing the historical influence of different ancestries on both the female and male sides of the genome. They found that, in the populations studied, Native American SNPs were more prevalent on the X chromosome than the others, reflecting the history of both marriage and rape of Native American women by Spanish men who settled in the area.
As medical researchers, the scientists are primarily interested in using the findings to advance research into the role of genetics in diseases that disproportionately affect Hispanic populations. Similar research on genetics and ethnicity has revealed that, for instance, Europeans are much more likely to suffer from cystic fibrosis, or sickle-cell anemia tends to strike people of African ancestry.
“Hispanics are extremely diverse genetically—they originate from countries all over the world,” Martin says. “So that poses great challenges in genetic studies. We can’t just lump all Hispanics into a group and think of them as homogenous, so we’re trying to look more deeply into their genetic heritage and where it came from.”
November 13, 2013
None of these drinks, though, has anything on the tradition of drinking spicy beverages in Mexico. A new analysis of ancient pottery unearthed from archaeological sites near Chiapa de Corzo, in southern Mexico, shows that people were using chili peppers to make their drinks spicy as far back as 400 BC.
The analysis, conducted by a group of researchers led by Terry Powis of Kennesaw State University, was published today in PLOS ONE. As part of the study, the scientists chemically tested 13 pottery vessels that had been excavated from a series of sites in the area linked to speakers of the Mixe-Zoquean group of languages—closely related to the language of the Olmec civilization—and were previously dated to years ranging from 400 BC to 300 AD.
When they scraped tiny samples out of the inside of each of the vessels, used chemical solvents extract organic compounds, and analyzed them with liquid chromatography testing, they found dihydrocapsaicin and other irritants that serve as evidence that Capsicum species, the taxonomic group that includes spicy chili peppers, once filled five of the vessels. Based on the vessels’ shape and prior archaeological work on the Mixe-Zoquean culture, the researchers believe they were used for all sorts of liquids—likely beverages, but perhaps condiments or sauces.
Previously, research by Smithsonian scientists had shown that chili peppers were domesticated much earlier—perhaps as far back as 6000 years ago—in Ecuador. This new research, however, is the oldest evidence of chili pepper use in
Central North America, and the first known instance of their use in ancient beverages, rather than in solid food.
Interestingly, the researchers originally began the project looking for evidence of the ancient use of cocoa beans in beverages. But their testing didn’t reveal any traces of cocoa left behind in the vessels, suggesting that the tradition of spicy drinks came first, and chocolate flavoring was only added to such drinks later on.
Other contextual evidence also suggests that the spicy drink of in Mixe-Zoquean culture differed significantly from the spiced hot chocolate enjoyed in Mexico today. Three of the vessels were found buried in the tombs of elite-status individuals, while the other two were excavated from temple-like structures. This context, they say, suggests that the beverages might have been used in ceremonial and ritual circumstances.
The authors note that this doesn’t rule out the possibility that the beverages were commonly drunk as well—a more thorough survey of vessels would need to be conducted to know for sure. Additionally, the researchers speculate that rather than a flavoring, chili peppers might have been ground up into a paste and coated on the walls of vessels as an insect and vermin repellent. If that was indeed the case,then bless the serendipity of whoever put liquidy chocolate into one of those vessels and created the wonder that is spicy hot cocoa.
October 30, 2013
In 2013, if you’re someone who cares about the environment, your first and foremost concern is probably climate change. After that, you might worry about things like radioactive contamination, collapsing honeybee colonies and endangered ecosystems, among other contemporary environmental perils that fill recent news headlines.
But a number of researchers in the field are focused on a problem that has faded out of the news cycle: the piles of garbage that are growing around the world.
A recent World Bank report projected that the amount of solid waste generated globally will nearly double by the year 2025, going from 3.5 million tons to 6 million tons per day. But the truly concerning part is that these figures will only keep growing for the foreseeable future. We likely won’t hit peak garbage—the moment when our global trash production hits its highest rate, then levels off—until sometime after the year 2100, the projection indicates, when we produce 11 million tons of trash per day.
Why does this matter? One reason is that much of this waste isn’t handled properly: Millions of plastic fragments flooding the world’s oceans and disrupting marine ecosystems, and plenty of trash in developing countries is either burned in incinerators that generate air pollution or dumped recklessly in urban environments.
Even if we sealed all our waste in sanitary landfills, however, there’d be a much bigger problem with our growing piles of garbage—all the industrial activities and consumption that they represent. “Honestly, I don’t see waste disposal as a huge environmental problem in itself,” explains Daniel Hoornweg, one of the authors of the World Bank report and a professor at the University of Ontario, who authored an article on peak garbage published today in Nature. “But it’s the easiest way to see how the environment is being affected by our lifestyles overall.”
The quantity of garbage we generate reflects the amount of new products we buy, and therefore the energy, resources and upstream waste that are involved in producing those items. As a result, Hoornweg says, “solid waste is the canary in the coal mine. It shows how much of an impact we’re having globally, as a species, on the planet as a whole.”
This is why he and others are concerned about peak garbage and are attempting to project our trash trends decades into the future. To make such estimates, they rely upon projections of population grown along with a number of established trends in waste: People create much more trash when they move to cities (and begin consuming more packaged products) and when they become wealthier (and increase their consumption overall).
Historical data indicate, though, that a certain point, the per capita amount of garbage generated in wealthy societies tends to level off—apparently, there’s only so much a person can consume (and only so much trash they can produce). As a result, in many of the world’s wealthy countries, the average person produces slightly more than 3 pounds of solid waste per day, and that number isn’t estimated to change significantly going forward.
The amount of people moving to cities and consuming more in the rest of the world, however, is projected to surge over the coming century—and even as the resulting waste production finally levels off in East Asia around 2075, it’ll be offset by continuing increases in the growing urban areas of South Asia and Sub-Saharan Africa, the authors of the Nature article note. As a result, unless we significantly reduce the per capita waste production of wealthy city-dwellers,
the world as a whole won’t hit peak garbage until sometime after 2100, when we’re creating three times as much trash as we are right now.
How can we address our population’s growing consumption problem? One of the main things to consider is that it’s largely driven by people in the developing world voluntarily moving to cities and improving their standard of living, both signs of economic progress in their own right. But even if these demographic shifts continue, the projected rates of garbage growth aren’t entirely inevitable, because there are cultural and policy dimensions to waste production.
The average person in Japan, for example, creates about one-third less trash than an American, even though the two countries have similar levels of GDP per person. This is partly because of higher-density living arrangements and higher prices for imported goods, but also because of norms surrounding consumption. In many Japanese municipalities, trash must be disposed in clear bags (to publicly show who isn’t bothering to recycle) and recyclables are routinely sorted into dozens of categories, policies driven by the limited amount of space for landfills in the small country.
Creating policies that give incentive to people to produce less waste elsewhere, therefore, could be a way of tackling the problem. But, because our garbage is just the end result of a host of industrial activities, some reduction measures will be less important than others. Designing recyclable packaging would be a much less useful solution, for instance, than designing products that don’t need to be replaced as often. Even better, as Hoornweg and his coauthors argue in the article, would be accelerating ongoing increases in education and economic development in the developing world, especially Africa, which would cause urban population growth—and also the amount of trash produced per capita—to level off sooner.
Garbage might seem like a passé environmental issue, but it’s a proxy for nearly all the others—so tripling our global rate of garbage production is a particularly bad idea. “The planet is having enough trouble handling the cumulative impacts that we’re subjecting it to today,” Hoornweg says. “So with this projection, we’re basically looking at tripling the total amount of stress that we’re putting the planet under.”