June 18, 2013
Cyprus, the Mediterranean island nation just south of Turkey, took centuries to gain its independence. The Greeks, Assyrians, Egyptians, Persians, Romans, Ottomans, British and others all took their turns taking over the island, and each left their mark on the archeological record. But in a ruined chamber in a castle on the western corner of the island, it may be more apt to say the invaders left a smear.
In 1191, during the Third Crusade, King Richard I of England invaded Cyprus and ordered that a castle be built on the island’s western corner in order to defend the harbor there. Called Saranda Kolones, the castle’s name refers to its many monolithic columns. But in typical tumultuous Cyprus fashion, the medieval castle was only used for thirty years before it was destroyed by an earthquake. By then, King Richard had sold Cyprus to Guy de Lusignan, the King of Jerusalem. Lusignan and his successors had other plans for expanding the island. The wrecked port was abandoned and the castle never rebuilt.
As castles go, Saranda Kolones had a pretty poor run. But two University of Cambridge researchers recently realized that, precisely thanks to the castle’s short use, a priceless treasure had been left behind in the Saranda Kolones’ bowels. One of the centuries-old castle latrines (read: ancient toilet), they found, was still full of dried-up poo. That feces, they thought, could provide valuable insight into what kind of parasites plagued the former residents’ guts. And because only 30 years’ worth of waste clogged the ancient sewage system, those parasites could provide specific insight into what ailed medieval crusaders. The researchers rolled up their sleeves and collected samples from the dessicated cesspool.
To rehydrate the ancient night soil, the team placed one gram of their sample into a chemical liquid solution. They used micro sieves, or tiny strainers to separate parasite eggs from the digested remains of the crusaders’ meals. They created 20 slides, and peeked into their microscopes to see what creatures the soldiers may have left behind.
The samples revealed 118 “lemon-shaped” Trichuris trichiura eggs–a type of roundworm commonly called the whipworm–as well as 1,179 Ascaris lumbricoides, or giant roundworm, eggs. A control sample of non-toilet soil they tested did not contain any parasite eggs, confirming that the eggs did indeed come from the toilet, they report in the International Journal of Paleopathology.
The study of ancient parasites, whether through old bones that reveal leprosy-causing pathogens or dried up leaves that elucidate the cause of the Irish potato famine, is a thriving field. In this case, the long-dead parasite eggs were pooped out by the crusaders using the toilet years ago. These species reproduce within human bodies, and go on to infect new hosts through egg-contaminated soil or food delivered courtesy of the host.
Heavy infection with either of these worms was no picnic. The authors write, first of giant roundworms:
The mature female then starts to lay about 200,000 eggs per day that can be fertile or unfertile if no male worms are present. Although a mild infection with roundworms is mostly asymptomatic, heavy burdens with Ascaris can cause intestinal blockage and abdominal pain in adults. Because children are less able to tolerate parasites that compete with them for nutrients in their diet, heavy infection with roundworms can cause nutritional impairment, vitamin deficiencies, anaemia and growth retardation.
And of whipworms:
When the females reach maturity they can release 2000–10,000 eggs per day. As with roundworm a heavy worm burden may contribute to malnutrition, stunted growth in childhood and sometimes mechanical damage of the intestinal mucosa, diarrhoea and prolapsed rectum.
The presence of these worms, the authors write, attests to the poor hygienic conditions the castle residents likely practiced and put up with. “Poor hygiene with dirty hands, contamination of the food and water supplies with faecal material, inadequate disposal of the faecal material, and consumption of unwashed vegetables fertilized with human faeces are some of the means through which roundworms and whipworms are spread.”
The worms also could have jeopardized the health of their hosts, especially during years of famine when both parasite and human competed for scarce nutrients from meals few and far between. Previous studies found that between 15 to 20 percent of nobles and the clergy died from malnutrition and infectious disease during the crusades. Although death records for poor soldiers are not available, the authors think it’s safe to assume that malnutrition probably hit the lower-ranking crusaders even harder.
“It is quite likely that a heavy load of intestinal parasites in soldiers on crusade expeditions and in castles undergoing long sieges would have predisposed to death from malnutrition,” they write. “This clearly has implications for our understanding of health and disease on mediaeval military expeditions such as the crusades.”
Before contemporary readers breathe a sign of relief that these parasites infested the guts of people living more than 800 years ago, it’s important to note that the giant roundworm infests an estimated one-sixth of all humans living today. As the authors write, “In modern times A. lumbricoides and T. trichiura are two of the most common and widespread intestinal parasites.” Other parasites continue to plague human populations worldwide, especially in developing countries. Who knows what the archaeologists of the future will find in the scum of your latrine?
June 17, 2013
If you’ve ever attempted to move to a foreign country and learn to speak the local language, you’re aware that successfully doing so is an enormous challenge.
But in our age of widely distributed Wi-Fi hotspots, free Skype video calls from one hemisphere to another and favorite TV shows available anywhere in the world over the web, speaking a foreign language may be more difficult than ever.
That’s because, as new research shows, merely seeing faces and images that you associate with home could make speaking in a foreign tongue more difficult. In a study published today in the Proceedings of the National Academy of Sciences, researchers from Columbia University and Singapore Management University found that for Chinese students who’d recently moved to the U.S., seeing several different types of China-related visual cues measurably reduced their fluency in English.
In the first part of the study, the researchers tested 42 students’ English speaking ability under two different circumstances—while looking at a computer screen, they either spoke with an image of a Caucasian face or a Chinese one. In both cases, they heard the exact same prerecorded passage (about the experience of living on a college campus) and responded with their own thoughts in English.
When the students conversed with the image of a Chinese person, they spoke more slowly and with slightly less fluency, as judged by an observer that was unaware of which image they’d been looking at. As a control, the researchers also tested a group of native English speakers’ fluency when speaking with both images, and found no difference.
In the second part, instead of looking at a face, 23 other students viewed quintessential icons of Americana (Mt. Rushmore, for example) or icons of Chinese culture (such as the Great Wall) and then were asked to describe the image in English. Once again, looking at the Chinese-related images led to reduced fluency and word speed.
Finally, in the last part of the experiment, the researchers examined particular objects with names that can be tricky to translate from Chinese to English. The Chinese word for pistachio, for example, translates literally as “happy nut,” while the word for lollipop translates as “stick candy,” and the researchers wanted to see how prone to using these sorts of original Chinese linguistic structure the students would be under various conditions. (For an English example, think of the word “watermelon”—you wouldn’t translate into other languages by combining that language’s word for “water” with the word for “melon.”)
The researchers showed photos of these sorts of objects to 85 Chinese-speaking students who’d only been in the country for about 3 months. Some students saw familiar images from Chinese culture, while others saw American or neutral images. All groups demonstrated the same level of accuracy in describing objects like pistachios, but they proved to be much more likely to make incorrect overly literal translations (calling pistachios, for example, “happy nuts”) when shown Chinese imagery first than with either of the other two categories.
The researchers explain the results as an example of “frame-switching.” In essence, for the native Chinese speakers still learning English as a second language, being exposed to faces or images that they associated with China unconsciously primed them to think in a Chinese frame of reference. As a result, it took more effort to speak English—causing them to speak more slowly—and perhaps made them more likely to “think in” Chinese too, using literal Chinese linguistic structures instead of translating into the correct English words.
All this could eventually have some impact on the practices surrounding the teaching of second languages, providing further evidence that immersion is the most effective way for someone to gain mastery because it reduces the sort of counter-priming examined in this study. If you’ve recently immigrated to place where your native tongue isn’t widely spoken or you’re living abroad for an extended period of time, there’s a lesson here for you too: If you want to become fluent in the language around you, keep the video calls and TV shows from home to a minimum.
June 14, 2013
In the past two decades, we’ve seen dramatic images of ice shelves and the floating tongues of glaciers crumble into the ocean. The summer of 2012 saw a huge chunk of ice–two times the size of Manhattan–snap off of Greenland’s Petermann Glacier. Two years earlier, a piece of ice twice as big as that one split from the glacier’s front. In early 2002, ice covering an area the greater than the size of Rhode Island sloughed into the ocean from a lobe of the Antarctic Peninsula’s Larsen Ice Shelf, releasing into the ocean three-quarters of a trillion tons of ice. Seven years before that, the northernmost sector of the same ice sheet completely collapsed and an area of ice roughly the size of Hawaii’s Oahu island dissolved into the sea.
Scientists have long thought that sudden and dramatic ice calving events like these, along with more moderate episodes of calving that occur daily, were the main mechanisms for how polar ice gets lost to the sea. New research, however, shows that calving icebergs is only the tip of the iceberg–seawater bathing the undersides of ice shelves contributes most to ice loss even before calving begins, at least in Antarctica.
The discovery, published in the journal Science, shows that interactions with the ocean underneath floating ice account for 55 percent ice lost from Antarctic ice shelves between 2003 and 2008. The researchers arrived at their findings by studying airborne measurements of ice thicknesses from radar sounders and the rates of change in ice thickness based off of satellite data. Combining these data allowed them to calculate the rates of bottom melting.
Given that thick platforms of floating ice surround nearly 75 percent of Earth’s southernmost continent, covering nearly 580 million square miles, ice melted in this fashion may well be the main contributor to sea level rise. “This has profound implications for our understanding of interactions between Antarctica and climate change.” said lead author Eric Rignot a researcher at UC Irvine and NASA’s Jet Propulsion Laboratory, in a statement. “It basically puts the Southern Ocean up front as the most significant control on the evolution of the polar ice sheet.”
Interestingly, the big ice shelves–Ross, Ronne and Filchner, which cover about 61 of Antarctica’s total ice shelf area–only contribute a small fraction meltwater through their bases. Instead, less than a dozen small ice shelves, particularly those on the Antarctic Peninsula, are responsible for most–nearly 85 percent–of the basal melting observed by the authors during their study period. These shelves not only float in warmer water, relatively, but their small sizes may mean that their interiors are less sheltered from already warmer ocean waters that creep underneath the ice.
The findings reveal a lot about the vulnerability of polar ice in a warming world. Ice sheets ooze through glaciers to the sea, where they interlace and form ice shelves. These shelves are akin to a cork that keeps the contents inside from spewing out–when ice sheets collapse, the glaciers that feed them thin and accelerate, helping to drain the interior ice sheet. Polar ice sheets already are losing at least three times as much ice each year as they were in the 1990s, and the findings released today may give a mechanism for this frantic pace.
In fact, the major ice calving events of the last two decades on the Petermann Glacier and Larsen Ice Shelf may have started with the fact that melting from underneath was weakening the ability of ice to coalesce into a solid mass.
“Ice shelf melt can be compensated by ice flow from the continent,” Rignot added. “But in a number of places around Antarctica, they are melting too fast, and as a consequence, glaciers and the entire continent are changing.”
June 13, 2013
For centuries, millions of Europeans suffering from leprosy were shunned by society, made to wear bells that signaled to healthy citizens they were nearby. The infectious illness, also known as Hansen’s Disease, was poorly understood, often believed to be hereditary or a punishment from God. At its height, nearly one in 30 had the disease in some regions; by the 13th century, the number of leper hospitals active in Europe hit its peak at 19,000. Then, in the 16th century, the affliction fell into decline. Soon, it had virtually disappeared from the continent.
The pathogen responsible for leprosy was discovered in 1873 in Norway, squashing previous assumptions about its cause. The earliest written mention of leprosy, one of the oldest-known pathogens to plague humans, appeared in 600 B.C. in China. Historical records show it plagued ancient Greek, Egyptian and Indian civilizations. In 2009, DNA analysis of a first-century man’s remains found in a Jerusalem tomb provided the earliest proven case of leprosy.
Now, DNA sequencing technology has provided clues about the evolution of the bacteria itself. Using well-preserved DNA samples from ancient skeletons, an international team of researchers has sequenced the genome of the pathogen Mycobacterium leprae as it existed in medieval times.
Until now, scientists hadn’t even been able to sequence the pathogen from living people—the bacterium can’t be grown in cell culture in the lab, so scientists usually infect mice with it to achieve a sample big enough for sequencing. The material gleaned from human bones for this study, exhumed from medieval graves, contained a tiny amount of bacterial DNA—less than 0.1 percent, to be in fact. But thanks to extremely sensitive and precise technology, scientists were able to sequence five strains of M. leprae.
Today, more than 225,000 cases of leprosy arise each year, mostly in developing countries. Using samples from some of these cases, the researchers compared the centuries-old sequences to 11 modern strains of the pathogen, extracted from recent biopsies from several geographic regions.
The results, published today in the journal Science, reveal that the bacterium has, in terms of genetic makeup, remained relatively the same despite the last 1,000 years. Only 800 mutations occurred among the 16 genomes in that time, the researchers write. This number means that the mysterious disappearance of the disease by the Middle Ages in Europe can’t be attributed to M. leprae losing its virulence.
“If the explanation of the drop in leprosy cases isn’t in the pathogen, then it must be in the host—that is, in us,” says Stewart Cole, co-director of the study and the head of the École Polytechnique Fédérale de Lausanne’s Global Health Institute. “So that’s where we need to look.”
The pathogen’s genetic resilience was evident in its modern strains. Researchers found that a medieval strain present in Sweden and the U.K. was nearly identical to one currently found in the Middle East. Their findings also suggest that some strains found in the Americas originated in Europe. What they can’t tell us, however, is the direction in which the epidemic spread throughout history.
This research marks a growing trend in using DNA analysis to learn more about epidemics and other devastating events in human history. Last month, scientists sampled 166-year-old Irish potato leaves using similar technology: They determined that a previously unknown strain of P. infestans caused the blight that shrunk 19th-century Ireland’s population by 25 percent. Perhaps future research could someday pinpoint the pathogen responsible for the bubonic plague, commonly known as the Black Death, which wiped out nearly half of Europe’s population between 1347 and 1351.
June 12, 2013
You likely don’t give a ton of thought to the sounds and patterns that make up the language you speak everyday. But the human voice is capable making of a tremendous variety of noises, and no language includes all of them.
About 20 percent of the world’s languages, for example, make use of a type of sound called an ejective consonant, in which an intense burst of air is released suddenly. (Listen to all the ejectives here.) English, however—along with most European languages—does not include this noise.
Linguists have long assumed that the incorporation of different sounds into various languages is an entirely random process—that the fact that English includes no ejectives, for instance, is an accident of history, simply a result of the sounds arbitrarily incorporated into the language that would evolve into German, English and most other European languages. But recently, Caleb Everett, a linguist at the University of Miami, made a surprising discovery that suggests the assortment of sounds in human languages is not so random after all.
When Everett analyzed hundreds of different languages from around the world, as part of a study published today in PLOS ONE, he found that those that originally developed at higher elevations are significantly more likely to include ejective consonants. Moreover, he suggests an explanation that, at least intuitively, makes a lot of sense: The lower air pressure present at higher elevations enables speakers to make these ejective sounds with much less effort.
The finding—if it holds up when all languages are analyzed—would be the first instance in which geography is found to influence the sound patterns present in spoken words. It could open up many new avenues of inquiry for researchers seeking to understand the evolution of language throughout human history.
Everett started out by pulling a geographically diverse sampling of 567 languages from the pool of an estimated 6,909 that are currently spoken worldwide. For each language, he used one location that most accurately represented its point of origin, according to the World Atlas of Linguistic Structures. English, for example, was plotted as originating in England, even though it’s spread widely in the years since. But for most of the languages, making this determination is much less difficult than for English, since they’re typically pretty restricted in terms of geographic scope (the average number of speakers of each languageanalyzedis just 7,000).
He then compared the traits of the 475 languages that do not contain ejective consonants with the 92 that do. The ejective languages were clustered in eight geographic groups that roughly corresponded with five regions of high elevation—the North American Cordillera (which include the Cascades and the Sierra Nevadas), the Andes and the Andean altiplano, the southern African plateau, the plateau of the east African rift and the Caucasus range.
When Everett broke things down statistically, he found that 87 percent of the languages with ejectives were located in or near high altitude regions (defined as places with elevations 1500 meters or greater), compared to just 43 precent of the languages without the sound. Of all languages located far from regions with high elevation, just 4 percent contained ejectives. And when he sliced the elevation criteria more finely—rather than just high altitude versus. low altitude—he found that the odds of a given language containing ejectives kept increasing as the elevation of its origin point also increased:
Everett’s explanation for this phenomenon is fairly simple: Making ejective sounds requires effort, but slightly less effort when the air is thinner, as is the case at high altitudes. This is because the sound depends upon the speaker compressing a breath of air and releasing it in a sudden burst that accompanies the sound, and compressing air is easier when it’s less dense to begin with. As a result, over the thousands of years and countless random events that shape the evolution of a language, those that developed at high altitudes became gradually more and more likely to incorporate and retain ejectives. Noticeably absent, however, are ejectives in languages that originate close to the Tibetean and Iranian plateaus, a region known colloquially as the roof of the world.
The finding could prompt linguists to look for other geographically-driven trends in the languages spoken around the world. For instance, there might be sounds that are easier to make at lower elevations, or perhaps drier air could make certain sounds trip off the tongue more readily.