May 23, 2013
Last September, a team of archaeologists in the UK made a remarkable find: under a city council parking lot in Leicester, they found the remains of King Richard III. The king ruled England for just two years (from 1483 until 1485) before his violent war-time death.
In February, after comparing DNA taken from the skeleton to surviving descendants of the king and testing its age, the group officially confirmed the identity of the body. Since then, forensic analysis indicated that the king was killed by traumatic sword blows to the head—perhaps with enough force to drive his crown into his skull.
Now, the first academic paper to be published on the discovery provides more unnerving details on the circumstances of Richard III’s death. In a study to be published tomorrow in the journal Antiquity, the University of Leicester team writes that the king’s body looks like it was buried in a hurry, crammed into a hastily-prepared grave that was too small for him. Further, he was left in a strange, slightly folded position, perhaps even with its hands tied together.
Instead of a carefully-dug grave with straight walls, as was customary during the era, Richard III’s has sloping walls, with a larger size at the surface than at the bottom, as the team determined by comparing the layered patterns in the dirt abutting the grave with the unordered soil filling it and surrounding the king’s remains.
What’s more, the king’s head was left leaning against one corner of the grave, indicating that a gravedigger stood in the hole to receive his body and didn’t bother rearranging him at the center after putting him down on the ground, and there’s no evidence that a coffin or even a death shroud was used. Given the historical context of Richard III’s death, none of this is a huge surprise, although the apparent lack of care surrounding the burial of this king might exceed even what historians had previously expected.
Richard III was killed at age 32 during the Battle of Bosworth Field, close to the end of the infamously violent War of the Roses period—a 30-plus year battle for power between supporters of competing branches of the royal family for control of the throne. After he was defeated and killed in battle by the forces of rival Henry Tudor (who would become King Henry VII), the new king reportedly kept the burial location intentionally secret—he feared it would otherwise become a rallying location for his enemies—and knowledge of Richard III’s grave was lost over time.
Now we know that Richard III’s body was brought to the nearby city of Leicester, passed along to Franciscan friars and buried at what was then Grey Friars church “without any pomp or solemn funeral,” according to the contemporary historian Polydore Vergil. (Legend holds that his body was stripped naked, transported on the back of a horse and mocked by passers-by during the entire journey.) Eventually, the church was dismantled, and the site was paved over.
Apart from analyzing the unusual characteristics of the king’s grave, the new paper also provides the first peer-reviewed forensic details about his remains. As the archaeologists had previously mentioned in public statements, the body matches the physical details of Richard III as described in historical sources: a curved spine, due to childhood scoliosis, and slim features. In addition to the fierce blows to his head, there were a total of 10 wounds discovered on his body, including stabs in his buttocks and back that the researchers believe were probably made after he’d already been killed, because of their location and the fact that they couldn’t have been made while he was still wearing armor.
So, did Richard III die in violent humiliation? The new findings seem to support this idea. At the very least, he was buried in a manner that certainly didn’t befit a king. But now, a number of groups and localities are suddenly interested in giving him a proper burial. The cities of Leicester and York are dueling over the right to preserve his remains and attract the tourists that will flock to see the king who was buried in a parking lot. We can only hope this new battle doesn’t last for another 30 years.
There’s a lot we don’t understand about an itch. Why do itches sometimes pop up for no apparent reason? Why is itching contagious? Why can the very idea of an itch—maybe even the fact that you’re currently reading about itching—cause you to feel the actual physical sensation of one?
Given all this uncertainty, a new discovery reported today in Science should at least scratch the surface of your curiosity and answer a question you’ve been itching to ask (terrible puns intended). A pair of molecular geneticists from the National Institutes of Health, Santosh Mishra and Mark Hoon, isolated a crucial signaling molecule produced by nerve cells that is necessary for passing along the sensation of an itch to the brain.
The pair worked with mice, and started off by examining the neurotransmitter chemicals produced by a type of neuron that runs all the way from the animals’ skin into their spinal columns. These neurons are known to be involved in passing along sensory information about the outer environment, including sensations of heat and pain. They measured that one of the neurotransmitters produced by these nerve cells—a chemical called Nppb (natriuretic polypeptide b)—was secreted in excess when the mice were subjected to a range of itch-inducing substances, such as histamine (the natural compound that triggers the itchiness associated with allergies) and chloroquine (a malaria drug that’s notorious for causing itching as a side-effect).
To test whether Nppd played a role in the itching, they genetically engineered some mice so that they failed to produce the chemical. Initially, they checked to see if these engineered mice were impervious to other types of sensations also conveyed by these neurons (pain, movement and heat), but they seemed to behave just the same as the normal mice, indicating Nppb wasn’t involved in the transmission of those stimuli.
Then, they exposed them once again to the itch-inducing chemicals. The normal mice scratched away, but the genetically engineered mice were another story. “It was amazing to watch,” Mishra said in a press statement. “Nothing happened. The mice wouldn’t scratch.”
Nppb, they determined, plays a key role in passing along the sensation of an itch from these neurons to the brain—especially because, when they injected these same mice with doses of Nppb, they suddenly started scratching just like the others.
To investigate just how Nppb relays this message, they zeroed in on a spot in the mice’s spines called the dorsal horn, in which sensory information from the skin and muscles gets integrated into the spinal column and sent to the brain. In this area, they discovered a high concentration of neurons with a receptor called Npra (natriuretic peptide receptor A) that seemed likely to accept the Nppb molecules secreted when the mice encountered an itch-triggering substance.
Sure enough, when they removed the neurons with the Npra receptor from normal, non-engineered mice that produced Nppb, they too stopped scratching when exposed to the substances. This indicates that Nppb is crucial for passing along the itch sensation from the nerves that reach out into the skin to the spine, and that it fits into the Npra receptor on spinal nerve cells, which then convey the sensation to the brain. But removing these receptors didn’t impact the transmission of pain or touch, indicating that Npra is specifically involved in the itch sensation pathway. This comes as a surprise, as most previous research has indicated that the pain and itching nervous networks are intricately related.
While this chemical pathway explains part of the physical mechanism behind an itch, scientists still don’t fully understand the underlying evolutionary reason for the sensation in the first place. Some have speculated that it serves as a defense measure against insects, parasites and allergens, prompting us to scratch—and, ideally, remove the offending item from our skin—before it causes further damage.
Regardless of the evolutionary reason, our nervous system is similar enough to that of mice that the finding could help us better understand itching patterns in humans—perhaps people who are more prone to itching naturally produce higher levels of Nppb, compared to those who get biten by a mosquito and find the itchiness easy to ignore. On a practical level, the discovery could eventually help us develop anti-itch drugs for people with chronic itching ailments, such as allergic reactions or skin conditions like eczema, which affects an estimated 30 million people.
The problem, though, is that Nppb plays several other important roles in the body (it was originally discovered due to its role in the regulation of blood circulation and pressure) so simply creating a drug that disables Nppb is likely to cause disruptive side-effects that go way beyond itching. But looking more closely into the way the Nppb molecule acts as a “start switch” for itching in humans—and perhaps figuring out a way to safely turn the switch off—could potentially provide relief for itchiness caused by all sorts of triggers, because in the mice, at least, the molecule was found to be involved in the whole range of itch-inducing substances the team tested.
Most people consider saving the Amazon rainforest a noble goal, but nothing comes without a cost. Cut down a rainforest, and the planet looses untold biodiversity along with ecosystem services like carbon dioxide absorption. Conserve that tract of forest, however, and risk facilitating malaria outbreaks in local communities, a recent study finds.
Nearly half of malaria deaths in the Americas occur in Brazil, and of those nearly all originate from the Amazon. Yet few conservationists consider the forest’s role in spreading that disease. Those researchers who do take malaria into account disagree on what role forest cover plays in its transmission.
Some think that living near a cleared patch of forest–which may be pockmarked with ditches that mosquitoes love to breed in–increase malaria incidence. Others find the opposite–that living near an intact forest fringe brings the highest risk for malaria. Still more find that close proximity to forests decrease malaria risk because the mosquitoes that carry the disease are kept in check through competition with mosquitoes that don’t carry the disease. Most of the studies conducted in the past only focused on small patches of land, however.
To get to the bottom of how rainforests contribute to malaria risk, two Duke University researchers collected 1.3 million positive malaria tests from a period of four-and-a-half years, and ranging over an area of 4.5 million square kilometers in Brazil. Using satellite imagery, they added information about the local environment where each of the cases occurred and also took rainfall into account, because precipitation affects mosquitoes’ breeding cycles. Using statistical models, they analyzed how malaria incidences, the environment and deforestation interacted.
Their results starkly point towards the rainforest as the main culprit for malaria outbreaks. “We find overwhelming evidence that areas with higher forest cover tend to be associated with higher malaria incidence whereas no clear pattern could be found for deforestation rates,” the authors write in the journal PLoS One. People living near forest cover had a 25-fold greater chance of catching malaria than those living near recently cleared land. Men tended to catch malaria more often the women, implying that forest related jobs and activities–traditionally carried out by men–are to blame by putting people at greater risk for catching the disease. Finally, the authors found that people living next to protected areas suffered the highest malaria incidence of all.
Extrapolating these results, the authors calculated that, if the Brazilian government avoids just 10 percent of projected deforestation in the coming years, citizens living near those spared forests will contend with a 2-fold increase in malaria by 2050. “We note that our finding directly contradicts the growing body of literature that suggests that forest conservation can decrease disease burden,” they write.
The authors of the malaria study do not propose, however, that we should mow down the Amazon in order to obliterate malaria. “One possible interpretation of our findings is that we are promoting deforestation,” they write. “This is not the case.” Instead, they argue that conservation plans should include malaria mitigation strategies. This could include building more malaria detection and treatment facilities, handing out bed nets and spraying for mosquitoes.
This interaction between deforestation and disease outbreakis just one example of the way efforts to protect the environment can cause nature and humans to come into conflict. Around the world, other researchers have discovered that conservation efforts sometimes produce negative effects for local communities. Lyme disease–once all but obliterated–reemerged with a vengeance (pdf) in the northeastern U.S. when abandoned farmland was allowed to turn back into forest. Human-wildlife conflict–including elephants tearing up crops, tigers attacking livestock, and wolves wandering into people’s backyards–often comes to a head when a once-declining or locally extinct species makes a comeback due to conservation efforts.
“We believe there are undoubtedly numerous ecosystem services from pristine environments,” the PLoS One authors conclude. “However, ecosystem disservices also exist and need to be acknowledged.”
May 22, 2013
For most of human history, any baby who suffered a collapsed trachea or bronchi faced a tragic fate: suffocation. These tubes convey air from the mouth to the lungs, and some infants are born with congenitally weakened cartilage surrounding them, a condition known as tracheomalacia. In severe cases, this can lead the trachea or bronchi to collapse completely, blocking the flow or air and causing a newborn to suddenly stop breathing.
To the amazingly wide-ranging list of accomplishments attributed to 3D printing technology, we can now add one more: a custom-made tracheal splint that saved the life of an infant with tracheomalacia and will be safely absorbed into his tissue over the next two years. A team of doctors and engineers from the University of Michigan printed the splint and implanted it into six-week-old Kaiba Gionfriddo last year, and announced the feat in a letter published today in the New England Journal of Medicine.
In December of 2011, Giondriddo was born with tracheomalacia, a condition that affects roughly 1 in 2200 American babies. Typically, the weakened cartilage causes some difficulty breathing, but children grow out of it by age 2 or 3 as the trachea naturally strengthens over time. His case, though, was particularly severe, and in February 2012, his parents April and Bryan were out to dinner when they noticed that he suddenly stopped breathing and was turning blue.
He was rushed to a hospital and kept alive with a ventilator, but doctors said there was a good chance he wouldn’t be able to survive long-term. Several weeks later, a team of Michigan engineers led by Scott Hollister began designing the device, based off prior research, in which they’d 3D printed splints and other prostheses but hadn’t implanted them in clinical patients. For this splint, they used a CT scan of Giondriddo’s trachea and left bronchus to create a 3D digital representation that was then printed, allowing them to produce a splint that would perfectly match his airway’s size and contours.
On February 21, 2012, the splint was surgically sewn around Giondriddo’s failed bronchus; almost immediately, it held open his air passages and allowed him to breathe normally. “It was amazing. As soon as the splint was put in, the lungs started going up and down for the first time,” Glenn Green, the doctor who performed the surgery and helped design the splint, said in a press statement.
21 days later, Giondriddo was taken off the ventilator and has had no breathing problems in the 14 months since the surgery. In addition to holding open the bronchus, the splint also provides a skeleton upon which natural cartilage tissue can grow, and because it was printed using a biopolymer called polycaprolactone, it will gradually be absorbed into this body tissue over time.
Previously, severe tracheomalacia was treated by extended periods of time using a ventilator, or the implantation of mesh tubes around the trachea or bronchus to keep the airway open. By custom-designing the splint based off a CT scan, though, the team created a treatment method that they say is more effective. Additionally, the dissolvable material means Giondriddo won’t need invasive surgery later to remove the device.
The team has also worked on using this same CT scanning and 3D printing process to produce custom-made ear, nose, skull and bone prostheses that are currently in experimental phases. Other research groups have successfully implanted 3D printed ears, noses and skulls in clinical patients, while last month, an Oxford team figured out how to print microscopic droplets that behave like human tissue.
Concepts from science and nature pervade our language’s common phrases, idioms and colloquialisms. The incredulous expression”Well, I’ll be a monkey’s uncle” stems from sarcastic disbelief over Darwin’s writings on evolution. To be “in the limelight”—at the center of attention—harks back to how theater stages used to be lit by heating lime (calcium oxide) until it glowed a brilliant white, then focusing the light emitted into a spotlight.
Someone as “mad as a hatter” exhibits behavior similar to 18th and 19th century hat makers who stiffened felt cloth with mercury—an ingredient that after continued exposure causes dementia. “Tuning in” to someone’s message has its origins in the slight turns of a dial needed to focus on a radio signal.
These colorful expressions bring spice to our language. Yet certain well-used phrases from science are misrepresentations of what they’re trying to express. Others are just plain wrong!
Some are obvious, yet we use them anyhow. A person who sagaciously shakes her head and says “A watched pot never boils” while you are waiting second after agonizing second for test results to arrive or job offers to come in knows that if she sat down and watched a vessel containing water on a stove over high heat for long enough, the water will eventually boil. Or the person who utters the placating phrase that “the darkest hour is just before dawn,” meant to give hope to people during troubled times, probably knows that well before the Sun rises, the sky gets progressively lighter, just as how well after the Sun sets, light lingers until the Earth rotates beyond the reach of the Sun’s rays. Thus, the darkest hour of the night (in the absence of the Moon) is midway between sunset and sunrise.
A few phrases, however, have less obvious scientific inaccuracies. Here are a few for you to consider:
1. Once in a blue moon: This poetic phrase refers to something extremely rare in occurrence. A blue moon is the term commonly used for a second full moon that occasionally appears in a single month of our solar-based calendars. The problem with the phrase, however, is that blue moons are not so rare—they happen every few years at least, and can even happen within months of each other when the 29.5-day lunar cycle puts the full moon at the beginning of any month but February.
The usage of “blue moon” as the second full moon in a month dates back to a 1937 Marine Farmer’s Almanac. But prior to that, blue moons meant something slightly different. Typically, 12 full moons occur from winter solstice to the next winter solstice (roughly three per season), but occasionally a forth full moon in a season could be observed. In such a case, one of the four full moons in that season was labeled “blue.”
Readers may recall that baby Smurfs are delivered to the Smurf village during blue moons. If this were to occur every blue moon, we’d soon be awash in blue creatures three apples high!
2. Where there’s smoke, there’s fire: The phrase means that if something looks wrong, it likely is wrong. But let’s step back—do you always have to have fire if you see smoke?
Answering that first requires defining “fire.” Merriam-Webster’s first definition of fire is “the phenomenon of combustion manifested in light, flame, and heat.” Combustion is the chemical reaction that occurs when fuel is burned in the presence of oxygen. So for a fire to ignite and be sustained, it needs heat, fuel and oxygen—denying a fire any of these three things will extinguish the fire; attempting to start a fire without one of three things will be futile.
In complete combustion—what occurs when you light a gas stove—the fire produces no smoke. However, when most materials are burned, they undergo incomplete combustion, which means that the fire isn’t able to completely burn all of the fuel. Smoke is an airborne collection of little particles of these unburned materials.
The reason why these materials didn’t burn is because of pyrolysis—the breakdown of of organic material at elevated temperatures in the absence, or under a shortage, of oxygen. Think of it this way: a wood fire’s quick consumption of oxygen depletes the gas’s presence around a burning log, and this localized lack of oxygen while the log is at high temperatures causes log to char, breaking the log down into a substance much richer in carbon content. The resulting charcoal, if still under high heat, can then smolder—a flameless form of combustion—until all the fuel is consumed.
Smoke, then, can be considered to be a product of pyrolysis rather than of fire itself. You’re probably thinking—so what? To get the smoke, a fire needed to be present at some point, right?
Not always. Let’s consider pyrolysis to the extreme. For example, tobacco leaves heated to 800 degrees Celsius in a pure nitrogen atmosphere undergo pyrolysis and release smoke without actually being on fire.
Pyrolysis without fire can also occur in more familiar circumstances. Imagine blackening a piece of fish on a pan using an electric range, where electricity heats metal coils on the cooktop until they are incandescent, but not on fire. Leave the fish unattended for too long and it will start to char and smoke. But why bother with putting fish in the pan? Those looking for fireless smoke need to go no further than melting a slab of butter in a sauté pan. All oils and fats used in cooking have smoke points—the temperature at which they start to degrade into a charred goo of glycerol and fatty acids—as seen in this video.
Sure, leaving these smoking substances on the range for too long will cause them to eventually combust (oils and fats, after all, do have flash points), but before that, you have a whole lot of smoke with no fire!
3. The fish rots from the head down: The phrase seems to pop up more frequently when political scandals or accusations of malfeasance make headlines. The origin of the phrase is murky, likely stemming from folk proverbs of Europe and Asia Minor. But the meaning is simple–if a system is corrupt, its leaders instigated the corruption.
The authoritative ring to this phrase belies its accuracy. Fish, in fact, start to rot from the gut. According to David Groman, an expert on fish pathology at the University of Prince Edward Island, the proverb is a “poor metaphor. And, I must say, it’s biologically incorrect,” he told Anna Muoio of the business magazine Fast Company. “When a fish rots, the organs in the gut go first. If you can’t tell that a fish is rotting by the smell of it, you’ll sure know when you cut it open and everything pours out–when all the internal tissue loses its integrity and turns into liquid.”
The reporter then got hold of Richard Yokoyama, manager of Seattle’s Pike Place Fish Market, who said “Before I buy a fish from one of our dealers, I always look at the belly. On a fish, that’s the first thing to go. That’s where all the action is–in the gut. If the belly is brown and the bones are breaking through the skin, I toss the fish out. It’s rotten.”
Unfortunately for scientific accuracy, saying “The fish rots from the belly outward” lacks gravitas and is unlikely to be picked up by the punditsphere.
4. Hard as nails: The saying is often used to describe a person who is stern, unyeilding, unsympathetic, bordering on ruthless. An early appearance of the phrase can be found in Dickens’ Oliver Twist, when the Artful Dodger and the other street urchins describe their pickpocketing work ethic.
But let’s take a step back–are nails really that hard? The hardness of a material can be estimated relative to other substances according to where it falls on Mohs scale of mineral hardness. This scale, which ranges from one through 10, was developed by the German geologist in 1812 to help him classify the minerals he encountered in his excursions. Talc, a soft mineral easily powdered, is a one on the scale. The malleable element copper sits at a three. Quartz—the clear crystal common in sand or the spiny lining on the inside of a geode—is a seven. Diamond, the hardest substance on the planet, is a 10.
Mohs’ scale is an ordinal scale, which means that it doesn’t estimate the degree to which one substance is harder than another. Rather, it is based on the idea that materials that fall at higher values on this scale can scratch anything with lower numbers, and that materials with low hardness numbers cannot scratch anything with a higher hardness value. On this scale, a steel nail used to fasten wood together would hit at about 5.5. Feldspars, such as the pink minerals of granite, are harder than those nails, as are topaz, quartz, sapphires and of course diamonds. Even unglazed porcelain, which is about a seven on the scale, is harder than an average nail.
But not all nails are created equally. The nails used in wood are are made of low-carbon or “mild” steel, meaning that the chemical composition of their alloys are only between 0.05 to 0.6 percent carbon. Nails used to fasten concrete together, for example, have higher percentages of carbon–approaching one percent–which can push the hardness up to as high as a nine on Mohs scale.
So the more correct version of this phrase would be, “Hard as high-carbon steel nails,” but somehow that just doesn’t have the same ring, does it?
5. Diamonds are forever: Thanks to the DeBeers slogan, adorning your honey’s neck, wrists and fingers with bits of pressurized carbon has somehow become a metaphor for true and timeless love. Of course, no object that you can hold in your hand can last forever. But diamonds have a special reason for being incapable of eternity–without the extreme pressures of the deep Earth where they formed, a diamond will slowly revert back into graphite–which is why the older a diamond is, the more inclusions it’s likely to have.
Although it usually will take millions of years for the rock on your finger to become ready for use in pencils, some mineral forms of carbon seem to quickly flash between diamond and graphite depending on the pressures that they are exposed to in the lab. For those mutable sometimes-gems, diamonds are in fact transient.
What common phrases push your buttons when viewed under the microscope of science? Or perhaps you have the inside scoop on whether wet hens really get angry? Let us know!