June 11, 2013
Science is generally considered a rather serious business, full of big questions, dense calculations and incomprehensible jargon.
Then there is the Annals of Improbable Research, a venerable journal that has published data on the effects of peanut butter on the rotation of the Earth and how access to television can be an effective method of birth control. The publication’s stated goal is to publish “research that makes people laugh and then think.” Its articles—which are mostly satire, but with some occasional real research into offbeat issues—probably accomplish the former goal more often than the latter, but they do often contain a grain of scientific truth at their core. And, of course, the organization’s Luxuriant Flowing Hair Club for Scientists™ is an indispensable institution on the international scientific landscape.
For your reading pleasure, we bring you an (admittedly unscientific) list of the 5 most improbable research projects from the Annals:
How did Fiorella Gambale, a scientist at the (nonexistent) Institute for Feline Research in Milano, Italy, answer this age-old question? Simple: she dropped the cat Esther 100 times each from a variety of heights and charted the results. Improbably, the cat landed on its feet all 100 times when dropped from 2, 3, 4, 5 or 6 feet, but failed to do so even once when dropped from 1 foot.
Although these results were never vetted by other scientists—so there’s no way of knowing whether Gambale actually performed the tests—the finding that cats really do land on their feet when dropped from more than 12 inches from the ground actually does jibe with established scientific beliefs. The explanation is that they need a few seconds of free fall to trigger their righting reflex, which allows them to bend their back and twist their torso to orient their feet towards the ground.
“The field of culinary evolution faces one great dilemma,” wrote Joseph Staton, of Harvard’s Museum of Comparative Zoology. “Why do most cooked, exotic meats taste like cooked Gallus gallus, the domestic chicken?” Staton tasted a wide variety of meats (including kangaroo, rabbit, goose, pigeon, and iguana) in exploring the question, and ultimately determined that the quality of “chicken taste” is a conserved trait, something that came about once in the evolutionary history of invertebrates and was passed on to many species.
Sadly, Staton’s attempt to sample dinosaurs was thwarted: He apparently made several calls to Chicago’s Field museum to “borrow merely a single bone” from their T. rex but his request was “entangled in red tape.”
A team of geologists from Texas State and Arizona State Universities addressed this very serious question with the cutting-edge tools of their field: digital elevation analysis software, complex mathematical equations, and a standard-size flapjack from the local IHOP. They found that Kansas is, in fact, considerably flatter than an average pancake, which is actually more rugged than the Grand Canyon when viewed up close. They write that Kansas, on the other hand, “might be described, mathematically, as ‘damn flat.’”
Comparing these two fruits is not quite so difficult, it turns out, when you have access to a Nicolet 740 FTIR spectrometer, which can precisely measure the frequencies of light emitted from any substance. Scott Sandford, a NASA researcher, put this device to use on dried samples of a Granny Smith apply and Sunkist orange that had been pulverized and compressed into pellets. He found that the spectrums of light emissions from the fruits were remarkably similar, a rather stunning revelation given how frequently people employ the what he calls the “apples and oranges defense”: that we should avoid comparing two different things because of how different the fruits are.
“It would appear that the comparing apples and oranges defense should no longer be considered valid,” Sandford wrote. “It can be anticipated to have a dramatic effect on the strategies used in arguments and discussions in the future.”
Alice Shirrell Kaswell, a staff member at the Annals of Improbable Research, definitively answered this question once and for all in 2003: The chicken, it turns out, came approximately 11 hours before the egg. Kaswell came to this finding by separately mailing a dozen eggs and one (1) live chicken via the U.S. Postal Service from Cambridge, Massachusetts to New York City. Both items, sent out on a Monday, arrived on Wednesday, but the chicken was delivered at 10:31 a.m., while the eggs didn’t arrive until 9:37 p.m. Problem = solved.
May 7, 2013
Last month, a trio of engineers debuted an app that allows Icelanders to determine if they’re actually related to a potential date. Why, you ask? Because the entire population of Iceland, roughly 320,000 people, derives from a single family tree, and it’s very possible to bump into a former flame at a family gathering.
The case of Iceland is an extreme one, but the idea that we are all distant cousins, in the scope of human history, is well accepted. A new study, published today in the journal PLOS Biology, explains this degree of relatedness in modern-day Europeans.
The study reveals that just about any two random people from anywhere in Europe, even those living on opposite sides of the continent, share hundreds of genetic ancestors from only 1,000 years ago. In fact, a person living in the United Kingdom shares a chunk of genomic material with someone living in Turkey 20 percent of the time.
Researchers from the University of California, Davis and the University of Southern California studied genomic data for 2,257 Europeans from a massive database of genome-mapped individuals known as the Population Reference Sample. They measured ancestral ties going back 3,000 years by analyzing long segments of genome, passed down from generation to generation, shared by individuals.
Distant relatives share these long blocks of genome because they have both inherited them from common ancestors. First cousins share about one-fourth of their genome, inherited from a shared set of grandparents. Second cousins share just one-sixteenth of their genome, thanks to the same pair of great-grandparents. The researchers detected 1.9 million of these shared DNA sequences within the data pool, and then used their varying lengths to infer how long ago the shared ancestors lived.
These shared chunks of genome become shorter and shorter between more distant relatives because DNA strands undergo recombination, shuffling our genetic makeup around, with each successive generation. For example, a shared block of genome is shorter between second cousins than it is between first cousins. The longer a shared segment, the more recent the common ancestor.
As we might expect, the numbers of shared genetic ancestors dramatically decrease as geographic distance (in this case, across Europe) increases. This means that people who live near each other are more likely to be related to each other than those who don’t. For example, someone living in England will have a higher degree of relatedness to a fellow Briton than he would with someone from Germany. Researchers found that two modern Europeans living in neighboring populations, for example two adjacent countries, share between two and 12 genetic ancestors from the last 1,500 years.
This pattern can be seen in historically small or more isolated populations too, where fewer possible ancestors exist. Such is the case on the Italian and Iberian peninsulas—areas least affected by Slavic and Hunnic migrations between the fourth and eighth centuries—where people share more ancestors with each other than people in most other regions of Europe. Additionally, those living in Western Europe are also somewhat less related to each other than people living in Eastern Europe, a historically tight-knit region in terms of population.
However, some findings deviate from this genealogical norm. The researchers found that people from the United Kingdom shared more recent ancestors with people living in Ireland than with other UK residents. Recent ancestry also tied Germans more closely with Polish people than with other Germans. These instances likely reflect human migration in recent centuries, as smaller populations moved into larger ones.
Although this study looked only at European lineage, the researchers suggest that such patterns probably exist in the rest of the world. In any case, such research in human history brings us closer to learning more about the most recent common ancestor of all modern humans,
which scientists believe who, according to mathematical models, might have walked the Earth roughly as early as 3,500 years ago (PDF). This common ancestor, a product of the intermixing of once-isolated population groups, could have lived much earlier than this if remote populations managed to prevent its members from mating with far-flung explorers, but the recent paper’s finding seems to support the idea that distant populations converged relatively recently when compared to the long history of ancient humans.
April 15, 2013
This year, prolonged extreme temperatures and seemingly never-ending snowstorms in the United States forced many inside, seeking shelter from what felt like an unusually long winter. This meant some of us were stuck in bed for a day or two clutching a box of Kleenex and downing cough syrup. That’s because viruses that cause the common cold love enclosed spaces with lots of people—the family room, the office, the gym.
And though spring has arrived, cold-causing microbes haven’t slowed down. More than 200 viruses can trigger a runny nose, sore throat, sneezing and coughing—more than 1 billion cases of the common cold occur in the United States each year. The worst offenders (and the most common), known as human rhinoviruses, are most active in spring, summer and early fall.
While it’s difficult to pinpoint exactly when infected people cease to be contagious, they’re most likely to spread their cold when symptoms are at their worst, explains Dr. Teresa Hauguel of the National Institute of Allergy and Infectious Diseases. However, there’s another window of opportunity to be wary about. “A person can be infected before they actually develop symptoms, so they can be spreading it without even realizing it if they’re around people,” Hauguel writes in an email.
Surprised? Here are five more facts about the common cold.
Cold-causing viruses can be found in all corners of the world. Rhinoviruses (from the Greek word rhin, meaning “nose”) evolved from enteroviruses, which cause minor infections throughout the human body. They have been identified even in remote areas inside the Amazon. But it’s impossible to tell how long humans have been battling colds. Scientists can’t pinpoint when rhinoviruses evolved: they mutate too quickly and don’t leave a footprint behind in preserved human fossils. They could have been infecting
mankindhominids before our species appeared. Or they might have sprung up as small groups of humans moved out of isolation and into agricultural communities, where the pathogen became highly adapted to infecting them.
Cold-causing microbes can survive for up to two days outside of the body. Rhinoviruses, which cause 30 to 50 percent of colds, usually live for three hours on your skin or any touchable surface, but can sometimes survive for up to 48 hours. The list of touchable surfaces is a lengthy one: door knobs, computer keyboards, kitchen counters, elevator buttons, light switches, shopping carts, toilet paper rolls—the things we come in contact with on a regular basis. The number of microbes that can grow on these surfaces varies, but each spot can contain several different types of microbes.
You can calculate how far away to stand from someone who’s sick. When a sick person coughs, sneezes or talks, they expel virus-containing droplets into the air. These respiratory droplets can travel up to six feet to another person. A recent study found that the largest visible distance over which a sneeze travels is 0.6 meters, which is almost two feet. It did so at 4.5 meters per second, about 15 feet per second. A breath travels the same distance but much slower, at 1.4 meters—4.5 feet—per second. Moral of the story: remain six feet from infected people, and move quickly when they gear up to sneeze.
The weather plays a role in when and how we get sick—but not in the way you might think. Humidity levels can help those droplets whiz through the air quicker: the lower the humidity, the more moisture evaporates from the droplet, shrinking it in size so it can stay airborne for larger distances. Cold weather is notoriously dry, which explains why we’re more likely to catch a cold while we huddle up inside when temperatures start sinking. This type of air can dry out the mucus lining in our nasal passages; without this protective barrier that traps microbes before they enter the body, we’re more vulnerable to infection. So we’re weakened by the air we breathe in when it’s chilly out, not the chilly weather itself.
Contrary to popular belief, stocking up on vitamin C won’t help. Linus Pauling, a Nobel Prize-winning chemist, popularized the idea of taking high doses of vitamin C to ward off colds. But when put to the test, this cold remedy doesn’t actually work. If you take at least 0.2 grams of vitamin C every day, you’re not likely to have any fewer colds, but you may have colds that are a day or two shorter. When symptoms start to appear, drizzling packets of Emergen-C into glass after glass of water won’t help either. The vitamin is no more effective than a placebo at reducing how long we suffer from cold symptoms.
March 18, 2013
Some scientists investigate the universe’s biggest mysteries, like the Higgs boson, the mysterious particle that endows all other subatomic particles with mass.
Other researchers look into questions that are, well, a bit humbler—like the age-old puzzle of whether roosters simply crow when they see light of any kind, or if they truly know to crow when the morning sun arrives.
Lofty or not, it’s the goal of science to answer all questions that arise from the natural world, from roosters to bosons and everything in between. And a new study by Japanese researchers published today in Current Biology resolves the rooster question once and for all: The birds truly do have an inner circadian rhythm that tells when to crow.
The research team, from Nagoya University, investigated via a fairly straightforward route: They put several groups of four roosters in a room for weeks at a time, turned the lights off, and let a video camera running. Although roosters can occasionally crow at any time of day, the majority of their crowing was like clockwork, peaking in frequency at time intervals roughly 24 hours apart—the time their bodies knew to be morning based on the sunlight they’d last seen before entering the experiment.
This consistency continued for about 2 weeks, then gradually began to die out. The roosters were left in the room for 4 weeks in total, and during the second half of the experiment, their crowing began occurring less regularly, at any time of day, suggesting that they do need to see the sun on a regular basis for their circadian rhythms to function properly.
In the experiment’s second part, the researchers also subjected the roosters to alternating periods of 12 hours of light and 12 hours of darkness, while using bright flashes of light and the recorded crowing of roosters (since crowing is known to be contagious) to induce crowing at different times of day. When they activated these stimuli near at or near the dawn of the roosters’ 12-hour day, crowing rates increased significantly. At other times of day, though, exposing them to sudden flashes of light or playing the sound of crowing had virtually no effect, showing that the underlying circadian cycle played a role in the birds’ response to the stimuli.
Of course, many people who live in close proximity to roosters note that they often crow in response to a random light source turning on, like a car’s headlights, no matter what time of day it is. While this may be true, the experiment shows that the odds of a rooster responding to a car’s headlights depend on how close the current time is to dawn—at some level, the rooster’s body knows whether it should be crowing or not, and responding to artificial stimuli based on this rhythm.
For the research team, all this is merely a prelude to their bigger, more complex questions: Why do roosters have a biological clock that controls crowing in the first place, and how does it work? They see the simple crowing patterns of the rooster as an entry point into better understanding the vocalizations of a range of animals. “We still do not know why a dog says ‘bow-wow’ and a cat says ‘meow,’” Takashi Yoshimura, one of the co-authors, said in a press statement. “We are interested in the mechanism of this genetically controlled behavior and believe that chickens provide an excellent model.”
October 26, 2012
What was a group of orthopedic physicians doing with a hydraulic press, a set of kitchen knives, a set of pumpkin carving tools and a dead human hand? Well, if the headline didn’t give it away, then the approaching holiday might give you a hint about their exceedingly creepy experimental setup.
In 2004, a team from SUNY Upstate Medical University in Syracuse decided to rigorously investigate the potential risks of pumpkin carving, comparing the threats posed by conventional kitchen knives with those of other tools specifically intended for pumpkins. As Marc Abrahams (the editor of the Annals of Improbable Research) recently pointed out at the Guardian, the research published in the journal Preventive Medicine provides the most comprehensive account of pumpkin-carving danger we have to date.
“Even with optimal treatment, injuries from pumpkin carving accidents may leave people with compromised hand function,” they wrote. Common pumpkin-carving injuries come in a number of forms: hand punctures, resulting from instances where a knife is accidentally pushed through the pumpkin and contacts the opposite hand stabilizing it; and lacerations, caused by the cutting hand slipping off the knife handle and sliding across the blade.
Because of these risks, many companies market pumpkin-specific carving tools, claiming that they’re safer than sharp knives. Naturally, the researchers wanted to test these supposed safety benefits. As they noted, “evidence that they are safer is required before these knives can be recommended.”
In order to find such evidence, they compared various carving instruments—a serrated kitchen knife, a plain knife and two brands of pumpkin-specific tools (the Pumpkin Kutter and the Pumpkin Masters’ Medium Saw)—by placing each one in the grip of a hydraulic press and carefully measuring exactly how much force needed to be applied to pierce into a pumpkin and to lacerate a human hand. Since live volunteers for such an experiment might not be all that plentiful, they used six cadaver hands, harvested at the elbow.
In the first stage of the study, when the implements were tested on a pumpkin, each one was pushed into the squash’s flesh at the rate of 3 mm per second. The pumpkin-specific instruments performed as advertised, cutting into the pumpkins more easily than the kitchen knives. Theoretically, if less force is needed to actually carve the pumpkin, the risk of pushing too hard and accidentally cutting yourself should be lower.
In the second phase, each of the cutting tools was tested on the cadaver hands in two different ways: the researchers measured how much force was needed to lacerate the finger and to puncture the palm. In this case, the kitchen knives cut more easily into the hands, requiring less force and causing “more skin lacerations that would require suturing than either pumpkin knife.” When it comes to hands, the knives were more dangerous.
The researchers’ conclusions? “Tools designed specifically for pumpkin carving may indeed be safer. Use of these products, and increased overall awareness of the risks of pumpkin carving in general, which clinicians could help promote, might reduce the frequency and severity of pumpkin carving injuries.”
Another pressing question solved by science. No word yet on what happened to the lacerated and punctured cadaver hands afterward.