July 15, 2013
Déjà vu is a rare occurrence, but you know it when you feel it. As you walk through a new city for the first time, something familiar clicks in your mind, giving you pause. You’ve definitely been here before.
But you haven’t. So what gives?
Well, no one really knows for sure. The origin of déjà vu (French for “already seen”), a sense of familiarity with something entirely new, remains hidden somewhere deep in our brains. The phenomenon is difficult to study—most people, when they experience déjà vu, aren’t hooked up to a bunch of electrodes, with clipboard-toting researchers at the ready.
However, scientists have pondered the question for quite some time: A description of a déjà vu experience in patients with epilepsy appears as early as 1888. The observation was no coincidence—those with some types of epilepsy seem to feel déjà vu more often than those without the neurological disorder. Research on such patients showed that their feelings of déjà vu
were likely linked to seizure activity in the medial temporal lobe, the part of the brain associated with sensory perception, speech production and memory association.
During a seizure, neurons misfire, sending mixed-up messages to different parts of the body. For these patients, déjà vu is a result of getting their wires crossed. When some patients undergo brain surgery to stop the seizures, they wake up to a world free of the phenomenon.
Some scientists posit that similar neural misfiring—a glitch in the system—also causes healthy, seizure-free brains to experience a sense of familiarity when there’s no reason to.
A second hypothesis involves another brain error; this time, the problem is with our memory, says Anne Cleary, a cognitive psychology professor at Colorado State University. Something about a new situation or setting activates a memory of a similar past experience, but our brains fail to recall it. Cleary offers this scenario to help explain: Imagine you’re visiting Paris for the first time, and you have arrived at the Louvre. Your gaze lands on the giant glass pyramid jutting out of the museum’s main courtyard, and you get that strange feeling.
At that moment, your brain is failing to retrieve a memory that could explain it away: A few months ago, you watched The Da Vinci Code, a film that provides an up-close look at the Louvre Pyramid. “In the absence of recalling that specific experience,” Cleary says. “You’re left only with this feeling of familiarity with the current situation.”
Cleary suspected that this sense of familiarity results from our ability to remember the spatial configuration of surroundings. To test this hypothesis, she set out to induce déjà vu in a laboratory setting (PDF). Using the life simulation game The Sims, Cleary and her team built two scenes, different in their features but identical in their layout. The first was a courtyard setting featuring a potted tree in the center, encircled by various plants, and hanging plant baskets on the walls. The second was a museum setting that swapped the tree for a large statue, the floor plants with rugs and the hanging baskets with sconces.
When participants explored the second room, they reported experiencing a feeling of déjà vu, but they couldn’t connect that to their time spent navigating the first room. “People do have an increased sense of déjà vu when the scene has a similar layout, but they’re failing to recall the source of that familiarity,” Cleary says.
Yet another possible explanation for déjà vu, says Cleary, dates back to 1928, when psychology Edward Titchener described the sensation using the example of crossing a street. As we begin to cross a street, we instinctively look to the left, but if something catches our attention on our right,
we turn in that direction. By the time we look to our left again, our brains may have forgotten the first glance. This second glance triggers a feeling of familiarity, because, in this case, we really have seen something before.
In many cases, people who experience déjà vu can’t pinpoint why it’s happening. But for what it’s worth, our brains are trying to tell us, Cleary says. Tip-of-the-tongue experiences work in much the same way: for instance, we know that we know the name of that actor in that one movie, but we can’t pull it to the front of our minds. “When retrieval does fail, our memories still have a way of alerting us to the fact that there’s something relevant in there,” she says. “There’s something there that maybe we want to keep searching for.”
July 3, 2013
To keep their bodies running at peak performance, people often hit the gym, pounding away at the treadmill to strengthen muscles and build endurance. This dedication has enormous benefits—being in shape now means warding off a host of diseases when you get older. But does the brain work in the same way? That is, can doing mental exercises help your mind stay just as sharp in old age?
Experts say it’s possible. As a corollary to working out, people have begun joining brain gyms to flex their mental muscles. For a monthly fee of around $15, websites like Lumosity.com and MyBrainTrainer.com promise to enhance memory, attention and other mental processes through a series of games and brain teasers. Such ready-made mind exercises are an alluring route for people who worry about their ticking clock. But there’s no need to slap down the money right away—new research suggests the secret to preserving mental agility may lie in simply cracking open a book.
The findings, published online today in Neurology, suggest that reading books, writing and engaging in other similar brain-stimulating activities slows down cognitive decline in old age, independent of common age-related neurodegenerative diseases. In particular, people who participated in mentally stimulating activities over their lifetimes, both in young, middle and old age, had a slower rate of decline in memory and other mental capacities than those who did not.
Researchers used an array of tests to measure 294 people’s memory and thinking every year for six years years. Participants also answered a questionnaire about their reading and writing habits, from childhood to adulthood to advanced age. Following the participants’ deaths at an average age of 89, researchers examined their brains for evidence of the physical signs of dementia, such as lesions, plaques and tangles. Such brain abnormalities are most common in older people, causing them to experience memory lapses. They proliferate in the brains of people with Alzheimer’s disease, leading to memory and thinking impairments that can severely affect victims’ daily lives.
Using information from the questionnaire
and autopsy results, the researchers found that any reading and writing is better than none at all. Remaining a bookworm into old age reduced the rate of memory decline by 32 percent compared to engaging in average mental activity. Those who didn’t read or write often later in life did even worse: their memory decline was 48 percent faster than people who spent an average amount of time on these activities.
The researchers found that mental activity accounted for nearly 15 percent of the difference in memory decline, beyond what could be explained by the presence of plaque buildup. “Based on this, we shouldn’t underestimate the effects of everyday activities, such as reading and writing, on our children, ourselves and our parents or grandparents,” says study author Robert S. Wilson, a neuropsychologist at the Rush University Medical Center in Chicago, in a statement.
Reading gives our brains a workout because comprehending text requires more mental energy than, for example, processing an image on a television screen. Reading exercises our working memory, which actively processes and stores new information as it comes. Eventually, that information gets transferred into long-term memory, where our understanding of any given material deepens. Writing can be likened to practice: the more we rehearse the perfect squat, the better our form becomes, tightening all the right muscles. Writing helps us consolidate new information for the times we may need to recall it, which boosts our memory skills.
So the key to keeping our brains sharp for the long haul does have something in common with physical exercise: we have to stick with it. And it’s best to start early. In 2009, a seven-year study of 2,000 healthy individuals aged 18 to 60 found that mental agility peaks at 22. By 27, mental processes like reasoning, spatial visualization and speed of thought began to decline.
June 28, 2013
Snakes and fish do it. Cats and dogs do it. Even human babies do it inside the womb. And maybe after seeing the picture above, you’re doing it now: yawning.
Yawning appears to be ubiquitous within the animal kingdom. But despite being such a widespread feature, scientists still can’t explain why yawning happens, or why for social mammals, like humans and their closest relatives, it’s contagious.
As yawning experts themselves will admit, the behavior isn’t exactly the hottest research topic in the field. Nevertheless,
they are getting closer to the answer to these questions. An oft-used explanation for why we yawn goes like this: when we open wide, we suck in oxygen-rich air. The oxygen enters our bloodstream and helps to wake us up when we’re falling asleep at our desks.
Sounds believable, right? Unfortunately, this explanation is actually a myth, says Steven Platek, a psychology professor at Georgia Gwinnett College. So far, there’s no evidence that yawning affects levels of oxygen in the bloodstream, blood pressure or heart rate.
The real function of yawning, according to one hypothesis, could lie in the human body’s most complex system: the brain.
Yawning—a stretching of the jaw, gaping of the mouth and long deep inhalation, followed by a shallow exhalation—may serve as a thermoregulatory mechanism, says Andrew Gallup, a psychology professor at SUNY College at Oneonta. In other words, it’s kind of like a radiator. In a 2007 study, Gallup found that holding hot or cold packs to the forehead influenced how often people yawned when they saw videos of others doing it. When participants held a warm pack to their forehead, they yawned 41 percent of the time. When they held a cold pack, the incidence of yawning dropped to 9 percent.
The human brain takes up 40 percent of the body’s metabolic energy, which means it tends to heat up more than other organ systems. When we yawn, that big gulp of air travels through to our upper nasal and oral cavities. The mucus membranes there are covered with tons of blood vessels that project almost directly up to the forebrain. When we stretch our jaws, we increase the rate of blood flow to the skull, Gallup says. And as we inhale at the same time, the air changes the temperature of that blood flow, bringing cooler blood to the brains.
In studies of mice, an increase in brain temperature was found to precede yawning. Once the tiny rodents opened wide and inhaled, the temperature decreased. “That’s pretty much the nail in the coffin as far as the function of yawning being a brain cooling mechanism, as opposed to a mechanism for increasing oxygen in the blood,” says Platek.
as a thermoregulatory system mechanism could explain why we seem to yawn most often when it’s almost bedtime or right as we wake up. “Before we fall asleep, our brain and body temperatures are at their highest point during the course of our circadian rhythm,” Gallup says. As we fall asleep, these temperatures steadily decline, aided in part by yawning. But, he added, “Once we wake up, our brain and body temperatures are rising more rapidly than at any other point during the day.” Cue more yawns as we stumble toward the coffee machine. On average, we yawn about eight times a day, Gallup says.
Scientists haven’t yet pinpointed the reason we often feel refreshed after a hearty morning yawn. Platek suspects it’s because our brains function more efficiently once they’re cooled down, making us more alert as result.
A biological need to keep our brains cool may have trickled into early humans and other primates’ social networks. “If I see a yawn, that might automatically cue an instinctual behavior that if so-and-so’s brain is heating up, that means I’m in close enough vicinity, I may need to regulate my neural processes,” Platek says. This subconscious copycat behavior could improve individuals’ alertness, improving their chances of survival as a group.
Mimicry is likely at the heart of why yawning is contagious. This is because yawning may be a product of a quality inherent in social animals: empathy. In humans, it’s the ability to understand and feel another individual’s emotions. The way we do that is by stirring a given emotion in ourselves, says Matthew Campbell, a researcher at the Yerkes National Primate Research Center at Emory University. When we see someone smile or frown, we imitate them to feel happiness or sadness. We catch yawns for the same reasons—we see a yawn, so we yawn. “It isn’t a deliberate attempt to empathize with you,” Campbell says. “It’s just a byproduct of how our bodies and brains work.”
Platek says that yawning is contagious in about 60 to 70 percent of people—that is, if people see photos or footage of or read about yawning, the majority will spontaneously do the same. He has found that this phenomenon occurs most often in individuals who score high on measures of empathic understanding. Using functional magnetic resonance imaging (fMRI) scans, he found that areas of the brain activated during contagious yawning, the posterior cingulate and precuneus, are involved in processing the our own and others’ emotions. “My capacity to put myself in your shoes and understand your situation is a predictor for my susceptibility to contagiously yawn,” he says.
Contagious yawning has been observed in humans’ closest relatives, chimpanzees and bonobos, animals that are also characterized by their social natures. This begs a corollary question: is their capacity to contagiously yawn further evidence of the ability of chimps and bonobos to feel empathy?
Along with being contagious, yawning is highly suggestible, meaning that for English speakers, the word “yawn” is a representation of the action, a symbol that we’ve learned to create meaning. When we hear, read or think about the word or the action itself, that symbol becomes “activated” in the brain. “If you get enough stimulation to trip the switch, so to speak, you yawn,” Campbell says. “It doesn’t happen every time, but it builds up and at some point, you get enough activation in the brain and you yawn.”
June 21, 2013
Sharks have it tougher than most when it comes to public relations. Unlike a number of disgraced celebrities, politicians and athletes who’ve somewhat managed to come out on the other side of a scandal, the marine creatures haven’t been able to shake their bad reputation for 38 years. What’s more, they probably didn’t even deserve it in the first place.
Stephen Spielberg’s Jaws, which premiered this week in 1975, was adapted from a 1974 novel of the same name. The book was inspired by real-life events, a series of shark attacks along the Jersey Shore in July 1916 that killed four people. The type of shark behind the attacks was never confirmed, but Spielberg picked the prime suspect to be his villain: the great white shark. However, the movie has allowed viewers to paint all kinds of sharks as massive, bloodthirsty killers with a taste for revenge.
That’s about 440 species of sharks. Talk about one fish (unknowingly) ruining it for the rest of them
Here’s the thing: most of these sharks don’t have a taste for human blood—they don’t express special interest in mammal blood as opposed to fish blood. Diets vary across the many species around the globe, but they usually include other fish, crustaceans and marine mammals such as seals. The biggest species, the whale shark (which can reach up to 60 feet in length) only feeds on plankton.
And those supposed voracious appetites that in movies give them unnatural speed?
Most of the time, sharks are just not hungry. While they can reach up to 30 miles per hour or more in sudden bursts, they tend to cruise at a lackadaisical pace of about five miles per hour. And sharks that swim with their mouths open aren’t always in attack mode—they open wide to ventilate their gills.
Not all sharks are big enough to ram into and capsize unsuspecting boats, either. About 80 percent of all shark species grow to be less than five feet long. Only 32 species have been documented in attacks with humans, the repeat players being the great white, tiger and bull sharks. Your lifetime risk of suffering an attack from one of these predators is pretty small: 1 in about 3,700,000. Compare that to your odds of dying in a car accident (1 in 84), a fall (1 in 218), a lightning strike (1 in 79,746) or fireworks (1 in 340,733). Yet many people have an irrational fear of sharks, born from movies like Jaws.
Today, an emerging public relations campaign is underway to show that sharks aren’t the bad guys anymore—they’re the victims. According to the International Union for Conservation of Nature, 30 percent of open-ocean sharks and stingrays, their fellow sea dwellers, face extinction. True, 12 people are killed by sharks each year worldwide. However, 11,417 sharks are killed every hour by humans, adding up to roughly 100 million a year. Some of these deaths are intentional: sharks are often hunted for their fins to make soup or caught for sport, their toothy jaws kept as trophies. Others fall prey to recreational fishing or nets meant to protect humans. Still others die because their habitats are slowly disappearing due to human activity, which reduces their food supply and pollutes the water pumping through their gills.
The numbers are stark: In some parts of the world, the scalloped hammerhead shark population has shrunk by 99 percent in the last 30 years. In tropical Atlantic waters, the population of silky sharks is now half of what it was in he early 1970s. The Pacific’s whitetip shark population fell by 93 percent between 1995 and 2010.
This spring, an international organization implemented a ban on international trade in the whitetip, the porbeagle and three species of hammerhead sharks. The Shark Conservation Act, signed into law by President Barack Obama in 2011, closed loopholes in existing shark conservation legislation and promoted U.S.-led protection efforts worldwide. Even Discovery Channel’s Shark Week, which for a quarter of a century has hooked viewers with the promise of a fear-filled thrill ride, is partnering with conservationists to help boost sharks’ public image.
But perhaps the biggest shift in the Jaws-dominated shark culture is this: some survivors of shark attacks are actually teaming up to save the creatures that once nearly killed them. As shark attack survivor Debbie Salamone explains on their PEW Charitable Trust website, “If a group like us can see the value in saving sharks, shouldn’t everyone?”
June 13, 2013
For centuries, millions of Europeans suffering from leprosy were shunned by society, made to wear bells that signaled to healthy citizens they were nearby. The infectious illness, also known as Hansen’s Disease, was poorly understood, often believed to be hereditary or a punishment from God. At its height, nearly one in 30 had the disease in some regions; by the 13th century, the number of leper hospitals active in Europe hit its peak at 19,000. Then, in the 16th century, the affliction fell into decline. Soon, it had virtually disappeared from the continent.
The pathogen responsible for leprosy was discovered in 1873 in Norway, squashing previous assumptions about its cause. The earliest written mention of leprosy, one of the oldest-known pathogens to plague humans, appeared in 600 B.C. in China. Historical records show it plagued ancient Greek, Egyptian and Indian civilizations. In 2009, DNA analysis of a first-century man’s remains found in a Jerusalem tomb provided the earliest proven case of leprosy.
Now, DNA sequencing technology has provided clues about the evolution of the bacteria itself. Using well-preserved DNA samples from ancient skeletons, an international team of researchers has sequenced the genome of the pathogen Mycobacterium leprae as it existed in medieval times.
Until now, scientists hadn’t even been able to sequence the pathogen from living people—the bacterium can’t be grown in cell culture in the lab, so scientists usually infect mice with it to achieve a sample big enough for sequencing. The material gleaned from human bones for this study, exhumed from medieval graves, contained a tiny amount of bacterial DNA—less than 0.1 percent, to be in fact. But thanks to extremely sensitive and precise technology, scientists were able to sequence five strains of M. leprae.
Today, more than 225,000 cases of leprosy arise each year, mostly in developing countries. Using samples from some of these cases, the researchers compared the centuries-old sequences to 11 modern strains of the pathogen, extracted from recent biopsies from several geographic regions.
The results, published today in the journal Science, reveal that the bacterium has, in terms of genetic makeup, remained relatively the same despite the last 1,000 years. Only 800 mutations occurred among the 16 genomes in that time, the researchers write. This number means that the mysterious disappearance of the disease by the Middle Ages in Europe can’t be attributed to M. leprae losing its virulence.
“If the explanation of the drop in leprosy cases isn’t in the pathogen, then it must be in the host—that is, in us,” says Stewart Cole, co-director of the study and the head of the École Polytechnique Fédérale de Lausanne’s Global Health Institute. “So that’s where we need to look.”
The pathogen’s genetic resilience was evident in its modern strains. Researchers found that a medieval strain present in Sweden and the U.K. was nearly identical to one currently found in the Middle East. Their findings also suggest that some strains found in the Americas originated in Europe. What they can’t tell us, however, is the direction in which the epidemic spread throughout history.
This research marks a growing trend in using DNA analysis to learn more about epidemics and other devastating events in human history. Last month, scientists sampled 166-year-old Irish potato leaves using similar technology: They determined that a previously unknown strain of P. infestans caused the blight that shrunk 19th-century Ireland’s population by 25 percent. Perhaps future research could someday pinpoint the pathogen responsible for the bubonic plague, commonly known as the Black Death, which wiped out nearly half of Europe’s population between 1347 and 1351.