March 28, 2013
In 1968, Andy Warhol—already famous in his own right—further added to his celebrity by creating a lasting cliché: “In the future, everyone will be world-famous for 15 minutes.”
Prescient as Warhol might have been, it seems we haven’t reached that future quite yet, at least according to science. A new study, published today in the American Sociological Review, finds that true fame lasts a good deal longer than 15 minutes. In an analysis of the celebrity journalism nationwide, researchers found that the most famous (and most often-mentioned) celebrities stick around for decades.
To come to the finding, a number of sociologists each spent a multi-year sabbatical meticulously combing the “Stars: They’re Just Like Us” feature of UsMagazine. Several reportedly declined to return to the field of academia, apparently taking their talents to the analytical departments of the glossy magazine industry full-time.
Just kidding! In all seriousness, the sociologists, led by Eran Shor of McGill University and Arnout van de Rijt of Stony Brook University, used an automated search took a random sample of roughly 100,000 names that appeared in the entertainment sections of 2,200 daily American newspapers published between 2004 and 2009. Their sample didn’t include every single name published, but rather a random selection of names published at all different frequencies—so it wouldn’t be useful for telling you who was the most often-mentioned celebrity overall, but would be illustrative of the sorts of trends that famous (and not-so-famous) names go through over time.
The ten most frequently-mentioned names in their sample: Jamie Foxx, Bill Murray, Natalie Portman, Tommy Lee Jones, Naomi Watts, Howard Hughes, Phil Spector, John Malkovich, Adrien Brody and Steve Buscemi. All celebrities, they note, were relatively famous before the year 2000, in some cases decades earlier (Howard Hughes rose to fame in the 1920s). All ten names, additionally, are still fairly well-known today.
Overall, 96 percent of the most famous names in the sample (those mentioned more than 100 times over the course of a given year) had already been frequently featured in the news three years earlier, further dispelling the 15 minutes cliché. Furthermore, if a name was mentioned extremely often in its first year of appearing, it stood a greater chance of sticking around for an extended period of time.
There is, however, some truth to 15-minutes idea: Names of lesser fame (those less frequently mentioned to start) exhibit significantly higher amounts of turnover from year to year. The researchers say these names mostly fall into the category of people involved in newsworthy events—such as natural disasters and crimes—rather than people who readers find newsworthy in their own right. As an example, Van de Rijt mentions Chelsey Sullenberger, the US Airways pilot who briefly achieved celebrity after successfully executing an emergency landing on the Hudson River in 2011, but is now scarcely frequently mentioned in the press.
The list of the most famous names, though, stays relatively similar every year. “The vast majority of coverage goes to names that have already been in the news for several years, and new names rarely penetrate the higher strata of fame,” the researchers write in the study. The bottom of the fame hierarchy is filled with new names annually, but at the top, they write, is “a reshuffling of already familiar names and not rapid replacement of an outgoing cohort by an incoming cohort.”
Apart from the newspaper data, the team also looked at a much smaller sample of celebrity mentions on blogs and TV, and found a similar trend. New media, it seems, follow roughly the same pattern as old outlets—which is why you don’t see much about figures like the “balloon boy” across the web nowadays either.
Frivolous as the work may seem, the researchers say it bears important conclusions about our society. Upward mobility in the celebrity world is extremely scarce. Becoming famous requires some combination of talent and luck that allows a person to break into the elite class of being mentioned over and over by the press. But what is that combination–what makes a person famous? Or is it that the press has created a cycle that allows a person to remain famous, in some cases after his or her career has peaked, or even after his or her death?”
No word yet on whether scientists will someday be able to create a multivariable model to quantify celebrity “fierceness” over time as well.
March 25, 2013
One of the most unprecedented trends of modern society is the number of people who choose to live alone. As sociologist Eric Klinenberg observed in his 2012 book Going Solo, living alone was virtually unheard of in most world cultures throughout history prior to the 20th century, but an estimated 32.7 million people now live alone in the United States, accounting for about 28 percent of the country’s households today, compared with 17 percent in 1970.
The medical and mental effects of this shift are complex. As Klinenberg notes, many people who live alone still remain highly social and connected with friends and family, so living alone doesn’t necessarily mean that a person is isolated.
But what of those who live alone and are socially isolated? In a study published today in the Proceedings of the National Academy of Sciences, a group of researchers from University College London attempted to explore the health consequences of those who are isolated from others, and found that limited contact with others increases a person’s overall risk of death over time.
The group, led by Andrew Steptoe, examined data on the 6,500 older adults (aged 52 and up) who took part in the English Longitudinal Study of Ageing in 2004, and monitored which participants survived up until last March. The researchers specifically looked at the association between mortality (overall risk of death) and a pair of conditions: social isolation (as indicated by a lack of contact with others) and loneliness (as reflected by participants’ answers on a survey).
In total, 14.1 percent of the people who’d participated in the survey had died in the 8 years after the study was administered, but those who were classified as socially isolated had died at considerably higher rates. Of the most socially isolated respondents, 21.9 percent did not survive to March 2012, as compared with 12.3 percent of the least isolated. Even after the participants’ baseline health and demographic factors were taken into account, being socially isolated still correlated to an increase in their mortality.
Interestingly, though, defining oneself as lonely—via the answers about one’s emotions and psychological state on the survey—did not have the same effect. Those who were lonely did have overall higher mortality, but this was because on average, they were older and had poorer baseline health conditions at the start. When the researchers controlled for baseline health and age, the mortality gap between the lonely and the non-lonely largely vanished.
This indicates that the real danger of living alone is not feeling lonely per se, but having reduced contact with others. One possibility is that an older person who seldom sees friends and family is less likely to get the help they need in managing various ailments, and is probably also less likely to be encouraged to go see a doctor when new health problems pop up. The researchers speculate that living alone might even cause people to have poorer health habits, such as smoking, eating an unhealthy diet and getting less physical activity.
This jibes with previous work by other researchers, such as the fact that living alone with a serious cardiovascular problem makes you more likely to die, and a 2011 Finnish finding that living on your own increases your risk of mortality from an alcohol-related death. Being around others, it seems, helps us ensure that we take better care of ourselves—so if you’re planning on joining the many who have opted to live solo, you’re best off making sure you maintain frequent contact with friends and family.
Sign up for our free email newsletter and receive the best stories from Smithsonian.com each week.
March 19, 2013
In 2010, the surprising discovery that Neanderthals likely crossbred with our ancestors tens of thousands of years ago generated headlines around the world.
Now, we have a new finding about the sex lives of early Homo sapiens: It looks like they engaged in some inbreeding as well.
That is the conclusion of anthropologist Erik Trinkhaus of Washington University in St. Louis and Xiu-Jie Wu and Song Xing of the Chinese Academy of Sciences’ Institute of Vertebrate Paleontology and Paleoanthropology, based on a fractured 100,000-year-old skull excavated from China’s Nihewan Basin. Their finding, published yesterday in PLOS ONE, is that the skull shows evidence of an unusual genetic mutation that is likely the result of high levels of inbreeding.
The researchers used CT scanning and 3D modeling to join together for the first time the 5 pieces of the fractured skull—known as Xujiayao 11, named for the site where it was found back in 1977—and realized that it exhibited an unusual deformity. When the pieces are combined, they leave a hole on the crown of the skull, but there is no evidence that the fracture was caused by a traumatic injury or disease. As a result, they consider it most likely that the hole is a defect known as an enlarged parietal foramen.
Nowadays, this hole is mostly found in people with a particular pair of genetic mutations on chromosomes 5 and 11—most often a consequence of inbreeding—and occurs in about 1 of 25,000 live births. The mutation interferes with bone formation in the skull over the first five months of an infant’s life, when the pieces of the skull are supposed to fuse together to cover up the “soft spot.”
Given the tiny sample size of human skulls this old and the fact that similar kinds of genetic abnormalities have been seen so often in other prehistoric skulls—the researchers count 22 individuals with skull deformities discovered from this era—Trinkhaus thinks the simplest explanation is that small and unstable human populations forced our ancestors to inbreed.
If no inbreeding occurred, “the probability of finding one of these abnormalities in the small available sample of human fossils is very low, and the cumulative probability of finding so many is exceedingly small,” he said in a press statement. “The presence of the Xujiayao and other Pleistocene [2.6 million to 12,000 years ago] human abnormalities therefore suggests unusual population dynamics, most likely from high levels of inbreeding and local population instability.”
Such inbreeding was likely inevitable, given that most of humanity likely lived in small, isolated populations for most of our species’ evolution. For example, some scientists believe that an earlier population bottleneck that predated this skull may have driven the worldwide human population to as low as 2,000 individuals, at times making inbreeding a necessity. Our ancestors certainly didn’t understand the importance of genetic diversity and the dangerous consequences of inbreeding. But with such a scant population, the survival of our species might actually have depended on our ancient grandmothers procreating with their male relatives.
The good news? The researchers say that the genetic deformity preserved in this skull as a result of inbreeding may not have been too detrimental for this individual. Normally, it’s linked with major cognitive problems, but that’s doubtful in this case, given the demanding conditions of surviving in the Pleistocene. This prehistoric human appears to have survived to a ripe old age—which, in those days, probably means the individual lived into his or her thirties.
March 13, 2013
In one of the fastest-growing areas in psychology, researchers are gaining insight into the mental processes of subjects that are barely able to communicate: babies. In recent years, innovative and playful experimental setups have suggested that infants as young as six months old have a sense of morality and fairness, and that 18-month-olds are capable of altruistically helping others.
Some of this research, though, has also shed light on babies’ dark side. A new study published in Psychological Science suggests that 9- to 14-month-olds exhibit a particularly unwelcome trait—in watching a puppet show, at least, they seem to prefer their own kind, and support puppets that pick on those who are different from them.
Because babies can’t communicate verbally, J. Kiley Hamlin of the University of British Columbia has pioneered the use of puppet shows to probe their psychology and better understand how they see the world. In this study, her research team put on an show in which 52 infant participants were led to identify themselves as similar to one of the characters in the show and different from the other.
To accomplish this, the researchers started off by asking the infants to pick a food, either graham crackers or green beans (a little surprisingly, a full 42 percent chose the vegetables). Then, the infants were shown a pair of rabbit puppets, one who liked graham crackers and one who liked green beans:
Once they’d solidly demonstrated each rabbit’s choice, one of them—either the one with the same preference as the infant observer, or the one with an opposite preference—would be randomly chosen to encounter a pair of new characters: one dog, termed a “helper,” and another, called a “harmer.” As the rabbit played with a ball and dropped it, the nice “helper” dog threw it back, but the mean “harmer” dog (below) held onto the ball:
After both of the scenes were over, both dogs were presented to the infant, and the particular dog that the baby first reached for was interpreted as the character it preferred.
The results were a bit startling: When the infants had watched a play involving a rabbit with a food choice that matched theirs, 83 percent preferred the “helper” dog. When they’d watched a play with a rabbit who liked a different food, 88 percent chose the “harmer” dog. This held true regardless of the babies’ original food choices—the only thing that mattered was whether the rabbit’s identity, it terms of food choice, matched their own.
To further parse the motivations underlying the infants’ choices, the researchers conducted a similar experiment that involved a neutral dog that neither help nor harmed the rabbit. In this part of the study, the older infants’ preferences revealed that when watching rabbits who had different favorite foods than them, they not only liked “harmer” dogs more than neutral dogs, but strongly preferred even neutral dogs when compared to “helpers” (this was true among the 14-month-olds, but not the 9-month-olds). In other words, it seemed that they not only wanted to see the rabbit treated poorly, but also would rather see it treated neutrally than get some help.
Of course, when designing experiments for subjects that can’t use words to communicate, the simplest of variables could potentially throw off the results. It’s unclear, for example, if the researchers alternated which side the “helper” and “harmer” puppets appeared on, so the babies could have been influenced by their emerging sense of handedness. In the past, critics of such puppet show experiments have also charged that a baby merely reaching for one puppet or another might be an impulsive reflex, rather than reflecting an underlying moral judgement.
What’s clear, though, is that this experiment demonstrated a consistent reflex across the babies tested. While extrapolating this to mean that the babies are racist or bigoted is probably a step too far—for one, they were merely considering individual puppets, not groups of puppets with similar characteristics—it does raise interesting questions about the origins of xenophobia in an individual’s lifetime.
March 12, 2013
Neanderthals never invented written language, developed agriculture or progressed past the Stone Age. At the same time, they had brains just as big in volume as modern humans’. The question of why we Homo sapiens are significantly more intelligent than the similarly big-brained Neanderthals—and why we survived and proliferated while they went extinct—has puzzled scientists for some time.
Now, a new study by Oxford researchers provides evidence for a novel explanation. As they detail in a paper published today in the Proceedings of the Royal Society B, a greater percentage of the Neanderthal brain seems to have been devoted to vision and control of their larger bodies, leaving less mental real estate for higher thinking and social interactions.
The research team, led by Eiluned Pearce, came to the finding by comparing the skulls of 13 Neanderthals who lived 27,000 to 75,000 years ago to 32 human skulls from the same era. In contrast to previous studies, which merely measured the interior of Neanderthal skulls to arrive at a brain volume, the researchers attempted to come to a “corrected” volume, which would account for the fact that the Neanderthals’ brains were in control of rather differently-proportioned bodies than ours ancestors’ brains were.
One of the easiest differences to quantify, they found, was the size of the visual cortex—the part of the brain responsible for interpreting visual information. In primates, the volume of this area is roughly proportional to the size of the animal’s eyes, so by measuring the Neanderthals’ eye sockets, they could get a decent approximation of their the visual cortex as well. The Neanderthals, it turns out, had much larger eyes than ancient humans. The researchers speculate that this could be because they evolved exclusively in Europe, which is of higher latitude (and thus has poorer light conditions) than Africa, where H. sapiens evolved.
Along with eyes, Neanderthals had significantly larger bodies than humans, with wider shoulders, thicker bones and a more robust build overall. To account for this difference, the researchers drew upon previous research into the estimated body masses of the skeletons found with these skulls and of other Neanderthals. In primates, the amount of brain capacity devoted to body control is also proportionate to body size, so the scientists were able to calculate roughly how much of the Neanderthals’ brains were assigned to this task.
After correcting for these differences, the research team found that the amount of brain volume left over for other tasks—in other words, the mental capacity not devoted to seeing the world or moving the body—was significantly smaller for Neanderthals than for ancient H. sapiens. Although the average raw brain volumes of the two groups studied were practically identical (1473.84 cubic centimeters for humans versus 1473.46 for Neanderthals), the average “corrected” Neanderthal brain volume was just 1133.98 cubic centimeters, compared to 1332.41 for the humans.
This divergence in mental capacity for higher cognition and social networking, the researcher argue, could have led to the wildly different fates of H. sapiens and Neanderthals. “Having less brain available to manage the social world has profound implications for the Neanderthals’ ability to maintain extended trading networks,” Robin Dunbar, one of the co-authors, said in a press statement. “[They] are likely also to have resulted in less well developed material culture—which, between them, may have left them more exposed than modern humans when facing the ecological challenges of the Ice Ages.”
Previous studies have also suggested that the internal organization of Neanderthal brains differed significantly from ours. For example, a 2010 project used computerized 3D modeling and Neanderthal skulls of varying ages to find that their brains developed at different rates over the course of an individual’s adolescence as compared to human brains despite comparable brain volumes.
The overall explanation for why Neanderthals went extinct while we survived, of course, is more complicated. Emerging evidence points to the idea that Neaderthals were smarter than previously thought, though perhaps not smart enough to outmaneuver humans for resources. But not all of them had to—in another major 2010 discovery,a team of researchers compared human and Neanderthal genomes and found evidence that our ancestors in Eurasia may have interbred with Neanderthals, preserving a few of their genes amidst our present-day DNA.
Apart from the offspring of a small number of rare interbreeding events, though, the Neanderthals did die out. Their brains might have been just as big as ours, but ours might have been better at a few key tasks–those involved in building social bonds in particular—allowing us to survive the most recent glacial period while the Neanderthals expired.