October 1, 2012
With the first presidential debate scheduled for Wednesday night, we’re about to hit the whitewater of the campaign, the time when any slip, any rock beneath the surface, can turn the boat over.
And though it doesn’t seem possible, the political advertising will shift into an even higher gear. Last week alone Barack Obama, Mitt Romney and outside political groups spent an estimated $55 million to drum their messages into the minds of voters.
But whose minds might they be? Must be the undecideds–that 2 to 8 percent of American voters who remain uncommitted and, it turns out, are largely uninformed.
It couldn’t be the rest of us, right? We’ve made up our minds, we know what we believe, right?
Change is good?
Well, maybe so. But perhaps not as much as you think. A new study of moral attitudes by a team of Swedish researchers would seem to suggest that our minds are considerably more changeable than we imagine.
Here’s how the study worked: Subjects were asked to take a survey on a number of issues for which people are likely to have strong moral positions–such as whether government surveillance of e-mail and the Internet should be allowed, to protect against terrorism. Or if helping illegal aliens avoid being sent back to their home countries was commendable or deplorable.
Once they assigned a number to each statement reflecting their level of agreement or disagreement, the participants turned to a second page of the survey attached to a clipboard. And in doing so, they unwittingly mimicked an old magic trick. The section of the first page containing the original statements lifted off the page, thanks to glue on the back of the clipboard. In its place was a collection of statements that seemed identical to the ones on the first list, but now each espoused the direct opposite position of the original. For instance, a stance deemed commendable in the first list was now described as deplorable.
On the other hand
The numerical values selected by those surveyed remained the same, but now they were in response to the other side of a moral issue. When the participants were asked to explain their responses, almost 70 percent of them didn’t realize they had performed one fine flip-flop.
Okay, let’s cut them some slack. It’s easy to miss the change in one word, even if a statement said the exact opposite of what they had responded to. But here’s where it gets interesting. More than half, about 53 percent, actually offered arguments in favor of positions that just minutes before they had indicated they opposed.
I know what you’re thinking–you’d never do that. Maybe you wouldn’t. But the best conclusion the researchers could draw was that many of us just might not be as locked into our beliefs as we like to think.
Me, my bias, and I
If you want to see how flexible your political principles can be, consider downloading a plug-in developed at the University of Michigan called The Balancer. It’s designed to track your online reading habits and then calculate your political bias.
Researcher Sean Munson created The Balancer because, as he told NBC News’ Alan Boyle, he wanted to see if “having real-time feedback about your online news reading habits affects the balance of the news that you read.”
By matching your Web activity to a list of 10,000 news sources and blogs–each with a ranking on the political spectrum–The Balancer, through a button on your browser bar, lets you know how unbalanced your choices are. Depending on where you get your info, a stick figure will be shown overloaded with either conservative-red blocks or liberal-blue ones.
The plug-in, which works only on the Google Chrome browser, also suggests websites to visit if you don’t want your stick figure to tilt too much to one side.
Says Munson, who was surprised at the degree of his own bias: “Even self-discovery is a valuable outcome, just being aware of your own behavior. If you do agree that you should be reading the other side, or at least aware of the dialogue in each camp, you can use it as a goal: Can I be more balanced this week than I was last week?”
Stalking the vote
Here’s more recent research on what shapes and sometimes changes our political beliefs:
- That does not compute A study published last month in Psychological Science in the Public Interest found that people are reluctant to correct misinformation in their memories if it fits in with their political beliefs.
- You like who?: According to a survey by the Pew Research Center, almost 40 percent of people on social networking sites say they’ve been surprised by the political leanings of some of their friends. Two-thirds say they don’t bother to respond to political posts from friends with whom they don’t agree.
- Facebook made me do it: A message on Facebook on the day of the 2010 congressional elections may have been responsible for an additional 340,000 Americans voting, concludes a study published in the journal Nature. They were most influenced, say researchers, by messages that their closest friends had clicked an “I voted” button.
- No, my parents made me do it: Research published recently in Trends in Genetics, based on the political beliefs of twins, suggests that your genetic makeup can influence your stance on issues such as abortion, unemployment and the death penalty, though children tend not to express those opinions until they leave home.
- It’s my party and I’ll lie if I want to: A study at Washington State University posits that a “belief gap” has replaced the “education gap” in American politics. Positions on many issues–and how much someone knows about an issue–no longer are largely determined by how much education someone has, but rather with what party they identify.
- Funny how that happens: Late-night comedy shows, such as “The Daily Show with Jon Stewart” and “The Colbert Report” can actually spur political discussions among friends, according to a new study at the University of Michigan.
Video bonus: In case you missed it, check out out the “Saturday Night Live” take on undecided voters.
More from Smithsonian.com
Sign up for our free email newsletter and receive the best stories from Smithsonian.com each week.