November 5, 2013
The Director of the African-American History and Culture Museum on What Makes “12 Years a Slave” a Powerful Film
As I sat in the theater crowded with nervous patrons, unsure of what to expect from a movie about slavery, I was startled by the audience’s visceral reaction to a scene depicting the violence that was so much a part of what 19th century America called the “peculiar institution.” And then I found myself beginning to smile, not at the violence but with the realization that this movie, this brilliant movie, just might help to illuminate one of the darkest corners of American history. In many ways, American slavery is one of the last great unmentionables in public discourse. Few places, outside of history classes in universities, help Americans wrestle with an institution that dominated American life for more than two centuries. The imprint of slavery was once omnipresent, from the economy to foreign policy, from the pulpit to the halls of Congress, from westward expansion to the educational system. I smiled because if 12 Years a Slave garnered a viewership, it just might help America overcome its inability to understand the centrality of slavery and its continuing impact on our society.
12 Years a Slave, imaginatively directed by Steve McQueen with an Oscar worthy performance by Chiwetel Ejiofor, is the story of Solomon Northup, a free African-American living in New York who is kidnapped, “sold south” and brutally enslaved. Northup’s struggle to refuse to let his enslavement strip him of his humanity and his dignity and his 12-year fight to reclaim his freedom and his family are the dramatic heart of this amazing movie. Part of what makes this film experience so powerful is that it is based on the true story of Northup, a musician and man of family and community who had known only freedom until his kidnapping transplanted him into the violent world of Southern slavery.
The film’s depiction of slavery is raw and real. From the moment of his capture, Northup experiences the violence, the confinement, the sense of loss and the uncertainty that came with being enslaved. It is interesting that some of the criticism heaped on this film revolves around its use of violence. The scenes where Northup is beaten into submission or where the brutal plantation owner, Edwin Epps (played with nuance and depth by Michael Fassbender) whips Patsy, an enslaved woman who could not avoid the owner’s sexual abuse and rape have been called excessive. In actuality, these scenes force us to confront the reality that the use of violence was a key element used to maintain the institution of slavery. It is interesting that movie audiences accept and revel in the violence that dominates films from Westerns to horror flicks to the recently lauded Django Unchained, and yet, have a difficult time accepting the notion that some Americans used violence to attempt to control other Americans. This is a result of the fact that the violence in this movie makes it problematic for Americans not to see our historical culpability, something unusual for a nation that traditionally views itself as on the side of the right and the righteous.
12 Years a Slave is such an important movie because it entertains and educates in a manner that is ripe with nuance, historical accuracy and dramatic tension. It reveals stories about the African-American experience that are rarely seen or rarely as well depicted. Northup’s life as a free person of color is revelatory because it hints at the existence of the more than 500,000 African-Americans who experienced freedom while living in the north in the years just prior to the Civil War. Northup’s life of middling class respectability and community acceptance was not the norm; most free blacks lived on the margins with lives and communities limited by laws and customs that sought to enforce notions of racial inequality. Yet Northup’s very presence belied many of the racial beliefs of the period. There is a scene in the movie where Northup and his well-dressed family are walking down the street about to enter into a shop and they are being observed by an enslaved man whose southern owner has brought him north to serve the owner while he is on holiday in Saratoga. The enslaved man is amazed at the sight of a black family strolling freely and being greeted with respect by the shopkeeper. The owner quickly calls the man away as if to ensure that he not be infected by the freedom exhibited by the Northup family.
The importance of family is also a key element in the film. While Northup’s desire to be reunited with his wife and children is part of what motivates him to survive his time of bondage, the power of kinship is revealed in the scenes where a mother struggles to keep her family together. Like Northup, a young boy is kidnapped and held in a slave pen in Washington, D.C. (ironically, I am writing this piece within 30 yards of where the slave pen where Northup was first enslaved stood). When the mother learns where her son has been detained she enters the pen with her daughter hoping to reclaim her child. She is devastated when she and her daughter are also captured and readied to be sold into slavery. As the family is offered at auction, the pain the mother feels is almost unbearable as she begs, ultimately in vain, for someone to buy them all and to not destroy her family. During the months that follow the sale, the woman is inconsolable. On the plantation where she and Northup now live, she cries almost non-stop, whether serving the owner’s family or attending church service. Eventually she is sold to another owner because the mistress of the plantation does not understand why she cannot just get over the loss of her children. These scenes make clear that time could not heal all the wounds inflicted by slavery. In the years immediately following emancipation, thousands of the enslaved searched for any hint that would help them reunite with their family. Letters were sent to the Freedman Bureau seeking assistance and well into the 1880s, the formerly enslaved placed ads in newspapers searching for love ones cruelly separated by slavery. Rarely did these hoped for reunions occur.
While 12 Years a Slave rightfully and appropriately privileges Solomon Northup’s resiliency and resolve, it also reminds us that men and women of good will crossed the color line, stood against the popular sentiments of the period and risked much to help abolish slavery. Northup’s encounter with a Canadian sympathetic to the cause of abolition played by Brad Pitt revealed much about Northup’s ingenuity and the need to enlist the help of sympathetic whites. After hearing Pitt’s character engage in a debate with the plantation owner, Epps, over the morality of slavery, Northup cautiously convinces the Canadian to send a letter to the shopkeeper who knew him in New York and could prove that Northup was a free man. This begins a process that eventually returns Northup to his family in upstate New York. While Solomon Northup reunited with his family, most who were kidnapped never escaped the brutality of enslavement.
12 Years a Slave is a marvel. It works as a film and it works as a story that helps us to remember a part of the American past that is too often forgotten. We have all been made better by this film if we remember the shadow that slavery cast and if we draw strength and inspiration from those who refused to let their enslavement define them and those who, by refusing, helped make real the American ideals of freedom and equality.
November 4, 2013
Until 1982, anyone who used insulin to manage their diabetes got it from what we’d now think of as an unusual source: the pancreases of cows and pigs, harvested from slaughterhouses and shipped en masse to pharmaceutical processing plants. But there were problems with getting all our insulin this way—fluctuations in the meat market affected the price of the drug, and projected increases in the number of diabetic people made scientists worry that shortfalls in insulin supply could strike within the next few decades.
That all changed with the introduction of Humulin, the first synthetic human insulin. But the drug was a milestone for another reason, too: It was the first commercial product to come out of genetic engineering, synthesized by bacteria that had been altered to include the gene for producing human insulin.
Last year, the American History Museum acquired a handful of key items used to create Humulin from Genentech, the San Francisco company responsible for its development, and put them on view last week in a display titled “The Birth of Biotech,” giving visitors a look into the dawn of the era of genetic engineering.
Genentech’s work began with a discovery made in the 1970s by a pair of Bay Area scientists, Herbert Boyer of UC San Francisco and Stanley Cohen of Stanford: Genes from multi-cellular organisms, including humans, could be implanted into bacteria and still function normally. Soon afterward, they teamed with venture capitalist Robert Swanson to form the company, with the hope of using genetic engineering to create a commercially viable product.
Early on, they decided insulin was a logical choice. “It was convenient. It was an easy protein to handle, and it was obviously something that a lot of people needed,” says Diane Wendt, a Smithsonian curator who worked on the display.
One of their first achievements was synthetically building the human insulin gene in the lab, a single genetic base pair at a time. In order to check the accuracy of their sequence, they used a technique called gel electrophoresis, in which electricity forces the DNA through a gel. Because larger pieces of DNA migrate more slowly than smaller pieces, the process effectively filters the genetic material by size, allowing researchers to pick out the pieces they want, one of the key steps in early genetic sequencing methods.
Electrophoresis is still widely used, but the equipment donated by Genentech is decidedly more improvised than the standard setups seen in labs today. “You can see it’s sort of made by hand,” says Mallory Warner, who also worked on the display. “They used glass plates and binder clips, because they were working really quickly all the time and they wanted something they could take apart and clean easily.”
In order to manipulate DNA and other microscopic molecules, the researchers used a variety of tiny glass instruments. They made many of these tools themselves with a device called a microforge—essentially, a tool shop in extreme miniature, equipped with its own microscope so the makers could see what they were doing.
After synthesizing a gene for insulin, the scientists needed to assimilate it into a bacterium’s DNA so that the organism would produce insulin on its own. They used a variety of enzymes to do so, including Eco R1, a chemical that cuts DNA in a precise location, based on the surrounding base pairs. Researchers extracted small DNA molecules called plasmids from the bacterium, severed them with these enzymes, then used other enzymes to stitch the synthetic insulin gene in place. The new hybrid plasmid could then be inserted into live bacteria.
After the Genentech scientists successfully created bacteria with copies of the insulin gene, they confirmed that the microbes could produce human insulin in sufficient quantities in a fermentation tank like this one. Then the genetically modified bacteria were passed off to researchers at Eli Lilly, who began producing it in commercial quantities for sale. Voila: synthetic human insulin.
Of course, the state of biotechnology continued to evolve in the years after Humulin debuted, and the museum has collected notable items from that time as well. One is a prototype of a gene gun, developed by scientists at Cornell University in the mid-1980s.
The device makes it easier for scientists to introduce foreign genes into plant cells, by coating tiny metal particles in DNA and firing them at plant cells, forcing a small percentage of the genetic materials to penetrate into the cells’ nuclei and enter their genomes. The original gene gun prototype used a modified air pistol as a firing mechanism, and the technique proved successful when it modified onion cells, chosen for their relatively large size.
Another subsequent innovation ushered in the age of biotechnology in earnest: polymerase chain reaction, or PCR, a chemical reaction developed in 1983 by biochemist Kary Mullis that allowed scientists to automatically multiply a DNA sample into greater quantities with significantly less manual work. The first prototype PCR machine, or thermal cycler, was based on researchers’ knowledge of how enzymes like DNA polymerase (which synthesizes DNA from smaller building blocks) functioned at various temperatures. It relied on cycles of heating and cooling to rapidly generate large amounts of DNA from a small sample.
“The Birth of Biotech” is on display on the ground floor of the American History Museum through April 2014.
October 30, 2013
Andy Carvin is a man of many titles—“digital media anchor,” “real-time news DJ” and “online community organizer,” to name a few—but the one he is most comfortable with is “storyteller.” NPR‘s social media strategist, Carvin used Twitter during the Arab Spring to communicate with protesters in the Middle East and verify eyewitness accounts from the front lines, most of the time while he was on his iPhone in the United States. He recently published a book about his work, Distant Witness.
Carvin has donated his old phone to the American History Museum, which will include it in “American Enterprise,” a 2015 exhibition on the role of innovation in the nation’s emergence as a world power. “Engaging with people through my phone on Twitter was a story itself,” he says of his reporting in 2011. Carvin, who still tweets up to 16 hours a day, sees his work as a “form of real-time storytelling…sorting itself out, 140 characters at a time.”
See how the process works in this selection of tweets, and read on for our interview with Carvin on social media in journalism:
How did you use this phone during the Arab Spring?
My job at NPR is to be a journalistic test pilot: I experiment with new ways of conducting journalism and figure out what works and what doesn’t. At the beginning of the Arab Spring, I had contacts in Tunisia and other parts of the region who were talking about protests through Twitter and other social media. Initially I was simply retweeting what they were saying, but as the revolutions expanded from one country to another, I ended up using Twitter to create an online community of volunteers who served as sources, translators and researchers for me. We would all engage with each other mostly through my mobile phone, trying to sort out what was true and what wasn’t.
From 2011 to 2012, I was on Twitter upwards of 18 hours a day, 7 days a week, much of the time on that phone, and rarely in the places where these revolutions were taking place. I don’t have a background as a combat reporter, so this was very much an experiment in collaborative, virtual reporting, in which ultimately my iPhone and Twitter served as the focal points.
I was mostly in the U.S. while this was going on, but I made trips to Egypt, Lebanon, Libya, Tunisia and a number of other countries in the region. I discovered very quickly that when I would be in a place like Tahrir Square in Egypt, I found it really hard to get a big picture of what was going on, simply because when you’re surrounded by tear gas and people throwing rocks, you have a fairly limited field of view. Once I could get away from that scene and get back online, over my phone, I’d immediately have contact with dozens of sources across the field of battle who could help paint this picture for me and give me the type of situational awareness that I actually didn’t have when I was there in person.
A lot of your social media work was fact-checking or fact verification. Did you then funnel those facts to NPR or other journalists?
It varied. I was regularly in contact with our reporters on the ground, so as I discovered things that seemed relevant to our reporting on air and online, it would get incorporated into that work. But much of the time, the goal was to do a long-term experiment in social media and mobile journalism in which I wasn’t working under the assumption that my tweets would ultimately develop into some type of news product, like a blog post or a radio piece. Instead, engaging with people through my phone on Twitter was the story itself. It was the experience of being part of this real-time rollercoaster, with me essentially as a broadcast host trying to explain to people what was going on, what’s true, what’s not—but doing it through Twitter and pulling in people who are on the ground, using these same mobile technologies to share their experiences in real time.
[Social media] worked in parallel to our other reporting methods. It certainly wasn’t a replacement to our foreign correspondents being on the ground in all these places. If anything, it complemented that kind of journalism.
But Twitter can also amplify rumors and spread false reports very quickly. How do you answer that criticism?
All we have to do is look at the last year or two to see a vast array of egregious errors that journalists have made on cable television and broadcast news and online news in general. Whether it’s the Boston bombing mistakes or some of the reporting during the shooting in Newtown, the rumors that spread those days didn’t begin on social media; they began with incorrect reporting on air and online. Now, people immediately began talking about them through social media, so word of this reporting spread just as fast as it would have spread if the reporting had been accurate.
The problem is that news organizations often don’t see this social media space as their concern, except for promoting their work. If they report something incorrectly on air, they’ll correct it when they can—but ultimately the people online are going to have to sort it out themselves. I personally think that’s a big mistake. If anything, I think news organizations should have journalists active in these communities so we can slow down the conservation, ironically, because you think of Twitter as speeding up the news cycle.
You can slow it down by telling people: “This is what we know and what we don’t know. We have not been able to confirm what this other network is reporting, and we don’t have the evidence to back that up.” The types of things that you sometimes say on air but don’t always spell out. The average news consumer doesn’t know the difference between when a news anchor says, “We have confirmed,” versus “We have received reports,” or “Our news outlet has learned.” These all have very distinct meanings in journalism, and we never explain to anyone what they mean.
If you’re part of a conversation with the public on Twitter, you can say to them, just because this network said they’ve received reports that something has happened, that doesn’t mean it’s anywhere near being confirmed. You can actually improve the media literacy of the public so they become more responsible and less apt to be part of that rumor cycle.
So generally speaking, yes, social media amplifies rumors. There’s absolutely no doubt about it. But I think we have to take a really hard look at ourselves in the media and ask, where are these rumors originating? And when they’re originating through our own reporting, what can we do to alleviate them online?
Twitter is also used by ordinary people, celebrities, comedians, etc. Do you see all those uses of Twitter as different silos, or are they all part of the same phenomenon?
They’re all part of the same ecosystem in the same way that life and culture overlap different ecosystems. If you think about what we do in our online worlds, we occasionally enjoy comedy, we talk to our friends about the crappy meal we had at a restaurant the night before or the bad customer service we got from some business. Other times we’ll talk about serious things, try to help friends online, maybe talk about the news. None of these are mutually exclusive. They’re all aspects of who we are and how we engage with our friends and family.
Twitter and social media in general just amplify those same concepts and put them in a space that makes it easier for people who would never normally meet to engage in conversations. So I’m perfectly proud to admit that I watch cat videos and read BuzzFeed and TMZ on a daily basis, while at the same time talking to sources in Syria and reading the latest essays coming out of Foreign Policy magazine. I don’t see that as contradictory because those are things that interest me offline as well.
I think a lot of the people who follow me for professional reasons follow me because I’m also a real human being on Twitter. I talk about my family, I talk about how things are going at work, the apple picking that I took my kids to a week ago or whatever. Social media gives you a chance to demonstrate to the world that you’re not just a talking head on a screen somewhere and that you actually are multidimensional. I think that adds to your authenticity in ways that make people more likely to trust you, to the point where they may want to share things with you as well. Being yourself on Twitter and social media is just a natural part of being a good citizen and cultivating sources online.
Is it possible to share too much information?
People overshare. There’s no doubt that happens. I’ve been guilty of doing it myself sometimes. But we’re all figuring this stuff out at the same time. There is really no precedent in history for this type of network that we’ve created. There’s an identity crisis when it comes to privacy right now, too. On the one hand we have a habit of oversharing, but on the other hand, people are very concerned about what the government is doing here or overseas. I don’t think anyone’s been able to sort this out yet. They know privacy when they see it, and they know oversharing when they see it. That’s just something that’s gonna have to sort itself out over time. I don’t think at the moment it’s necessarily going to stop those people who want to use social media in constructive ways from using them in constructive ways.
What phone do you have now?
I have an iPhone 5.
How do you feel about iOS 7?
I actually haven’t upgraded to it yet. It’s funny, I don’t consider myself a true early adopter of technologies in the sense that I don’t get new gadgets or tools in the first generation. I’d rather watch other people figure out whether they’re functional or not, and once they’re a bit more stable, then I like to tinker with them and figure out how they can be used in a broad sense.
I’d rather be on the cutting edge of figuring out what’s going on in the world than figuring out how to work my iPhone. I can always play catch-up on that as I need to.
October 29, 2013
“Times have changed,” reads a disclaimer at the Natural History Museum, “and so have the dates in many of our fossil displays.” This notice, accompanied by a revised geological timeline, is currently posted throughout the museum’s fossil halls. It’s a stopgap measure to update exhibitions that haven’t changed in 30 years—but it won’t be needed for much longer. The Natural History Museum is about to undergo a gut renovation that will not only update these exhibitions, but also transform their narrative of earth’s fossil record.
The “Deep Time” project is the largest and most complex renovation in the museum’s history. All of the current fossil exhibitions, including Life in the Ancient Seas, Dinosaurs and Ice Ages, will come down to make way for the Deep Time Hall, a thematic, rather than encyclopedic, timeline of life on Earth. This exhibition, slated to open in 2019, will illustrate the relevance of paleontology to modern life, portraying ancient plants and animals as interconnected parts of ecosystems and revealing a fossilized world just as complicated as ours.
“We study things like climate change and carbon dioxide in the past, extinction, things that are going on in the world today,” says Matt Carrano, lead curator of the Deep Time initiative. “It’s all of these big systems that work together. . . those are the systems that we are paying attention to in the present.”
The biggest change is chronological: the Deep Time story will run in reverse. Visitors entering the exhibition from the rotunda will start with the most recent past—the Ice Age, during which humans actually lived—and travel backward in time to the primordial Earth. In many museums, Carrano says, the prehistoric world feels like an “alien experience” and visitors “may as well be taking a spaceship to different planets.” Deep Time, on the other hand, will move from the familiar to the abstruse: “You have a house, you’ve taken it down and now you’re looking at the foundation—rather than you have a hole in the ground and you’re trying to tell people that there’ll be a house there later.”
The infrastructure of the gallery space will also receive its first makeover in more than a century. When the Natural History Museum first opened in 1910, the paleobiology wing consisted solely of the “Hall of Extinct Monsters,” little more than a trophy gallery for dinosaur fossils. Over the years, more and more exhibitions were tacked onto the space, resulting in the labyrinthine form of the fossil halls today. The renovation will remove the false walls subdividing the space and restore its original Beaux-Arts architecture. The new Deep Time Hall will be one cavernous, continuous gallery, with “display islands” that elaborate on specific themes.
Of course, no paleontology exhibit would be complete without a few dinosaurs, and the revamped space will display them to maximum effect. The fossil halls’ biggest draws, including the giant diplodocus on view and the Wankel T. rex on the way, will be placed in the center of the gallery so that visitors can see them all in one glance.
Other changes will be less noticeable, but more scientifically compelling. Carrano points to the current display of an allosaurus about to attack a stegosaurus: “What’s the point of showing that, besides the entertainment? We could talk about: What is it that predators do? What is it that herbivores do? Is that any different from today? Probably not. As dramatic as those animals are, they’re doing things that you can see happening out your window right now.” In the new exhibition, these creatures might represent predation or the relationship between species form and function. The work of the Deep Time team is as much about storytelling as it is about stage-setting for some of the Smithsonian’s best-loved fossils.
After the current fossil exhibitions go back into storage, a temporary gallery, focusing mainly on dinosaurs, will open on the second floor. Carrano puts it mildly: “We’re very conscious of the fact that you can’t just take the dinosaurs away for five years.”
October 25, 2013
I am an unapologetic fan of show biz glitz. When organizing an exhibition, my approach is to dip scholarship in dazzle: I firmly believe that injecting an exhibition with spectacle and showmanship fuels the path to understanding. The idea is to inspire visitors rather than to intimidate, baffle or bore them. I’ve always wanted to roll out the red carpet and this time I did.
In the current exhibition, “Dancing the Dream,” which recently opened at the National Portrait Gallery, the idea was to show how Broadway, Hollywood, modern, classical and contemporary dance have captured American culture in motion. In 1900, Loie Fuller unleashed her barefoot and uncorseted version of the “New Woman” on stages around the world; in the 1930s, Fred and Ginger danced an elegant escapism for Depression audiences; at the height of the Cold War, Rudolf Nureyev and Mikhail Baryshnikov sought asylum and sparked a mania for ballet in America; from the 1980s to today, MTV and YouTube have showcased such dancers as Michael Jackson and Beyoncé and created audiences that are both more diverse and more individualized than ever before.
The dance exhibition’s basic ingredients—strong images of iconic personalities—were already present, as the Gallery has an extraordinary collection of key dance figures—Isadora Duncan, Irene Castle, Josephine Baker, Busby Berkeley Rita Moreno, Alvin Ailey, Shakira and Justin Timberlake, to name a few. The challenge for the museum’s design team was to create a lively showcase that conveyed dance’s dynamism. “I don’t like white walls,” I chirped. “Make it dazzle.”
And they did. One of the most exciting design elements is the red carpet that runs down the center hall connecting each of the six exhibition rooms. Yes, the National Portrait Gallery has a real red carpet. Designer Raymond Cunningham told me that he researched A-List red carpet events and discovered that the “red” used by the Golden Globes is a bluer red than the brighter hue used for the Academy Awards. The color used for “Dancing the Dream” is close to Oscar’s, but has been uniquely created for the Gallery.
Tibor Waldner, the museum’s chief of design, and his remarkable staff created a space that radiates with color—a drawing of Josephine Baker shimmies and shakes in a gallery with stunning teal walls; young ballet dancer Misty Copeland soars as a flaming Firebird in a gallery the color of her fires; Beyoncé hot-steps her “Single Ladies” number in a yellow-green gallery that I call “the riot of Spring.”
I was vastly intrigued by Raymond’s red carpet research, and have since discovered that the red carpet itself has an amazing history. The earliest reference to “walking a red carpet” is in Aeschylus’s Agamemnon in 458 B.C., when the title character is greeted by his vengeful wife Clytemnestra, who invites him to walk a “crimson path” to his house. In Georgetown, South Carolina, a ceremonial red carpet was purportedly rolled out for President James Monroe when he disembarked from a riverboat in 1821. Mainly, though, it seems the red carpet was a railroad phenomenon: in 1902, the New York Central used plush crimson carpets to direct people boarding the 20th Century Limited. It was this usage that seems to mark the origin of the phrase “red carpet treatment.”
Today, we associate red carpets as fashion and celebrity runways at major entertainment events. I asked Linda Mehr, director of the Academy of Motion Pictures’ Margaret Herrick Library, when the Academy began using a red carpet, and she told me that it wasn’t until 1961. Television broadcasts of the Oscars had begun in 1953, and by 1966 when the awards were first broadcast in color, the red carpet had become a major factor in the Oscar experience. Turner Classic Movies primetime host Robert Osborne has said that “for most of us, even a walk down the red carpet is just a dream.” It has also has become the stage for one of the biggest fashion events of the year. At the 2013 Oscars, Jessica Chastain told a reporter that “as a little girl…I always dreamed about my Oscar dress. I love fashion that celebrates a woman’s body, and that maybe is a throwback to the glamour of Old Hollywood.” Amy Adams said of her Oscar de la Renta dress, “I’ve worn a lot of different dresses, but I’ve never worn a big ballgown, so I thought I wanna wear a dress you can’t wear anywhere but the Oscars.”
Many of the iconic figures in the dance exhibition have walked the red carpet: several have won Oscars—including Gene Kelly, James Cagney, Rita Moreno, and Liza Minnelli—and several have been awarded Grammys, including Lady Gaga, Justine Timberlake, and Beyoncé
Installing the red carpet was the exclamation point that finished the exhibition’s high impact design. But once it was unrolled, there was yet another surprise: the carpet’s red reflected off the walls and ceiling in a way that suffused the entire corridor with an unexpected glow.
Dancing the Dream will be open at the National Portrait Gallery until July 13, 2014.