October 11, 2013
For the privileged eaters of the Western world, so much of eating is done routinely: cereal for breakfast, a sandwich for lunch, probably a protein and vegetable for dinner. Sometimes, the act of eating is so second nature that the guidelines that dictate how and when we eat are invisible—guidelines such as eating a steak for dinner but not for breakfast, or eating lunch in the middle of the day. Eating wasn’t always dictated by these rules—so why is it now? That’s the question that food historian Abigail Carroll set out to answer in her new book, Three Squares: The Invention of the American Meal. Tracing the meal’s history from colonial America to present-day, Carroll explores why we eat cereal for breakfast, how dinner became American and how revisiting the history of our meal can have a tpositive impact on the future of eating. Carroll spoke with Smithsonian.com about the guidelines that control our dining.
How did the associations between certain meals and certain foods, like cereal for breakfast, form?
You start in the very early colonial era with one meal in the middle of the day—and it’s the hot meal of the day, dinner. Farmers and laborers ate earlier because they were up really early, and the elite were eating later in the day because they could sleep in. Breakfast and supper were kind of like glorified snacks, often leftovers or cornmeal mush, and there was not a lot of emphasis placed on these meals. Dinner, the main meal, at which people did tend to sit down together and eat, was really not the kind of social event that it has become. People did not emphasize manners, they did not emphasize conversation, and if conversation did take place it wasn’t very formal: it was really about eating and refueling. That’s the time where there are very blurry lines between what is and what isn’t a meal, and very blurry lines between what is breakfast, dinner and lunch.
Then, with the Industrial Revolution, everything changed, because people’s work schedules changed drastically. People were moving from the agrarian lifestyle to an urban, factory-driven lifestyle, and weren’t able to go home in the middle of the day. Instead, they could all come home and have dinner together, so that meal becomes special. And that’s when manners become very important, and protocol and formality. It’s really around then that people start to associate specific foods with certain meals.
Then, with dinner shifting you have the vacuum in the middle of the day that lunch is invented to fill. People are bringing pie for lunch, they’re bringing biscuits, but the sandwich really lends itself to lunch well. So the popularity of the sandwich really does have something to do with the rise of lunch—and especially the rise of children’s lunch, because it’s not messy. You don’t need utensils, you don’t have to clean up—you can stick it in a lunch pail really easily.
Why is it acceptable to eat cereal and eggs and a waffle for breakfast, but not for lunch or dinner? How did breakfast go from being a necessity meal—fueled by leftovers—to a meal with clear guidelines for what is acceptable to eat?
There was a problem during the Industrial Revolution: people were still eating a farmer’s diet, but they were shifting to a more sedentary lifestyle, which caused indigestion. People who were interested in health started looking into that and started coming up with solutions. Sylvester Graham, the reformer who became a preacher of health ideology, advocated for vegetarian food, and whole wheat as kind of a panacea for health problems, which becomes the answer to the question of breakfast. Then, people who ran sanitariums, including John Harvey Kellogg, in the late 1800s and early 1900s, really took that idea and flew with it and invented new ways to eat farinaceous [starchy] foods.
Entrepreneurs—some of whom worked in the sanitariums, like Charles C. Post–really build on these ideas and make them a healthy requirement. He creates all sorts of crazy testimonies that serve as advertisements for Grape-Nuts, where people’s lives are saved from chronic illness and they’re able to walk again.
Then, there’s also the history of orange juice and milk, with the discovery of vitamins in the 1910s. Milk came to be seen as a super food, and something that would keep you from getting deficiency diseases. It shows up at other meals too, but for much of the 20th century, it’s not a complete meal unless you have milk.
Why is it that, in America, we have maintained the feeling that lunch needs to be a quick meal in the middle of the day?
We still are working a lot—we’re working more hours in the United States than any other industrialized nation. Lunch is the original quick meal; it accommodated changing work schedules.
And dinner has taken on the ideological weight of the meal. Dinner has been the time when we celebrate family, and when we concentrate on having a nice, hot meal, ideally. Because dinner fulfilled that need, there was less of a need for the other meals to. Lunch doesn’t have a lot of cultural work to do; it just has to get us by.
But, if you think about it, it’s not just lunch—it’s breakfast too. We can just pour milk over cereal, or pop some toast in the toaster and walk out the door without even needing a plate or utensils. Breakfast accommodates work. It’s not the meal that shapes work, it’s the work that shapes the meal.
Could you talk about how dinner became a particularly American institution?
Dinner was not initially a strong identifying factor, in terms of nationality, for colonists. At first, they were eating more or less peasant food, porridges brought from England that said more about class than nationality. Then, dinner shifts in the 1700s to become an identifying factor in terms of being English. They’re in this new world, seen as primitive, and so they feel that they have to compensate for that. They inherit the fashions that cross the ocean, like eating a roast with dinner.
In the nineteenth century, the emerging middle class identifies itself through French food and French ways of eating. Things that we take for granted now, like starting a meal with soup or having a salad, were really French concepts. Dessert was largely a French concept, and many of the desserts that we adopted in the 19th century were French desserts. For the Victorian middle class, eating in the French way was a way to imitate the elite.
With the decline of servants in the late 1800s, people just couldn’t keep that up. Then there are the [World] Wars and the Depression, and those require Americans to be frugal. But they don’t just require Americans to be frugal—they give Americans the opportunity to celebrate frugality as patriotic. To eat frugally, to have a Victory Garden and can your own foods is patriotic. The model for dinner is no longer the French multicourse formal meal, but Thanksgiving. Thanksgiving becomes the model for the everyday American dinner. Of course, you don’t eat a whole roast every night, but the idea is that you have “a chicken in every pot,” which was Herbert Hoover’s 1928 campaign slogan. You would have some kind of meat on the table.
Are there any dishes or foods that you would classify as typically, or even exclusively, “American?”
A number of iconic foods—hot dogs and hamburgers, snack food—are hand-held. They’re novelties associated with entertainment. These are the kinds of food you eat at the ballpark, buy at a fair and eventually eat in your home. I think that there is a pattern there of iconic foods being quick and hand-held that speaks to the pace of American life, and also speaks to freedom. You’re free from the injunctions of Victorian manners and having to eat with a fork and knife and hold them properly, sit at the table and sit up straight and have your napkin properly placed. These foods shirk all that. There’s a sense of independence and a celebration of childhood in some of those foods, and we value that informality, the freedom and the fun that is associated with them.
Along those lines, there’s a lot of pushback against those processed foods today, with people wanting to recall old ways of eating, with eating local and fresh. But, how do you think that knowing the kinds of food that we used to eat and the ways that we used to eat, and think about eating, influences the future of American food?
History can play a really central role in thinking about the way that we want to eat in the future. The evolution of the meal is a process, and it continues.
With all of the talk of food and health, I think a really good question to ask is “Can we actually be healthy without eating meals?” And without even, perhaps, eating a family dinner? Studies show that eating together, we always eat better, always.
The family meal is the opportunity to put to work what we’re talking about. If we’re learning about fresh foods and ingredients, the family meal has potential to be another way of instructing our children and ourselves. There’s an interest in renewing the family meal, even reinventing it. We’re not going to be able to revive a Victorian notion of dining; I don’t think we’re interested in it. If we want to spend time together, if we want to invest in our children, if we want to be healthy, the family meal can be a vehicle for that.
November 30, 2012
It’s not peanut butter jelly time. In fact, put down the peanut butter and walk away slowly. If the spread you are putting on your morning toast is from a jar of Organic Trader Joe’s Creamy Salted Valencia peanut butter, you may just want to stick with jelly. The reason? The Food and Drug Administration issued a summons to shut down the country’s largest organic peanut butter processor earlier this week, per the Associated Press.
Salmonella in peanut butter is no new discovery—in 2007, contaminated Peter Pan products resulted in 329 reported cases in 41 states—and this past September, Trader Joe’s voluntarily recalled its Creamy Salted Valencia Peanut Butter due to contamination with salmonella thought to be from Sunland, Inc., located in Portales, New Mexico. The outbreak of salmonella poisoning—41 people infected in 20 states—has since been traced to the New Mexico plant, which distributes to major food retailers including Trader Joe’s, Whole Foods and Target. FDA inspections found samples of salmonella in 28 places in the plant—unclean equipment and uncovered trailers of peanuts outside of the factory, too. Not to worry, though, Sunland Inc. hasn’t manufactured peanut butter since the initial voluntary recall in September.
But how does salmonella get into peanut butter in the first place? Dr. Mike Doyle, who has assisted in helping Sunland getting their plants back up and running again and serves as director of the Center for Food Safety at the University of Georgia, explains that peanuts grow in the ground and can be contaminated from a variety of sources: manure, water, wild animals—even the soil. Studies have shown that once present, salmonella can survive for many months—even years—in peanut butter, according to Scientific American. Before treatment, in fact, about two percent of all peanuts are contaminated with salmonella.
“When harvested, we assume there can be some salmonella present and we have to use a treatment to kill it,” Doyle says. A roaster with air temperatures set to about 300 degrees Fahrenheit destroys salmonella in peanuts. For this reason, this moment in the process is often referred to as the “kill step” by manufacturers. The biggest challenge, then, is to prevent contamination in processing plant after the roasting.
“Water is one of the biggest problems in dry food processing for salmonella proliferation,” Doyle says. “If water is available to salmonella, it will grow.”
Dry food manufacturers like a peanut plants or breakfast cereal producers, for example, must minimize the use of water in the plant. Everything from leaks in the roof to the water used to clean up a mess needs to be controlled.
So what can be done to prevent future contamination? There are a variety of things that can be done to upgrade systems and facilities, Doyle says. But all food processors are different in how they control harmful microbes in their plants. As for the Sunland plant, Doyle says they’ve traced the root cause of the contamination to the roaster room.
“The company is in the process of making changes to prevent future contamination,” he says. “They’re gutting the room—new walls, new floors—and fixing other things that need to be addressed.”
August 6, 2012
In Italy, working on assignment for several magazines, author Bob Spitz got an unusual call from the Italian Trade Commission in 1992.
“Would you like to be an escort for an older woman?”
Spitz was quick to answer, “Lady, I don’t do that kind of work.”
“It’s for Julia Child,” the woman on the phone informed him. Even quicker to answer this time, Spitz said, “I’ll be right over.”
And thus began his month long tour with one of the greatest culinary figures in American history.
Julia Child would have been 100 years old this August 15. Known for her distinct vibrato voice, her height and her role in bringing French food across the Atlantic in the 1960s, Child stood an impressive 6-foot-2 and couldn’t help but be noticed.
The first time Spitz met her, all he could hear was a chorus of lunching Americans chirping, “It’s Julia. It’s Julia.” Seated at a hotel in Taormina, he watched her walk across the piazza. “Every head in the place turned,” he says, everyone referring to her simply as Julia, not Julia Child.
Together the pair ate their way across Sicily, talking about food and reexamining her life. Child had just watched her husband and business partner Paul enter a medical facility as his mental faculties began to fade and she was in a contemplative mood, says Spitz.
Of course, that didn’t diminish her spirit, which Spitz describes as “relentless.” Even though she didn’t particularly care for Italian food (“The sauces were too boring for her”), Child took her tour seriously.
“We went into the restaurants, but then she would head into the kitchen,” often without invitation, says Spitz. “She talked to the chef, she’d shake everybody’s hand in the kitchen, even the busboys and the dishwashers,” Spitz remembers, “And always made sure to count how many women were working in the kitchen.”
If Child received warm receptions from vacationing Americans, the Italian chefs were less than star struck. Many, says Spitz, didn’t even know who she was. “The Italian chefs, most of them men where we went, were not very happy to see a 6-foot-2 woman come into their kitchen and, without asking them, dip her big paw into the stock pot and taste the sauce with her fingers.” Her brash behavior often brought reproachful, murderous stares, says Spitz. Not easily daunted, she found it amusing. “She would say to me, ‘Oh, they don’t speak English. Look at them! They don’t know what I’m made of. They don’t know what to do with me.’ It was great,” Spitz says.
Few people in Child’s life seemed to know what to do with her. She grew up in a conservative family in Pasadena, Calif. playing tennis and basketball. After college and a brief copywriting career in New York, she headed back home and volunteered with the Junior League. Craving adventure, she tried to enlist in the Women’s Army Corps but was too tall. Instead, she wound up in the Office of Strategic Services, beginning her career in Sri Lanka in 1944 before heading to China and eventually France after Paul was assigned there.
The rest is a familiar history. She developed a devoted passion for French food and technique, trained and worked tirelessly to record her findings. The first volume of her Mastering the Art of French Cooking was published in 1961, with a second volume to come in 1970. In between, she began her TV career hosting “The French Chef.”
“She never tried to work on a personality,” Spitz says of the show’s success. “The day she first walked on TV, it was all there–the whole Julia Child persona was intact.”
Her dedication to getting real French food into American homes that were used to TV dinners and Jello desserts energized every episode. But Spitz insists, she didn’t just change the way Americans ate, she changed the way they lived.
Given the opportunity to clear one thing up, Spitz has one misconception on his mind: “Julia never dropped anything. People swear she dropped chickens, roasts–never happened.” Likewise, the mythology around her drinking on the show, which was limited to the close of each show when she sat down to enjoy her meal, also developed its own life. “Julia was by no means a lush,” says Spitz. “Although,” he adds, “when we were in Sicily, she consumed alcohol in quantities that made my eyes bug out.”
“She was a woman who liked adventure,” Spitz says. The pair would sometimes tour the Italian countryside by motorcycle. “Just knowing that this 80-year-old, 6-foot-2 woman, no less Julia Child was on the back of a motorcycle, riding with me–it told me everything I needed to know about her.”
Spitz will read from and discuss his new biography, Dearie: The Remarkable Life of Julia Child, Wednesday, August 8, at 7 p.m. at the Natural History Museum. He will also attend the 100th anniversary celebration August 15.
July 19, 2012
Beating the lazy, mid-afternoon summer heat with a cold energy drink?
Energy drinks are a staple among active Americans, who substitute the canned, sugary beverages for coffee or tea and have launched brands like Red Bull, Monster and Rockstar to the top of a $7.7 billion industry. Not only do energy drinks pack a caffeine-punch, they are filled with energy-boosting supplements.
It’s a tough call whether the benefits associated with supplemental boosters outweigh all the unhealthy sugars that give energy drinks their sweet flavor. Red Bull contains 3.19 grams of sugar per fluid ounce, Monster contains 3.38 g/oz. and Rockstar has 3.75 g/oz. Marketed as health drinks, energy drinks are as high in sugar as classic Coca-Cola, which contains 3.25 g/oz. of sugar.
So what exactly are those “energy-boosting natural supplements” that supposedly set energy drinks apart from other sugary beverages — and how do they affect the bodies of those who consume energy drinks?
Taurine: Although it sounds as though it was dreamed up in a test-lab, taurine isn’t foreign to the human body. Its name stems from the fact it was first discovered and isolated from ox bile, but the naturally-occurring supplement is the second-most abundant amino acid in our brain tissue, and is also found in our bloodstream and the nervous system.
The taurine used in energy drinks is produced synthetically in commercial laboratories. Since excess taurine is excreted by the kidneys, it’s improbable that someone could overdose on the supplemental form. To be on the safe side, one expert recommends staying under 3,000 mg per day. Animal experiments have shown that taurine acts as an antioxidant and may have anti-anxiety and anti-epileptic properties. Some studies have even suggested that dosages of the amino acid may help to stave off age-related bodily degeneration.
And taurine’s anti-anxiety effects might be useful when consumed as part of an energy drink; the amount of accompanying stimulant found in popular beverages is capable of causing some seriously anxious jitters.
Guarana: The caffeine component of many energy drinks is guarana, which comes from a flowering plant native to the Amazon rainforest. In fact, most people in South America get their caffeine intake from the guarana plant rather than coffee beans. Guarana seeds are about the same size as a coffee bean, but their caffeine potency can be up to three times as strong.
Both coffee and guarana have weight-loss inducing effects through the suppression of appetite, a common side-effect of caffeine. Although caffeine can improve mental alertness, it can also cause dizziness, nervousness, insomnia, increased heart rate and stomach irritation.
Ginseng: Some of the most interesting, if not debatable, effects come from supplemental Panax ginseng, which is included in 200mg doses in several energy drink brands. As a traditional herbal treatment associated with East Asian medicines, ginseng has many folkloric uses — although many of those uses are not proven scientifically. Rumored uses for ginseng have included improved psychologic functioning, boosted immune defenses and increased sexual performance and desire.
Myths aside, ginseng does offer some attractive benefits. Studies have indicated positive correlation between daily ginseng intake and improved immune system responses, suggesting ginseng has anti-bacterial qualities in addition to boosting a body’s “good” cells.
Ginseng has also been shown in animal and clinical studies to have anticancer properties, due to the presence of ginsenosides within the extract of the plant. Ginsenosides are a type of saponins, which act to protect the plant from microbes and fungal and have been described as being “tumor killers”. Scientists are still working to understand the effects of ginseng supplements for use in preventative and post-diagnosis cancer treatment.
Energy drinks may be overhyped as a source of supplemental substances. All of the supplements found in energy drinks can be bought individually as dietary supplements, which allows consumers to ingest the substances without the complementary sugar load found in energy drinks.
Please, though, if you’ve ever sprouted wings after chugging back an energy drink, we’d like to be the first to know.
January 20, 2012
The Perennial Plate is an online documentary series by Daniel Klein about food and communities. Season 1 had a Minnesota and Midwest focus. Season 2, which is still being rolled out, covers the continental United States.
Gilt Taste’s stories section is also worth watching as a “must-read” site. It started up last spring. While the section can get a little recipe-heavy during the holiday season, it features stories about food and culture from a wide variety of writers.
McSweeney’s, the book publisher, is putting out David Chang’s dude-centric Lucky Peach and also, get this, a cookbook written by Eat Pray Love‘s Liz Gilbert’s grandma.
Nicola Twilley of Foodprint/Edible Geography. She writes about “smellscapes,” the odors that define certain places; wacky food-based artists; edible insects; and she runs a lot of Q&As with interesting characters.