October 30, 2013
In 1971, Walt Disney World had just opened in Orlando, Florida. Led Zepplin was about to blow our minds, a prison riot had been shut down at Attica, and all across America, kids were pooping pink. Hundreds of mothers hospitalized their children for fecal testing out of fear of internal bleeding. Within that same year, not-so-coincidentally, General Mills released their classic monster cereals Count Chocula and Franken Berry. The latter was colored red using “Food, Drug and Cosmetics” (FD & C) Red No. 2 and No. 3., originally and chemically known as amaranth, a synthetic color named after the natural flower. The synthetic dye can’t be broken down or absorbed by the body.
A 1972 case study, “Benign Red Pigmentation of Stool Resulting from Food Coloring in a New Breakfast Cereal (The Franken Berry Stool),” published in Pediatrics explains the phenomenon later known as “Franken Berry Stool.” A 12-year-old boy was hospitalized for four days after being admitted for possible rectal bleeding. “The stool had no abnormal odor but looked like strawberry ice cream,” Payne reports. Further questioning of the mother revealed that the child had enjoyed a bowl of Franken Berry cereal two days and one day prior to his hospitalization. By the fourth day, they did a little experiment: They fed the boy four bowls of Franken Berry cereal and for the next two days, he passed bright pink stools. But other than pink poop, there were no other symptoms, Payne reports, “Physical examination upon admission revealed [a boy] in no acute distress and with normal vital signs…Physical examination was otherwise unremarkable.”
At the time of the study, the product had only been on the market for a few weeks. The author warns that “physicians should be aware of its potential for producing reddish stools.” Other monster cereals at the time also used dyes that caused stool to change colors. Booberry, which debuted in December of 1972, for example, uses Blue No. 1 (a dye currently banned in Norway, Finland and France) and turns stool green. Apparently, green stool seems less life-threatening than the reddish hue caused by Franken Berry.
But pink poop wasn’t always the worst side effect from colored confections. Ruth Winters’s A Consumer’s Dictionary of Cosmetic Ingredients details the history of commercial food dyes, including those later used in Franken Berry. At the turn of the 20th century, with virtually no regulation of more than 80 dyes used to color food, the same dyes used for clothes could also be used to color confections and other edibles.
In 1906, Congress passed the first legislation for food colors, the Pure Food and Drug Act, deeming seven colors suitable for use in food: orange, erythrosine, ponceu 3R, amaranth (the color later used in Franken Berry cereal), indigotin, naphthol yellow, and light green. Since then, upon further study, several of these choices have been delisted.
More than 20 years later, in 1938, Congress passed the Federal Food, Drug, and Cosmetic Act which gave these colors numbers instead of chemical names—every batch needed to be certified by the Food and Drug Administration, though some problems still arose: in the fall of 1950, many children became ill from eating an orange Halloween candy containing one to two percent FD&C Orange No. 1, for example.
Red Dye No. 2, the one used by the original Franken Berry cereal, was one of the most widely used color additives at the time, until a 1971 Russian study reported that the dyes caused tumors in female rats. Years of research led the FDA to find that even though the Russian study was extremely flawed (the FDA couldn’t even prove that amaranth was one of the dyes used), the agency would remove the dye from its Generally Regarded As Safe (GRAS) list in 1976. Between public outcry against the dye and the chance that trace elements could potentially have carcinogens, the FDA banned a number of other dyes as well. According to the FDA, 47 other countries, including Canada and the United Kingdom, still allow for the use of Red Dye No. 2.
That same year, Mars removed their red M&M’s from the candy-color spectrum for nearly a decade, even though Mars didn’t even use Red No. 2; the removal of the red candies was a response to the scare, livescience.com reports:
The red food coloring in question was not actually used in M&M’s chocolate candies, according to mms.com. “However, to avoid consumer confusion, the red candies were pulled from the color mix.”
Inquiries to General Mills as to when the Franken Berry ingredients switched to less poop-worrying dyes, were not responded to. These days, the only red colors accepted by the FDA are Red No. 40, which appears in all five of the General Mills monster cereals, and Red No. 3, typically used in candied fruits.
The symptoms of “Franken Berry Stool” were pretty benign compared to other more notable confectionary mishaps in history: The accidental poisoning of more than 200 people in Bradford, England in 1858 comes to mind. The sweets were accidentally made with arsenic. Let’s be thankful there’s a bit more regulation of food dyes these days.
Another stool scare in cereal history: Smurfberry Crunch Cereal, released in 1982 by Post Foods, turned the poop of those who ate it blue—the ultimate Smurfs experience. Post then changed the formula and re-released the cereal in 1987 as Magic Berries Cereal.
Looking for a sugar high now? You’re safe. When you open your celebratory, Franken Berry or any of the other monster cereals this Halloween, [for the first time, all five monsters are available in stores since the well-received re-release of Frute Brute (1975-1984) and Fruity Yummy Mummy (1987-1992)], expect a sugar high—without the pink poop aftermath. We tasted all five of the cereals and Count Chocula is the best by a long shot.
The best part is when the chocolate “sweeties,” as the marshmallows were called in the original commercials in 1971, are all gone: the plain milk becomes chocolate milk. Let’s be real, what child—or “adult”—prefers regular milk to chocolate? I haven’t met this kind of person.
October 11, 2013
For the privileged eaters of the Western world, so much of eating is done routinely: cereal for breakfast, a sandwich for lunch, probably a protein and vegetable for dinner. Sometimes, the act of eating is so second nature that the guidelines that dictate how and when we eat are invisible—guidelines such as eating a steak for dinner but not for breakfast, or eating lunch in the middle of the day. Eating wasn’t always dictated by these rules—so why is it now? That’s the question that food historian Abigail Carroll set out to answer in her new book, Three Squares: The Invention of the American Meal. Tracing the meal’s history from colonial America to present-day, Carroll explores why we eat cereal for breakfast, how dinner became American and how revisiting the history of our meal can have a tpositive impact on the future of eating. Carroll spoke with Smithsonian.com about the guidelines that control our dining.
How did the associations between certain meals and certain foods, like cereal for breakfast, form?
You start in the very early colonial era with one meal in the middle of the day—and it’s the hot meal of the day, dinner. Farmers and laborers ate earlier because they were up really early, and the elite were eating later in the day because they could sleep in. Breakfast and supper were kind of like glorified snacks, often leftovers or cornmeal mush, and there was not a lot of emphasis placed on these meals. Dinner, the main meal, at which people did tend to sit down together and eat, was really not the kind of social event that it has become. People did not emphasize manners, they did not emphasize conversation, and if conversation did take place it wasn’t very formal: it was really about eating and refueling. That’s the time where there are very blurry lines between what is and what isn’t a meal, and very blurry lines between what is breakfast, dinner and lunch.
Then, with the Industrial Revolution, everything changed, because people’s work schedules changed drastically. People were moving from the agrarian lifestyle to an urban, factory-driven lifestyle, and weren’t able to go home in the middle of the day. Instead, they could all come home and have dinner together, so that meal becomes special. And that’s when manners become very important, and protocol and formality. It’s really around then that people start to associate specific foods with certain meals.
Then, with dinner shifting you have the vacuum in the middle of the day that lunch is invented to fill. People are bringing pie for lunch, they’re bringing biscuits, but the sandwich really lends itself to lunch well. So the popularity of the sandwich really does have something to do with the rise of lunch—and especially the rise of children’s lunch, because it’s not messy. You don’t need utensils, you don’t have to clean up—you can stick it in a lunch pail really easily.
Why is it acceptable to eat cereal and eggs and a waffle for breakfast, but not for lunch or dinner? How did breakfast go from being a necessity meal—fueled by leftovers—to a meal with clear guidelines for what is acceptable to eat?
There was a problem during the Industrial Revolution: people were still eating a farmer’s diet, but they were shifting to a more sedentary lifestyle, which caused indigestion. People who were interested in health started looking into that and started coming up with solutions. Sylvester Graham, the reformer who became a preacher of health ideology, advocated for vegetarian food, and whole wheat as kind of a panacea for health problems, which becomes the answer to the question of breakfast. Then, people who ran sanitariums, including John Harvey Kellogg, in the late 1800s and early 1900s, really took that idea and flew with it and invented new ways to eat farinaceous [starchy] foods.
Entrepreneurs—some of whom worked in the sanitariums, like Charles C. Post–really build on these ideas and make them a healthy requirement. He creates all sorts of crazy testimonies that serve as advertisements for Grape-Nuts, where people’s lives are saved from chronic illness and they’re able to walk again.
Then, there’s also the history of orange juice and milk, with the discovery of vitamins in the 1910s. Milk came to be seen as a super food, and something that would keep you from getting deficiency diseases. It shows up at other meals too, but for much of the 20th century, it’s not a complete meal unless you have milk.
Why is it that, in America, we have maintained the feeling that lunch needs to be a quick meal in the middle of the day?
We still are working a lot—we’re working more hours in the United States than any other industrialized nation. Lunch is the original quick meal; it accommodated changing work schedules.
And dinner has taken on the ideological weight of the meal. Dinner has been the time when we celebrate family, and when we concentrate on having a nice, hot meal, ideally. Because dinner fulfilled that need, there was less of a need for the other meals to. Lunch doesn’t have a lot of cultural work to do; it just has to get us by.
But, if you think about it, it’s not just lunch—it’s breakfast too. We can just pour milk over cereal, or pop some toast in the toaster and walk out the door without even needing a plate or utensils. Breakfast accommodates work. It’s not the meal that shapes work, it’s the work that shapes the meal.
Could you talk about how dinner became a particularly American institution?
Dinner was not initially a strong identifying factor, in terms of nationality, for colonists. At first, they were eating more or less peasant food, porridges brought from England that said more about class than nationality. Then, dinner shifts in the 1700s to become an identifying factor in terms of being English. They’re in this new world, seen as primitive, and so they feel that they have to compensate for that. They inherit the fashions that cross the ocean, like eating a roast with dinner.
In the nineteenth century, the emerging middle class identifies itself through French food and French ways of eating. Things that we take for granted now, like starting a meal with soup or having a salad, were really French concepts. Dessert was largely a French concept, and many of the desserts that we adopted in the 19th century were French desserts. For the Victorian middle class, eating in the French way was a way to imitate the elite.
With the decline of servants in the late 1800s, people just couldn’t keep that up. Then there are the [World] Wars and the Depression, and those require Americans to be frugal. But they don’t just require Americans to be frugal—they give Americans the opportunity to celebrate frugality as patriotic. To eat frugally, to have a Victory Garden and can your own foods is patriotic. The model for dinner is no longer the French multicourse formal meal, but Thanksgiving. Thanksgiving becomes the model for the everyday American dinner. Of course, you don’t eat a whole roast every night, but the idea is that you have “a chicken in every pot,” which was Herbert Hoover’s 1928 campaign slogan. You would have some kind of meat on the table.
Are there any dishes or foods that you would classify as typically, or even exclusively, “American?”
A number of iconic foods—hot dogs and hamburgers, snack food—are hand-held. They’re novelties associated with entertainment. These are the kinds of food you eat at the ballpark, buy at a fair and eventually eat in your home. I think that there is a pattern there of iconic foods being quick and hand-held that speaks to the pace of American life, and also speaks to freedom. You’re free from the injunctions of Victorian manners and having to eat with a fork and knife and hold them properly, sit at the table and sit up straight and have your napkin properly placed. These foods shirk all that. There’s a sense of independence and a celebration of childhood in some of those foods, and we value that informality, the freedom and the fun that is associated with them.
Along those lines, there’s a lot of pushback against those processed foods today, with people wanting to recall old ways of eating, with eating local and fresh. But, how do you think that knowing the kinds of food that we used to eat and the ways that we used to eat, and think about eating, influences the future of American food?
History can play a really central role in thinking about the way that we want to eat in the future. The evolution of the meal is a process, and it continues.
With all of the talk of food and health, I think a really good question to ask is “Can we actually be healthy without eating meals?” And without even, perhaps, eating a family dinner? Studies show that eating together, we always eat better, always.
The family meal is the opportunity to put to work what we’re talking about. If we’re learning about fresh foods and ingredients, the family meal has potential to be another way of instructing our children and ourselves. There’s an interest in renewing the family meal, even reinventing it. We’re not going to be able to revive a Victorian notion of dining; I don’t think we’re interested in it. If we want to spend time together, if we want to invest in our children, if we want to be healthy, the family meal can be a vehicle for that.
March 14, 2012
Henry D. Perky is best remembered as the inventor of Shredded Wheat, one of the first ready-to-eat cereals and a food that’s changed the way Americans think about breakfast. Perky was a devout vegetarian who believed that good health came from simple, wholesome foods. His whole-wheat biscuits were not intended exclusively as a breakfast cereal—the biscuits were a health food that could be paired with mushrooms, or even sardines. Despite claims that the Shredded Wheat Biscuit was “the Wonder of the Age,” a cure-all for societal and personal woes, the little edible brown pillows did not immediately take off.
In order to get grocery stores to stock Shredded Wheat, Perky began publishing booklets—millions of booklets. And by emphasizing the link between health food and industrial efficiency, he accomplished something else: Perky published the earliest images of American ships in the Spanish American war—in a cookbook.
His 1898 book, The Vital Question and Our Navy, featured recipes for shredded wheat along with an addendum about the U.S. Naval exercises in the Philippines and Cuba. The photos “have nothing to do with the rest of the book,” Andrew F. Smith, a culinary historian and author of Eating History, said at the recent Cookbook Conference. “As far as I know, they’re the first pictures that appear of these battle cruisers and destroyers that are public.” To think, health foods and war once went hand in hand.
November 8, 2011
Fast-food aficionados are all abuzz over the McRib, the sandwich with a sizable cult following enjoying a return engagement at McDonald’s locations through November 14. Seriously, how many foodstuffs do you know of that have their own locator map so that die-hard fans can get their fix? The pork patty itself is something of a technological marvel, with emulsified bits of pork meat molded into the shape of ribs.
The more I pondered the McRib, the more it seemed like a descendant of scrapple. For those not in the know, this traditional breakfast food combines grain with the scraps and trimmings of meat, including organ meat, left over from butchering a hog. The mixture is boiled and allowed to set before being molded into a loaf, sliced up and finally pan-fried until golden brown. Like the McRib, scrapple is a distinctively American pork product and remains a regional favorite.
The dish has its roots in the black blood puddings found in Dutch and German cuisine. Immigrants brought the dish, also known as pawnhoss, to the New World in the 17th century, where it became most closely associated with the Pennsylvania Dutch communities. In this country, blood was omitted from the meat mix and European grains were replaced with American ones, such as buckwheat and cornmeal. Seasonings can vary depending on locality, with Philadelphia scrapple going heavy on the sage, while more Germanic versions favor marjoram and coriander. The dish was a commonsense means of extending leftover meat and avoiding waste, making as much use of an animal as possible. While pragmatic, the flip side is that organ meats can be very high in fat and cholesterol, so regularly incorporating scrapple into your diet might not be the best idea. Nevertheless, it remains popular and has spawned local celebrations, such as Philadelphia’s Scrapplefest and Bridgeville, Delaware’s Apple-Scrapple Festival, which sports events like a scrapple shot-put contest. (And XBox users out there might also recall the scrapple commercial that was worked into the game Whacked!, with a line of dancing pigs being sent down a conveyor belt before being sloshed into tin cans. And I have to admit, the jingle is pretty catchy.)
My first encounter with scrapple was at the L&S Diner in Harrisonburg, Virginia, courtesy of an uncle who treated me for breakfast and didn’t explain what it was I was eating until after my plate was cleared. I took pause, but didn’t dwell on the matter too long because, frankly, the nondescript brown slice of pork-flavored something-or-other tasted great—though it’s difficult for anything that’s fried to be rendered unpalatable. When Snowpocalypse hit the D.C. area last year, this meatloaf of the morning was my comfort food of choice to get me through being stuck indoors for a few days. Former Food and Think blogger Amanda Bensen, on the other hand, seems to have had an unpleasant introduction to the dish, so much so that she turned vegetarian. Though based on her description of being served pork mush, I’m not sure that it was properly prepared. But, like with any regional cuisine, there are dozens of variations that can be had with the dish. Do you enjoy scrapple? If so, tell us in the comments section how you like it served.
September 23, 2011
I think I’ve known one person in my entire life who actually drinks straight-up buttermilk as a beverage. Something about a sour-tasting dairy drink is low on appeal for most Americans. (However, it should be noted that other nationalities have similar cultured dairy beverages that are very popular.) But, oh, the things it can do to in tandem with other ingredients.
Today’s buttermilk is really fermented milk, different from the byproduct of butter-churning from olden days. Because it contains high amounts of lactic acid, buttermilk is excellent at helping baked goods rise and at tenderizing meat, not to mention adding tangy flavor to other recipes. The problem is that it always seems to be sold in a larger quantity than any one recipe calls for. And, although it has a fairly long shelf life, it’s always a challenge to find enough uses for the remainder before it goes to waste. Here are a few ideas to help make full use of your next quart.
1. Marinate meats. According to Fine Cooking magazine, buttermilk and yogurt are the only marinades that truly work to tenderize meat. Vinegar-based marinades are too acidic and could actually make meat tougher, while for some reason—possibly the calcium—the only slightly acidic buttermilk seems to stimulate the breakdown of proteins. However it works, it’s especially good with chicken, whether grilled (as in this simple marinade from Cheeky Kitchen) or fried (like this double-dipped version from Epicurious).
2. Add low-fat creaminess. Low-fat buttermilk is creamier and more flavorful than regular low-fat milk, so it’s perfect for mashed potatoes (this herbed recipe from Dash and Bella also contains butter, but it sure sounds good); creamy soups, like a buttermilk summer squash soup from 101 Cookbooks; or sauces, like Jean-Georges Vongerichten’s fish poached in buttermilk, from the New York Times.
3. Cook up breakfast. Some of the best morning foods are even better with buttermilk. It makes for fluffy pancakes, crispy outside/soft inside waffles (so says Smitten Kitchen), and rich scones (these lemon-blueberry buttermilk scones from Sing For Your Supper sound delicious).
4. Bake some bread. Buttermilk’s slight acidity helps activate baking soda and make bread rise. It’s the traditional liquid used in Irish soda bread. Oatmeal buttermilk bread gets high marks from Clockwork Lemon. And chances are good Grandma’s delicious, flaky biscuits were made with buttermilk. Sweet breads also get low-fat moistness from buttermilk, as in this banana-blueberry buttermilk bread from Eating Well magazine.
5. Save room for dessert. The same moistness also does wonders for cake, whether Bon Appétit magazine’s blackberry buttermilk cake or what the Pioneer Woman calls the best chocolate sheet cake. Ever. And don’t forget the Southern specialty, sweet, custardy buttermilk pie; Homesick Texan shares her Grandma Blanche’s recipe, which you just know has to be good.