December 6, 2013
Next time you are at an all-you-can eat buffet, imagine the food displays without any covering: there are flies in the coleslaw, the man in front of you leans over the spread, breathing heavily. His nose scrunches up as though he might sneeze at any moment. You cringe, but it’s too late. Mashed potatoes are off the menu tonight.
Johnny Garneau is the reason people like this man will never sneeze on your food today.
On March 10, 1959, the restaurateur and inventor filed his patent for the “Food Service Table” later known as the “sneeze guard,” meant to protect food on display from bacteria and other germs that may be spread by sneezing. These days, it’s required by law that retail, self-service food bars have one—nary a salad bar shall be left uncovered.
At the time of his invention, he owned and ran a chain of American Style Smorgasbord restaurants in Ohio and Pennsylvania—a set price, all-you-can-eat buffet model based off of the the traditional Swedish “smorgasbord,” a celebratory meal, buffet style, with a laid-out table of food. The first example of a smorgasbord in America appeared at the 1939 New York World’s Fair. Garneau’s “American Style Smorgasbord” restaurant was one of the first of many self-service restaurants that would pop up in the the United States in the ’50s.
“Being the germaphobe that he was, he couldn’t stand people going down the Smorgasbords smelling things and having their noses too close to the food,” Barbara Kelley, one of five of Garneau’s children says. “He said to his engineers, ‘We have to devise something—I don’t want these people sneezing on the food.”
When the patent was granted (for a term of 14 years), Garneau installed them in each of his restaurants. His daughter Barbara was born the year her father filed for the patent and remembers growing up in the spotless kitchens and dining rooms of her father’s businesses.
“He had that typical entrepreneur mind—he was always thinking of the next great idea.” Kelley says. These common things we use every day, somebody, somewhere had an idea and they had the guts to take it to fruition. My dad was one of them. There wasn’t one thing he thought that he couldn’t make or do.”
At 15, Garneau got a taste for the restaurant business as a “soda jerk” when he began forming dreams of his first restaurant, “The Beanery,” which opened 1949. The six-stool, 20-foot by 15-foot diner served American classics like the hot dog with curb service. By 1952, he opened his first American Style Smorgasbord restaurant.
When the smorgasbord style became less trendy, he turned each of his restaurants into steakhouses called the Golden Spike, the first of which opened in 1954. The railroad theme (there was a toy train set up at the bar that delivered your drink) came from Garneau’s interest in Promontory Summit in Utah, the point that completed of the first transcontinental railroad in 1869. At the height of his business, he had six successful restaurants: four in Pittsburgh, one in Clarion, Pennsylvania, where Garneau raised his family, and one in South Florida. Garneau died in May of this year at his home in Florida at age 90.
Garneau’s invention effectively changed the standard for food safety in self-service environments. Even though there isn’t evidence of a direct causation between Garneau’s patent and food-safety initiatives, as far back as the early ’60s, the FDA regulated the presence of food shields. “The 1962 Model Food Service Sanitation Ordinance and 1976 Model Food Service Sanitation Ordinance also has very similar language,” David Steigman a communications representative of the FDA stated in an email to Smithsonian.com. “Instead of ‘salad bar food guards’, the term ‘counter protective devices’ and ‘salad bar protective devices’ were used in 1962 and 1976 respectively.” The NSF’s Food Service Equipment criteria for the design and construction for “counter guards” go as far back as 1965, and perhaps even earlier.
The most current law, the 2013 Food Code, under section 3-306.11 states that: “FOOD on display shall be protected from contamination by the use of PACKAGING; counter, service line, or salad bar FOOD guards; display cases; or other effective means.”
All 50 states have adopted food codes patterned after one of the six versions of the FDA’s model (1993, 1995, 1997, 1999, 2001, 2005, 2009, and as of last month 2013), which include requirements for protecting food on display that resemble Garneau’s original design. Though each state’s regulation remains in line with the FDA’s guidelines, it is up to state, local and tribal agencies to regulate and inspect retail food establishments. The degree of coverage and specific dimensions of “food guards” vary. New Jersey for example follows the National Sanitation Foundation Internation (NSF International) Food Guard requirements which state that a sneeze guard must be positioned 14 inches above the food counter surface and must extend seven inches beyond the edge of the utensil on which food is placed.
According to Elizabeth Dougherty, director of inventor education at the U.S. Patent and Trademark Office, there are only about 100 patents filed in the area of food storage, safety and care—a small number when you consider that there are eight million U.S. patents total. After Garneau’s patent in 1959, there have been some innovations in the field with minor changes in the original design.
“The late ’50s does seem to be the era in time when the sneeze guards started to become an object for innovation and invention,” Dougherty says. “Prior to this time, there are very few documented patents in this technology area.”
The saying is that “necessity is the mother of invention.” It took a Midwestern restauranteur to realize that without something to protect them, everyone’s favorite buffet foods were defenseless from the attack of a 40 mph sneeze.
November 27, 2013
Americans consume 5,062,500 gallons of jellied cranberry sauce—Ocean Spray’s official name for the traditional Thanksgiving side dish we know and love that holds the shape of the can it comes in—every holiday season. That’s four million pounds of cranberries—200 berries in each can—that reach a gel-like consistency from pectin, a natural setting agent found in the food. If you’re part of the 26 percent of Americans who make homemade sauce during the holidays, consider that only about five percent of America’s total cranberry crop is sold as fresh fruit. Also consider that 100 years ago, cranberries were only available fresh for a mere two months out of the year (they are usually harvested mid-September until around mid-November in North America making them the perfect Thanksgiving side). In 1912, one savvy businessman devised a way to change the cranberry industry forever.
Marcus L. Urann was a lawyer with big plans. At the turn of the 20th century, he left his legal career to buy a cranberry bog. “I felt I could do something for New England. You know, everything in life is what you do for others,” Urann said in an interview published in the Spokane Daily Chronicle in 1959, decades after his inspired career change. His altruistic motives aside, Urann was a savvy businessman who knew how to work a market. After he set up cooking facilities at as packinghouse in Hanson, Massachusetts, he began to consider ways to extend the short selling season of the berries. Canning them, in particular, he knew would make the berry a year-round product.
“Cranberries are picked during a six-week period,” Robert Cox, coauthor of Massachusetts Cranberry Culture: A History from Bog to Table says. “Before canning technology, the product had to be consumed immediately and the rest of the year there was almost no market. Urann’s canned cranberry sauce and juice are revolutionary innovations because they produced a product with a shelf life of months and months instead of just days.”
Native Americans were the first to cultivate the cranberry in North America, but the berries weren’t marketed and sold commercially until the middle of the 18th century. Revolutionary war veteran Henry Hall is often credited with planting the first-known commercial cranberry bed in Dennis, Massachusetts in 1816, but Cox says Sir Joseph Banks, one of the most important figures of his time in British science, was harvesting cranberries in Britain a decade earlier from seeds that were sent over from the states—Banks just never marketed them. By the mid-19th century, what we know as the modern cranberry industry was in full swing and the competition among bog growers was fierce.
The business model worked on a small scale at first: families and members of the community harvested wild cranberries and then sold them locally or to a middle man before retail. As the market expanded to larger cities like Boston, Providence and New York, growers relied on cheap labor from migrant workers. Farmers competed to unload their surpluses fast—what was once a small, local venture, became a boom or bust business.
What kept the cranberry market from really exploding was a combination of geography and economics. The berries require a very particular environment for a successful crop, and are localized to areas like Massachusetts and Wisconsin. Last year, I investigated where various items on the Thanksgiving menu were grown: “Cranberries are picky when it comes to growing conditions… Because they are traditionally grown in natural wetlands, they need a lot of water. During the long, cold winter months, they also require a period of dormancy which rules out any southern region of the U.S. as an option for cranberry farming.”
Urann’s idea to can and juice cranberries in 1912 created a market that cranberry growers had never seen before. But his business sense went even further.
“He had the savvy, the finances, the connections and the innovative spirit to make change happen. He wasn’t the only one to cook cranberry sauce, he wasn’t the only one to develop new products, but he was the first to come up with the idea,” says Cox. His innovative ideas were helped by a change in how cranberries were harvested.
In the 1930s, techniques transitioned from “dry” to “wet”— a confusing distinction, says Sharon Newcomb, brand communication specialist with Ocean Spray. Cranberries grow on vines and can be harvested either by picking them individually by hand (dry) or by flooding the bog at time of harvest (wet) like what we see in many Ocean Spray commercials. Today about 90 percent of cranberries are picked using wet harvesting techniques. “Cranberries are a hearty plant, they grow in acidic, sandy soil,” Newcomb says. “A lot of people, when they see our commercials think cranberries grow in water.”
The water helps to separate the berry from the vine and small air pockets in the berries allow them to float to the surface. Rather than taking a week, you could do it in an afternoon. Instead of a team of 20 or 30, bogs now have a team of four or five. After the wet harvesting option was introduced in the mid to late 1900s, growers looked to new methods of using their crop, including canning, freezing, drying, juicing berries, Cox says.
Urann also helped develop a number of novel cranberry products, like the cranberry juice cocktail in 1933, for example, and six years later, he came up with a syrup for mixed drinks. The famous (or infamous) cranberry sauce “log” we know today became available nationwide in 1941.
Urann had tackled the challenge of harvesting a crop prone to glut and seesawing prices, but federal regulations stood in the way of him cornering the market. He had seen other industries fall under scrutiny for violating antitrust laws; in 1890, Congress passed the Sherman Anti-Trust Act, which was followed by additional legislation, including the Clayton Act of 1914 and the Federal Trade Commission Act of 1914.
In 1930, Urann convinced his competitors John C. Makepeace of the AD Makepeace company—the nation’s largest grower at the time—and Elizabeth F. Lee of the New Jersey-based Cranberry Products Company to join forces under the cooperative, Cranberry Canners, Inc. His creation, a cooperative that minimized the risks from the crop’s price and volume instability, would have been illegal had attorney John Quarles not found an exemption for agricultural cooperatives in the Capper-Volstead act of 1922, which gave “associations” making agricultural products limited exemptions from anti-trust laws.
After World War II, in 1946, the cooperative became the National Cranberry Association and by 1957 changed its name to Ocean Spray. (Fun Fact: Urann at first “borrowed” the Ocean Spray name and added the image of the breaking wave, and cranberry vines from a fish company in Washington State from which he later bought the rights). Later, Urann would tell the Associated Press why he believed the cooperative structure worked: ”grower control (which) means ‘self control’ to maintain the lowest possible price to consumers.” In theory, the cooperative would keep the competition among growers at bay. Cox explains:
From the beginning, the relationship between the three [Urann, Makepeace and Lee] was fraught with mistrust, but on the principle that one should keep one’s enemies closer than one’s friends, the cooperative pursued a canned version of the [American Cranberry Exchange] ACE’s fresh strategy, rationalizing production, distribution, quality control, marketing and pricing.
Ocean Spray still is a cooperative of 600 independent growers across the United States that work together to set prices and standards.
We can’t thank Urann in person for his contribution to our yearly cranberry intake (he died in 1963), but we can at least visualize this: If you lay out all the cans of sauce consumed in a year from end to end, it would stretch 3,385 miles—the length of 67,500 football fields. To those of you ready to crack open your can of jellied cranberry sauce this fall, cheers.
November 25, 2013
A few years back, when she was the director and librarian of the Pilgrim Hall Museum, Peggy Baker came across a fascinating document at a rare book and ephemera sale in Hartford, Connecticut. It was a four-course menu for a luxurious dinner at the Hotel Vendome in Boston for November 29, 1894 – Thanksgiving.
Appetizers consisted of Blue Point oysters or oyster crabs in béarnaise sauce. The soup is consumee Marie Stuart, with carrots and turnips; or, a real delicacy, terrapin a la gastronome (that’s turtle soup to you).
The choice of entrees included mousee de foie graise with cauliflower au gratin, prime ribs with Yorkshire pudding, Peking Duck with onions and squash and…a nod to the traditionalists…roasted turkey with cranberry sauce and mashed potatoes.
Then, salad—at the end of the meal, as they do in Europe—followed by a plethora of desserts: Petit fours, plum pudding with maple brandy sauce, Neapolitan ice cream; mince, apple and pumpkin pie, and almond cake with maple frosting. To round out the meal, coffee or sweet cider with assorted cheeses and nuts.
Baker’s discovery of this belt-busting tour de force sent her on a mission to shed light back on a long forgotten chapter of the history of this holiday; a time when wealthy Americans celebrated their Thanksgivings not in the confines of the home with family, but at fancy hotels and restaurants, with extravagant, haute cuisine dinners and entertainments.
“I was thoroughly entranced, having no idea any such thing existed,” recalls Baker. She began collecting similar bills of fare from other establishments, in other cities.
“It was like an anthropological expedition to a different culture,” recalls Baker, “I wasn’t aware people dined out as a regular annual event for Thanksgiving. It was just so foreign to me.”
Baker amassed more than 40 of these menus, which she displayed at the museum in 1998, in an exhibit called “Thanksgiving a la Carte.” Baker retired in 2010, but the pieces from the exhibit can still be viewed on the Pilgrim Hall Museum website. (PDF)
The reason a Thanksgiving Day spent anywhere but home seems so jarring today, is due in large part to the power of a painting: Norman Rockwell’s 1943 “Freedom from Want”—part of the famous “Four Freedoms” series that Rockwell painted as part of the effort to sell War Bonds. Published on the cover of the March 6, 1943 edition of the Saturday Evening Post, the painting depicts a kindly-looking, white-haired patriarch and matriarch standing at the head of the table, as hungry family members—their smiling faces only partially visible—eagerly anticipate the mouth-watering turkey dinner that’s about to be served.
But Rockwell’s idealized Thanksgiving celebration is not the way it’s always been; it could even be argued that the idea of a tightly-knit family celebration at home would have been unfamiliar to even the Pilgrims.
“The meal we harken back to in 1621, is a totally anomalous situation to the way we think about it today,” says Kathleen Wahl, a culinarian and 17th-century food expert at Plimouth Plantation, the living history museum of the Pilgrim period in Plymouth, Massachusetts. “You have about 50 English people whose families were torn apart, by death or distance. It’s like a very modern, make-do family. Family is your neighbors, it’s whoever happens to be in the situation with you.”
Those survivors of the first winter in the New World celebrated the harvest with the Wampanoag sachem Massasoit and about 90 of his men. While there were no restaurants or catering halls in 1621, this was about as close as you could come without a waiter taking Squanto and Miles Standish’s drink orders. “The original Thanksgiving dinners was an `out of home’ experience,” Wahl argues. “I think going out is more in the tradition of the 1621 event.”
According to James W. Baker, author of the 2009 book Thanksgiving: the Biography of an American Holiday (and husband to Peggy), part of the celebration has always involved events outside the home. Thanksgiving Day Balls were popular in New England in the early 19th century—although they followed a day that included church services and a meal at home. “The dinner was just one small element along with these other things,” said Baker, “but over the years it’s swallowed up the other things.” The primacy of the meal continues in more recent times. Things like the Thanksgiving Day parade, the high school football game, the local foot race, have all become common holiday events in various parts of the country, but they are usually done in the morning, allowing participants to race home for the family dinner.
It seems to have been during the Gilded Age when the Thanksgiving banquet at the luxury hotel or restaurant first became popular. This coincided with a general movement into fashionable new restaurants by the upper class. “Before then, you stayed home because you didn’t want the riffraff to see what you were doing,” says Evangeline Holland, a social historian who writes about the late Victorian and Edwardian periods on her website edwardianpromenade.com “But with the rise of the nouveau riche, people in England started dining out at restaurants and Americans followed suite.”
What better day to flaunt what you had than on Thanksgiving? “With the Gilded Age, everything is over the top,” says Stephen O’Neill, associate director and curator of collections at Pilgrim Hall Museum. “Thanksgiving is very much a celebration of abundance, so I think they sort of used that as an excuse to promote these extravagantly large dinners.”
The affairs were held at such famous, luxury hotels and restaurants as the Vendome, Delmonico’s and the Waldorf Astoria in New York. Even luxury cruise ships got into the act, offering elaborate Thanksgiving Day dinners to their seaborne passengers. The upper crust in smaller communities had them, as well, usually at the fanciest place in town.
The Waldorf, which opened in 1893, probably gets the prize for the most outrageous celebration. In 1915, the hotel erected an elaborate, mock “New England barn” in its grillroom for Thanksgiving Day—complete with, if the press reports are true, live animals and a dancing scarecrow. Well-heeled, urban diners feasted and danced, paying an odd tribute to the rural, New England roots of the holiday. As garish as it may sound today, the event was a smash.
“The Thanksgiving Day revel attracted one of the largest crowds that ever attended an affair at the hotel,” gushed The New York Times.
What changed all that? Baker thinks it was the combination of Prohibition in the 1920s and the Great Depression of the following decade. While some restaurants continued to offer grand Thanksgiving Day dinners, the practice had declined to the point that by the mid-20th century, as Rockwell’s painting suggests, it seemed almost un-American to have Thanksgiving Dinner anywhere but around grandma’s table.
“When my father came back from World War II, he was going to have nothing but the full homemade Thanksgiving dinner around the family table,” recalls Peggy Baker with a laugh. “He did relent enough to let my mother buy a pie from the store…that’s only because she wasn’t good at making pies.”
But some say that in the 21st century, dining out on Thanksgiving could be in again. In a 2011 survey, the National Restaurant Association found that 14 million Americans dined out on Thanksgiving; and anecdotal evidence suggests that more restaurants are open for the holiday, to accommodate greater demand.
“It is still a very domestically oriented holiday,” O’Neill says, “but I think now, especially with smaller families or families that are spread out a lot, it is much more fluid and adaptable. Whether it’s at the family home, or someone else’s home, or a restaurant, it’s now more of a ‘let’s just sort of have a big dinner’ holiday.”
Although probably not one with turtle soup and duck liver on the menu.
November 8, 2013
In 1908, over a bowl of seaweed soup, Japanese scientist Kikunae Ikeda asked a question that would change the food industry forever: what gave dashi, a ubiquitous Japanese soup base, its meaty flavor? In Japanese cuisine, dashi, a fermented base made from boiled seaweed and dried fish, was widely used by chefs to add extra oomph to meals–pairing well with other savory, but meatless foods like vegetables and soy. For some reason that was generally accepted but inexplicable, dashi made these meatless foods meaty–and Ikeda was determined to find out why.
Ikeda was able to isolate the main substance of dashi–the seaweed Laminaria japonica. He then took the seaweed and ran it through a series of chemical experiments, using evaporation to isolate a specific compound within the seaweed. After days of evaporating and treating the seaweed, he saw the development of a crystalline form. When he tasted the crystals, he recognized the distinct savory taste that dashi lent to other foods, a taste that he deemed umami, from the Japanese umai (delicious.) It was a breakthrough that challenged a cornerstone of culinary thinking: instead of four tastes—sweet, salty, bitter and sour—there were now five. A new frontier of taste had been discovered, and Ikeda wasted no time monopolizing on his discovery.
He determined the molecular formula of the crystals: C5H9NO4, the same as glutamic acid, an amino acid designated as non-essential because the human body, as well as a large smattering of other plants and animals is able to produce it on its own. In the body, glutamic acid is often found as glutamate, a different compound that has one less hydrogen atom. Glutamate is one of the most abundant excitatory neurotransmitters in brain, playing a crucial role in memory and learning. The FDA estimates that the average adult consumes 13 grams of it a day from the protein in food. Non-meat food sources like tomatoes and Parmesan cheese have high levels of glutamic acid.
In 1909, Ikeda began mass-producing Ajinomoto (meaning “essence of taste”), an additive that came out of his creation of the first method of industrially producing glutamate by way of fermented vegetable proteins. The resulting sodium salt form of glutamic acid (the acid with just a single sodium molecule) became famous for its ability to imbue a meaty flavor into dishes, or just naturally enhance the flavor of food. It was touted as a nutritional wonder, helping bland but nutritious food become delicious. A growing number of Japanese housewives used the product, and by the 1930s, recipes included Ajinomoto use in their directions. The sodium salt of glutamic acid remains prevalent today–anyone who has eaten KFC or Doritos has ingested it; it’s just known by a different name: monosodium glutamate, or MSG.
Few letters have the power to stop conversation in its tracks more than MSG, one of the most infamous additives in the food industry. The three little letters carry so much negative weight that they’re often whispered sheepishly or, more often, decidedly preceded by the modifier “NO” that seems to make everyone breathe a collective sigh of relief when they go out to eat. Nobody wants MSG in their food—the protest goes—it causes headaches, stomachaches, dizziness and general malaise. It’s unhealthy and, maybe even worse, unsexy, used by lazy chefs as an excuse for flavor, not an enhancement.
On the other side of the spectrum lies umami: few foodie buzzwords pop off the lips with such entertaining ease. Enterprising young chefs like David Chang (of Momofuku fame) and Adam Fleischman, of the LA-based chain Umami Burger, have built their culinary careers on the basis of the fifth taste, revitalizing an interest in the meaty-depth of umami. It’s difficult to watch the Food Network or Travel Channel or any food-based program without hearing mention of the taste wunderkind, a host or chef cooing over the deep umami flavors of a Portobello mushroom. Where MSG is scary, umami is exciting.
What few people understand is that the hated MSG and the adored umami are chemically related: umami is tasted by the very receptors that MSG targets. At a MAD Symposium in Denmark, a TED-like conference for the food industry, Chang spoke about MSG and umami: “For me, the way that I’m looking at umami, it’s the same way I look at MSG. It’s one in the same.” But if chefs like Chang (neither inept nor lazy when it comes to flavor, as his Michelin stars would attest to) are down with MSG, why does the additive retain such a bad reputation?
After gaining a foothold in Japanese cooking columns, MSG spread throughout Asia, becoming especially popular in Chinese cooking for enhancing both stocks and vegetarian dishes. Everyone knows this connection, and probably associates MSG use in America most heavily with Chinese restaurants–thanks in large part to the absurdly racist name for MSG sensitivity “Chinese Restaurant Syndrome.” But MSG’s foray into American cuisine came from more than Chinese dishes; MSG became popular in the United States during World War II thanks in large part to the country’s increasing military-industrial complex. The military thought that they had found in MSG an answer to the flavorless rations allotted to soldiers, and when the war ended, the troops came home and so did the industrialization of food production. From canned vegetables to frozen dinners, industrially created food was met with wonder in the United States.
That all changed in the 1960s, when trust in industrial food began to wane. In 1962, Rachel Carson published Silent Spring, a manifesto against pesticides that kicked off the environmental movement. As pesticides quickly fell from grace, faith in the industry of yesteryear–of the chemicals and additives born from the war—declined as well. In 1968, MSG’s death knell rang in the form of a letter written to the New England Journal of Medicine by Robert Ho Man Kwok, a Chinese-American doctor from Maryland. Kwok claimed that after eating at Chinese restaurants, he often came down with certain unpleasant symptoms, namely “numbness at the back of the neck, gradually radiating to both arms and the back” and “general weakness and palpitation.” After Kwok’s letter ran, the journal received a deluge of letters from other readers, all claiming to suffer from the same affliction, deemed “Chinese Restaurant Syndrome” by editors. Some readers presented the same symptoms as Kwok, but most were extremely varied, ranging from cold sweats to extreme dizziness. In response, the Journal offered up MSG as the likely culprit for their reader’s unpleasant symptoms.
Public interest spurred a number of scientific inquiries into the potential danger of MSG. According to food historian Ian Mosby’s exploration of MSG in “That Won-Ton Soup Headache” these inquiries went one of two ways: they either sought to prove the harmful short-term effects of MSG (and Chinese Restaurant Syndrome) or they looked to identify more long-term damage caused by the additive. Initially, researchers had success proving both the short-term and long-term dangers of MSG: mice injected with the additive showed signs of brain lesions, and humans fed 3 grams of MSG per 200 ml of soup presented symptoms congruent with “Chinese Restaurant Syndrome.” Subsequent studies, however, provided mixed results: some confirmed findings of brain lesions in animals or symptoms in humans, but other studies were unable to replicate the results. Double-blind studies often showed little correlation between MSG and adverse symptoms. Parties on both sides of the debate slung accusations at the other, with the anti-MSG researchers claiming that studies were being funded by MSG producers, and pro-MSG researchers accusing the other side of fear-mongering.
From the FDA to the United Nations to various governments (Australia, Britain and Japan) the public bodies that have investigated MSG have deemed it a safe food additive. The FDA states on their website:
FDA considers the addition of MSG to foods to be “generally recognized as safe” (GRAS). Although many people identify themselves as sensitive to MSG, in studies with such individuals given MSG or a placebo, scientists have not been able to consistently trigger reactions.
Scientific interest in its deleterious effects seems to be waning: one of the last studies to gain public attention was published in 2011. The authors of that study claimed to have found a link between MSG and obesity, though those results have been questioned. While the general scientific consensus seems to be that only in large doses and on an empty stomach can MSG temporarily affect a small subset of the population, MSG’s reputation is still maligned in the public eye.
On the other hand, MSG’s glutamic cousin umami suffers no public scorn: in 2010, umami was deemed one of the most delicious food trends to watch. When Adam Fleischman’s Umami Burger (a burger chain devoted to all things umami) opened a New York outpost, the wait for a meaty bite stretched on for three-hours. In addition to piling natural glutamates onto their burger to ensure the most umami flavor, Umami Burger enhances the burger with their “umami dust,” a blend of dried mushrooms and seaweed, and umami sauce, which includes soy and Marmite. Altogether, an original Umami Burger contains 2,185 mg of glutamate.
“Most people don’t know the connection between umami and MSG. They know about it from the fifth taste, and the fifth taste was always called umami and not MSG,” Fleischman explains. “We didn’t feel that using MSG was creative enough. We wanted to do it ourselves. By doing it ourselves, we could create a flavor that was umami without the stigma of MSG. MSG, whether you like it or not, has been marketed so poorly, it sounds like this horrible thing.”
By harnessing natural glutamates for their burgers, Umami Burger avoids negative connotations associated with MSG. But the “natural” glutamates in an Umami Burger aren’t chemically any different from glutamtes in MSG.
“The short answer is that there is no difference: glutamate is glutamate is glutamate,” says Richard Amasino, professor of biochemistry at University of Wisconsin-Madison. “It would be identical unless different things created a different rate of uptake.”
Glutamtes that occur naturally in food come intertwined with different chemicals or fiber, which the body is naturally inclined to regulate, explains Amy Cheng Vollmer, professor of biology at Swarthmore College. MSG, however, comes without the natural components of food that help the body regulate glutamic levels. It’s like taking an iron supplement versus obtaining iron from spinach or red meat: the iron supplement creates an expressway between the iron and your bloodstream that you wouldn’t find in natural iron sources.
“The bottom line here is context is everything,” Vollmer adds.
So does MSG deserve its bad rap? For the small section of the population that shows sensitivity to it, probably. But for the rest of America, maybe it’s time to reconsider exactly what we’re so afraid of when it comes to MSG.
November 5, 2013
Highlighted by its distinctive gold-yellow label, a bottle of Veuve Clicquot champagne is hard to ignore. In 2012, it was the second highest selling brand of champagne in the world, with 1,474,000 nine-liter cases sold worldwide. But Veuve Clicquot wasn’t always so successful: if it weren’t for the efforts of a cunning 19th-century business mind, the champagne might never have existed. That remarkable mind belonged to the eponymous Widow (veuve in French0) Clicquot, one of the world’s first international businesswomen, who brought her wine business back from the brink of destruction and created the modern champagne market in the process.
The Widow Clicquot was born Barbe-Nicole Ponsardin, daughter of an affluent textile industrialist in Reims, France. Born in the years leading up to the French Revolution, Barbe-Nicole’s childhood was heavily influenced by the political leanings of her father, Ponce Jean Nicolas Philippe Ponsardin, which switched from monarchist to Jacobin as the tide of the Revolution turned against the monarchy. Through his shrewd politics, Barbe-Nicole’s family was able to escape the Revolution relatively unscathed, a rarity for an affluent bourgeoisie family.
Next door to Hôtel Ponsardin, the large family estate that Barbe-Nicole grew up on, lived the Clicquot family, under the patriarch Philippe. Philippe Clicquot also ran a successful textile business, making him the chief competitor to Barbe-Nicole’s father. In an attempt to consolidate the power of their two businesses, Mr. Ponsardin and Mr. Clicquot did what any shrewd business owner in the 18th century would have done: married their children. In 1798, when she was 21 years old, Barbe-Nicole married Francois Clicquot, Philippe Clicquot’s only son–the marriage was akin to an arranged marriage, a business deal devised by two industrial leaders in the small town of Reims.
Still, as the two embarked on their life together, a real partnership seemed to grow between them. Francois was a lively young man with large aspirations: instead of taking over his father’s textile industry, as his father wanted him to, Francois was interested in growing his family’s small wine business. Up to that point, the Clicquot’s family’s involvement in the wine industry constituted a minor portion of the family business. Philippe often only sold wine as an afterthought to his large textile business, adding bottles of still or sparkling white wine to orders only to round them out (once a boat had been commissioned and paid for, Philippe wanted to make sure he was getting his money’s worth). Though sparkling wine had been invented, the Champagne region was more famous for its still white wines, which Philippe would buy from wine producers and export on an as-needed basis. Philippe Clicquot had no intention of expanding its wine business to production, but Francois had a different plan.
Francois announced to his father his intention of expanding the family’s wine business, but was met with disapproval. As France plunged into the Napoleonic Wars, Philippe didn’t see wine as a profitable endeavor. Francois dismissed his father’s concerns, and set about learning the wine trade, along with his young wife. While Francois had little knowledge of wine-making, the craft ran in Barbe-Nicole’s family: one of her grandmothers had been part of a wine making operation generations earlier. Still, the two set out to learn the industry from the ground-up together.
Despite their apparent passion for the industry, Philippe Clicquot’s judgement seems to have been correct: their champagne business stalled and looked ready to collapse. In 1805, six years after their marriage, Francois fell suddenly ill with a fever; 12 days later, he was dead. Rumors swirled around the town that his death had been a suicide caused by despair at the failing business, though other accounts attribute his death to an infectious fever such as typhoid. Both Barbe-Nicole and Philippe were devastated by Francois’ death, and Philippe announced that by the end of the year, he would end the wine business.
Barbe-Nicole had other plans, and approached her father-in-law with a bold proposition.
“Barbe-Nicole goes to her father-in-law and says, ‘I’d like to risk my inheritance, I’d like you to invest the equivalent of an extra million dollars in me running this wine business.’ And he says yes,” explains Tilar Mazzeo, author of The Widow Clicquot. “It’s surprising that he would let a woman who has no business training take this on, and what it speaks to is that Philippe Clicquot was no fool. He understood how very keenly intelligent his daughter-in-law was.”
Keenly intelligent, perhaps, but at that point, Barbe-Nicole had been unsuccessful in selling champagne wine. So Philippe agreed under one condition: Barbe-Nicole would go through an apprenticeship, after which she would be able to run the business herself–if she proved her abilities. She entered into an apprenticeship with the well-known winemaker Alexandre Fourneaux, and for four years tried to make the dying wine business grow. It didn’t work, and at the end of her apprenticeship, the business was just as broke as before. So Barbe-Nicole went to her father-in-law a second time asking for money, and for a second time, Philippe Clicquot invested in his daughter-in-law’s business.
“That’s the time that comes right at the end of the Napoleonic Wars, when she has in her cellars what will become the legendary vintage of 1811, and she’s about ready to go bankrupt,” Mazzeo explains. Facing bankruptcy, Barbe-Nicole took a huge business gamble: she knew that the Russian market, as soon as the Napoleonic Wars ended, would be thirsty for the kind of champagne she was making–an extremely sweet champagne that contained nearly 300 grams of sugar (about double that of today’s sweet dessert wines, like a Sauterne). At this moment in champagne history, the champagne market was fairly small–but Russians were early enthusiasts. If she could appeal to their burgeoning desire for champagne and corner that market, Barbe-Nicole believed that success would be hers.
There was only one problem: the naval blockades that had crippled commercial shipping during the wars. Barbe-Nicole smuggled the vast majority of her best wine out of France as far as Amsterdam, where it waited for peace to be declared. As soon as peace was declared, the shipment made its way to Russia, beating her competitors by weeks. Soon after her champagne debuted in Russia, Tsar Alexander I announced that it was the only kind that he would drink. Word of his preference spread throughout the Russian court, which was essentially ground-zero for international marketing.
“She goes from being a very minor player to a name that everyone knows, and everybody wants her champagne,” Mazzeo says. Suddenly, the demand for her champagne increased so much that she was worried she would not be able to fill all the orders. Champagne making, at that time, was an incredibly tedious and wasteful business, and Barbe-Nicole realized that she would need to improve the process if she was going to keep up with the new demand for her product.
Champagne is made by adding sugar and live yeast to bottles of white wine, creating what is known as secondary fermentation. As the yeast digests the sugar, the bi-products created are alcohol and carbon dioxide, which give the wine its bubbles. There’s only one problem: when the yeast consumes all the sugar, it dies, leaving a winemaker with a sparkling bottle of wine–and dead yeast in the bottom. The dead yeast was more than unappetizing–it left the wine looking cloudy and visually unappealing. The first champagne makers dealt with this by pouring the finished product from one bottle to another in order to rid the wine of its yeast. The process was more than time-consuming and wasteful: it damaged the wine by constantly agitating the bubbles.
Barbe-Nicole knew there had to be a better way. Instead of transferring the wine from bottle to bottle to rid it of its yeast, she devised a method that kept the wine in the same bottle but consolidated the yeast by gently agitating the wine. The bottles were turned upside down and twisted, causing the yeast to gather in the neck of the bottle. This method, known as riddling, is still used by modern champagne makers.
Barbe-Nicole’s innovation was a revolution: not only was her champagne’s quality improved, she was able to produce it much faster. Her new technique was an extreme annoyance to her competitors, especially Jean-Rémy Moët, who could not replicate her method. It wasn’t an easy secret to keep, since Barbe-Nicole employed a large number of workers in her cellars–but no one betrayed her secret, a testament to her workers loyalty, Mazzeo explains. It would be decades before any of them became wise to the method of riddling, giving Barbe-Nicole another advantage over the champagne market.
With the production of champagne increasing, Barbe-Nicole set her sights on building a global empire. By the time she died in 1866, Veuve Clicquot was exporting champagne to the far reaches of the world, from Lapland to the United States. Veuve Clicquot helped turn champagne from a beverage enjoyed solely by the upper-class to a drink available to almost anyone in the middle-upper class–a seemingly small distinction, but one that vastly increased Barbe-Nicole’s market.
“The invention of riddling allows the mass-production of an artisanal and luxury product, just not at the tiny quantities that they were dealing with before,” Mazzeo explains. “Barbe-Nicole begins exporting wine around the world in large quantities and is known as being one of the great businesswomen of her century.”
In spite of the extent of her champagne empire, Barbe-Nicole never left France during her lifetime: it would have been inappropriate for a woman to travel alone during that time. She also never remarried, though there is evidence of mild flirtations with some of her business associates (“She was rumored to have had a penchant for handsome young men working in her company,” Mazzeo explains). Had she remarried, she would almost certainly have had to relinquish control of her business, an unthinkable act for the first modern businesswoman.
From risking her inheritance on a failing business to gambling her champagne against a naval blockade, Barbe-Nicole built her champagne empire on bold decisions, a business model she never regretted. As she wrote in the later years of her life in a letter to a grandchild: “The world is in perpetual motion, and we must invent the things of tomorrow. One must go before others, be determined and exacting, and let your intelligence direct your life. Act with audacity.”