August 13, 2013
The science is pretty sound that carrots, by virtue of their heavy dose of Vitamin A (in the form of beta carotene), are good for your eye health. A 1998 Johns Hopkins study, as reported by the New York Times, even found that supplemental pills could reverse poor vision among those with a Vitamin A deficiency. But as John Stolarczyk knows all too well as curator of the World Carrot Museum, the truth has been stretched into a pervasive myth that carrots hold within a super-vegetable power: improving your night-time vision. But carrots cannot help you see better in the dark any more than eating blueberries will turn you blue.
“Somewhere on the journey the message that carrots are good for your eyes became disfigured into improving eyesight,” Stolarczyk says. His virtual museum, 125 pages full of surprising and obscure facts about carrots, investigates how the myth became so popular: British propaganda from World War II.
Stolarczyk is not confident about the exact origin of the faulty carrot theory, but believes that it was reinforced and popularized by the Ministry of Information, an offshoot of a subterfuge campaign to hide a technology critical to an Allied victory.
During the 1940 Blitzkrieg, the Luftwaffe often struck under the cover of darkness. In order to make it more difficult for the German planes to hit targets, the British government issued citywide blackouts. The Royal Air Force were able to repel the German fighters in part because of the development of a new, secret radar technology. The on-board Airborne Interception Radar (AI), first used by the RAF in 1939, had the ability to pinpoint enemy bombers before they reached the English Channel. But to keep that under wraps, according to Stolarczyk’s research pulled from the files of the Imperial War Museum, the Mass Observation Archive, and the UK National Archives, the Ministry provided another reason for their success: carrots.
In 1940, RAF night fighter ace, John Cunningham, nicknamed “Cat’s Eyes”, was the first to shoot down an enemy plane using AI. He’d later rack up an impressive total of 20 kills—19 of which were at night. According to “Now I Know” writer Dan Lewis, also a Smithsonian.com contributor, the Ministry told newspapers that the reason for their success was because pilots like Cunningham ate an excess of carrots.
The ruse, meant to send German tacticians on a wild goose chase, may or may not have fooled them as planned, says Stolarczyk.
“I have no evidence they [the Germans] fell for it, other than that the use of carrots to help with eye health was well ingrained in the German psyche. It was believed that they had to fall for some of it,” Stolarczyk wrote in an email as he reviewed Ministry files for his upcoming book, tentatively titled How Carrots Helped Win World War II. “There are apocryphal tales that the Germans started feeding their own pilots carrots, as they thought there was some truth in it.”
Whether or not the Germans bought it, the British public generally believed that eating carrots would help them see better during the citywide blackouts. Advertisements with the slogan “Carrots keep you healthy and help you see in the blackout” (like the one pictured below) appeared everywhere.
But the carrot craze didn’t stop there—according to the Food Ministry, when a German blockade of food supply ships made many resources such as sugar, bacon and butter unavailable, the war could be won on the “Kitchen Front” if people changed what they ate and how they prepared it. In 1941, Lord Woolton, the Minister of Food, emphasized the call for self-sustainability in the garden:
“This is a food war. Every extra row of vegetables in allotments saves shipping. The battle on the kitchen front cannot be won without help from the kitchen garden. Isn’t an hour in the garden better than an hour in the queue?”
That same year, the British Ministry of Food launched a Dig For Victory Campaign which introduced the cartoons ”Dr. Carrot” and “Potato Pete”, to get people to eat more of the vegetables (bread and vegetables were never on the ration during the war). Advertisements encouraged families to start “Victory Gardens” and to try new recipes using surplus foods as substitutes for those less available. Carrots were promoted as a sweetener in desserts in the absence of sugar, which was rationed to eight ounces per adult per week. The Ministry’s “War Cookery Leaflet 4″ was filled with recipes for carrot pudding, carrot cake, carrot marmalade and carrot flan. Concoctions like “Carrolade” made from rutabagas and carrots emerged from other similar sources.
Citizens regularly tuned into radio broadcasts like “The Kitchen Front“, a daily, five-minute BBC program that doled out hints and tips for new recipes. According to Stolarczyk, the Ministry of Food encouraged so much extra production of the vegetable that by 1942, it was looking at 100,000 ton surplus of carrots.
Stolarczyk has tried many of the recipes including Woolton Pie (named for Lord Woolton), Carrot Flan and Carrot Fudge. Carrolade, he says, was one of the stranger ideas.
“The Ministry of Food had what I call a ‘silly ideas’ section where they threw out crazy ideas to see what would stick—this was one of those,” he says. “At the end of the day, the people were not stupid. If it tasted horrible, they tended to shy away.”
Dr. Carrot was everywhere—radio shows, posters, even Disney helped out. Hank Porter, a leading Disney cartoonist designed a whole family based on the idea of Dr. Carrot—Carroty George, Pop Carrot and Clara Carrot—for the British to promote to the public.
Dr. Carrot and Carroty George had some competition in the U.S., however—from wise-guy carrot-chomping Bugs Bunny, born around the same time. While Bugs served his own role in U.S. WWII propaganda cartoons, the connection between his tagline, “What’s up Doc?,” and the UK’s “Dr. Carrot” is probably just a coincidence.
August 1, 2013
In 1905, John Schneider sat down, put pen to paper, and began writing an account of his life. Elderly, his wiry white beard and mustache framing a face marked with deeply creased wrinkles, his memories came simply, matter-of-fact words and descriptions perhaps disguising how ill-at-ease the German immigrant felt with his adopted language. “We were 250 brewers in [Cincinnati] and founded the Gambrinus Support Association and demanded 30 dollars per month, which the bosses didn’t agree to, and we went on strike,” he wrote. “Business was good; left Eichenlaube. Went to Moerlein’s Brewery, only there was another strike, so I left soon and went to Herancourt’s Brewery, from there to the Jackson Brewery as maltser and got 3 dollars more wage here.” His words reveal the success of his chosen industry; breweries were plentiful, and Schneider had innumerable options for work. The year was 1854, and Schneider, who would become a brewmaster before long, found himself on the ground-level of the American brewing boom, a business that would peak in 1873 with over 3,700 independent breweries operating in the United States.
140 years later, the American brewing industry is back on the rise, thanks in large part to the reinvigorated appeal — and economic success — of small batch craft breweries. In their midyear report — released this week — the Brewers Association announced strong financial growth for American breweries, dollars backed by the number of breweries operating: 2,538, the largest number since 1873. What sounded the death knell for the brewing boom, and why did it take nearly a century and a half for American brewing to reclaim its former glory? The death of the American brewery can be attributed — at least in part — to the heartbreak of loving something too much: when beer became popular, it became profitable, opening itself up to large-scale corporate control and consolidation.
Before 1810, production statistics for beer are widely unavailable, speaking to its lack of standing in the American beverage rotation. Toward the mid-1850s, however, a number of social and technological advancements made beer an appealing option for drinkers. For one, an influx of immigrants from Britain, Germany and Ireland contributed to the idea of a beer-drinking culture. Additionally, wages were on the rise, affording workers the economic means to knock back a cold one after work. Substantive improvements in technology — such as refrigeration and pasteurization — also contributed to beer becoming more widely available. In 1865, per capita consumption of beer in the United States was 3.4 gallons — by the end of the 19th century, that number had nearly quadrupled.
Up through 1873, most of America’s beer came from small, locally owned and operated breweries. While craft breweries of today are concerned with creating a breadth of creative beer styles (see the Rogue Brewery’s Bacon Maple Ale, a beer inspired by Portland, Oregon’s famous Voodoo Doughnut shop), small batch breweries of the 19th century were more concerned with distributing quality beer to their immediate, local clientele. “Today, America’s craft brewers are creating innovative, high quality beer in a variety of styles and flavors,” explains Paul Gatza, director of the Brewers Association. “But, for a good part of the 20th century, it was hard to find many examples of ales in the U.S.” Lighter styles like lagers and pilsners began to squeeze heavier ales out of the market, thanks in large part to the influx of German immigrants — like Schneider — who brought their country’s penchant for the pilsner to America.
As thirst for the malty beverage increased, a new dynamic pitted big business against small craftsmanship. In 1870, 3,286 breweries produced, on average, 2,009 barrels of beer per year. By 1915, only 1,345 breweries remained, but these were prodigious in their production, cranking out 44,461 barrels per year. “Brewery declines in the 1870s were related to refrigerated and iced rail cars allowing breweries to extend their reach, pushing consolidation and closure of small, local brewers,” says Gatza.
It wasn’t until after Prohibition, however, that these large scale “shipping breweries” began to truly outwit the smaller craft breweries — which, though outnumbered, had been able to sustain their business by supplying small batch brews to their immediate local markets. With the passing of the 21st amendment, a measure was put in place that banned brewers from owning bars or saloons, requiring a middleman to go between bar owners and beer manufactures. Such a step drove up cost for small breweries, making their model economically unfeasible. “After Prohibition, over 700 breweries opened, but consolidation of smaller brewers by larger brewers started quickly and continued to around 1980,” Gatza says. “The post-Prohibition low point was 89 breweries owned by 42 companies in the late-1970s.” A combination of factors began to make beer — especially craft beer — less appealing to the American public. Marketing campaigns effectively dictated that the industry center around pale lagers, and diet crazes proselytized the light beer above all. The bell was tolling for the American brewery: experts projected that by the 1980s, there would be five brewing companies left in the United States.
Dancing with extinction, the American tradition of craft brewing has undergone a rebirth in the past 30 years. “A book could be written on what is behind the renaissance,” Gatz explains. “In a nutshell, beer drinkers are far more educated about breweries and beer styles, and having great experiences with delicious beers.” From 89 to 2,538 in three decades might be more than a renaissance, however — we may be witnessing the second-coming of an American craft brewing boom.
Which isn’t to say that history is repeating itself–merely mirroring a pattern of expanding industry.
July 31, 2013
A glass of champagne is often synonymous with toasting some of life’s biggest moments—a big promotion at work, weddings, the New Year. So too, is the tickle that revelers feel against their skin when they drink from long-stemmed flutes filled with bubbly.
There’s more to that fizz than just a pleasant sensation, though. Inside a freshly poured glass of champagne, or really any sparking wine, hundreds of bubbles are bursting every second. Tiny drops are ejected up to an inch above the surface with a powerful velocity of nearly 10 feet per second. They carry aromatic molecules up to our noses, foreshadowing the flavor to come.
In Uncorked: The Science of Champagne, recently revised and translated into English, physicist Gerard Liger-Belair explains the history, science and art of the wine. His book also features high-speed photography of champagne bubbles in action and stop-motion photography of the exact moment a cork pops (potentially at a speed of 31 miles per hour (!). Such technology allows Liger-Belair to pair the sommelier with the scientist. “Champagne making is indeed a three-century-old art, but it can obviously benefit from the latest advances in science,” he says.
Liger-Belair became interested in the science of bubbles while sipping a beer after his finals at Paris University about 20 years ago. The bubbles in champagne, he explains, are actually vehicles for the flavor and smell of champagne, elements that contribute to our overall sipping experience. They are also integral to the winemaking process, which produces carbon dioxide not once but twice. Stored away in a cool cellar, the champagne, which could be a blend of up to 40 different varietals, ferments slowly in the bottle. When the cork is popped, the carbon dioxide escapes in the form of Liger’s beloved bubbles. Once poured, bubbles form on several spots on the glass, detach and then rise toward the surface, where they burst, emitting a crackling sound and sending a stream of tiny droplets upward.
These bubble-forming hot spots launch about 30 bubbles per second. In beer, that rate is just 10 bubbles a second. But without this phenomenon, known as effervescence, champagne, beer and soda would all be flat.
Once the bubbles reach the top of the flute, the tension of the liquid below becomes too great as it pulls on them. The bubbles pop in a matter of microseconds. When they burst, they release enough energy to create tiny auditory shock waves; the fizzing sound is a chorus of individual bubbles bursting. By the time champagne goes flat, nearly 2 million bubbles have escaped from the glass.
The collapse of bubbles at the surface is Liger-Belair’s favorite thing about champagne. “Bubbles collapsing close to each other produce unexpected lovely flower-shaped structures unfortunately completely invisible by the naked eye,” he says. “This is a fantastic example of the beauty hidden right under our nose.”
Europeans, though, once considered the bubbling beverage a product of poor winemaking. In the late 1400s, temperatures plunged suddenly on the continent, freezing many of the continent’s lakes and rivers, including the Thames River and the canals of Venice. The monks of the Abbey of Hautvillers in Champagne, where high-altitude made it possible to grow top quality grapes, were already hard at work creating reds and whites. The cold temporarily halted fermentation, the process by which wine is made. When spring arrived with warmer temperatures, the budding spirits began to ferment again. This produced an excess of carbon dioxide inside wine bottles, giving the liquid inside a fizzy quality.
In 1668, the Catholic Church called upon a monk by the name of Dom Pierre Pérignon to finally control the situation. The rebellious wine was so fizzy that bottles kept exploding in the cellar, and Dom Pérignon was tasked with staving off a second round of fermentation.
In time, however, tastes changed, starting with the Royal Court at Versailles. By the end of the 17th century, Dom Pérignon was asked to reverse everything he was doing and focus on making champagne even bubblier. Although historical records show that a British doctor developed a recipe for champagne six years before Pérignon began his work, Pérignon would come to be known as the father of champagne thanks to his blending techniques. The process he developed, known as the French Method, incorporated the weather-induced “oops” moment that first created champagne—and it’s how champagne is made today.
So the next time you raise a glass of bubbly, take a second to appreciate its trademark tickle on another level—a molecular one.
July 24, 2013
There’s nothing inherently wrong with the Korean taco – nothing sinister about the combination of kimchi and hot sauce, nothing terribly iconoclastic about bulgogi wrapped in billowy tortillas. If anything, the Korean taco represents a creative moment in foodie culture, the blending of two seemingly disparate taste profiles into a surprisingly tasty – and palatally coherent – meal. It’s the dish-du-moment of the fusion food trend, the chic movement sometimes credited to Wolfgang Puck that gave us things such as the buffalo chicken spring roll and BBQ nachos. But to call the Korean taco – or the fusion food movement – something new would be rewriting history. “Fusion food,” the blending of culinary worlds to create new, hybrid dishes, has been around since the beginning of trade; so vast is its history that it’s almost impossible to discern the “original” iteration of fusion food. The most famous example, however, so ubiquitous that it’s difficult to connect origin to culture, is the noodle: spaghetti wouldn’t exist if the Chinese hadn’t perfected the method first.
“It’s really hard to invent new dishes, and even harder to invent new techniques,” Rachel Laudan, food historian and author of Cuisine and Empire: Cooking in World History, explains. “Almost all foods are fusion dishes.” But there’s a difference between food we easily recognize as fusion and food whose blended past remains hidden to the casual observer. Dishes often thought of as extremely nationalized, like ramen in Japan or curry in India, often really have origins in the fusion of cuisines that met during colonial expansion and migration.
“When cultures mix, fusion is inevitable,” adds Corrine Trang, author of Food Lovers Vietnamese: A Culinary Journey of Discovery. “[Colonists] wanted to eat the foods they were used to eating.” But as the hold of imperialism started to fall in the 19th and 20th centuries, a unique idea of nationalism began to take its place. As fledgling provinces struggled to prove their national might on an international scale, countries often adopted a national dish much like they adopted a flag or national anthem. Generally, dishes that were adopted as representations of a country’s “national” culture truly represented an area’s culturally diverse history. Below, we’ve compiled a list of foods whose origins exemplify the blending of cultures into a classically “fusion” dish.
Bánh mì: A typical Vietnamese street food, the bánh mì (specifically, the bánh mì thit) combines notes crunchy, salty and spicy to the delight of sandwich lovers everywhere. But this typical Vietnamese sandwich represents a prime example of fusion food. A traditional bánh mì is made up of meat (often pâté), pickled vegetables, chilies and cilantro, served on a baguette. The influence of French colonialism is clear: from the pâté to the mayonnaise, held together by the crucial French baguette, the typically Vietnamese sandwich speaks of Vietnam’s colonial past. Which is not to say that it doesn’t hold a place in Vietnam’s culinary present. “As long as there is demand you’ll always have the product. Basic business practice. Why would you take something off the market, if it sells well?” Tang asks, explaining why this vestige of colonialism enjoys such modern success. “Bánh mì is convenient and delicious. It’s their version of fast food.”
Jamaican patty: One of the most popular Jamaican foods, the patty is similar in idea to an empanada (a dish which also has cross-cultural origins): pastry encases a meaty filling animated with herbs and spices indigenous to Jamaican cuisine. But the snack “essential to Jamaican life” isn’t one hundred percent Jamaican; instead, it’s a fusion product of colonialism and migration, combining the English turnover with East Indian spices, African heat (from cayenne pepper) and the Jamaican Scotch Bonnet pepper. So while the patty might be giving the Chinese noodle a run for its money in terms of late-night street food, its complex culinary history is much less rough-and-tumble.
Vindaloo: Curry vindaloo is an omnipresent staple in any Indian restaurant’s repertoire, but this spicy stew comes from the blending of Portuguese and Goan cuisine. Goa, India’s smallest state, was under Portuguese rule for 450 years, during which time the European colonists influenced everything from architecture to cuisine, including the popular spicy stew known as vindalho (the dropped ‘h’ is merely an Anglicized spelling of the dish.) The name itself is a derivative of the Portuguese vinho (wine vinegar) and ahlo (garlic), two ingredients that give the curry its unique taste. The dish is a replication of the traditional Portuguese stew Carne de Vinha d’Alhos, which was traditionally a water-based stew. In Goa, the Portuguese revamped their traditional dish to include the chilies of the region, and today, curry vindaloo is known as one of the spicier curry dishes available. And this trend isn’t singular to vindaloo, as Laudan points out “curry, as we know it, also has largely British origins.”
Ramen: Nothing says “college student” quite like the fluorescent-orange broth of instant ramen noodles. The real dish, however, remains a Japanese culinary mainstay – and a dish that claims roots in Japan’s imperialist history. In the late 1800s and into the early 1900s, Japan won a series of power struggles with China, allowing the island-nation to claim various Chinese territories as their own (including Tawian and former-Chinese holdings in Korea). But land wasn’t the only way the Japanese chose to exert their imperial might over their longtime rivals. They also took their traditional Chinese noodle – saltier, chewier and more yellow due to the technique of adding alkali to salty water during the cooking process- and created a dish known as Shina soba, literally “Chinese noodle.” The name for the dish gradually tempered with time (Shina is a particularly pejorative way to describe something as Chinese) and came to be known as ramen, but its imperial history remains. As food historian Katarzyna Joanna Cwiertka writes in Modern Japanese Cuisine: Food, Power and National Identity, “by physically interacting with China through the ingestion of Chinese food and drink, the Japanese masses were brought closer to the idea of empire.”
July 19, 2013
It’s hard to imagine anything positive coming out of the dreaded hangover, that ultimate punishment doled out by the universe in the form of headaches, nausea and general discomfort. After a night of revelry, the unlucky afflicted often retreat to their beds, nursing aches and pains with rest and water. A brave few, however, have surged ahead, grasping at a mix of science and migraine-induced cravings to create their own remedies for the infamous day-after-blues. While some of these inventive cures have failed the test of time (deep-fried canary was a favorite of the Romans which, thankfully, you won’t find on your nearest diner menu), others have reached a level of success so mainstream that you might be surprised by their more nefarious origins.
Brunch: Though currently a popular venue for weekend gossip and day drinking, this portmanteau-meal actually began as a hangover cure. Before English writer Guy Beringer proposed the most ingenious combination of breakfast and lunch, weekend feasting was strictly reserved for the early Sunday dinner, where heavy fare like meat and pies were served to the after-church crowd. Instead of forcing this early dinner, Beringer argued that life would be happier for all if a new meal was created, “served around noon, that starts with tea or coffee, marmalade and other breakfast fixtures before moving along to the heavier fare.” By letting people sleep in on Sundays, and awake later for a meal, Beringer noted that life would be made easier for “Saturday night carousers.” Beyond the appeal of a nice, substantial meal after a night of debauchery, Beringer testified to the soothing social interaction brunch brings, reasoning that it helped to “sweep away the cobwebs of the weekend.” Brunch didn’t gain traction with the American crowd, however, until the 1920s and into the 1930s, when celebrities and socialites hosted brunch parties in their homes. Brunch received an even larger following in the 70s and 80s, when church attendance dropped nationwide, and Americans swapped their religious dedication to breaking bread with the secular tradition of breaking yolks.
The Bloody Mary: Battling a hangover with more drinking has been around as a cure since alcohol itself. Famously referred to as “hair of the dog” (which actually comes from an old cure for rabies, wherein the afflicted would rub a bit of dog hair into the wound) the hungover have often turned to libations as a way to ease their pain. Perhaps no iteration of this is more famous than the Bloody Mary, ubiquitous on brunch menus (see above). But the drink itself wasn’t created to cause hangovers – instead, it was created to cure them. As bartender Josh Krist explains, the roaring crowd of ex-pats that populated Paris in the 1920s required a drink that could ease the pain from their The Sun Also Rises-esque gallivanting of the night before. In response to such demand, Fernand Petiot, bartender at Harry’s New York Bar in Paris, first created the concoction by adding equal parts vodka and tomato juice. In terms of scientific hangover cures, the one half of the libation is fairly ingenious, because tomato juice contains high amounts of both lycopene and potassium, which help stimulate blood flow and replenish electrolytes (hair of the dog, however, has been debunked as a healthy way to hurdle the hangover slump).
Fernet: Continuing the fine tradition of spirits invented to cure an over-indulgence in spirits (again, see above) Fernet, a famous Italian liquor now used as a post-meal digestive, was actually created for curing hangovers. As the story goes, Italian spice trader Bernadino Branca invented the spirit in 1845, adding the traditional hangover cure-all myrrh to a lot of grape infused spirits. He then added a plethora of other flavorings and ingredients, including rhubarb, chamomile, aloe, cardamom, peppermint oil, and – get this – opiates. The resulting mix certainly succeeding in perking drinkers up after a night on the town and, in much more extreme cases, patients suffering from cholera.
Eggs Benedict: If we are sensing a trend here, it’s that the world of brunch is very meta (a hangover cure that inspired other hangover cures…like some headache-ridden version of Groundhog Day). We’ve all heard of the greasy breakfast – eggs, bacon, whatever your heaving stomach can handle – as a cure for the hangover, but if you thought of eggs Benedict as too highbrow to constitute the classic “greasy breakfast,” think again: lore surrounding the origin of this famous brunch food actually cites one seriously hungover Wall Street worker as the original Benedict. In 1942, The New Yorker published an article claiming the dish had its roots in a man named Lemuel Benedict, a Wall Street worker known for his eccentric-for-the-time lifestyle choices (like marrying a woman who worked as an opera singer) and heavy partying habits. After one especially raucous night of partying, Lemuel awoke in the morning and went to breakfast at the Waldorf Hotel, where he invented his own breakfast sandwich of two poached eggs, bacon, buttered toast, and a pitcher of hollandaise sauce. Lemuel’s inventive sandwich caught the eye of the Waldorf’s famous maître d’hôtel Oscar, who sampled the sandwich, made some personal alterations (ham was swapped for bacon, an English muffin for the toast), put the sandwich on the menu, and sailed peacefully into history, much to the delight of hungover brunch attendees everywhere.
Coca-Cola: Brunch, eggs Benedict, Bloody Marys – these items are already so associated with post-drinking maladies that their origin in the history of hangovers might not come as a huge surprise. But that ever-present Coca-Cola bottle in the vending machine and corner stores, it too was a brainchild of those looking to cure hangovers. Coca-Cola went public in 1886, but the recipe that the popular beverage was based off of had been popular for years at pharmacist John Pemberton’s Atlanta drugstore and soda fountain. By mixing caffeine from cola nuts with cocaine from coca leaves, and adding a thick syrupy base, Pemberton’s original cola sold widely as a miracle hangover remedy. Soon, the beverage’s enjoyable taste made it popular with a non-drinking crowd, and Coca-Cola erupted into the famous soda we know today.