November 8, 2013
In 1908, over a bowl of seaweed soup, Japanese scientist Kikunae Ikeda asked a question that would change the food industry forever: what gave dashi, a ubiquitous Japanese soup base, its meaty flavor? In Japanese cuisine, dashi, a fermented base made from boiled seaweed and dried fish, was widely used by chefs to add extra oomph to meals–pairing well with other savory, but meatless foods like vegetables and soy. For some reason that was generally accepted but inexplicable, dashi made these meatless foods meaty–and Ikeda was determined to find out why.
Ikeda was able to isolate the main substance of dashi–the seaweed Laminaria japonica. He then took the seaweed and ran it through a series of chemical experiments, using evaporation to isolate a specific compound within the seaweed. After days of evaporating and treating the seaweed, he saw the development of a crystalline form. When he tasted the crystals, he recognized the distinct savory taste that dashi lent to other foods, a taste that he deemed umami, from the Japanese umai (delicious.) It was a breakthrough that challenged a cornerstone of culinary thinking: instead of four tastes—sweet, salty, bitter and sour—there were now five. A new frontier of taste had been discovered, and Ikeda wasted no time monopolizing on his discovery.
He determined the molecular formula of the crystals: C5H9NO4, the same as glutamic acid, an amino acid designated as non-essential because the human body, as well as a large smattering of other plants and animals is able to produce it on its own. In the body, glutamic acid is often found as glutamate, a different compound that has one less hydrogen atom. Glutamate is one of the most abundant excitatory neurotransmitters in brain, playing a crucial role in memory and learning. The FDA estimates that the average adult consumes 13 grams of it a day from the protein in food. Non-meat food sources like tomatoes and Parmesan cheese have high levels of glutamic acid.
In 1909, Ikeda began mass-producing Ajinomoto (meaning “essence of taste”), an additive that came out of his creation of the first method of industrially producing glutamate by way of fermented vegetable proteins. The resulting sodium salt form of glutamic acid (the acid with just a single sodium molecule) became famous for its ability to imbue a meaty flavor into dishes, or just naturally enhance the flavor of food. It was touted as a nutritional wonder, helping bland but nutritious food become delicious. A growing number of Japanese housewives used the product, and by the 1930s, recipes included Ajinomoto use in their directions. The sodium salt of glutamic acid remains prevalent today–anyone who has eaten KFC or Doritos has ingested it; it’s just known by a different name: monosodium glutamate, or MSG.
Few letters have the power to stop conversation in its tracks more than MSG, one of the most infamous additives in the food industry. The three little letters carry so much negative weight that they’re often whispered sheepishly or, more often, decidedly preceded by the modifier “NO” that seems to make everyone breathe a collective sigh of relief when they go out to eat. Nobody wants MSG in their food—the protest goes—it causes headaches, stomachaches, dizziness and general malaise. It’s unhealthy and, maybe even worse, unsexy, used by lazy chefs as an excuse for flavor, not an enhancement.
On the other side of the spectrum lies umami: few foodie buzzwords pop off the lips with such entertaining ease. Enterprising young chefs like David Chang (of Momofuku fame) and Adam Fleischman, of the LA-based chain Umami Burger, have built their culinary careers on the basis of the fifth taste, revitalizing an interest in the meaty-depth of umami. It’s difficult to watch the Food Network or Travel Channel or any food-based program without hearing mention of the taste wunderkind, a host or chef cooing over the deep umami flavors of a Portobello mushroom. Where MSG is scary, umami is exciting.
What few people understand is that the hated MSG and the adored umami are chemically related: umami is tasted by the very receptors that MSG targets. At a MAD Symposium in Denmark, a TED-like conference for the food industry, Chang spoke about MSG and umami: “For me, the way that I’m looking at umami, it’s the same way I look at MSG. It’s one in the same.” But if chefs like Chang (neither inept nor lazy when it comes to flavor, as his Michelin stars would attest to) are down with MSG, why does the additive retain such a bad reputation?
After gaining a foothold in Japanese cooking columns, MSG spread throughout Asia, becoming especially popular in Chinese cooking for enhancing both stocks and vegetarian dishes. Everyone knows this connection, and probably associates MSG use in America most heavily with Chinese restaurants–thanks in large part to the absurdly racist name for MSG sensitivity “Chinese Restaurant Syndrome.” But MSG’s foray into American cuisine came from more than Chinese dishes; MSG became popular in the United States during World War II thanks in large part to the country’s increasing military-industrial complex. The military thought that they had found in MSG an answer to the flavorless rations allotted to soldiers, and when the war ended, the troops came home and so did the industrialization of food production. From canned vegetables to frozen dinners, industrially created food was met with wonder in the United States.
That all changed in the 1960s, when trust in industrial food began to wane. In 1962, Rachel Carson published Silent Spring, a manifesto against pesticides that kicked off the environmental movement. As pesticides quickly fell from grace, faith in the industry of yesteryear–of the chemicals and additives born from the war—declined as well. In 1968, MSG’s death knell rang in the form of a letter written to the New England Journal of Medicine by Robert Ho Man Kwok, a Chinese-American doctor from Maryland. Kwok claimed that after eating at Chinese restaurants, he often came down with certain unpleasant symptoms, namely “numbness at the back of the neck, gradually radiating to both arms and the back” and “general weakness and palpitation.” After Kwok’s letter ran, the journal received a deluge of letters from other readers, all claiming to suffer from the same affliction, deemed “Chinese Restaurant Syndrome” by editors. Some readers presented the same symptoms as Kwok, but most were extremely varied, ranging from cold sweats to extreme dizziness. In response, the Journal offered up MSG as the likely culprit for their reader’s unpleasant symptoms.
Public interest spurred a number of scientific inquiries into the potential danger of MSG. According to food historian Ian Mosby’s exploration of MSG in “That Won-Ton Soup Headache” these inquiries went one of two ways: they either sought to prove the harmful short-term effects of MSG (and Chinese Restaurant Syndrome) or they looked to identify more long-term damage caused by the additive. Initially, researchers had success proving both the short-term and long-term dangers of MSG: mice injected with the additive showed signs of brain lesions, and humans fed 3 grams of MSG per 200 ml of soup presented symptoms congruent with “Chinese Restaurant Syndrome.” Subsequent studies, however, provided mixed results: some confirmed findings of brain lesions in animals or symptoms in humans, but other studies were unable to replicate the results. Double-blind studies often showed little correlation between MSG and adverse symptoms. Parties on both sides of the debate slung accusations at the other, with the anti-MSG researchers claiming that studies were being funded by MSG producers, and pro-MSG researchers accusing the other side of fear-mongering.
From the FDA to the United Nations to various governments (Australia, Britain and Japan) the public bodies that have investigated MSG have deemed it a safe food additive. The FDA states on their website:
FDA considers the addition of MSG to foods to be “generally recognized as safe” (GRAS). Although many people identify themselves as sensitive to MSG, in studies with such individuals given MSG or a placebo, scientists have not been able to consistently trigger reactions.
Scientific interest in its deleterious effects seems to be waning: one of the last studies to gain public attention was published in 2011. The authors of that study claimed to have found a link between MSG and obesity, though those results have been questioned. While the general scientific consensus seems to be that only in large doses and on an empty stomach can MSG temporarily affect a small subset of the population, MSG’s reputation is still maligned in the public eye.
On the other hand, MSG’s glutamic cousin umami suffers no public scorn: in 2010, umami was deemed one of the most delicious food trends to watch. When Adam Fleischman’s Umami Burger (a burger chain devoted to all things umami) opened a New York outpost, the wait for a meaty bite stretched on for three-hours. In addition to piling natural glutamates onto their burger to ensure the most umami flavor, Umami Burger enhances the burger with their “umami dust,” a blend of dried mushrooms and seaweed, and umami sauce, which includes soy and Marmite. Altogether, an original Umami Burger contains 2,185 mg of glutamate.
“Most people don’t know the connection between umami and MSG. They know about it from the fifth taste, and the fifth taste was always called umami and not MSG,” Fleischman explains. “We didn’t feel that using MSG was creative enough. We wanted to do it ourselves. By doing it ourselves, we could create a flavor that was umami without the stigma of MSG. MSG, whether you like it or not, has been marketed so poorly, it sounds like this horrible thing.”
By harnessing natural glutamates for their burgers, Umami Burger avoids negative connotations associated with MSG. But the “natural” glutamates in an Umami Burger aren’t chemically any different from glutamtes in MSG.
“The short answer is that there is no difference: glutamate is glutamate is glutamate,” says Richard Amasino, professor of biochemistry at University of Wisconsin-Madison. “It would be identical unless different things created a different rate of uptake.”
Glutamtes that occur naturally in food come intertwined with different chemicals or fiber, which the body is naturally inclined to regulate, explains Amy Cheng Vollmer, professor of biology at Swarthmore College. MSG, however, comes without the natural components of food that help the body regulate glutamic levels. It’s like taking an iron supplement versus obtaining iron from spinach or red meat: the iron supplement creates an expressway between the iron and your bloodstream that you wouldn’t find in natural iron sources.
“The bottom line here is context is everything,” Vollmer adds.
So does MSG deserve its bad rap? For the small section of the population that shows sensitivity to it, probably. But for the rest of America, maybe it’s time to reconsider exactly what we’re so afraid of when it comes to MSG.
October 30, 2013
In 1971, Walt Disney World had just opened in Orlando, Florida. Led Zepplin was about to blow our minds, a prison riot had been shut down at Attica, and all across America, kids were pooping pink. Hundreds of mothers hospitalized their children for fecal testing out of fear of internal bleeding. Within that same year, not-so-coincidentally, General Mills released their classic monster cereals Count Chocula and Franken Berry. The latter was colored red using “Food, Drug and Cosmetics” (FD & C) Red No. 2 and No. 3., originally and chemically known as amaranth, a synthetic color named after the natural flower. The synthetic dye can’t be broken down or absorbed by the body.
A 1972 case study, “Benign Red Pigmentation of Stool Resulting from Food Coloring in a New Breakfast Cereal (The Franken Berry Stool),” published in Pediatrics explains the phenomenon later known as “Franken Berry Stool.” A 12-year-old boy was hospitalized for four days after being admitted for possible rectal bleeding. “The stool had no abnormal odor but looked like strawberry ice cream,” Payne reports. Further questioning of the mother revealed that the child had enjoyed a bowl of Franken Berry cereal two days and one day prior to his hospitalization. By the fourth day, they did a little experiment: They fed the boy four bowls of Franken Berry cereal and for the next two days, he passed bright pink stools. But other than pink poop, there were no other symptoms, Payne reports, “Physical examination upon admission revealed [a boy] in no acute distress and with normal vital signs…Physical examination was otherwise unremarkable.”
At the time of the study, the product had only been on the market for a few weeks. The author warns that “physicians should be aware of its potential for producing reddish stools.” Other monster cereals at the time also used dyes that caused stool to change colors. Booberry, which debuted in December of 1972, for example, uses Blue No. 1 (a dye currently banned in Norway, Finland and France) and turns stool green. Apparently, green stool seems less life-threatening than the reddish hue caused by Franken Berry.
But pink poop wasn’t always the worst side effect from colored confections. Ruth Winters’s A Consumer’s Dictionary of Cosmetic Ingredients details the history of commercial food dyes, including those later used in Franken Berry. At the turn of the 20th century, with virtually no regulation of more than 80 dyes used to color food, the same dyes used for clothes could also be used to color confections and other edibles.
In 1906, Congress passed the first legislation for food colors, the Pure Food and Drug Act, deeming seven colors suitable for use in food: orange, erythrosine, ponceu 3R, amaranth (the color later used in Franken Berry cereal), indigotin, naphthol yellow, and light green. Since then, upon further study, several of these choices have been delisted.
More than 20 years later, in 1938, Congress passed the Federal Food, Drug, and Cosmetic Act which gave these colors numbers instead of chemical names—every batch needed to be certified by the Food and Drug Administration, though some problems still arose: in the fall of 1950, many children became ill from eating an orange Halloween candy containing one to two percent FD&C Orange No. 1, for example.
Red Dye No. 2, the one used by the original Franken Berry cereal, was one of the most widely used color additives at the time, until a 1971 Russian study reported that the dyes caused tumors in female rats. Years of research led the FDA to find that even though the Russian study was extremely flawed (the FDA couldn’t even prove that amaranth was one of the dyes used), the agency would remove the dye from its Generally Regarded As Safe (GRAS) list in 1976. Between public outcry against the dye and the chance that trace elements could potentially have carcinogens, the FDA banned a number of other dyes as well. According to the FDA, 47 other countries, including Canada and the United Kingdom, still allow for the use of Red Dye No. 2.
That same year, Mars removed their red M&M’s from the candy-color spectrum for nearly a decade, even though Mars didn’t even use Red No. 2; the removal of the red candies was a response to the scare, livescience.com reports:
The red food coloring in question was not actually used in M&M’s chocolate candies, according to mms.com. “However, to avoid consumer confusion, the red candies were pulled from the color mix.”
Inquiries to General Mills as to when the Franken Berry ingredients switched to less poop-worrying dyes, were not responded to. These days, the only red colors accepted by the FDA are Red No. 40, which appears in all five of the General Mills monster cereals, and Red No. 3, typically used in candied fruits.
The symptoms of “Franken Berry Stool” were pretty benign compared to other more notable confectionary mishaps in history: The accidental poisoning of more than 200 people in Bradford, England in 1858 comes to mind. The sweets were accidentally made with arsenic. Let’s be thankful there’s a bit more regulation of food dyes these days.
Another stool scare in cereal history: Smurfberry Crunch Cereal, released in 1982 by Post Foods, turned the poop of those who ate it blue—the ultimate Smurfs experience. Post then changed the formula and re-released the cereal in 1987 as Magic Berries Cereal.
Looking for a sugar high now? You’re safe. When you open your celebratory, Franken Berry or any of the other monster cereals this Halloween, [for the first time, all five monsters are available in stores since the well-received re-release of Frute Brute (1975-1984) and Fruity Yummy Mummy (1987-1992)], expect a sugar high—without the pink poop aftermath. We tasted all five of the cereals and Count Chocula is the best by a long shot.
The best part is when the chocolate “sweeties,” as the marshmallows were called in the original commercials in 1971, are all gone: the plain milk becomes chocolate milk. Let’s be real, what child—or “adult”—prefers regular milk to chocolate? I haven’t met this kind of person.
October 24, 2013
The avocado is a fruit of a different time. The plant hit its evolutionary prime during the beginning of the Cenozoic era when megafauna, including mammoths, horses, gomphotheres and giant ground sloths (some of them weighing more than a UPS truck) roamed across North America, from Oregon to the panhandle of Florida. The fruit attracted these very large animals (megafauna by definition weigh at least 100 pounds) that would then eat it whole, travel far distances and defecate, leaving the seed to grow in a new place. That’s the goal of all botanical fruits, really. Survival and growth via seed dispersal.
But the great mammals disappeared forever about 13,000 years ago in the Western Hemisphere. Around that time, North America lost 68 percent of its diverse Pleistocene megafauna, and South America lost 80 percent, Connie Barlow, author of The Ghosts of Evolution: Nonsensical Fruit, Missing Partners, And Other Ecological Anachronisms says. But even after this major shift in the land mammal population, the wild avocado still requires the same method of seed dispersal, which makes it somewhat of an evolutionary anachronism.
“After 13,000 years, the avocado is clueless that the great mammals are gone,” Barlow explains. “Without larger mammals like the ground sloth to carry the seed far distances, the avocado seeds would rot where they’ve fallen and must compete with the parent tree for light and growth.”
A fruit with smaller seeds, like a berry, for example, can be consumed whole and dispersed by small mammals, making the chances of fruiting in a new place higher.
After the giant mammals had died out, if an avocado tree was lucky, a jaguar might’ve found the fruit attractive—the cat’s stomach is designed for digesting large hunks of meat, leaving potential for swallowing the avocado whole, though there is no evidence to support this idea. Rodents like squirrels and mice may have also contributed, as they traveled and buried seeds in the ground, rather than letting it rot on the surface. Wild avocados were appealing to larger animals because it had enough tasty flesh to lure them in and could be eaten in one bite. The fruit had a larger pit and less flesh than today’s avocados, but it really served as a quick snack for big mammals like the mammoth. Barlow writes in “Haunting the Wild Avocado,” originally published in Biodversity:
The identities of the dispersers shifted every few million years, but from an avocado’s perspective, a big mouth is a big mouth and a friendly gut is a friendly gut. The passage of a trifling 13,000 years (since the Pleistocene extinction) is too soon to exhaust the patience of genus Persea. The genes that shape fruits ideal for megafauna retain a powerful memory of an extraordinary mutualistic relationship.
How the avocado still exists in the wild after surviving its evolutionary failures remains a puzzle. But once Homo sapiens evolved to the point where it could cultivate the species, the fruit had the chance to thrive anew. Back when the giant beasts roamed the earth, the avocado would’ve been a large seed with a small fleshy area—less attractive to smaller mammals such as ourselves. Through cultivation, humans have bulked up avocados so there is more flesh for us to eat.
The avocado has been a staple food in Mexico, as well as Central and South America, since 500 B.C. Spanish conquistadors discovered the fruit from the Aztecs in the 16th century, but the ahuacate, the Aztec word for “avocado,” wasn’t grown commercially in the United States until the turn of the 20th century. By 1914, the exotic fruit made an appearance on California soil. Roughly 90 percent of today’s avocados are grown in California according to NPR. But Barlow is quick to point out the difference between a cultivated avocado and those found naturally.
“The wild varieties of avocados that are still somewhat available have a thin fleshy area around the seed—it wouldn’t necessarily be something that we would recognize as edible,” says Barlow. “When we go to the store and we see an avocado on sale, it’s always a question of will this be one with a tiny seed, or will it be a batch where the seed takes up five-sixths of the space of the fruit?”
Ecologist Dan Janzen conducted groundbreaking research on these and other “anachronistic fruits” and found that the avocado isn’t alone in this regard. His research in the late ’70s in the neotropics— an ecozone that includes both Americas and the entire South American temperate zone—sparked a shift in ecological thinking regarding these evolutionary-stunted fruits. Other examples include: papaya, cherimoya, sapote and countless other fleshy fruits of the neotropics. Another surprising “ghost” you may see everyday: Honey locust pods scattered about your driveway. All of these fruits are not considered edible by most native mammalian standards today. Barlow continues:
“In 1977, however, [Janzen] was beginning to suspect that he—along with every other ecologist working with large tropical fruits of the New World—had been wrong in one very big way. They all had failed to see that some fruits are adapted primarily for animals that have been extinct for 13,000 years.”
“We don’t have the liver or the enzyme systems to detoxify our bodies from something like the avocado seed,” Barlow says. “But at the same time, the rhino which has been around for ages, can eat all kinds of things that are toxic to everyone else.”
A South American folk recipe for rat poison mixes avocado pits with cheese or lard to kill off unwanted rodents. Whether or not humans are supposed to eat avocados from an evolutionary standpoint, America produced 226,450 tons of the fruit and consumed 4.5 pounds per capita in 2011. The avocado, a true “ghost of evolution,” lives on.
More avocado facts to drop at your next party:
- The Aztec word for avocado, ahuacatl means “testicle”. This is most likely because the avocado, growing in pairs, resembled the body part. After the arrival of Spanish conquistadors, Spanish speakers substituted the form avocado for the Aztec (Nahuatl) word because ahuacatl sounded like the early Spanish word avocado (now abogado), meaning “lawyer.”
- The Spanish-Mexican word “guacamole” was derived from ahuacamolli, meaning “avocado soup or sauce,” made from mashed avocados, chiles, onions and tomatoes.
- For reasons related to the word’s origin, the avocado is also considered an aphrodisiac. According to the book The Aphrodisiac Encyclopaedia, by the time the fruit traveled to Europe, the Sun King (Louis XIV) nicknamed avocados la bonne poire (the good pear) because he believed it restored his lagging libido.
- The Hass variety of avocado was named after a postal employee, Rudolph Hass, who purchased the seedling in 1926 from a California farmer.
- For more information regarding other “ghosts of evolution” Barlow’s theme song is a great listen:
September 25, 2013
Though most people rely on commercial producers for their bread, baking one’s own at home is rather simple to do. Combined in a bowl with flour and water, dried yeast reacts marvelously, coming vigorously to life as it ferments sugars and creating a delicious balloon of gas-filled dough. Thirty minutes in the oven produces a house full of aromas and a hot, steaming loaf on the table. It’s easier, for sure, than pie. With white flour, anyway.
But using whole wheat takes things up a notch. Unlike white flour, whole wheat–like other unrefined grains–contains germ and bran. These two components bear minerals like zinc, magnesium and iron, as well as omega-3 fatty acids and dietary fiber. They also add a nutty array of flavors to a loaf of bread, as well as a fuller texture. Thing is, they also make life harder for bakers. For one thing, bran and germ soak up water, which can dry out a loaf and make it crumbly–and largely for this reason, bakers cannot simply substitute whole grain for white. Rather, recipes must be entirely recomposed. Germ and bran also add weight to the dough, which can impede its capacity to rise, leading to loaves almost as dense as French cobblestone. But a properly made whole wheat loaf can be surprisingly light as well as healthy to eat in ways that white bread isn’t, and if one loaf should fail, it’s worth it for the home baker to try again for that perfect honey-brown bread.
It helps to try a few basic methods. First and foremost, you must use enough water.
“Probably the most frequent mistake in baking whole wheat bread is not using enough water,” says Dave Miller, a whole wheat enthusiast and the owner of Miller’s Bakehouse near Chico, Calif. “You really need to hydrate the flour. Only then can you get really beautiful, soft bread.” White flour dough can be made with as little water as just 60 percent of the flour weight–a so-called “baker’s percentage” of 60 percent. But whole grain flour demands significantly more. Most commercial bakers use at least a 90-percent baker’s percentage of water–that is, 14.4 ounces to a pound of whole wheat flour. Miller uses even more water than that–often a 105-percent baker’s percentage. That means he uses almost 17 ounces of water to 16 ounces of flour.
And in San Rafael, Calif., Craig Ponsford, of the bakery Ponsford’s Place, goes even higher–up to 120 and even 130 percent water. “My dough is like soup when I first combine the flour and water,” says Ponsford, who makes breads and pastries with nothing but 100-percent whole grain flour. “Bread is all about the water. Water is what makes light, fluffy loaves, and in the case of whole wheat you need lots of water.”
You also don’t want to over-knead your whole wheat dough. That’s because it contains flakes of bran which can actually cut the dough like knives.
“Those will slice through the gluten strands when you’re kneading the dough,” says Jonathan Bethony-McDowell, a research baker in Washington State University’s Bread Lab, a facility used in national wheat breeding programs. This cutting action, he explains, will damage the consistency and structure of the dough and curtail its ability to rise. Anyway, an extra wet, gooey dough may be too sticky to easily knead, and a quick mix will do.
You’ll also probably have to give your whole wheat dough more time to rise than you would white dough, thanks to the heavy germ and bran particulates. But Ponsford warns that there is only so much time you can give. That is, at a certain point, a ball of dough will reach its maximum volume. Then, as the fermenting yeast continues metabolizing the sugars in the wheat, the dough stops rising and reverses. “If you let your dough over-ferment, then the gluten deteriorates, and the dough can collapse,” Ponsford explains.
So, what’s the sweet spot? The rule of thumb when using a baker’s percentage of 1 percent yeast (remember, that’s 1 percent of the flour weight) says you can let whole wheat dough rise for about three-and-a-half hours at 75 degrees Fahrenheit before it attains its maximum volume, according to Ponsford. But Ponsford usually uses one-tenth of a percent yeast. (A gram-sensitive scale would be helpful here.) Thus, the yeast takes longer to attain its full vigor–and the dough longer to reach its maximum gas capacity. Some of Ponsford’s whole wheat breads spend 36 hours rising, he says–a time span that he explains allows great development of flavor as the yeasts work on the germ, bran and endosperm. Ponsford likens these day-and-a-half breads to the great red wines of Bordeaux. Like a good Cabernet Sauvignon, he explains, such complex, long-fermented whole grain bread will last longer on the shelf and can be matched to stronger-tasting foods.
Beyond bread, those with a sweet tooth can also bake using whole grain flour. That’s what professional pastry chef Kim Boyce has been doing since 2007, after she discovered while experimenting with a recipe just how good whole wheat pancakes can be. Today, Boyce owns and operates Bakeshop, a pastry house in Northeast Portland, Ore. For Boyce, using whole grains is not about the health benefits. Rather, she believes they make better pastries, plain and simple.
“Whole grains give you a toothsome texture and a little nuttiness,” she says. “There is so much more flavor in whole grains, and that lets me pair my pastries with fruits and wines.” For cookie recipes, Boyce uses entirely whole grain flour, but for items that require some fluff, like scones and muffins, Boyce uses a 50-50 blend of white flour to whole grain flour.
Boyce says it doesn’t take a pro baker to replicate her recipes, many of which she has published in her 2010 cookbook, Good to the Grain. “People can totally do this at home,” Boyce says. For those hoping to try their own creations, Boyce advises starting with a favorite baking recipe that calls for white flour and substituting in a quarter or a half cup of whole grain flour in a one-to-one swap. Those who proceed further toward entirely whole wheat pastries must start boosting the liquid volumes, she advises, whether milk, water or cream, to accommodate the higher levels of water-grabbing germ and bran.
Whole wheat baking, clearly, takes some effort and time to do well. But whole grain proselytizers believe it’s well worth it–that the health benefits of eating whole grain flour, as well as the bonus of improved flavor, outweigh the challenges of turning it into bread. White flour, says Bethony-McDowell, at the WSU Bread Lab, is nothing but powdery white endosperm–almost entirely void of nutrition. “It’s just starch,” he says. “Ninety percent of the nutrients in whole wheat go out the door as soon as you mill it into white flour.” Monica Spiller is another advocate for whole grains–plus making them with sourdough yeast, which she and others say are good for the digestive tract. She sells heirloom seeds to farmers through her online nonprofit, the Whole Grain Connection, and she voices an increasingly supported notion that gluten intolerance is a misidentified condition.”I think gluten intolerance is actually an intolerance to refined flour,” she says. Ponsford, too, has observed this, he says, in customers at his bakery who once sometimes reported stomach aches after eating refined wheat products but who can digest his whole grain pastries and breads just fine.
The verdict may not be in yet on this health claim–but the jury, anyway, is baking good bread.Following are two recipes from the experts.
Dave Miller’s Basic Whole Wheat Bread
16 ounces whole wheat flour
16.32 ounces water (102 percent of flour weight, though extra dry flour may call for 105 percent, or 16.8 ounces, of water)
3.2 ounces sourdough starter (or, for non-sourdough, 1 tsp activated dry yeast)
0.38 ounces salt
Mix the flour with 90 percent of the water in a bowl. Let sit for 30 minutes–a lapse of time called the”autolease,” during which enzymes activate and convert starches into sugar. Next, mix the dough in an automatic mixer or by hand for several minutes. Add the remaining water, sourdough starter and salt. The dough will be very gooey–almost like batter. Allow it to sit for three hours in a bowl at room temperature. Next break apart the dough and shape into loaves. Allow 20 minutes of rising. Punch down the dough loaves and allow one more rise. After three hours, place in an oven preheated to 520 degrees F (yes–this is very hot). After 15 minutes, reduce the temperature to 470 for 20 minutes. For 15 more minutes, open the oven door a crack, which allows moisture to escape and facilitates crust formation. Remove the finished bread.
Monica Spiller’s Sourdough Starter
1/2 cup water
1/2 cup whole wheat flour
Directions: Combine half the flour and half the water in a glass jar and cover with a cloth. Stir two times per day. After about three days, the mixture should be bubbling. Using ph paper, measure the acidity. Monica Spiller suggests aiming for a ph of 3.5. Now, feed the starter half of the remaining flour and water. The ph should hit 3.5 again in slightly less time–two days, perhaps. When it does, add the remaining flour and water. This time, the increasingly vigorous starter will hit the desired ph in just eight hours. It is now ready to begin using. Always leave a portion in the jar to allow indefinite propagation. Maintaining the starter is easy. You must only remove about half of its volume every week, either to discard or (preferably) use in bread, and “feed” the starter with fresh whole wheat flour and water. If you bake less frequently, keep the starter in the fridge. Keep it covered with a cloth.
March 26, 2013
These days, the classic wedge salad—wherein the chef smothers a chunk of crisp Iceberg lettuce with creamy blue cheese dressing, and crumbles bacon all over the top—is seen as a cornerstone of American “comfort food.”
The dish is also often credited with single-handedly causing an “Iceberg comeback.” All of this raises the question: Did this crisp salad green, the “polyester of lettuce,” really go so far away that it needed to come back? And if so, can one menu item really make a difference?
But first a note—for those who aren’t old enough to remember—about just how ubiquitous Iceberg lettuce once was. Introduced for commercial production in the late 1940s, Iceberg (or crisphead) lettuce was the only variety bred to survive cross-country travel (the name Iceberg comes from the piles of ice they would pack the light green lettuce heads in before the advent of the refrigerated train car). Therefore, throughout the middle of the century, unless you grew your own or dined in a high-end establishment, iceberg essentially was lettuce.
Most of the nation’s lettuce is grown in California, and in 1974, leafy green “non-crisphead” varieties of lettuce still made up only around five percent of the total acres grown in California. Then things changed. For one, consumers became more aware of the nutritional value of greens that are, well, greener. (Made of a high percentage of water, iceberg has only around 1/20th the amount of vitamins as the darker leafy greens, says David Still, a plant science professor at California State Polytechnic University at Pomona.)
America’s everyday lettuce for half a century was losing market share. By 1995, other lettuce varieties made up to around 30 percent of the lettuce American’s ate, and it has been rising steadily since, according to the California Leafy Greens Research Programs (a salad industry group). That’s precisely why, by 2007, the Salinas, California-based Tanimura and Antle—the nation’s largest lettuce supplier—decided it needed to start promoting Iceberg. And rather than compete with varieties that have more flavor or nutrition, Tanimura and Antle went straight for nostalgia, and opted to draw a connection to steaks, fathers, and sports. A press release from the time reads:
Mother’s Day has strawberries, Thanksgiving has celery, but historically no holiday has been associated with Iceberg lettuce,” says Antle. “What better product to claim ownership of Father’s Day than the cornerstone salad of steakhouse menus?
Wal-Mart, Albertsons, and several other big retailers hung signs and banners promoting the campaign, and sales got a boost. The company also planted wedge salad recipes around the food media world, in hopes that they would inspire chefs to return to this American Classic.
It’s hard to say whether the Father’s Day angle made a difference, but the larger effort to reconnect to Iceberg to simpler times with fewer complicated health choices appears to have worked. Sort of.
On the one hand, chefs like the fact that Iceberg is a completely neutral way to add crunch and filler to an otherwise flavorful medley of ingredients. So it appears that this classic salad will be sticking around on menus for a while. (Last fall the San Francisco Chronicle ran a list of nearly a dozen upscale restaurants serving some variation on the wedge salad, including everything from croutons, to apple, walnuts, and avocado. One Napa restaurant even serves it with the Iceberg frozen for extra crispness.)
On the production level, however, Iceberg may never return to it’s reigning position. It’s a little cheaper to grow and has long been easy to ship and store (the name Iceberg is said to come from the way the round lettuces were shipped by train in big piles of ice), but it has a hard time standing up to romaine, butter, and all the other specialty greens that have become popular in recent years.
This also appears to be true outside the U.S. In 2011, for example, UK-based Telegraph declared: “The era of Iceberg lettuce is over,” as “bagged leaf varieties such as [arugula] and watercress are up by 37 per cent compared to last year.” Of course, it may never be hard to find Iceberg lettuce in fast food tacos and Sizzler salad bars. But the decline of Iceberg might also signal some good news for Americans’ diets.
“Iceburg sales have gone down, but romaine has gone up,” says Mary Zischke from the California Leafy Greens Research Programs. “Tastes have changed. And the darker, leafy greens have a better story to tell from a nutrition standpoint.”
Compared to 20 years ago, Zischke added, “there are a lot more choices. Especially in some parts of the country, like the Midwest.” Overall, she’s glad to report that: “The product mix has changed, but our [greens] industry has also gotten bigger.”