August 11, 2011
In the story of Kermit the Frog’s rise to fame recounted in The Muppet Movie, the road to stardom is paved with danger—namely in the form of Doc Hopper, the owner of a fast-food chain specializing in frog legs who wants Kermit for a singing, dancing spokesman. Our amphibian friend is horrified by the prospect. “All I can see are millions of frogs with tiny crutches,” he says in response to Hopper’s initial business proposal. And while things turned out well for Kermit and his talented troupe of friends, in real life, it’s not that easy being green. A worldwide penchant for frogs’ legs results in billions of frogs being snapped up and eaten every year, and according to a new study, it’s a dining habit that is putting considerable strain on frog populations.
In Europe, the mild-flavored meat has been a part of the cuisine for centuries, but demand for frogs’ legs skyrocketed after World War II to the point that local frog populations in Romania went extinct. France had to place a ban on the collection of indigenous frogs in 1992. To meet consumer demands, the European Union has been importing frogs from Asia. The United States is another major frog consumer, importing an average of 2,280 tons of legs per year, most of which come from, ironically, American bullfrogs.
India was a major frog exporter starting in the 1950s; however, the wild populations of those animals eventually collapsed, and with fewer predators to feed on insects and other pests, local agriculture started to suffer. It was a problem that prompted India to ban trade in frogs in 1987, and populations have since recovered. But now history may be repeating itself in Indonesia. Using farmed frogs may be a means of taking some pressure off the animals hopping around in the wild, but even that route poses problems: non-native frogs raised on farms can escape and introduce diseases or turn into an invasive species, which is the case with Indian bullfrogs raised in Madagascar. And then there are animal welfare issues (as dramatized on “The Muppet Show“); frogs are sometimes dismembered while still alive.
The study offers a number of ways to make frog leg trade sustainable and to minimize ecological impacts, such as setting export quotas, carefully monitoring wild populations, restricting commercial farming to native species and setting humane standards for the capture and slaughter of the animals. All that said, with so many issues surrounding this food source, would you spring for a plate of frogs’ legs?
April 6, 2011
A few years ago, while driving through rural Washington County, New York—a picturesque area that has attracted retirees and city-weary escapees—I noticed a sign declaring it a “right to farm” area. A city person myself until recently, it struck me as strange that anyone would feel the need to declare such an obvious right, kind of like insisting on the right to practice accounting or teach piano lessons. Clearly, I hadn’t spent a lot of time around farms, or understood the conflicts that can arise when city folk start moving into farm country and imposing their city standards.
Say Old MacDonald had a neighbor. And that neighbor didn’t appreciate the constant “oink oink” here and “moo moo” there coming from Old MacDonald’s farm—not to mention the wafting chemicals, noisy machinery operated at all hours and the ever-present stink of animal flatulence.
Assuming the farm was there first, that neighbor had better get used to it. Since the 1970s, all 50 states have enacted some version of “right to farm” statutes, which protect farmers from being considered a nuisance by new neighbors if they weren’t a nuisance before. Some areas (like the one where I saw the sign) have also enacted local ordinances. Although they vary slightly from place to place, they share a motivation: to help preserve farmland in the face of encroaching suburbia. Before the statutes, some farms were forced to shut or change their operations, or spend large sums defending themselves against lawsuits. As the bumper stickers say, No Farms No Food.
But some people think the laws go too far. Idaho is considering a stronger version of its right to farm law that critics say favors big agribusiness and could support environmentally damaging practices. A small-scale hay farmer quoted in the Idaho Press-Tribune called it a “right to pollute” act, saying, “it does nothing to protect small family farmers.” Others complained that it prevents neighbors from seeking recourse when a farm expands or begins offensive practices that make their homes unlivable—as happened to one family who said they could no longer stomach their tap water after a neighboring farm began dumping onions near their water source.
Supporters of the bill, including the newspaper’s editorial board, say that farming is a vital industry and should take precedence over the sensibilities of neighbors. “Cow poop stinks, folks,” the editorial asserts. “Tractors make noise. Expect to smell and hear them if you live near agricultural land. It’s not reasonable to expect otherwise.”
Lately, a new development has flipped the scenario: what happens when it’s farmers encroaching on urban areas? With the advent of the urban farming movement, the culture clash is occasionally going the other way. Many cities have enacted livestock bans; to some people, pre-dawn rooster crowing and barn smells are more offensive than car alarms and rotting garbage.
Novella Carpenter, whose book Farm City describes how she raised veggies and animals on squatted property in her scruffy Oakland, California, neighborhood, recently ran into zoning trouble, according to the San Francisco Chronicle. She now owns the property and sells some of her surplus produce, but a neighbor who didn’t care for her raising rabbits turned her in for operating without a permit. The permit would probably cost more than the couple thousand dollars she makes as an urban farmer.
“Why am I even trying? Why not just move to the country and do whatever I want?” Carpenter wrote on her blog, before answering her own questions. “I’ll tell you why: I love Oakland…. And, at the same time, I love keeping animals and growing vegetables.”
March 29, 2011
Art and culture flourished throughout Europe during the Renaissance. It was the period when Michelangelo wielded his chisel, Galileo defied preconceived notions about the universe and William Shakespeare penned some of the most enduring dramatic works. It was also a period that saw the evolution of manners, as the article “Mind Your Manners” in the Spring 2011 issue of Folger magazine will attest. Manners were a response to the violence and crude behaviors run rampant in burgeoning cities and a means of reinforcing social order and distinguishing the privileged class from everyone else. A first generation of Miss Manners-es—typically men—took up the quill. And the newly defined codes of conduct were especially important at the dinner table.
Italy more or less led the cultural revolution, table manners included. Italian poet Giovanni della Casa advised in “Galateo,” his 1558 book on manners: “One should not comb his hair nor wash his hands in public… The exception to this is the washing of the hands when done before sitting down to dinner, for then it should be done in full sight of others, even if you do not need to wash them at all, so that whoever dips into the same bowl as you will be certain of your cleanliness.” To the modern reader, these attitudes toward public displays of personal cleanliness might seem a little over the top; however, considering that one’s hands were also one’s dining utensils, this sort of advice was of utmost importance. In his study on the social customs of this period, sociologist Norbert Elias noted that “In good society one does not put both hands into the dish. It is most refined to use only three fingers of the hand. … Forks scarcely exist, or at most for taking meat from the dish.”
That’s right: no forks. They were initially viewed as excessively refined or, in the case of men, a sign of effeminacy. The newfangled fork custom began in Italy and was a hit, but forks were slow to catch on in Northern Europe. The use of forks to get food from plate to mouth didn’t didn’t gain wide acceptance until the 17th century—and even then, only the well-to-do could afford them.
Utensils such as spoons were communally used—making the etiquette of eating soups a delicate matter. “If what is given is rather fluid,” Dutch theologian Erasmus of Rotterdam writes, “take it on a spoon for tasting and return the spoon after wiping it on a napkin.”
But in spite of trying to polish social customs, some human behaviors were deemed permissible at the dinner table. On farting, Erasmus writes, “If it is possible to withdraw, it should be done alone. But if not, in accordance with the ancient proverb, let a cough hide the sound.” Slick, no? However, lest you follow this example, modern manners maven Miss Conduct says that “civilized folk will protect others from any sounds or smells that may be displeasing.”
This is not to say that all Renaissance manners are outdated. On respecting fellow diners’ personal space, Giovanni Della Casa says, “It is also an unsuitable habit to put one’s nose over someone else’s glass of wine or food to smell it.” And again, from Erasmus: “It is rude to offer someone what you have half eaten yourself; it is boorish to redip half-eaten bread into the soup.” Anyone remember the “did you just double dip that chip” episode of Seinfeld? George Costanza was definitely a couple hundred years behind the etiquette curve. Even modern science shows that re-dipping partially-eaten foods is a great means of spreading bacteria. It certainly gives you an idea of what Renaissance society was trying to improve upon—and how far we’ve come since.
December 29, 2010
Every December, the Salvation Army deploys bell-ringers to shopping areas to collect donations for the needy, acting as jingling reminders that not everyone has a roof over his head or food in her belly, much less gifts under the tree.
The ringers’ iconic red collection kettles, which represent soup pots, have been a tradition since 1891. That was the year, according to the Salvation Army, that Joseph McFee brainstormed an idea to fund a Christmas dinner for the destitute in San Francisco. Recalling his sailor days, McFee thought of the port in Liverpool, where passersby would toss coins for the poor into a kettle called “Simpson’s Pot.” He put out a similar pot by the Oakland ferry landing on Market Street, along with a sign reading, “Keep the pot boiling,” and soon had enough to feed 1,000 people dinner.
It’s no coincidence that a soup kettle was the symbol for feeding the poor, rather than, say, a roasting pan or a skillet. Soup has always been one of the most economical ways to provide nourishing, filling food to a large quantity of people. Although he was hardly the first person to come up with the idea to feed the poor, an interesting fellow known as Count Rumford is often credited with establishing the first real soup kitchen.
Born Benjamin Thompson in Woburn, Massachusetts, in 1753, he fled to Britain during the American Revolution, having been accused of being loyal to the crown. He went on to have a brilliant career as a scientist, social reformer and inventor. His work for the Bavarian government earned him the title of Count of the Holy Roman Empire, and he chose Rumford, the New Hampshire town where he lived for a time, as the place he was from (the full name was Benjamin Count von Rumford).
His biggest project may have been his plan to rid Munich of its beggar problem by feeding—and, more pointedly, employing—the poor. According to the handbook he wrote for other cities to emulate, “mendicity” was epidemic there—”In short, these detestable vermin swarmed everywhere,” he wrote. He was speaking specifically of those able-bodied cadgers would send out scuffed-up children to prey on public sympathy, and who had developed an elaborate system of mooching food from merchants, which they would then sell to other shopkeepers at a profit.
After sending out troops to roust the beggars, Rumford established workhouses, where poor people, including children, were employed to make military uniforms. Those who were too weak, young or awkward to do more strenuous work were given the easier tasks of carding wool or spooling yarn. The youngest children were to sit in chairs in the workroom, where they would be enticed by boredom to prefer work. Children attended an on-premises school before and after work and, Rumford noted, were also given the opportunity to recreate and play.
“At the hour of dinner,” Rumford wrote, “a large bell was rung in the court, when those at work in the different parts of the building repaired to the dining-hall; where they found a wholesome and nourishing repast.” This consisted of “a very rich soup of peas and barley, mixed with cuttings of fine white bread; and a piece of excellent rye bread, weighing seven ounces, which last they commonly put in their pockets, and carried home for their supper.”
Rumford was also an early proponent of the potato as good, cheap and filling food, though this New World ingredient was still viewed with suspicion by many Europeans.
Although some of his methods (like child labor) wouldn’t necessarily mesh with today’s sensibilities, the basic concept of Rumford’s program set the groundwork for the last century’s soup kitchens. And through his many scientific innovations, he developed tools that improved cooking for everyone, poor or not, including the cast-iron Rumford stove (the first commercially available kitchen range), which kept in heat and allowed temperature to be regulated better than on an open hearth; a pressure cooker (though not necessarily the first one); and a drip coffee maker.
But the item bearing Rumford’s name that is probably most familiar to cooks today wasn’t actually his invention: a brand of baking powder was named in his honor.
October 29, 2010
Samira Kawash writes the blog “Candy Professor” and is working on a book about the cultural and social history of candy in twentieth-century America. She spoke to Smithsonian’s Amanda Bensen about Americans’ tricky relationship with treats
Amanda: At this time of year, even people who don’t eat a lot of sweets are stocking up. When did our obsession with Halloween candy start?
Samira: It surprised me to discover that Halloween was not a candy holiday until well into the 1950s. If you go back to the ‘teens and ‘twenties, and look at what the candy companies were making in terms of holidays, Christmas was a big one, Easter was a big one, but Halloween wasn’t even on their radar. There’s no sign of trick-or-treating at all until the 1930s and it really wasn’t until the late 1940s that it became widespread. Even then, kids might have gotten a homemade cookie, a piece of cake, money, or a toy. There really wasn’t a sense that it was all about candy.
So what was Halloween about, if not candy?
Up until before World War II, Americans had Halloween parties that might have involved some of what we do today, like costumes and games, but it was more of a harvest festival than a spooky thing. Candy that was made and sold especially for Halloween appeared in the 1930s, but it was something you’d have in a bowl at your party, not the main focus.
The trick-or-treat giveaway was pretty flexible in the 1950s and 1960s. Candy was becoming more important. At the same time, the door was open to other kinds of treats. No one objected to unwrapped or homemade things like cookies and nuts. Kool Aid’s Halloween ads suggested that kids would come in for a refreshing glass of soft drink. And Kellogg’s advertised cereal Snack-Packs for trick-or-treating.
Cereal, huh? Not sure that would pass muster with trick-or-treaters anymore.
I know—here’s a box of corn flakes, kids, happy Halloween! (Laughs.) But you know, when they did get candy, it was often a full-sized portion, not the mini ones we have today. For example, Brach’s was packaging candy corn for trick or treat in the 1960s, and the 5-cent package was the typical size. This was a pouch with 40 or 50 pieces of candy corn. Today you get just 6 or 8 little pieces in a tiny “treat” size pouch.
Did kids back then get the kinds of massive hauls of candy many now get at Halloween?
It’s hard to say, but my sense is that trick-or-treaters in the 1950s, especially younger kids, were more likely to go into someone’s house and have some punch and visit for a while. The newspaper women’s pages had a lot of ideas for entertaining trick-or-treaters with party refreshments and games, and it is clear that these were frequently strangers’ kids. Some of the social interaction of trick-or-treating has since disappeared; I hear a lot of adults complain that kids now don’t even bother to say thank you. Kids going door-to-door today are just a lot more efficient at covering ground, so it’s easier to fill up the treat bags much faster.
So what happened to make candy so central to the holiday?
Definitely marketing. Starting in the 1950s, big candy manufacturers started putting out a lot more Halloween promotions. But candy also was viewed in the 1950s and 1960s as a more acceptable treat. Kids, of course, really like it. And convenience was probably a big factor for the women who were handing out the treats. Candy was pre-packaged and portioned—if you bake cookies or make popcorn balls you have to wrap them, you know.
Also, in the 1970s, there was the emergence of the myth of the Halloween sadist; the idea that there are people out there who are going to poison the popcorn balls, put razors in the apples, etc. Anything that wasn’t factory-sealed wasn’t considered safe. We didn’t trust the handmade, the unmarked or unbranded. Which is hugely ironic, because in the early 20th century it was the factory-made candy that was viewed as suspicious when it was first introduced!
Even though it’s since been established that the Halloween sadist was an urban legend, there was a sense of loss of small-townness in that era of suburbanization. The neighbors were strangers for the first time. Fear of the neighbors’ candy sort of captured that sense of loss of community.
Tell me about yourself. How did you become the so-called Candy Professor? Is this a lifelong interest?
I have a Ph.D. in cultural studies and literary criticism, so I’ve always been interested in interpreting culture and everyday life. I was a professor at Rutgers University for many years, first in the English department, and later in Women’s Studies. After I decided to leave the university, I was looking for a new research project that would connect with my interests and also be fun and engaging for a broader non-academic audience.
At the time, I was a new mother with a little girl. One day she wanted a lollipop. Should I give it to her? That turned out to be a very difficult question. Should a kid have candy? How much? How often? The more I thought about it, the more I realized that candy was pretty complicated. It has such powerful emotional associations, especially with childhood. Even the words we use to talk about eating candy, like “temptation” and “guilty pleasure.” I got interested in trying to understand the meanings of candy and the uses of candy, and what that tells us about ourselves.
I have been researching the history of candy in American culture, and it turns out that ideas we have about candy today are deeply connected to the past. I’m also discovering that what candy means in different contexts has to do with many different ideas in our culture about food, health and medicine—ideas about what’s good for you, what’s harmful, and what’s pleasurable.
Hmm, I don’t think most of us associate candy with medicine these days.
Right, but the first candies were medicinal! An apothecary in the 18th century would prescribe you sugar candy for things like chest ailments or digestion problems. Back then, the “spoonful of sugar” idea was literal—if you had some sort of unpleasant medicine to take, usually a concoction of herbs that might not taste very good, the apothecary would suspend it in sugar.
It wasn’t until the 19th century that the apothecary and confectionery started becoming separate professions. Candy of the sort that you might recognize today really took off emerged after the Civil War, after the price of sugar has fallen. And then the new industrial machines of the late 19th and early 20th centuries made it possible to produce candy in a whole new way.
Actually, the first candy-making machine was invented by a pharmacist, Oliver Chase, in 1947, to crank out medicated candy lozenges. I think that the idea of candy as medicine still lingers in the way we’re aware of its effect on our bodies. We think it must cause your blood sugar to rise, cause cavities, or make you hyperactive…and it’s true that candy can do all of those things, but so can other things you eat, like a big bowl of noodles!
Medicine and poison are always very close together: The thing that heals you, if you have too much of it, can harm you. So there’s a sort of subconscious anxiety about candy. There’s still this notion that candy somehow soothes, ameliorates pain—you get a lollipop at the doctor’s office, although it’s probably sugar-free these days. And just go to the drugstore and look at the gummy vitamins, sugary cough remedies, chocolate laxatives, etc. Candy looks like the opposite of medicine, but it turns out that a lot of the ways we think about candy’s dangers are closely related to the idea of candy as a kind of drug.
Have the types of candy we like changed over the years?
Chocolate has become more central, and I think that has to do with the idea we have that it is the most luxurious, decadent flavor ever. If you go back to the early 1900s, chocolate was not as ubiquitous, but now there’s a sense that somehow chocolate is better, more adult, than sugar candy. And now the National Confectioners Association survey of kid’s preferences finds the most favored trick-or-treating candy is chocolate.
What strikes you as interesting about our current attitudes toward Halloween candy?
There’s this weird ballet of Halloween now, where families buy a bunch of candy to give away to other kids, but then they take the candy their own kids are given and either throw it away or give it to someone else. So there’s all this candy circulating, but it’s not clear that anyone’s eating it!
From what I’ve seen, trick-or-treating is sort of hyper-controlled by parents. I saw some bit of advice on TV that parents should put candy in their children’s pockets before going out, so they won’t be tempted to eat the candy they get from others—such a strange idea, that you can eat candy, but only the “safe” candy from home.
Do you think we’ve villainized candy too much?
Yes. We treat candy as being so powerful that we try to protect ourselves in it in these almost magical ways. Let’s go back to the lollipop I was debating offering my daughter: it has less sugar in it than a juice box. So it surprised me a little that a lot of moms that I knew seemed happy if their kids drank apple juice, but worried if they wanted candy There was something about not just the sugar, but the form of sugar as candy, that seems to make it especially troubling.
I think that candy becomes a place to put a lot of our anxieties and worries about food, because candy’s at the very edge of food. When you go to the supermarket and you’re surrounded by these things in boxes that have 20 ingredients, it gets confusing. It’s handy to say: That’s NOT food, that’s candy. This breakfast bar, on the other hand, that IS food.
There are so many of these processed, food-like substances, and we want to know where to draw the bright line at what’s wholesome and nutritious, so we use candy that way—even though when you look closely, there is no bright line.
So, back to the lollipop. Do you let your daughter go trick-or-treating, and eat candy?
My daughter is 7 now, and Halloween is her favorite holiday. We live in Brooklyn so it’s a little different, but we go out and take candy, and we give it out. She loves it. One of the things I struggle with as a parent is, how can we have a healthy relationship with candy? I think saying, “it’s a bad thing, you can never have it” is a sure way to create an unhealthy obsession. So I’ve been trying to figure out how to teach that candy is something nice, something I like, but I don’t have to eat it all at once. I think that’s a nice way to experience Halloween.
Do you have a favorite candy yourself?
This time of year, I cannot resist candy corn. I have the biggest candy corn problem. I eat one, pretty soon the bag is gone, and I’m like…what have I done?