October 24, 2013
The avocado is a fruit of a different time. The plant hit its evolutionary prime during the beginning of the Cenozoic era when megafauna, including mammoths, horses, gomphotheres and giant ground sloths (some of them weighing more than a UPS truck) roamed across North America, from Oregon to the panhandle of Florida. The fruit attracted these very large animals (megafauna by definition weigh at least 100 pounds) that would then eat it whole, travel far distances and defecate, leaving the seed to grow in a new place. That’s the goal of all botanical fruits, really. Survival and growth via seed dispersal.
But the great mammals disappeared forever about 13,000 years ago in the Western Hemisphere. Around that time, North America lost 68 percent of its diverse Pleistocene megafauna, and South America lost 80 percent, Connie Barlow, author of The Ghosts of Evolution: Nonsensical Fruit, Missing Partners, And Other Ecological Anachronisms says. But even after this major shift in the land mammal population, the wild avocado still requires the same method of seed dispersal, which makes it somewhat of an evolutionary anachronism.
“After 13,000 years, the avocado is clueless that the great mammals are gone,” Barlow explains. “Without larger mammals like the ground sloth to carry the seed far distances, the avocado seeds would rot where they’ve fallen and must compete with the parent tree for light and growth.”
A fruit with smaller seeds, like a berry, for example, can be consumed whole and dispersed by small mammals, making the chances of fruiting in a new place higher.
After the giant mammals had died out, if an avocado tree was lucky, a jaguar might’ve found the fruit attractive—the cat’s stomach is designed for digesting large hunks of meat, leaving potential for swallowing the avocado whole, though there is no evidence to support this idea. Rodents like squirrels and mice may have also contributed, as they traveled and buried seeds in the ground, rather than letting it rot on the surface. Wild avocados were appealing to larger animals because it had enough tasty flesh to lure them in and could be eaten in one bite. The fruit had a larger pit and less flesh than today’s avocados, but it really served as a quick snack for big mammals like the mammoth. Barlow writes in “Haunting the Wild Avocado,” originally published in Biodversity:
The identities of the dispersers shifted every few million years, but from an avocado’s perspective, a big mouth is a big mouth and a friendly gut is a friendly gut. The passage of a trifling 13,000 years (since the Pleistocene extinction) is too soon to exhaust the patience of genus Persea. The genes that shape fruits ideal for megafauna retain a powerful memory of an extraordinary mutualistic relationship.
How the avocado still exists in the wild after surviving its evolutionary failures remains a puzzle. But once Homo sapiens evolved to the point where it could cultivate the species, the fruit had the chance to thrive anew. Back when the giant beasts roamed the earth, the avocado would’ve been a large seed with a small fleshy area—less attractive to smaller mammals such as ourselves. Through cultivation, humans have bulked up avocados so there is more flesh for us to eat.
The avocado has been a staple food in Mexico, as well as Central and South America, since 500 B.C. Spanish conquistadors discovered the fruit from the Aztecs in the 16th century, but the ahuacate, the Aztec word for “avocado,” wasn’t grown commercially in the United States until the turn of the 20th century. By 1914, the exotic fruit made an appearance on California soil. Roughly 90 percent of today’s avocados are grown in California according to NPR. But Barlow is quick to point out the difference between a cultivated avocado and those found naturally.
“The wild varieties of avocados that are still somewhat available have a thin fleshy area around the seed—it wouldn’t necessarily be something that we would recognize as edible,” says Barlow. “When we go to the store and we see an avocado on sale, it’s always a question of will this be one with a tiny seed, or will it be a batch where the seed takes up five-sixths of the space of the fruit?”
Ecologist Dan Janzen conducted groundbreaking research on these and other “anachronistic fruits” and found that the avocado isn’t alone in this regard. His research in the late ’70s in the neotropics— an ecozone that includes both Americas and the entire South American temperate zone—sparked a shift in ecological thinking regarding these evolutionary-stunted fruits. Other examples include: papaya, cherimoya, sapote and countless other fleshy fruits of the neotropics. Another surprising “ghost” you may see everyday: Honey locust pods scattered about your driveway. All of these fruits are not considered edible by most native mammalian standards today. Barlow continues:
“In 1977, however, [Janzen] was beginning to suspect that he—along with every other ecologist working with large tropical fruits of the New World—had been wrong in one very big way. They all had failed to see that some fruits are adapted primarily for animals that have been extinct for 13,000 years.”
“We don’t have the liver or the enzyme systems to detoxify our bodies from something like the avocado seed,” Barlow says. “But at the same time, the rhino which has been around for ages, can eat all kinds of things that are toxic to everyone else.”
A South American folk recipe for rat poison mixes avocado pits with cheese or lard to kill off unwanted rodents. Whether or not humans are supposed to eat avocados from an evolutionary standpoint, America produced 226,450 tons of the fruit and consumed 4.5 pounds per capita in 2011. The avocado, a true “ghost of evolution,” lives on.
More avocado facts to drop at your next party:
- The Aztec word for avocado, ahuacatl means “testicle”. This is most likely because the avocado, growing in pairs, resembled the body part. After the arrival of Spanish conquistadors, Spanish speakers substituted the form avocado for the Aztec (Nahuatl) word because ahuacatl sounded like the early Spanish word avocado (now abogado), meaning “lawyer.”
- The Spanish-Mexican word “guacamole” was derived from ahuacamolli, meaning “avocado soup or sauce,” made from mashed avocados, chiles, onions and tomatoes.
- For reasons related to the word’s origin, the avocado is also considered an aphrodisiac. According to the book The Aphrodisiac Encyclopaedia, by the time the fruit traveled to Europe, the Sun King (Louis XIV) nicknamed avocados la bonne poire (the good pear) because he believed it restored his lagging libido.
- The Hass variety of avocado was named after a postal employee, Rudolph Hass, who purchased the seedling in 1926 from a California farmer.
- For more information regarding other “ghosts of evolution” Barlow’s theme song is a great listen:
October 21, 2013
It was the second day of autumn term at a small boys’ school in South London in 1979. Without warning, 78 schoolboys and a handful of monitors simultaneously fell ill. Symptoms included vomiting, diarrhea, abdominal pain and, in severe cases, depression of the central nervous system. Several patients were comatose with episodes of convulsive twitching and violent fits of fever. In many patients, there were signs of peripheral circulatory collapse. Within five days of the initial outbreak, all patients recovered in full, though some hallucinated for several days, Mary McMillan and J.C. Thompson report in the Quarterly Journal of Medicine. But what could cause such a sudden and mysterious illness?
Turns out, a bag of potatoes left in storage from the previous summer term.
After careful analysis of the sequence of events, the onset of symptoms was pinpointed to about four to 14 hours after the boys had eaten boiled potatoes that had a high concentration of the toxin, solanine, a glycoalkaloid that was first isolated in 1820 in the berries of a European black nightshade. Nightshade is the term used to describe over 2,800 species of plants in the scientific family, Solanaceae. Eggplants, tomatoes, and some berries are common members of the nightshade family—many of them contain highly toxic alkaloids.
That said, the potato is the most common cause of solanine poisoning in humans. But how do you know when solanine is present in a potato? The tuber is turning green.
Though the green color that forms on the skin of a potato is actually chlorophyll, which isn’t toxic at all (it’s the plant’s response to light exposure), the presence of chlorophyll indicates concentrations of solanine. The nerve toxin is produced in the green part of the potato (the leaves, the stem, and any green spots on the skin). The reason it exists? It’s a part of the plant’s defense against insects, disease and other predators.
If you eat enough of the green stuff, it can cause vomiting, diarrhea, headaches, paralysis of the central nervous system (as evidenced by the incident above) but in some rare cases the poisoning can cause coma—even death. Studies have recorded illnesses caused by a range of 30 to 50 mg of solanine per 100 grams of potato, but symptoms vary depending on the ratio of body weight of the toxin and the individual’s tolerance of the alkaloid. The following cases recorded in various medical journals include examples of some of the most severe cases of solanine poisoning (many of which resulted in death):
1899: After eating cooked potatoes containing 0.24 mg of solanine per gram of potato, 56 German soldiers experienced solanine poisoning. Though all recovered, in a few cases, jaundice and partial paralysis were observed.
1918: In Glasgow, Scotland, 61 people from 18 separate households were affected at once by a bad batch of potatoes. The following day, a five-year-old boy died of strangulation of the bowel following extreme retching and vomiting. According to “An Investigation of Solanine Poisoning” by S. G. Willimott, PhD, B.Sc. published in 1933, the case was investigated by scientists, R. W. Harris and T. Cockburn, who concluded in their article, “Alleged Poisoning By Potatoes” (1918), that the poisoning was the result of eating potatoes which contained five or six times the amount of solanine found in normal potatoes. Willimott cites this particular occurrence as an example of the toxin’s prevalence: “A review of the literature reveals the fact that authentic cases of solanine poisoning are not so rare as authorities appear to believe.”
1925: Seven members of a family were poisoned by greened potatoes. Two of them died. According to reports, symptoms included vomiting, extreme exhaustion, but no convulsions like that of the schoolboys in London. Breathing was rapid and labored until consciousness was lost a few hours before death.
1948: A case of solanine poisoning involving the potato’s nightshade relative, the berry, was recorded in the article “A Fatal Case of Solanine Poisoning“ published in the British Medical Journal. On August 13 of that year, a 9-year-old girl with a bad habit of snacking on the berries that grew along the railroad tracks by her house was admitted to the hospital with symptoms of vomiting, abdominal pain, and distressed breathing. She died two days later. An autopsy found hemorrhages in the mucosa of stomach and middle section of her small intestine. The stomach contained about one pint of dark brown fluid.
1952: According to the British Medical Journal, solanine poisoning is most common during times of food shortage. In the face of starvation, there have been accounts of large groups eating older potatoes with a higher concentration of the toxin. In North Korea during the war years of 1952-1953, entire communities were forced to eat rotting potatoes. In one area alone, 382 people were affected, of whom 52 were hospitalized and 22 died. The most severe cases died of heart failure within 24 hours of potato consumption. Some of the less severe symptoms included irregular pulses, enlargement of the heart, and blueing lips and ears. Those who displayed these ailments died within 5 or 10 days. Authors John Emsley and Peter Fell explain their book Was It Something You Ate?: Food Intolerance: What Causes It and How to Avoid It: ”In the final stages [of the illness] there were sometimes a state of high excitability with shaking attacks and death was due to respiratory failure.”
1983: Sixty-one of 109 school children and staff in Alberta, Canada, fell ill within five minutes of eating baked potato. Forty-four percent of those affected noted a green tinge and a bitter taste in the potatoes.
Not to worry though, fatal cases of solanine poisoning are very rare these days. Most commercial varieties of potatoes are screened for solanine, but any potato will build up the toxin to dangerous levels if exposed to light or stored improperly. Often, the highest concentrations of solanine are in the peel, just below the surface and in the sprouted “eyes”—things that are typically removed in cooking preparation—though Warren would argue even boiling water in potato prep dissolves only a little of the alkaloid. Emsley and Fell continue:
Most people can easily cope with the solanine in the average portion of potato and show no symptoms of poisoning because the body can break it down and rapidly and excrete the products in the urine. But if the level of solanine is as high as 40 mg per 100 g of potato, symptoms include diarrhea…even coma.
The best way to prevent solanine poisoning is to store tubers in a cool, dark place and remove the skin before consumption. A general rule for avoiding illnesses like the ones described above? Green and sprouted? Throw it out.
October 11, 2013
For the privileged eaters of the Western world, so much of eating is done routinely: cereal for breakfast, a sandwich for lunch, probably a protein and vegetable for dinner. Sometimes, the act of eating is so second nature that the guidelines that dictate how and when we eat are invisible—guidelines such as eating a steak for dinner but not for breakfast, or eating lunch in the middle of the day. Eating wasn’t always dictated by these rules—so why is it now? That’s the question that food historian Abigail Carroll set out to answer in her new book, Three Squares: The Invention of the American Meal. Tracing the meal’s history from colonial America to present-day, Carroll explores why we eat cereal for breakfast, how dinner became American and how revisiting the history of our meal can have a tpositive impact on the future of eating. Carroll spoke with Smithsonian.com about the guidelines that control our dining.
How did the associations between certain meals and certain foods, like cereal for breakfast, form?
You start in the very early colonial era with one meal in the middle of the day—and it’s the hot meal of the day, dinner. Farmers and laborers ate earlier because they were up really early, and the elite were eating later in the day because they could sleep in. Breakfast and supper were kind of like glorified snacks, often leftovers or cornmeal mush, and there was not a lot of emphasis placed on these meals. Dinner, the main meal, at which people did tend to sit down together and eat, was really not the kind of social event that it has become. People did not emphasize manners, they did not emphasize conversation, and if conversation did take place it wasn’t very formal: it was really about eating and refueling. That’s the time where there are very blurry lines between what is and what isn’t a meal, and very blurry lines between what is breakfast, dinner and lunch.
Then, with the Industrial Revolution, everything changed, because people’s work schedules changed drastically. People were moving from the agrarian lifestyle to an urban, factory-driven lifestyle, and weren’t able to go home in the middle of the day. Instead, they could all come home and have dinner together, so that meal becomes special. And that’s when manners become very important, and protocol and formality. It’s really around then that people start to associate specific foods with certain meals.
Then, with dinner shifting you have the vacuum in the middle of the day that lunch is invented to fill. People are bringing pie for lunch, they’re bringing biscuits, but the sandwich really lends itself to lunch well. So the popularity of the sandwich really does have something to do with the rise of lunch—and especially the rise of children’s lunch, because it’s not messy. You don’t need utensils, you don’t have to clean up—you can stick it in a lunch pail really easily.
Why is it acceptable to eat cereal and eggs and a waffle for breakfast, but not for lunch or dinner? How did breakfast go from being a necessity meal—fueled by leftovers—to a meal with clear guidelines for what is acceptable to eat?
There was a problem during the Industrial Revolution: people were still eating a farmer’s diet, but they were shifting to a more sedentary lifestyle, which caused indigestion. People who were interested in health started looking into that and started coming up with solutions. Sylvester Graham, the reformer who became a preacher of health ideology, advocated for vegetarian food, and whole wheat as kind of a panacea for health problems, which becomes the answer to the question of breakfast. Then, people who ran sanitariums, including John Harvey Kellogg, in the late 1800s and early 1900s, really took that idea and flew with it and invented new ways to eat farinaceous [starchy] foods.
Entrepreneurs—some of whom worked in the sanitariums, like Charles C. Post–really build on these ideas and make them a healthy requirement. He creates all sorts of crazy testimonies that serve as advertisements for Grape-Nuts, where people’s lives are saved from chronic illness and they’re able to walk again.
Then, there’s also the history of orange juice and milk, with the discovery of vitamins in the 1910s. Milk came to be seen as a super food, and something that would keep you from getting deficiency diseases. It shows up at other meals too, but for much of the 20th century, it’s not a complete meal unless you have milk.
Why is it that, in America, we have maintained the feeling that lunch needs to be a quick meal in the middle of the day?
We still are working a lot—we’re working more hours in the United States than any other industrialized nation. Lunch is the original quick meal; it accommodated changing work schedules.
And dinner has taken on the ideological weight of the meal. Dinner has been the time when we celebrate family, and when we concentrate on having a nice, hot meal, ideally. Because dinner fulfilled that need, there was less of a need for the other meals to. Lunch doesn’t have a lot of cultural work to do; it just has to get us by.
But, if you think about it, it’s not just lunch—it’s breakfast too. We can just pour milk over cereal, or pop some toast in the toaster and walk out the door without even needing a plate or utensils. Breakfast accommodates work. It’s not the meal that shapes work, it’s the work that shapes the meal.
Could you talk about how dinner became a particularly American institution?
Dinner was not initially a strong identifying factor, in terms of nationality, for colonists. At first, they were eating more or less peasant food, porridges brought from England that said more about class than nationality. Then, dinner shifts in the 1700s to become an identifying factor in terms of being English. They’re in this new world, seen as primitive, and so they feel that they have to compensate for that. They inherit the fashions that cross the ocean, like eating a roast with dinner.
In the nineteenth century, the emerging middle class identifies itself through French food and French ways of eating. Things that we take for granted now, like starting a meal with soup or having a salad, were really French concepts. Dessert was largely a French concept, and many of the desserts that we adopted in the 19th century were French desserts. For the Victorian middle class, eating in the French way was a way to imitate the elite.
With the decline of servants in the late 1800s, people just couldn’t keep that up. Then there are the [World] Wars and the Depression, and those require Americans to be frugal. But they don’t just require Americans to be frugal—they give Americans the opportunity to celebrate frugality as patriotic. To eat frugally, to have a Victory Garden and can your own foods is patriotic. The model for dinner is no longer the French multicourse formal meal, but Thanksgiving. Thanksgiving becomes the model for the everyday American dinner. Of course, you don’t eat a whole roast every night, but the idea is that you have “a chicken in every pot,” which was Herbert Hoover’s 1928 campaign slogan. You would have some kind of meat on the table.
Are there any dishes or foods that you would classify as typically, or even exclusively, “American?”
A number of iconic foods—hot dogs and hamburgers, snack food—are hand-held. They’re novelties associated with entertainment. These are the kinds of food you eat at the ballpark, buy at a fair and eventually eat in your home. I think that there is a pattern there of iconic foods being quick and hand-held that speaks to the pace of American life, and also speaks to freedom. You’re free from the injunctions of Victorian manners and having to eat with a fork and knife and hold them properly, sit at the table and sit up straight and have your napkin properly placed. These foods shirk all that. There’s a sense of independence and a celebration of childhood in some of those foods, and we value that informality, the freedom and the fun that is associated with them.
Along those lines, there’s a lot of pushback against those processed foods today, with people wanting to recall old ways of eating, with eating local and fresh. But, how do you think that knowing the kinds of food that we used to eat and the ways that we used to eat, and think about eating, influences the future of American food?
History can play a really central role in thinking about the way that we want to eat in the future. The evolution of the meal is a process, and it continues.
With all of the talk of food and health, I think a really good question to ask is “Can we actually be healthy without eating meals?” And without even, perhaps, eating a family dinner? Studies show that eating together, we always eat better, always.
The family meal is the opportunity to put to work what we’re talking about. If we’re learning about fresh foods and ingredients, the family meal has potential to be another way of instructing our children and ourselves. There’s an interest in renewing the family meal, even reinventing it. We’re not going to be able to revive a Victorian notion of dining; I don’t think we’re interested in it. If we want to spend time together, if we want to invest in our children, if we want to be healthy, the family meal can be a vehicle for that.
October 7, 2013
One of my favorite pastimes is joining the Sunday morning hordes outside San Francisco’s Ton Kiang, a popular dim sum restaurant in the city’s Outer Richmond neighborhood. So when the opportunity recently arose to visit Hong Kong and not only dine on the bite-size delicacies but actually learn how to make them, I jumped at the chance.
Hong Kong is dim sum’s cultural epicenter and here, the cuisine is king. The name dim sum, which means ‘to touch the heart,’ derives from its roots as a simple snack food offered with tea to the weary travelers of Asia’s Silk Road. Even today, dim sum and tea go hand in hand, and going for dim sum in Hong Kong is known as going for yum cha, which translates to ‘drink tea.’
Cantonese immigrants first introduced dim sum to the U.S. during the mid-1800s, and the cuisine’s varied selection and small, convenient portions eventually caught the attention of Westerners. Still, although there have been about 2,000 types of dim sum since its inception, most dim sum eateries in the States stick to several dozen offerings that appeal mostly to westernized palates and incorporate easy-to-find ingredients, such as sui mai (pork dumplings), wah tip (pot stickers), and ha yeung (crispy shrimp balls). In Hong Kong, however, chefs have the advantage of utilizing a larger variety of tropical vegetables from nearby Asian countries, as well as catering to a clientele that’s grown up on dim sum and tend to be more adventurous in their tastes. This means exotic treats like Sun Tung Lok’s baked sea conch shells, or steamed hairy crab roe with pork dumpling at the InterContinental Hong Kong’s Yan Toh Heen.
For over a decade, Peninsula Hong Kong has been offering weekday workshops in dim sum making as part as their larger Peninsula Academy, a series of location-specific workshops that range from paper mache and Chinese puppet mastery to insights into the region’s contemporary art scene. The hour-and-a-half-long course takes participants behind the scenes of the luxury hotel’s 1920s Shanghai-inspired Spring Moon restaurant and into its industrial kitchen to learn the art of crafting both shrimp and vegetable dumplings. Henry Fong, Peninsula’s dedicated dim sum chef, has been working in the culinary world for nearly 20 years. He is also the workshop’s teacher and will be leading our group of six in our efforts to mix, roll and wrap restaurant-style cuisine.
With so many dim sum eateries across Hong Kong, it takes an extra something to stand out. To keep his clientele happy—and his creative juices flowing—Fong hits up local farmers markets and specialty stories like the region’s popular City’s Super on weekends, searching out fresh, new ingredients to incorporate into his menu. He says it’s the endless variety that makes dim sum more interesting to him then other types of cuisine. Though well-versed in creating traditional dim sum favorites like wah tip (pot stickers) and lo mai gai (sticky rice and meats wrapped in lotus leaves), Fong also likes coming up with innovative creations by mixing the conventional with the unusual, such as drumstick-shaped steamed dumplings filled with carrots, spider crab leg and pumpkin; steamed vegetarian dumplings packed with locally grown imperial fungus and topped with gold leaf; and baked crispy buns filled with minced Wagyu beef, onions and black pepper.
As the workshop begins, Fong provides us each with an apron and invites us to gather around a large stainless steel table. He then begins mixing what will be the translucent skin for shrimp dumplings. First, he measures out equal portions of corn starch and high protein powder and pours them into a bowl together, and then adds some boiling water and a small bit of vegetable oil. Next he begins working the mixture with his hands. As he presses, scoops, and turns the mix repetitively it becomes thick and doughy, almost like marzipan. Fong then offers each of us a try.
Once the dough cools, Fong rolls it into a long, thin, rope-like stretch and slices off half-inch pieces, using a blunt stainless steel Chinese cleaver to flatten them each into paper-thin circles. When it’s my turn, Fong shows me how to press down on the flat side of the cleaver with the palm of my hand, turning it as I go. My first attempt at creating a dumpling skin is nearly perfect, though my excitement is short-lived. As it happens, wrapping a shrimp dumpling is not so easy. Fong demonstrates, topping the skin with a teaspoon-size portion of dumpling filler—a blend of finely minced prawn meat, shredded bamboo shoots, and chicken power with some salt, sugar, and vegetable oil—and using two fingers, quickly creates a dozen uniform folds across its top, almost akin to a fan.
“The trick,” he explains through a translator, “is to not let the two sides touch in the middle.” When through my creation looks more like a shrimp-nado than dumpling, though it’s still perfectly edible (and delicious), which I find out soon enough. Someone then asks Fong if there are any natural dim sum makers. “Not too many,” he says, laughing. “If there were I’d be out of a job.”
For the next 45 minutes we continue honing our shrimp dumpling skills, and also give vegetable dumplings (easier to fold because they require less dexterity) a go. Once we’re through, Fong steams them all on a stove top. After five minutes, they’re ready to eat. Along with our own creations, Fong also treats us to plates of roast pork buns, custard balls, and—for the group’s vegetarians—mushroom dumplings. He then offers each of us a cup of jasmine tea.
We are weary travelers, after all.
Where to get delicious dim sum in the States? Fong offers his recommendations for a range of price levels:
Less expensive: “The food is good quality and comparable to dim sum in Asia,” says Fong.
365 Gellert Blvd
Daly City, CA
Moderately expensive: “There is a great variety of dim sum,” says Fong, “and the choices are the same as what we offer in most restaurants in Hong Kong.”
14 Elizabeth Street
New York City, NY
Most expensive: “Every dim sum dish is hand-made with the finest seasonal ingredients and the taste is authentic,” says Fong. “Also, the food presentation is outstanding.”
529 Hudson Street
New York City, NY
October 1, 2013
Pizza has come a long way since the 18th century. This winning combination of bread, tomato and cheese, which food writer Alan Richman dubbed the “perfect food,” is said to have originated in Naples, but today it claims admirers the world over, inspiring endless variations, effusive odes and even, in Philadelphia, a pizza museum. It was only a matter of time before the humble pizza pie got the fine art treatment.
“PIZZA TIME!,” the inaugural show of Manhattan’s Marlborough Broome Street Gallery, features more than 25 works of pizza-inspired art. It’s a playful take on pizza as food, as consumer brand, as cultural icon and, perhaps most importantly, as common denominator. Curator Vera Neykov calls pizza a “metaphor for community,” something that is “not too fussy” and brings people together.
That sense of community animates John Riepenhoff‘s conceptual piece, “Physical Pizza Networking Theory,” which debuted on opening night as a 38-inch pizza topped with miniature pizzas. Riepenhoff hired a local pizzeria to cook up the largest pie its oven could hold and then custom-built the box in which the pizza was delivered. On opening night, visitors were invited to dig into this edible artwork, leaving an empty pizza box in the gallery. Riepenhoff describes the work as a recursive “collage” that “address[es] the ontology of the social as material in art,” and Neykov was struck by its temporality, as visitors came, saw and ate the artwork—“there it was and now it’s gone.”
Michelle Devereux’s “Caveman on Pizza” and “Dude on Pizza #6” couple pizza with other pop culture icons. The irreverent colored-pencil drawings imagine a Tron-like grid world and hovering pizza crafts topped with a surfing Neanderthal and a reclining “dude.” In “Dude,” pastel dinosaurs cavort before an airbrushed aurora borealis, while in “Caveman,” the Bat-signal looms over the cityscape in the background.
Other works are more evocative. Andrew Kuo’s “Slice 8/23/13” and “Piece/Peace” render the pizza’s familiar triangular form in geometric shards and colorful smears, respectively. Will Boone’s “Brothers Pizza” series shows the spooky outcome of photocopying a pizza; these images feature red pockmarks, presumably pepperoni, on black backgrounds.
Neykov, who started working on the show last fall, was surprised by how much pizza art is out there. “I feel like this show can be done three more times with completely different artwork,” she says. The variety makes sense to her because pizza is itself a “canvas”: “There are so many different levels, from super cheap sliced pizza to fancy restaurant pizza to frozen pizza to make-it-yourself pizza. You can dress it up or you can dress it down.”
Some of Neykov’s favorites are Oto Gillen’s photographic still life, “untitled, (Vanitas),” and Willem de Kooning’s pencil drawing, “Untitled Circle.” Although it’s unclear whether de Kooning had pizza in mind, Neykov observes that shadowy circles on the work suggest toppings and thin lines seem to cut it into slices.
For Neykov, PIZZA TIME! is not so much a response to foodie culture as it is a reflection of globalized, digitized, mash-up culture generally. Pizza has “come into [popular] culture in a way that people no longer look at it and think it’s absurd,” she says; it’s a product of culture just as worthy of study and artistic exploration as any other. “It may be silly,” Neykov says of the show, “but it’s not dumb.”