September 9, 2013
Whiskey drinkers know that the moment they swirl a bit of the smoky spirit in their mouth, they’re bound to find a world of flavors: some oak, some smoke, a little vanilla, maybe a slight bite from tannin. Brown liquors — from scotch to bourbon and all the whiskeys in between — are complex spirits that lend themselves to purposeful tasting, creating connoisseurs willing to shell out top dollar for the most peaty scotch or their favorite spicy bourbon. When it comes to the magic of whiskey, their complex profiles might be explained by the chemical fingerprints that separate them from one another — and change the way that they taste.
It’s an idea that the aptly-named Tom Collins, a researcher at the University of California, Davis, is actively pursuing. “I worked on my Ph.D., and it was a project looking at aroma and flavor chemistry in wine [fermented] in oak barrels,” Collins explains, crediting the barrels with sparking his initial interest in the chemistry of spirits. “It sort of seemed a natural extension to look from the chemistry of wine to the chemistry of whiskeys, because the chemistry of oak barrels play a huge role in what you see in whiskeys of all sorts.”
Collins and researchers at Davis set out to see if they could determine the chemical differences among 60 different whiskeys: 38 straight bourbon whiskeys, 10 rye whiskeys, five Tennessee whiskeys and seven other American whiskeys, varying in age from two-to-15 years old. What they found was a spectacular testament to the spirit’s complex chemistry–over 4,000 different non-volatile compounds across the different samples, results which he presented today at the 246th National Meeting & Exposition of the American Chemical Society. “It’s very complex,” Collins says of the chemistry. “There are components that are barrel derived, as we would expect, but there are also things that are related to the grains that are used to make the distillates in the first place—so the corn and wheat and rye and things that are fermented to form the distillate. We see some components that appear to be grain related, and there are also likely to be components that are derived from the yeast that are used do the fermentation.”
Of the thousands of chemical compounds Collins found, there was a fair amount of overlap between the different spirits. But Collins found that each spirit contained unique compounds, or unique concentrations of compounds, that he could use to distinguish a scotch from a bourbon, or a Tennessee whiskey from a bourbon, simply by looking at the liquor’s chemistry. “If you try to make sense of all of the components that are there, it’s essentially overwhelming, but if you filter out the things that are not used in Tennessee whiskeys, or things that are only present in some of the bourbons, you can sort of whittle away down to the things that define what a bourbon is or what a Tennessee whiskey is chemically,” Collins said.
It might be the perfect answer that eternal question of novice whiskey drinkers everywhere: what exactly is the difference between a whiskey and a bourbon?
The confusing answer is that bourbon is always whiskey, but all whiskey isn’t bourbon. This has always been true from a historical and regulatory perspective. Historian Michael Veach spoke with Food and Think in June and dispelled the myths that bourbon has its roots in Bourbon County, Kentucky, and that all bourbons must originate there. “‘People started asking for ‘that whiskey they sell on Bourbon Street,’ Veach says, ‘which eventually became ‘that bourbon whiskey.’”
The regulatory distinction presents a slight complication: some Tennessee whiskeys, from a regulatory standpoint, actually qualify as bourbons, but choose not to market themselves as such (Jack Daniels, for example, adamantly markets itself as a Tennessee whiskey, even when it meets regulatory standards for being a bourbon). Natalie Wolchover at Live Science outlines the regulatory standards for bourbon:
While bourbon whiskey has its roots in Kentucky, and continues to be primarily produced there, it is now manufactured in distilleries all over the United States. Manufacturers must meet the following requirements in order to advertise their whiskey product as “bourbon”:
It must be produced in the U.S. from a grain mixture (called “mash”) made up of at least 51 percent corn. It must be distilled to a maximum strength of 160 proof, bottled at a strength of at least 80 proof, and barreled for aging at no more than 125 proof. It must be aged in new, charred oak barrels. To qualify as “straight bourbon,” the spirits must meet the above requirements as well as being aged for at least two years and containing no added coloring, flavoring or other spirits.
Many bourbon whiskey distilleries in Kentucky advertise their use of unique water filtered by the limestone shelf in Bourbon County; while this feature may add to the allure of Kentucky bourbon whiskey, the federal trade regulations do not stipulate about what water must be used.
Collins thinks he might have a more chemically elegant answer to the conundrum. As his team discovered, there are 50 to 100 chemical compounds such as fatty acids and tannins that can be used to distinguish a Tennessee whiskey from a bourbon to such an extent that Collins can tell the difference between them without tasting either. Chemically, it’s often a question of concentration–how much of a plant derived compound does a spirit have? How much tannin? “There are, in many cases, certain compounds that are only found in one or the other, but more often, there are compounds that are present in both but at different concentrations. Those are the tannins, the fatty acids, and in some cases, turpentine – compounds that are plant-derived.”
These compounds complicate the matter further–certain chemicals are extracted from the wood barrels during the aging process, which might not be unique to the distillate itself. As Collins notes, barrels are, after all, made from trees–an unarguable plant substance. So how do they discern the unique plant-derived elements in the distillates from the compounds that might come from the barrel? “Some of the ways we get through that is to look at whiskeys that have been freshly distilled, and haven’t been put in barrels yet, so we can see what’s there in the fresh distillate before we put it in oak, and then we can see what changes between the newly distilled spirit and the spirit that has been aged in barrels for some period of time,” Collins explains. “That helps us to understand what the things are that come from the barrels, versus the things that come from the distillate itself.”
Collins and his team have yet to embark on the next step of their experiments–relating the differences in chemical makeup to potential sensory differences in aroma and flavor–but he feels fairly confident that the two are related. “I think–being a chemist–that the sensory differences arise from the chemistry,” Collins admits. Take, for example, the chemical compounds that arise when the spirit is being aged in a charred barrel. “The sensory component that you smell, that you associated with toasted oak, or charred oak, is going to be related to the compounds that are extracted by the whiskey from the wood,” Collins explains.
Understanding the delicate interplay between chemistry and aroma could be a huge help to distillers looking to tweak their whiskey to encapsulate that perfect blend of smoky and spicy. “This could be a tool [distillers] could use to understand if they make a change to their distillation processes, how does that impact the resulting whiskey,” Collins said, noting that the better distillers understand how the process of distillation impacts the final product, the better they can manipulate the process to their advantage. “It’s a tool that can be used by distillers large and small to understand the impact of what they’re doing on the chemistry, and then the sensory.”
It’s research that means that the perfect whiskey–smoky, spicy, or however you want it–might not be so elusive after all.
June 27, 2013
Imagine if you will: Agropolis, a supermarket where all your produce is hydroponically grown right there in the store. Even living in dense, urban areas you’d have access to fresh fruits and vegetables. It eliminates the issue of transportation, further driving down costs, and because you’d pluck what you wanted straight from the farm/store display, there’d be less waste in the form of plastic bags and cartons. Unfortunately, Agropolis is purely conceptual, the idea of a team of Danish designers who wanted to take the farm-to-table concept to a new level. Their grown-in-store model, while fun, has its drawbacks, namely that the technology required to make an Agropolis-like market a reality is prohibitively expensive. So while these idyllic urban markets remain a figment of the human imagination, grocery stores are finding ways to innovate and make use of technology to create better shopping experiences. Here are five ways in which you may presently see the supermarket of the future:
Same-Day Delivery: Many food retailers now allow customers to fill a virtual cart online and have their order of goods delivered directly to their doorstep; however, there is a delay between the time you place your order and the time you receive your goods—as much as a few days depending on the delivery time slots available. If you’re ace at planning ahead, this works great. Google is looking to change that. In April, they began testing a new service dubbed Shopping Express in the San Francisco Bay area. Customers can order from big box stores—like Target and Walgreens—as well as from participating local stores, which means a person doesn’t have to build their pantry up through a series of trips to different stores. At Slate, Reid Mitenbuler notes that this service could be revolutionary in how it allows a person access to better food, “A lot of times I’m looking for specialty goods—higher quality seafood, some specific ethnic spice, fresh roasted coffee beans, high-end local bread, a snooty variety of coconut water—that requires a trip to Whole Foods, Trader Joe’s, the Chinese or Indian market, or some other out-of-the-way place.” Not to be outdone, both Amazon and WalMart are each testing same-day and next day delivery services.
Receipts in the Cloud: Cloud computing has been promoted as a means to break the bonds of your hard drive and to access your data—music, movies, documents—from anywhere as long as you have access to a data connection. Grocery stores are starting to jump on the bandwagon. This June, Booths supermarket in the UK started phasing out paper receipts, instead sending them to a customer’s cloud-based account. The idea of e-receipts, where a retailer will email you a receipt in lieu of handing you a paper one, isn’t new; however, Booths cloud refines the idea in such a way that digital-only receipts has advantages for the consumer. Shoppers have an account so they can track not just how much they spend on each shopping visit, but also their expenditures by category, allowing them to make budgetary—and dietary—adjustments as needed. There’s also the ecological bonus of eliminating an estimated 100,000 rolls of receipt paper per year.
Scanning With Your Smartphone: Scan It devices have been around for a few years already. On entering the store, shoppers pick up a device that looks like a remote control with a monitor built in and can scan items as they shop, keeping a running total of their purchases that is designed to make the checkout process faster. Some chains, like Giant and Stop and Shop, are taking that concept a step further by publishing apps that turn your smart phone into a barcode scanner. Though these apps are usually free to download, you may get hit in the wallet elsewhere: stores are also using mobile technology to get shoppers to spend more money by offering app-exclusive coupons to spur impulse buys. A supermarket in Paris, however, is taking this a step further. Customers use their phones to scan the item and, in addition to maintaining a running tally of the grocery order, but they will be provided with nutritional information and other data about the item before they decide to place it in their cart.
No More Typing in Produce Codes: While smart phones may be the new barcode readers, Toshiba is figuring out how to do away with barcodes altogether by developing a scanner savvy enough to tell the difference between your Fuji and Granny Smith apples. Unveiled in spring 2012, the Object Recognition Scanner hones in on patterns and colors in food much in the same way that facial recognition scanners use certain criteria—like the distance between a person’s eyes and nose width—to identify people. But here, the scanner can discern between fresh produced and prepackaged goods. While this technology could one day spell the end for barcodes, as of this writing, the scanners have not yet been tested outside of a demo environment.
Shorter Waits in Line: Infrared cameras used to detect body heat are a tool traditionally used by police and the military. But food retailer Kroger sees a use for them in the grocery store. By mounting the cameras at the entrance to the store and at the cash registers, the cameras work with in-house-developed software that records supermarket traffic at different times of day, allowing managers to know how many lanes need to be open and when to open them. Currently in use at some 2,400 stores, the average customer wait time has been reduced from 4 minutes to 26 seconds.
June 10, 2013
Curt Jones, founder and CEO of Dippin’ Dots, was always interested in ice cream and science. He grew up on a small farm in Pulaski County, Illinois. As a child, he and his neighbors would get together and make homemade ice cream with an old hand crank: he’d fill up the machine with cream and sugar, add ice and salt to lower the temperature below zero and enjoy the dessert on the front porch.
When he first made Dippin’ Dots in 1987, the treat required a little more than a hand crank. By flash-freezing ice cream into tiny pellets with liquid nitrogen, Jones made the ice crystals in his dessert 40 to 50 times smaller than in regular ice cream—something he marketed as “the future” of the classic summer snack. Today, the company sells about 1.5 million gallons of dots a year and can be found in 100 shopping centers and retail locations, 107 amusement parks and more than one thousand stadiums, movie theaters and other entertainment venues across the United States.
But, 26 years after its invention, can we still call it the “Ice Cream of the Future”? Now that competitors including Mini Melts and MolliCoolz caught on and began shaking things up with their own versions of the flash-frozen dessert, has the novelty begun to fade?
In the mid-2000s, when the recession made it difficult for the average amusement-park-goer to drop the extra dollars for the fun dessert, Dippin’ Dots plummeted in sales. In 2007, Dippin’ Dots entered a patent battle with the competitor “Mini Melts” (Frosty Bites Distribution)—a legal defeat that would ultimately contribute to the company’s financial struggles. A federal court jury invalidated Jones’ patent for “cryogenic encapsulation” on a technicality: Jones had sold the product for over a year before filing for the patent. The New York Times cites a memo prepared by the law firm Zuber & Taillieu:
One of the arguments that Mini Melts used in undermining Dippin’ Dots was that the company committed patent fraud by not disclosing that it had sold its ice cream product one year prior to applying for its patent. Technically, an inventor of a new product (or process) is required to apply for a patent within one year of inventing the product or the product is considered to be “public art” and the right to file for a patent is forfeited.
In the suit Dippin’ Dots, Inc. v. Frosty Bites Distribution, LLL aka Mini Melts, it was determined that Jones had sold a similar version of the product he eventually patented to more than 800 customers more than one year prior the filing of the patent, making the company’s claim against Mini Melts unfounded. The Federal Circuit Court ruled that Dippin’ Dots’s method of making frozen ice cream pellets was invalid because it was obvious.
In 2011, Dippin’ Dots filed for Chapter 11 bankruptcy in federal court in Kentucky. Again, according to the Times, the company owed more than $11 million to Regions Bank on eight different promissory notes. In 2012, Dippin’ Dots secured an offer from an Oklahoma energy executive that would hopefully buy the company out of bankruptcy for 12.7 million dollars. The Wall Street Journal reports:
The deal would preserve the flow of colorful flash-frozen ice cream beads to baseball stadiums and amusement parks across the country…Under the new ownership, the company would continue to pump out the dots from its 120,000-square-foot Paducah, Kentucky, manufacturing plant…
Even with the new owners, the plan was to keep Jones actively involved in the product. To stop the “Ice Cream of the Future” from becoming a thing of the past, the company tried a few twists on the orignal ice cream beads that eventually helped drag the company out of its crushing debt. These days, the company has a few spin off products in the works—a fusion of dots and regular ice cream called Dots N’ Cream and a Harry Potter-themed ice cream at Universal Studios, for example. And by August, Dippin’ Dots will have close to a thousand locations with 40-degrees-below-Fahrenheit freezers installed in grocery stores.
But in the late ’80s, the company was still in its nascent stages. Jones was a Southern Illinois University graduate with a degree in microbiology—a solid foundation for his futuristic idea to take shape. After graduating in 1986, he took a job with Alltech, a biotechnology company based in Kentucky. The science behind the invention is impressive, even 30 years later.
His main responsibility at Alltech was to isolate the probiotic cultures found in yogurt, freeze-dry them into a powder, and then add then to animal feeds as an alternative to antibiotics. Once ingested, these “good bacteria” came back to life and helped with the animal’s digestion. Jones experimented with different ways to freeze the cultures, and he discovered that if he froze the cultures in a faster process, the result was smaller ice crystals. After many attempts, he found that by dipping cultures into liquid nitrogen (a staggering 320 degrees Fahrenheit below zero) he could form pellets—making it easier to pour the small balls of probiotics into different containers.
A couple of months after this discovery, he was making homemade ice cream with his neighbor when they started a casual conversation about ice crystals. Jones loved homemade ice cream since childhood, but he never liked the icy taste—he wished they could freeze the dessert faster. “That’s when the light bulb came on,” Jones says. “I thought, ‘I know a way to do that better. I work with liquid nitrogen.’” Jones immediately began working on this budding business.
In 1988, Jones and his wife opened their creamery in Lexington, Kentucky with zero restaurant experience under their belt, and their rookie mistakes were costly, at least at first.
“There just weren’t enough customers coming through the door,” Jones says. “We got by because we sold one of our cars and we had some money saved up.” In that same year, he began converting an old garage on his father’s property into a makeshift factory (pictured below). With the help of his sister Connie, his father and his father-in-law, the Joneses were able to make the conversion.
By 1989, undeterred, Kay and Curt closed their failed restaurant and tried their luck at county and state fairs instead. Success there brought them to Nashville, Tennessee, and Opryland USA. At first, Jones sold the product to the park in designated kiosks throughout Opryland. They were just barely breaking even. The employees at Opryland working the stands didn’t know how to answer questions about the product. “It totally failed the first few years,” Jones says. “The people that tried it liked it, but at that time Dippin’ Dots didn’t mean anything—we didn’t have the slogan yet.” (Sometime between 1989 and 1990, Jones and his sister Charlotte came up “The Ice Cream of the Future” tagline that would help raise the product’s profile.) After two years of terrible sales at Opryland, a new food service supervisor at the park gave Dippin’ Dots another shot. Jones could sell and sample Dippin Dots himself on a retail level and explain the technology to customers himself.
When sales at Opryland took off, Jones pitched the product to other amusement parks, and by 1995 Dippin’ Dots made their international market debut in Japan. In 2000, the company’s network spanned from coast-to-coast.
It’s strange to embrace the nostalgia of a product that garnered a name for itself as a thing of the “future” —ironic even. But for anyone who pleaded with their parents to buy them a bowl of Jones’ straight-from-the-lab ice cream, it’s difficult to imagine Dippin’ Dots going the way of the Trapper Keeper and hypercolor T-shirt.
April 15, 2013
When we think about pigs today, most of us likely imagine the Wilbur or Babe-type variety: pink and more or less hairless. Mention pig farming and images of hundreds upon hundreds of animals crammed into indoor cages may come to mind, too. But it wasn’t always like this. Prior to the industrial revolution, pigs came in an astounding variety of shapes, sizes, colors and personalities. And the ham made from their cured meat was just as diverse.
“The tale of ham’s innovation began around 200 years ago, and it paved the way for how ham is produced today,” said Nicola Swift, the creative food director of the Ginger Pig, a company of butchers and farmers that specializes in rare breeds of livestock reared in England’s North York Moors. Swift presented a talk on the history of ham at the BACON conference in London last weekend, which sadly was not devoted to bacon but to “things developers love.”
One family in particular, the Harrises, almost single-handily changed the way England turned pigs into ham, she explained, and in doing so, they inadvertently laid the foundations for large-scale, homogenized pig farming.
Mary and John Harris were pig folk. Their family hailed from Calne, a quiet town in Southwest England. In the early and mid-1800s, they played a small but important role in providing London with pork. At the time, much of London’s pork arrived by way of Ireland. But without refrigeration, transporting large amounts of meat was impossible. Instead, pig handlers would literally walk the animals to the Irish coast, corral them onto boats destined for Bristol, and then continue to trek to London by foot.
But a deliciously fat pig forced to trot more than 100 miles would soon turn into a lean, tough mass of muscle. To make sure the ham, chops and bacon that those animals were destined to become remained fatty, tender and flavorful, pig herders would make pit stops along the way to give the animals a rest and fatten them up. The Harris farm was one such destination. The family also supplied Calne with meat from their small shop on Butcher’s Row, founded in 1770.
The Harrises were by no means well off. If they butchered 6 or 8 pigs in a week they wrote it off as a success. Still, they got by all right. That is, until tragedy struck. In 1837, John Harris, the relatively young head of the household, died suddenly, leaving his wife, Mary, to manage the business and look after the couple’s 12 children. A few years later, just as the family was getting back on its feet, hard times fell upon them once again. It was 1847, and the Irish potato famine arrived.
In Ireland, potatoes fed not only people but their pigs, too. As season after season of potato crops failed, the Irish could not feed themselves, much less their animals. The supply of pork to the Harris’ farm and butcher shop stopped arriving. In desperation, Mary and her son, George, hatched a scheme to send George to America by ship. The idea, they decided, was for George to strike up a pig business deal with American farmers and figure out a way to transport their slaughtered animals across the Atlantic in boxes packed with salt to ward off spoilage during the long journey. On its way to England, that meat would cure into ham and George’s entrepreneurial venture would save the family.
Not surprisingly, George failed in his mission. But while in the States, he did learn of a remarkable new practice the Americans were pursuing called ice houses. In the U.S., this method allowed farmers to slaughter pigs not only in months ending in an ‘r’ (or those cold enough for the meat not to rot before it could be cured and preserved), but during any time of year – even in steamy July or August. Curing, or the process of preventing decomposition-causing bacteria from setting in by packing the meat in salt, was then the only way to preserve pork for periods of time longer than 36 hours. Such horrendously salty meat was eaten out of necessity rather than enjoyment, however, and it often required sitting in a bucket of water for days at time before it could be rinsed of its saltiness to the point that it would even be palatable. ”This all harks back to the day when people had to preserve something when they had lots of it because there were other times when they didn’t have much,” Swift said. “This type of preserving goes back hundreds and hundreds of years.”
Ice houses, specially constructed sheds with packed ice blocks either collected locally or imported from Norway, offered partial relief from that practice, however. Charcoal acted as an insulator, preventing the ice from melting quickly and trapping the cool air within the small room.
When George returned home, curly tail between legs, he immediately got busy earning back his family’s trust by experimenting with ice house design. By 1856, he had succeeded in constructing what was likely the first ice house in England. The ham that resulted from slaughtering pigs in that cool confine was more tender and tasty since it didn’t have to be aggressively cured with large amounts of salt. Eventually, the Harrises shifted to brining techniques, or curing in liquid, which led to the creation of the massively popular Wiltshire ham.
The family patented George’s creation, and it soon began spreading to other farmers and ham producers who licensed the technology around the country. The Harris’ wealth increased so quickly and so dramatically that they partly financed the construction of a branch of the Great Western Railway to their village in 1863. Several decades after that, they helped bring electricity to Calne.
While the Harris’ tale is one of personal triumph, their mark on England’s ham production did not come without cultural costs. Prior to the ice house, each region in the UK and Ireland enjoyed their own specific breed of pig. In Lincolnshire, for example, Lincolnshire ham originated from the Lincolnshire curly coat, an enormous beast of a pig that was around twice the size of the animals typically bred today. It’s long, thick curly white coat kept the hardy animal warm throughout the damp winters, and its high fat content provided plenty of energy for the farm laborers that relied upon its exceptionally salty ham for sustenance. After a long decline, that breed finally went extinct in the 1970s thanks to industrialized farming.
Other regions once boasted their own breeds and unique ham brews. In Shropshire, people made “black ham,” which they cured along with molasses, beer and spices. This created an exceptional mix of salty sweetness, with a tinge of sourness from the beer. In Yorkshire, a breed called the large white – which is still around today – inspired a method of steaming cured ham in order to more efficiently remove the salt, while in Gloucestershire people preferred to add apples to their ham cures. But after the Harris’ ham empire took off, a massive advertising campaign that followed painted a picture of what ham and bacon should look and taste like, largely removing these traditions from kitchens around the country. “Most of the regional variances are sadly not known any more except to ham geeks,” Swift said.
In addition to stamping out ham variety, the Harris’ factory – which soon employed hundreds of staff and processed thousands of pigs each week – and others like it began favoring homogenized mass-production methods of indoor pig rearing. Older residents in Calne recall the factory’s unmistakable reek in the 1930s. Eventually, public protests caused its closure and demolition in the 1960s, but for local pigs and ham, the damage was already done. Between 1900 to 1973, 26 of the unique regional breeds of pigs and other livestock went extinct, with others surviving only in very small numbers.
To try and preserve pig and other livestock heritage, concerned citizens formed the non-profit Rare Breeds Survival Trust in 1973, which maintains a sort of endangered species list and conservation group for farm animals on the fringe. In addition, farms such as Swift’s Ginger Pig specialize in breeding and reintroducing some of these lines into restaurants and local butcher shops in London and beyond, and in introducing traditional curing techniques through their upcoming book, the Farmhouse Cook Book. “Innovation is awesome and brilliant, but there’s also a dark side,” Swift said. “That’s the history of ham.”
August 1, 2012
In today’s global village, it should come as no surprise that Eastern and Western cultures are often wedded, and sometimes in weird and ingenious ways. Enter the Chork. While it may sound like an expletive, or a clever name given to the odd guttural noise produced when an over-zealous chortle leads you to choke, it is neither.
The Chork is an innovative new eating tool that combines chopsticks with a fork. It is the brainchild of Jordan Brown, who saw the need for the Chork at a sushi dinner when he found himself constantly reaching for a fork while eating with chopsticks, to grasp smaller grains of rice. Brown, a partner at the concept development and marketing company Brown Innovation Group Incorporated (B.I.G.) in Salt Lake City, then resolved to make the transition between the fork and chopsticks easier with the Chork.
With chopsticks on one end and a fork on the other, you’re bound to ask why you didn’t come up with this simple yet brilliant innovation yourself. Keeping in mind that most people need to use a fork because they haven’t quite mastered the art of using chopsticks, Brown has designed the Chork such that the adjoining sticks can be pinched together to grasp food without needing to be separated, functioning as trainers. For the initiated, the sticks come apart and click back into place just as easily.
When we wrote before about the origins of the fork and chopsticks, little did we imagine that these implements with such diverse and storied histories could be blended so harmoniously. The fork, the younger of the two, is said to have caused quite a stir when it was first introduced:
In 1004, the Greek niece of the Byzantine emperor used a golden fork at her wedding feast in Venice, where she married the doge’s son. At the time most Europeans still ate with their fingers and knives, so the Greek bride’s newfangled implement was seen as sinfully decadent by local clergy.
Chopsticks, in contrast, had a more humble beginning:
The earliest versions were probably twigs used to retrieve food from cooking pots. When resources became scarce, around 400 BC, crafty chefs figured out how to conserve fuel by cutting food into small pieces so it would cook more quickly.
While it took two years in the making for the prototype of the Chork to undergo several revisions, the final product finally hit the shelves early last year. “People are really interested to see something new and unique, especially in a part of food service that hasn’t really had a lot changes. The utensils that you use to consume your meal have been the same for forever, so I think part of it is just the novelty of having a different tool with which to eat your food, really gets people excited,” says Nick Van Dyken, general manager of the Chork.
Receiving rave reviews from Gizmodo blogger Casey Chan who goes as far as to say that “the chork, instead of pandas, could be used to maintain US/China relations,” and Daily Mail writer Ted Thornhill who writes, “this new kid on the utensil block is certainly proving a hit with diners,” the Chork seems to have made an impression. But it is left to be seen how lasting that will be. For now, this versatile tool has made inroads to dethroning the simple fork. According to Van Dyken, the utensil is available at grocery stores on the East Coast, the Atlantis resort in the Bahamas, and Carnival Cruise Ships. Here in D.C., the PhoWheels food truck distributes them in lieu of more traditional utensils.
The Chork has inspired a spinoff from B.I.G., namely, the creation of a spoon version of it, tailored to accompany the many soup-based Chinese and Vietnamese dishes, which should be available early next year (the Choon, perhaps?).
Cutlery might have been slow to change thus far, but the tide is turning. Another newcomer that seeks to find room on your table is the Trongs. This claw-like device was created to help grip finger foods while avoiding the mess. No longer will finger-lickin’ good wings or ribs require just that.