"Don't you have a machine that puts food into the mouth and pushes it down?" the Soviet leader Nikita Khrushchev sarcastically asked Richard Nixon in the now infamous Kitchen Debate of 1959. That memorable exchange took place at the opening of the American National Exhibition in Moscow, where Nixon, then vice president, went to promote the latest innovations of the decadent West.
Today, even the most Pampered Chef has no such food-pushing machine, but the quest to make our kitchens smarter continues unabated. Today's technologies are no longer the dumb, passive appliances of the 1950s. Some of them feature tiny and sophisticated sensors that "understand"—if that’s the right word—what's going on in our kitchens and attempt to steer us, their masters, in the right direction. And if Khrushchev's rhetorical question sought to highlight the limitations of the consumer, today's attempts to build a "smart kitchen" highlight those of the culinary geek.
A recent article in the British magazine the New Scientist has brought attention to several such initiatives. Meet Jinna Lei, a computer scientist at the University of Washington who has built a system in which a cook is monitored by several video cameras installed in the kitchen. These cameras are quite clever: They can recognize the depth and shape of objects in their view and distinguish between, say, apples and bowls.
With this surveillance, chefs can be informed whenever they have deviated from their chosen recipe. Each object has a number of activities associated with it—you don't normally boil spoons or fry arugula—and the system tracks how well the current activity matches the object in use. "For example, if the system detects sugar pouring into a bowl containing eggs, and the recipe does not call for sugar, it could log the aberration," Lei told the New Scientist. To improve the accuracy of tracking, Lei is also considering adding a special thermal camera that would identify the user's hands by body heat. The quest here is to turn modern kitchen into a temple of modern-day Taylorism, with every task tracked, analyzed, and optimized. Geeks hate making errors and love sticking to algorithms. That cooking thrives on failure and experimentation, that deviating from recipes is what creates culinary innovations and pushes the cuisine forward, is discarded as whimsical and irrelevant. For many such well-meaning innovators, the context of the practice they seek to improve doesn't matter—not as long as efficiency can be increased. As a result, chefs are imagined not as autonomous virtuosi or gifted craftsmen but as enslaved robots who should never defy the commands of their operating systems.
Another project mentioned in the New Scientist is even more degrading. A group of computer scientists at Kyoto Sangyo University in Japan is trying to marry the logic of the kitchen with the logic of "augmented reality"—the fancy term for infusing our everyday environment with smart technologies. (Think of QR codes that can be scanned with a smartphone to unlock additional information or of the upcoming goggles from Google's Project Glass, which use data streams to enhance your visual field.)
To this end, the Japanese researchers have mounted cameras and projectors on the kitchen's ceiling so that they can project instructions—in the form of arrows, geometric shapes, and speech bubbles guiding the cook through each step—right onto the ingredient. Thus, if you are about to cut a fish, the system will project a virtual knife and mark where exactly that knife ought to go on the fish's body. And there's also a tiny physical robot that sits on the countertop. Thanks to the cameras, it can sense that you've stopped touching the ingredients and inquire if you want to move on to the next step in the recipe.
Now, what exactly is "augmented" in such reality? It may be augmented technologically, but it also seems diminished intellectually. At best, we are left with "augmented diminished reality." Some geeks stubbornly refuse to recognize that challenges and obstacles—of which initial ignorance about the right way to cut the fish might be one—enhance rather than undermine the human condition. To make cooking easier is not necessarily to augment it—quite the opposite. To subject it fully to the debilitating logic of efficiency is to deprive humans of the ability to achieve mastery in this activity, to make human flourishing impossible and to impoverish our lives.
This is not a snobbish defense of the sanctified traditions of cooking. In a world where only a select few could master the tricks of the trade, such “augmented” kitchens would probably be welcome, if only for their promise to democratize access to this art. But this is not a world we inhabit: The Internet is chock-full of detailed recipes and instruction videos on how to cook the most exquisite dish. Do we really need a robot—not to mention surveillance cameras above our heads—to cook that stuffed turkey or roast that lamb?
Besides, it's not so hard to predict where such progress leads: Once inside our kitchens, these data-gathering devices will never leave, developing new, supposedly unanticipated functions. First, we'd install cameras in our kitchens to receive better instructions, then food and consumer electronics companies would tell us that they'd like us to keep the cameras to improve their products, and, finally, we'd discover that all our cooking data now resides on a server in California, with insurance companies analyzing just how much saturated fat we consume in order to adjust our insurance premiums. Cooking abetted by smart technology could be just a Trojan horse for far more sinister projects.
None of this is to say that technology cannot increase our pleasure from cooking—and not just in terms of making our food tastier and healthier. Technology, used with some imagination and without the traditional geek fetishism of efficiency and perfection, can actually make the cooking process more challenging, opening up new vistas for experimentation and giving us new ways to violate the rules.
Compare the impoverished culinary vision on offer in the New Scientist with just some of the fancy gadgetry embraced by the molecular gastronomy movement. From thermal immersion circulators for cooking at low temperature to printers with edible paper, from syringes used to produce weird noodles and caviar to induction cookers that send magnetic waves through metal pans, all these gadgets make cooking more difficult, more challenging, more exciting. They can infuse any aspiring chef with great passion for the culinary arts—much more so than surveillance cameras or instructions-spewing robots.
Strict adherence to recipes can produce predictable, albeit tasty, dishes—and occasionally, this is just what we want. But such standardization can also make our kitchens as exciting as McDonald's joints. Celebrating innovation for its own sake is in bad taste. For technology to truly augment reality, its designers and engineers should get a better idea of the complex practices of which that reality is composed.
Disclosure: Slate and the New Scientist have a content-sharing partnership.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.