Future Tense

Should We Outsource Emotional Labor to Robots?

Cuddly mechanical bears and seals may seem harmless, but there’s a real risk to depending on machines for comfort. 

Huggable pero.

Huggable, the robot teddy bear, and Paro, the baby harp seal robot.

Photo illustration by Slate. Screenshot via NYT and image courtesy of Koichi Kamoshida/Getty Images

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. On Wednesday, Jan. 20, Future Tense will host a lunchtime event in Washington, D.C., on human-robot interaction. For more information and to RSVP, visit the New America website.

If you’ve ever been to Walt Disney World and seen the unnervingly consistent smiles and surface cheerfulness on the faces of the theme park’s employees, you have experienced the phenomenon sociologist Arlie Russell Hochschild described as “emotional labor”—the performance of feelings that service workers must provide to their customers.

Disney workers are an extreme example, but emotional labor is a requirement in many jobs. Hochschild studied flight attendants and bill collectors, but in everyday life emotional labor is everywhere: It is the smile your waiter is expected to give you at a restaurant or the pleasantries the nurse exchanges with you during a doctor’s visit.

However enjoyable such experiences are for the customers, emotional labor can be draining for those who have to perform it. (Those smiles and exhortations to “Have a nice day!” can be stressful to produce on demand, day in and day out). Perhaps this is why we look to software and technology to alleviate some of the burden. We are already comfortable using apps and software that act as mild-mannered techno-nannies, counting our steps or calories or productivity levels and reminding us of upcoming appointments or flight delays; more recent software such as ToneCheck offers a form of “emotional spellcheck” for our online lives, scanning emails for words or phrases that might “convey unintended emotion or tone.”

We’ve also become comfortable using things to fulfill the emotional needs that real people might not be willing or able to meet. The clothing company Lululemon markets the “sensation innovation” of its athletic wear by asking, “How do you want to feel?” and includes options such as “hugged,” “naked,” and “relaxed.” “We engineer our Hugged Sensation to feel like a comfortable embrace from a close friend,” its website states. Because if you can’t find a friend to hug you, then your yoga pants should. (It’s cheaper than hiring a professional cuddler).

Roboticists have taken this idea several steps further with the creation of therapeutic robots like Paro the baby harp seal and Huggable the robot teddy bear. These robots are explicitly marketed to the public as nurturing companions, not merely machines. We are supposed to view them like emotionally hyperresponsive pets. Holding Huggable “feels like holding a puppy,” according to its creators, even though Huggable’s video-camera eyes and microphone ears give it surveillance capabilities the Stasi would have coveted.

A few critics, most notably Sherry Turkle, have shown how our embrace of these technologies poses a counterintuitive but real danger to our ability to relate to one another as human beings. And yet we still tend to emphasize the positive side of these companion robots, viewing them not as emotionally manipulative but as emotionally fulfilling. (One doctor told the New York Times he thought Huggable should become the “standard of care” in pediatric hospitals).

In an episode of Aziz Ansari’s Netflix series, Master of None, one of the characters inherits his grandfather’s Paro robotic therapy seal when the grandfather dies. At first skeptical, the character is soon responding to Paro’s adorable emotional manipulations and eventually finds himself spending nights in with his robot companion, happily watching TV together.

The episode pokes fun at the ease with which we project our feelings onto robots. But this emotional neediness could pose a real challenge in the future. If artificial intelligence develops as swiftly and powerfully as researchers such as Nick Bostrom predict, we might be dispatched swiftly not by Skynet-sponsored Terminators but by an army of snuggly therapy bots; our A.I. overlords could look more like Oprah than HAL.

Outsourcing emotional labor has other more immediate consequences. What happens when we know that someone is faking a feeling? We might sense implicitly that our overworked waiter isn’t actually happy to see us when we sit down at his table, but unspoken social conventions allow us to make as much of a pretense of believing his feelings as the waiter does of performing them. But when we interact with robots that we know have been programmed to give everyone the same friendly greeting regardless of anyone’s actual feelings, that unspoken compact disappears. Even if social robots become more skilled at expressing a fuller range of human emotion (like “Nadine,” who uses Siri-like technology and can express anger as well as happiness and sadness), we risk doing to expressions of social feeling what Muzak has done to music—homogenizing it to the point that it becomes nearly unrecognizable.

The risk is not a world run by robots (although employers in Japan already use Smile-Scan machines to analyze the smiles of their service workers). The risk is that outsourcing emotional labor to robots and machines could lead to mass emotional deskilling on the part of people. Writing about “smart” technology in the home, Alfred Borgmann warned, “We will slide from housekeeping to being kept by our house.” (Or, as some Nest “smart” thermostat owners recently discovered, being frozen out of their homes.) A similar decay of skill could occur with our ability to understand and read others’ emotions.

Furthermore, despite the hype that often accompanies their release, these technologies can reinforce existing inequalities and stereotypes about caregiving, for example. Julia Ticona, a doctoral fellow in sociology at the Institute for Advanced Studies in Culture at the University of Virginia who studies the social impact of technology, told me, “I think it’s important to ask questions about how these technologies may reinforce or perhaps change our ideas about care and who’s ‘naturally’ more inclined to do it. Which patients will receive human care, and who will be tended to by robots? For me, thinking about devices that are designed to save humans emotional labor in the future is a really interesting place to look at the politics of care in the present—who gives it, who gets it, and how much we value it both emotionally and in dollars and cents.”

Right now we are embracing emotional robots out of efficiency and cost savings and, let’s face it, because they are really cool machines. We aren’t thinking through whether it’s either appropriate or desirable for them to become part of our everyday lives. But there are interactions in which empathy must trump efficiency. Many of those interactions occur in private. Hochschild’s work demonstrated that people understand emotional labor in private life very differently than they do when at work. In private, emotional labor acts as a kind of “gift exchange,” ideally one fueled by mutual understanding.

Robots are now poised to become part of that exchange, acting as our emotional Stepford Wives in private as well as in public. New robots like Paro respond “as if” they are alive and “like” they understand our feelings and needs. But they don’t. Not in the way that humans understand one another. “The machines will do what we ask them to do and not what we ought to ask them to do,” Norbert Wiener warned in 1949. In our age of ersatz intimacy, perhaps sophisticated emotional mimicry is enough for us. But we should at least acknowledge that we are setting a low bar for our emotional lives. Emotions are like weather—only partially understandable and predictable. But that is precisely what makes them a bug and a feature of being human.