Future Tense

Should AI Makers Be Legally Responsible for Emotionally Manipulating Customers?

Siri is just the beginning.

Photo by Lintao Zhang/Getty Images

Harvey Milk famously advised his “gay brothers and sisters” to come out, but “only to the people you know, and who know you.” He understood the power of human empathy. It is hard to hate people you know and who know you.

We’ll soon see how human empathy shapes the design of artificial intelligence because, as it turns out, our empathy is so strong we will consider objects people if we can interact with them like people. Her played with this idea in a future where Joaquin Phoenix’s Theodore falls in love with his operating system, Samantha. But right now on Indiegogo, you can contribute to EmoSPARK, “the first artificial intelligence console committed to your happiness.”

The campaign claims that EmoSPARK can track your emotions, creating a record of each person’s facial expressions, preferences, and reactions. The device then tries to improve and maintain your mood by sharing with you pictures, music, and video that its observations and analysis indicate will make you happier.

Obviously, there are some key differences between EmoSPARK and Samantha. EmoSPARK doesn’t appear to read your emails. Its voice is closer to the emotion-less Majel Barrett-Roddenberry on Star Trek than the flirty Scarlett Johansson. Its ability to have real interactions with users is suspect and might be limited to suggesting YouTube clips and Facebook posts. But they’re both designed to get to know you and adapt their actions to what they learn. And both EmoSPARK and Samantha interact with people using a conversational interface.

If humans start to think of their EmoSPARKs as actual people, that interface might be the reason. When we have conversations with a machine, we start to think of that machine as a person. The late Stanford professor Clifford Nass said, “Our brains are built to treat these conversations with computer-based voices to an incredible degree like [conversations] we are having with actual people—including flattery, flirtation and all the rest.” If EmoSPARK determines that nothing perks you up after a bad day like Community bloopers and starts to verbally tell you about clips available on the Internet, your brain will rewire itself to think of the AI appliance as a fun, quirky roommate.

Smart developers and manufacturers will start to incorporate this into their designs. Siri, for example, is a step in that direction, as users have described “her” as “a good listener [and] funny in a smart, dry way,” while also undercutting those compliments by saying “She’s just a little too glib.” Although Siri’s hit-or-miss execution can limit the bond users feel with their iPhones, Apple clearly knows that there is real value in developing a Siri personality that users interact with. It has sought writers, looking to expand and solidify Siri’s conversation skills, and appears to want to leverage Siri’s “emotional relationship” with iPhone customers to compensate for its performance deficiencies.

When it comes to Siri, I don’t think designing a verbal interface to increase user satisfaction with a product is problematic. It’s a sort of benign design. However, what if a future conversational robot—or even, say, a toaster—is designed to develop a relationship with users before strongly suggesting/advertising other products sold by the machine’s manufacturer? Or a smartphone app designed by Fox News or MSNBC that tries to influence your politics after its interpersonal analysis algorithm establishes you have formed an emotional bond with the program? Machines and programs like this move beyond benign design into a morally questionable area that could be legally actionable.

Right now, there’s no way to know. The legal duty owed by the manufacturers of these hypothetical products might be no more than a warning: “This product is going to make you love it and then trick you into doing things you wouldn’t do otherwise. Please see past manipulative boyfriends and girlfriends for examples.” But depending on the results of future research, companies whose products create the impression of emotional connection may be required to exercise a certain amount of caution and restraint. Manufacturers might even owe a fiduciary duty to customers because they will have such superior knowledge of their products’ abilities to connect and manipulate.

Harvey Milk’s understanding of, and belief in, human empathy was a turning point for the LGBT community. If AI becomes a world-changing technology, I hope that its developers share his respect for our empathy.

Future Tense is a partnership of Slate, New America, and Arizona State University.