Should AI Makers Be Legally Responsible for Emotionally Manipulating Customers?

The Citizen's Guide to the Future
Jan. 20 2014 11:31 AM

Should AI Makers Be Legally Responsible for Emotionally Manipulating Customers?

181281532-customer-inspects-the-new-iphone-5s-at-the-wangfujing
Siri is just the beginning.

Photo by Lintao Zhang/Getty Images

Harvey Milk famously advised his “gay brothers and sisters” to come out, but "only to the people you know, and who know you." He understood the power of human empathy. It is hard to hate people you know and who know you.

We'll soon see how human empathy shapes the design of artificial intelligence because, as it turns out, our empathy is so strong we will consider objects people if we can interact with them like people. Her played with this idea in a future where Joaquin Phoenix's Theodore falls in love with his operating system, Samantha. But right now on Indiegogo, you can contribute to EmoSPARK, "the first artificial intelligence console committed to your happiness."

The campaign claims that EmoSPARK can track your emotions, creating a record of each person's facial expressions, preferences, and reactions. The device then tries to improve and maintain your mood by sharing with you pictures, music, and video that its observations and analysis indicate will make you happier.

Obviously, there are some key differences between EmoSPARK and Samantha. EmoSPARK doesn’t appear to read your emails. Its voice is closer to the emotion-less Majel Barrett-Roddenberry on Star Trek than the flirty Scarlett Johansson. Its ability to have real interactions with users is suspect and might be limited to suggesting YouTube clips and Facebook posts. But they're both designed to get to know you and adapt their actions to what they learn. And both EmoSPARK and Samantha interact with people using a conversational interface.

If humans start to think of their EmoSPARKs as actual people, that interface might be the reason. When we have conversations with a machine, we start to think of that machine as a person. The late Stanford professor Clifford Nass said, “Our brains are built to treat these conversations with computer-based voices to an incredible degree like [conversations] we are having with actual people—including flattery, flirtation and all the rest." If EmoSPARK determines that nothing perks you up after a bad day like Community bloopers and starts to verbally tell you about clips available on the Internet, your brain will rewire itself to think of the AI appliance as a fun, quirky roommate.

Smart developers and manufacturers will start to incorporate this into their designs. Siri, for example, is a step in that direction, as users have described “her” as “a good listener [and] funny in a smart, dry way,” while also undercutting those compliments by saying “She’s just a little too glib.” Although Siri’s hit-or-miss execution can limit the bond users feel with their iPhones, Apple clearly knows that there is real value in developing a Siri personality that users interact with. It has sought writers, looking to expand and solidify Siri’s conversation skills, and appears to want to leverage Siri’s “emotional relationship” with iPhone customers to compensate for its performance deficiencies.

When it comes to Siri, I don’t think designing a verbal interface to increase user satisfaction with a product is problematic. It’s a sort of benign design. However, what if a future conversational robot—or even, say, a toaster—is designed to develop a relationship with users before strongly suggesting/advertising other products sold by the machine’s manufacturer? Or a smartphone app designed by Fox News or MSNBC that tries to influence your politics after its interpersonal analysis algorithm establishes you have formed an emotional bond with the program? Machines and programs like this move beyond benign design into a morally questionable area that could be legally actionable.

Right now, there’s no way to know. The legal duty owed by the manufacturers of these hypothetical products might be no more than a warning: “This product is going to make you love it and then trick you into doing things you wouldn’t do otherwise. Please see past manipulative boyfriends and girlfriends for examples.” But depending on the results of future research, companies whose products create the impression of emotional connection may be required to exercise a certain amount of caution and restraint. Manufacturers might even owe a fiduciary duty to customers because they will have such superior knowledge of their products’ abilities to connect and manipulate.

Harvey Milk’s understanding of, and belief in, human empathy was a turning point for the LGBT community. If AI becomes a world-changing technology, I hope that its developers share his respect for our empathy.

Future Tense is a partnership of Slate, New America, and Arizona State University.

Future Tense is a partnership of SlateNew America, and Arizona State University.

John Frank Weaver is an attorney in Portsmouth, N.H., who works on artificial intelligence law. He is the author of Robots Are People, Too. Follow him @RobotsRPeople.

TODAY IN SLATE

Doublex

Crying Rape

False rape accusations exist, and they are a serious problem.

Scotland Is Just the Beginning. Expect More Political Earthquakes in Europe.

No, New York Times, Shonda Rhimes Is Not an “Angry Black Woman” 

Brow Beat
Sept. 19 2014 1:39 PM Shonda Rhimes Is Not an “Angry Black Woman,” New York Times. Neither Are Her Characters.

The Music Industry Is Ignoring Some of the Best Black Women Singing R&B

How Will You Carry Around Your Huge New iPhone? Apple Pants!

Medical Examiner

The Most Terrifying Thing About Ebola 

The disease threatens humanity by preying on humanity.

Television

The Other Huxtable Effect

Thirty years ago, The Cosby Show gave us one of TV’s great feminists.

There’s a Way to Keep Ex-Cons Out of Prison That Pays for Itself. Why Don’t More States Use It?

Why Men Can Never Remember Anything

The XX Factor
Sept. 19 2014 1:11 PM Why Men Can Never Remember Anything
Behold
Sept. 19 2014 11:33 AM An Up-Close Look at the U.S.–Mexico Border
  News & Politics
Foreigners
Sept. 19 2014 1:56 PM Scotland’s Attack on the Status Quo Expect more political earthquakes across Europe.
  Business
Moneybox
Sept. 19 2014 12:09 PM How Accelerators Have Changed Startup Funding
  Life
Inside Higher Ed
Sept. 19 2014 1:34 PM Empty Seats, Fewer Donors? College football isn’t attracting the audience it used to.
  Double X
The XX Factor
Sept. 19 2014 3:07 PM Everything Is a "Women's Issue"
  Slate Plus
Slate Picks
Sept. 19 2014 12:00 PM What Happened at Slate This Week? The Slatest editor tells us to read well-informed skepticism, media criticism, and more.
  Arts
Brow Beat
Sept. 19 2014 2:44 PM Where Do I Start With Mystery Science Theater 3000?
  Technology
Future Tense
Sept. 19 2014 12:38 PM Forward, March! Nine leading climate scientists urge you to attend the People’s Climate March.
  Health & Science
Medical Examiner
Sept. 19 2014 12:13 PM The Most Terrifying Thing About Ebola  The disease threatens humanity by preying on humanity.
  Sports
Sports Nut
Sept. 18 2014 11:42 AM Grandmaster Clash One of the most amazing feats in chess history just happened, and no one noticed.