Amazon Echo's A.I. Alexa throws shade when you bring up HAL.

Amazon Echo and the Art of Artificially Throwing Shade

Amazon Echo and the Art of Artificially Throwing Shade

Future Tense
The Citizen's Guide to the Future
July 31 2015 8:35 AM

The Art of Artificially Throwing Shade

The Amazon Echo can get a little snippy.

Photo courtesy Amazon

As today’s artificial intelligence grows more and more capable of natural language interaction with humans, they will need to master a peculiar yet highly important design need: ready-made snarky responses for when their human owners troll them with science fiction movie A.I. references. As you can see in a video I recorded of myself playing with the Amazon Echo and its Alexa-intelligent assistant, Alexa got sassy when I repeated a famous line from 2001: A Space Odyssey

In the movie, the astronaut Dave Bowman asks the homicidal supercomputer HAL to let him back inside the spacecraft, and HAL responds with a curt “I’m sorry Dave, I’m afraid I can’t do that.” When you say “HAL, open the pod bay doors,” Alexa responds by not only mimicking the first part of HAL’s response—she also reminds you that she is not HAL and we’re not in space.

Granted, Alexa’s shade-throwing is really that of the team of programmers that built her. But that’s also the point. There are many ways of building human connection to machines, and Alexa reflects many of them. For example, by assuming a human female’s name and taking on a vaguely female voice, Alexa encourages you to regard it using terminology such as “her” or “she.” And whenever I call an “it” a “she”, I linguistically imbue a cloud-based computer program speaking through a faceless black cylinder with a socially constructed marker of human identity: gender.

But, as my video demonstrates, another component of feeling connected to a machine could also be the machine faking a form of self-awareness.  Alexa “knows” that she is an A.I. enough to understand what it means when I tease her by asking her to open the pod bay doors. And Alexa responds by effectively rolling her eyes at me. The fact that Alexa seems unhappy and even passive-aggressive when you troll her with HAL jokes makes it easier for us to assume that “she” has belief, desires, and intentions.

Small touches like this will help people adapt to a world in which they will live and work alongside machines like Alexa—as well as tease them in the hope of getting a “What, this joke again?” reaction.

Future Tense is a partnership of SlateNew America, and Arizona State University.

Adam Elkus is a Ph.D. student in computational social science at George Mason University and a fellow at New America’s Cybersecurity Initiative.