Fear and loathing of robots is a trope of long standing in science fiction, captured best in Isaac Asimov’s 1942 short story “Runaround,” which introduced the Three Laws of Robotics, fictional limits proposed to protect humans from being harmed by the machines they created.
But if popular culture is any guide, fear and loathing might be giving way to grudging acceptance—not surprising, given our growing enthusiasm for and use of robots in everyday life. Robots cater to a variety of human needs today: Companies such as Boston Dynamics manufacture powerful robots such as the Cheetah and Petman for use in military settings while Japanese company AIST focuses its efforts on snuggly “therapeutic” robots such as Paro the baby harp seal, which is used in nursing homes and has proven particularly effective in comforting patients with dementia.
As the new Fox television series Almost Human suggests, in the near future we might seek robots that give us combat-ready protection and a warm embrace at day’s end. The show, which follows the exploits of human police officer John Kennex (Karl Urban) and his android partner Dorian (Michael Ealy), imagines a world of human-robot cooperation that consistently questions whether it is the humans or the machines that exercise a moral advantage.
The show is set in a near-future dystopia rife with crime and confusion. “Science and technology evolve at an uncontrollable pace,” the voice-over at the beginning of the pilot episode intones. Every police officer must have an android as a partner, a requirement that annoys Kennex (so much so that he destroys his first partner by pushing him—it?—out of a moving car). The appeal of these androids is that they are entirely rational—walking, talking embodiments of algorithmic superiority. In one scene, when Kennex commands a robot to help him evacuate an injured human cop, the robot refuses, saying, “Others have a better statistical chance of survival.” The androids’ eerily smooth skin, flat eyes, and perfectly modulated voices represent the emptiness of the programmed soul.
By contrast, Kennex’s new android partner, Dorian, is an earlier model discontinued because it was taught too well how to understand human emotion. Dorian oozes empathy: He’s a natural with children and women and even takes the initiative to create an online dating profile for Kennex after assessing his partner’s level of sexual frustration. (This leads to one of the more cringe-worthy bits of television writing, when Kennex responds, “Don’t scan my balls again ... ever.”) Dorian’s sensitivity is the leitmotif of the show and the twist in the usual science-fiction narrative about humans and robots.
The show also touches lightly on anxiety about our dependence on technology; in one scene in the pilot, upgraded android cops collapse en masse at police headquarters, the victims of a disabling technology, leaving their human partners to fend for themselves. Older android model Dorian—the Walkman to their iPods—observes wryly, “Sometimes new technology isn’t better.”
But this dystopia is surprisingly hopeful. Judging by the first two episodes, the show’s creators seem to be suggesting that although we might have to rely on robots that are stronger, more intelligent, and more rigorously rational than human beings, what we really need are robots that can remind us of what it means to be emotionally well-developed human beings.
This subtext emerges clearly in the show’s second episode, which introduces the show’s first female androids. All too predictably, they are prostitutes: “sexbots” that are licensed and programmed to bond with their customers and make them believe that the machines are enjoying themselves as well. The market driving the creation of these sexbots is human loneliness—or rather, male loneliness. Women are never shown availing themselves of male sexbots. In the episode, Kennex and Dorian pursue an Albanian gang that is kidnapping real women in order to harvest their human skin so it can be used to cover the bodies of the sexbots in the Albanians’ harem. Their customers evidently want sexual partners that are as close as possible to human women, but without the emotional volatility that real women might bring to the relationship. (One need not have a Ph.D. in women’s studies to notice that there are, as of yet, no female android cops in the show, and the two main women characters, played by a doe-eyed Minka Kelly and the excellent Lili Taylor, are familiar stereotypes—the sensitive lady-cop and the tough-love boss, respectively.) But by episode’s end, the Albanians trounced and the damsels rescued, it is once again Dorian the android who teaches Kennex the man about the way genuine connection can alleviate loneliness.
All of this android empathizing raises the question: Can intuition be programmed? The show’s creators imply that it can and should be, if the result is the kind of companionship and reliability that Dorian offers Kennex. Besides entertaining us, which it does very well, Almost Human is normalizing the notion that we can create technologies that can teach us how to be better human beings. Even the show’s title teases viewers: Is it Kennex, with his synthetic leg, brusque manner, and faulty memory who is “almost human” or the emotionally grounded android Dorian?
The show’s creators, writer J.H. Wyman and producer J.J. Abrams, pay homage to earlier science-fiction classics—one scene in the pilot episode features Kennex and Dorian sitting at an outdoor noodle bar amid dark rain and flashing neon lights that is a near-perfect re-creation of a similar scene in the movie Blade Runner. Science fiction is a useful vehicle for exploring the hopes and fears we have about technology and human nature, and for questioning what makes being human a different experience from being an animal or a machine. Even kitschy television cop shows like Knight Rider explored such themes. KITT the computerized talking car was a reliable and sympathetic friend to Michael (played by David Hasselhoff). But although the Hoff was always thankful for KITT, he rarely ended an episode of Knight Rider without securing a new girlfriend. The ineffable pleasure that relationships with other human beings bring was something earlier shows took for granted.
Not anymore. If you believe the claims of leaders of technology companies such as Google that they can engineer serendipity and predict your next behavior, then it is not surprising that our television shows are beginning to mirror such claims and ask the provocative question: Will the souls of our new machines one day become more sensitive than our own?
This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.