Future Tense

Does Siri’s Suicide Prevention Update Cross a Line?

The Golden Gate Bridge in San Francisco

Photo by Justin Sullivan/Getty Images

It used to be that if you told Apple’s Siri you wanted to jump off a bridge, she’d provide a list of nearby bridges. But after this week’s iOS update, she now asks whether you want her to call the National Suicide Prevention Lifeline. (And even if you say no, she pulls up directions to local suicide prevention centers.)

From a human standpoint, this is an excellent use of technology. In this country, more than 100 people kill themselves each day, and anything we can do to help our troubled friends, neighbors, and family members is a worthy pursuit. That said, when Siri provides suicide prevention numbers instead of bridge listings, the program’s creators are making a value judgment on what is right. Of course, Siri is just a gimmicky gateway to the Internet, for which there are plenty of workarounds. But she’s also part-browser/part-search engine. Even though this case feels straightforward—most of us would agree that people are not supposed to kill themselves—it raises important ethical questions about how free of interference the Internet and technology should really be.

There are numerous reasons why it would be handy to game the whole system. If virtually all scientists agree that climate change is real or that vaccines do not cause autism, why should a Google search bring up an array of conspiracy sites that say otherwise? (See Evgeny Morozov’s “Warning: This Site Contains Conspiracy Theories.”)

Conversely, there are all sorts of arguments for why the Internet must not have a guiding hand—freedom of speech, press, and protest chief among them. If someone has to make decisions based on what’s “right,” who will we trust to be that arbiter?  

Apple declined to comment when ABC News reached out to them, but the Siri update is consistent with a number of other recent online initiatives pertaining to suicide. For its part, Facebook has also been working with the National Suicide Prevention Lifeline to study how those who commit suicide behave on the social network in the days and weeks leading up to death. Similarly, if you Google “best ways to kill yourself,” a small banner with the Lifeline’s phone number trumps all the other results.

I know, all of this is a lot to extrapolate from a small programming tweak designed to save lives—it’s also an interesting coincidence that it should come out the same week as Aaron’s Law—but Siri’s role as a guardian angel is rather inconsistent. She has no problem directing me to the closest gun store. OK, firearms are dangerous, but they’re also legal. However, prostitutes are not (at least in my state), so when asked Siri directs me to the next closest thing without so much as a wink: nearby escort services. All ambiguity is lost when I ask advice on the best place to hide a body and she instantly starts navigating to the closest reservoir.

Regardless of the philosophical ramifications, suicide prevention professionals see the update as a good thing. Robert Gebbia, executive director for the American Foundation for Suicide Prevention, says: “It’s conceivable that people will use the technology as a way to reach out when they don’t know where to turn.”

I hope it does help someone. But if everything we’re learning about the NSA and Apple is true, then the questions I asked Siri for this article probably got my name on a lot of weird lists.