Siri fails to produce good results if you ask it for abortion or birth control instead of Viagra.

Siri fails to produce good results if you ask it for abortion or birth control instead of Viagra.

Siri fails to produce good results if you ask it for abortion or birth control instead of Viagra.

The XX Factor
What Women Really Think
Dec. 1 2011 11:18 AM

Siri Doesn't Know About Your Lady Stuff

Does Siri give you the information you are looking for? Let us know.

Photograph by Oli Scarff/Getty Images.

The past couple of days, the lady-sphere has erupted in outrage over the discovery that Siri, iPhone's voice-activation software, is a real dunce when it comes to women's basic health care. If you tell Siri you've been raped, it directs you to long-term services instead of immediate services such as the police or the hospital. Even though Siri knows that drugstores sell Viagra, it doesn't seem to have that same information if you ask it for birth control, even though unlike Viagra, there are many forms of over-the-counter birth control. It knows how to find an escort service if you plug in any manner of words relating to sex (unless you mention slang terms for acts performed specifically for women), but it isn't smart enough to know that Planned Parenthood gives abortion referrals. I asked it specifically for an abortion, and it had no answers, even though one of the busiest abortion clinics in all of New York is within walking distance of my apartment. At least it didn't give me anti-choice crisis pregnancy centers instead, which has been what some women have been finding when they ask Siri about abortion services. Statistically speaking, way more women use abortion services than men use prostitution services, so this oversight is maddening.

The problem here is one of neglect and not malice. The programmers behind Siri seem to be a bunch of gleefully juvenile dudes who took the time to teach Siri corny jokes, marijuana know-how and sci-fi references, along with teaching it about serious problems that can affect both men and women, such as suicidal thoughts. And even though they really like the idea of sex with women, they seem to have not thought much about the work that women have to put into being sexually accessible. Just as with the mind-boggling name fail of the iPad, the problem seems to be that there simply aren't enough women working in innovative, customer-driven technology services, and the ones who do have to adopt a bro-like attitude that makes them nearly as forgetful of the concerns of ordinary women as the men are.


I don't have Siri on my phone, but my boyfriend does, and like pretty much all dorks left alone with Siri for five minutes, we've had our fun playing with it. It was also pretty stupid when I asked it for a vasectomy. Just as with the phrase "birth control", it had no ability to look past the actual name of clinics, so instead of producing the names of local urologists, it gave me a couple of seedy-sounding places with the word "vasectomy" right in their name. But I still can't see this as some kind of egalitarian fail on the reproductive health front; even though vasectomies are performed on men, they are done to protect women from pregnancy. Again, it just seems that some of the most basic, everyday health concerns of women hadn't registered as important with Siri's programmers. When I used some common slang terms for oral sex performed on women with it, Siri seemed to think I was in the the mood for a hamburger or on the market to buy a cat (and shame on Siri for sending me to a pet store instead of a local animal shelter!). It had zero problem knowing what I meant when I referenced fellatio.

So, I ask of commenters: what have you found that Siri knows and doesn't know? How ignorant is Siri specifically of women's basic concerns? Does your Siri really think that people use the word "pussy" to mean "cat"? I'd like to ask you to ask your Siri to help you with some of the concerns of lady-life, and report back in comments what it says. Apple is clearly embarrassed by this whole thing, so maybe they're cruising the internet looking for ways to fix the problem, and your comments could help.