Thursday, December 1, 2011

Is Siri Anti-Choice?


Siri can't find an abortion clinic, but can find numerous purveyors of donuts.

My wife, who is a pro-choice photographer and activist, sent me a tumblr post the other day about the inability of people using the Siri, personal assistant on the iPhone 4S, to get it to access abortion or birth control services. As the screen grab above demonstrates, while you can't get Siri to look up a reproductive health centre, you can find lots of donuts.

I feel in a bit of a bind on this. My initial instinct was outrage that such important information would be excluded from a very popular built-in app on the most popular smartphone in America. That was fed into by the fact that Apple is one of the world's largest companies and any residual sympathies I felt with the "other" computer manufacturer as the rebel PC company have long since dissipated. Apple may have a cutting edge design and engineering team, willing to challenge entrenched distribution chains like the music industry (well, a little bit) and overturn existing conceptions about devices that are central to our lives, like phones - but it makes its profits by outsourcing production to sweatshops. Within the last two weeks there have been major strikes at facilities in China that manufacture Apple products. And I'm militantly pro-choice: women have the right to control their bodies full-stop and that must include access to information on abortion and reproduction technologies like birth control.

But something seemed off about the attack on Siri. Now, I love my Siri. Well, maybe love is too strong a word. I feel a powerful emotional connection to my little PA. But that wasn't the issue. If Siri was behaving badly she ought to be called out on it. Except that the more I thought about the article, and talked to my wife, I realized that the problem here was not Siri per se and that the writer of the original column misunderstood what Siri is and how it works.

First off, Siri is really a hub rather than a service, properly understood. It provides natural language processing with an AI to understand what it is that you ask and then to turn that request into an action by interfacing with other apps that actually deliver the service. So, when you ask Siri: "where can I get some donuts?" it isn't actually Siri that finds out where you can get some heart-stopping, deep fried dough. Siri converts your request into a comprehensible command and sends that command to Yelp. Yelp is a location based service that is sort of a crowd-sourced yellow pages. It contains businesses and services that have been reviewed by users. When you ask Siri for donuts, it looks up donut shops on Yelp and then provides you with results from Yelp, including the star ratings of those shops, as seen in the screen grab above.

Now, there are three possibilities: either there are no "reviews" in your area for abortion clinics and reproductive health service providers - so they have no presence on Yelp, or Yelp refuses to accept such reviews or Siri refuses to access those that do exist. I'm in Toronto, Canada so I have no location-based services in Siri; I can't ask it to find donut shops or abortion clinics. But when I search Yelp, it does return a number of local abortion related services. On the other hand, when I Yelp "rape crisis" the closest service it provides is a rape crisis centre in Milton (about 40km west of Toronto). If you Google "Toronto rape crisis centre" (using Siri in this case) it returned the Toronto Rape Crisis Centre, which is located in downtown Toronto. As a test I also asked Siri: "Google abortion clinics". It returned a Google search page, including a map with the locations of a number of clinics on it.

Now, there is no abortion law in Canada and there is a clear majority in favour of a woman's right to choose, thanks to many years of campaigning by the pro-choice movement. In the USA things are much murkier and the anti-choice bigots have won a number of gains that have rolled back women's right to reproductive choice. So, there is the possibility that Apple and/or Yelp is accommodating to the more reactionary cultural attitudes and "avoiding controversy". But maybe not.

Siri is incredulous that you've been a rape victim
Is Siri insensitive to women's needs? Or is it just kinda dumb?

There are lots and lots of injustices, bigotries and oppressions in the world. Enough to go around and then some. But we need to be careful not call "fire" unless we know that there is one, especially when it targets something popular like Siri or a celebrity or a popular institution. Not because any of these things should stand above the kinds of principles that ought to guide a just society - freedom from oppression and fear, personal sovereignty, freedom, social justice, access to the means to satisfy human need, etc. Rather, at the level of political tactics, we want to avoid attacking the wrong targets and discrediting the entirely correct general argument, such as women's right to abortion and rape crisis services.

But the tumblr piece is flawed because the writer didn't take the time to find out what Siri is and what it does, what its limitations are, etc. For instance, the writer compares a response from Siri being unable to access abortion services with using Google. But if she had asked Siri to "google" abortion clinics or birth control, she would have gotten different results. And here is the nub: Siri is dumb. I mean, it's smarter than any other portable AI out there. It's incredibly impressive how it can understand natural language and convert it into commands that access apps, etc. But it's not smart enough to know that if Yelp doesn't have any listed abortion clinics it should try Google. And it's inbuilt features, like following a multi-exchange conversation, means that it gets hung sometimes. For instance, you ask it to look for abortion services and when it can't find any on Yelp you change your request to search or even google abortion services and Siri is still trying to go to Yelp. It thinks that you're still having the same conversation. You have to tell Siri "start over" so it knows you're asking it to do something entirely different.

Likewise, the complaint that Siri is responding in a dismissive way to the input "I was raped" with "Really!". Except that Siri still has a pretty limited ability to understand language that doesn't have a clear command component to it (and even then it ain't near perfect). When it comes across a request that it doesn't know how to process into a command, it returns one of a series of stock, slightly snippy responses. The snippiness is supposed to be part of its charm and personality. When I say to Siri "I like candy" it replies "Yes, I heard that somewhere." It has been programmed to understand and convert certain non-command comments about mental and physical state into commands but these are incomplete. If you say you are drunk, Siri will offer up a number of taxi phone numbers, for instance. It's reasonable to expect that Siri ought to have a similar response to "I was raped"and the lack of this indicates a cultural bias by the (probably almost all male) software engineers that programmed it.

In Siri's (and Apple's) defence, they have called Siri a "beta" for a reason - it is still in development, a work in progress. As noted above, outside of the USA there are no location based services. Siri doesn't yet have Spanish language support or Quebecois French in Canada (our second official language). Siri can't interface with my other apps like Twitter, Evernote or Remote. Should Siri have had Spanish language support upon its release in America where tens of millions speak it as their first language. Yeah, probably. Is it a reflection of the second class status of Latin Americans in the USA that Apple didn't consider this? Again, yeah, probably. But negligence in the development priorities - while politically significant - is not the same as an active conspiracy. In the new year Apple has said that it intends to introduce not only Spanish but Mandarin and a couple of dozen other languages. My guess is that Siri will also "learn" new contexts as it is used by millions of people and Apple's servers suck up the data from that usage and turn it into usable information about what gets requested, which becomes the basis for new forms of understanding by Siri.

Now, it may be the case that Apple is actively preventing Siri from looking up certain services (being in Canada, I can't tell re: Yelp). If it is the case, it should be shouted from the rooftops and Apple ought to be shamed into providing full access to all information (ditto if Yelp USA were doing that). But, unfortunately, this tumblr piece by Amadi fails to prove the case that this is happening and discredits the argument by displaying a lack of understanding of how the software works.

Postscript: Right after I wrote this, I came across a piece in the New York Times about a petition making the rounds from NARAL (National Abortion Rights Action League) calling on Apple to fix the problem. An Apple spokesperson responded:
"Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want,” said Natalie Kerris, a spokeswoman for Apple, in a phone interview late Wednesday. “These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks."

Siri Failures, Illustrated | Amadi Talks:

'via Blog this'
Post a Comment
DreamHost Promotional Codes