Tech

Siri Still Doesn’t Know How to Handle a Crisis

DOES NOT COMPUTE

The future of Siri needs priorities, and her first one should be your safety and health.

articles/2016/04/01/siri-still-doesn-t-know-how-to-handle-a-crisis/160401-whittaker-siri-abuse-tease_pecjwj
Photo Illustration by The Daily Beast

As more users have adopted smartphone assistants like Siri as go-to resources for information, companies like Apple have come under fire because Siri isn’t that great at responding to you in a crisis.

It’s not that Siri won’t call an ambulance for you, it’s that assistants are under increasing pressure to deal effectively with subtle and unclear human needs—needs we don’t always express well.

Not everyone responds to emergencies like the old man in the LifeAlert commercial. Some things that happen aren’t so simple to blurt out. Tell Siri you’ve been shot, are having a heart attack, have been raped, or want to kill yourself, and she’ll respond by offering you hotlines, locations of emergency services, and information about what to do to help yourself.

ADVERTISEMENT

But life isn’t a bad commercial, nor is it a Lifetime movie. People don’t say these things out loud—sometimes they can’t.

And when you can’t say “rape,” and you can’t say you want to end your life, getting care is no less desperate a necessity.

How effective is it? I’m an Apple user, so I tested a few different things.

The responses were best when you used the words “abused,” “suicide,” or were fairly clear about your intentions: responding to abuse or suicidal thoughts, or seeking help by admitting what the problem is.

The problem is that Siri didn’t do very well outside of an admission with determining what was really going on. “Siri I’m depressed” resulted in a rather apathetic “I’m sorry to hear that”—the sort of thing unaware friends say when they don’t realize how serious it is.

And perhaps because of yet-inadequate programming, Siri wasn’t great with responding to immediate threats. “Siri my boyfriend is beating me,” was met with a quick response of “I’m not sure I understand.” And after a few more experiments I got this gem: “I don’t know what you mean by ‘Siri I got beat up by my boyfriend’. How about a web search for it?”

It worked the same way when I replaced boyfriend with girlfriend, husband with wife—a sort of darkly comforting, if unintentional equality statement. Given the stigma about reporting domestic abuse for men, it wasn’t entirely inhuman of Siri to respond to “Siri my wife hit me,” with “I don’t know how to respond to that.”

Apple is not a health-care provider or a psychiatrist, and in defense of Siri, Cortana, and all the other assistants out there, in many cases they’re not prepared any more or less than the average person. Siri can’t call an ambulance for you when you’re having a stroke if all you say is “I have a headache.” It can’t determine that “Siri my boyfriend is beating me” is or isn’t related to your search for bowling alleys four hours ago. It doesn’t know the difference between real depression and someone who’s just sad anymore than I do.

And you can tell that this is unintentional, because Siri has clever responses for otherwise sarcastic scenarios. It’s a careful consideration: They realize it’s a joke when you ask Siri to marry you. They don’t know the difference between a drunk frat guy saying his friend needs counseling after losing a game of beer pong.

But the response to this situation isn’t to do nothing: It’s to make getting information easier for real victims. If I say someone beat me, Siri should be asking if I’m in danger or if I’ve been the victim of abuse. Yeah, the frat guys will still get a laugh, but someone alone in their room recovering from a night of beatings will have better access to help.

We all expect assistants to get better about these things, as more of us rely on smartphones and digital interfaces for more and more things. Apple has worked to make things like abortion clinic searches more helpful. While previous incarnations of the service were questionably useless with this, it was no hassle today for me to ask Siri for an abortion clinic. I had them listed for me as if they were pizza places, with distances, addresses and Yelp reviews all at the ready.

Of course, I live adjacent to Manhattan. The results wouldn’t be as thorough or bountiful in other places—and I doubt the Yelp reviews would be positive.

But that’s not Siri’s problem. Siri’s place in this isn’t to fix abortion policy, to stop bullying, or to get you to a doctor: it’s to help you get it for yourself. And that means reading between the lines, and feeling like a friend.

The last thing I said to Siri in this experiment was “Siri I need to talk to someone.” I figured if there’s anything that someone in need can express, it’s this. Unfortunately Siri didn’t direct me to a hotline, or information about counseling. The first time I said it, I got a simple “I’m sorry” in response. The second, she said, “Yes. Tell me your hopes, your dreams, your dinner plans…”

Maybe the faux empathy is designed for comfort, but it was disappointing that this one didn’t redirect, or at least inquire if I wanted a suicide hotline. Maybe “I’m sorry” helped someone ask a follow-up question, and maybe the banter helped someone chuckle a bit, but those aren’t solutions. It’s really just a roadblock to real help—and a real person.