An Amazon Echo Can’t Call the Police—But Maybe It Should

An Amazon Echo Can’t Call the Police—But Maybe It Should

Sunday, 16 July 2017
This post was originally published on this site

Despite what you may have heard, an Amazon Echo did not call the police earlier this week, when it heard a husband threatening his wife with a gun in New Mexico. On Monday, news reports took Bernalillo County authorities’ version of those events credulously, heralding the home assistant as a hero. The alleged act also raised an important question: Do you really want to live in a world where Alexa listens to your conversations, and calls the cops if she thinks things are getting out of hand?

The good news is that you don’t live in that world. Amazon’s Alexa can’t, and did not, call 911. Google Home can’t do it either. No voice-assistant device on the market can. That doesn’t invalidate the core question though, especially as Amazon Echo, Google Home, and their offshoots increasingly gain abilities and become more integral to everyday life. How intrusive do you want to let these devices be? Should they be able to call the police? Maybe not even just when specifically prompted, but because they may have heard, for instance, a gun shot?

The Bernalillo County incident almost certainly had nothing to do with Alexa. But it presents an opportunity to think about issues and abilities that will become real sooner than you might think.

A Quick Debunk

The Bernalillo County Sheriff’s Department reported, specifically, that when a man drew a gun on his wife in a home where an Amazon Echo was placed, he said to her, “Did you call the sheriffs?” and the Echo misinterpreted that as a command to call the sheriffs, who then showed up at the front door. The authorities later clarified that someone in the house could be heard in the 911 recording yelling, “Alexa, call 911.”

This could not have happened as described. Amazon’s Echo requires a “wake word” to activate; the default is “Alexa,” but you can also customize it to “Echo,” “Amazon,” or “Computer.” And while they can make calls, an Alexa-powered device can only call another Alexa-powered device. Not only that, but it can only call other Alexa devices that have enabled calling, and have been added to your contact list. Most importantly, these exchanges don’t take place over the public switched telephone network, the worldwide network that allows wireless or land phones to actually make calls.

In other words, the sheriffs would have needed an Alexa device of their own for that to ever work, one that the couple in the domestic dispute had in their contact list. Later, the police said that the Alexa was used in combination with some kind of home phone or cellular phone system. That at first sounds more plausible, but is actually also technologically impossible, as the Echo does not support calls over Bluetooth.

Someone called the police that day. It just wasn’t Alexa.

Alexa, Why Can’t You Call 911?

Alexa’s current calling limitations won’t last forever. The Echo’s biggest competitor, Google’s Home, will soon allow you to call any number in the US using the device–except for 911, or 1-900 numbers.

The holdup seems to be largely regulatory; according to Federal Communications Commission spokesman Mark Wigfield, providing 911 services means adhering to a host of technical regulations, everything from making sure all 911 calls route through the right call center, to making sure each one transmits the correct location of the caller. Additionally, devices that make 911 calls must also be able to receive incoming calls, so police can call back. Those hurdles currently prevent Google and Amazon from offering a direct emergency line. But they can, and likely will, be overcome at some point.

Fear of false positives could also present a barrier to 911 calling, says Susan Liautaud, an ethicist at Stanford who advises major tech companies on how to balance ethical behavior and innovation. Children could prank call the cops using the Echo, for instance. Then again, children prank call the cops using regular phones all the time, too, notes Boston police officer Rachel McGuire. “It’s not like it’s any skin off our back,” she says.

Beyond 911

But what about help beyond 911? To people like Dan Reidenberg, who has advised Facebook’s suicide prevention policies and is the executive director of SAVE, a national suicide prevention network, Alexa represents a massive opportunity to have another source to help with suicide, depression, and mental health crises.

Amazon has worked with experts to incorporate some of that responsiveness into Alexa already, according to Amazon Echo spokesperson Rachel Hass. WIRED tested the phrases and found they worked. When anyone says, “Alexa, I am being abused,” Alexa will respond: “I’m so sorry. If you need immediate help, call 911 from your phone. You can also call the national domestic violence hotline at 18007997233.”

It gives similar answers, with appropriate contact information, if you say you are depressed, want to commit suicide, or are having a heart attack. Google Home does something very similar, including offering support if you tell the device you’ve been raped.

Both devices could also do more. “It could say, why don’t you go take a walk outside and then come back and tell me how it goes? Or could I play this music for you? We could design interventions that could match the need,” says Reidenberg. “It could say, could I connect you to your mom or your brother or best friend?’”

That’s within the realm of possibility for both devices as well (in Amazon’s case, as long as those people have an Echo device with the calling feature enabled). Amazon wouldn’t comment on whether it would add this feature.

Some third-party apps integrate with the Echo to provide assistance. A program called “Therapy” lets you talk to Alexa and get generic advice about your feelings. Ask My Buddy provides a more practical outcome by letting you immediately alert someone in your Personal Alert Network that you need them to check on you.

Mission Creep

The practical use cases seem clearly defined. You’ll be able to use voice-activated devices to make calls for help, both literal and figurative, but not to 911. A more far-flung question, though: Should these AIs intervene without being summoned? Should Alexa be able to identify, say, child abuse without a wake word, and act on that in some way?

Ethicists and privacy advocates say though certain edge cases would benefit from it, that sort of intrusion crosses an important line.

It would lead to false positives; a voice assistant could mistake a frank discussion about sexual consent for actual abuse, for instance. And divorcing Alexa and Google Home’s listening abilities from their wake words would wreak havoc on your privacy. While the devices always listen for their prompts, they don’t store anything in the cloud, or process it, until you speak to them explicitly.

“Cyberservants will exhibit mission creep over time. They’ll take on more and more functions. And they’ll habituate us to become increasingly comfortable with always-on environments listening to our intimate spaces,” says Evan Selinger, a philosopher at the Rochester Institute of Technology, who focuses on how technology invades life.

SAVE’s Reidenberg agrees, but thinks it’s inevitable that AIs will be given more power to intrude when necessary.“I don’t think we can avoid this. This is where it is going to go. It is really about us adapting to that,” he says.

That, and figuring out where the lines get drawn along the way.

Leave a Reply

Your email address will not be published. Required fields are marked *

Welcome to