Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security

Radio Waves Can Be Used To Hijack Androids and iPhones Via Siri and Google Now 49

An anonymous reader writes: Two French researchers have discovered a way to use the Siri and Google Now voice assistant software to relay malicious commands to smartphones without the user's consent or knowledge. This method relies on a special hardware rig that can send radio waves to smartphones with earphones plugged into them. The radio waves get picked up by the earphone cable, get transformed into electrical signals and then to software commands. The research is accompanied by a YouTube video as well. Note that this attack, as the article explains, so far relies on some bulky dedicated equipment, and on the attacker being close to the system he wants to disrupt.
This discussion has been archived. No new comments can be posted.

Radio Waves Can Be Used To Hijack Androids and iPhones Via Siri and Google Now

Comments Filter:
  • Ask someone's Siri where the horse dick is. Ask someone's Siri where the hard drugs are. Ask someone's Siri where the inflateable dolls are. Make sure you target politicians, you pranksters...
    • ... yea, and it'll play over their head phones ... so no one will hear it ...

      Next time read the summary, not the headline. Works with headphones pulled in by stimulating the microphone on the earbuds with RF.

      No ear buds, no worky. With ear buds plugged in, no one will hear its response ... effectively no work.

      Of course the required RF is going to cause other issues besides Now/Siri acting up, but go ahead continue to be ignorant and too stupid to realize this is nothing more than another sensationalist Sl

      • Risk (Score:4, Interesting)

        by fyngyrz ( 762201 ) on Sunday October 18, 2015 @11:43AM (#50753851) Homepage Journal

        You may be misunderstanding the risk, such as it is.

        o Siri is given instructions via RF injection and incidental demodulation within the phone's mic input electronics.

        o Siri performs an action you didn't ask it to do.

        o You won't necessarily hear the instructions come in. In the cable, it's RF. Your earphones would also have to demodulate the signal. If they're purely inductive (most headphones are), they won't do that. If the circuitry they are plugged in to doesn't provide incidental demodulation (a lot less likely than an input like a mic input), it won't get back to the earphones that way either. Last chance is anything you say is fed back to your earphones by Siri / etc. Does it do that? My Galaxy Note 3 doesn't do that with Google voice. Why would it, anyway?

        o If you're not looking at your phone, you might not even be aware this had happened. You might even be asleep. I nap with my earphones in, listening to music, on a fairly regular basis, for instance.

        So while it's extremely unlikely to be any kind of an immediate threat because of the equipment and proximity issues, it actually might be able to cause problems in those rare cases where those issues do not prevent it. Mostly it depends on what the phone can be told to do, and what portion of that it will do without further interaction / confirmation.

  • Since the researcher did not try to see whether the same trick would work on a Lumia
  • Use BlueTooth headphones/headsets.

  • they can use radiowaves to remotely control and tap into/scan anything, even DRAM, CPU, brain/nerves, USB, keyboard, monitors.

    the technique is called interferometry/electronic warfare but also you can do it with off the shelf parts. they call the off the shelf stuff van eck phreaking: https://en.wikipedia.org/wiki/... [wikipedia.org]

    More info on the interferometry/electronic warfare kind used by our government from space satellites and over the horizon radar at http://www.drrobertduncan.com/ [drrobertduncan.com]

    Info on interferometry: https:// [wikipedia.org]

  • "OK Google, begin DDoS script."

    Imagine rolling through Times Square on New Years. Omnidirectional antenna on a micro version of this, get in the middle of the crowd, pwn everyone using wired headsets with a microphone, instant cellular botnet, and since you're not issuing commands from a cell phone or through the cellular network, you're not going to be traceable through that system.

    You are effectively an invisible and untouchable attacker/control/command server. All you do is issue the command in a quick burst and go silent.

    • by TWX ( 665546 )
      Omnidirectional antennae probably can't generate the desired effect.
      • by Guignol ( 159087 )
        No, this is a french attack, we have to wave in your general direction...
      • by Khyber ( 864651 )

        Just push enough power to it. Burst transmissions aren't that difficult to achieve.

      • Omnidirectional isn't important. What's important is that the RF be pretty close to some multiple of the length of the headset cord.
  • OK, this is the sort of question that could be answered by RTFA, however when it's a 40-minute long video, I don't feel as bad.

    When configuring Siri for voice activation, you go through some steps that give the impression that it's tuning the activation for your specific pattern of speech. Which presumably is to prevent false activation when somebody next to you is using the feature on their phone.

    Assuming this is actually happening, would that prevent this sort of attack?

    • by jo_ham ( 604554 )

      OK, this is the sort of question that could be answered by RTFA, however when it's a 40-minute long video, I don't feel as bad.

      When configuring Siri for voice activation, you go through some steps that give the impression that it's tuning the activation for your specific pattern of speech. Which presumably is to prevent false activation when somebody next to you is using the feature on their phone.

      Assuming this is actually happening, would that prevent this sort of attack?

      I doubt it. The voice training just makes Siri respond more effectively to you when there are other noises around during activation. It can still be activated by someone else saying "Hey Siri" even after this training step (although commands are more limited if the iPhone is locked).

  • Just have Siri or Ok Google say something whenever interpreting a voice command. Something simple like "OK Boss," would let the user know something is going on with their phone.

    Which, of course, leaves the problem of how a non-tecvh-savy person would know that when your phone is doing weird shit you unplug the headphones, which is probably the harder thing to figure out, but hey.

    • by wbr1 ( 2538558 )
      And the output you speak of comes through the headset. If that is not on a user, no bueno. However with Google now, if you have a screen lock it will not run commands without unlocking.
      • For the hack to work the headphones have to be plugged in. They are the attack vector. I can't think of a lot of use-cases where the headphones would be plugged in, but not in your ears.

  • Just say'n.
  • I'd love to lean into the mic at a packed concert and say, "Ok Google, call mom, yes ... Hold the tourniquet tight while I find the vein."

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...