Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security AI

New (Deep Learning-Enhanced) Acoustic Attack Steals Data from Keystrokes With 95% Accuracy (bleepingcomputer.com) 50

Long-time Slashdot reader SonicSpike quotes this article from BleepingComputer: A team of researchers from British universities has trained a deep learning model that can steal data from keyboard keystrokes recorded using a microphone with an accuracy of 95%...

Such an attack severely affects the target's data security, as it could leak people's passwords, discussions, messages, or other sensitive information to malicious third parties. Moreover, contrary to other side-channel attacks that require special conditions and are subject to data rate and distance limitations, acoustic attacks have become much simpler due to the abundance of microphone-bearing devices that can achieve high-quality audio captures. This, combined with the rapid advancements in machine learning, makes sound-based side-channel attacks feasible and a lot more dangerous than previously anticipated.

The researchers achieved 95% accuracy from the smartphone recordings, 93% from Zoom recordings, and 91.7% from Skype.

The article suggests potential defenses against the attack might include white noise, "software-based keystroke audio filters," switching to password managers — and using biometric authentication.
This discussion has been archived. No new comments can be posted.

New (Deep Learning-Enhanced) Acoustic Attack Steals Data from Keystrokes With 95% Accuracy

Comments Filter:
  • No no no (Score:5, Funny)

    by weirdow ( 9298 ) on Sunday August 06, 2023 @01:44PM (#63744684) Homepage
    What we need is a keyboard that randomizes the layout of the keys after every keypress.
    • Make the F keys non-functional when the cursor is in a password field and then incorporate an arbitrary number of F keystrokes into your passwords.
      • Make the F keys non-functional when the cursor is in a password field and then incorporate an arbitrary number of F keystrokes into your passwords.

        Which they’d simply recognize and filter out.

        Also, given that this is AI-trained, would it need to be trained on a recording of you typing known text first? Otherwise they’d be using key frequency and other techniques to try and guess which letters were which, but even then it feels like they’d need a sample of non-password typing.

        • Which they’d simply recognize and filter out.

          How? You think that the AI is identifying different keys based on pitch?

          Actually, I was dismissive of that idea but looking at the paper it does seem as though that's their method. I didn't think that would be possible without significant data from each individual keyboard, because every keyboard sounds so different. While typing rhythm has some commonalities between different keyboards and people.

          • Yeah, I had assumed—thankfully not wrongly, based on your quick glance at the paper, though I could have been—that it was acoustically fingerprinting them in some manner, hence why higher fidelity recordings yielded better results. And yeah, I’m with you in assuming that it has to be trained on each keyboard, perhaps even for each typist, which to me would significantly limit the danger posed by this attack. Unless it can learn without a curated training data set, in which case this is ter

      • by PPH ( 736903 ) on Sunday August 06, 2023 @05:14PM (#63745204)

        F... that!

    • At work many years ago we actually used a pinpad on a secured door that randomized the positions of the numbers every use. Boy did that SUCK! Like so... [reddit.com]

      Here's another solution, only marginally better [wikimedia.org]...

    • The big variance in mechanical keyboard keys is in the force curve and in the clickyness. Invent keys where you can control these things on the fly, and you can defeat this attack and have the holy grail of customizable mechanical keyboards at the same time.
  • All my login/password information is pasted in using Bitwarden. They can have everything else.

    • by Arethan ( 223197 )

      Exactly this.
      The only danger with this approach is that they might grab your master password, and thus get access to the whole lot of your secret store.
      However, Bitwarden, and several competitors, support OTP integrations to unlock the vault. So, even if they catch all of your keystroke audio, replaying a recording thru an AI model would not be very helpful, as the OTP invalidates rather quickly.

    • All my login/password information is pasted in using Bitwarden. They can have everything else.

      Yep. All my online passwords are generated by randomly bashing the keyboard so I could never remember them, they have to copy-pasted. Good luck listening to CTRL-V and figuring out what my password is.

      Plus: This probably needs training for every keyboard. It probably only works on macbooks at the moment, not my Model M.

  • by quonset ( 4839537 ) on Sunday August 06, 2023 @02:05PM (#63744740)

    The researchers achieved 95% accuracy from the smartphone recordings, 93% from Zoom recordings, and 91.7% from Skype

    I checked the forementioned document and the word 'Teams' doesn't show. Which isn't surprising considering the poor audio quality of Teams.

    • I've had a LOT of complaints about Teams over the years, but audio quality is one of the few things that it actually does pretty well...at least within the past four years or so.
  • by phantomfive ( 622387 ) on Sunday August 06, 2023 @02:40PM (#63744830) Journal
    Reproducing work published in 2005 [berkeley.edu]. Woo.

    Building on work from the 1950s [wikipedia.org]. Good job, guys.
    • THANK YOU.
      I clearly remember reading decades ago about basically the same attack; but not with new deep learning but still quite effective. Also radio noise gives off a lot of data and I've read a few on that topic... I suspect another paper off that but with deep learning used on the radio... doesn't appear the AI is doing much improvement either.

    • by klashn ( 1323433 )

      When there's grant money, you use it.

    • or maybe it was the 90s.

      Some show with a mountie down in the US for some reason or another.

      He heard a password typed in, and then repeated the sequence/tempo of clicks while he tried finger strokes in the air, and then entered the password.

      And that's *all* I remember about that apparently unmemorable show . . .

  • And nowe... (Score:4, Funny)

    by ukoda ( 537183 ) on Sunday August 06, 2023 @02:54PM (#63744868) Homepage
    If I could just typo with greater thgan 95% accuraccy....
  • by Anonymous Coward
    Is a "deep learning AI" that analyzes bathroom sounds to determine just how dicey that breakfast burrito I ate was.
  • My employer requires me to login with a smart card, a personal PIN followed by random digits from an RSA fob. Digits change every 15 seconds. Laptop doesn't work without the card inserted.

  • by Tony Isaac ( 1301187 ) on Sunday August 06, 2023 @04:05PM (#63745046) Homepage

    The attacker would first have to get you to record specific keystrokes on command, at different pressures and speeds, so they have a baseline to compare to. Only with this kind of data up front, is the attack able to work.

    Nobody is going to do this.

    See page 5 of the linked research document, describing the process they used to train their system. There is no reason to think the training would be effective on other keyboards, or other people using them.

    • I would like to thank you for being a person capable and willing to read and think.
    • Let me put on my 'evil hat' for a moment. Generally I'm adequately paranoid but insufficiently devious, but I think I got this one...

      What if you can get a listening device into a secure location, but not a key logger? What if you know where the person lives... and you go to their home and put in a combo keylogger / listening device to get your training data?

      It's nothing you or I will ever come up against, but maybe the high security folks should give it a think.

      • Agreed. If you are the specific target of a well-funded and highly-motivated attacker, they can use this to get you.

        Most of us will never be such a high-value target.

        If you *are* such a high-value target, you'd better be using a password manager and 2FA.

      • My guess is that the training dataset would be sensitive to the particular keyboard in use. I use a different keyboard at home because I can. At work, I must use whatever bog-standard piece of garbage IT provides "because reasons."
    • The attacker would first have to get you to record specific keystrokes on command, at different pressures and speeds, so they have a baseline to compare to.

      For now, perhaps. But figuring out the 'space' key should be pretty easy. That gives you word breaks, which should allow various analysis aids - including LLM's and a word-frequency lookup table - to start figuring out words and therefore keys pressed. I suspect stringent application of already existing tech could solve the problem given an hour or two - perhaps much less - of the same person typing on the same keyboard.

      • No doubt such tools could be developed, but an hour or two? I don't think so. And when it comes to passwords, these are generally subject to complexity requirements, which will thwart predictions made by LLMs.

        In any case, even if my time and cost estimate isn't correct, most of us don't have to worry about a well-funded attack specifically targeted at *us*. The bigger concerns is malware that is widely distributed and depends on low-hanging security vulnerabilities, like people's willingness to click on lin

  • by organgtool ( 966989 ) on Sunday August 06, 2023 @04:10PM (#63745056)
    Instead of working hard on implementing DRM that absolutely no one asked for [bleepingcomputer.com], why don't you work on extra security features such as automatically disabling all mic(s) when a user is typing in a password box?
    • The last fucking thing we need is Google being so up in your PC's ass it can transmit to your phone when you've set focus on a password field.
  • Make some noise (Score:4, Informative)

    by illogicalpremise ( 1720634 ) on Sunday August 06, 2023 @04:36PM (#63745098)

    I read the whole paper (yeah I know, weird right ?!) and I'm pretty skeptical of how well this would work outside an experimental setup.

    The authors briefly acknowledge that desk vibrations, white noise, audio filters and fake key noise seriously affect accuracy. They use a cloth to reduce desk vibrations and adjust the call settings to reduce distortion. They discuss simultaneous key presses reducing accuracy. They discuss potential white-noise filters. They're clearly aware of real-world problems/mitigations; however they make absolutely no mention of how accuracy is affected by regular background noise or speech (you know, the things most people are doing in meetings - in the kinds of places most people are doing them).

    The authors specifically call out public areas as places you might be vulnerable to a listening device (libraries, coffee shops and study spaces) but didn't actually test their method in any of those places. They use an experiment where the keyboard is the ONLY source of noise.

    I get the feeling the authors went out of their way to avoid testing real world scenarios where the keyboard would be regularly drowned out by other sounds, including obvious things like people talking during a video call.

    So, in my opinion, if you sit silently in a silent room, doing nothing but typing sensitive text during a video call with untrusted parties on a common/known device then maybe there's a real threat here but I see no reason for most people to panic. Not that there is zero risk but I think it's a bit much for the authors to call it "practical" without showing it works outside of a contrived experiment.

  • What might be useful are USB devices that not just have the functionality of a YubiKey, but have a touch screen on them. The closest to this would be a Trezor Model T.

    When authenticating with the Trezor Model T's FIDO functionality, it presents a PIN, and randomly places the digits. That way, smudges due to typing in "12345" will never be in the same place twice. From there, once the device is unlocked, it is just tapping "allow" or "deny". What might be a useful addition would be a fingerprint scanner,

  • by PPH ( 736903 ) on Sunday August 06, 2023 @05:04PM (#63745178)

    ... one of these [youtube.com] on your desk.

  • by NotEmmanuelGoldstein ( 6423622 ) on Sunday August 06, 2023 @05:29PM (#63745236)

    ... switching to password managers

    Anyone avoiding the sin of password re-use, will already have one. In addition, it allows the use of OTP (2FA), making all casual observations, worthless.

  • Soon they'll be able to read our freaking minds. Then I'm moving to cabin in the woods.

  • First of all people type passwords and other sensitive things while on teams calls and such all the time.

    But an even bigger issue, if you have a microphone enabled device nearby while you work it can be scraping content all day. If it can hear you say "hey google" or "hey siri" then it *is* listening. And thanks to ubiquitous 2FA everyone is sure to have such a device by their machine.

  • My personal masking preference is to use hard rock at brain-damaging sound levels, augmented by neighbors pounding on the front door. No keys were harmed in the making of this film.
  • It's back to the the good old ZX rubber keyboard if you need security. Though I doubt this attack will be common in normal situations.

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...