Brute-Force Test Attack Bypasses Android Biometric Defense (techxplore.com) 35
schwit1 shares a report from TechXplore: Chinese researchers say they successfully bypassed fingerprint authentication safeguards on smartphones by staging a brute force attack. Researchers at Zhejiang University and Tencent Labs capitalized on vulnerabilities of modern smartphone fingerprint scanners to stage their break-in operation, which they named BrutePrint. Their findings are published on the arXiv preprint server.
A flaw in the Match-After-Lock feature, which is supposed to bar authentication activity once a device is in lockout mode, was overridden to allow a researcher to continue submitting an unlimited number of fingerprint samples. Inadequate protection of biometric data stored on the Serial Peripheral Interface of fingerprint sensors enables attackers to steal fingerprint images. Samples also can be easily obtained from academic datasets or from biometric data leaks.
And a feature designed to limit the number of unsuccessful fingerprint matching attempts -- Cancel-After-Match-Fail (CAMF) -- has a flaw that allowed researchers to inject a checksum error disabling CAMF protection. In addition, BrutePrint altered illicitly obtained fingerprint images to appear as though they were scanned by the targeted device. This step improved the chances that images would be deemed valid by fingerprint scanners. To launch a successful break-in, an attacker requires physical access to a targeted phone for several hours, a printed circuit board easily obtainable for $15, and access to fingerprint images.
A flaw in the Match-After-Lock feature, which is supposed to bar authentication activity once a device is in lockout mode, was overridden to allow a researcher to continue submitting an unlimited number of fingerprint samples. Inadequate protection of biometric data stored on the Serial Peripheral Interface of fingerprint sensors enables attackers to steal fingerprint images. Samples also can be easily obtained from academic datasets or from biometric data leaks.
And a feature designed to limit the number of unsuccessful fingerprint matching attempts -- Cancel-After-Match-Fail (CAMF) -- has a flaw that allowed researchers to inject a checksum error disabling CAMF protection. In addition, BrutePrint altered illicitly obtained fingerprint images to appear as though they were scanned by the targeted device. This step improved the chances that images would be deemed valid by fingerprint scanners. To launch a successful break-in, an attacker requires physical access to a targeted phone for several hours, a printed circuit board easily obtainable for $15, and access to fingerprint images.
"enables attackers to steal fingerprint images" (Score:1)
So much for "only a hash is stored".
But even if really only a hash was stored: Obtaining a hash means you can try and, if not outright reverse the image, then at least collide the hash on your own time. And since biometric hashes are necessarily lossy, that means obtaining a colliding image ought to be doable as well. Might even be able to speed up the process with a latent partial fingerprint on the device.
These guys report an equipment cost of $15. I'm sure law enforcement or any other snooping outfit w
Re:"enables attackers to steal fingerprint images" (Score:4, Informative)
Only storing the hash is designed to prevent your fingerprints being stolen by software. This doesn't affect that, they need to first obtain your fingerprints to execute the attack, and the secret nonce that is part of the hash isn't being recovered.
They also need to take the phone apart and connect to its internal SPI bus. iPhones are vulnerable as well, although they do at least enforce a limit on the number of failed attempts (although it's 3x higher than it is supposed to be).
It affects certain phones that use a particular sensor. Hopefully it can be fixed with a firmware update, as some are current models and still supported.
Just curious (Score:3)
Re: (Score:2)
Re: (Score:3)
It's a question of what your security model is. Most criminals are not able or willing to go to that amount of effort to unlock your phone. Because many desirable phones have anti-theft features, often they just get broken down into parts anyway.
If you are worried about the government/LEA unlocking your phone, say because you are a protestor, then don't use biometric unlock. Even without these attacks, they may be able to force you to authenticate by pressing your finger to the sensor, or holding the phone
Re: (Score:2)
Re: (Score:3)
Re:Just curious (Score:4, Interesting)
Re: (Score:2)
Yes, you can. As a matter of fact, there is a group in Germany who captured a photo of Angela Merkel's hand as she was waving at a public event, and were able to generate a spoof fingerprint from the photo (https://www.theregister.com/2014/12/29/german_minister_fingered_as_hackers_steal_her_thumbprint_from_a_photo/).
The problem with having a fingerprint is then turning it into a usable spoof. In the early days of fingerprint sensors, it was easy; an image or an easily created Gumi bear (gummy bear) mold c
Fix the software (Score:2)
If I understand correctly, the "Cancel-After-Match-Fail (CAMF)" flaw that allowed researchers to inject a checksum error is a software bug the fix for which would render this attack a lot less practical.
Re: (Score:2)
The core vulnerability isn't really fixable. Biometrics are a password that can't be changed. Police have them, former devices and employers have them... database leaks contain them. You can't have this lock rekeyed after you break up.
Re: (Score:3)
That's not the core vulnerability. Biometrics wouldn't be a problem is they could not be faked. It's supposed to be your fingerprint that unlocks the device, not an image or recreation of your fingerprint.
Decent fingerprint scanners do have some resistance to fakes, such as being able to scan through the epidermis to parts of your finger that are not reproducible from a latent fingerprint image.
Re: (Score:3)
"Decent fingerprint scanners do have some resistance to fakes, such as being able to scan through the epidermis to parts of your finger that are not reproducible from a latent fingerprint image."
First there are very few 'decent' fingerprint scanners in the wild. Second that just makes them an improved scanner for use in producing a more sophisticated fakes and checks features less reliably proven to be unique in the first place.
Also these systems will always be vulnerable to side-channel attacks. Biometric
Re: (Score:2)
I'll agree with you that, sadly, there are few 'decent' fingerprint scanners in the wild. I will submit that Apple devices and HP Enterprise / Lenovo Enterprise laptops with fingerprint sensors have 'decent' fingerprint scanners, though, which makes them not rare.
>>> that just makes them an improved scanner
Well, that's your uninformed opinion. I like to think mine is just a bit better informed.
I will agree that a biometric is not a perfect authentication device; it can be spoofed, it can't be chan
Re: (Score:2)
"Well, that's your uninformed opinion. I like to think mine is just a bit better informed."
We are all entitled to our uninformed opinions I suppose. :)
"the remarkable people like yourself, who can remember dozens of complex passwords and have information that Mossad is interested in, aren't common"
Complex passwords are far inferior to long ones but and that is what password managers are for but that is a side rant. I personally might be a higher priority target for many than most but don't knock the value o
Re: (Score:3, Insightful)
Re: (Score:3)
I haven't tried but I'll bet phone fingerprint scanners work on toes too.
Re: (Score:2)
Why would Chinese researchers publish this? (Score:2)
Re: Why would Chinese researchers publish this? (Score:1)
You misspelled "worst".
They'll publish it for the same reasons anyone else would...ultimately to improve security of devices.
Re: (Score:2)
No surprises there (Score:3)
Caught by the Race to the Bottom (Score:5, Informative)
Disclaimer: I worked on fingerprint sensors in the 00's, shipping a couple hundred million units to major Phone and Laptop manufacturers.
Our top line sensors fully encrypted all communications between the sensor and the host, similar to what the Authentec sensor in Apple devices used. Note that the researchers had zero success against Apple devices. We sold tens of millions of these to the likes of HP and Lenovo. But the mobile world wanted cheaper, cheaper, cheaper and simpler, simpler, simpler - the mobile customers neither wanted to pay an extra dime for encryption, nor did they want to deal with the hassle of encryption. So we sold them hundreds of millions of sensors with no encryption, and eventually exited the business when we could no longer compete at the sub-$1 price points.
What these researchers were attacking were the bottom-of-the-barrel sensors, selling for pennies, that the manufacturers demanded. The result isn't too surprising. It's mostly an attack on the architecture of the Android fingerprint stack, written by Google and common to all manufacturers - whoever thought that enforcing retry lockouts AFTER matching wasn't thinking very deeply. That's like accepting passwords, validating them, then deciding whether or not the user is locked out due to too many bad passwords in a row; there are way too many side channels attacks to even think that this is a good idea in the 2020's.
Re: (Score:1)
Re:Caught by the Race to the Bottom (Score:4, Insightful)
There's lots of magic involved in making a fingerprint sensor work, but the deepest, blackest magic is in the piece of software called the "Matcher" that compares the image from the sensor with the stored fingerprint template (the stored template is generally not an image, it's a processed description of the multiple fingerprint images taken during enrollment). I don't know what sensor is in the Kensington, or what matcher software was used, but primitive matchers had the kinds of issues that you're describing. They tried to store an image of your fingerprint, then match against that in succeeding days. That didn't work well, as you're noting, because fingerprints change from day to day - humidity can cause the skin to expand or contract, daily damage can cause images to change, etc. More sophisticated matchers work hard to extract information from the enrollment images that doesn't change over time - ours involved various neural nets and a process that, even after having it explained to me several times, I didn't fully grasp.
Yes, it is possible to update stored templates over time, but it's very risky - there are various methods that an attacker could use to modify the template over time to make it easier and easier for them to log in. We looked into it, but never pulled the trigger on that.
The iPhone has a sophisticated fingerprint sensor and matcher - it came from Authentec, who had been doing fingerprint sensors for a long time before being acquired. I have an iPhone 8 with their sensor, and it always matches (unless my finger is wet), despite not having re-enrolled in years. I doubt very much that it's updating the template over time, we didn't have to and had similar performance.
So, short answer, the Kensington has a crappy Matcher, and there's not much you can do about it.
Re: (Score:1)
Re:Caught by the Race to the Bottom (Score:4, Insightful)
The problem with encryption is you need to pair the sensors with the SoC. And the problem with that is the right to repair crowd screech that it's artificially locking down repairs of the device. After all, it means you can't just change the fingerprint sensor because now you need to re-pair the new sensor to establish a new encryption key.
Oh, but why can't you just have the device pair on the screen?
Well, then that just defeats the entire purpose of the encryption! You want to lock out man in the middle attacks, and replay attacks, and likely compromised sensor attacks (where the sensor is replaced with a fake one - either to capture fingerprints in cleartext, or to allow attacks like this where the sensor is basically being a bad actor.
Any mechanism you think of to fix the issue is also one which a bad guy can use to try these attacks.
It's one thing I'd like to see the right to repair groups propose - a way to allow people to repair things like this by replacing sensors/screens/etc with new ones, but being able to pair the new sensor in such a way that it cannot be used as a way to perform an attack. Because the problem is, they're not happy until you can replace anything you want to bring it to full functionality. So disabling security functions because you cannot trust the sensors used is a non-starter.
Re: (Score:3)
Pairing isn't really a problem.
All of the secure sensors we sold were paired with the host. Some of them used public key processes to assure that the sensor would only work with a host from the particular OEM - not our choice, but they were writing the checks.
Apple chose the path of pairing at manufacturing - so replacing the fingerprint sensor is not something you could do as a third-party repair. That was their choice.
Pairing a new sensor can easily be done securely on the screen, but OEMs chose not to
Mostly an easy to fix. (Score:3, Informative)
I looked at the actual paper (PDF) [arxiv.org] and the big issue here is that the "failed fingerprint attempts lockout" is easily foiled by submitting an attempt containing a bad checksum. The bad checksum causes the failed attempts counter to be reinitialized to zero and thus submitting one after a few failed attempts will enable unlimited attempts.
The lesser issue is that the fingerprint scanner data isn't signed which means anything can pretend to be your fingerprint scanner.
The takeaway here is that your fingerprint scanner isn't going to stop a motivated individual or organization. However, that should be the assumption in the first place because you leave your fingerprints on everything you touch.
Hm (Score:2)