
Biometric Face Recognition Exploit 188
clscott writes "A researcher
at the U. of Ottawa has developed an exploit to which most
biometric systems are probably vulnerable.
He developed an algorithm which allows a fairly high
quality image of a person to be regenerated from a
face recognition template. Three commercial face rec.
algorithms were tested and in all cases the image could
masquerade to the algorithm as the target person.
Here are links to a
talk
and a
paper.
Unfortunately, biometric templates are currently considered
to be non-identifiable, much like a password hash.
This means that
legislation gets passed to require
hundreds of millions of people to have their biometrics
encoded onto their passports. This kind of vulnerability
could mean that anyone who reads these documents has access
to the holders fingerprint, iris images, etc."
This problem is solved by redundancy (Score:5, Funny)
(P.S. Please no replies from humor-impaired folks.)
Re:This problem is solved by redundancy (Score:5, Funny)
This isn't a problem because most people have extras of the body parts used for most biometric schemes. For example, you probably a large supply of fingers (about ten), so it doesn't matter if a few get compromised. Similarly, if you have two eyes, it's not a big deal if your retinal print becomes known to bad guys. (P.S. Please no replies from humor-impaired folks.)
I don't get it. The way you're talking isn't in a standard joking format at all. Maybe you Canadians have a different sense of humor?
Re:This problem is solved by redundancy (Score:5, Funny)
Yeah really.
In the States, all of our humor formats have been standardized by the Department of Homeland Security. Currently, I'm 80% done with my ISO9666 humor certification. When I'm done, everyone will be able to understand and interface with my humor.
Re:This problem is solved by redundancy (Score:5, Funny)
American humor is expected to involve either bodily functions or blonde women.
Failure to employ region-appropriate humor will potentially flag you for review as a potential terrorist.
Re:This problem is solved by redundancy (Score:5, Funny)
Re:This problem is solved by redundancy (Score:2)
Yeah, or Polish people.
Re:This problem is solved by redundancy (Score:2)
Hey in that case, I think I qualify
Re:This problem is solved by redundancy (Score:2)
Of course, the best humor involves both.
Humor (Score:2)
Do what I did - upgrade your humo[u]r to Region-Free!
Re:This problem is solved by redundancy (Score:3, Funny)
If you *insist* on American style humor, here it is:
Re:This problem is solved by redundancy (Score:3, Insightful)
It's not a problem at all. On the contrary, it is a really good discovery IMHO. The most important conclusion from this is (from the talk slides):
Biometric software systems should provide yes/no only, with no match score values.
My question is: why would the software systems ever need to give a match score value, instead of a yes/no answer in the first place? It's not like the algorithm develop
Re:This problem is solved by redundancy (Score:3, Insightful)
And if it is a camera in the cash machine and you claim that you are Joe and want to get your $500, you bet
Think of what might happen to body parts (Score:3, Insightful)
This is even easier to compromise than having a keycard or something, as the individual could at least hide it somewhere. They CAN'T hide their face without
Regarding eyes (Score:2)
I remember reading a paper about biometric identification using the iris. The bit I remember is that it is really easy to tell if the eye you're scanning is alive or not. For example, as part of the scanning process the machine just needs to go from dark to bright in a short time. If it does that and the pupil doesn't narrow then the eye isn't attached to a living body. I can't speak for other body parts, but it's unlikely anybody will pluck out your eyes and scan them.
Other systems too? (Score:5, Interesting)
Re:Other systems too? (Score:3, Interesting)
Re:Other systems too? (Score:5, Informative)
I think many people miss the boat when it comes to biometric identity authentication. The fact is, any security protocol can be exploited. The idea is to make it a protocol difficult enough to exploit so that it isn't in the best interests of an attacker to go after whatever is being secured. It's like cryptography. There is no unbreakable code or cipher, but there are codes that are difficult enough to break that it isn't worth the time or effort required to break them.
Re:Other systems too? (Score:2)
That way, the system is only comprised when:
a) You lose the card
b) Someone threatens you at knife-point to hand the card over.
In such cases, you simply call the card authority to invalidate the card's key and get a new one.
Re:Other systems too? (Score:2)
A huge key is unnecessary. If they have the card, they have the key. The key exists solely to keep someone from whipping up a card with your user ID and getting instant access. No one is going to guess your key even if it's only 128 bits.
That way, the system is only comprised when:
a) You lose the card
b) Someone threatens you at knife-point to hand the card over.
Seeing that we already have the system you describe
Re:Something i've always wondered (Score:2)
paranoia (Score:5, Funny)
Re:paranoia (Score:2, Funny)
Re:paranoia (Score:2)
Re:paranoia (Score:3, Funny)
Problems include a high failure rate when women switched between high-heels and flats, etc...
Re:paranoia (Score:2, Interesting)
Re:paranoia (Score:3, Funny)
Re:paranoia (Score:2)
Re:paranoia (Score:2)
Something like that would, as someone else noted, also produce false rejections depending on the type of shoes worn, whether your pants are tight or loose, etc. What if you broke an ankle? Your gait would change considerably for months as it healed up (I've spiral-fractured one, and it was nearly a year before I could walk decently again).
But you know, I'd bet some company somewhere is already
or... (Score:1)
From TechTV [techtv.com]
Re:paranoia (Score:2)
> facemask and a pair of shiny gloves... that way
> they'll never recognise me!
Nice idea, C3PO, but I don't think you'll get away with it...
I left my wallet at Wonderland Ranch. (Score:2)
That's true, they mightn't recognize you, but if you're planning to venture into public you had best practice your dance moves and your falsetto singing voice, Mr. Jackson.
Facial recognition (Score:1, Insightful)
Re:Facial recognition (Score:5, Insightful)
At least a good guy discovered this (Score:1, Funny)
Re:At least a good guy discovered this (Score:4, Funny)
Not anymore, Palladium is here to save us.
the intent of Palladium (Score:2)
Who's going to protect either MS or us?
As I understand it, X-Box was intended as a testbed for "Trustworthy Computing". A small bunch of dedicated fanatics cracked it.
How many million people are going to try to make a rep for themselves by trying to crack Palladium / TCPA, and will all of them be "good guys" who at least will let us who subscribe to BugTraq and Full Disclosure know where the security holes are?
Re:At least a good guy discovered this (Score:2)
Biometric identification is inherently flawed because it relies on things that cannot easily be changed (i.e., without major surgery), but that can be reproduced. This has been known for years. They even use similar situations on TV shows (Paul Milander, anyone?).
Re:At least a good guy discovered this (Score:2)
One thing that is missing from "the spoof" (Score:5, Interesting)
Sometimes we give criminals to much credit. Again, if it's someone that can go through all three of those, they were going to get past the toughest of Indiana Jones hurdles.
Re:One thing that is missing from "the spoof" (Score:5, Funny)
Staff time to implement new security procedures $12500
Sledge hammer: $25
Expression on the Project Manager's face after he realized he should have installed a better door: Priceless
Re:One thing that is missing from "the spoof" (Score:2)
how long, does it take to measure body temp average? days, months,
Re:One thing that is missing from "the spoof" (Score:2)
So if you catch a cold you can't even get into the building?
Re:One thing that is missing from "the spoof" (Score:2)
Tell me about it. It's damn exhausting hauling that boulder back up all the time.
-
Old News (Score:5, Funny)
RTFA (Score:1, Interesting)
All they're saying is that if they have access to that information, they can generate something that can authenticate against it. (DUH!)
The moral of the story is that if you don't want someone to pretend to be Bob's face, don't give anyone access to the database that has the information on what Bob's face looks like to the biometric scanners.
RTFA yourself (Score:5, Insightful)
This is very similar to the one-way hashing that happens with unix passwords, only that in this case the hashing is 'lossier' so you have 'confidence scores' instead of a black/white answer.
The article shows that given this 'hashed' value you can recreate an image that has a good chance of not only being authenticated by the same system/algorithm (which already should be very hard, given the one-way nature of the templatization) =BUT= also by different systems!
It also is really interesting how if you have access to the 'confidence score' outputted by the recognizer, you can take arbitrary images and blending/averaging them again come up with an image that works.
This is definitely not useless news and will have quite some implications.
Re:RTFA yourself (Score:2)
The exploit requires both the template and (repeated) access to score results (i.e., the evaluation / matching algorithm). The template itself is insufficient as the exploit depends on iterative image manupulations and "hotter, warmer, cooler" feeback from the evaluation algorithm to work.
So, although you seem to get this in your final paragraphs (th
Re:RTFA yourself (Score:3, Insightful)
Total cost for piercing the false security of the system? Way to little
Re:RTFA yourself (Score:3, Insightful)
'Access to templates OR match scores implies access to biometric sample image' (emphasis mine)
I originally thought that you needed both, but after re-reading the presentation a few times it seems the researcher has -TWO- different exploits, one which regenerates things from the biometric data (samples not shown) and the other which takes arbitrary pics and by using the match percentage iterates a few times until it finds somet
Re:RTFA (Score:2)
Yikes! (Score:2, Informative)
So this means that spotty, streaky photo of me (or is it a dog .. a wombat maybe?) on the back of my CostCo membership card isn't safe! Just about anyone could march in the door, past their regorously trained staff and buy Boca Burgers for half off!
Someone showed me a fake driver's license made by a "novelty" company. The only distinguishable difference was a missing a
Re:Yikes! (Score:2)
Dude, you should have a serious conversation with your foks
SB
The solution: store biometric data on a Java Card (Score:1, Insightful)
(btw, I don't work for Sun)
A Java Card would allow you to store information (in this case biometric data) in a way that the data could be used in some sort of transformation but the original data is protected.
Were biometric data to be included on Passports, I see no better way to store it than in a Java Card. Portions of the biometric data analysis could be offloaded onto the Java Card itself, until an acceptable and mutual balan
Re:The solution: store biometric data on a Java Ca (Score:2)
> need not worry about it getting leaked to the "bad guys" even if your passport were stolen.
Note that in the article they did not use any reference to the original image
or to the dataset that the face recognition software creates from it. They rather
chose 30 different (visually not related) images and then evolutionary selected
the best
Re:The solution: store biometric data on a Java Ca (Score:2)
Eh? I understand the part about being able to use a score to slowly converge on a working template, but that's not the way any smartcard I've seen works.
I've never worked with a card that returned a score. The biometric template is instead used like a PIN, it either unlocks the card or not and the card determines that. When the card
Does the database depend on obscurity? (Score:3, Insightful)
But how will they handle changes? I mean, people will probably figure out how the recognition works, and learn how to trick it. If you know the scheme, it probably wouldn't be too hard.
If they have a giant database of these n-tuples, generated from photos, will they have to recrunch every photo in the db when they want to improve the system, or respond to holes that emerge? I guess they'll have a lot of computer power, so it's probably not too bad.
The thing that worries me about this stuff is the possibility that the crooks and terrorists will be able to defeat it trivially, but the average citizen will be tracked everywhere he or she goes.
x10 Get your Biometric Face Master Template (Score:3, Funny)
Want to snoop on your neighbor?? Want to trespass?? Want to know if there are Aliens at Area 51???
GET YOUR OWN BIOMETRIC FACE MASTER TEMPLATE. Guaranteed to *FOOL* all Biometric Scanners. Get the *NEW* and *IMPROVED* BIOMETRIC FACE MASTER TEMPLATE from X10. It will even fool our OWN SECURITY CAMERA!!! Our NEW special offer, buy one BFMT and get PRE-APPROVED Bail for FREE (good for 5000 dollars) ORDER NOW!!!
you cant get a one-ace master template (Score:2)
"Time Cube proves a 1 face god impossible, due to 4 corner face metamorphic human - baby, child, parent and grandparent faces."
Am I reading the description incorrectly? (Score:1)
Those two statements seem to be contradictory. If biometric templates are considered to be "non-identifiable" (much like lie-detector tests are inadmissable in court due to unreliability), why would legislation be passed to require them to be used in passports? A
Nah, it's just backasswards (Score:2)
Re:Am I reading the description incorrectly? (Score:2)
They make decisions based on vendor presentations and canned demos.
They also wonder why the stuff never works quite as well after they spend our money on it. Usually, they blame the IT staff they saddled with this crap to begin with.
You don't like this? Vote for leaders who aren't lawyers.
Sounds easy to fix... (Score:2)
Re:Sounds easy to fix... (Score:2, Insightful)
The paper explicitly covers encryption, etc., of the data.
Any system that uses the data to decide whether or not the presented (fake) pattern matches the template is subject to this attack, i.e., has
Re:Sounds easy to fix... (Score:2, Informative)
Put simply:
1. start with some random face
2. ask the system to compute the recognition score for this face
3. make changes to the face
4. compute the new score
5. if the score is higher, keep the change to the face, if the score is lower, reject the change
6. goto 3
You'll notice that nowhere do you have to look at the biometric data its
Read the technical paper (Score:2)
I don't think this can be worked around in any way that winds up with a usable product.
Hash the data (Score:2)
Joe Average User... (Score:5, Interesting)
He will be in the position of being assumed guilty because everyone know that biometrics don't lie and are completely infallable. Thanks to legislation like the DMCA, no one will testify that the systems are, indeed, very easy to compromise. It'll be illegal to talk about those aspects of security. Not that the law has ever stopped the black hats...
Re:Joe Average User... (Score:2)
Re:Joe Average User... (Score:3, Interesting)
Sir Francis Galton's work reg
Not a surprise (Score:4, Insightful)
The passport angle is probably a red herring though. The unreliability of photo identification is already known. Identity theft is simple and easy. Hell, here in New Mexico, we've already been the first state to accept 'Matricula Consular' cards as valid ID for driver's licenses. Matricula Consular cards, of course, are given out by Mexican embassies to undocumented Mexicans living in the US. By 'undocumented,' I mean illegal, of course. Check out the immigration reform site www.vdare.com for some more information on the subject.
Biometrics 101 (Score:2, Interesting)
However, for this particular exploit to affect passport security and the like, the entire system would have to be automated, so that there would be no one to notice the perpetrator was holding a photo of someone else in front of his face as he walked by.
To guard against exploits like these in totally automated systems, the data that is fed into the matching system should be digi
Ident-i-Eeze (Score:2, Funny)
There were so many different ways in
which you were required to provide absolute proof of your iden-
tity these days that life could easily become extremely tiresome
just from that factor alone, never mind the deeper existential
problems of trying to function as a coherent consciousness in an
epistemologically ambiguous physical universe. Just look at cash
point machines, for instance. Queues of people standing around
waiting to have their fingerprints read, their retinas scanned, bits
of skin scraped from the n
How to fix the problem (Score:4, Interesting)
Make the cameras use x-ray backscattering (as in the earlier story today) of your face. Then in order to spoof the system, a printout of your picture (generated from the hash or not) would not work -- you'd have to build something that recreates your x-ray backscatter and show that to the camera. (I'm assuming that would be much more difficult, like making a sculpture out of meat or something -- anyone in the know wish to shoot down my theory?)
Of course, then there's the issue of getting x-rayed in the face every time you walk in the door...
Re:How to fix the problem (Score:2, Interesting)
Sorry, not comparatively hard (Score:3, Interesting)
An associate of mine runs a small factory in Japan where they make 3d-printers, much of the technology is from Texas-based DTM. Can't find their homepage, I think they might be owned or were by BFGoodrich. Many companies use their Sinterstation, which uses a laser to fuse nylon or metal powder deposited in thin layers inside the production bay.
The machines are I believe in the hundreds of thousands of dollars each but they are used to make prototypes like mobile phone shells, or mold
Not as significant as you might think (Score:5, Insightful)
This isn't such a big deal for face recognition systems, because face recognition systems suck at identifying people anyway. Why? First a little tereminology:
With any biometric matcher you have to define a match "tolerance", which defines how close a pair of templates (usually one from a database and one from a livescan) have to be before they're considered to be a match. Set this tolerance too "loose" and you get lots of false positives (matches that shouldn't match), set it too "tight" and you get the opposite, false negatives. The tolerance setting where you get roughly the same number of errors each way is called the equal error point, and the error rate is called the equal error rate (abbreviated ERR for some unfathomable reason).
Well, all current face recognition systems have an ERR that is too high to be useful in nearly any situation, even when used for identity verification, as opposed to the much-harder problem of identification (verification: I say I'm Bill Gates, and the system agrees; identification: The system says I'm Bill Gates, not RMS or anyone else). It's possible that in the future this will change, of course.
However, this doesn't really matter because we already have ready access to an excellent and very widely available face recognition system: the Mark I eyeball. Millions of years of evolution have made people extremely good at identifying and matching human faces. What people aren't so good at (with notable exceptions) is matching a face against a database of thousands of faces they've seen only once, and *that* is something that face recognition systems can do extremely well. They may not be able to decide which faces are a "match", but they can do an excellent job of finding the *closest* faces, which can then be reviewed by the super-duper face-matching algorithm contained in the average person's head.
When automated face recognition is used in that sort of context, spoofs like this one are unlikely to be very useful; if you want to impersonate someone you'd better get a face that's good enough to fool another human. It's doable, certainly, but much harder. And holding a laptop screen in front of your face is likely to raise some suspicions.
Re:Not as significant as you might think (Score:2)
Just like my beloved Apple Newton -- It got the handwriting right 98% of the time, but for the other 2%, you'd find yourself double-tapping the word to see what else it thought you might have written. I'd be surprised to learn that this isn't the way most firms are implementing the technology. After all, "Blocks more than 98% of intruders" isn't a great advertising slogan unless you plan to use
Better Than (Score:3, Funny)
Whew! What a relief.
Re: (Score:2)
Re:One to one relationship / pigeonhole principle (Score:2)
Much like a hashing algorithm (and the pigeonhole principle) if two items can hash to the same spot, then the algorithm is broken; or in this instance two people look alike and the computer can't tell them apart.
Er, actually no. Hashing two templates to the same key is not evidence of a broken algorithm as long as some of a whole range of other factors can be used to "work" the collision. In particular you want the algorithm to return an even distribution accross the key space and even more particularly
Re: (Score:2)
Biometrics are the visual equivalent of soundex (Score:2)
For instance, if she had a little less facial hair, my aunt's bouffant hairdo under a scarf might give her the same biometric as Osama bin Laden.
Frequent changes... (Score:2)
Re:Frequent changes... (Score:2)
Links lost... (Score:2)
Everyone has missed the point (Score:5, Informative)
This is not an exploit designed to show that biometric systems can be fooled or that you could create some kind of fake image that would match an existing one.
The whole point is that this shows that biometric templates are privacy-sensitive. Previously it was thought that they could be stored and promulgated without interfering with anyone's privacy, because it was thought to be infeasible to start from the template and reconstruct personally identifiable information about the subject.
The new paper shows that this is not true; from the templates, you can reconstruct an identifiable picture of the individual. That means that, for example, if you had a bunch of templates of people who went in for an AIDS test, you could re-create pictures of the people who went in, adequate to recognize individuals.
This would therefore interfere with the privacy of those individuals. And that implies that templates need to be subject to the same kind of privacy restrictions as other forms of personally identifying information, a standard to which they have not traditionally been held.
And that's the point of the paper.
Simple algorithm. It works. (Score:4, Insightful)
system as "oracle" and present different images until the match
is achieved. The different images are not chosen at random, but
rather evolutionary. That is, a selection of images is presented,
and the best (highest score) is chosen. Recursively, new selections
are derived from the best image, and again presented to the oracle.
According to the article 24,000 images are necessary to achieve
convergence, when the initial images were specifically chosen to
NOT be visually similar to the "target" image.
Some oracles can't be questionned 24,000 times - eg at an airport
or an ATM machine. You might become arrested long before finished.
However, often press releases indicate which company designed the
software for a particular implentation of face recognition. You
can easily purchase other software of the same company (or find
an OEM product) and thus have the same (or very similar) oracle
on your desk at home. There you can do the 24,000 iterations to
get ahold of the "good" image and then proceed to remodel your
face or whatever way you intend to "present" the image to the
real face recognition system.
In my opinion, biometrics just doesn't work for security. Because
everyone is open to see the datasets.
Just look at those stupid press releases of Siemens/Infineon, who
make high-payed security engineers invent ATM cards with finger
print sensors. Owners finger print => money from ATM. Where does
owner leave his finger print, when handling the card? Couldn't be
on the very ATM card, possibly?
Acceptable security requires
a) something you have, and
b) something you know.
When the item you have is stolen, the thief lacks the information
you know. And vice-versa, when the secret is learned (eg shoulder
surfing at ATM), the item you have still misses to complete the
electronic robbery.
Biometrics is something you have, not something you know. That is
the key thing to learn here!
It can be copied, without your noticing, but that doesn't make it
category b). It still is something you have, because everybody has
access to it when he's physically near to you. You can't just shut
up to make it stay secret.
Therefore, biometrics won't (ever) work as long as it's coupled with
other category a) stuff. A biometric dataset can possibly replace a
physical token, but it can NOT replace a PIN code.
I'm happy that this is once again demonstrated, with press coverage.
Marc
Re:Simple algorithm. It works. (Score:3, Insightful)
>No. Biometrics is something you *are*.
>A card or other token is something you have.
Your finger and your face are "something you are".
But the biometric is something you have.
I can't "be you". But I can have your measurements. You are not your measurements.
Not Surprising In Ottawa (Score:2, Funny)
Re:Not Surprising In Ottawa (Score:2)
As a resident of Ottawa, I can say this is true...really! It's actally rather insightful. From January to March here your only like to see the tip of someone's nose, as the rest of your face is (and should be) covered by parkas, touques, belaclavas or scarves.
Facial recognition biometrics would never be used here for that very reason
Not anything like a password hash (Score:5, Informative)
For instance, take this simple hash:
uint32_t hash;
for (size_t i=0; i < str.length(); i++) {
hash += str[i];
}
Given an input of say, foobar, one would get a hash of 633. Now, if I start with an arbitrary password of say, google, I get a hash of 637.
Since I know that slight adjustments to the word, produce slight differences, I know that I can just start moving letters one space down the alphabet until I find a matching value.
Lets say I choose:
google -} 637
foogle -} 636
fnogle -} 635
fnngle -} 634
fnnfle -} 633 *bingo*
So know I've successfully "exploited" this password protection mechanism.. This is why it's referred to as plain-text equivalent.
A cryptographic hash though has the interesting proper that a small change results in a unpredictable different. For instance, in the same example you might get:
google -} 3453
foogle -} 234543
fnogle -} 234
fnngle -} 23425434
fnnfle -} 53424
There's no reason biometrics can't be cryptographically strong. It's just that the algorithms currently being aren't. That's no big news for anyone with even half a clue stick.
Re:Not anything like a password hash (Score:2)
Biometric systems however supply a score. If password systems did this you crack then like this... If the password is "aaaa" and you first try "mmmm" it'll (let's say) give a score of 50. So you try "mmma" and "mmmz" and see which one gives the highest score. The first would give 62.5% and the second would be 37.5%, so you'd stick with the first and you'd make another change.
With biometrics this is like
Oh, really? Didn't Roger Wilco already do this? (Score:4, Funny)
This kinda reminds me of the part in Space Quest III, where you gain access to the restricted area inside ScumSoft by holding up a xeroxed picture of the CEO's face to the facial recognition scanner.
Re:I don't have one, do you? (Score:2, Insightful)
Re:I don't have one, do you? (Score:2)
Re:Gee (Score:2)
Visual or at least optical biometrics are a disaster. Anyone (including government agencies) who think otherwise will end up getting in trouble by it.
Re:Yo' Mama's So Ugly... (Score:2)
SB
Re:has the professor been arrested... (Score:2)
Figure it out.
Re:Easily fixed (Score:2)
Re:Easily fixed (Score:2)