Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security

Biometric Face Recognition Exploit 188

clscott writes "A researcher at the U. of Ottawa has developed an exploit to which most biometric systems are probably vulnerable. He developed an algorithm which allows a fairly high quality image of a person to be regenerated from a face recognition template. Three commercial face rec. algorithms were tested and in all cases the image could masquerade to the algorithm as the target person. Here are links to a talk and a paper. Unfortunately, biometric templates are currently considered to be non-identifiable, much like a password hash. This means that legislation gets passed to require hundreds of millions of people to have their biometrics encoded onto their passports. This kind of vulnerability could mean that anyone who reads these documents has access to the holders fingerprint, iris images, etc."
This discussion has been archived. No new comments can be posted.

Biometric Face Recognition Exploit

Comments Filter:
  • by NumberField ( 670182 ) * on Friday June 27, 2003 @05:57PM (#6315242)
    This isn't a problem because most people have extras of the body parts used for most biometric schemes. For example, you probably a large supply of fingers (about ten), so it doesn't matter if a few get compromised. Similarly, if you have two eyes, it's not a big deal if your retinal print becomes known to bad guys.

    (P.S. Please no replies from humor-impaired folks.)

    • by gerf ( 532474 ) on Friday June 27, 2003 @05:59PM (#6315259) Journal

      This isn't a problem because most people have extras of the body parts used for most biometric schemes. For example, you probably a large supply of fingers (about ten), so it doesn't matter if a few get compromised. Similarly, if you have two eyes, it's not a big deal if your retinal print becomes known to bad guys. (P.S. Please no replies from humor-impaired folks.)

      I don't get it. The way you're talking isn't in a standard joking format at all. Maybe you Canadians have a different sense of humor?

    • This isn't a problem because most people have extras of the body parts used for most biometric schemes.
      It's not a problem at all. On the contrary, it is a really good discovery IMHO. The most important conclusion from this is (from the talk slides):

      Biometric software systems should provide yes/no only, with no match score values.

      My question is: why would the software systems ever need to give a match score value, instead of a yes/no answer in the first place? It's not like the algorithm develop
      • Maybe because in different situations different threshold would have to be applied. E.g., if it is a terrorist monitoring camera on a random street corner, it might not be feasible to unleash FBI agents after every guy who matched at 80%, but if that random street corner happens to be in Washington, DC across the street from the White House, 80% confidence might be a reason to trigger further actions.

        And if it is a camera in the cash machine and you claim that you are Joe and want to get your $500, you bet
    • When will people get concerned that their body parts are now vulnerable? Desperate criminals who want to infiltrate, or governments, for that matter, would find it rather suitable to simply kill a person and remove their face, eyes, fingers, etc., to use in a biometrics device.

      This is even easier to compromise than having a keycard or something, as the individual could at least hide it somewhere. They CAN'T hide their face without

      • I remember reading a paper about biometric identification using the iris. The bit I remember is that it is really easy to tell if the eye you're scanning is alive or not. For example, as part of the scanning process the machine just needs to go from dark to bright in a short time. If it does that and the pupil doesn't narrow then the eye isn't attached to a living body. I can't speak for other body parts, but it's unlikely anybody will pluck out your eyes and scan them.

  • Other systems too? (Score:5, Interesting)

    by mgcsinc ( 681597 ) on Friday June 27, 2003 @05:59PM (#6315256)
    Personally I use BioPassword for authenticating my workstation using keystroke recognition, so I seem to be safe from the exploit as yet; holding an image up to a computer seems like it would require considerably less effort than attaching a PS2 device that typed at exactly the correct rate. Nonetheless, I wonder if this discovery will prompt the redesigning of the way user data is stored across the biometric spectrum, going as far as the oft considered-foolproof keystroke systems...
    • by spydir31 ( 312329 ) *
      Keystroke and timing capture/playback is trivial, I wouldn't go trusting that as secure.
    • by NixterAg ( 198468 ) on Friday June 27, 2003 @06:29PM (#6315483)
      BioPassword unfortunately suffers from a habit of producing false rejections. It really diminishes its usability. BioPassword's best trait is that it doesn't require an additional hardware purchase to work. Several high profile banks inspected BioPassword to determine whether they could use it for identity authentication within the context of online purchases. They came to the conclusion that it wasn't usable enough.

      I think many people miss the boat when it comes to biometric identity authentication. The fact is, any security protocol can be exploited. The idea is to make it a protocol difficult enough to exploit so that it isn't in the best interests of an attacker to go after whatever is being secured. It's like cryptography. There is no unbreakable code or cipher, but there are codes that are difficult enough to break that it isn't worth the time or effort required to break them.
      • So why don't we just create a long 50000-bit key and slap it onto a magnetic-swipe card?

        That way, the system is only comprised when:

        a) You lose the card
        b) Someone threatens you at knife-point to hand the card over.

        In such cases, you simply call the card authority to invalidate the card's key and get a new one.

        • So why don't we just create a long 50000-bit key and slap it onto a magnetic-swipe card?

          A huge key is unnecessary. If they have the card, they have the key. The key exists solely to keep someone from whipping up a card with your user ID and getting instant access. No one is going to guess your key even if it's only 128 bits.

          That way, the system is only comprised when:

          a) You lose the card
          b) Someone threatens you at knife-point to hand the card over.

          Seeing that we already have the system you describe

  • paranoia (Score:5, Funny)

    by klokwise ( 610755 ) on Friday June 27, 2003 @06:01PM (#6315280)
    maybe i should extend my tin-foil hat to a tin-foil facemask and a pair of shiny gloves... that way they'll never recognise me!
    • Re:paranoia (Score:2, Funny)

      by Emugamer ( 143719 ) *
      Micheal, is that you? I didn't recognize you with two gloves on!
    • I bet they've already got a system that can identify you by the way you walk, or it's being developed. All they need you to do now is walk down that hallway in your apartment building and the floor sensors will have you identified...
      • Yeah, There was news here a few months ago about this - it does exist.

        Problems include a high failure rate when women switched between high-heels and flats, etc...

      • ...aah, so then we go for the "ants in the pants" disguise!
      • It' never work for me. My gait changes considerably on cold, wet days when the arthritis in my knees kicks in.
        Something like that would, as someone else noted, also produce false rejections depending on the type of shoes worn, whether your pants are tight or loose, etc. What if you broke an ankle? Your gait would change considerably for months as it healed up (I've spiral-fractured one, and it was nearly a year before I could walk decently again).

        But you know, I'd bet some company somewhere is already
    • Rumors out of Europe tell of Eminem dressing his hotel room in tinfoil in order to thwart people from eavesdropping on his cellphone calls.

      From TechTV [techtv.com]
    • > maybe i should extend my tin-foil hat to a tin-foil
      > facemask and a pair of shiny gloves... that way
      > they'll never recognise me!

      Nice idea, C3PO, but I don't think you'll get away with it...
    • "maybe i should extend my tin-foil hat to a tin-foil facemask and a pair of shiny gloves... that way they'll never recognise me!"

      That's true, they mightn't recognize you, but if you're planning to venture into public you had best practice your dance moves and your falsetto singing voice, Mr. Jackson.
  • Facial recognition (Score:1, Insightful)

    by Anonymous Coward
    ...doesn't work worth a damn anyway. Other forms of biometric authentication are much more reliable.
  • by Anonymous Coward
    I'm glad to know that someone legit found this out before it got into the hands of those evil terrorists . Seriously, it's great that these kinds of things are being discovered now. It just goes to show that no matter what, things can be hacked/bypassed/etc somehow.
    • by gregmac ( 629064 ) on Friday June 27, 2003 @06:06PM (#6315322) Homepage
      It just goes to show that no matter what, things can be hacked/bypassed/etc somehow.

      Not anymore, Palladium is here to save us.

      • is to protect Microsoft and DRM customers from the public.

        Who's going to protect either MS or us?

        As I understand it, X-Box was intended as a testbed for "Trustworthy Computing". A small bunch of dedicated fanatics cracked it.

        How many million people are going to try to make a rep for themselves by trying to crack Palladium / TCPA, and will all of them be "good guys" who at least will let us who subscribe to BugTraq and Full Disclosure know where the security holes are?

    • What do you mean, "a good guy discovered this"? Do you actually think that anybody who would exploit it would TELL us that he could exploit it? Honestly.

      Biometric identification is inherently flawed because it relies on things that cannot easily be changed (i.e., without major surgery), but that can be reproduced. This has been known for years. They even use similar situations on TV shows (Paul Milander, anyone?).
    • Anything can be hacked when you're storing it in the clear. Of course you'll be able to regenerate the template when you already know exactly which points need to match. The solution to this "exploit" is to run a cryptographic algorithm on the facial template before storing it anywhere. Just like /etc/shadow... you can provide an unencrypted facial template, encrypt it, and compare the two results - but you can never regenerate the original from the encrypted record (well, not without more computing pow
  • by adzoox ( 615327 ) * on Friday June 27, 2003 @06:04PM (#6315300) Journal
    A local company to me, has a biometric scan + retina and thumbprint scan, but it also takes your body temp average/signature .... the combination of the three are pretty hard, if not impossible, to spoof. And, anyone that can, was going to break into your system anyway. (With the VERY expensive equipment and extensive knowledge it would take to reproduce all three)

    Sometimes we give criminals to much credit. Again, if it's someone that can go through all three of those, they were going to get past the toughest of Indiana Jones hurdles.

  • Old News (Score:5, Funny)

    by fobbman ( 131816 ) on Friday June 27, 2003 @06:05PM (#6315314) Homepage
    The fallibility of biometric systems has been widely known since a scientific expose [imdb.com] was released on the topic no less than five years ago.

  • RTFA (Score:1, Interesting)

    by Uhh_Duh ( 125375 )
    You'll notice that the data is insecure so much as the database the biometric information is stored in is protected.

    All they're saying is that if they have access to that information, they can generate something that can authenticate against it. (DUH!)

    The moral of the story is that if you don't want someone to pretend to be Bob's face, don't give anyone access to the database that has the information on what Bob's face looks like to the biometric scanners. /. has sure been good at wasting my time with use
    • RTFA yourself (Score:5, Insightful)

      by MarcoAtWork ( 28889 ) on Friday June 27, 2003 @06:28PM (#6315473)
      You don't understand what the article is talking about. When you enroll in a biometric system, the system itself -doesn't- match based on your picture, but on a 'template' which is created by taking your standard data and performing certain destructive operations to arrive to a much smaller 'template' which can still be used to identify you.

      This is very similar to the one-way hashing that happens with unix passwords, only that in this case the hashing is 'lossier' so you have 'confidence scores' instead of a black/white answer.

      The article shows that given this 'hashed' value you can recreate an image that has a good chance of not only being authenticated by the same system/algorithm (which already should be very hard, given the one-way nature of the templatization) =BUT= also by different systems!

      It also is really interesting how if you have access to the 'confidence score' outputted by the recognizer, you can take arbitrary images and blending/averaging them again come up with an image that works.

      This is definitely not useless news and will have quite some implications.
      • I guess this is a meta-meta-RTFA, but you seem to have missed a key point in the article (though you clearly read at least some of it -- kudos for that).

        The exploit requires both the template and (repeated) access to score results (i.e., the evaluation / matching algorithm). The template itself is insufficient as the exploit depends on iterative image manupulations and "hotter, warmer, cooler" feeback from the evaluation algorithm to work.

        So, although you seem to get this in your final paragraphs (th
        • Re:RTFA yourself (Score:3, Insightful)

          by dbrutus ( 71639 )
          Did you notice that nobody's using biometric systems that aren't also sold to companies. All you really need is to have a front company that says it needs a secure biometric company id system. The same people that sold the US their system will happily sell you an exact copy scaled down to one site. Once you own the system, you can run it to your heart's content. You can get data off of passports and create proper fakes at your leisure.

          Total cost for piercing the false security of the system? Way to little
        • Re:RTFA yourself (Score:3, Insightful)

          by MarcoAtWork ( 28889 )
          I originally thought the same, but have a look at slide 15, the researcher says:

          'Access to templates OR match scores implies access to biometric sample image' (emphasis mine)

          I originally thought that you needed both, but after re-reading the presentation a few times it seems the researcher has -TWO- different exploits, one which regenerates things from the biometric data (samples not shown) and the other which takes arbitrary pics and by using the match percentage iterates a few times until it finds somet
    • what do you mean lately?
  • Yikes! (Score:2, Informative)

    by ackthpt ( 218170 ) *
    This means that legislation gets passed to require hundreds of millions of people to have their biometrics encoded onto their passports.

    So this means that spotty, streaky photo of me (or is it a dog .. a wombat maybe?) on the back of my CostCo membership card isn't safe! Just about anyone could march in the door, past their regorously trained staff and buy Boca Burgers for half off!

    Someone showed me a fake driver's license made by a "novelty" company. The only distinguishable difference was a missing a

  • I think this only further proves the need for something like a Java Card [sun.com]

    (btw, I don't work for Sun)

    A Java Card would allow you to store information (in this case biometric data) in a way that the data could be used in some sort of transformation but the original data is protected.

    Were biometric data to be included on Passports, I see no better way to store it than in a Java Card. Portions of the biometric data analysis could be offloaded onto the Java Card itself, until an acceptable and mutual balan

    • > In this way the biometric data is never exposed directly to the outside world, so one
      > need not worry about it getting leaked to the "bad guys" even if your passport were stolen. ..except of course, when the JavaCard can be used as an oracle by the attacker.
      Note that in the article they did not use any reference to the original image
      or to the dataset that the face recognition software creates from it. They rather
      chose 30 different (visually not related) images and then evolutionary selected
      the best
      • As soon as your JavaCard is going to be universal (and serve multiple purposes with varying degree of security) it has to return a "score" (rather than a yes/no decision).

        Eh? I understand the part about being able to use a score to slowly converge on a working template, but that's not the way any smartcard I've seen works.

        I've never worked with a card that returned a score. The biometric template is instead used like a PIN, it either unlocks the card or not and the card determines that. When the card
  • by astrashe ( 7452 ) * on Friday June 27, 2003 @06:10PM (#6315350) Journal
    I've been curious about these databases and how they work. They have to take the images and proces them, presumably into some sort of n-tuple. And then they database that.

    But how will they handle changes? I mean, people will probably figure out how the recognition works, and learn how to trick it. If you know the scheme, it probably wouldn't be too hard.

    If they have a giant database of these n-tuples, generated from photos, will they have to recrunch every photo in the db when they want to improve the system, or respond to holes that emerge? I guess they'll have a lot of computer power, so it's probably not too bad.

    The thing that worries me about this stuff is the possibility that the crooks and terrorists will be able to defeat it trivially, but the average citizen will be tracked everywhere he or she goes.
  • by bugsmalli ( 638337 ) on Friday June 27, 2003 @06:10PM (#6315351)
    **Guy snooping on a girl sunbathing**

    Want to snoop on your neighbor?? Want to trespass?? Want to know if there are Aliens at Area 51???

    GET YOUR OWN BIOMETRIC FACE MASTER TEMPLATE. Guaranteed to *FOOL* all Biometric Scanners. Get the *NEW* and *IMPROVED* BIOMETRIC FACE MASTER TEMPLATE from X10. It will even fool our OWN SECURITY CAMERA!!! Our NEW special offer, buy one BFMT and get PRE-APPROVED Bail for FREE (good for 5000 dollars) ORDER NOW!!!
  • Unfortunately, biometric templates are currently considered to be non-identifiable, much like a password hash. This means that legislation gets passed to require hundreds of millions of people to have their biometrics encoded onto their passports.

    Those two statements seem to be contradictory. If biometric templates are considered to be "non-identifiable" (much like lie-detector tests are inadmissable in court due to unreliability), why would legislation be passed to require them to be used in passports? A
    • Means you can't identify a fake, starting from scratch, that gives a valid match to the "template". Only, now, turns out that you can after all.
    • The people who make the decisions don't read the technical literature because they can't.

      They make decisions based on vendor presentations and canned demos.

      They also wonder why the stuff never works quite as well after they spend our money on it. Usually, they blame the IT staff they saddled with this crap to begin with.

      You don't like this? Vote for leaders who aren't lawyers.

  • Unlike all the *other* problems with biometrics, like false positives/false negatives/gelatin sheet spoofing, showing the camera a photograph, etc., this one seems like it should be easy to solve: don't store the biometric data, instead, treat it like a password and store a cryptographic hash of it instead.
    • Unlike all the *other* problems with biometrics, like false positives/false negatives/gelatin sheet spoofing, showing the camera a photograph, etc., this one seems like it should be easy to solve: don't store the biometric data, instead, treat it like a password and store a cryptographic hash of it instead.

      The paper explicitly covers encryption, etc., of the data.

      Any system that uses the data to decide whether or not the presented (fake) pattern matches the template is subject to this attack, i.e., has

    • If you bothered to RTFA (I know, this is /.), you would find that this exploit does not need access to the biometric data, instead it only needs access to the scoring function.

      Put simply:
      1. start with some random face
      2. ask the system to compute the recognition score for this face
      3. make changes to the face
      4. compute the new score
      5. if the score is higher, keep the change to the face, if the score is lower, reject the change
      6. goto 3

      You'll notice that nowhere do you have to look at the biometric data its
    • Decryption isn't necessary, all the cracker needs to get is the "confidence level" that the image submitted to the sensors matches the image hash in the database.

      I don't think this can be worked around in any way that winds up with a usable product.

  • I'm not sure if it's possible, since the face-recognition data probably has to be "fuzzy". But if there's any data that is exact, you could just hash that.
  • Joe Average User... (Score:5, Interesting)

    by Greyfox ( 87712 ) on Friday June 27, 2003 @06:19PM (#6315411) Homepage Journal
    Is going to be awfully put out when the authorities hold him because someone with his biometric pattern did soemthing highly illegal.

    He will be in the position of being assumed guilty because everyone know that biometrics don't lie and are completely infallable. Thanks to legislation like the DMCA, no one will testify that the systems are, indeed, very easy to compromise. It'll be illegal to talk about those aspects of security. Not that the law has ever stopped the black hats...

    • Testifying about the system's ease of comprimise is entirely different from trying to bust some guy with a cast iron alabi, and trust me, it will happen. All the sooner if it's someone high profile, like a congressmonkey or star athlete or actor. At that point, the system's falability will have to be questioned, and once it is, every case after will have the defense scrambling to cite Senator Bob vs. BioID Ltd.. This is also another reason why people will always remain in the identification equasion for the
      • by Poeir ( 637508 )
        Alphonse Bertillon advanced a system which would provide "unique" identification by taking measurements of various bones throughout the body. In 1903, two prisoners at the same facility were found to have almost identical Bertillion measurements, and the system was more or less scrapped. Modern facial recognition systems work in a matter similar to the Bertillion one, by comparing the ratio/measurement between various components of the face, like eyes, ears, nose, et cetera.

        Sir Francis Galton's work reg
  • Not a surprise (Score:4, Insightful)

    by Henry V .009 ( 518000 ) on Friday June 27, 2003 @06:27PM (#6315460) Journal
    Anyone who has done work on computer vision would have guessed this to be so. What would interest me is in how it would be possible to exploit the algorithms, i.e., how bad of a picture can you get away with? Certain images that might not look anything like a face to you or me will quite possibly be able to fool the system.

    The passport angle is probably a red herring though. The unreliability of photo identification is already known. Identity theft is simple and easy. Hell, here in New Mexico, we've already been the first state to accept 'Matricula Consular' cards as valid ID for driver's licenses. Matricula Consular cards, of course, are given out by Mexican embassies to undocumented Mexicans living in the US. By 'undocumented,' I mean illegal, of course. Check out the immigration reform site www.vdare.com for some more information on the subject.
  • Biometrics 101 (Score:2, Interesting)

    by stupendou ( 466135 )
    While this is an interesting expolit, the sky isn't falling. Any and all biometric systems can be exploited, and in similar ways.

    However, for this particular exploit to affect passport security and the like, the entire system would have to be automated, so that there would be no one to notice the perpetrator was holding a photo of someone else in front of his face as he walked by.

    To guard against exploits like these in totally automated systems, the data that is fed into the matching system should be digi
  • by Anonymous Coward

    There were so many different ways in
    which you were required to provide absolute proof of your iden-
    tity these days that life could easily become extremely tiresome
    just from that factor alone, never mind the deeper existential
    problems of trying to function as a coherent consciousness in an
    epistemologically ambiguous physical universe. Just look at cash
    point machines, for instance. Queues of people standing around
    waiting to have their fingerprints read, their retinas scanned, bits
    of skin scraped from the n
  • by Atario ( 673917 ) on Friday June 27, 2003 @06:30PM (#6315486) Homepage

    Make the cameras use x-ray backscattering (as in the earlier story today) of your face. Then in order to spoof the system, a printout of your picture (generated from the hash or not) would not work -- you'd have to build something that recreates your x-ray backscatter and show that to the camera. (I'm assuming that would be much more difficult, like making a sculpture out of meat or something -- anyone in the know wish to shoot down my theory?)

    Of course, then there's the issue of getting x-rayed in the face every time you walk in the door...

    • Or some face topography scheme (IR distance sensors etc...), or make people turn their head so that the computer has to validate x number of positions between a frontal and quarter profile. Thermal is too easy to fool. No doubt these methods also could be fooled and likely sucessfully reversed as well. But the more complicated the verification, the more complicated circumvention will have to be. It appears that the currenet scheme is easier to circumvent than impliment.
    • Nope, check out this [3dsystems.com].

      An associate of mine runs a small factory in Japan where they make 3d-printers, much of the technology is from Texas-based DTM. Can't find their homepage, I think they might be owned or were by BFGoodrich. Many companies use their Sinterstation, which uses a laser to fuse nylon or metal powder deposited in thin layers inside the production bay.

      The machines are I believe in the hundreds of thousands of dollars each but they are used to make prototypes like mobile phone shells, or mold

  • by swillden ( 191260 ) * <shawn-ds@willden.org> on Friday June 27, 2003 @06:33PM (#6315513) Journal

    This isn't such a big deal for face recognition systems, because face recognition systems suck at identifying people anyway. Why? First a little tereminology:

    With any biometric matcher you have to define a match "tolerance", which defines how close a pair of templates (usually one from a database and one from a livescan) have to be before they're considered to be a match. Set this tolerance too "loose" and you get lots of false positives (matches that shouldn't match), set it too "tight" and you get the opposite, false negatives. The tolerance setting where you get roughly the same number of errors each way is called the equal error point, and the error rate is called the equal error rate (abbreviated ERR for some unfathomable reason).

    Well, all current face recognition systems have an ERR that is too high to be useful in nearly any situation, even when used for identity verification, as opposed to the much-harder problem of identification (verification: I say I'm Bill Gates, and the system agrees; identification: The system says I'm Bill Gates, not RMS or anyone else). It's possible that in the future this will change, of course.

    However, this doesn't really matter because we already have ready access to an excellent and very widely available face recognition system: the Mark I eyeball. Millions of years of evolution have made people extremely good at identifying and matching human faces. What people aren't so good at (with notable exceptions) is matching a face against a database of thousands of faces they've seen only once, and *that* is something that face recognition systems can do extremely well. They may not be able to decide which faces are a "match", but they can do an excellent job of finding the *closest* faces, which can then be reviewed by the super-duper face-matching algorithm contained in the average person's head.

    When automated face recognition is used in that sort of context, spoofs like this one are unlikely to be very useful; if you want to impersonate someone you'd better get a face that's good enough to fool another human. It's doable, certainly, but much harder. And holding a laptop screen in front of your face is likely to raise some suspicions.

    • Yeah, yeah. That's what they said about handwriting. Oh, wait. They were right. Maybe I'm agreeing with you.

      Just like my beloved Apple Newton -- It got the handwriting right 98% of the time, but for the other 2%, you'd find yourself double-tapping the word to see what else it thought you might have written. I'd be surprised to learn that this isn't the way most firms are implementing the technology. After all, "Blocks more than 98% of intruders" isn't a great advertising slogan unless you plan to use
  • Better Than (Score:3, Funny)

    by somethinghollow ( 530478 ) on Friday June 27, 2003 @06:39PM (#6315551) Homepage Journal
    At least I don't have to cut someone's fingers off/eyes out/head off/etc. to get past these types of security measures any more.

    Whew! What a relief.
  • Comment removed based on user account deletion
    • Much like a hashing algorithm (and the pigeonhole principle) if two items can hash to the same spot, then the algorithm is broken; or in this instance two people look alike and the computer can't tell them apart.

      Er, actually no. Hashing two templates to the same key is not evidence of a broken algorithm as long as some of a whole range of other factors can be used to "work" the collision. In particular you want the algorithm to return an even distribution accross the key space and even more particularly

  • I'm guessing that the these biometric template could misidentify people that look quite different to a human observer.

    For instance, if she had a little less facial hair, my aunt's bouffant hairdo under a scarf might give her the same biometric as Osama bin Laden.

  • For some reason, I don't think biometric face scans would hold up in Hollywood (well, Los Angeles for that matter) very well. Having lived there, people's faces just seem to keep changing. And so do hair and eye colors. It's almost like a hobby for some people.
  • I remember reading an article (possibly from here) about the challenges facial recognition systems faced, in particular comparing the facilities in the human brain. It had very interesting examples, for instance showing only a mouth and chin, but even with just that information, most people recognized it as Julia Roberts. They also altered a picture of Clinton and Gore but switched their mouths, something again that everyone notices but that a computer would have a very hard time picking up on. Finally,
  • by SiliconEntity ( 448450 ) on Friday June 27, 2003 @07:29PM (#6315818)
    Every comment I have read has missed the point!

    This is not an exploit designed to show that biometric systems can be fooled or that you could create some kind of fake image that would match an existing one.

    The whole point is that this shows that biometric templates are privacy-sensitive. Previously it was thought that they could be stored and promulgated without interfering with anyone's privacy, because it was thought to be infeasible to start from the template and reconstruct personally identifiable information about the subject.

    The new paper shows that this is not true; from the templates, you can reconstruct an identifiable picture of the individual. That means that, for example, if you had a bunch of templates of people who went in for an AIDS test, you could re-create pictures of the people who went in, adequate to recognize individuals.

    This would therefore interfere with the privacy of those individuals. And that implies that templates need to be subject to the same kind of privacy restrictions as other forms of personally identifying information, a standard to which they have not traditionally been held.

    And that's the point of the paper.
  • by jetmarc ( 592741 ) on Friday June 27, 2003 @07:37PM (#6315873)
    The algorithm they used is simple. They use the face recognition
    system as "oracle" and present different images until the match
    is achieved. The different images are not chosen at random, but
    rather evolutionary. That is, a selection of images is presented,
    and the best (highest score) is chosen. Recursively, new selections
    are derived from the best image, and again presented to the oracle.

    According to the article 24,000 images are necessary to achieve
    convergence, when the initial images were specifically chosen to
    NOT be visually similar to the "target" image.

    Some oracles can't be questionned 24,000 times - eg at an airport
    or an ATM machine. You might become arrested long before finished.

    However, often press releases indicate which company designed the
    software for a particular implentation of face recognition. You
    can easily purchase other software of the same company (or find
    an OEM product) and thus have the same (or very similar) oracle
    on your desk at home. There you can do the 24,000 iterations to
    get ahold of the "good" image and then proceed to remodel your
    face or whatever way you intend to "present" the image to the
    real face recognition system.

    In my opinion, biometrics just doesn't work for security. Because
    everyone is open to see the datasets.

    Just look at those stupid press releases of Siemens/Infineon, who
    make high-payed security engineers invent ATM cards with finger
    print sensors. Owners finger print => money from ATM. Where does
    owner leave his finger print, when handling the card? Couldn't be
    on the very ATM card, possibly?

    Acceptable security requires

    a) something you have, and

    b) something you know.

    When the item you have is stolen, the thief lacks the information
    you know. And vice-versa, when the secret is learned (eg shoulder
    surfing at ATM), the item you have still misses to complete the
    electronic robbery.

    Biometrics is something you have, not something you know. That is
    the key thing to learn here!

    It can be copied, without your noticing, but that doesn't make it
    category b). It still is something you have, because everybody has
    access to it when he's physically near to you. You can't just shut
    up to make it stay secret.

    Therefore, biometrics won't (ever) work as long as it's coupled with
    other category a) stuff. A biometric dataset can possibly replace a
    physical token, but it can NOT replace a PIN code.

    I'm happy that this is once again demonstrated, with press coverage.

    Marc
  • A couple of decades ago Ottawa was the world's coldest capital city (I forget what it is now). The saying goes that come it's impossible to tell people apart, because everyone's wearing parkas. Now there's a challenge for facial recognition!
    • Flamebait?

      As a resident of Ottawa, I can say this is true...really! It's actally rather insightful. From January to March here your only like to see the tip of someone's nose, as the rest of your face is (and should be) covered by parkas, touques, belaclavas or scarves.

      Facial recognition biometrics would never be used here for that very reason

  • by lkaos ( 187507 ) <anthony@NOspaM.codemonkey.ws> on Friday June 27, 2003 @09:03PM (#6316452) Homepage Journal
    A useful password hash (at least one that isn't considered to be plain-text equivalent) is a cryptographic hash. A cryptographic hash is one thought to be np-hard.

    For instance, take this simple hash:

    uint32_t hash;

    for (size_t i=0; i < str.length(); i++) {
    hash += str[i];
    }

    Given an input of say, foobar, one would get a hash of 633. Now, if I start with an arbitrary password of say, google, I get a hash of 637.

    Since I know that slight adjustments to the word, produce slight differences, I know that I can just start moving letters one space down the alphabet until I find a matching value.

    Lets say I choose:

    google -} 637
    foogle -} 636
    fnogle -} 635
    fnngle -} 634
    fnnfle -} 633 *bingo*

    So know I've successfully "exploited" this password protection mechanism.. This is why it's referred to as plain-text equivalent.

    A cryptographic hash though has the interesting proper that a small change results in a unpredictable different. For instance, in the same example you might get:

    google -} 3453
    foogle -} 234543
    fnogle -} 234
    fnngle -} 23425434
    fnnfle -} 53424 ...

    There's no reason biometrics can't be cryptographically strong. It's just that the algorithms currently being aren't. That's no big news for anyone with even half a clue stick.
    • The problem is that passwords are an all or nothing. "google" works "goofle" does not. There's no hint.

      Biometric systems however supply a score. If password systems did this you crack then like this... If the password is "aaaa" and you first try "mmmm" it'll (let's say) give a score of 50. So you try "mmma" and "mmmz" and see which one gives the highest score. The first would give 62.5% and the second would be 37.5%, so you'd stick with the first and you'd make another change.

      With biometrics this is like
  • by willith ( 218835 ) on Friday June 27, 2003 @09:10PM (#6316487) Homepage
    "He developed an algorithm which allows a fairly high quality image of a person to be regenerated from a face recognition template..."

    This kinda reminds me of the part in Space Quest III, where you gain access to the restricted area inside ScumSoft by holding up a xeroxed picture of the CEO's face to the facial recognition scanner.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...