Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Face-Recognition Software Fingers Suspects 184

eldavojohn writes, "In Holyoke and Northampton, Massachusetts, the police have a new member on the team. It's facial recognition software that will mine the 9.5 million state license images of Massachusetts residents. From the article: 'Police Chief Anthony R. Scott said yesterday he will take advantage of the state's offer to tap into a computer system that can identify suspects through the Registry of Motor Vehicle's Facial Recognition System.' The kicker is that this system been in use since May and has been successful." An article from Iowa a few weeks back mentions that software from the same company (Digimark) is in use to catch potential fraud in applying for driver's licenses in Alabama, Colorado, Kansas, Massachusetts, Oregon, and Texas. But offering the software and photo database as a resource to police departments raises the stakes considerably. I wonder what the false positive rate is.
This discussion has been archived. No new comments can be posted.

Face-Recognition Software Fingers Suspects

Comments Filter:
  • by Monsieur_F ( 531564 ) <.moc.liamtoh. .ta. .xff.> on Saturday November 18, 2006 @10:32PM (#16901282) Homepage Journal
    $ finger suspect
    finger: suspect: no such user.

    $ finger suspects
    finger: suspects: no such user.
    • by smittyoneeach ( 243267 ) * on Saturday November 18, 2006 @11:06PM (#16901436) Homepage Journal
      You, for one, have clearly ventured into Soviet Rootkit territory: your new PCI overlord [slashdot.org] welcomes you!
    • by Web Goddess ( 133348 ) on Sunday November 19, 2006 @03:09AM (#16902442)
      I had one criminal conviction when I was 18. It has dogged me my entire life. It is so upsetting to hear people say, oh well, as long as it's only *convicted criminals* who go into these insane database searches.

      It's wrong to mass-search drivers' license pictures. It's also wrong to mass-search pictures of anyone who has ever been convicted of a crime. Many, many people have a regrettable misdeed in their past. It's wrong to continue to punish people who have, as was once said, "paid their debt to society."

      Penitent and paranoid in California.
      • That argument is awfully similar to GMail scanning your email.

        Its a freaking computer! It doesnt give a damn about you.
      • Re: (Score:2, Insightful)

        by ectotherm ( 842918 )
        I understand your point, but by the same token would you want a convicted burglar as a house-sitter or a convicted child molester babysitting your children? Trust, unfortunately, is like fine crystal- once broken, it will never be the same. It can be repaired, but the initial breakage will always be there as a reminder. It is unfortunate that you made a bad choice in your youth, and that the results of your choice appears in any background checks done, but you cannot expect to engender 100% trustworthine
        • I understand your point, but by the same token would you want a convicted burglar as a house-sitter or a convicted child molester babysitting your children?

          The convicted burglar? Depending on circumstances. If he broke into a store because he was starving on the street at age 18, and he's now 40, sure, why not? The child molester should never be let out onto the streets again. If someone's enough of a bastard to rape a child, they should get a mandatory death penalty.

          Trust, unfortunately, is like fi

      • Re: (Score:2, Insightful)

        Exactly how is it "wrong" to mass-search drivers' license pictures? So long as they don't ONLY rely on the computer then it's alright.
        • Person A gets caught on tape shooting a grocery store clerk. Unfortunately, that is the only lead.
        • While persuing the ordinary investigation (involving the public, looking for witnesses, looking for speeding vehicles), etc...
        • the police run the face through the recognition DB and get a number of hits
        • They then use ordinary police work to check out the potential suspects:
      • It's wrong to mass-search drivers' license pictures. It's also wrong to mass-search pictures of anyone who has ever been convicted of a crime. Many, many people have a regrettable misdeed in their past. It's wrong to continue to punish people who have, as was once said, "paid their debt to society."

        I truly understand what you mean. Although I have not been convicted of any crime (other than a few traffic issues and a failure to appear) my closest friend was so accused. What really sucks is that, in his case
  • False positive rate? (Score:5, Interesting)

    by Anonymous Brave Guy ( 457657 ) on Saturday November 18, 2006 @10:33PM (#16901290)

    I wonder what the false positive rate is.

    Speaking as someone with (a) some common sense and (b) a formal CS background including image processing work, I think it's fair to say that it won't be zero.

    I hope they have good procedures in place to immediately drop any proceedings against those who are misidentified, and that any automatic identification using this system is not somehow considered 100% reliable in court.

    • by bob65 ( 590395 )
      It's just another tool - I don't see any harm in this whatsoever, even if the false positive rate is like 30% (although that might make the tool less useful). Ultimately it's up to the police to figure out who the suspects are, and using this tool is sort of analogous to receiving a bunch of "tips" from "witnesses" that point to a group of potential suspects.
      • "Ultimately it's up to the police to figure out who the suspects are, and using this tool is sort of analogous to receiving a bunch of "tips" from "witnesses" that point to a group of potential suspects."

        Trouble is..to the general populace....the computer is infallible...much more correct than a person could be...people really beleive that in droves out there.

        Much like they all believe that Doctors know it all...give them complete 100% trust...not knowing that the Dr's largely are making only very educat

      • If the false positive rate is around 200%, and if the false negative rate is 50%, then any given search might tend to yield 2-4 suspects (depending on the variance of the false positives, of course) and about half the time the actual culprit will be included. Of the 2-4 suspects that are not the culprit, most of the time it will be quite easy to eliminate them, either visually by the user, or from simple detective work (i.e., a rock-solid alibi or possibly even common sense in some cases). So, for a fairly

    • by Adam Zweimiller ( 710977 ) on Saturday November 18, 2006 @10:48PM (#16901372) Homepage
      Yeah, but futhermore wouldn't it be safe to say that they don't just go indict someone with charges without a gumshoe comparing the photographs themselves? I mean, theres go to be some sort of human involvement. Lets say the have a CCTV image of a buglary suspect and they use this software to scan the DMV photos for a match, and the software returns 1 or more matches. They don't just throw the match(es) in jail right then. I think its a safe bet that law enforcement would use their own peepers to compare the DMV photographs with the CCTV to see if its close, and then go about questioning the match(es) for their whereabouts..etc..looking for other evidence before going ahead with prosecution. It's obvious that this system is meant to give leads rather than 100% solve cases. Sure there are going to be false positives, it's a computer look for matches. It's more than likely that it's designed to be liberal with its matches simply to give detectives a list of a dozen possible suspects rather than the entire population of a city/town etc. Regardless, I can't say I'm entirely surprised that a slashdot editor took this chance to stir the pot on something that for the most part is cool, useful, and manages to assist law enforcement without trampling our privacy.
      • by trianglman ( 1024223 ) on Saturday November 18, 2006 @11:50PM (#16901608) Journal
        The issue is more than just the false positive rate. The problem is that they are going through the entire DMV records. As it stands right now, most places can only go through previously arrested people for things like fingerprint and facial matches, which is something that comes with having a record. I, as a law abiding citizen on the other hand, should not be immediately thrown under suspicion just because my face is somewhat similar to a blurry CCTV image, which is what the false positive rate could cause. I have a job that requires me to be in a certain place at a certain time, thats not exactly possible if I am being held for questioning because of something someone I have never met did something on the other side of town. If I could trust our government to use new technologies judiciously and with restraint, it wouldn't be a problem, but this hasn't ever been the case and, short of some utopia suddenly appearing, probably never will.
        • by KKlaus ( 1012919 )

          I, as a law abiding citizen on the other hand, should not be immediately thrown under suspicion just because my face is somewhat similar to a blurry CCTV image, which is what the false positive rate could cause.

          That's not really a valid complaint in and of itself. The system already works with you being a suspect for looking like whoever committed the crime. That's what wanted posters are about, what "have you seen this man" questions are about, etc. It's not like criminals pose for cameras, so using im

          • If the system is used fairly, and police understand that people the system fingers are fairly likely to be law abiding citizens and should therefore be treated with courtesy and respect, I think it will work fine.

            Well, there's your problem right there. Sometimes it's not about justice, it's about the police arresting someone and the DA getting a conviction for political gain, or just so that they don't look like a bunch of fools. The justice system in the US is supposed to operate on a presumption of
            • by KKlaus ( 1012919 )
              Sure I guess my point was that if the police want to throw you in jail for looking like someone who committed a crime, they don't need some database of license photos to do it. They already have that capability.
              • by shmlco ( 594907 )
                Yes, but what if you work in some area, like DC, where they only way to get to work every day is to drive past a battery of these cameras? And you're stopped, daily, because the system says so?

                Which is the difference between then and now. Then, someone COULD have done it true, but statistically, the odds of it happening were slight. (Cop being there, you being there, cop noticing you and running a check, dozens of other people not being checked because cop was checking someone else, cop not munching a donut
        • Re: (Score:3, Funny)

          "Round up twice the usual suspects!" - Louie Renault
    • I'd hope this would be used as much to rule out suspects as to convict a specific one. And even if it were a not un-realistic .1%... I don't think facial recognition alone would do someone in. But this could certainly help, even if alone it wouldn't convict.
    • Re: (Score:1, Insightful)

      by snark23 ( 122331 )
      It won't be zero, but it also can't be very high or else it wouldn't be cost effective for the police. Assuming that it takes a non-trivial amount of human time to process each positive, a high false-to-true positive ratio would be a show-stopper.
      • by Znork ( 31774 )
        "It won't be zero, but it also can't be very high or else it wouldn't be cost effective for the police."

        You're assuming they want to nail _the_ perpetrator, not _a_ suspect. Consider the number of times that US prosecutors have actually opposed conceivable exonerating DNA tests even for convicts on death row, and you might not think a high false positive rate would be a showstopper at all.

        From what I've seen of facial recognition software, the error rates are horrible. Set it to sensitive and you get error
    • If juries weight this kind of evidence heavily enough to convict on the basis of it alone, then the false positive rate will be zero, by definition.
    • People seem to have the wrong idea here, I highly doubt that a system like this would examine a photo and say this is your criminal, it's more likely to be used to narrow down searches from... well... everyone, to maybe giving the top 50 matches or something to work with. Sounds like a great idea to me.
    • Re: (Score:3, Informative)

      by nbauman ( 624611 )
      >I wonder what the false positive rate is.

      It will be like the Do Not Fly list.

      Years ago, Scientific American had a story about a prototype system for facial recognition created of students at Brooklyn College. They had a database of about 1,000 faces, and they showed the 2 most similar and the 2 most different. The 2 most different were very different. The 2 most similar were so similar, I couldn't tell them apart. So back-of-the-envelope, I'd say about 2 faces in 1,000 will be so similar you can't tell
      • by Firehed ( 942385 )
        One guy was hispanic, another guy was Italian.

        Hispanic and Italian? Surely not... they were obviously both Italic.
        *rimshot*
    • It's funny how people are crying over this invasion of their rights. Fine. The police are horrible people. I'll accept that for the sake of the discussion.

      But consider the opposite side of the equation... if you are able. What if the system was only 50% sucessful? Isn't that at least a high enough success rate to send out a cop to personally ID the guy? What's the difference between this system, and some old lady down the street calling in to the cops saying she recognized the guy from the photo in the post
      • by shmlco ( 594907 )
        "Isn't that at least a high enough success rate to send out a cop to personally ID the guy?"

        If the "match" is against, say, an escaped felon, "armed and presumed dangerous," I strongly suspect the cops are going to do just a bit more than saunter up and ask for your id...
        • So you're saying we shouldn't even try to catch the armed and presumed dangerous felon?
          • by shmlco ( 594907 )
            Your orignal comment was, "What if the system was only 50% sucessful?" With the implication that 50% of the time it's going to be unsuccessful.

            My example simply served to demonstrate the severity of that potential mistake...
    • by mdfst13 ( 664665 )
      "any automatic identification using this system is not somehow considered 100% reliable in court."

      They already have systems like this for fingerprints: local police send fingerprints to the FBI; the FBI puts them in computer; computer spits out a possible match or matches; a real person then looks at the submitted fingerprint and the stored fingerprint and makes a decision. If it goes to court, a real person who is locally available will testify as to the match (well first, you can see both have a tented
    • Police dont walk around blindfolded. I'm sure they make their own decision.
      This would just help them narrow down the possibilities.
  • by Anonymous Coward
    This might be software that is a lot more popular with the ladies than the gents!
  • Well... (Score:1, Funny)

    by Anonymous Coward
    Did the software at least buy them dinner first?
  • Oh yeah? (Score:5, Funny)

    by slughead ( 592713 ) on Saturday November 18, 2006 @10:40PM (#16901346) Homepage Journal
    Face-Recognition Software Fingers Suspects

    And what does it do if they're male?
  • As a license-holding Massachusetts resident who lives right near Holyoke and Northampton (Amherst) it's nice to know I can look forward to being a criminal suspect in the near future.
    • Your picture stays the same.
      That coupled with the ravages of solar radiation upon your facial skin after a few years you will never be recoginzed.
      You will be unstopable.
  • by Wylfing ( 144940 ) <brian@@@wylfing...net> on Saturday November 18, 2006 @11:01PM (#16901426) Homepage Journal

    I fear "automatic" matching of criminals and trying to catch them, e.g., when they renew their license. Here is a true false-positive story that happened to me. I went to renew my driver's license, and the nice lady informed me that she could not issue me a license because I had had mine revoked in Maryland due to felony charges. Now, I have never committed a felony and I have never been to Maryland, let alone had a driver's license there. The nice lady was unpersuaded by this information. The database said I was a felon in Maryland, and that was the end of the story.

    After much yelling about the problem, it was finally revealed that the real felon's name was exactly like mine except for one letter, and some moron doing data entry had gone ahead and decided we were the same person, based solely on name. Since this data problem was local to the "matching" system they had implemented, and not prevalent in who-knows-how-many databases, it was cleared up with a little investigation. However, if that "match" had been replicated into other systems, I could very well have had a nasty time clearing my name. The lady at the DMV was 100% convinced that I was a felon based on what the computer told her. Quite likely, no one else would have believed I was innocent either.

    I can see this system playing havoc with people too. I have met people with no connection to each other but who nevertheless look virtually identical.

    • by whm ( 67844 ) on Saturday November 18, 2006 @11:42PM (#16901580)
      I can see this system playing havoc with people too. I have met people with no connection to each other but who nevertheless look virtually identical.

      This article is a great example of what you've described,

      http://nebraska.statepaper.com/pages/drudged/innoc ent.html [statepaper.com]

      In summary: There are two girls that look nearly identical. One of them committed a crime, and the other was put in jail for a week. There are photos in the article.
    • Maybe it will be for the best if this story becomes the norm. If it just happens in a few rare cases people (and politicians) will ignore the problem. If it happens a lot the system will be abandoned.
    • I didn't see any mention in the article that a match from the computer system would be admissible in court. It is just a tool for narrowing the suspect list down from "everyone but me" to maybe this one or more people from the database, to maybe someone not in the database...
    • by Skidge ( 316075 ) *
      I had a similar problem renewing my license in NY state. I have a fairly common name and someone with an identical name, down to the middle initial, and I believe even the same birthday had some charges for running a red light and fleeing the scene of an accident in NY City and some violations in Florida, as well. Problem was, I was living as far from NYC as you can get in the state and had only been to NYC when I was 5 years old. Not to mention that I was only 18 and had rarely driven outside of my county.
    • After much yelling about the problem, it was finally revealed that the real felon's name was exactly like mine except for one letter

      Is your name Tuttle [wikipedia.org], by chance?
  • by eyeball ( 17206 ) on Saturday November 18, 2006 @11:10PM (#16901448) Journal
    Part of the SQL better include something like "... WHERE OCCUPATION IS NOT 'politician' " otherwise there's be total anarchy.

  • sounds like a Realdoll(TM) upgrade
  • but no stats (Score:5, Interesting)

    by troll -1 ( 956834 ) on Saturday November 18, 2006 @11:21PM (#16901510)
    Sounds promising for law agencies but given that no caught suspects have been named and that criticism persists that face recognition technology is inherently unreliable, I wonder how much of this is just (sales) hype. I mean, come on, give us some real data where you can say it's effective because ... here are the names of the criminals we caught and it can all be credited to the system.
    • Re:but no stats (Score:4, Insightful)

      by colmore ( 56499 ) on Saturday November 18, 2006 @11:59PM (#16901654) Journal
      There's a fundamental, mathematical problem for any system that screens large populations looking for a small number of targets.

      Let's say your system is 99% reliable, that is to say, 1% of the time it checks a negative it reports a positive and vice versa.

      Now you screen 1,000,000 people looking for one suspect, your system turns up 10,001 positives. Which one is it?

      This is a problem that has been well-studied in cancer screenings. For certain rare types of cancers, there are nearly 100% reliable tests that nonetheless when they report a positive, are usually wrong.

      Now it's fine to say, in the case of the cancer, that the 1% of the population should be informed and then checked via another procedure or something. But when we're talking about a process that fingers potential criminals, and in modern criminal justice where merely being a suspect hurts your life in a myriad of ways (god help you if the information winds up somewhere accessible to google, or worse yet, the case has anything to do with terrorism).

      I have the same objection to large-scale wiretapping operations, if anything, the human factor there greatly increases the problem.
      • Now you screen 1,000,000 people looking for one suspect, your system turns up 10,001 positives. Which one is it?

        Worse, what if the answer is "none," since the actual criminal is either not in the database or wasn't recognized? Will the pressure to solve the case become great enough to pick the likeliest suspect out of those 10,001, and try to get a conviction.

        In Texas, there was a case where two janitors were the suspect in the killing of a girl. One was Black, one was White. There was no strong evid

  • The next time someone tells me that the slippery slope is an invalid argument, I'm going to slap them.
  • Coming soon to Toll Booths and ATMs, little brother.
    • the booth with slam shut on you, and armed guards shall come and escort you to a holding cell pending questing. all the while telling you it's for your own good.
  • by `Sean ( 15328 ) <sean@ubuntu.com> on Sunday November 19, 2006 @12:09AM (#16901694) Homepage Journal

    Finally! Inept police departments will be able to solve murders and other heinous crimes using awesome computer graphics in 47 minutes or less...just like on TV!

    Enhance...enhance...enhance...

  • Suddenly... (Score:3, Interesting)

    by Anonymous Coward on Sunday November 19, 2006 @12:20AM (#16901750)
    A whole lot of moustaches and beards become the new fasion...
  • Two potential issues with automated matching, not just facial recognition are: ending up in files as a 'person of interest', 'subject of investigation' or whatever you want to call it, simply because you were in a potential match. This doesn't look good in deep background checks. And, what would be more troublesome, is spending time defending yourself in a preliminary investigation if authorities begin to rely too heavily on fuzzy matches, unless you track your every move.
  • Oh goodies (Score:2, Insightful)

    by Anonymous Coward
    Great, yet another unpleasant use of interdepartmental government cooperation. One completely unrelated activity (in this case, driving) being used to gather data for another activity (criminal apprehension). I don't know about everyone else, but when I went to get my License I didn't think "oh swell, this information will be culled through every time the police are looking for someone, criminal or not". Perhaps it is my incessant paranoia, but I don't like the Idea of my name/information being put in a
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      I have a short barreled semi-automatic 12 gauge shotgun and other weapons loaded and ready to kill anyone (police or not) who tries to force their way into my home without showing a warrant first. I realize I would die in the process if it were a cop or SWAT team executing a so called "no-knock warrant", but some things are worth dying for, and the sanctity of my own home is one of them.
  • WTF!?~ (Score:4, Insightful)

    by sc0p3 ( 972992 ) <jaredbroad AT gmail DOT com> on Sunday November 19, 2006 @12:55AM (#16901924) Homepage Journal
    This is exactly like finger printing everyone in the state. Privacy has gone out the window. Making use of photos which people allowed for use on their license, to be used to finger them is criminal.
  • That's the one that programmed me for evil!!!!!!
  • by JimBobJoe ( 2758 ) on Sunday November 19, 2006 @01:03AM (#16901960)
    You'd be surprised how many state legislatures never bothered authorizing their respective DMVs to archive the photographs (which is a huge change from the days of the original photo licenses, where only negative was produced and no photograph maintained.)

    I just took a look at the MA code [mass.gov] and couldn't find anything allowing the photographs to be archived by the registry of motor vehicles. Maybe someone else with a better knoweledge of MA law can find such a law.

    This is not an insignificant issue...the archival of the photographs and sharing them to law enforcement, basically without limit and without warrant to access the database, is the practical equivalent of requiring every citizen above the age of 16 to show up at the local police station and be photographed.

    I consider the photograph archival of US license pictures to be one of the biggest and least known/understood privacy invasions in the last 10-15 years.
  • http://politics.slashdot.org/article.pl?sid=06/11 / 17/1630255 [slashdot.org] ' 'But rather than work out these dilemmas in partnership with their elected leaders, they were encouraged to regard all politicians as corrupt or mendacious by the media, which he described as "a conspiracy to maintain the population in a perpetual state of self-righteous rage." Whether media was left wing or right wing, the message was always that 'leaders are out there to shaft you.'"

    Obviously they are. A License to drive has just been t

  • Heh (Score:1, Informative)

    by Anonymous Coward
    "Face-Recognition Software Fingers Suspects"

    Oh, that's just great. First face-regonition violates our privacy, and now it's violating our orifices!

  • by quax ( 19371 ) on Sunday November 19, 2006 @02:43AM (#16902390)
    ... such a system is truely scary. What's next? How about 24/7 machine assisted surveilance of all telepone calls just because it may help catch a terrorist? Oh, wait a sec =:-0
  • Twins (Score:5, Interesting)

    by ms1234 ( 211056 ) on Sunday November 19, 2006 @04:14AM (#16902622)
    How does it handle identical twins?
  • by Anonymous Coward
    I'm intrigued that there's so little objection to the gradual reversal of the first principle of law: innocent until proven guilty. In this case it starts on the wholly false premise that the images used on driver licenses are of a sufficient quality to make a proper match without a high false positives rate (you can find the EU specs for biometric passports here [europa.eu]- I know that it has turned making a passport picture into something like an art form, and out of reach of your average 'picture me' box). Result
  • That's why I always wear a mask when I commit my crimes.
  • I wonder what the false positive rate is.

    Even if the rate is unacceptably high for automated use, it could prove very helpful.

    Imagine someone coming in to get a license, gets his picture taken, and while he waits for the license to be printed and laminated, the system searches its database. It produces the 10 closest matches it can find, and presents them to the DMV worker. The worker then visually compares the ten images with the actual person sitting in the waiting area. It's not necessary for the syst
    • In your scenario, the matching tool is used correctly because it ASSISTS in the decision making process. But how often have you heard "it's on the computer so it must be right"?

      The whole problem starts with someone considering the computer to be authoritative instead of yet another fraud detection tool - usually followed by downskilling the frontline workers which makes the whole matter worse.

    • I worked on image recognition software on my last job. The project was a failure, for many reasons. Our stuff was worse than average for such software, but the biggest problem is that facial recognition is very hard. The best stuff out there can maybe hit 90% accuracy. With lots of time consuming help from people, the accuracy rate can be pushed up to 98% or 99%. For 90% accuracy to even be possible, the subjects have to be photographed in a very controlled environment. Everyone's face must be in exac

  • So it would have fingered the suspect in the case of the wendy's chilli bowl..with..the..human.finger..? =\

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...