Face-Recognition Software Fingers Suspects 184
eldavojohn writes, "In Holyoke and Northampton, Massachusetts, the police have a new member on the team. It's facial recognition software that will mine the 9.5 million state license images of Massachusetts residents. From the article: 'Police Chief Anthony R. Scott said yesterday he will take advantage of the state's offer to tap into a computer system that can identify suspects through the Registry of Motor Vehicle's Facial Recognition System.' The kicker is that this system been in use since May and has been successful." An article from Iowa a few weeks back mentions that software from the same company (Digimark) is in use to catch potential fraud in applying for driver's licenses in Alabama, Colorado, Kansas, Massachusetts, Oregon, and Texas. But offering the software and photo database as a resource to police departments raises the stakes considerably. I wonder what the false positive rate is.
Re:False positive rate? (Score:1, Insightful)
Re:be careful.. (Score:2, Insightful)
Re:False positive rate? (Score:5, Insightful)
Re:but no stats (Score:4, Insightful)
Let's say your system is 99% reliable, that is to say, 1% of the time it checks a negative it reports a positive and vice versa.
Now you screen 1,000,000 people looking for one suspect, your system turns up 10,001 positives. Which one is it?
This is a problem that has been well-studied in cancer screenings. For certain rare types of cancers, there are nearly 100% reliable tests that nonetheless when they report a positive, are usually wrong.
Now it's fine to say, in the case of the cancer, that the 1% of the population should be informed and then checked via another procedure or something. But when we're talking about a process that fingers potential criminals, and in modern criminal justice where merely being a suspect hurts your life in a myriad of ways (god help you if the information winds up somewhere accessible to google, or worse yet, the case has anything to do with terrorism).
I have the same objection to large-scale wiretapping operations, if anything, the human factor there greatly increases the problem.
Oh goodies (Score:2, Insightful)
As for the level of trust that can be placed in this system......, I would place it as low at best. The inaccuracies of currently understood facial recognition software aside. The fact that swat teams routinely smash into the wrong persons home, because of a misspelled address or faulty descriptions should clue into that this system would probably trouble a lot of innocent people. I have little doubt that there would be many false positives involving people who looked relatively similar to a criminal who made it all the way up to the "arrest" phase of being a suspect before the police finally discovered it was a mistake. And in a environment where, at least as far as police mistakes/abuse are concerned, treated with a light slap on the wrist, paid leave of absence, or a reprimand on their file are about all the punishment that can be expected, I don't think they need a tool as inaccurate and dangerous as this. If they can eventually learn to use their current tools better (like putting heavy/warranted restrictions on access to DMV info, Phone Records, and Credit Card info) and punish/repair mistakes appropriately. Then maybe they should be allowed equally restricted access to a tool as dangerous as this with the affore mentioned criminal mug shots restriction, but not until.
WTF!?~ (Score:4, Insightful)
license photograph archive (Score:5, Insightful)
I just took a look at the MA code [mass.gov] and couldn't find anything allowing the photographs to be archived by the registry of motor vehicles. Maybe someone else with a better knoweledge of MA law can find such a law.
This is not an insignificant issue...the archival of the photographs and sharing them to law enforcement, basically without limit and without warrant to access the database, is the practical equivalent of requiring every citizen above the age of 16 to show up at the local police station and be photographed.
I consider the photograph archival of US license pictures to be one of the biggest and least known/understood privacy invasions in the last 10-15 years.
Re:False positive rate? (Score:4, Insightful)
Re:False positive rate? (Score:5, Insightful)
dealing with false positives from "tips". I suspect that is not
proven.
The amount of comments here, endorsing ... (Score:3, Insightful)
Turning the law upside down.. (Score:1, Insightful)
Other examples of having to prove your innocence are any broad RIAA 'john doe' suit, being labelled a terrorist (in some cases you can't even prove your innocence there because you're quickly shipped outside the internationally agreed legal framework at Guantanamo Bay), Microsofts' WGA, oh, and forgetting your college card which apparently is good enough to allow police officers to submit you to unwarranted violence which in other nations would lead to such officers facing jail. On that topic, I vaguely recall that the other argument for Iraq was the police brutality the citizens were subjected to. Well, it appears a few lessons were learned there - just not the right ones..
Let me ask you something: does Washington actually have any politicians left with a spine or have they all been bought? Does anyone actually CARE about human rights there other than to harass other nations with and as a pretext to start the odd war when it's politically convenient?
The US is not 'on' the slippery slope - it's damn well sliding fast if citizens don't start making Washingtom behave like most citizens want (I'm making the distinction here because most Americans I know don't seem to agree with what's happening in Washington - proven by the latest election results). It'll be interesting to see if that 'bloody nose' Bush received in the elections will make a difference.
Given the amount of money involved, I somehow don't think so.
Re:Once a criminal, always a suspect? (Score:2, Insightful)
Re:Once a criminal, always a suspect? (Score:2, Insightful)
As for your problems, yeh it sucks. But the article isn't saying that only convicted criminals go in: everybody goes in. So how is that punishing you? If you got nailed for involuntary manslaughter during a drunk driving hit-and-run a few years ago, and then a couple of years later it happens again in your neighborhood, you can be sure your name will come up again (even if only for a few seconds) -- with or without the system.
It's just an investigative tool to help find a list of suspects. The software is not good enough to treat it like "fingerprints" or DNA, it's just saying "these guys kind of look like the guy in the grainy black-and-white video tape. They'd need a lot more to charge you.