Submission + - When facial recognition goes wrong (bbc.co.uk)
Bruce66423 writes: 'A man who is bringing a High Court challenge against the Metropolitan Police after live facial recognition technology wrongly identified him as a suspect has described it as "stop and search on steroids".
'Shaun Thompson, 39, was stopped by police in February last year outside London Bridge Tube station.
'Privacy campaign group Big Brother Watch said the judicial review, due to be heard in January, was the first legal case of its kind against the "intrusive technology".
'The Met, which announced last week that it would double its live facial recognition technology (LFR) deployments, said it was removing hundreds of dangerous offenders and remained confident its use is lawful.
'LFR maps a person's unique facial features, and matches them against faces on watch-lists.'
I suspect a payout of £10,000 for each false match that is acted on would probably encourage more careful use, perhaps with a second payout of £100,000 if the same person is victimised again.
'Shaun Thompson, 39, was stopped by police in February last year outside London Bridge Tube station.
'Privacy campaign group Big Brother Watch said the judicial review, due to be heard in January, was the first legal case of its kind against the "intrusive technology".
'The Met, which announced last week that it would double its live facial recognition technology (LFR) deployments, said it was removing hundreds of dangerous offenders and remained confident its use is lawful.
'LFR maps a person's unique facial features, and matches them against faces on watch-lists.'
I suspect a payout of £10,000 for each false match that is acted on would probably encourage more careful use, perhaps with a second payout of £100,000 if the same person is victimised again.