Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
AI

UK Police's Porn-Spotting AI Keeps Mistaking Desert Pics for Nudes (gizmodo.com) 144

An anonymous reader quotes Gizmodo: London's Metropolitan Police believes that its artificial intelligence software will be up to the task of detecting images of child abuse in the next "two to three years." But, in its current state, the system can't tell the difference between a photo of a desert and a photo of a naked body... "Sometimes it comes up with a desert and it thinks its an indecent image or pornography," Mark Stokes, the department's head of digital and electronics forensics, recently told The Telegraph. "For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour."
The article concludes that the London police software "has yet to prove that it can successfully differentiate the human body from arid landscapes."

Slashdot Top Deals

We can defeat gravity. The problem is the paperwork involved.

Working...