When people try to share this disgusting content they are caught and prosecuted.
That seems like the wrong way of going about it. The image has already been created and the child has already been exploited; The fact that perverts share the image shouldn't really matter. They should be putting more effort towards finding the image source, the guy who's touching kids. It's a bit of a weak analogy, but compare this to the way we handled the drug war in the past. Users were being pursued just as much as the Producers, and then nobody thought about the User after incarceration. If the User springs a trap, fine, that's cool. It's a great chance to get him some therapy and make him better. But the User is not the one hurting children, and this is defintely aimed at the Users. The algorithm looks for images that have already been circulated, instead of new ones. The new images are important because they'll help trap Producers, but Google is just doing a PR stunt with this program. Unrelated: It also bothers me that there are pictures of dead children and exsanguinated teens floating around, but that's perfectly legal to see. That's some shaky logic.
A brand spanking new Google Nexus 7 arrived in the office today. I told the person who brought it to go ahead and mess with it and try it out- so the first thing I did was a factory reset and then it updated to Android 4.2 Very nice. So I've been messing with it a couple hours and here are my very first impressions.
"Little else matters than to write good code." -- Karl Lehenbauer