ISPs to Create Database to Combat Child Porn 595
BlueCup writes to tell us that several media companies are banding together to create a database of child pornography images to help law enforcement officials combat distribution of questionable material. In addition to the database several tools and new technologies are also planned but most notable is what some perceive as a willingness to cooperate which critics say has been lacking in the past. From the article: "Each company will set its own procedures on how it uses the database, but executives say the partnership will let companies exchange their best ideas — ultimately developing tools for preventing child-porn distribution instead of simply catching violations."
Wanna bet? (Score:2, Interesting)
Anyway, it's just another case of "think of the children!!1"
Hashing? (Score:4, Interesting)
This can be a problem (Score:5, Interesting)
Recently, a popular imageboard at http://img.4chan.org/b/imgboard.html [4chan.org] has been added to that list for reasons unknown. Several UK ISPs, including BT Internet and NTL, have blocked that URL. Complaints to either the ISPs or the IWF from both the users and the site admin have gone unanswered. I am personally quite annoyed by this as I'm a regular user of that board.
It's this sort of unaccountable censorship of the Internet that makes me suspicious of such 'helpful' databases.
sets a bad precedent (Score:5, Interesting)
These online companies were previously protecting themselves from liability for their customers' transmissions by claiming that filtering this data would be an expensive and prohibitive task. By volunteering this service, they've crossed that line. It should be possible for the music companies, MPAA, etc. to demand filtering as well.
It's a pretty stupid plan nonetheless. These digital fingerprints will only catch casual or newbie child porn traffickers. Encryption will easily render these fingerprints useless. The worrisome side effect is the false positives that will be triggered by this fingerprinting technique. As an example, try using one of those packages that tries to tag your mp3s by fingerprinting [musicbrainz.org]... Pretty unreliable stuff.
Seth
Re:The big problem (Score:5, Interesting)
Use a signature generation method like http://vision.unige.ch/publications/postscript/98
What is child porn? (Score:5, Interesting)
How do they categorise what is collected in their database as child porn? I have yet to see an automated system that can look at a photo and describe what it is (although several have been promoted over the years) I imagine that the decision as to what category the pics falls under must be made by a human. So my question is whose standard do they apply for the process?
I can see that this process could be very arbitrary. So while I am not advocating child porn, I can also see that the data collection process could get very messy and have lots of false positives and negatives. and like the TSAs no fly list, could be very hard to get off it once you are on.
Oh shit
Re:This discussion will come to nothing, so... (Score:3, Interesting)
Re:Devil's Advocate (Score:2, Interesting)
Let's hope that some detail has been lost: A database with some kind of hash sum of the images (and *not* the images themselves) would be a good idea. The big problem is to identify images (and videos) without (or with a very low rate of) false positives and false negatives. There are known ways to bypass simple file checksum algorithms: Append some junk bytes, recompress, rotate, mirror, and so on. The hash sum algorithm has to be able to detect this. The images could be packed into archives or iso files, and so on. There are nearly infinite possibilities to bypass blacklist filters. I won't go into details, for obvious reasons.
Tux2000
Re:The big problem (Score:5, Interesting)
Eivind.
Computer-generated images will win out (Score:5, Interesting)
So, yeah, go ahead and build your database. By the time it is up and running, it will be obsolete, and we'll be discussing other problems.
Re:Corporate and Government Censorship (Score:4, Interesting)
There is no such thing as innocent until proven guilty. Never was. It was something we were to aspire to, because as humans we do not honestly beleive this.
You disagree with me, you're guilty! You dont like my politician, you're the enemy... You dont support the war, you're a commie!
You dont join the party because you're anti American.
Its US for them. I'm right, you're wrong. You can't possibly be right, because I am right. You are guilty because i say so.
What the hell for? (Score:3, Interesting)
Is there some question as to the definition of "child porn" or some type of miscommunication that prevents someone from looking it up in the dictionary? Because if the people enforcing these policies can't identify child porn without looking at 100 other child porn images first, then we have one hell of a problem on our hands.
Stockpiling these images isn't going to do anything at all. If they wanted to create some type of program that could identify porn, they could do it with the millions of legal (most of which are free) images on the web.
Re:Everything about this seems... (Score:1, Interesting)
We can just laugh off being tagged as a potential terrorist and tell it as a funny story to our friends and work collegues. Would you do the same thing if you'd been investigated by the police as a potential paedophile? I could see it happening quite easily - send a photo of your kids in the bath to their grandma, AOL system tags it, police come knocking at your door and take your computer and all your archives away. You get the computer back a week later with an apology from the police. But the damage is done, your neighbours and work collegues have found out why the police visited... It's a nightmare scenario but I'm afraid it's going to happen. And perhaps, more innocent people are going to be investigated than real paedophiles caught, as is the cause with "the war on terror".
With this specific system, this is impossible to happen - they're basing it on hashes of known images, whereas your picture to grandma would be a unique image that isn't in their database (barring collisions, but that's a totally different issue)
That would only happen if isps were to begin doing image-recognition/manual looking at all images sent through email... the first is probably technically possible but doubtfully legal, and the second is physically impossible
Re:Wanna bet? (Score:5, Interesting)
Compromised? Not in the way you mean.
Unfortunately, I have some experience of this from about 10 years ago. While I was working for a large corporation as a sysadmin I came across a stash of this stuff. To cut a long story short it went from that to helping the police gather evidence against three individuals and from there to helping them to crack a much larger ring of paedophiles.*
A normal adult wants to love and protect kids. I can tell you these people (I use the term advisedly) are *really* not normal and some of the images made me physically sick - literally. We are not talking about kids in the nude - you don't want to know. There is NO way a NORMAL adult will be compromised... really! What that police officer was probably feeling was... nothing. You have to be like that to be able to take it at all and even then it does damage. It's so bad that you *must* stop after a couple of years.
"One wonders why cops are allowed to work on this on their own, seems to me it would make much more sense to allow people access to the material only in teams, perhaps mixed-gender."
Well, you are in a team. Part of the reason of trawling through the material on your own is logistics (manpower, etc) and the other is; why expose people to more than necessary? And, as I mention above, the dangers aren't that you'll turn into a paedophile yourself.
* Yes we got them - it was on the front page of the papers - especially the bit about most of them getting 15 months. We spent two years taking them down. Go figure!
Re:This can be a problem (Score:4, Interesting)
Note that I'd never even heard of the site until now; I'm just curious as you are clearly so worked up about it.
Re:What is child porn? (Score:3, Interesting)
> How do they categorise what is collected in their database as child porn? I have yet to see an automated system that can look at a photo and describe what it is (although several have been promoted over the years) I imagine that the decision as to what category the pics falls under must be made by a human. So my question is whose standard do they apply for the process?
Indeed. And it gets even murkier when one considers famous images such as this [globalsecurity.org] (SFW).
The article indicates that hashes of the images will be kept, not the images themselves. This is dangerous since there's no accountability for what image "TSxnWMoHpb9QY" is. Without reversability there's no way to clean the database if it gets subverted.
Re:This can be a problem (Score:5, Interesting)
Many pedophiles were themselves sexually abused as children, and it has affected them for life. Many are filled with self loathing. Some have never once abused as child. Yet unlike violent murders, drug abusers, "adult" rapists, thieves, psychotics, necrophiliacs and even zoophiles, these people will never be able to get help, even if they wanted to. They are the modern untermensch, who are either expected to commit a crime so they can be summarily incarcerated or quietly commit suicide.
In either event, their flaws will sell newspapers.
Re:wont work (Score:5, Interesting)
I've randomly seen ("mild") child porn a couple of times, and I'll admit it turn me on. However, I'm smart enough that I still don't intentionally look it up, nor do I collect it, both for ethical and pragmatic reasons. Those that do look it up aren't smart enough to see and follow those pragmatic reasons.
Eivind.
Is this really a problem? (Score:3, Interesting)
What is the percentage of clearly illegally created porn as apposed to legally created porn?
Does this justify these measures? Does this reduce the incidents of actual abuse?
My thinking is that there is not all that much actual child abuse going on and that much of the 'illegal' porn that is floating about the internet is multiple copies from the few actual abuses or it is legal porn masquerading as 'illegal' porn. I also don't believe that the problem is so widespread that I need to relinquish any more of privacy or rights than the ones already stolen from me by the federal government's 'war on terror'. I also don't think that this in anyway will lesson the incident rate of child abuse and this is what we as a society need to stop. I'm all for stopping child abuse and I don't mind paying to stop it. However, I *do* mind* loosing rights and I do mind paying for ridiculous, ineffective boondoggles. And it seems lately that the government when faced with any 'problem' can *only* come up with ridiculous, ineffective boondoggles.
This will be about as effective as stopping the consumption of cocaine in the United States by dumping millions of tons of roundup in South America.
Or about as effective as stopping terrorism by killing 50,000 Iraqi civilians *and* reading all of my email and listening to all of my phone calls.
Re:Devil's Advocate (Score:3, Interesting)
Re:The big problem (Score:2, Interesting)
No jury would buy the 'it was an accident' story either. This is one of those things were if you're accused, it doesn't matter what the truth really is.
All because the police are too lazy to find the people actually creating the stuff in the first place.
Re:The big problem (Score:1, Interesting)
Consider yourself lucky. I used newsrobot to download a ton of stuff from alt.binaries.erotica.* and then later used file sharing networks (such as Kazaa). I would usually mass-download a ton of files and sort it out later. Every once in a while I'd get some child porn which I really didn't want to see. Child pornography is really disturbing. I could describe what I saw, but you really don't want to hear about it.
What sucks even more is if I didn't go through those downloaded files and one of them remained on my computer. It's completely possible for someone to have a child porn file on their computer without even knowing it, because those sick fucks are putting it out there and labelling the files as something more innocent.
Re:The big problem (Score:5, Interesting)
I think you've been spending too much time on Slashdot.
I've been using the interweb since 1998 when I was 13, and I have been exposed to child pornography since day one. I remember logging in to Microsoft Chat (which was bundled with Windows) and all the rooms were devoted to kid porn... I also remember the channel listings on DALnet just being filled with stuff like, "!!!!!!!!!!!!11LolIta-_OMG-filesrvr" although these channels tended to be pure smoke.
On a more interesting point, a few years ago, I was paid to go through a list of about 10,000 randomly selected international websites and categorize them by hand for a search engine. For every thousand or so, I would see at least a couple child pornography sites.
Re:The big problem (Score:1, Interesting)
Re:So this is like... (Score:4, Interesting)
Then again I wonder how this will affect other cultures. Does a culture where females marry at 14 perceive nude images of a person of the same age to be child porn? I'd never thought about that before. I recall an incident where some local photo developing shop called the cops on a foreign couple because they had images developed of the woman and her child nude in the tub and on a bed. Of course SRS freaked out. In reality no harm was done (except to the family). This was a common thing in their culture (and most others I would think). It makes you wonder.
Re:Official stance (Score:3, Interesting)
In fact I am very cautious about people claiming to fight against child pornography because they tend to claim a lot of power for a good cause but if they abuse their power, they really could end like a big bad brother.
Re:Official stance (Score:2, Interesting)
Typical Slashdot (Score:5, Interesting)
NCMEC will be undoubtably supplying a hash database to ISPs. MD5 or SHA1 probably as these are in common use today. This would enable matching of identical files quickly and easily.
Unfortunately, we are already running into the limits of simple MD5 matching with child porn cases today. You resize the picture or brighten it up a little bit and that changes the MD5 value and your database, library or whatever is then useless. You have a new, original picture with a new original hash value. There are other ways to accomplish this which do not suffer from these limitations without giving up high-speed autonomous comparisons. Check out http://www.infinadyne.com/icatch.html [infinadyne.com] for some ideas.
Yes, I work at the company that is producing this product.
Re:So this is like... (Score:3, Interesting)
I'm pretty sure that the parts of the picture that make it pornographic (view of the child's genitals or view of child in contact with adult genitals, for example) will make it unsuitable for public view.
There was a case a while back where the entire child was grayed out of the picture, leaving the furniture, bedspread, etc. visible, which allowed wide dissemination of the picture and subsequent identification of the room as being in a particular motel, which led to a review of the motel's guest register and, if I recall correctly, an eventual arrest and recovery of the child.
As for de-criminalizing the dissemination (perhaps an unfortunate word choice under the circumstances) of the images, that would still be a violation of the child's privacy rights.
Re:So this is like... (Score:1, Interesting)
In my opinion, at least selling, and for that matter trading (like in the old mailbox days, give me n, then you can have m of mine), since they provide an incentive for producing more. One could argue that paying for child porn should be illegal for the same reason.
Re:Hashing? (Score:3, Interesting)
Also, I beleive the law(in the US) is something terrible like "intended to illicit a sexual response", so even a 12 year old posing seductively in a swimsuit would be deemed child porn. Probably a 'lesser' child porn, but still...
I'm not 100% sure about that, and if ISPs are going to start filtering things, I'd prefer to be wrong. You know some nutjob is turned on by the sight of a kid's feet. Are we then going to have to filter pictures of underage feet? Or of feet that look as if they may be a child's?
Re:Everything about this seems... (Score:4, Interesting)
Having seen numerous advertisements for childrens skincare products on primetime television, no, I'm afraid I don't see the inherant problem. Or have unclothed infants become somehow taboo? Then again I don't read tabloids, so I imagine I'm rather behind on the latest child hysteria trends.
Re:Yeah. (Score:3, Interesting)
The hypothetical situation stated is "we suspect a teacher has done X, what do we do." The answer should not be "allow countrywide surveillance on internet traffic." In fact, we already have a convenient (albeit less sure) method of finding out if he/she's guilty:
1) Get warrant
2) Install tap on teacher's PC (not on the ISP so it can be abused)
3) Profit (or not, depending on if the crime was really commited). Be sure to ignore all other evidence of crime not specified on the warrant, if discovered.
Re:Official stance (Score:1, Interesting)
Having not been raped, I can't imagine what it'd be like to have people distributing copies of my rape to other people. However, I do realize that simply redistributing copies of a non-child rape isn't illegal (except possibly under copyright, though if the victim is the copyright holder because they're on camera depends on the jurisdiction). Further, it is the case that while there are people who have a mental disorder that makes them pedophiles, there are also people with a mental disorder that makes them rapists (ie, in both instances it's not an act regarded as something a mentally normal person would engage in). Having stated that, it would be quite useful to draw parallels between the two groups.
Why, you ask? Because of the fact that, again AFAIK, it's not illegal to redistribute non-child rape videos. First, it would be helpful to figure out why this is the case. Now, seeing how many non-child rapes occur and how many are recorded, it'd seem rather probable that there exists on the internet many circulating copies of said videos. Are there? Again, AFAIK, there aren't. Why? I would guess that it is the case because for those who are interested in "rape fantasy", there exists videos that depict in much more detail what is desired than what exists in a real rape.
And not surprisingly, there are many people who are up in arms about such videos. Yet, there hasn't been much flurry of activity because in the end, it's consentual adults who commit the acts together, a production crew who consentual records it, and consentual adults who watch it. In all, the parts are entirely legal. So, while it is possible that all of these things together actually contribute to more rapes, there isn't very strong legal footing to make it illegal.
Compare this to child pornography, where some laws are written to try to include the mere *likeness* of youth in anything *remotely* sexual, and it becomes clear that the laws as written classify "fantasy" as equal to "reality", precisely for the reason that while they can't ban the production of "rape fantasy", people are more than willing to repeatedly back make unconstituional anti-"pedo fantasy" law. In short, the people who are so interested in trying to screen out the "bad thoughts" have made it such that it's all underground and since it's all underground, it might as well be the real thing.
So, the real answer your conundrum is to encourage the legal production of "pedo fantasy" so that not only will less children be directly harmed (since commercial enterprises end up making the real thing now) but less children would be indirectly harmed (their image would be less likely to be circulated, meaning less emotional trauma). As an added bonus, many people who are now closet pedophiles might be able to be a little more open and honest, since they would have a legal outlet and be more able to point out when they discover the abuse of a child. And the laws would no longer be motivated to go after people for thought crimes but actual criminal acts (as the only people with real child porn would be actual abusers or their accomplices).
But of course, the argument could still be made that more fantasy porn could lead to more abuse. And the only thing I can say is, if that theory is true, then we should work to dissolve copyright, because if anything, copyright has been the basis for humanity's largest distribution of violent and amoral sexual fantasy; surely the support of it is then the support of violence and amoral sexual acts.
Re:Computer-generated images will win out (Score:2, Interesting)
1: It is nearly impossible to argue in defense of the material without looking like a pedophile supporter
2: It makes something harmless into another crime for which you can arrest people randomly.
Thomas Bodström (the swedish minister of justice who was behind the Piratebay raid fiasco) recently proposed a law here in Sweden that would allow the police to spy on people's surfing habits and install trojans on their computers. If anything criminal was found it would be valid evidence against the target in question. Combine this with the above law and you could essentially jail anyone with a reasonably sized hentai collection at will. If they don't have hentai, then they have mp3s, or something else. Bestiality, necrophilia. Perhaps some snuff. Then again, cartoon child porn is worse than snuff, yeah right.