ISPs to Create Database to Combat Child Porn 595
BlueCup writes to tell us that several media companies are banding together to create a database of child pornography images to help law enforcement officials combat distribution of questionable material. In addition to the database several tools and new technologies are also planned but most notable is what some perceive as a willingness to cooperate which critics say has been lacking in the past. From the article: "Each company will set its own procedures on how it uses the database, but executives say the partnership will let companies exchange their best ideas — ultimately developing tools for preventing child-porn distribution instead of simply catching violations."
Yeah. (Score:5, Insightful)
ISPs have long said that they are just carriers and are not responsible for the content they provide access to. As soon as the technological solution for implementing a "content filter" is there, RIAA and friends will _require_ ISPs to use it for that purpose as well.
This is completely ignoring the technical stupidity of trying to "fingerprint" media that is _not_ going to be transferred in plaintext.
Re:Yeah. (Score:5, Insightful)
> _not_ going to be transferred in plaintext.
And even if it is, it's trivial to come up with a way of altering images so that they look identical but where every bit is different to the original.
I'm sure the Chinese government would literally kill to have a way of tracking the movement of files too.
But yeah..kids...photographs...the internet...
Re:Yeah. (Score:3, Funny)
Re:Yeah. (Score:5, Funny)
I nearly spat my tea out all over my keyboard ... I read that as "tissues".
Re:Yeah. (Score:5, Funny)
Cute chica at bar: "So, what do you do for a living?"
DBA: "I am the DBA for the largest single collection of child pornography on the planet. You?"
Re:Yeah. (Score:5, Funny)
Re:Yeah. (Score:5, Insightful)
Besides I rather have someone like a teacher arrested because they found Child Porn on his PC, vs. Having him just work there for years not knowing because the ISP has blocked the traffic.
Re:Yeah. (Score:5, Insightful)
See, that's the problem -- "rather 100 innocent jailed than one guilty man go free." It's supposed to be the other way around.
Re:Yeah. (Score:3, Insightful)
This kind of pointless action doesn't help anyone except those who hunger for power. The people who try to objectivel
So this is like... (Score:5, Insightful)
Re:So this is like... (Score:5, Funny)
If we make sure The Good Guys (read: us) have 99 times as much nukular weapons as The Bad Guys (read: them), then only 1% of all nukular weapons will be in the hands of the Bad Guys. Now if we continue to increase the nukular stockpile so we have 999 times as much as The Bad Guys then only 0.1%
So if 'we' have an infinite amount of nukular weapons, the Bad Guys virtually have none at all!
Re:So this is like... (Score:5, Insightful)
Yes, but the bad guys will still have nukes. Making statistics that say "they only have 0.1% the number of nukes we have" doesn't fix that.
.. and *WOOOSH* goes the sound of the joke.. :-)
Re:So this is like... (Score:5, Insightful)
It's like stopping the proliferation of nuclear weapons by creating a stockpile of blueprints telling what various nuclear weapon looks like so they can easier be detected.
Re:So this is like... (Score:5, Insightful)
If you only store a small piece of information per image, the number of false positives will make the whole thing useless. Store too much and your storing the image.
Using SSL etc. will make it impossible.
The analogy with nuclear weapons would be similar, change the box, add a few decoy parts, paint the others a different color and the original "plans" or pictures are worthless, the machine won't detect squat. A human expert probably would.
I think this is probably all B.S., i.e. it's someone's idea of how they will make a lot of money in consulting and software developemnt. All the ISPs will buy into to say that they are doing something even though they know it is B.S.
This is really a socialogical problem which is hard to fix and this makes just it sound like everyone's doing something. They dont have the answer. If pcs of 100 people are confiscated and their personal lives invaded for every one person caught, this is a vast injustice.
Re:So this is like... (Score:5, Insightful)
This is really a socialogical problem which is hard to fix
Rubbish. It's (fairly) easy to fix. The trouble is that it's been demonised so much that it's turned into a "thoughtcrime".
Here's an idea. Remove all laws against copying, selling and downloading child porn, but keep the laws against things that actually involve the children - like statutory rape, child abuse, etc. This makes it more likely that police will be able to find images of kids being abused, partially because the black market won't be so hidden and partially because it's more likely that the illegal stuff will be photographed. If the police have images of abuse, they can crop out everything but the kid's face and stick it on a milk carton with "do you know this kid"-style messages, thus actually tracking down the kids that are being abused and stopping the real crime, not the symptom.
Unfortunately, this tactic would involve scaling back the paranoia and hatred and making a distinction between people who actually abuse children and people who are attracted to underage people. That's not a distinction society is willing to make, in my opinion, we collectively seem to like having people that we can point unreserved hatred at.
Re:So this is like... (Score:3, Interesting)
I'm pretty sure that the parts of the picture that make it pornographic (view of the child's genitals or view of child in contact with adult genitals, for example) will make it unsuitable for public view.
There was a case a while back where the entire child was grayed out of the picture, leaving the furniture, bedspread, etc. visible, which allowed wide dissemination of the picture and subseq
Re:So this is like... (Score:3, Insightful)
Re:So this is like... (Score:5, Insightful)
Re:So this is like... (Score:5, Insightful)
> You need a fingerprint, not a copy of the act.
That might be correct for examination of files. However, we're talking about ISPs here. It is not very far fetched that an ISP would try to match TCP/IP packets. That would require a fingerprint of a part of the image (impossible to produce without the original image).
My point is that an "ad hoc" database won't be useful without the original images. Sooner or later a user will come up with a new (incompatible) usage mode. Without the original images, the database can not support it.
The statement "Each company will set its own procedures on how it uses the database" just asks for it.
Re:So this is like... (Score:4, Interesting)
Then again I wonder how this will affect other cultures. Does a culture where females marry at 14 perceive nude images of a person of the same age to be child porn? I'd never thought about that before. I recall an incident where some local photo developing shop called the cops on a foreign couple because they had images developed of the woman and her child nude in the tub and on a bed. Of course SRS freaked out. In reality no harm was done (except to the family). This was a common thing in their culture (and most others I would think). It makes you wonder.
Hashing? (Score:4, Interesting)
Re:Hashing? (Score:5, Insightful)
Theres a lot of gray area, and a huge list of hashes isn't going to be very descriptive. While we're at it, they're just flagging files transfered.. What if someone sets up a relayer in a country where its legal and uses it to send kiddieporn to you via email? Click a message, commit a crime and go to jail. Or if someone defaces a site and puts up CP, or if someone just ups random CP to a public site(4chan), or any number of other ways.
Going after real pedophiles hurting real people would be great, but this isn't going to help and passing this kind of tech off as "for the children" is downright offensive.
Re:Hashing? (Score:5, Insightful)
This is what worries me about the "it's illegal to view $foo" laws - it's entirely possible that you don't know you're about to view $foo until it's too late and you've broken the law. Is there a need to go after people who have simply downloaded something dodgy since they may not have intentionally done so? Better to concentrate on people who are *paying* for content since by paying they are financially supporting the continuation of the crime (the people who haven't paid are not supporting the real criminals).
Re:Hashing? (Score:5, Informative)
Then again, our filters are made mostly to protect the innocent from being subjected to CP by accident (and yes, it'll stop a few from ever getting into the stuff), not so much prevent someone who really wants it from getting it - they'll always find a way...
Re:Hashing? (Score:3, Interesting)
Also, I beleive the law(in the US) is something terrible like "intended to illicit a sexual response", so even a 12 year old posing seductively in a swimsuit would be deemed child porn. Probably a 'lesser' child porn, but still...
I'm not 100% sure about that, and if ISPs are going to start filtering things, I'd prefer to be wrong
Re:Hashing? (Score:3, Informative)
I set up my own "porn" server one time by using MD5 hashes. I used a program called suck to, err, suck down all new pics from certain alt.binaries groups, stored the md5s in an MySQL database, and if the md5 existed, I jus
Re:Hashing? (Score:5, Insightful)
The big problem (Score:5, Insightful)
However, I don't agree with this database. Keeping these images, even for law enforcement purposes, is a violation of the privacy of children who have already been subjected to a horrific violation. Leave them alone already.
Re:The big problem (Score:5, Interesting)
Use a signature generation method like http://vision.unige.ch/publications/postscript/98
Re:The big problem (Score:5, Insightful)
Still, I'm scared of how much 'for the children' there is today. It's become the clarion call of those who want to take our rights away.
I mean, think about what else this can be used for, and you know it will be used for other things. Looking for copyrighted media, anyone?
Re:The big problem (Score:5, Interesting)
Eivind.
Re:The big problem (Score:5, Insightful)
Bullshit. In the 10 years I've been using the Internet, I've come accross child porn one (1) time, and even that looked more like two kids playing doctor than any pedophilic photo setup. If that's the "darkest side of the Internet", then the Net's brighter than the surface of the Sun.
No, what's happening here is simply another censorship / surveillance system being built with the mantra "think of the children". And the makers do think of the children - they think of those children in the future, all grown up and in chains and get a hardon from that.
So no, all the Net's users should not be on guard for the infinitesimally small chance that they happen upon CP by accident, anymore than all the people in Real Life should be on guard for the infinitesimally small chance that the guy passing you on the street happens to be a terrorist. Yeah, it's possible, but even if it happened, what the heck are you going to do - you sick pervert looked at the picture, so by law you should go to prison, since such pictures incite people to such acts, so you can't now be trusted anymore, right ? And what were you doing on a netsite where pedophiles hang out at, anyway ? You must be one too !
Every time I hear "think of the children", I think of the future of those children and want to cry. Well, actually I want to protect those children by beating the living crap out of whoever it is trying to enslave them this time, but crying is more socially accepted.
Do you honestly think that those who are building this censorship & surveillance system are doing it for the childrens sake ? No, it is something that will be used to put those children into chains, once they grow up.
Don't be fooled by their lies; these people care nothing for the children, or anyone else for that matter; they only care about power.
Re:The big problem (Score:4, Insightful)
It's a pity that I already used my mod points because I agree 100% with you.
I have been using the Internet for 20 years. Before the web was invented, I saw hardcore porn pictures floating around in the alt.* newsgroups and on some ftp servers, including on a server that I was administering (the unprotected incoming directory was used by some porn traders until I discovered it and deleted the whole stuff - no, I did not keep a copy). Some of it was rather nasty: zoophilia, BDSM, deep fisting, lots of fetish stuff and so on...
Later, when the web was invented and started to grow, I started seeing porn popping up on many web sites. Although the number of porn sites has been growing steadily, I would say that the amount of porn that you can be exposed to by accident is not larger than 10 or 20 years ago. The amount of porn that you can find if you are actively looking for it may be a bit bigger, but not much (taking into account all sources of porn that existed then and that exist now: magazines, tapes and now the web).
But during all that time, I did not see a single child porn picture (save for some censored pictures illustrating articles about how to fight against child porn). Of course I'm not actively searching for that because I find the idea disgusting. But I am convinced that those who make so much publicity around the fight against child porn are overstating the problem and (most likely) have a hidden agenda that I cannot agree with.
Re:The big problem (Score:5, Interesting)
I think you've been spending too much time on Slashdot.
I've been using the interweb since 1998 when I was 13, and I have been exposed to child pornography since day one. I remember logging in to Microsoft Chat (which was bundled with Windows) and all the rooms were devoted to kid porn... I also remember the channel listings on DALnet just being filled with stuff like, "!!!!!!!!!!!!11LolIta-_OMG-filesrvr" although these channels tended to be pure smoke.
On a more interesting point, a few years ago, I was paid to go through a list of about 10,000 randomly selected international websites and categorize them by hand for a search engine. For every thousand or so, I would see at least a couple child pornography sites.
Re:The big problem (Score:5, Insightful)
Re:The big problem (Score:3, Funny)
Devil's Advocate (Score:5, Insightful)
~Rebecca
Re:Devil's Advocate (Score:4, Informative)
"create a unique mathematical signature for each one based on a common formula"
"If child porn is detected, AOL would refer the case to the missing-children's center for further investigation, as service providers are required to do under federal law."
Kinda covers most of your post, no?
Re:Devil's Advocate (Score:3, Interesting)
This can be a problem (Score:5, Interesting)
Recently, a popular imageboard at http://img.4chan.org/b/imgboard.html [4chan.org] has been added to that list for reasons unknown. Several UK ISPs, including BT Internet and NTL, have blocked that URL. Complaints to either the ISPs or the IWF from both the users and the site admin have gone unanswered. I am personally quite annoyed by this as I'm a regular user of that board.
It's this sort of unaccountable censorship of the Internet that makes me suspicious of such 'helpful' databases.
Re:This can be a problem (Score:3, Insightful)
Re:This can be a problem (Score:3, Insightful)
So actually, you are not for the children, but against child porn (even fake) consumers. Interesting..
Re:This can be a problem (Score:5, Interesting)
Many pedophiles were themselves sexually abused as children, and it has affected them for life. Many are filled with self loathing. Some have never once abused as child. Yet unlike violent murders, drug abusers, "adult" rapists, thieves, psychotics, necrophiliacs and even zoophiles, these people will never be able to get help, even if they wanted to. They are the modern untermensch, who are either expected to commit a crime so they can be summarily incarcerated or quietly commit suicide.
In either event, their flaws will sell newspapers.
Re:This can be a problem (Score:3, Insightful)
Re:This can be a problem (Score:5, Insightful)
This is a commonly held belief. wonmder though why it only applies to sexual fantasy (again, FANTASY, not real ) about children? Look for instance at the NY Times list of best-selling books. Currently the top 5 are:
Here this "whetting" argument is often riduculed when Jack Thomson comes out with another vilification of video games.
Children know that cartoons are not real. They don't think they can fall off cliffs and survive like Wile E Coyote. People can indulge themselves in all kinds of horrible fantasies, and then close the book and live in the real world.
Re:This can be a problem (Score:3, Insightful)
Yes, REAL child porn does necessarily in its manufacture.
I will argue that quasi child porn is similar enough to real child porn that there is no substantial difference between a child porn consumer and a quasi child porn consumer.
That's not an "argument", it's just expressing distaste.
I'll again pose the challenge, what is the difference between real child porn, photo-realistic child porn with a model, and photo realistic child porn
Re:This can be a problem (Score:5, Insightful)
So because some asshole posts offensive images, he gets the whole site banned? Once that policy becomes established, think how easy it would be for any determined person to get just about any site blacklisted. Just post some kiddie porn every day for a week, reporting the site immediately after before it can be removed.
Re:This can be a problem (Score:4, Interesting)
Note that I'd never even heard of the site until now; I'm just curious as you are clearly so worked up about it.
wont work (Score:5, Insightful)
zip, rar, and other compression formats
encrpyted
hidden inside other files (stenography)
the list goes on...
these people should learn, you cant fight the internet
Re:wont work (Score:5, Insightful)
I'm sick of this mentality that criminals (esp terrorists) are not as smart as you or I. They know just as well as we do they can throw it in a zip or rar file (It's probably a better way for them to transfer the files, anyway!). In fact, IF THEY AREN'T SMART THEY GO TO JAIL. I think that's a pretty strong motivation for covering their ass.
Re:wont work (Score:5, Interesting)
I've randomly seen ("mild") child porn a couple of times, and I'll admit it turn me on. However, I'm smart enough that I still don't intentionally look it up, nor do I collect it, both for ethical and pragmatic reasons. Those that do look it up aren't smart enough to see and follow those pragmatic reasons.
Eivind.
Re:wont work (Score:4, Insightful)
2. Like any male adult with a sex drive who isn't a lying sack of shit, you admitted that sometimes individuals that haven't quite reached the age of consent turn you on. I applaud you for your integrity, but think about what you said right afterward: these pragmatic reasons you talk about amount to the laws being so screwed up that you're afraid to do what you want with your own computer in the privacy of your own home. And unless you believe law = ethics, the ethical argument falls apart when you realize there are perfectly civilized, modern, and inhabitable countries where the age of consent falls anywhere between 15 and 18. The US is an anomaly in treating every individual under 18 as a child (except for purposes of administering the death penalty, of course).
Re:wont work (Score:3, Funny)
Everything about this seems... (Score:5, Insightful)
AOL, for instance, plans to check e-mail attachments that are already being scanned for viruses. If child porn is detected, AOL would refer the case to the missing-children's center for further investigation, as service providers are required to do under federal law.
Sounds like one of those 'good on paper' ideas that later spins itself into a slavering monster that eats half the internet. What's to say they don't start scanning for other things? Is the RIAA going to be knocking on my door because I sent an AOL member a Metallica MP3?
Re:Everything about this seems... (Score:5, Insightful)
How many people have been seriously inconvenienced when trying to take a flight because the system has flagged them as a potential terrorist? A lot more innocent people have been inconvenienced than terrorists have been caught. Now, imagine the same situation but applied to this...
We can just laugh off being tagged as a potential terrorist and tell it as a funny story to our friends and work collegues. Would you do the same thing if you'd been investigated by the police as a potential paedophile? I could see it happening quite easily - send a photo of your kids in the bath to their grandma, AOL system tags it, police come knocking at your door and take your computer and all your archives away. You get the computer back a week later with an apology from the police. But the damage is done, your neighbours and work collegues have found out why the police visited... It's a nightmare scenario but I'm afraid it's going to happen. And perhaps, more innocent people are going to be investigated than real paedophiles caught, as is the cause with "the war on terror".
Re:Everything about this seems... (Score:4, Informative)
Why imagine? People already have been condenmed [blogspot.com] for taking pictures of their kids at bathtime. [bbc.co.uk]
Bathtime has become a taboo activity, best undergone alone, one child at a time, and if a supervisor must be present, only the child's mother is allowed. Possibly an aunt, but that's pushing it. No fathers allowed. Eyes only. IR goggles preferred.
God Bless The News Of The World.
Re:Everything about this seems... (Score:4, Interesting)
Having seen numerous advertisements for childrens skincare products on primetime television, no, I'm afraid I don't see the inherant problem. Or have unclothed infants become somehow taboo? Then again I don't read tabloids, so I imagine I'm rather behind on the latest child hysteria trends.
Re:Everything about this seems... (Score:5, Insightful)
That was just one example of how an innocent person might be flagged, there are many others I can think of. For instance, we all know that people who have very insecure Windows machines. Say they get infected by a worm that then emails kiddie porn. The same scenario applies... Visit from police, computers taken away, the shy funny looking guy in the office who everyone thinks is a bit weird commits suicide because everyone thinks he must be a paedophile since he was investigated by the police...
privacy issues... (Score:5, Insightful)
so they will be scanning our web traffic in real-time to determin if we are sharing child porn?
anyone else see this and think something along the lines of "this is just a 'think of the children' excuse to implement advanced monitoring systems, which in due time the govt. will take over 'in the public interest'"?
sets a bad precedent (Score:5, Interesting)
These online companies were previously protecting themselves from liability for their customers' transmissions by claiming that filtering this data would be an expensive and prohibitive task. By volunteering this service, they've crossed that line. It should be possible for the music companies, MPAA, etc. to demand filtering as well.
It's a pretty stupid plan nonetheless. These digital fingerprints will only catch casual or newbie child porn traffickers. Encryption will easily render these fingerprints useless. The worrisome side effect is the false positives that will be triggered by this fingerprinting technique. As an example, try using one of those packages that tries to tag your mp3s by fingerprinting [musicbrainz.org]... Pretty unreliable stuff.
Seth
What is child porn? (Score:5, Interesting)
How do they categorise what is collected in their database as child porn? I have yet to see an automated system that can look at a photo and describe what it is (although several have been promoted over the years) I imagine that the decision as to what category the pics falls under must be made by a human. So my question is whose standard do they apply for the process?
I can see that this process could be very arbitrary. So while I am not advocating child porn, I can also see that the data collection process could get very messy and have lots of false positives and negatives. and like the TSAs no fly list, could be very hard to get off it once you are on.
Oh shit
Re:What is child porn? (Score:3, Interesting)
> How do they categorise what is collected in their database as child porn? I have yet to see an automated system that can look at a photo and describe what it is (although several have been promoted over the years) I imagine that the decision as to what category the pics falls under must be made by a human. So my question is whose standard do they apply for the process?
Indeed. And it gets even murkier when one considers famous images such as this [globalsecurity.org] (SFW).
The article indicates that hashes of the image
And of course... (Score:5, Insightful)
And with all the porn, they'll need .. (Score:5, Funny)
*rim short*
Thank-you, thank-you, I'll be here all week
RTFA? (Score:5, Funny)
Re:RTFA? (Score:3, Funny)
We'd be lucky for 3 slashdotters to rtfa on any given topic
It's a really delicate subject (Score:4, Insightful)
The subject is really complicated, here you have a conjunction action from the top ISP companies, but there are some things we must know.
This means that if "somebody" sends to me an image that triggers the filter I'm gonna be a "suspect" (at least for a while) so AOL refer the case and 1 minute later i have an investigation running on my private emails.
BTW... i don't want to sound paranoid, but this is a "way to start", then the database can include another kind of images (who knows?). Or just filter anything they want. The comparison with the Antivirus system (intentionally and not so technical related) put me more alert.
I don't want to sound liberal, I'm against child pornography, but i think that this is not the way to fight against it. If some sick-man (A) have a picture of some-more-sick-asshole(B) doing nasty things with a child, he(A) is a sick person but not a criminal, the asshole(B) must go to jail because he abuse (mental and physical) the boy (the other guy(A) must go to a doctor).
Another idea could be the "infection" of some images/files/videos and leave in the wild (this pedophiles bastards are not technical specialist, the majority of them are teachers, fathers or military related). So we keep track of the files all over, and figured out "sources" where they upload this files not a "single email address" i mean where a lot of files converge from different places. Then, security experts with some legal support, 0wn the server and monitors everything... and the investigation continues.
Also the P2P networks has a LOT of "pedophilic" shares, but you can't run after every sick people, you must go to the source and condemn the one who abuse the child.
I don't like the idea of "monitors everything -> searching for something". I think it must be like i said before... its a HUGE difference.
How would it work?? (Score:3, Insightful)
Duplication of effort (Score:5, Insightful)
http://www.interpol.int/Public/ICPO/PressReleases
I've met some of the guys running it, and while I really admire their dedication and achievements, I can honestly say there's no job on earth I'd less like to have.
Computer-generated images will win out (Score:5, Interesting)
So, yeah, go ahead and build your database. By the time it is up and running, it will be obsolete, and we'll be discussing other problems.
Re:Computer-generated images will win out (Score:3, Informative)
In the UK computer generated porn is classified in law exactly the same way as regular digital / wet photography. AFAIK the same is true of drawings and paints as well.
What the hell for? (Score:3, Interesting)
Is there some question as to the definition of "child porn" or some type of miscommunication that prevents someone from looking it up in the dictionary? Because if the people enforcing these policies can't identify child porn without looking at 100 other child porn images first, then we have one hell of a problem on our hands.
Stockpiling these images isn't going to do anything at all. If they wanted to create some type of program that could identify porn, they could do it with the millions of legal (most of which are free) images on the web.
I run an ISP. (Score:5, Funny)
rhY
So much potential for abuse (Score:5, Insightful)
I piss off the wrong person. This person has access to material of this kind, and a zombie botnet. He arranges for this botnet to spam me with pictures of kiddy porn. The emails are caught by this system and flagged, and suddenly I'm the subject of an investigation. The way that sort of thing works here in the UK, I'm likely to be splashed all over the papers before my innocence is proved (which won't make nearly as large headlines, of course). Even if I am cleared, my reputation may well be shot to hell; people over here aren't too picky when it comes to this sort of thing. A few years ago a tabloid paper raised hell about paedophiles having been released into the community after serving their sentence. Some of the resulting protests saw a paediatrician being hounded from her home - people saw "paed" and thought "paedo". Rationality often takes a back seat where kids are concerned; this could be a very cheap and easy way to utterly ruin someone.
Hypothetical scenario 2:
I go on holiday with my family. I take photographs. I email some of these photographs to my friends and parents. Some of them contain shots of my 6 year old daughter in her swimming costume. An overzealous automated process tags this as a false positive, and suddenly we're all under investigation.
To be honest, scenario 2 doesn't worry me so much; it should be obvious to even the most rabid "think of the children" zealot that the photos are perfectly innocent. It's the first one that gives me grave cause for concern. It would potentially take some effort to prove ones innocence, during which time you're very likely to have been utterly pilloried in the press. If you have kids yourself, they may even have been taken into care for the duration, and are likely to have been teased or bullied about it at school.
I appreciate that measures do need to be taken to fight against child porn, but given the highly sensitive nature of the subject, I have conerns about implementing any sort of automated system.
Is this really a problem? (Score:3, Interesting)
What is the percentage of clearly illegally created porn as apposed to legally created porn?
Does this justify these measures? Does this reduce the incidents of actual abuse?
My thinking is that there is not all that much actual child abuse going on and that much of the 'illegal' porn that is floating about the internet is multiple copies from the few actual abuses or it is legal porn masquerading as 'illegal' porn. I also don't believe that the problem is so widespread that I need to relinquish any more of privacy or rights than the ones already stolen from me by the federal government's 'war on terror'. I also don't think that this in anyway will lesson the incident rate of child abuse and this is what we as a society need to stop. I'm all for stopping child abuse and I don't mind paying to stop it. However, I *do* mind* loosing rights and I do mind paying for ridiculous, ineffective boondoggles. And it seems lately that the government when faced with any 'problem' can *only* come up with ridiculous, ineffective boondoggles.
This will be about as effective as stopping the consumption of cocaine in the United States by dumping millions of tons of roundup in South America.
Or about as effective as stopping terrorism by killing 50,000 Iraqi civilians *and* reading all of my email and listening to all of my phone calls.
Re:Is this really a problem? (Score:3, Insightful)
Child porn filtering only helps its distribution (Score:3, Insightful)
Essentially, any filtering mechanism depends on ability to detect the illegal act. If you prevent every method of distribution possible, the only channels left for child porn distributions are ones which are currently impossible to detect. Thus, in the long run this will only make it safer and more secure for people to download child porn. With filtering in place, the end users will know that if they're able to get the material, it means it probably cannot be traced.
If you want real solutions to the child porn problem, you should attack the people involved. "Divide and conquer" is the basic strategy, the different groups have to be isolated from each others and dismantled. Currently there are large anonymous p2p networks which are mainly run by people who want to share files, namely to perform copyright infringement. The child porn distributors use the same networks. If you want to eliminate child porn, you need to isolate these two groups from each others by giving them different goals. Currently, they both want to hide what they're doing from the authorities. One straightforward solution would be to allow filesharing for non-commercial purposes and encourage it to be done in plain sight and moderated networks, so child porn distributors couldn't piggyback in warez networks. Not going to happen anytime soon, eh, so does anyone else have any other ideas?
Wouldn't it make more sense ..... (Score:5, Insightful)
I have absolutely no problem whatsoever with people who just want to look at pictures. Yes, they may well be pictures documenting a crime that was committed
I say let people jack off into a box of tissues as much as they damn well like. At least once they've spent their pocket money, they're no danger to anyone for a couple of hours. If they're doing more than look at pictures, then by all means go after them. But what a person does within the privacy of their own imagination is nobody else's business.
Re:Wouldn't it make more sense ..... (Score:5, Informative)
Oh, but arresting people for thought-crimes and future-crimes is so much more fun. Easier too!
Seriously though, what's scary to me is how little discretion the cops/prosecutors use when arresting people for CP-related crimes. They arrest underage teens for sending out nude pictures of themselves!
http://www.usatoday.com/tech/webguide/internetlif
People always assume that everyone arrested for CP is a 50-year-old guy in a trenchcoat looking at pictures of babies being raped. Not so. There are so many cops working on these cases that they bust everyone they can find.
Is a picture of your kids naked child porn? (Score:3, Insightful)
While I find it mildly weird to put family photos with naked kids on Flickr or your own family picture site, I can see no reason why this should be illegal. But isn't there a chance of these pictures finding their way into the kiddie porn database? If so, isn't there a decent chance someone may end up being tracked as a pedophile simply for proudly posting family pictures on the Internet?
Differentiating between kiddie porn and legal pictures of kids is probably hard enough when you do it manually and individually, but doing this on a massive scale just sounds incredibly hard and possibly dangerous.
Open up the can o'worms (Score:3, Insightful)
That's pretty much it.
Now, when A can't get his pictues from B anymore the "normal" way, what will happen? Will they stop trading?
Would you stop getting music from the 'net if the RIAA (who do I fool, that should read "when", not "if") buys the corresponding law to apply this technology to music?
What will happen is that the ways to transfer those items become more obscured. Hashes are worthless as soon as you change a single byte. Both ends agree on an encryption scheme and the transfer is possible again. What automatically fails is any kind of tracking possibility.
Currently, when those files can pass, CP traders might be carelessly using traditional means to transfer their material. Because "it works". When it doesn't "work" anymore, they won't stop, they will turn to technologies that can not be stopped.
Those can't be tracked as easily either, though.
REALLY bad idea. (Score:4, Insightful)
I don't think anybody is against the idea of nailing the kiddie pornographers and getting their "customers" into therapy or whatever they need, but I think this particular idea is a bad misfire.
Gov. sending you child porn? (Score:4, Insightful)
So, the ISP's put this system in place, the GOV hires a bunch of spammers (all under the table of course) to email low grade kiddy porn to everbody who looks like the next terrorist and VOILA instant access to all your information: digital and physical. A kiddy porn investigation gets the judges to write out all kinds of warrants for the FBI and you are powerless to stop it.
Some asshat senator mad at your company for opposing one of his bills? Send some kiddy porn to you, and start an investigation. Even if they don't find anything, you'll most likely lose half of your cusotmers and most of your respect.
I'm scared.Search Warrant (Score:4, Insightful)
Typical Slashdot (Score:5, Interesting)
NCMEC will be undoubtably supplying a hash database to ISPs. MD5 or SHA1 probably as these are in common use today. This would enable matching of identical files quickly and easily.
Unfortunately, we are already running into the limits of simple MD5 matching with child porn cases today. You resize the picture or brighten it up a little bit and that changes the MD5 value and your database, library or whatever is then useless. You have a new, original picture with a new original hash value. There are other ways to accomplish this which do not suffer from these limitations without giving up high-speed autonomous comparisons. Check out http://www.infinadyne.com/icatch.html [infinadyne.com] for some ideas.
Yes, I work at the company that is producing this product.
you are all missing the point (Score:5, Insightful)
what is REALLY shocking is that this opens the door for ISPs to get their 'fingers on the bits' (its a data comm term - sorry about the double ententre).
so far, it has not been 'ok' to let ISPs scan for content and make judgements on it. most ISPs have drawn the line to say that we are just a carrier of bits and we are not RESPONSIBLE for what the user includes in the payload.
the music and film industry has tried to get ISPs to do their spying. with mixed success.
but scream 'CP' and you can't publicly NOT support that (and still keep your job). "have you stopped beating your wife yet?" goes the old joke. there's no safe way to answer that. if you publicly oppose such a politically charged idea, you are a boogeyman and an evil person. if you support it, you will pass under the suspicion-radar and will more or less be left alone.
this is a power grab to OFFICIALLY define an isp's job as net-nanny. first they claim to be protecting the citizenry - but its really far more devious than that. once the gov and the isp's convince joe sixpack that its in their 'benefit' for the net-nannies to read all your content ahead of you, you will NEVER get that level of privacy back again.
this is a sham. whenever someone says "won't you please think of the children!" you can bet that there are alterior motives going on.
remember: those in power just want to keep and increase their control level. fingers on the datacomm bits is one thing they've been after for a long time!
Re:Wanna bet? (Score:5, Funny)
Re:Wanna bet? (Score:5, Funny)
Think of the children!!
Er, wait, that's the problem to begin with . . .
(It's an oldie but a goodie, folks!)
Official stance (Score:5, Insightful)
Sharing of copyrighted music leads to less copyrighted music.
Find the anomaly.
In fact, to follow the "think of the children" idea, I believe that such a database would lead with more CP production, as you would have to "replace" the material censored (assuming this measure would be efficient) leading to profits for pornographer producer.
Just a thought
Re:Official stance (Score:3, Insightful)
Sharing of copyrighted music leads to less copyrighted music.
Find the anomaly.
More sharing means more instances of child pornography which inspires more people which leads to more child molestation which again leads to more child pornography.
More sharing means less sales of copyrighted music which leads to less revenue which leads to less (copyrighted) commercial music.
While I suppose there could be some commercial child pornography producers who
Re:Official stance (Score:3, Interesting)
In fact I am very cautious about people claiming to fight agains
Oh, please. (Score:4, Insightful)
How did this get moderated up? I'll find you the anomaly: No company in the world has a legitimate market in online pornography. The rationale is that illicit/illegal downloading leads to more illicit/illegal downloading in the cases of both child pornography and copyrighted music.
The damage (theorized by the RIAA) to legitimate music markets by illegal downloading cannot happen to the market for child pornography because there is no market of child pornography to harm.
Re:Wanna bet? (Score:5, Funny)
If anything, I think the point is to NOT think of the children. At all. STOP IT SCUMBAG.
Re:Wanna bet? (Score:5, Insightful)
I thought the same thing while watching some news report about child porn on television recently. A cop was sitting at his computer doing some clicking as he viewed child porn (obviously the camera didn't show the screen), and he talked about his war against distributors. Something just wasn't right about the way he talked about child porn, almost as if it took effort to disparage it and I got the sneaking suspicion that he had been compromised by it in some way. It made me wonder how much of a risk there is of a police officer developing an addiction to the matter he's sworn to defend against, a la Philip K. Dick's A Scanner Darkly [amazon.com] One wonders why cops are allowed to work on this on their own, seems to me it would make much more sense to allow people access to the material only in teams, perhaps mixed-gender.
Re:Wanna bet? (Score:5, Interesting)
Compromised? Not in the way you mean.
Unfortunately, I have some experience of this from about 10 years ago. While I was working for a large corporation as a sysadmin I came across a stash of this stuff. To cut a long story short it went from that to helping the police gather evidence against three individuals and from there to helping them to crack a much larger ring of paedophiles.*
A normal adult wants to love and protect kids. I can tell you these people (I use the term advisedly) are *really* not normal and some of the images made me physically sick - literally. We are not talking about kids in the nude - you don't want to know. There is NO way a NORMAL adult will be compromised... really! What that police officer was probably feeling was... nothing. You have to be like that to be able to take it at all and even then it does damage. It's so bad that you *must* stop after a couple of years.
"One wonders why cops are allowed to work on this on their own, seems to me it would make much more sense to allow people access to the material only in teams, perhaps mixed-gender."
Well, you are in a team. Part of the reason of trawling through the material on your own is logistics (manpower, etc) and the other is; why expose people to more than necessary? And, as I mention above, the dangers aren't that you'll turn into a paedophile yourself.
* Yes we got them - it was on the front page of the papers - especially the bit about most of them getting 15 months. We spent two years taking them down. Go figure!
Re:Wanna bet? (Score:5, Insightful)
"He who fights with monsters might take care lest he thereby become a monster. And if you gaze for long into an abyss, the abyss gazes also into you."
Friedrich Nietzsche, Beyond Good and Evil, Aphorism 146
Re:Wanna bet? (Score:4, Insightful)
Anyway, I wasn't using the word "normal" in a moral sense. I meant normal as in the vast majority who have the instincts to nurture and protect children.
To illustrate my point: Theoretically, someone with malformed instincts might be able to supress the actions or even thoughts that accompany this flaw through their morals. Just because they have a moral stance against what their diseased instincts are telling them, doesn't mean that they are *not* normal.
Or, to put it another way: A NORMAL adolescent male (I use male 'cos it's much more pronounced in males) wants sex as often as possible - he's not interested in children because they don't trigger his instincts. He doesn't try to rape every female who comes his way because he has NORMAL morals (and/or a normal understanding of what will happen to him if he's caught). In this case the instinct is normal, but it doesn't *necessarily* result in moral behaviour.
Err... does that make sense?
Re:Wanna bet? (Score:5, Insightful)
I'd open a book on it, but only at 1/33.
Just like the Catholic Church is full of pedophiles and pederasts, no doubt "internet" law enforcement is filled with closet perverts who delight in ammassing volumes upon volumes of illicit data. It's probably also filled with those who get their thrills from snooping on other people's emails.
Let's put it this way. Where's the best place for a criminal to hide. A position of authority.
Re:This discussion will come to nothing, so... (Score:3, Interesting)
Re:Corporate and Government Censorship (Score:4, Interesting)
There is no such thing as innocent until proven guilty. Never was. It was something we were to aspire to, because as humans we do not honestly beleive this.
You disagree with me, you're guilty! You dont like my politician, you're the enemy... You dont support the war, you're a commie!
You dont join the party because you're anti American.
Its US for them. I'm right, you're wrong. You can't possibly be right, because I am right. You are guilty because i say so.