Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

ISPs to Create Database to Combat Child Porn 595

BlueCup writes to tell us that several media companies are banding together to create a database of child pornography images to help law enforcement officials combat distribution of questionable material. In addition to the database several tools and new technologies are also planned but most notable is what some perceive as a willingness to cooperate which critics say has been lacking in the past. From the article: "Each company will set its own procedures on how it uses the database, but executives say the partnership will let companies exchange their best ideas — ultimately developing tools for preventing child-porn distribution instead of simply catching violations."
This discussion has been archived. No new comments can be posted.

ISPs to Create Database to Combat Child Porn

Comments Filter:
  • Yeah. (Score:5, Insightful)

    by Dibblah ( 645750 ) on Tuesday June 27, 2006 @04:21AM (#15611256)
    This is a great idea. With a couple of tiny issues.

    ISPs have long said that they are just carriers and are not responsible for the content they provide access to. As soon as the technological solution for implementing a "content filter" is there, RIAA and friends will _require_ ISPs to use it for that purpose as well.

    This is completely ignoring the technical stupidity of trying to "fingerprint" media that is _not_ going to be transferred in plaintext.
    • Re:Yeah. (Score:5, Insightful)

      by Threni ( 635302 ) on Tuesday June 27, 2006 @04:23AM (#15611258)
      > This is completely ignoring the technical stupidity of trying to "fingerprint" media that is
      > _not_ going to be transferred in plaintext.

      And even if it is, it's trivial to come up with a way of altering images so that they look identical but where every bit is different to the original.

      I'm sure the Chinese government would literally kill to have a way of tracking the movement of files too.

      But yeah..kids...photographs...the internet...
      • it's trivial to come up with a way of altering images so that they look identical but where every bit is different to the original.
        Shouldn't there be an "... er ... so I'm told" in there somehere?
    • Re:Yeah. (Score:5, Funny)

      by AGMW ( 594303 ) on Tuesday June 27, 2006 @05:41AM (#15611465) Homepage
      This is a great idea. With a couple of tiny issues.

      I nearly spat my tea out all over my keyboard ... I read that as "tissues".

    • Re:Yeah. (Score:5, Insightful)

      by jellomizer ( 103300 ) * on Tuesday June 27, 2006 @07:01AM (#15611657)
      I don't think so. It will be more like other content filters, and spam filtering. Used as a selling point for their ISPs but not mandatory. If this were the trend I would expect it would be mandatory for all ISP to scan for viruses on everything. (Being that viruses effect the economy more and politicians worry more about money then people)
      Besides I rather have someone like a teacher arrested because they found Child Porn on his PC, vs. Having him just work there for years not knowing because the ISP has blocked the traffic.
      • Re:Yeah. (Score:5, Insightful)

        by voice_of_all_reason ( 926702 ) on Tuesday June 27, 2006 @09:45AM (#15612294)
        Besides I rather have someone like a teacher arrested because they found Child Porn on his PC, vs. Having him just work there for years not knowing because the ISP has blocked the traffic.

        See, that's the problem -- "rather 100 innocent jailed than one guilty man go free." It's supposed to be the other way around.
    • Re:Yeah. (Score:3, Insightful)

      by muzzy ( 164903 )
      Unfortunately, the only people to profit from filtering are people who sell filtering systems and the pedos who will setup more secure distribution channels out of necessity. Oh, and ISPs who will use this for PR purposes. And "child rights" groups who only want to police the children and will secure more funding through all the attention they get from these kind of pointless operations...

      This kind of pointless action doesn't help anyone except those who hunger for power. The people who try to objectivel
  • So this is like... (Score:5, Insightful)

    by Powercntrl ( 458442 ) on Tuesday June 27, 2006 @04:24AM (#15611259) Homepage
    ...stopping the proliferation of nuclear weapons by creating a massive stockpile?
    • by BorgDrone ( 64343 ) on Tuesday June 27, 2006 @04:55AM (#15611357) Homepage
      So this is like stopping the proliferation of nuclear weapons by creating a massive stockpile?
      Yes, good idea btw.

      If we make sure The Good Guys (read: us) have 99 times as much nukular weapons as The Bad Guys (read: them), then only 1% of all nukular weapons will be in the hands of the Bad Guys. Now if we continue to increase the nukular stockpile so we have 999 times as much as The Bad Guys then only 0.1% ...

      So if 'we' have an infinite amount of nukular weapons, the Bad Guys virtually have none at all!
    • by Jugalator ( 259273 ) on Tuesday June 27, 2006 @06:12AM (#15611548) Journal
      Stockpile of what? Not actual nuclear weapons anyway.

      It's like stopping the proliferation of nuclear weapons by creating a stockpile of blueprints telling what various nuclear weapon looks like so they can easier be detected.
      • by enrevanche ( 953125 ) on Tuesday June 27, 2006 @06:58AM (#15611651)
        This will probably only work against particular instances of an image. Change the resolution or compression rate even slightly will look like a whole new image. Zipping images with a password and/or various compression rates etc. will make this difficult also. This may catch the easy suspects though.

        If you only store a small piece of information per image, the number of false positives will make the whole thing useless. Store too much and your storing the image.

        Using SSL etc. will make it impossible.

        The analogy with nuclear weapons would be similar, change the box, add a few decoy parts, paint the others a different color and the original "plans" or pictures are worthless, the machine won't detect squat. A human expert probably would.

        I think this is probably all B.S., i.e. it's someone's idea of how they will make a lot of money in consulting and software developemnt. All the ISPs will buy into to say that they are doing something even though they know it is B.S.

        This is really a socialogical problem which is hard to fix and this makes just it sound like everyone's doing something. They dont have the answer. If pcs of 100 people are confiscated and their personal lives invaded for every one person caught, this is a vast injustice.

        • by Anonymous Coward on Tuesday June 27, 2006 @08:35AM (#15611949)

          This is really a socialogical problem which is hard to fix

          Rubbish. It's (fairly) easy to fix. The trouble is that it's been demonised so much that it's turned into a "thoughtcrime".

          Here's an idea. Remove all laws against copying, selling and downloading child porn, but keep the laws against things that actually involve the children - like statutory rape, child abuse, etc. This makes it more likely that police will be able to find images of kids being abused, partially because the black market won't be so hidden and partially because it's more likely that the illegal stuff will be photographed. If the police have images of abuse, they can crop out everything but the kid's face and stick it on a milk carton with "do you know this kid"-style messages, thus actually tracking down the kids that are being abused and stopping the real crime, not the symptom.

          Unfortunately, this tactic would involve scaling back the paranoia and hatred and making a distinction between people who actually abuse children and people who are attracted to underage people. That's not a distinction society is willing to make, in my opinion, we collectively seem to like having people that we can point unreserved hatred at.

          • by unitron ( 5733 )
            "If the police have images of abuse, they can crop out everything but the kid's face and stick it on a milk carton..."

            I'm pretty sure that the parts of the picture that make it pornographic (view of the child's genitals or view of child in contact with adult genitals, for example) will make it unsuitable for public view.

            There was a case a while back where the entire child was grayed out of the picture, leaving the furniture, bedspread, etc. visible, which allowed wide dissemination of the picture and subseq

          • The view against child sex and related issues has only been a recent invention (roughly in the last 150 years). Prior to that, there was no real aversion to having sex with children or other "sexual assualt" ideas; the Chinese (circa 18th century) had children that "entertained" guests under the table during dinner. Many cultures encouraged child marriages as a way to lessen the burden on families and ensure the girls would get enough support. Just like prostitution was encouraged in Rome so men wouldn't
    • by Firethorn ( 177587 ) on Tuesday June 27, 2006 @07:10AM (#15611677) Homepage Journal
      One would tend to think that a checksum/hash code would be sufficient. You need a fingerprint, not a copy of the act.
      • by jetmarc ( 592741 ) on Tuesday June 27, 2006 @08:00AM (#15611816)
        > One would tend to think that a checksum/hash code would be sufficient.
        > You need a fingerprint, not a copy of the act.

        That might be correct for examination of files. However, we're talking about ISPs here. It is not very far fetched that an ISP would try to match TCP/IP packets. That would require a fingerprint of a part of the image (impossible to produce without the original image).

        My point is that an "ad hoc" database won't be useful without the original images. Sooner or later a user will come up with a new (incompatible) usage mode. Without the original images, the database can not support it.

        The statement "Each company will set its own procedures on how it uses the database" just asks for it.
      • by macdaddy ( 38372 ) on Tuesday June 27, 2006 @10:31AM (#15612560) Homepage Journal
        That's good and all, but how are they supposed to be able to identify child porn anyway? Sure some of it is obvious. Quite frankly some of it isn't. I've seen some images that most people would immediately assume were child porn when in fact it's a young 20-something-er dolled up to look young. If you didn't recognize the actress you would mistake her for a minor. I'd like to know just how exactly they plan on eliminating the false-positives. They must eliminate all FPs because a mistake could literally ruin a person's life.

        Then again I wonder how this will affect other cultures. Does a culture where females marry at 14 perceive nude images of a person of the same age to be child porn? I'd never thought about that before. I recall an incident where some local photo developing shop called the cops on a foreign couple because they had images developed of the woman and her child nude in the tub and on a bed. Of course SRS freaked out. In reality no harm was done (except to the family). This was a common thing in their culture (and most others I would think). It makes you wonder.

  • Hashing? (Score:4, Interesting)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Tuesday June 27, 2006 @04:24AM (#15611261) Homepage
    I hope they apply a strong hash - I certainly wouldn't want to be the victim of a collision. Which also makes me wonder - though some hashes havn't been broken yet they likely will be in the future - does this mean pedos will get off scott free because it might have just been a collision?
    • Re:Hashing? (Score:5, Insightful)

      by irc.goatse.cx troll ( 593289 ) on Tuesday June 27, 2006 @06:07AM (#15611535) Journal
      For that matter, how are they verifying their copy? Obviously if its a 6 year old getting raped you'd flag it and add the hash, but what if its just a girl taking a picture for her boyfriend that leaks out? Especially if its a 16 year old that looks like shes 18? or a 18 year old that looks like shes 16? What about Art? Family photograph from a country where theyre open about nudity(okay, would still be illegal here, but you get what I'm getting at).

      Theres a lot of gray area, and a huge list of hashes isn't going to be very descriptive. While we're at it, they're just flagging files transfered.. What if someone sets up a relayer in a country where its legal and uses it to send kiddieporn to you via email? Click a message, commit a crime and go to jail. Or if someone defaces a site and puts up CP, or if someone just ups random CP to a public site(4chan), or any number of other ways.

      Going after real pedophiles hurting real people would be great, but this isn't going to help and passing this kind of tech off as "for the children" is downright offensive.
      • Re:Hashing? (Score:5, Insightful)

        by FireFury03 ( 653718 ) <slashdot@NoSPAm.nexusuk.org> on Tuesday June 27, 2006 @08:12AM (#15611855) Homepage
        While we're at it, they're just flagging files transfered.. What if someone sets up a relayer in a country where its legal and uses it to send kiddieporn to you via email? Click a message, commit a crime and go to jail. Or if someone defaces a site and puts up CP, or if someone just ups random CP to a public site(4chan), or any number of other ways.

        This is what worries me about the "it's illegal to view $foo" laws - it's entirely possible that you don't know you're about to view $foo until it's too late and you've broken the law. Is there a need to go after people who have simply downloaded something dodgy since they may not have intentionally done so? Better to concentrate on people who are *paying* for content since by paying they are financially supporting the continuation of the crime (the people who haven't paid are not supporting the real criminals).
        • Re:Hashing? (Score:5, Informative)

          by monsted ( 6709 ) on Tuesday June 27, 2006 @09:42AM (#15612271)
          This is why the child pornography filters employed by most Danish ISPs now will only redirect the user to an "Oops, you do know that this stuff is illegal, right?" page.

          Then again, our filters are made mostly to protect the innocent from being subjected to CP by accident (and yes, it'll stop a few from ever getting into the stuff), not so much prevent someone who really wants it from getting it - they'll always find a way...
    • Re:Hashing? (Score:3, Informative)

      by hackstraw ( 262471 ) *
      I hope they apply a strong hash - I certainly wouldn't want to be the victim of a collision. Which also makes me wonder - though some hashes havn't been broken yet they likely will be in the future - does this mean pedos will get off scott free because it might have just been a collision?

      I set up my own "porn" server one time by using MD5 hashes. I used a program called suck to, err, suck down all new pics from certain alt.binaries groups, stored the md5s in an MySQL database, and if the md5 existed, I jus
      • Re:Hashing? (Score:5, Insightful)

        by budgenator ( 254554 ) on Tuesday June 27, 2006 @11:05AM (#15612768) Journal
        human sexauality is a continum, most of us find the opposite sex attractive, most prefer the same age and discriminate based on things like hair color, body shape ect, fewer are attracted to the same sex but same age; some are farther out on the fringe, it's the way we are born.
  • The big problem (Score:5, Insightful)

    by damburger ( 981828 ) on Tuesday June 27, 2006 @04:24AM (#15611264)
    Child porn is the darkest side of the internet. Its the thing all net users should be on guard for, and the argument invoked against the internet by countless alarmists.

    However, I don't agree with this database. Keeping these images, even for law enforcement purposes, is a violation of the privacy of children who have already been subjected to a horrific violation. Leave them alone already.
    • Re:The big problem (Score:5, Interesting)

      by neomage86 ( 690331 ) on Tuesday June 27, 2006 @04:34AM (#15611298)
      Fine, they won't keep the actual images in their databases, but instead keep a hash/signature of images.

      Use a signature generation method like http://vision.unige.ch/publications/postscript/98/ MilaneseCherbuliezPun_icapr98.pdf [unige.ch] or even more flexible (kind of like a visual version of musicbrainz) so the signature would be invariant to minor changes in the image. Not really my field, but it seems relatively trivial.
    • Re:The big problem (Score:5, Interesting)

      by Eivind Eklund ( 5161 ) on Tuesday June 27, 2006 @05:06AM (#15611381) Journal
      According to the article, it is based on one way hashes - in other words, the image is not kept. Also, no matter what, this is a tradeoff. If we assume that the database is an effective tool for stopping distribution, then keeping an image in the database would be less of a violation of privacy than letting the images float free.

      Eivind.

    • Re:The big problem (Score:5, Insightful)

      by ultranova ( 717540 ) on Tuesday June 27, 2006 @06:42AM (#15611617)

      Child porn is the darkest side of the internet. Its the thing all net users should be on guard for, and the argument invoked against the internet by countless alarmists.

      Bullshit. In the 10 years I've been using the Internet, I've come accross child porn one (1) time, and even that looked more like two kids playing doctor than any pedophilic photo setup. If that's the "darkest side of the Internet", then the Net's brighter than the surface of the Sun.

      No, what's happening here is simply another censorship / surveillance system being built with the mantra "think of the children". And the makers do think of the children - they think of those children in the future, all grown up and in chains and get a hardon from that.

      So no, all the Net's users should not be on guard for the infinitesimally small chance that they happen upon CP by accident, anymore than all the people in Real Life should be on guard for the infinitesimally small chance that the guy passing you on the street happens to be a terrorist. Yeah, it's possible, but even if it happened, what the heck are you going to do - you sick pervert looked at the picture, so by law you should go to prison, since such pictures incite people to such acts, so you can't now be trusted anymore, right ? And what were you doing on a netsite where pedophiles hang out at, anyway ? You must be one too !

      Every time I hear "think of the children", I think of the future of those children and want to cry. Well, actually I want to protect those children by beating the living crap out of whoever it is trying to enslave them this time, but crying is more socially accepted.

      However, I don't agree with this database. Keeping these images, even for law enforcement purposes, is a violation of the privacy of children who have already been subjected to a horrific violation. Leave them alone already.

      Do you honestly think that those who are building this censorship & surveillance system are doing it for the childrens sake ? No, it is something that will be used to put those children into chains, once they grow up.

      Don't be fooled by their lies; these people care nothing for the children, or anyone else for that matter; they only care about power.

      • Re:The big problem (Score:4, Insightful)

        by Anonymous Coward on Tuesday June 27, 2006 @07:41AM (#15611762)

        It's a pity that I already used my mod points because I agree 100% with you.

        I have been using the Internet for 20 years. Before the web was invented, I saw hardcore porn pictures floating around in the alt.* newsgroups and on some ftp servers, including on a server that I was administering (the unprotected incoming directory was used by some porn traders until I discovered it and deleted the whole stuff - no, I did not keep a copy). Some of it was rather nasty: zoophilia, BDSM, deep fisting, lots of fetish stuff and so on...

        Later, when the web was invented and started to grow, I started seeing porn popping up on many web sites. Although the number of porn sites has been growing steadily, I would say that the amount of porn that you can be exposed to by accident is not larger than 10 or 20 years ago. The amount of porn that you can find if you are actively looking for it may be a bit bigger, but not much (taking into account all sources of porn that existed then and that exist now: magazines, tapes and now the web).

        But during all that time, I did not see a single child porn picture (save for some censored pictures illustrating articles about how to fight against child porn). Of course I'm not actively searching for that because I find the idea disgusting. But I am convinced that those who make so much publicity around the fight against child porn are overstating the problem and (most likely) have a hidden agenda that I cannot agree with.

      • Re:The big problem (Score:5, Interesting)

        by jesuscyborg ( 903402 ) on Tuesday June 27, 2006 @08:58AM (#15612042)
        "Bullshit. In the 10 years I've been using the Internet, I've come accross child porn one (1) time, and even that looked more like two kids playing doctor than any pedophilic photo setup. If that's the "darkest side of the Internet", then the Net's brighter than the surface of the Sun."

        I think you've been spending too much time on Slashdot.

        I've been using the interweb since 1998 when I was 13, and I have been exposed to child pornography since day one. I remember logging in to Microsoft Chat (which was bundled with Windows) and all the rooms were devoted to kid porn... I also remember the channel listings on DALnet just being filled with stuff like, "!!!!!!!!!!!!11LolIta-_OMG-filesrvr" although these channels tended to be pure smoke.

        On a more interesting point, a few years ago, I was paid to go through a list of about 10,000 randomly selected international websites and categorize them by hand for a search engine. For every thousand or so, I would see at least a couple child pornography sites.
        • Re:The big problem (Score:5, Insightful)

          by QCompson ( 675963 ) on Tuesday June 27, 2006 @09:51AM (#15612324)
          I've been using the interweb since 1998 when I was 13, and I have been exposed to child pornography since day one.
          The way these draconian laws are designed, you should be thrown into jail for a very long time. Every child you saw in those pictures, you have personally exploited (or so the theory seems to go). Busting the creeps who take the pictures makes sense to me; busting the saps that look at the pictures seems absurd.
  • Devil's Advocate (Score:5, Insightful)

    by rkcallaghan ( 858110 ) on Tuesday June 27, 2006 @04:25AM (#15611265)
    What exactly is different between Company A (ISP) and Company B (Offshore Freakshow) amassing a huge database of child porn? Company B is probably even in a jurisdiction where having it is legal by local laws, but Company A is certainly not. We have zero tolerance laws so strict they ruin people's lives for a banner ad containing a legal model that simply wasn't documented properly. So how come it doesn't apply here?

    ~Rebecca
  • by Anonymous Coward on Tuesday June 27, 2006 @04:26AM (#15611272)
    This can be problematic and annoying for users when the databases aren't correctly updated. A case in point: the Internet Watch Foundation [iwf.org.uk] maintains a database of child porn / other obscene URLs so that ISPs can take that list (hashed, so the URLs are not revealed) and block them.

    Recently, a popular imageboard at http://img.4chan.org/b/imgboard.html [4chan.org] has been added to that list for reasons unknown. Several UK ISPs, including BT Internet and NTL, have blocked that URL. Complaints to either the ISPs or the IWF from both the users and the site admin have gone unanswered. I am personally quite annoyed by this as I'm a regular user of that board.

    It's this sort of unaccountable censorship of the Internet that makes me suspicious of such 'helpful' databases.
    • I tend to frequent said image board, and, while the posting of child pornography is rare, it happens.
  • wont work (Score:5, Insightful)

    by mtxf ( 948276 ) on Tuesday June 27, 2006 @04:26AM (#15611273)
    how many ways can these pictures be hidden?

    zip, rar, and other compression formats
    encrpyted
    hidden inside other files (stenography)
    the list goes on...

    these people should learn, you cant fight the internet
    • Re:wont work (Score:5, Insightful)

      by mboverload ( 657893 ) on Tuesday June 27, 2006 @04:34AM (#15611297) Journal
      People who view child pornography are NOT IDIOTS. Stop treating them like it.

      I'm sick of this mentality that criminals (esp terrorists) are not as smart as you or I. They know just as well as we do they can throw it in a zip or rar file (It's probably a better way for them to transfer the files, anyway!). In fact, IF THEY AREN'T SMART THEY GO TO JAIL. I think that's a pretty strong motivation for covering their ass.

      • Re:wont work (Score:5, Interesting)

        by Eivind Eklund ( 5161 ) on Tuesday June 27, 2006 @06:22AM (#15611574) Journal
        People who view child pornography are not all idiots - like the rest of the population, they're a mix of idiots and non-idiots. However, I suspect there's somewhat more idiots among them than the rest of the population.

        I've randomly seen ("mild") child porn a couple of times, and I'll admit it turn me on. However, I'm smart enough that I still don't intentionally look it up, nor do I collect it, both for ethical and pragmatic reasons. Those that do look it up aren't smart enough to see and follow those pragmatic reasons.

        Eivind.

        • Re:wont work (Score:4, Insightful)

          by f1r3br4nd ( 16047 ) on Tuesday June 27, 2006 @11:13AM (#15612825)
          1. Perhaps the purpose of CP hysteria is to give law enforcement broader powers that can be used to bust idiots they don't like in general, be they CP idiots or some other type of idiot which can be made to look like a CP idiot.

          2. Like any male adult with a sex drive who isn't a lying sack of shit, you admitted that sometimes individuals that haven't quite reached the age of consent turn you on. I applaud you for your integrity, but think about what you said right afterward: these pragmatic reasons you talk about amount to the laws being so screwed up that you're afraid to do what you want with your own computer in the privacy of your own home. And unless you believe law = ethics, the ethical argument falls apart when you realize there are perfectly civilized, modern, and inhabitable countries where the age of consent falls anywhere between 15 and 18. The US is an anomaly in treating every individual under 18 as a child (except for purposes of administering the death penalty, of course).
  • by bluemeep ( 669505 ) <bluemeep@@@gmail...com> on Tuesday June 27, 2006 @04:30AM (#15611283) Homepage
    ..."Yucky" I guess would be the best word. Not just the fact that they're planning a corporate sponsored mecca of kiddie porn, but things like this too.

    AOL, for instance, plans to check e-mail attachments that are already being scanned for viruses. If child porn is detected, AOL would refer the case to the missing-children's center for further investigation, as service providers are required to do under federal law.

    Sounds like one of those 'good on paper' ideas that later spins itself into a slavering monster that eats half the internet. What's to say they don't start scanning for other things? Is the RIAA going to be knocking on my door because I sent an AOL member a Metallica MP3?

    • by pubjames ( 468013 ) on Tuesday June 27, 2006 @05:37AM (#15611458)
      The thing I really hate about this stuff, the people who say "If you're not doing anything wrong you don't have to worry". But consistently, when law enforcement starts treating everyone as potential criminals, innocent people are affected, sometimes very adversely.

      How many people have been seriously inconvenienced when trying to take a flight because the system has flagged them as a potential terrorist? A lot more innocent people have been inconvenienced than terrorists have been caught. Now, imagine the same situation but applied to this...

      We can just laugh off being tagged as a potential terrorist and tell it as a funny story to our friends and work collegues. Would you do the same thing if you'd been investigated by the police as a potential paedophile? I could see it happening quite easily - send a photo of your kids in the bath to their grandma, AOL system tags it, police come knocking at your door and take your computer and all your archives away. You get the computer back a week later with an apology from the police. But the damage is done, your neighbours and work collegues have found out why the police visited... It's a nightmare scenario but I'm afraid it's going to happen. And perhaps, more innocent people are going to be investigated than real paedophiles caught, as is the cause with "the war on terror".
       
  • privacy issues... (Score:5, Insightful)

    by mtxf ( 948276 ) on Tuesday June 27, 2006 @04:33AM (#15611290)
    from tfa: "the goal is to ultimately develop techniques for checking other distribution techniques as well, such as instant messaging or Web uploads"

    so they will be scanning our web traffic in real-time to determin if we are sharing child porn?

    anyone else see this and think something along the lines of "this is just a 'think of the children' excuse to implement advanced monitoring systems, which in due time the govt. will take over 'in the public interest'"?
  • sets a bad precedent (Score:5, Interesting)

    by SethJohnson ( 112166 ) on Tuesday June 27, 2006 @04:33AM (#15611292) Homepage Journal


    These online companies were previously protecting themselves from liability for their customers' transmissions by claiming that filtering this data would be an expensive and prohibitive task. By volunteering this service, they've crossed that line. It should be possible for the music companies, MPAA, etc. to demand filtering as well.

    It's a pretty stupid plan nonetheless. These digital fingerprints will only catch casual or newbie child porn traffickers. Encryption will easily render these fingerprints useless. The worrisome side effect is the false positives that will be triggered by this fingerprinting technique. As an example, try using one of those packages that tries to tag your mp3s by fingerprinting [musicbrainz.org]... Pretty unreliable stuff.

    Seth
  • What is child porn? (Score:5, Interesting)

    by OzPeter ( 195038 ) on Tuesday June 27, 2006 @04:38AM (#15611313)
    No a troll but a serious question.

    How do they categorise what is collected in their database as child porn? I have yet to see an automated system that can look at a photo and describe what it is (although several have been promoted over the years) I imagine that the decision as to what category the pics falls under must be made by a human. So my question is whose standard do they apply for the process?

    I can see that this process could be very arbitrary. So while I am not advocating child porn, I can also see that the data collection process could get very messy and have lots of false positives and negatives. and like the TSAs no fly list, could be very hard to get off it once you are on.

    Oh shit .. I knew I should have read TFA .. they are advocating an automated process that is trained to recognise signatures of pics that are deemed to be bad. If they can do that for $1,000,000 I will be really surprised, as I don;t think it has ever been sucessfully done before for any type of image. I wonder who sold them this snake oil (again)
    • by FTL ( 112112 )

      > How do they categorise what is collected in their database as child porn? I have yet to see an automated system that can look at a photo and describe what it is (although several have been promoted over the years) I imagine that the decision as to what category the pics falls under must be made by a human. So my question is whose standard do they apply for the process?

      Indeed. And it gets even murkier when one considers famous images such as this [globalsecurity.org] (SFW).

      The article indicates that hashes of the image

  • And of course... (Score:5, Insightful)

    by TCM ( 130219 ) on Tuesday June 27, 2006 @04:41AM (#15611327)
    ...those who speak up against this incredibly stupid idea are just latent child porn users. Voila, more people you can potentially detain if you see fit.
  • by OzPeter ( 195038 ) on Tuesday June 27, 2006 @04:54AM (#15611354)
    Petaphiles of disk space.

    *rim short*

    Thank-you, thank-you, I'll be here all week
  • RTFA? (Score:5, Funny)

    by HaydnH ( 877214 ) on Tuesday June 27, 2006 @04:56AM (#15611359)
    RTFA - no way! Not when the link is on the words "database of child pornography"... I can imagine the headlines now... 3,000,000 /.ers arrested for paedophilia!
  • by KarMax ( 720996 ) <KarMax&gmail,com> on Tuesday June 27, 2006 @04:59AM (#15611363) Homepage
    Where i work the Child Porn is an important Subject its not my area, but i still know what happens. You can't imagine the pics and videos that the specialist must see.(i never watch any)

    The subject is really complicated, here you have a conjunction action from the top ISP companies, but there are some things we must know.

    AOL, for instance, plans to check e-mail attachments that are already being scanned for viruses. If child porn is detected, AOL would refer the case to the missing-children's center for further investigation, as service providers are required to do under federal law.
    This means that if "somebody" sends to me an image that triggers the filter I'm gonna be a "suspect" (at least for a while) so AOL refer the case and 1 minute later i have an investigation running on my private emails.

    BTW... i don't want to sound paranoid, but this is a "way to start", then the database can include another kind of images (who knows?). Or just filter anything they want. The comparison with the Antivirus system (intentionally and not so technical related) put me more alert.

    I don't want to sound liberal, I'm against child pornography, but i think that this is not the way to fight against it. If some sick-man (A) have a picture of some-more-sick-asshole(B) doing nasty things with a child, he(A) is a sick person but not a criminal, the asshole(B) must go to jail because he abuse (mental and physical) the boy (the other guy(A) must go to a doctor).

    Another idea could be the "infection" of some images/files/videos and leave in the wild (this pedophiles bastards are not technical specialist, the majority of them are teachers, fathers or military related). So we keep track of the files all over, and figured out "sources" where they upload this files not a "single email address" i mean where a lot of files converge from different places. Then, security experts with some legal support, 0wn the server and monitors everything... and the investigation continues.

    Ryan said that although AOL will initially focus on scanning e-mail attachments, the goal is to ultimately develop techniques for checking other distribution techniques as well, such as instant messaging or Web uploads.
    Also the P2P networks has a LOT of "pedophilic" shares, but you can't run after every sick people, you must go to the source and condemn the one who abuse the child.
    I don't like the idea of "monitors everything -> searching for something". I think it must be like i said before... its a HUGE difference.
  • by saurabhdutta ( 904490 ) <saurabh@dutta.gmail@com> on Tuesday June 27, 2006 @05:09AM (#15611389) Homepage
    I am wondering how would the system differentiate between me uploading my lil bro in his swimwear and some other almost naked pic of a kid meant for some sick bastard in some dingy corner. Wait till u see the feds knocking on your door for no apparant reason. I bet false positives will be enormous.. Far too much to outweigh the advantages of the system. Also as another dude pointed out earlier obfuscation of this type of contect isnt really difficult. The entire system is flawed and makes me think .. could google/yahoo be of any help in combating child porn??
  • by Clovert Agent ( 87154 ) on Tuesday June 27, 2006 @05:23AM (#15611421)
    This'll be different in what way from the massive database and set of image search tools that Interpol already maintains? It's not like every signatory agency (including those in the US) doesn't already have access to it, and it's been running for years.

    http://www.interpol.int/Public/ICPO/PressReleases/ PR2005/PR200536.asp [interpol.int]

    I've met some of the guys running it, and while I really admire their dedication and achievements, I can honestly say there's no job on earth I'd less like to have.
  • by Nice2Cats ( 557310 ) on Tuesday June 27, 2006 @05:31AM (#15611443)
    In, oh, ten, twenty years at the most, everybody will have a computer powerful enough and software good enough to generate any sort of pornography on the fly. And when that happens, they will not have to trade pictures anymore (and the clever ones won't do it), and the rest of us are left with the question if that sort of software should be banned. It is better to have these people sitting in front of a computer generating their fantasies in the seclusion of their houses, or do we want to (try to) take that away from them and risk that they take their cameras out to playgrounds again?

    So, yeah, go ahead and build your database. By the time it is up and running, it will be obsolete, and we'll be discussing other problems.

  • What the hell for? (Score:3, Interesting)

    by Mr. Freeman ( 933986 ) on Tuesday June 27, 2006 @05:44AM (#15611472)
    OK, they want to stop child porn. And to do that they're going to stockpile a ton of it?
    Is there some question as to the definition of "child porn" or some type of miscommunication that prevents someone from looking it up in the dictionary? Because if the people enforcing these policies can't identify child porn without looking at 100 other child porn images first, then we have one hell of a problem on our hands.

    Stockpiling these images isn't going to do anything at all. If they wanted to create some type of program that could identify porn, they could do it with the millions of legal (most of which are free) images on the web.
  • by crhylove ( 205956 ) <rhy@leperkhanz.com> on Tuesday June 27, 2006 @05:45AM (#15611480) Homepage Journal
    Is there anyway I can get a copy of that database? Anyone? Bueller?

    rhY
  • by Tim C ( 15259 ) on Tuesday June 27, 2006 @06:29AM (#15611591)
    Hypothetical scenario 1:

    I piss off the wrong person. This person has access to material of this kind, and a zombie botnet. He arranges for this botnet to spam me with pictures of kiddy porn. The emails are caught by this system and flagged, and suddenly I'm the subject of an investigation. The way that sort of thing works here in the UK, I'm likely to be splashed all over the papers before my innocence is proved (which won't make nearly as large headlines, of course). Even if I am cleared, my reputation may well be shot to hell; people over here aren't too picky when it comes to this sort of thing. A few years ago a tabloid paper raised hell about paedophiles having been released into the community after serving their sentence. Some of the resulting protests saw a paediatrician being hounded from her home - people saw "paed" and thought "paedo". Rationality often takes a back seat where kids are concerned; this could be a very cheap and easy way to utterly ruin someone.

    Hypothetical scenario 2:

    I go on holiday with my family. I take photographs. I email some of these photographs to my friends and parents. Some of them contain shots of my 6 year old daughter in her swimming costume. An overzealous automated process tags this as a false positive, and suddenly we're all under investigation.

    To be honest, scenario 2 doesn't worry me so much; it should be obvious to even the most rabid "think of the children" zealot that the photos are perfectly innocent. It's the first one that gives me grave cause for concern. It would potentially take some effort to prove ones innocence, during which time you're very likely to have been utterly pilloried in the press. If you have kids yourself, they may even have been taken into care for the duration, and are likely to have been teased or bullied about it at school.

    I appreciate that measures do need to be taken to fight against child porn, but given the highly sensitive nature of the subject, I have conerns about implementing any sort of automated system.
  • by bhima ( 46039 ) <(Bhima.Pandava) (at) (gmail.com)> on Tuesday June 27, 2006 @06:43AM (#15611618) Journal
    What is the incidence rate of the abuse of children to create pornography?

    What is the percentage of clearly illegally created porn as apposed to legally created porn?

    Does this justify these measures? Does this reduce the incidents of actual abuse?
    My thinking is that there is not all that much actual child abuse going on and that much of the 'illegal' porn that is floating about the internet is multiple copies from the few actual abuses or it is legal porn masquerading as 'illegal' porn. I also don't believe that the problem is so widespread that I need to relinquish any more of privacy or rights than the ones already stolen from me by the federal government's 'war on terror'. I also don't think that this in anyway will lesson the incident rate of child abuse and this is what we as a society need to stop. I'm all for stopping child abuse and I don't mind paying to stop it. However, I *do* mind* loosing rights and I do mind paying for ridiculous, ineffective boondoggles. And it seems lately that the government when faced with any 'problem' can *only* come up with ridiculous, ineffective boondoggles.

    This will be about as effective as stopping the consumption of cocaine in the United States by dumping millions of tons of roundup in South America.

    Or about as effective as stopping terrorism by killing 50,000 Iraqi civilians *and* reading all of my email and listening to all of my phone calls.
  • by muzzy ( 164903 ) on Tuesday June 27, 2006 @06:47AM (#15611629) Homepage Journal
    In the long run, all filtering schemes will only make distribution systems stronger. Child porn is already distributed in password protected rar files in certain places, and anonymous p2p networks have hundreds of gigabytes of the material in circulation. Technology isn't the problem here, the problem are the people who distribute the material. Any attacks on technology will fail as long as the people and their interests remain.

    Essentially, any filtering mechanism depends on ability to detect the illegal act. If you prevent every method of distribution possible, the only channels left for child porn distributions are ones which are currently impossible to detect. Thus, in the long run this will only make it safer and more secure for people to download child porn. With filtering in place, the end users will know that if they're able to get the material, it means it probably cannot be traced.

    If you want real solutions to the child porn problem, you should attack the people involved. "Divide and conquer" is the basic strategy, the different groups have to be isolated from each others and dismantled. Currently there are large anonymous p2p networks which are mainly run by people who want to share files, namely to perform copyright infringement. The child porn distributors use the same networks. If you want to eliminate child porn, you need to isolate these two groups from each others by giving them different goals. Currently, they both want to hide what they're doing from the authorities. One straightforward solution would be to allow filesharing for non-commercial purposes and encourage it to be done in plain sight and moderated networks, so child porn distributors couldn't piggyback in warez networks. Not going to happen anytime soon, eh, so does anyone else have any other ideas?
  • by ajs318 ( 655362 ) <sd_resp2@earthsh ... .co.uk minus bsd> on Tuesday June 27, 2006 @07:02AM (#15611664)
    Wouldn't it make more sense to arrest people if and when they actually harm a child?

    I have absolutely no problem whatsoever with people who just want to look at pictures. Yes, they may well be pictures documenting a crime that was committed ..... but so what? The kids in the pictures aren't getting any worse just because other people are looking at them. The harm was already done when the pictures were taken, and it isn't going to be undone.

    I say let people jack off into a box of tissues as much as they damn well like. At least once they've spent their pocket money, they're no danger to anyone for a couple of hours. If they're doing more than look at pictures, then by all means go after them. But what a person does within the privacy of their own imagination is nobody else's business.
  • by GauteL ( 29207 ) on Tuesday June 27, 2006 @08:08AM (#15611844)
    Literally EVERY parent I know have lots of pictures of their kids naked. Kids run around naked on the beach in pretty much all of Europe and small children simply enjoy taking their clothes off and running around the house and garden, sometimes to the embarrassment of their parents.

    While I find it mildly weird to put family photos with naked kids on Flickr or your own family picture site, I can see no reason why this should be illegal. But isn't there a chance of these pictures finding their way into the kiddie porn database? If so, isn't there a decent chance someone may end up being tracked as a pedophile simply for proudly posting family pictures on the Internet?

    Differentiating between kiddie porn and legal pictures of kids is probably hard enough when you do it manually and individually, but doing this on a massive scale just sounds incredibly hard and possibly dangerous.
  • by Opportunist ( 166417 ) on Tuesday June 27, 2006 @08:40AM (#15611969)
    Currently, occasionally CP traders are found out. Because A was getting it off filesharing tools from B, and either of them got busted during a "mundane" sting op and on the PC they found the trace to the other one.

    That's pretty much it.

    Now, when A can't get his pictues from B anymore the "normal" way, what will happen? Will they stop trading?

    Would you stop getting music from the 'net if the RIAA (who do I fool, that should read "when", not "if") buys the corresponding law to apply this technology to music?

    What will happen is that the ways to transfer those items become more obscured. Hashes are worthless as soon as you change a single byte. Both ends agree on an encryption scheme and the transfer is possible again. What automatically fails is any kind of tracking possibility.

    Currently, when those files can pass, CP traders might be carelessly using traditional means to transfer their material. Because "it works". When it doesn't "work" anymore, they won't stop, they will turn to technologies that can not be stopped.

    Those can't be tracked as easily either, though.
  • REALLY bad idea. (Score:4, Insightful)

    by TomatoMan ( 93630 ) on Tuesday June 27, 2006 @08:58AM (#15612041) Homepage Journal
    So what this database is telling the producers of kiddie porn is: if you distribute the stuff we already know about, there's a higher chance you'll get busted, so be safe and only produce/distribute fresh new material?

    I don't think anybody is against the idea of nailing the kiddie pornographers and getting their "customers" into therapy or whatever they need, but I think this particular idea is a bad misfire.
  • by Revolver4ever ( 860659 ) on Tuesday June 27, 2006 @09:05AM (#15612082)

    So, the ISP's put this system in place, the GOV hires a bunch of spammers (all under the table of course) to email low grade kiddy porn to everbody who looks like the next terrorist and VOILA instant access to all your information: digital and physical. A kiddy porn investigation gets the judges to write out all kinds of warrants for the FBI and you are powerless to stop it.

    Some asshat senator mad at your company for opposing one of his bills? Send some kiddy porn to you, and start an investigation. Even if they don't find anything, you'll most likely lose half of your cusotmers and most of your respect.

    I'm scared.
  • Search Warrant (Score:4, Insightful)

    by Detritus ( 11846 ) on Tuesday June 27, 2006 @11:09AM (#15612798) Homepage
    Whatever happened to the idea of a search warrant? The Postal Service isn't allowed to open my mail and check it for illegal or subversive material without a warrant. An ISP has no business scanning my email or web requests for questionable material.
  • Typical Slashdot (Score:5, Interesting)

    by cdrguru ( 88047 ) on Tuesday June 27, 2006 @11:09AM (#15612799) Homepage
    First off, the National Center for Missing and Exploited Children already has this "databae" or "library" of child porn images. They would be the maintainers of it, not the ISPs themselves. That is what the article says, and that would be the legal requirements - police and other government agencies cannot keep child porn even for sample purposes.

    NCMEC will be undoubtably supplying a hash database to ISPs. MD5 or SHA1 probably as these are in common use today. This would enable matching of identical files quickly and easily.

    Unfortunately, we are already running into the limits of simple MD5 matching with child porn cases today. You resize the picture or brighten it up a little bit and that changes the MD5 value and your database, library or whatever is then useless. You have a new, original picture with a new original hash value. There are other ways to accomplish this which do not suffer from these limitations without giving up high-speed autonomous comparisons. Check out http://www.infinadyne.com/icatch.html [infinadyne.com] for some ideas.

    Yes, I work at the company that is producing this product.
  • by TheGratefulNet ( 143330 ) on Tuesday June 27, 2006 @11:31AM (#15612949)
    the point is NOT about the KIND of content. that's just a way to get popular soccer-moms (etc) up in arms and mobilized on your side.

    what is REALLY shocking is that this opens the door for ISPs to get their 'fingers on the bits' (its a data comm term - sorry about the double ententre).

    so far, it has not been 'ok' to let ISPs scan for content and make judgements on it. most ISPs have drawn the line to say that we are just a carrier of bits and we are not RESPONSIBLE for what the user includes in the payload.

    the music and film industry has tried to get ISPs to do their spying. with mixed success.

    but scream 'CP' and you can't publicly NOT support that (and still keep your job). "have you stopped beating your wife yet?" goes the old joke. there's no safe way to answer that. if you publicly oppose such a politically charged idea, you are a boogeyman and an evil person. if you support it, you will pass under the suspicion-radar and will more or less be left alone.

    this is a power grab to OFFICIALLY define an isp's job as net-nanny. first they claim to be protecting the citizenry - but its really far more devious than that. once the gov and the isp's convince joe sixpack that its in their 'benefit' for the net-nannies to read all your content ahead of you, you will NEVER get that level of privacy back again.

    this is a sham. whenever someone says "won't you please think of the children!" you can bet that there are alterior motives going on.

    remember: those in power just want to keep and increase their control level. fingers on the datacomm bits is one thing they've been after for a long time!

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...