Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
EU Social Networks Technology

EU Privacy Rule Would Rein In the Hunt for Online Child Sexual Abuse (nytimes.com) 66

An anonymous reader shares a report: Privacy concerns in Europe have led to some of the world's toughest restrictions on companies like Facebook and Google and the ways they monitor people online. The crackdown has been widely popular, but the regulatory push is now entangled in the global fight against child exploitation, setting off a fierce debate about how far internet companies should be allowed to go when collecting evidence on their platforms of possible crimes against minors. A rule scheduled to take effect on Dec. 20 would inhibit the monitoring of email, messaging apps and other digital services in the European Union. It would also restrict the use of software that scans for child sexual abuse imagery and so-called grooming by online predators. The practice would be banned without a court order. European officials have spent the past several weeks trying to negotiate a deal allowing the detection to continue. But some privacy groups and lawmakers argue that while the criminal activity is abhorrent, scanning for it in personal communications risks violating the privacy rights of Europeans.

"Every time things like these unbelievable crimes are happening, or there is a terrorist attack, it's very easy to say we have to be strong and we have to restrict rights," said Birgit Sippel, a German member of the European Parliament. "We have to be very careful." Of the more than 52 million photos, videos and other materials related to online child sexual abuse reported between January and September this year, over 2.3 million came from the European Union, according to the U.S. federal clearinghouse for the imagery. If the regulation took effect, the rate of reports from Europe would drop precipitously, because automated scanning is responsible for nearly all of them. Photo- and video-scanning software uses algorithms to compare users' content with previously identified abuse imagery. Other software targeted at grooming searches for key words and phrases known to be used by predators. Facebook, the most prolific reporter of child sexual abuse imagery worldwide, said it would stop proactive scanning entirely in the E.U. if the regulation took effect. In an email, Antigone Davis, Facebook's global head of safety, said the company was "concerned that the new rules as written today would limit our ability to prevent, detect and respond to harm," but said it was "committed to complying with the updated privacy laws."

This discussion has been archived. No new comments can be posted.

EU Privacy Rule Would Rein In the Hunt for Online Child Sexual Abuse

Comments Filter:
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Friday December 04, 2020 @10:28AM (#60793628)
    Comment removed based on user account deletion
    • How will you create the "DB of known illegal images"?

      • by EvilSS ( 557649 ) on Friday December 04, 2020 @10:39AM (#60793684)

        How will you create the "DB of known illegal images"?

        Did you not read the summary? The databases already exist.

      • Comment removed based on user account deletion
        • How exactly will that stom any child abuse?

          You people are like babies that think mommy has vanished because she holds her hands in front of her eyes...

          Alternatively, you don't give a single fuck about protecting children Only about how it looks. How YOU look.
          Plastic people.

          • s/stom/stop/

            -- Jeez, I hate mobile touch UIs!

          • by dfghjk ( 711126 )

            Exactly, it will do none of that. It will only serve, at best, to ensnare people who aren't likely a threat to anyone. It won't help with abuse nor will it address new images or the motivation to create them.

            • Comment removed based on user account deletion
            • It really takes very little reasoning to see how measures like these could significantly reduce production, and since abuse is a prerequisite of production, reduce abuse. If you increase the risk for people who share any image known to be child pornography, fewer people are going to be willing to endure the risk of doing so. If there's less risk, there will be more people willing to people becoming part of a group that obtains, produces and shares these things. While technically savvy people can certainly u

              • The risk of getting caught hasn't seemed to stop all the school teachers getting caught banging their students. Rarely a week goes by that someone isn't busted. It's funny that before the big crackdown in the 90s, it was always presumed it was men who were the terrible teachers sneaking around banging the innocent young girls. All this time it was really the ladies enticing the willing horny young boys.
              • by gweihir ( 88907 )

                Threatening punishment does not stop people with non-standard sexual urges that they do not have under control. That at least should be amply clear in this day and age. You may perhaps stop the occasional psycho that simply does not care who he hurts, but that is not the typical person we are talking about here.

                For an example, read the story about that Polish politician that was caught in a male-only orgy recently while being ultra-conservative? That guy could also not help himself, despite that this will c

                • Threatening punishment does not stop people with non-standard sexual urges that they do not have under control.

                  It doesn't stop people with standard sexual urges either - see ... oh, try your local bronze-age goat-herder's morality guide (there's probably one in a church near you, even if there isn't a congregation) and the rules banning the coveting of either your neighbour's wife, or his ass. Even then, they knew about standard and non-standard urges.

                  The various people within living memory who have been j

          • Comment removed based on user account deletion
            • by gweihir ( 88907 )

              Nope. That is an extremely perverted view of things. Actual child abuse is what allows the making of such pictures, but the vast majority of child abuse happens without being documented. And there is the real problem. People like you effectively argue that abusing children is quite ok as long as you do not take pictures or make movies of it.

          • by gweihir ( 88907 )

            How exactly will that stop any child abuse?

            It will not. It is also amply clear that the threat of penalties and convictions does not stop this. But it does give the police nice opportunities to claim "victories" and some particularly despicable politicians to claim the same. At the same time, making mere possession illegal nicely curbed any research into the area. You know, activities that could have identified measures that would actually have helped and _prevented_ children from getting abused. I am completely convinced by now that this is intende

            • Completely agree. Child abuse, like any other crime, will be much better addressed by prevention rather than deterrence. Most abusers have psychiatric problems and the type of deterrence currently being used precludes searching for help before they act or finding mitigation strategies for their problem. In the end, itâ(TM)s the children that suffer. Itâ(TM)s one of the few instances where I think the âoeOMG think of the childrenâ actually applies.
              • by gweihir ( 88907 )

                Exactly. Nobody sane will abuse a child unless they cannot help themselves. But if they cannot help themselves, no amount of threatened punishment will do anything.

      • Checksums, thumbnails, the raw images themselves, comparison algorithms, filenames, there's a huge amount of data that can be used for that.

        • They're also things that can be completely changed by swapping one pixel
          • Not if you slice the image up into different pieces and index their location. Trust me, busting CP producers is a major effort and they'll track them down from the tiniest detail.

            • by gweihir ( 88907 )

              When somebody says "trust me", what you should hear is "do not believe a word I say".

              • My MIL is a social worker who (among many others) works with victims of CSA, people convicted of crimes against children, and people who investigate such crimes. She told me how the police will scrutinize things as simple as wall sockets or mentions of mundane events like weather problems or internet/power outages to narrow down a nonce's location, this part came up when she was telling me that one of her clients starts suffering PTSD symptoms if he's in a room where there are no sockets visible as his abus

          • They're also things that can be completely changed by swapping one pixel

            Facebook analyses the images with CNNs.

            CNNs are trained to tolerate "salt & pepper" noise added to images. It is a standard technique used to make the models more robust and avoid overfitting.

            Changing one pixel, or even thousands of pixels, is not going to make a difference.

            Convolutional Neural Networks [wikipedia.org]

      • by gweihir ( 88907 )

        How will you create the "DB of known illegal images"?

        You get somebody in law enforcement with a large private collection of the stuff and use that as a basis...

        Unfortunately, that is not even a dark joke. I know of an European police agency that internally asked for analysts. Most that applied were policemen with conviction for possession of respective materials.

        • You get somebody in law enforcement with a large private collection of the stuff and use that as a basis...

          I used to work with legal agencies and law enforcement IT, the detectives were usually proud of their collections and shocking the staff.

          • by gweihir ( 88907 )

            You get somebody in law enforcement with a large private collection of the stuff and use that as a basis...

            I used to work with legal agencies and law enforcement IT, the detectives were usually proud of their collections and shocking the staff.

            I am not surprised at all. Thanks for confirming that not only I have observed this.

  • Good (Score:4, Insightful)

    by Joce640k ( 829181 ) on Friday December 04, 2020 @10:30AM (#60793634) Homepage

    Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety.

    Benjamin Franklin (1706-1790)

    • Those who bring overused words to justify a simplistic laissez-faire solution to a difficult problem deserve neither simplicity nor words.

    • by fermion ( 181285 )
      It is about what is used and saved. Some of this data is saved and used for targeted ads, which is problematic. By making this practice illegal it does lead to an level of privacy we should expect. While it is legal for any node to scan anything that goes through it, and use it to generate revenue, it is not always acceptable. We do not expect our ISP to blackmail us because we sent an email saying we lied about missing work. What is real is law enforcement believes they have the right to break into your ho
    • It's a tradeoff. And there are a lot of people who don't really understand this. If their sole life consuming goal is the elimination of child porn, for example, then the solution obviously is a totalitarian state where children are raised by robots on a continent with no adults present. Well, no one is actually advocating that, but it's not a stretch to imagine those people wanting extremely intrusive police powers.

      And you do see this constant struggle in government between the power of law enforcement v

    • I do love this quote considering Jefferson was talking about the exact opposite of what you think he was. Also there's nothing "temporary" about the safety that this quote is often used against.

      Jefferson was literally talking about a security force defending an estate against the British and taxing the Penns for the privilege. The Penns blocked that effort by buying paying the governor to veto laws. The quote misused to prevent governments from infringing the rights of people was actually in context a compl

  • Translation (Score:5, Insightful)

    by JBMcB ( 73720 ) on Friday December 04, 2020 @10:30AM (#60793636)

    We want to ban large internet companies from collecting data from their users. But we want them to collect data from their users and give it to us.

  • by Anonymous Coward
    Try hunting at source, and at the destination, and stop snooping on people's private communications. A fight that encryption has already won.
  • by Valkyre ( 101907 ) on Friday December 04, 2020 @10:36AM (#60793668) Journal

    It's become the go-to way for tech to fight things they lose money on. If they would spend 1/10th of their lobbying and marketing budget on actually fighting child exploitation we might actually do something about the problem! But no, it's just another reason to slow down any consumer-friendly tech legislation.

    • by ytene ( 4376651 ) on Friday December 04, 2020 @10:53AM (#60793752)
      Because of the 4 Horsemen of the Infocalypse [wikipedia.org].

      Translation: because those in government with authoritarian tendencies know that if they proposed a law to allow surveillance of citizens for, e.g. "bad language on line", they would be laughed at. So when they want to bring in something that would otherwise be rejected on the grounds that it was needlessly intrusive, the invocation of "but think of the children" is a good way to convince a careless public to go along with what they want.

      It is an incredibly common tactic. It is common because it is effective. In fact, the most annoying aspect is the way that the media tend to get duped each time it is used - and play along with manipulative lawmakers.
    • by ceoyoyo ( 59147 )

      Child porn pushes one of our big cognitive buttons. Terrorism too. The point of terrorism is to multiply force by triggering our irrational fear reflexes, and evolution has figured out that insane parents produce the best reproductive outcomes.

      Any policy has benefits and harms, the argument is just about what weight to put on each. When one of the harms is the potential for a child to be hurt, or terrorism, many people's brains shut off, they assign infinite weight to that factor, and you've won not just a

    • by gweihir ( 88907 )

      Simple: The war-on-drugs is winding down and they cannot keep the fundamental truth that it has done massive damage and helped nobody under wraps much longer. There is now legalized use, legally tolerated use and actual research into effects and how to reduce the harm done, instead of locking up harmless pot-smokers for long times. Hence there is a dire need for a new blown-up threat and this one is it.

      In actual reality, it is basically a certainty that almost all child-abusers do not document their deeds.

  • wat (Score:5, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday December 04, 2020 @10:43AM (#60793714) Homepage Journal

    "An anonymous reader shares a report:"

    Won't someone think of the children? Signed, A. Nonymous.

    Anytime someone uses that argument you know with 99.44% certainty that they are an authoritarian toolbag.

    If they do it anonymously, you know 100% that they're trying to gaslight and manipulate you.

  • Correct. Privacy rights are not subject to any politician wanting to improve his profile by "saving" people from the latest scare.

    Hey, scarecrow-politicians: How about reining in child abuse instead of hiding pictures of child abuse so you can look good while abusers can keep operating in the dark.

    Do yo want to raise unreported child abuse?
    Because that's how you raise unreported child abuse!

    The core issue is that we care so little about mental health and raising kids that we don't even have weeky classes on

  • by bradley13 ( 1118935 ) on Friday December 04, 2020 @10:59AM (#60793788) Homepage

    Putting backdoors into encryption would make the lives of the police a lot easier. Also installing cameras in every room of every house.

    The thing is: most of us are not criminals. Violating the privacy rights of the entire population, in order to make hunting criminals easier? No thanks, that's not how life should work.

    • If people wish to protect their rights, they have to be more careful who they vote for, otherwise they are only enabling the fascism themselves

  • Wrong. (Score:4, Insightful)

    by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Friday December 04, 2020 @11:02AM (#60793808)

    A regular warrant gives authorities all the power they need to hack and view a suspects electronic communication and their data.

    Just like with phone tapping or reading somebody's mail.

    So just stop the sensationalism already. Nothing to see here, move along.

  • by ytene ( 4376651 ) on Friday December 04, 2020 @12:20PM (#60794160)
    One of the aspects of the perpetual fight against the online spread of "restricted materials" that I find (hardest to understand/most worrying) is that a lot of the legislation and activity seems to focus on the "end users" of the content.

    Let's put that to the test here. How many times have you seen or heard about someone being found with images relating, to quote from the title, to "Online Child Sexual Abuse"? The stories are commonplace: "Unsuspecting fool takes their PC to a high street repair store, only for the engineers to find and report the kiddie porn it contains". (See e.g. here. [wikipedia.org].

    Don't get me wrong: there is a correlation between someone who "uses" images of that nature and someone who is likely to commit such acts in future.

    But why is it that we don't seem to hear of cases in which the creators of this material - i.e. the actual abusers - are caught and punished? OK, in one sense there is a simple mathematical reason for this: once an image has been taken, it can be digitally duplicated and shared by tens, hundreds or thousands of individuals. So, statistically speaking, law enforcement would have a greater probability of locating and "end user" and not a "content creator".

    But Edward Snowden taught us just how incredibly pervasive the monitoring powers of Western nations [at least, the Five Eyes countries] has become. We also know [see e.g. the "Four Horsemen of the Infocalypse"] that crimes of this nature are considered so egregious and so likely to push the buttons the general public as to grant lawmakers ever-more intrusive powers. So is it reasonable to assume - especially in light of what Snowden revealed - that the government remain incapable of tracing this sort of abuse to its source?


    We should absolutely do what we can to stamp out online child abuse. But why do we seem to be concentrating on the "last mile" of the problem and not the identification and prosecution of the content creators?

    Think of it as if you were the child: would you rather the government invested time and money in stopping people viewing the content in which you had been abused, or would you rather the government found and stopped the abusers in the first place?

    If you think about it like this, a lot of the focus being applied to this problem starts to look a bit mis-placed...
    • by Qwertie ( 797303 ) on Friday December 04, 2020 @05:33PM (#60795414) Homepage

      In addition, if I understand U.S. law [justice.gov] correctly, there is no legal distinction between:

      1. 1. Photos taken by a 16-year-old of himself masturbating
      2. 2. Photos of a 50-year-old man sexually abusing a 3-year old.

      If these things are legally identical, there is no incentive for police or prosecutors to prioritize stopping distribution in the second case. They could of course choose to do, but the law does not seem to imply that any more resources should be devoted to stopping the glorification of child abuse than stopping recordings of normal sexual behavior. And I have the impression that the teen could be sent to prison for a minimum of 5 years for recording himself, even if he doesn't share the photos with anyone. And fictional child porn, such as pencil drawings, seems to be treated almost the same as actual child abuse imagery. [wikipedia.org]

      But what politician wants to be seen weakening child porn law? The only saving grace here is the de-facto lack of enforcement against teens criminally recording themselves.

    • if you created a database of school faces you could match child faces of exploted kids and have a lead to rescue the child
      • by ytene ( 4376651 )
        I would agree that this is the sort of step that we might expect law enforcement to push for, but I am not sure that it is the right solution in this particular cases...

        For one thing, you are taking a huge trove of personal data about innocent people and giving it to law enforcement. We have so many examples of law enforcement abusing bulk data, I’m not sure where to begin laying out the reasons this would be a bad idea.

        Or we could point out that children, especially young children, undergo rapi
  • Won't somebody please think of the children?
  • Just assume they collect everything and act accordingly. You're encryption probably isn't effective either

    • "Just assume they collect everything and act accordingly. You're encryption probably isn't effective either"

      Oh, it is, of course it is. Why do you think they come from time to time with even another "ban encryption" regulation scheme?

      • political theater

      • No direct knowledge of this either way.

        But could it be pure theater? Edward Snowden taught us that the NSA had critically compromised various encryption schemes by corrupting the pseudo-random number generator (PRNG) to be not-entirely-random and more-easily-guessable [by them] because they knew how it had been compromised.

        And if you've ever worked for a large corporation that allows you to access the web from your workstation, chances are that *all* your encrypted traffic is broken open and inspected
  • Write the rule to explicitly allow the search for child pornography and the like.

    All done. Now ignore the simplicity and hold more debates on the rule.

  • by NotEmmanuelGoldstein ( 6423622 ) on Friday December 04, 2020 @06:36PM (#60795624)

    “I don’t hear anybody complaining that my spam filter reads my email” ...

    The spam filter isn't applying a presumption of guilt to the recipient. That's what all this scanning technology is doing: With the unspoken declaration that child abuse is so heinous that everyone should give-up their rights for a poorly-defined 'greater good'. If this was fishing for music pirates (ie. Using Alexa/Siri/Echo/Bixby to detect unlicensed playbacks.), people would be horrified that corporate profits excused a loss of privacy: Corporate needs aren't sufficiently 'good' to invoke the slippery slope of using crime to prevent crime. The revulsion a normal person feels about sexual assault is a poor excuse for presuming guilt of the majority.

    In France, 1997, 2004 and 2014, police demand DNA samples of hundreds of males, all somewhat adjacent to a murder or rape: In all cases, the results are negative. There's a lot of trusting the police there (and surprise that DNA isn't a magic bullet). But it's a fine line from 'nothing to hide' to 'prove your innocence'.

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...