AI-generated imagery and other forms of deepfakes depicting child sexual abuse (CSA) could be criminalized
Who's old enough to remember the long arguments we had about whether video-game violence caused real-life violence? Or (more on the nose) about whether the "rape" fantasies on porn sites cause real rapes? AFAIK there was never any scientific conclusion that games/fantasies lead to real-life crimes. My conclusions were (1) the evidence we have is for a weak correlation, with no evidence of causation (2) for video games there is an extremely high ratio of people who play violent games to people who commit violent crimes, so banning them is unjustified. And personally, when I finally obtained access to porn in ~1995, it didn't make me have more real-life sex - in fact I had none at all for many, many years afterward.
So it's obvious that groups supporting these policies hate pedophiles, but not that they care about protecting children. Think about it: imagine the web disappears tomorrow and there's no more p0rn. Does this really make you less likely to seek out real-life sex? That's the theory needed to support a law like this, and I think it's exactly backwards. Pedophiles know perfectly well that it's wrong to [you know] but human sex drive is powerful. I think many of them would accept a substitute if they could, but laws and enforcement against fictional child p0rn have gotten tighter over the years. Meanwhile, real-life children are no more rare than before.
Something else. If a 16-year-old wanks on camera, that's illegal production of child porn under typical laws (though curiously nobody seems to get prosecuted for it?). Likewise two 16-year-olds having sex is perfectly legal, but if they make a record of it, it's a serious crime. I bring this up because while these two cases may be serious crimes of "child pornography", it would be quite a stretch to call them "CSAM". Yet this is precisely what activist groups want. Two examples:
United States federal law defines child pornography as any visual depiction of sexually explicit conduct involving a minor [....] NCMEC chooses to refer to these images as Child Sexual Abuse Material (CSAM) to most accurately reflect what is depicted
While the term child pornography is still widely used by the public, it's more accurate to call it what it is: evidence of child sexual abuse. Thatâ(TM)s why RAINN and others have stopped using the term child pornography and switched to referring to it as CSAM â" child sexual abuse materials.
While some of the pornography online depicts adults who have consented to be filmed, thatâ(TM)s never the case when the images depict children. Just as kids can't legally consent to sex, they canâ(TM)t consent to having images of their abuse recorded and distributed. Every explicit photo or video of a kid is actually evidence that the child has been a victim of sexual abuse.
Nowhere does RAINN's article mention teenagers, they present only a "child-adult" dichotomy. They do say "In about four out of 10 cases, there was more than one minor victim, ranging from two to 440 children" which makes it clear that "child" is meant as a synonym for "minor" and so includes teenagers.
Since activist groups encourage everyone to sed s/child porn(ography)?/CSAM/g, when Apple or Google talks about their "CSAM" detection system, this seems to actually be a system for detecting porn (or simple nudity, or medical pictures) involving minors, which they call CSAM because activists insist on it.
This is an example of a more general phenomenon I call "casting negative affect": using words to create negative feelings in the listener. For example, calling Martin Luther King a "criminal" because he was put in jail 29 times, convicted of contempt of court, and convicted of disobeying a police order and fined $14. Likewise: suggesting that 16-year olds (or AI, or a hebephiliac with a box of pencils) can't make child porn, only Child Sex Abuse Material.