Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI

Inside the Deepfake Porn Economy (nbcnews.com) 67

The nonconsensual deepfake economy has remained largely out of sight, but it's easily accessible, and some creators can accept major credit cards. From a report: Digitally edited pornographic videos featuring the faces of hundreds of unconsenting women are attracting tens of millions of visitors on websites, one of which can be found at the top of Google search results. The people who create the videos charge as little as $5 to download thousands of clips featuring the faces of celebrities, and they accept payment via Visa, Mastercard and cryptocurrency. While such videos, often called deepfakes, have existed online for years, advances in artificial intelligence and the growing availability of the technology have made it easier -- and more lucrative -- to make nonconsensual sexually explicit material.

An NBC News review of two of the largest websites that host sexually explicit deepfake videos found that they were easily accessible through Google and that creators on the websites also used the online chat platform Discord to advertise videos for sale and the creation of custom videos. The deepfakes are created using AI software that can take an existing video and seamlessly replace one person's face with another's, even mirroring facial expressions. Some lighthearted deepfake videos of celebrities have gone viral, but the most common use is for sexually explicit videos. According to Sensity, an Amsterdam-based company that detects and monitors AI-developed synthetic media for industries like banking and fintech, 96% of deepfakes are sexually explicit and feature women who didn't consent to the creation of the content. Most deepfake videos are of female celebrities, but creators now also offer to make videos of anyone. A creator offered on Discord to make a 5-minute deepfake of a "personal girl," meaning anyone with fewer than 2 million Instagram followers, for $65.

This discussion has been archived. No new comments can be posted.

Inside the Deepfake Porn Economy

Comments Filter:
  • as there is no stronger force in nature than male libido.

  • by Powercntrl ( 458442 ) on Thursday March 30, 2023 @03:27PM (#63412490) Homepage

    I can't fap to this. Not that I've gone down the paid side of the deepfake rabbit hole, but from what I've seen of "celeb fakes", they're clearly still in uncanny valley territory. It is also a pretty disturbing concept if you're talking about actors who were adolescents at the height of their film careers, and promotional photos from that era end up being part of the source material used to generate the fake. Depending on the interpretation of the law, deepfakes produced from images of underage celebrities could be considered CSAM at a federal level. [govinfo.gov]

    • they're clearly still in uncanny valley territory

      I can't fap to this.

      Ahhh padwan you'll get there eventually. I recommend approaching the problem from both sides. Start by fapping to Hentai comics, then move onto 3d animations of video game characters, furies, Disney characters, and eventually you can meet both sides of the realism spectrum right in the middle, at deep fakes.

      It's amazing what people will fap to. You may not be into it, but I guarantee that a LOT of people are.

  • by Miles_O'Toole ( 5152533 ) on Thursday March 30, 2023 @03:37PM (#63412504)

    The world needs a sexually explicit deepfake of Biden and Trump engaging in wet, slippery, enthusiastic gay sex, with top members of their governments joining in part way through to bring it all home.

    This is how the world ends...not with a whimper, but with a gang bang.

  • I mean, I'm posting on /.. I'm clearly a Dog.

    That said, while I'm not sure we can divorce the harm caused by even fake porn circulating of someone, at a certain point society at large will have to discount any and all images and even video of someone performing in pornography since there will be so much fake porn.

    So it's possible that the societal harm might just go away. That said, people are *mean*. Especially the kind of people uptight about sex and sexuality. So I'm equally skeptical that'll hap
    • Re: (Score:2, Interesting)

      by Baron_Yam ( 643147 )

      >it's possible that the societal harm might just go away. That said, people are *mean*. Especially the kind of people uptight about sex and sexuality.

      Anyone who complains about the existence of porn they're not being forced to watch should immediately be the subject of a deepfake to teach them what it's like to be on the receiving end of their bullshit.

      You might have to go a few rounds until you get a video that hits them where they're most sensitive, but eventually they'll learn to STFU.

      Except kids, of

      • it might make you feel better, but reddit has an entire forum about "Face Eating Leopards". Nobody ever believes it'll happen to them and they act accordingly.
        • It doesn't work on the group, but it works on the individual.

          As you say, they don't believe it until it happens to them... so you make it happen to them. Force them to develop some empathy.

      • I often disagree with rsilvergun but I think you've totally misconstrued his point here. For many people, there would be a very deep sense of violation if such porn were made from their images. I'm not sure how I would feel if somebody made a deepfake porn using my images. But I'm old and not very beautiful so it's not very likely anyway. Many people also feel a deep sense of violation when their own sexually explicit images/videos get shared more widely than they have given consent. You will hear many
      • by AmiMoJo ( 196126 )

        Involuntary pornography is already illegal in many places, e.g. "up-skirt" photos, hidden cameras and so on.

        The only new part is that the images are synthetic, although not entirely. The AI requires training material, so the creator has to acquire photos of the victim. That means there is potential copyright infringement, and using images to create pornography without permission is also a crime in some jurisdictions.

        You are probably headed to jail if you start making money by creating (fake) porn of people

      • No one should be forced to appear in porn without their full consent either. That's the point.
    • Hell, if I was a certain right wing think tank trying to privatize the American education system posting deep fakes of teachers to get them fired would be a nice side businesses.

      You're way overthinking this. Blackmail deepfakes leaves too much potential of creating an evidence trail. Much easier to just plant a banned book like the cops do with drugs.

      "Now let's all be good team players or we might find (air quotes) a copy of Gender Queer in your classroom."

      Oh hell, now I'm overthinking it too. I've known a few people who work in education in Florida. In so-called right-to-work states, they just let you go and don't really need a good reason. Too slow getting your kids locked d

    • by HBI ( 10338492 )
      I actually made an account just to write this response: I wonder if any of these deepfake porn makers would make a distasteful video of someone. You know, open sores, shancres, deformed nipples, outrageous body hair, amputee stumps etc. Like something from Xerxes' harem from 300. I know a lot of people would probably pay to have these made of their exes.
  • I really wonder who is willing to spend money/effort to montage certain faces on porn actors - simply because faces are such an insignificant (and often not even in-frame) part of pornographic videos. If AI was used to manipulate what genitals of porn-actors look like in a movie, now that would be a somewhat more plausible investment.
  • Also, according to the prior entry on the same bathroom wall, Jenny gives good head. As they used to say back in my day.

    Not sure how that kind of trash talk is any more or any less of an insult than faking someone's face onto a porn video.

    Or mentally undressing a woman.

    Or having a sex dream.

    Maybe the sleezy part is doing it in a way that's open for all to see instead of confining your perversions between your ears where they belong.

    Who the fuck knows. The liberal media occasionally inform me that making pr0

    • "The liberal media occasionally inform me that making pr0n is somehow empowering. Is it still a violation if you've been involuntarily empowered?"

      I'm a liberal. Let me help you out with your apparent dilemma. Scenario One: Let's say you go out to a gay bar, and pick a guy up, and you both decide to make a video of your mutually enjoyable encounter and distribute it to an appreciative audience. That could very well be empowering.

      Now, let's consider another scenario. You have been unjustly imprisoned

  • by John.Banister ( 1291556 ) * on Thursday March 30, 2023 @07:08PM (#63412900) Homepage
    Have AIs generate all deepfake porn on the fly. Then when the customer stops watching it, it vanishes as never was, and there's no recording for anyone to get upset about.
    • by AmiMoJo ( 196126 )

      The law mostly doesn't work that way, which is why you can't avoid child porn charges by deleting the images after you get off, or using your video card to render a 3D image that goes away when you close the app.

      • The law has enough to do going after the people who record and post images like that to intentionally make someone else unhappy. Playing thought police for AI augmented thoughts may be within the purview of the law, but there's no budget for that.
  • by joe_frisch ( 1366229 ) on Thursday March 30, 2023 @09:36PM (#63413116)
    Deepfakes won't disappear just because they are illegal, that never works, the will just be rare. So what deepfakes exist will easily be confused with real images. But if deepfake porn is common then having someone's face on a porn scene will become meaningless - like drawing a mustache on a poster. If there are real non-consenting video releases, the victims can just claim they are fakes, and likely be believed.

    Professional actors in porn will have the same issue as all professional actors - deepfakes will become a drain on their income, like counterfeit goods on real manufactures. That issue isn't in any way specific to porn.

    Similarly fully faked porn videos might put a lot of sex workers out of business, but again that is a potential impact of AI that is much broader
    • I doubt the deepfakes will have much negative affect on porn actors' income. In fact, it might even go up. People don't care if things are real. They care about suspension of disbelief. Even if the whole thing is deepfake have a known person supposedly be in the movie makes it more appealing. Now the actors don't have to waste time actually filming movies. They can just promote a whole library of deepfakes that look like the promoter.

      See OnlyFans as an example. The "creators" put up a few pictures a

Our policy is, when in doubt, do the right thing. -- Roy L. Ash, ex-president, Litton Industries

Working...