Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI

Giant AI Platform Introduces 'Bounties' For Deepfakes of Real People (404media.co) 28

An anonymous reader quotes a report from 404 Media: Civitai, an online marketplace for sharing AI models that enables the creation of nonconsensual sexual images of real people, has introduced a new feature that allows users to post "bounties." These bounties allow users to ask the Civitai community to create AI models that generate images of specific styles, compositions, or specific real people, and reward the best AI model that does so with a virtual currency users can buy with real money. As is common on the site, many of the bounties posted to Civitai since the feature was launched are focused on recreating the likeness of celebrities and social media influencers, almost exclusively women. But 404 Media has seen at least one bounty for a private person who has no significant public online presence.

"I am very afraid of what this can become, for years I have been facing problems with the misuse of my image and this has certainly never crossed my mind," Michele Alves, an Instagram influencer who has a bounty on Civitai, told 404 Media. "I don't know what measures I could take, since the internet seems like a place out of control. The only thing I think about is how it could affect me mentally because this is beyond hurtful." The news shows how increasingly easy to use text-to-image AI tools, the ability to easily create AI models of specific people, and a platform that monetizes the production of nonconsensual sexual images is making it possible to generate nonconsensual images of anyone, not just celebrities.

The bounty of a real person that 404 Media saw on Civitai did not include a name, and included a handful of images that were taken from her social media accounts. 404 Media was able to find this person's online accounts and confirm they were not a celebrity or social media influencer, but just a regular person with personal social media accounts with few followers. The person who posted the bounty claimed that the woman he wanted an AI model of was his wife, though her Facebook account said she was single. Other Civitai users also weren't buying that explanation. Despite suspicions from these users, someone did complete the bounty and created an AI model of the woman that now any Civiai user can download. Several non-sexual AI generated images of her have been posted to the site.

This discussion has been archived. No new comments can be posted.

Giant AI Platform Introduces 'Bounties' For Deepfakes of Real People

Comments Filter:
  • I think this is how artificial economies are started!
  • AI Tech Bros are fucking creeps? Never!

  • by FeelGood314 ( 2516288 ) on Monday November 13, 2023 @06:03PM (#64003437)
    Now we just have evidence of what they were thinking. However, we as a society need to get over making people feel ashamed if they are seen naked. The shame or fear of being shamed is a bigger problem than the pictures or videos themselves. If a picture of you naked is leaked, manufactured or you consented to be in an X rated movie it shouldn't affect your standing in society in anyway.
    We aren't going to defeat the deep fakes just as much as we aren't going to stop people from imagining and fantasizing about others. What we can control is how we react.
    • It also creates plausible deniability, if real images leak it can be claimed they are just fakes. But really this is just the tip of the iceberg being shoved into public discourse, on all topics we will be asked to reject the evidence of our eyes and ears. Welcome to the disinformation age.
    • Now we just have evidence of what they were thinking. However, we as a society need to get over making people feel ashamed if they are seen naked. The shame or fear of being shamed is a bigger problem than the pictures or videos themselves. If a picture of you naked is leaked, manufactured or you consented to be in an X rated movie it shouldn't affect your standing in society in anyway.

      We aren't going to defeat the deep fakes just as much as we aren't going to stop people from imagining and fantasizing about others. What we can control is how we react.

      So... your plan is to change human nature?

      • I think you're confusing "human nature" with "cultural norms". Cultural expectations change all the time. There was a time when women couldn't reveal their ankles without being shamed (and there are present-day cultures where that is still the case). There are also present-day cultures (and sub-cultures) where being naked is no big deal, or even expected. Even conservative western culture is fine with young children being naked up to a certain age. For children "human nature" is often a desire to shed their

        • I think you're confusing "human nature" with "cultural norms". Cultural expectations change all the time. There was a time when women couldn't reveal their ankles without being shamed (and there are present-day cultures where that is still the case). There are also present-day cultures (and sub-cultures) where being naked is no big deal, or even expected. Even conservative western culture is fine with young children being naked up to a certain age. For children "human nature" is often a desire to shed their clothes. They aren't old enough to have learnt to be ashamed of their bodies.

          Since the general trend in western culture is towards less covering (shorter shorts, crop tops) and less body shaming, it's certainly not outside of all possibility that eventually "clothing optional" becomes acceptable or even the norm (at least in hot climates).

          Not say we should all be ok with deepfakes but I think there will come a time when everything embarrassing or pornographic is consider fake "by default". At that point it'll be like email spam, most people won't even bother to look at it.

          If you take this through to it's logical conclusion the people who should be most worried about this are porn stars. It's not going to help their industry if people are choosing fakes over the real thing. In fact it's possible all future porn will just be people in body suits recording poses for AI replacement.

          Cultural norms only change if it's normalized, meaning most people are affected. But making deepfaked nudes of another person is $#@ed up, as is looking at them (particularly if it's a non-celebrity). Since violations will always be rare and sporadic they'll never be normalized and will always feel like a huge violation.

          • The only cultural norm we have in relation to private fantasies is that of “don’t ask, don’t tell” and that will not change any time soon. As long as personally-generated deepfakes remain private, they are not messed up but are instead completely normal.

            People online fantasise over non-celebrities all the time, imagining in their minds eye what they could do in their idealised fantasy version of real life. Literotica is proof of this with short stories based upon real life crushes
            • The only cultural norm we have in relation to private fantasies is that of “don’t ask, don’t tell” and that will not change any time soon. As long as personally-generated deepfakes remain private, they are not messed up but are instead completely normal.

              Fantasizing about your crush is fine.

              Deepfaking a naked picture of them is an invasion of privacy and deeply messed up.

              • It's no different conceptually to someone taking a photo of a glamour/nude model and sticking a photo of their crushes head on it. That's something people (especially girls) used to do back when polaroids were the in thing. As long as the result is never published for the general public to use in any way, provided that the images were legally obtained and the obtaining of the original images themselves did not constitute any kind of privacy breach, there's no issue with people doing that.

                Can we find such
    • by AmiMoJo ( 196126 )

      People get fired for doing sex work like OnlyFans or porn. People whose families are particularly prudish can get ostracised.

      Ask LGBTQ folk about the risks of coming out. In theory there's nothing wrong or embarrassing about being gay or trans, in practice it often leads to very negative consequences.

      Deep fakes required a source image, so you could easily debunk them by finding that source. With AI generated images there is no source image, and as AI gets better they are going to get increasingly hard to de

  • Civitai has models ranging from SFW to NSFW (and by default hides NSFW content).

    Next up, Walmart sells art supplies which enables the creation of nonconsensual sexual images of real people.
    • by ceoyoyo ( 59147 )

      They also seem to have introduced a feature for bounties for fine-tuned models and a few users have posted bounties for generators of likenesses of real people. Not quite what the headline says.

    • by Plugh ( 27537 )
      Bingo. I also wonder if these people know that there are at least half a dozen fully open source AI models that are in fact optimized for pr0n [betterwaifu.com].

      Download A1111 [github.com] and have fun!

    • Civitai has models ranging from SFW to NSFW (and by default hides NSFW content).

      Next up, Walmart sells art supplies which enables the creation of nonconsensual sexual images of real people.

      And if Walmart had a bulletin board next to those art supplies where folks could post requests for specific paintings, and those requests included hyper-realistic nude paintings of specific real people, and if Walmart did nothing to take down those specific requests, then you'd have a good point.

      And Walmart would have a huge scandal on its hands.

      If you create a marketplace you have some responsibility for the goods traded in that marketplace. This probably isn't as illegal as if they were trading meth in th

  • The concept of trading in human image rights, especially in the context of AI and platforms like Civitai, raises profound ethical and legal questions. At its core, it challenges the fundamental notion of personal autonomy and consent. When someone's likeness can be digitally replicated and manipulated without their permission, it undermines their control over their own image, an aspect deeply tied to personal identity and dignity.

    This issue becomes even more complex with the introduction of financial inc
    • And I mean that quite literally, if it's too much legal hassle here, they'll do it elsewhere where it isn't.

    • by Anonymous Coward

      >control over "their" own image

      control over imaginary property was always an illusion

      maybe we're better off with the indulgence, but it's always been a game of pretend that we indulged

      even the greatest wealths and corpos with the greatest threats of violence have struggled to approximate control, the only real control over a contagion is quarantine

      we'll keep chasing it if it's better for society (or at least IP holders) but then what i "own" is something hand-waved, not fundamental

  • ...are obiquitous, there's no longer a need to share the tools' products.
    • Right... because your friend that can't figure out how to connect their laptop to their printer wouldn't have any problem generating their own deep fakes of Margot Robbie giving head?
  • What?!? (Score:4, Funny)

    by Locke2005 ( 849178 ) on Monday November 13, 2023 @09:34PM (#64003817)
    They're making sexually explicit deep fake images of famous celebrities? THAT'S DISGUSTING!

    So... where would I find those images?

  • I remember a time on /. where there were regular comments/jokes about when the Olsen twins would turn 18 and be allowed to do pornos or something.

    If this service had been around back then....

    I'm not actually sure if those images would be legal or not (not really something I'm comfortable researching on google).

  • Perhaps don't be an Instagram influencer.
  • Can you image what they would do if they saw you in person!? Most will probably run and hide

  • I mean, at the point that deepfakes become perfect, then there's NO POINT in having shame for your early years on Onlyfans.

    "I'm afraid we can't give you the teaching job as we have seen your impressive Onlyfans account"

    "Oh that? Yeah, I've known about it but I can't get them to take it down, it's all deepfakes from a bunch of pics/video someone stole from my ancient facebook account. Not me."

    Not sure why women are crying over this.

    I think it's a little bit funny that women *seem* to want to be treated equ

You know you've landed gear-up when it takes full power to taxi.

Working...