Giant AI Platform Introduces 'Bounties' For Deepfakes of Real People (404media.co) 28
An anonymous reader quotes a report from 404 Media: Civitai, an online marketplace for sharing AI models that enables the creation of nonconsensual sexual images of real people, has introduced a new feature that allows users to post "bounties." These bounties allow users to ask the Civitai community to create AI models that generate images of specific styles, compositions, or specific real people, and reward the best AI model that does so with a virtual currency users can buy with real money. As is common on the site, many of the bounties posted to Civitai since the feature was launched are focused on recreating the likeness of celebrities and social media influencers, almost exclusively women. But 404 Media has seen at least one bounty for a private person who has no significant public online presence.
"I am very afraid of what this can become, for years I have been facing problems with the misuse of my image and this has certainly never crossed my mind," Michele Alves, an Instagram influencer who has a bounty on Civitai, told 404 Media. "I don't know what measures I could take, since the internet seems like a place out of control. The only thing I think about is how it could affect me mentally because this is beyond hurtful." The news shows how increasingly easy to use text-to-image AI tools, the ability to easily create AI models of specific people, and a platform that monetizes the production of nonconsensual sexual images is making it possible to generate nonconsensual images of anyone, not just celebrities.
The bounty of a real person that 404 Media saw on Civitai did not include a name, and included a handful of images that were taken from her social media accounts. 404 Media was able to find this person's online accounts and confirm they were not a celebrity or social media influencer, but just a regular person with personal social media accounts with few followers. The person who posted the bounty claimed that the woman he wanted an AI model of was his wife, though her Facebook account said she was single. Other Civitai users also weren't buying that explanation. Despite suspicions from these users, someone did complete the bounty and created an AI model of the woman that now any Civiai user can download. Several non-sexual AI generated images of her have been posted to the site.
"I am very afraid of what this can become, for years I have been facing problems with the misuse of my image and this has certainly never crossed my mind," Michele Alves, an Instagram influencer who has a bounty on Civitai, told 404 Media. "I don't know what measures I could take, since the internet seems like a place out of control. The only thing I think about is how it could affect me mentally because this is beyond hurtful." The news shows how increasingly easy to use text-to-image AI tools, the ability to easily create AI models of specific people, and a platform that monetizes the production of nonconsensual sexual images is making it possible to generate nonconsensual images of anyone, not just celebrities.
The bounty of a real person that 404 Media saw on Civitai did not include a name, and included a handful of images that were taken from her social media accounts. 404 Media was able to find this person's online accounts and confirm they were not a celebrity or social media influencer, but just a regular person with personal social media accounts with few followers. The person who posted the bounty claimed that the woman he wanted an AI model of was his wife, though her Facebook account said she was single. Other Civitai users also weren't buying that explanation. Despite suspicions from these users, someone did complete the bounty and created an AI model of the woman that now any Civiai user can download. Several non-sexual AI generated images of her have been posted to the site.
Arificial Economy (Score:2)
What!? (Score:2)
AI Tech Bros are fucking creeps? Never!
People have always been able to imagine you naked (Score:5, Interesting)
We aren't going to defeat the deep fakes just as much as we aren't going to stop people from imagining and fantasizing about others. What we can control is how we react.
Re: (Score:2)
Re: (Score:3)
Now we just have evidence of what they were thinking. However, we as a society need to get over making people feel ashamed if they are seen naked. The shame or fear of being shamed is a bigger problem than the pictures or videos themselves. If a picture of you naked is leaked, manufactured or you consented to be in an X rated movie it shouldn't affect your standing in society in anyway.
We aren't going to defeat the deep fakes just as much as we aren't going to stop people from imagining and fantasizing about others. What we can control is how we react.
So... your plan is to change human nature?
Re: (Score:3)
I think you're confusing "human nature" with "cultural norms". Cultural expectations change all the time. There was a time when women couldn't reveal their ankles without being shamed (and there are present-day cultures where that is still the case). There are also present-day cultures (and sub-cultures) where being naked is no big deal, or even expected. Even conservative western culture is fine with young children being naked up to a certain age. For children "human nature" is often a desire to shed their
Re: (Score:2)
I think you're confusing "human nature" with "cultural norms". Cultural expectations change all the time. There was a time when women couldn't reveal their ankles without being shamed (and there are present-day cultures where that is still the case). There are also present-day cultures (and sub-cultures) where being naked is no big deal, or even expected. Even conservative western culture is fine with young children being naked up to a certain age. For children "human nature" is often a desire to shed their clothes. They aren't old enough to have learnt to be ashamed of their bodies.
Since the general trend in western culture is towards less covering (shorter shorts, crop tops) and less body shaming, it's certainly not outside of all possibility that eventually "clothing optional" becomes acceptable or even the norm (at least in hot climates).
Not say we should all be ok with deepfakes but I think there will come a time when everything embarrassing or pornographic is consider fake "by default". At that point it'll be like email spam, most people won't even bother to look at it.
If you take this through to it's logical conclusion the people who should be most worried about this are porn stars. It's not going to help their industry if people are choosing fakes over the real thing. In fact it's possible all future porn will just be people in body suits recording poses for AI replacement.
Cultural norms only change if it's normalized, meaning most people are affected. But making deepfaked nudes of another person is $#@ed up, as is looking at them (particularly if it's a non-celebrity). Since violations will always be rare and sporadic they'll never be normalized and will always feel like a huge violation.
Internet cultural norms (Score:2)
People online fantasise over non-celebrities all the time, imagining in their minds eye what they could do in their idealised fantasy version of real life. Literotica is proof of this with short stories based upon real life crushes
Re: (Score:2)
The only cultural norm we have in relation to private fantasies is that of “don’t ask, don’t tell” and that will not change any time soon. As long as personally-generated deepfakes remain private, they are not messed up but are instead completely normal.
Fantasizing about your crush is fine.
Deepfaking a naked picture of them is an invasion of privacy and deeply messed up.
Creepy but not an invasion of privacy (Score:2)
Can we find such
Re: (Score:2)
People get fired for doing sex work like OnlyFans or porn. People whose families are particularly prudish can get ostracised.
Ask LGBTQ folk about the risks of coming out. In theory there's nothing wrong or embarrassing about being gay or trans, in practice it often leads to very negative consequences.
Deep fakes required a source image, so you could easily debunk them by finding that source. With AI generated images there is no source image, and as AI gets better they are going to get increasingly hard to de
Disingenuous article (Score:2, Troll)
Next up, Walmart sells art supplies which enables the creation of nonconsensual sexual images of real people.
Re: (Score:2)
They also seem to have introduced a feature for bounties for fine-tuned models and a few users have posted bounties for generators of likenesses of real people. Not quite what the headline says.
Re: (Score:1)
Download A1111 [github.com] and have fun!
Re: (Score:3)
Civitai has models ranging from SFW to NSFW (and by default hides NSFW content).
Next up, Walmart sells art supplies which enables the creation of nonconsensual sexual images of real people.
And if Walmart had a bulletin board next to those art supplies where folks could post requests for specific paintings, and those requests included hyper-realistic nude paintings of specific real people, and if Walmart did nothing to take down those specific requests, then you'd have a good point.
And Walmart would have a huge scandal on its hands.
If you create a marketplace you have some responsibility for the goods traded in that marketplace. This probably isn't as illegal as if they were trading meth in th
A Slippery Slope in the Digital Rights Arena (Score:2)
This issue becomes even more complex with the introduction of financial inc
First world problem (Score:2)
And I mean that quite literally, if it's too much legal hassle here, they'll do it elsewhere where it isn't.
Re: (Score:1)
>control over "their" own image
control over imaginary property was always an illusion
maybe we're better off with the indulgence, but it's always been a game of pretend that we indulged
even the greatest wealths and corpos with the greatest threats of violence have struggled to approximate control, the only real control over a contagion is quarantine
we'll keep chasing it if it's better for society (or at least IP holders) but then what i "own" is something hand-waved, not fundamental
Once the tools to create deepfakes... (Score:2)
Re: (Score:2)
What?!? (Score:4, Funny)
So... where would I find those images?
Age (Score:2)
I remember a time on /. where there were regular comments/jokes about when the Olsen twins would turn 18 and be allowed to do pornos or something.
If this service had been around back then....
I'm not actually sure if those images would be legal or not (not really something I'm comfortable researching on google).
Here's a thought (Score:1)
They are having sex with you virtually (Score:2)
Can you image what they would do if they saw you in person!? Most will probably run and hide
This should be a godsend for modern women (Score:2)
I mean, at the point that deepfakes become perfect, then there's NO POINT in having shame for your early years on Onlyfans.
"I'm afraid we can't give you the teaching job as we have seen your impressive Onlyfans account"
"Oh that? Yeah, I've known about it but I can't get them to take it down, it's all deepfakes from a bunch of pics/video someone stole from my ancient facebook account. Not me."
Not sure why women are crying over this.
I think it's a little bit funny that women *seem* to want to be treated equ