Microsoft Closes Loophole That Created Taylor Swift Deepfakes (404media.co) 64
An anonymous reader shares a report: Microsoft has introduced more protections to Designer, an AI text-to-image generation tool that people were using to make nonconsensual sexual images of celebrities. Microsoft made the changes after 404 Media reported that the AI-generated nude images of Taylor Swift that went viral last week came from 4chan and a Telegram channel where people were using Designer to make AI-generated images of celebrities.
"We are investigating these reports and are taking appropriate action to address them," a Microsoft spokesperson told us in an email on Friday. "Our Code of Conduct prohibits the use of our tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system and help create a safer environment for users."
"We are investigating these reports and are taking appropriate action to address them," a Microsoft spokesperson told us in an email on Friday. "Our Code of Conduct prohibits the use of our tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system and help create a safer environment for users."
Yeah because (Score:2)
Everybody dreams of seeing Taylor screwed by a 50-year-old trash puppet. The future of GenAI will be porn.
Re:Yeah because (Score:4, Insightful)
The future of GenAI will be open source self-hosted models making porn.
FTFY
Re: (Score:2)
What do you mean "the future"? Like 90% of the user-created models on CivitAI already are. It's IMHO really annoying, even with filters. Even most of the ones that aren't are "This is just a normal model, except the main thing it outputs is its horny author's conception of his perfect dream girl, but clothed!"
Sooner or later, all these people relying on online services are going to discover that they can run StableDiffusion on their own computers, effectively for free, for as many images as they want, or
Re: Yeah because (Score:2)
She's not bad looking, but it's that she's famous and 4chan is trolling.
Re: (Score:1)
"Is she really supposed to be that great looking? I don't see it.
Is it that she is thin?"
You can make her as thin and as big-titted as you want.
Try it on https://getimg.ai/ [getimg.ai]
Re: Yeah because (Score:2)
Kind of spam to be honest. The service you linked requires access to a Google account to use. No way to ensure it doesn't harvest data from its users to train its model or invisibly stamps generated images with the Google ID.
Re: (Score:2)
"Kind of spam to be honest. The service you linked requires access to a Google account to use. No way to ensure it doesn't harvest data from its users to train its model or invisibly stamps generated images with the Google ID."
That's why people have dozens of Google accounts, created with a VPN+TOR.
Re: (Score:1)
Remote services. (Score:3)
If it's Microsoft's servers, and users need a Microsoft account to access this, then Taylor could sue to try to force Microsoft to disclose whose Microsoft accounts have been generating those images.
Not your computer, not your privacy.
Re: (Score:3, Funny)
No doubt she'll sue when the hullabaloo dies down and she needs to milk the deepfake for a bit of extra publicity.
Re: (Score:1)
She's a billionaire.
Re: (Score:2)
Its not like billionaires ever screw people over for money, Jeff Bezos didn't exploit employees, Google doubled down on the don't be evil thing. /s There is nothing in my experience that says (in general) once you are rich you stop trying to get as much money out of people as you can.
Re: (Score:1)
She's going to sue MS over a glitch that lasted how long until it was killed? Her damages are what? The lawsuit will cost her how much money and time vs MS's $5000/hr lawyers? It'll take how many years to go through court?
If anything she'll use it in some PR stunt to sell more concert tickets.
Re: (Score:2)
Re: (Score:1)
Ok and those people are effectively sue proof because they're broke ass 4chan clowns, half of them out of legal reach in other countries anyway. What's the point?
When you're 4chan clown #42 and Taylor Swift sues you for defamation or whatever and you don't even bother showing up because you're broke and the court default assigned her a $25m payout, so what?
She nor anyone in her legal team would bother. She's a celebrity, shit happens.
Re: (Score:2)
Re: Remote services. (Score:1)
Re: (Score:1)
Does Taylor even own her own image rights or did s (Score:2)
Does Taylor even own her own image rights or did she give them away as part of one her contracts?
Re: (Score:2)
Re:Does Taylor even own her own image rights or di (Score:4, Interesting)
Given that everything publicly known about how she (and her parents, who are apparently her business managers) runs her business interests indicates they're extremely sharp, I'd guess no, she didn't.
Was that wrong? (Score:4, Interesting)
While I assume Ms Swift has a specific and firm opinion on the matter... Is using AI tools to create nudes actually 'wrong'? I can imagine Taylor Swift nude or engaged in lewd behavior any time I want. If I were artistically inclined I might be able to create text, drawings, or animations of my imaginings. Would you outlaw pencil and paper?
It seems to me the public distribution is the issue, not the original act of creating the image.
Re:Was that wrong? (Score:5, Insightful)
Not only the distribution, but the implication that Swift was a willing participant in the acts depicted in the fake images.
Yes, you could create artwork depicting the same things, and we wouldn't really know if they were completely fabricated or posed.
But photographs are generally assumed as a literal snapshot of reality, ie. "pics or it didn't happen". So what happens when photographs are lying?
It's still a real problem for the actual person being depicted in the images.
Re: (Score:2, Interesting)
Not only the distribution, but the implication that Swift was a willing participant in the acts depicted in the fake images.
Yes, you could create artwork depicting the same things, and we wouldn't really know if they were completely fabricated or posed.
But photographs are generally assumed as a literal snapshot of reality, ie. "pics or it didn't happen". So what happens when photographs are lying?
It's still a real problem for the actual person being depicted in the images.
Yeah, but photographs have lied for a long time. Double-exposure, various tricks with the development process, etc...
Ad-laden site warning: https://www.historyextra.com/p... [historyextra.com]
It's still a real problem for the actual person being depicted in the images.
How would I even know if someone had a personal porn collection on their computer with my face swapped in? I wouldn't.
Distributing it publicly (or even semi-privately) while claiming it's real could be damaging to my reputation, so sure--lawsuit there...but I could care less if some random chick or dude half way across the world was
Re: (Score:2)
Oh hell, if you're remotely decent at Photoshop (or Affinity Photo)...even before any AI additions to it, you could create extremely realistic fakes of any photo you wanted.
Still can.
Re: (Score:2)
I think its time that we just stop believing pictures, really if AI can generate them so easily then their status as proof must start coming into question.
I know it makes it harder to gather proof, however that is the reality of life. On the plus side if someone does release images of you, just say they are fakes, and your reputation should be clean if people no longer believe photos or videos.
Re: (Score:1)
In what universe? A photo is a picture of something. What that something is, or what it might mean is up to the viewer.
Literally no photographer would agree with that statement.
Re: Was that wrong? (Score:1)
It's certainly rape adjacent.
Re: (Score:2)
Err...not even close to physical penetration> adjacent.
Unless we're about to modify the definition of things in the dictionary.....again.
Re: (Score:3)
I can imagine Taylor Swift nude or engaged in lewd behavior any time I want. If I were artistically inclined I might be able to create text, drawings, or animations of my imaginings. Would you outlaw pencil and paper?
Your imagination does not involve Microsoft specifically the tools that are owned by Microsoft. That is the only part where MS has a say. They have a right to say that their tool, Designer, not to be used in this way according to the terms of service that they have spelled out with their users. I would guess you could use MS Paint to achieve the same result but there are not these terms of service when it comes to Paint.
Re: (Score:2)
So if I had an image of you, you'd be ok with me creating a fake nude image of you and posting it on the internet for all to see?
Re: (Score:2)
Only so long as the OP was shown tied up and spitroasted by two fat Russians. Because clearly they wouldn't have an issue with that depiction.
Re: (Score:1)
I think it depends on if he's depicted enjoying it or not.
Re: (Score:2)
So if I had an image of you, you'd be ok with me creating a fake nude image of you and posting it on the internet for all to see?
I believe that is classified as a war crime. Intentional blinding is not allowed.
Geneva convention> "It is prohibited to employ laser beams of a nature to cause permanent blindness [serious damage ] against the eyesight of persons as a method of warfare."
While nudes of me are clearly not a laser, they would have the same effect, so should fall under the same rule.
Re: (Score:1)
Re: (Score:1)
ChatGPT is a programming language, or adjacent (Score:2)
Microsoft is allegedly trying to limit the functionality of ChatGPT to do what you tell it to do. Your problem here is, it's designed to be very flexible about doing what you tell it to do, and that is actually one of its main features. It can't be removed, or you won't have a ChatGPT.
They can't "fix" this. People will immediately come up with work-arounds.
Re: (Score:2)
If your prompt starts pulling in certain parts of the language tree, the GenAI can reject the prompt. If you simply draw a circle around entire topics and say it cannot comment about them, then it works. The problem is when they want it to be able to comment on a topic but only with certain responses, e.g., factually true statements or non-porn statements. Image generator is trickier, but I bet it can be done for famous people by checking the output image to see if the output looks like anyone recognizable
Here's the real question (Score:1)
If people don't see the issue with this and keep asking what's the big deal, they wouldn't have an issue with a picture showing the orange criminal bending his knee and kissing the hand of his Russian handler, would they?
Re: (Score:1)
I'd bet most normal people (aka voters) would have a very big problem with any unauthorized AI fakes.
Historically, pr0n always wins (Score:3)
As an aside, and while the adult entertainment sector appears to often get reviled in public by puritanical lawmakers in the US (and UK), it's interesting to observe that the same group don't seem to vocally be opposed to the unfettered implementation of such technology for military purposes, which it could be argued is where actual lives are often destroyed senselessly by its unregulated use.
Then again, this may perfectly fit the profiles of a majority of those constituents who elected them into office and gave them a mandate: "guns & bibles".
This being the planet we live on, there probably isn't much that can be done about it.
crap (Score:2)
I really wanted nudes of current day Gary Busey
Closed A Loophole (Score:1)
Re: (Score:1)
Sorry no and no you should not block it.
Look it is quite simple. You make enough of these things, people will get bored and move on. You limit the ability and people will try to figure out how to do it.
"Loophole", my ass... (Score:2)
This is more like "full of holes". Generative AI and ChatAI cannot be controlled. You can filter some specific things, but the basic problem stays there. Another reason why the approach is a dead end.