Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Microsoft AI

Microsoft Closes Loophole That Created Taylor Swift Deepfakes (404media.co) 64

An anonymous reader shares a report: Microsoft has introduced more protections to Designer, an AI text-to-image generation tool that people were using to make nonconsensual sexual images of celebrities. Microsoft made the changes after 404 Media reported that the AI-generated nude images of Taylor Swift that went viral last week came from 4chan and a Telegram channel where people were using Designer to make AI-generated images of celebrities.

"We are investigating these reports and are taking appropriate action to address them," a Microsoft spokesperson told us in an email on Friday. "Our Code of Conduct prohibits the use of our tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system and help create a safer environment for users."

This discussion has been archived. No new comments can be posted.

Microsoft Closes Loophole That Created Taylor Swift Deepfakes

Comments Filter:
  • Everybody dreams of seeing Taylor screwed by a 50-year-old trash puppet. The future of GenAI will be porn.

    • Re:Yeah because (Score:4, Insightful)

      by Plugh ( 27537 ) on Monday January 29, 2024 @11:59AM (#64197700) Homepage

      The future of GenAI will be open source self-hosted models making porn.

      FTFY

    • by Rei ( 128717 )

      What do you mean "the future"? Like 90% of the user-created models on CivitAI already are. It's IMHO really annoying, even with filters. Even most of the ones that aren't are "This is just a normal model, except the main thing it outputs is its horny author's conception of his perfect dream girl, but clothed!"

      Sooner or later, all these people relying on online services are going to discover that they can run StableDiffusion on their own computers, effectively for free, for as many images as they want, or

  • by Dwedit ( 232252 ) on Monday January 29, 2024 @12:11PM (#64197754) Homepage

    If it's Microsoft's servers, and users need a Microsoft account to access this, then Taylor could sue to try to force Microsoft to disclose whose Microsoft accounts have been generating those images.

    Not your computer, not your privacy.

    • Re: (Score:3, Funny)

      No doubt she'll sue when the hullabaloo dies down and she needs to milk the deepfake for a bit of extra publicity.

      • She's a billionaire.

        • Its not like billionaires ever screw people over for money, Jeff Bezos didn't exploit employees, Google doubled down on the don't be evil thing. /s There is nothing in my experience that says (in general) once you are rich you stop trying to get as much money out of people as you can.

          • She's going to sue MS over a glitch that lasted how long until it was killed? Her damages are what? The lawsuit will cost her how much money and time vs MS's $5000/hr lawyers? It'll take how many years to go through court?

            If anything she'll use it in some PR stunt to sell more concert tickets.

            • by Lehk228 ( 705449 )
              suing MS would be silly, she would be suing john doe user#1-99 then subpoena MS and 4chan for user information, which is how just about any lawsuit involving anonymous internet users starts
              • Ok and those people are effectively sue proof because they're broke ass 4chan clowns, half of them out of legal reach in other countries anyway. What's the point?

                When you're 4chan clown #42 and Taylor Swift sues you for defamation or whatever and you don't even bother showing up because you're broke and the court default assigned her a $25m payout, so what?

                She nor anyone in her legal team would bother. She's a celebrity, shit happens.

    • Welp after you're done going after the Swift fakes, you have 20-25 years of Photoshop fakes to hunt down.
    • Would require Microsoft to have build in adequate audit logs. I am pretty sure it has not been done yet and needs to be retrofitted as an afterthought. And that is why they need risk management and we need regulation.
    • On what basis? Suing people for using a publicly available photo of a public figure and creating a parody? Sorry Charlie, that's protected speech. See People v. Larry Flint.
  • Does Taylor even own her own image rights or did she give them away as part of one her contracts?

  • Was that wrong? (Score:4, Interesting)

    by Baron_Yam ( 643147 ) on Monday January 29, 2024 @12:17PM (#64197784)

    While I assume Ms Swift has a specific and firm opinion on the matter... Is using AI tools to create nudes actually 'wrong'? I can imagine Taylor Swift nude or engaged in lewd behavior any time I want. If I were artistically inclined I might be able to create text, drawings, or animations of my imaginings. Would you outlaw pencil and paper?

    It seems to me the public distribution is the issue, not the original act of creating the image.

    • Re:Was that wrong? (Score:5, Insightful)

      by The Faywood Assassin ( 542375 ) <benyjr@@@yahoo...ca> on Monday January 29, 2024 @12:25PM (#64197814) Homepage

      Not only the distribution, but the implication that Swift was a willing participant in the acts depicted in the fake images.

      Yes, you could create artwork depicting the same things, and we wouldn't really know if they were completely fabricated or posed.

      But photographs are generally assumed as a literal snapshot of reality, ie. "pics or it didn't happen". So what happens when photographs are lying?

      It's still a real problem for the actual person being depicted in the images.

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        Not only the distribution, but the implication that Swift was a willing participant in the acts depicted in the fake images.

        Yes, you could create artwork depicting the same things, and we wouldn't really know if they were completely fabricated or posed.

        But photographs are generally assumed as a literal snapshot of reality, ie. "pics or it didn't happen". So what happens when photographs are lying?

        It's still a real problem for the actual person being depicted in the images.

        Yeah, but photographs have lied for a long time. Double-exposure, various tricks with the development process, etc...
        Ad-laden site warning: https://www.historyextra.com/p... [historyextra.com]

        It's still a real problem for the actual person being depicted in the images.

        How would I even know if someone had a personal porn collection on their computer with my face swapped in? I wouldn't.
        Distributing it publicly (or even semi-privately) while claiming it's real could be damaging to my reputation, so sure--lawsuit there...but I could care less if some random chick or dude half way across the world was

        • Yeah, but photographs have lied for a long time. Double-exposure, various tricks with the development process, etc...

          Oh hell, if you're remotely decent at Photoshop (or Affinity Photo)...even before any AI additions to it, you could create extremely realistic fakes of any photo you wanted.

          Still can.

      • I think its time that we just stop believing pictures, really if AI can generate them so easily then their status as proof must start coming into question.

        I know it makes it harder to gather proof, however that is the reality of life. On the plus side if someone does release images of you, just say they are fakes, and your reputation should be clean if people no longer believe photos or videos.

      • "But photographs are generally assumed as a literal snapshot of reality, i"

        In what universe? A photo is a picture of something. What that something is, or what it might mean is up to the viewer.

        Literally no photographer would agree with that statement.
    • It's certainly rape adjacent.

      • It's certainly rape adjacent.

        Err...not even close to physical penetration> adjacent.

        Unless we're about to modify the definition of things in the dictionary.....again.

    • I can imagine Taylor Swift nude or engaged in lewd behavior any time I want. If I were artistically inclined I might be able to create text, drawings, or animations of my imaginings. Would you outlaw pencil and paper?

      Your imagination does not involve Microsoft specifically the tools that are owned by Microsoft. That is the only part where MS has a say. They have a right to say that their tool, Designer, not to be used in this way according to the terms of service that they have spelled out with their users. I would guess you could use MS Paint to achieve the same result but there are not these terms of service when it comes to Paint.

    • So if I had an image of you, you'd be ok with me creating a fake nude image of you and posting it on the internet for all to see?

      • So if I had an image of you, you'd be ok with me creating a fake nude image of you and posting it on the internet for all to see?

        Only so long as the OP was shown tied up and spitroasted by two fat Russians. Because clearly they wouldn't have an issue with that depiction.
      • So if I had an image of you, you'd be ok with me creating a fake nude image of you and posting it on the internet for all to see?

        I believe that is classified as a war crime. Intentional blinding is not allowed.
        Geneva convention> "It is prohibited to employ laser beams of a nature to cause permanent blindness [serious damage ] against the eyesight of persons as a method of warfare."
        While nudes of me are clearly not a laser, they would have the same effect, so should fall under the same rule.

      • Does it raise the price of beer? Than what do I care.
    • Does not parody laws cover nude photos as well? Taylor Swift is a grown adult in her 30s, so this is a lot of nothing to me.
  • Microsoft is allegedly trying to limit the functionality of ChatGPT to do what you tell it to do. Your problem here is, it's designed to be very flexible about doing what you tell it to do, and that is actually one of its main features. It can't be removed, or you won't have a ChatGPT.

    They can't "fix" this. People will immediately come up with work-arounds.

    • If your prompt starts pulling in certain parts of the language tree, the GenAI can reject the prompt. If you simply draw a circle around entire topics and say it cannot comment about them, then it works. The problem is when they want it to be able to comment on a topic but only with certain responses, e.g., factually true statements or non-porn statements. Image generator is trickier, but I bet it can be done for famous people by checking the output image to see if the output looks like anyone recognizable

  • If people don't see the issue with this and keep asking what's the big deal, they wouldn't have an issue with a picture showing the orange criminal bending his knee and kissing the hand of his Russian handler, would they?

  • by zuki ( 845560 ) on Monday January 29, 2024 @01:26PM (#64198068) Journal
    if the past 25 years of computing trends are any indication of this, it's been clear that the porn industry has always nimbly been at the cutting edge of things (virtual chatrooms, live streaming, on-demand video delivery, etc..), and often among the first to leverage the power of any tech innovation because they immediately realize its potential; One can wonder why it should it be any different with AI?

    As an aside, and while the adult entertainment sector appears to often get reviled in public by puritanical lawmakers in the US (and UK), it's interesting to observe that the same group don't seem to vocally be opposed to the unfettered implementation of such technology for military purposes, which it could be argued is where actual lives are often destroyed senselessly by its unregulated use.

    Then again, this may perfectly fit the profiles of a majority of those constituents who elected them into office and gave them a mandate: "guns & bibles".

    This being the planet we live on, there probably isn't much that can be done about it.
  • I really wanted nudes of current day Gary Busey

  • Block words "Taylor Swift". Loophole fixed! That was easy.
    • And if you used Swift Taylor or an anagram, "Ways to flirt".

      Sorry no and no you should not block it.

      Look it is quite simple. You make enough of these things, people will get bored and move on. You limit the ability and people will try to figure out how to do it.
  • This is more like "full of holes". Generative AI and ChatAI cannot be controlled. You can filter some specific things, but the basic problem stays there. Another reason why the approach is a dead end.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...