Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Internet Technology

Deepfake Porn Is Evolving To Give People Total Control Over Women's Bodies (vice.com) 301

samleecole shares a report from Motherboard: A lineup of female celebrities stand in front of you. Their faces move, smile, and blink as you move around them. They're fully nude, hairless, waiting for you to decide what you'll do to them as you peruse a menu of sex positions. This isn't just another deepfake porn video, or the kind of interactive, 3D-generated porn Motherboard reported on last month, but a hybrid of both which gives people even more control of women's virtual bodies. This new type of nonconsensual porn uses custom 3D models that can be articulated and animated, which are then made to look exactly like specific celebrities with deepfaked faces. Until recently, deepfake porn consisted of taking the face of a person -- usually a celebrity, almost always a woman -- and swapping it on to the face of an adult performer in an existing porn video. With this method, a user can make a 3D avatar with a generic face, capture footage of it performing any kind of sexual act, then run that video through an algorithm that swaps the generic face with a real person's.
This discussion has been archived. No new comments can be posted.

Deepfake Porn Is Evolving To Give People Total Control Over Women's Bodies

Comments Filter:
  • Sexists! (Score:3, Insightful)

    by Gay Boner Sex ( 5003585 ) on Friday December 06, 2019 @08:53PM (#59493498)
    What about 19 year old guys? Remember, 60% of the world population is female or gay!
    • by Mr. Dollar Ton ( 5495648 ) on Friday December 06, 2019 @11:30PM (#59493920)

      WTF, nobody has any control "over the women's bodies" whatsoever. The control is over some pixels.

      • by Tom ( 822 ) on Saturday December 07, 2019 @03:01AM (#59494358) Homepage Journal

        In a world where feminists are running amok against language because they slept through grammar class I'm not surprised that the difference between pixels and meat is a tricky thing to comprehend.

        Most anti-feminists aren't any better, btw. - it's a cringeworthy fight, where you want to smack both sides because they're just so unbelievably stupid.

  • Individuals (Score:4, Insightful)

    by Captivale ( 6182564 ) on Friday December 06, 2019 @08:59PM (#59493510)

    You can always spot the agenda-driven propagandist 'journalist' because individual people don't exist to them. Deepfakes let anybody put any face on any footage, but the majority of times it's WAHMEN so this is a WAHMEN'S RIGHTS ISSUE now. Won't somebody think of the WAHMEN? These poor WAHMENS are being virtually raped, which is much much worse than real rape because of blockchains and dark webs and shit. We need the UN to declare virtual porn to be a war crime. It's digital slavery. It's cyber murder. It's Photosluttery. We're going to keep making more and more scary terms for it until you bow down and give us more power to control you with.

    • Re:Individuals (Score:5, Insightful)

      by Mashiki ( 184564 ) <mashiki&gmail,com> on Friday December 06, 2019 @09:15PM (#59493562) Homepage

      It's the same type of people who complain about sexbots, and how they're vile and harmful to women. But in the same breath start screeching how vibrators are empowering. Then screech about 'female objectification' but ignore male objectification in the exact same way. I'm sure there's also overlap between those, and the ones that claim disagreement is harassment, rape, sexual assault, misogyny(or internalized misogyny if you're female), the patriarchy and other nebulous bullshit.

      Of course it's also pretty easy to tell who's got the actual power in this situation. If anyone questions or speaks up about their motives or aren't pure enough out come the mobs to come after your job, family, and friends. And before someone goes but...noooo that's not true! Ask yourself why they're the first ones to go after other women when they try setting up male domestic violence shelters, question the 1:6 rape lie, or the wage gap BS.

    • by skam240 ( 789197 )

      I just noticed your user number. Way to get payed to be a shithead foreign troll.

    • I'll admit I only read a little of the article and quickly skimmed the rest, but I didn't come away with the same impression you did.

      The vast majority of porn is of women so I am not surprised that usually it's women that are "deep-faked". It doesn't surprise me that they are usually celebrities too.

      I'm a complete unknown to the public at large and being a male, especially of my age, probably means it's likely anyone would create a deepfake of me. I would think it's kind of creepy though. And it really sh

  • Why should I care? (Score:5, Insightful)

    by alvinrod ( 889928 ) on Friday December 06, 2019 @08:59PM (#59493512)
    Why should anyone care about this? No person has been harmed in any way and the best summary of this whole thing would be "Programmer figures out how to make a computer do what people have been doing in their minds since time immemorial."

    It's seems quite apparent that anyone using this is clearly aware that the actual individual isn't really performing any of these acts so it's hard to argue that any brand damage occurs either. I'd far less worried about the people using celebrities for this than I would be about the guy who's decided to use his secretary or some other person that they actually interact with on a regular basis in the real world.
    • by backslashdot ( 95548 ) on Friday December 06, 2019 @09:05PM (#59493528)

      I don't think it should be illegal. However if you knowingly distribute the deepfake video without clearly describing the modifications, then you should be subject to civil (not criminal) lawsuit for defamation/slander and stuff like that.

    • Re: (Score:2, Insightful)

      So how would you feel if it was your wifes' face? Or your daughters' face?
      Don't even play the "oh but that's the 'appeal to emotion' logical fallacy", because this is all about emotion, the human sense of what's right and what's wrong, and protecting people who matter to you -- and people protecting themselves.

      While we're at it: How would you feel if someone pasted your face onto one of these avatars and had degrading gay sex with it? "Oh, but you're not being harmed", you say. Or are you?

      Arguments
      • by K. S. Kyosuke ( 729550 ) on Friday December 06, 2019 @09:35PM (#59493626)

        Don't even play the "oh but that's the 'appeal to emotion' logical fallacy", because this is all about emotion

        I wonder, what kind of fallacy is it to say that something is not a fallacy because in this one case the thing that otherwise would make it fallacy doesn't, just becase.

        "Oh, but you're not being harmed", you say. Or are you?

        Are you harmed when someone writes some erotica story about you? Especially when the story is not distributed at all?

        In any case, this moral panic will most likely extinguish itself when people realize the infinite variety of synthetic faces that is possible that completely dwarfs the finite set of faces of actual people.

        • Re: (Score:2, Informative)

          by skam240 ( 789197 )

          "Are you harmed when someone writes some erotica story about you? Especially when the story is not distributed at all?"

          You're confabulating two very different things here. Share it widely and you're likely guilty of defamation and possibly some mental anguish. Keep it private and it is exactly that, private.

          • I am fabricating memories? Where? In any case, what the summary mentions sounds very much like "keeping it private". You can't have a control of pre-generated video, therefore you have to have a video generated on demand, regardless of whether it's offline or online, speaking in video parlance.
            • by skam240 ( 789197 )

              I am fabricating memories?

              I have no idea what you're talking about but it's not what I'm talking about.

              Dispersal of a video of an individual doing something they didn't do and that the individual finds defaming is in fact slander. The laws are already on the books.

              • Who exactly are you quoting? I can't find the quoted text.
                • by skam240 ( 789197 )

                  I accidentally quoted my own words. Your quote is as such "I am fabricating memories?". The rest is my own words. Formatting error, sorry about that.

          • "Are you harmed when someone writes some erotica story about you? Especially when the story is not distributed at all?" You're confabulating two very different things here. Share it widely and you're likely guilty of defamation and possibly some mental anguish. Keep it private and it is exactly that, private.

            Defamation is when you say untrue things about a person that will damage their reputation. If the erotica is clearly fictional (e.g. is not presented as an account of actual events) then it's not defamation. If anyone's reputation would be harmed by it, it would probably be the authors's.

            • by skam240 ( 789197 )

              WTF are you even talking about? If there's a visual representation of you cheating on your significant other that didn't happen how does that not fall under slander law?

              • Comment removed based on user account deletion
                • Comment removed based on user account deletion
                • by skam240 ( 789197 )

                  What utter nonsense. "We can't trust the scientific consensus on global warming because last year we had an overly cold winter where i live". Living life by individual experience is a truly stupid and ignorant way to live ones life but it's a great way to come to conclusions like "vaccines are dangerous", and "global warming isn't real" or that "the earth is flat".

      • So how would you feel if it was your wifes' face? Or your daughters' face?

        How would I ever know unless someone told me (and who says they're telling the truth as opposed to just trying to ruffle my feathers like you're doing now) and why should I care? What's the difference between them using this service to upload a picture and having a wank to pure fantasy as opposed to imagining all of it in their head and having a wank to pure fantasy? Does visualizing something in a computer program make it more real? If that's the case I've got far more to worry about than someone using my

        • What happens in someone's imagination is not the same at all as someone distributing a porno over the internet to whoever wants to watch it.
      • by fafalone ( 633739 ) on Friday December 06, 2019 @10:03PM (#59493704)
        And you can't possibly imagine the unintended Orwellian outcomes of trying to legislate against, essentially, what people draw for themselves to fap to and the tools to create such images, and supporting that by what you admit is purely emotional? Yeah, it's gross if it's you or a loved one, so what, so is the unavoidable fact people are picturing you/them in their head. You'll be the one eating your words when the unintended consequences of anti-deepfake laws come up.
      • by Torodung ( 31985 )

        How would you feel if someone pasted your face onto one of these avatars and had degrading gay sex with it?

        I think at this point in your post you are making wild assumptions about the OP's potential homophobia. You are probably arguing with a fantasy figure of your own design. You have total control over that fantasy figure's body and mind. You are guilty of the crime you are arguing against.

        Or... try a Xanax. You can get them by the fistful in my country.

      • by fazig ( 2909523 )
        You don't need to feel anything if someone does this with your face and makes money of it, use it to defame you or similar things. These things are already largely covered by personality rights.

        If it's done with someone else's face AND they gave permission (assuming that they can give permission) and you still have a problem with that, then you are the one having a problematic mental attitude. You might care for them, but they are still their own and not your property.
        Also where do you draw your lines he
      • How would you feel if someone pasted your face onto one of these avatars and had degrading gay sex with it? "Oh, but you're not being harmed", you say. Or are you?

        I wouldn't mind. I'd be flattered that someone thought I was attractive enough that they'd want to have virtual sex with my avatar.

      • >So how would you feel if it was your wifes' face? Or your daughters' face?
        Pretty good actually. I'd much rather some creeper was indulging their fantasy *without* involving my wife or daughter. And if the evidence around child porn is any indicator - access to such material actually reduces the risks of an assault. (Though we'll have to wait and see if that remains the case with personalized porn)

        > How would you feel if someone pasted your face onto one of these avatars and had degrading gay sex wi

    • by rtb61 ( 674572 )

      They are just pissed that the whole marketing delusion about psuedo celebrities being able to sell crap products is collapsing. Why are those people opinions are inflated beyond all reason, when they are nothing but empty talking heads by profession because corporate main stream media, screamed at you over and over again, every hour of every day, over and over, that you have to listen to them when they lie to you.

      They dying pseudo celebrity lashing out at reality which makes of them nothing more than just

      • The created brand, the fake person, who is only as appealing as corporate main stream media scream they are appealing, ...

        That reminded me of this song: Any Kind of Pain [youtube.com]

        You are the girl
        Somebody invented
        In a grim little office
        On Madison Ave.

        They were specific
        They made you terrific:
        Red lips;
        Blue eyes;
        Blonde hair;
        Un-wise -
        You're All-American,
        And, darling, they said so

        That was the part I was reminded of. I forgot this line:

        And all the yuppie boys, they dream they will rape her

        So, I guess fantasizing about celebrities isn't exactly new especially considering that song is over 30 years old.

    • The part these stupid articles get wrong is mostly that it's the likenesses that should be getting people in trouble. Sell them the gun, not the murder.

  • Terrible headline (Score:4, Insightful)

    by Immerman ( 2627577 ) on Friday December 06, 2019 @09:04PM (#59493524)

    This isn't giving anyone control over anybody's body. It's giving people greater options for creating fake videos, nothing more.

    If anything, as this becomes increasingly common (and let's be honest...) it could well be liberating. If you see a compromising video of someone, what are you going to assume? That they let some asshole record them and post online, or that it's a fake created by someone with an axe to grind?

    I could care less about the porn uses, it's ultimately harmless. What I worry about is the effect on politics, as compromising non-sexual videos will become utterly unreliable. How are we supposed to hold politicians remotely accountable when even video evidence of appalling behavior is no longer trustworthy?

    • "Deepfake" images look like crap, and the videos are even worse. Disney spent tens of millions of dollars and employed dozens of the best graphic artists on the planet just to CGI Admiral Tarkin into a few seconds of Star Wars... and it was still obvious.
    • by kenh ( 9056 )

      What a complete craptastic headline.

      The woman portrayed in the "fake porn" are computer-generated, the "victim" is not involved in the video.

      Wait, is this Jib-Jab [jibjab.com] 2.0?

    • It's also a bit suspicious that there have been a number of Virt-a-Mate moral-outrage stories published in the last couple of months that specifically mention Virt-a-Mate and nothing much else. Just Virt-a-Mate, available on Patreon. This could be one of the best viral marketing campaigns ever for Virt-a-Mate, which in case we haven't mentioned it you can buy on Patreon. What's the bet we'll see more stories like this in the future, telling us about Virt-a-Mate, starting at just $2/month on Patreon.

      That'

    • by Tom ( 822 )

      How are we supposed to hold politicians remotely accountable when even video evidence of appalling behavior is no longer trustworthy?

      Uh, by holding them accountable for their politics, not their sex lives ?

      Next I'll take "obvious answers for 200".

  • Porn (Score:5, Funny)

    by Jodka ( 520060 ) on Friday December 06, 2019 @09:12PM (#59493554)

    Porn is a complete waste of your time. Go find a real woman.

    Says the guy posting to Slashdot. On Friday night.

  • by kenh ( 9056 ) on Friday December 06, 2019 @09:25PM (#59493584) Homepage Journal

    By their own admission, the alleged "victim" is never involved:

    This new type of nonconsensual porn uses custom 3D models that can be articulated and animated, which are then made to look exactly like specific celebrities with deepfaked faces.

  • I'm not much interested in making porn videos of celebrities. Hollywood actress, porn actress, it's all the same to me.

    What I want to know is, can I use this technology to make a video of Justin Trudeau being sodomized by Donald Trump?

  • by Rick Schumann ( 4662797 ) on Friday December 06, 2019 @09:31PM (#59493612) Journal
    Would you want your face pasted onto some virtual avatars' body, performing sex acts?
    What about if it's your sister, wife, or daughter (or brother, husband, or son, for that matter)?
    If most people think that's wrong, then it's wrong.
    'Causing harm' to someone isn't limited to just physical harm. It can be emotional, reputational, or financial, or all the above.
    Theoretically: if someone made a 'deep fake' porn video of you, and it got around the internet, and you lost your job or were denied a job because of it, have you been harmed? Yes, you have, your reputation certainly has, and you might have been harmed financially as well.
    What if you're a public figure, say running for public office, and your rivals produce a deepfake porn of you and distribute it? Do you think that might affect the outcome of the election? I say it likely would, and that's definite harm to your professional, and perhaps personal reputation, and very likely your career.

    You see where I'm coming from with this, yes?
    Some may treat it as a laughing matter, just a joke, who would take it seriously? But it's not a laughing matter, any more than spreading slander around is a laughing matter.
    It all comes down to consent. If someone who is the object of a deepfake did not consent to having their image used that way, then they have been done harm just as surely as if someone physically attacked them.
    • by Anonymous Coward on Friday December 06, 2019 @09:48PM (#59493654)

      they have been done harm just as surely as if someone physically attacked them

      So what you're saying is that it is no worse to kill them and use their corpses as puppets in our porn videos than to alter a photo of them. What a relief!

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Friday December 06, 2019 @09:51PM (#59493668)
      Comment removed based on user account deletion
      • Easy Rebuttal. (Score:4, Informative)

        by skam240 ( 789197 ) on Friday December 06, 2019 @10:51PM (#59493822)

        No, I don't want that. But the distinction I make that you don't is that I don't think that just because I don't want people to do something -- even if I think it's immoral -- it should be illegal. I don't want people to say mean things about me on the internet, and I think it's immoral to cheat on your partners, but those things should not be illegal.

        Would you want people to be able to post fake videos of you that could cost you jobs, relationships, and the respect of your pears? No you wouldn't and that's the end of the argument. Slander is already illegal and deep fakes are nothing more or less.


        • Would you want people to be able to post fake videos of you that could cost you jobs, relationships, and the respect of your pears?

          I hate it when my fucking bowl of fruit doesn't respect me.

          Also, you really should look up what the legal definitions for slander, libel, and defamation are if you want to try to argue legal points with them. Here's a hint: Slander only involves making known false statements OUT LOUD with your voice, in a way specifically designed to hurt the reputation of another.

          • by skam240 ( 789197 )

            So you're saying me making a video of you getting rammed up your ass by your father that looked convincing wouldn't be slander?

            You're stupid.

            • by malkavian ( 9512 )

              It would be distasteful, yes. But if someone wanted to screw me over that badly, they'd just find other ways. What you're suggesting is that malice would suddenly appear from nowhere, where there had been none before.
              This simply isn't true.
              And if you did it for your own gratification, and I never knew, to be honest, I wouldn't really care. It's not something I'd worry about, because I have plenty of real, solid, present things to worry about and get on with in my life (and many things to be cheery about

        • Re:Easy Rebuttal. (Score:4, Insightful)

          by Corbets ( 169101 ) on Saturday December 07, 2019 @12:00AM (#59493986) Homepage

          No, I don't want that. But the distinction I make that you don't is that I don't think that just because I don't want people to do something -- even if I think it's immoral -- it should be illegal. I don't want people to say mean things about me on the internet, and I think it's immoral to cheat on your partners, but those things should not be illegal.

          Would you want people to be able to post fake videos of you that could cost you jobs, relationships, and the respect of your pears? No you wouldn't and that's the end of the argument. Slander is already illegal and deep fakes are nothing more or less.

          When someone says “and that’s the end of the argument”or similar, it becomes apparent that they are trying to preempt debate by claiming victory. To the rest of us, it’s clearly not the end of the argument, as you haven’t understood (nor rebutted in any way) the GP’s arguments about legislating based on morality.

          • by skam240 ( 789197 )

            When someone says "I can't caste doubt on your argument so I'll attack something else" it becomes apparent that they are trying to preempt debate by claiming victory.

            Fuck off and engage in a debate of ideas and not slander.

      • What I see is that you think we should legislate based on morality, which is relative and a very dangerous slope.

        While I agree that the hype about deepfakes is overblown, I can't agree with you here. All legislation is based on morality. Murder (for example) is illegal because it is morally wrong, not because it affects the GNP or something.

      • It's real fucking easy to just label everything I said as """fearmongering""" when you just clearly don't agree with any of it and want to ignore it, isn't it? But this is a REAL, EMOTIONAL ISSUE and it's NEW so we really don't have any rules regarding it yet now do we? So do we just go "Oh well wait and see, haha!" and maybe it becomes a nightmare, or do we put on the goddamned brakes NOW and figure out whether it's really okay or not? But sure go right ahead and keep your head stuck in the ground like tha
    • Would you want your face pasted onto some virtual avatars' body, performing sex acts?

      Why would I care?

      If most people think that's wrong, then it's wrong.

      Most people think you're wrong.

      'Causing harm' to someone isn't limited to just physical harm. It can be emotional, reputational, or financial, or all the above.

      If they're uploading porn with my face on it, and passing it off as me, that might do harm to my reputation. Or, it might enhance it. But if they're just fapping to it, then it does me no harm.

      It all comes down to consent. If someone who is the object of a deepfake did not consent to having their image used that way, then they have been done harm just as surely as if someone physically attacked them.

      That's a really dumb thing to say.

    • by malkavian ( 9512 )

      Some people think it's wrong. Some people think it's fine. If you don't like it, don't do it. Don't watch it.
      If, on the other hand, people start making money out of someone's image, that's lawyer land, or if they do it to harrass someone, that's harrassment or a load of other legal problems.
      If you make a vid like this of someone's wife, and then go and show it to them (or the wife), expect bad things to happen to you.
      So, if someone drew a lewd picture of someone, you'd say that they have suffered harm?

    • by Strill ( 6019874 )

      If someone wants to draw pictures of my mom 100 feet tall shoving the empire state building up her butt, I could care less. This is no different. You don't own the rights to someone else's creative works, and I see no reason why someone's right to artistic expression should be infringed here.

    • > It all comes down to consent. If someone who is the object of a deepfake did not consent to having their image used that way, then they have been done harm just as surely as if someone physically attacked them.

      What a load of crap.

      If Tom Cruise doesn't consent me to cutting his picture out of a magazine to set fire to, or to have a wank to, or to roll a blunt with he has been done zero harm and you are a moron if you think that is the same as me punching him in the nose.

      Similarly Katy Perry is done zero

  • So vice is saying it's programmed to completely block being used against men. Good to know

  • by frup ( 998325 ) on Friday December 06, 2019 @09:54PM (#59493674)

    There are comments above who portray the issue as victimless. It may seem so if you do not often leave the house. There are already women being killed around the world because of people shaming them on social media with deep fakes. Certain cultures practice honour killings and the women have far less freedom than their western counterparts. Through the distribution of deep fakes these women are in real danger.

    • maybe try to stop honor killing instead of policing pixels. Oh wait, the one honor killing are protected group.
    • So if someone somewhere were killed because of your comment are your morally responsible or legally culpable in any way? I'm pretty damn sure you didn't intend for anyone to die because of what you wrote, but there could well be someone or even several someones being killed around the world because of the sentiments you've expressed online. Better retract that all just in case.

      Yeah that's all utterly stupid and I feel bad having written it, but it's just to illustrate why your argument is terrible. We co
    • Certain cultures practice honour killings and the women have far less freedom than their western counterparts.

      None of these are problems with any kind of porn. All of the issues you cited are problems with primitive cultures. That's where you need to focus your attention.

  • Easy (Score:5, Funny)

    by Jodka ( 520060 ) on Friday December 06, 2019 @10:03PM (#59493702)

    "This new type of nonconsensual porn..."

    Program the avatar to consent and you're good.

  • Of course (Score:2, Funny)

    by argStyopa ( 232550 )

    ...one COULD see this as FREEING women from their previous choices, allowing them to be/do whatever they want.

    "Um, we were going to hire you but a web search brought up this...staggering collection of brutal rape porn, bukkake, bestiality, and pedophilia. I mean, seriously, it's like you spend the majority of the last 15 years getting hammered at home, at work, in the grocery store, in the machine shop, at school...we're not sure you're management material."

    "Oh what? I don't know what you're talking about

  • Clickbait is an exercise/example of dissonance. Every issue raised in the article has tortious category. Bing's 1st result about licensing a likeness is...

    http://performermag.com/band-management/contracts-law/legal-pad-what-you-should-know-about-likeness-rights/

    VR's amplification of commoditizing the female form and the potential marginalization of women's issues relating to ownership/consent/autonomy is unsurprising and merits feminist criticism.

    Any exploitation of misrepresentation will come with for
  • by JThundley ( 631154 ) on Friday December 06, 2019 @10:35PM (#59493778)

    Mark my words, this generation's elderly will be using VR and Deepfake technology to have sex with their dead partners.

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Friday December 06, 2019 @11:55PM (#59493968)
    Comment removed based on user account deletion
  • by mark-t ( 151149 ) <markt.nerdflat@com> on Friday December 06, 2019 @11:56PM (#59493972) Journal

    .... if deepfaking video is so good, then why did General Tarkin and a young Princess Leia in Rogue One look like they were made out of plastic?

    Or are you suggesting that Disney didn't have the money to pull off something convincing then that can be practically done on a desktop today?

  • bizarre (Score:5, Informative)

    by cascadingstylesheet ( 140919 ) on Saturday December 07, 2019 @12:24AM (#59494046) Journal

    but a hybrid of both which gives people even more control of women's virtual bodies.

    This is bizarre. What the heck are "women's virtual bodies"?

    This gives you control over a computer simulation, which presumably belongs to (or is licensed by) you, not some other person.

    This isn't Tron; people don't actually have "virtual bodies". That's ... not a thing.

One man's constant is another man's variable. -- A.J. Perlis

Working...