Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
China AI

China Bans Deepfakes Created Without Permission Or For Evil 33

China's Cyberspace Administration has issued guidelines on how to do deepfakes the right way. The Register reports: [T]he Cyberspace Administration (CAC) has issued regulations that prohibit their creation without the subject's permission, or to depict or utter anything that could be considered as counter to the national interest. Anything counter to socialist values falls under that description, as does any form of "Illegal and harmful information" or using AI-generated humans in an attempt to deceive or slander. But the rules also suggest China expects synthetic humans will be widely used. For instance, they allow use of deepfakes in applications such as chatbots. In such scenarios, deepfakes must be flagged as digital creations.

The document also envisages that deepfakes will be used by online publishers, which must take into account China's myriad other rules about acceptable online content. Including the one that censpored images of Winnie the Pooh online, as the beloved bear - as depicted by illustrator E. H. Shepard - was felt to resemble, and mock, China's president-for-probably-life Xi Jinping. The Register therefore suggests it will be a very, very, brave Chinese developer that creates a photorealistic ursine chatbot or avatar.

The regulations also spell out how the creators of deepfakes -- who are termed "deep synthesis service providers" -- must take care that their AI/ML models and algorithms are accurate and regularly revised, and ensure the security of data they collect. The rules also include a requirement for registration of users -- including their real names. Because allowing an unknown person to mess with deepfakes would not do. The rules are pitched as ensuring that synthesis tech avoids the downsides and delivers benefits to China. Or, as Beijing puts it (albeit in translation), deepfakes must "Promote the healthy development of internet information services and maintain a good ecology of cyberspace." The regulations come into force on January 10, 2023.
This discussion has been archived. No new comments can be posted.

China Bans Deepfakes Created Without Permission Or For Evil

Comments Filter:
  • by Flownez ( 589611 ) on Monday December 12, 2022 @10:09PM (#63126162)
    Alan Alexander Milne and Ernest Howard Shepherd would be first in line for posthumous convictions...
  • by UnknownSoldier ( 67820 ) on Monday December 12, 2022 @10:14PM (#63126168)

    > Including the one that censpored images of

    .. and you guys couldn't even do that. How fucking hard is it to fix the spelling mistakes in the original article when quoting it??

    Yeah, yeah, the editors have been a total joke for the past decade+.

  • by Petersko ( 564140 ) on Monday December 12, 2022 @10:18PM (#63126174)

    The morally sound and well-intentioned use cases for deep fakes are pretty few. The potential for illegal, immoral, and downright nefarious uses are vast. For every actor licensing their image, there are going to be a million women who don't know they've entered the porn collections of acquaintances, or the revenge porn of their ex-partners.

    Of course, there's no going back. But if this tech was never invented, the world would be better for it.

    • by joe_frisch ( 1366229 ) on Monday December 12, 2022 @10:47PM (#63126218)
      I agree that deep fakes are bad, but I see porn fakes as a relatively trivial problem compared to the damage caused by deep fakes of political and public figures. Imagine believable videos of a presidential candiate saying something believable but offensive to their base, or a statement by the president or head of the fed that causes massive stock market swings, or panics or whatever. its easy to end up in a situation where the general public has no way to determine reality even if they put serious effort into it.
      br. Deep fake porn is offensive, but once people realize that it can be faked, it will cause a lot less harm - eventually becomming a variant of cutting the head off of one picture and scotch taping it to a naked body from playboy.
      • by Petersko ( 564140 ) on Tuesday December 13, 2022 @03:17AM (#63126544)

        I agree. I mentioned revenge porn as the example because it's the easy to be empathetic about it. But you're right - we're witnessing the death of facts. Anybody looking to dodge accountability will have the easy out of, "Wasn't me." And enough people will falsely paint innocents with fakes to make that defense unassailable. It is dystopian.

        • Re: (Score:3, Insightful)

          Honestly it's a tempest in a tea pot. The New York Times won a pulitzer for publishing straight up genocide denying propaganda in defense of the USSR during the Holodomor decades ago, and in modern times they routinely get caught committing egregious frauds like the time they use google image search pictures to make a completely fake list of children supposedly killed. It's not some kind of accident on their part, you don't get on google and literally collect a dozen random pictures and then make up names f

      • by Mozai ( 3547 )

        ...or a statement by the president or head of the fed that causes massive stock market swings

        You mean like saying insulin medicine is now free? Didn't even need a deepfake photo that time...

    • Not really. I work for a VFX company dealing primarily in deepfake content for huge Hollywood movies. But the company has resources that AI pron enthusiasts don't, and that will be the differentiating factor, going forward.
      • Most deep fake porn can use ordinary actors and ordinary props. They really don't need the Millennium Falcon flying through the asteroid field. It's going to get very troubling when deepfakes can emulate children well.

        • There is a small upside to deepfake porn. Plausible deniability, it's not much but for some of the people who are victims of revenge porn, it could be a lifeline. Think of it as a silver lining. Still sucks, but...
    • The thing is the same type of ML model that can do a deepfake could also produce lifelike images of fossils and historical bodies.

  • So can we continue making them?
    Like that Keanu Reeves Bollywood deep fake was awesome.
    Or that Kim K. deep fake where she admits to genuinly love the process of manipulating people for money thanks to Spectre.

  • Oh pooh (Score:4, Funny)

    by Tablizer ( 95088 ) on Monday December 12, 2022 @11:31PM (#63126292) Journal
  • by HeadSoft ( 147914 ) on Tuesday December 13, 2022 @12:53AM (#63126398)

    They have always been subject to faking, since the invention of photography, so there's no reason to ever take any photo or video as real and never has been. It's about time Deep Fakes came along so the people who didn't already realize this get a clue.

    • A deep enough fake is indistinguishable from real. I think that's due to the universe wrapping around on itself.
    • Yes, people faked stuff since the invention of photography. But we're approaching the time when it will no longer be detectable - first, by normal folks with normal tools, but ultimately once the AIs are trained on avoiding detection by AIs, by anybody.

      That's new. Unprecedented, either. Eventually there will be no forms of evidence involving audio or video that will be acceptable.

    • Ah, Arthur Conan Doyle proclaiming that the pictures of tiny fairies in Cottingley. https://en.wikipedia.org/wiki/... [wikipedia.org]
      It's a very good example, because a lot of people mistakenly assumed the good Sir who wrote about Sherlock Holmes was himself highly logical, scientific, and used deductive reasoning, and therefore others were duped into believing this because of proof by authority. Also Dunning-Kreuger effect in assuming that because you're a popular writer that you're also very good at figuring out if phot

  • All hail the God Emperor Xi the Teddy Bear children's toy!

    Glory to his occupying forces!

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...