Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI

Adobe Creates Symbol To Encourage Tagging AI-Generated Content 25

Emilia David reports via The Verge: Adobe and other companies have established a symbol that can be attached to content alongside metadata, establishing its provenance, including whether it was made with AI tools. The symbol, which Adobe calls an "icon of transparency," can be added via Adobe's photo and video editing platforms like Photoshop or Premiere and eventually Microsoft's Bing Image Generator. It will be added to the metadata of images, videos, and PDFs to announce who owns and created the data. When viewers look at a photo online, they can hover over the mark, and it will open a dropdown that includes information about its ownership, the AI tool used to make it, and other details about the media's production.

Adobe developed the symbol with other companies as part of the Coalition for Content Provenance and Authenticity (C2PA), a group that looks to create technical standards to certify the source and provenance of content. (It uses the initials "CR," which confusingly stands for content CRedentials, to avoid being confused with the icon for Creative Commons.) Other members of the C2PA include Arm, Intel, Microsoft, and Truepic. C2PA owns the trademark for the symbol. Andy Parsons, senior director of Adobe's Content Authenticity Initiative, tells The Verge that the symbol acts as a "nutrition label" of sorts, telling people the provenance of the media. The presence of the symbol is meant to encourage the tagging of AI-generated data, as Parsons said it creates more transparency into how content was created. While the small symbol is visible in the image, the information and the symbol are also embedded in the metadata, so it will not be Photoshopped out.
This discussion has been archived. No new comments can be posted.

Adobe Creates Symbol To Encourage Tagging AI-Generated Content

Comments Filter:
  • by HBI ( 10338492 ) on Wednesday October 11, 2023 @10:11PM (#63919729)

    If a graphic designer would ever take credit for this, they should be ashamed of themselves. Then again, they probably just paid someone on fiverr.

    How about a little robot with the letters AI across the face? I mean i'm no designer but that would be more to the point.

    • How about "AI" instead of "CR" anyway? Looks *way* too similar to Creative Commons (which is confused with Closed Captioning in video sometimes, at least by me, ironically).

      • Content producers pay for their software and want to reduce their workload but don't want to get sued for fraud, so they are trying to pretend they're not lying by using this symbol instead of 'AI'.

        Probably a court will be persuaded. In court they demand "the whole truth" but on trial they will accept any plausible excuse with a tenuous throughline - "not 100% lying" somehow becomes the standard.

        No wonder normal people are completely fed up with this system.

    • Pay?

      They probably used AI for it.

    • by Rei ( 128717 )

      This sort of comment is just ignorant.

      Raw AI generations are fine for meme-quality stuff, but simply insufficient for any sort of serious artistic or professional work. To get things to professional takes significant amount of human, manual work. Calling that human work irrelevant just because there was also AI work used is just dumb.

      I fully support (optional) metadata to describe workflows. For *all* works. And I specify "workflows", because there's a long chain of steps involved in making any serious w

      • by HBI ( 10338492 )

        Not that I particularly said anything you attributed to me, but you're basically admitting that identifying ML or AI as even part of the product taints the entire product.

        That's an interesting viewpoint and one that will only change by attributing quality to something ML was involved in doing, rather than obfuscation.

        • by Rei ( 128717 )

          but you're basically admitting that identifying ML or AI as even part of the product taints the entire product.

          Yes, exactly, I said precisely that, right next to the part where how I talked about all velociraptors should learn to juggle onions.

    • this went through 8 weeks of design and focus groups creating the perfect symbol before an SVP scribbled something on a napkin and ordered them to use it.
  • NO AI (Score:2, Troll)

    by Dwedit ( 232252 )

    There's already that "NO AI" sign that was once all over Artstation. I think that's become the de-facto symbol for AI art.

    • by Rei ( 128717 )

      I love that thing. AI artists had endless fun using AI to remix it ;)

      The fact that people thought that symbol would "pollute AI training datasets and render AI image generators useless" just screams how ignorant they are of the process. My favourite part was when AI artists started making AI images where they faked having it been corrupted by the "NO AI" images, and anti-AI people started sharing them, believing them to be totally legit proof that their campaign was working (just weeks after it began).

      The

  • Cry me a River. Ie stop sniveling
  • If they're going to brand all their content as AI generated, why would I pay them to augment my workflow? I wouldn't be caught dead labelling content I made (using whatever tools I choose to use for it) as AI, on purpose.
  • Since it's from Adobe, what's the license fee - $5.99 per month or one easy annual fee of $71.88 (per user/workstation of course).

  • The US copyright office has formally ruled that AI generated images cannot be copyrighted, so "what exactly" is implied by "owner" here?

    Owner of what? Because it sure as fuck isn't a copyright.

    • Remember NFT ownership?
      Yeah...

    • by Rei ( 128717 ) on Thursday October 12, 2023 @07:47AM (#63920213) Homepage

      The actual ruling is that raw images cannot be copyrighted where there is no control over the positioning and form of the elements, but that copyright can be extended where there is significant selection or manual postprocessing work on the outputs.

    • Ownership and copyright are different things. You can own something yet grant others copyright (e.g., an actor "owns" his likeness, but typically grants movie studios copyright permission to use his likeness for advertising purposes and denies others). You can also have copyright permission without being the owner (e.g., any software published under the GPL that you didn't write).

      In the case of AI images, if you create such an image, technically you "own" it, but you can't copyright it, i.e., you can't de

  • by El_Muerte_TDS ( 592157 ) on Thursday October 12, 2023 @01:44AM (#63919929) Homepage

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    Why not use that one?

  • What a joke really, Adobe keep looking more and more like a clown company
  • by nicubunu ( 242346 ) on Thursday October 12, 2023 @03:18AM (#63919979) Homepage

    While the small symbol is visible in the image, the information and the symbol are also embedded in the metadata, so it will not be Photoshopped out.

    If you save without metadata, it will be photoshopped out from there too

    • by njen ( 859685 )
      Or quite simply, open a blank new file of the same resolution, flatten the image of the existing file, marque select all of the pixels, copy the selection, then paste into the new file. Metadata is not is not held in the copy buffer when coping the rgb value of each pixel in a selection.
    • by Rei ( 128717 )

      The problem is tha tmost social media sites strip all metadata.

      IMHO, they need to stop doing that. There's some data that should be stripped (based on user preferences) for privacy reasons, but not all.

  • Because if someone creates something with AI, he sure as all hell wouldn't want to pretend it's been done without. There ain't no drawback to creating an image with the aid of AI, after all...

  • "Adobe releases new software to automaticallhy strip metadata from image files to combat doxxing and stalking"
  • This content is good. That content is bad. Oh, and we can turn your software off.

    Can't imagine where that's headed.

Somebody ought to cross ball point pens with coat hangers so that the pens will multiply instead of disappear.

Working...