Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI

Leica Camera Has Built-In Defense Against Misleading AI, Costs $9,125 45

Scharon Harding reports via Ars Technica: On Thursday, Leica Camera released the first camera that can take pictures with automatically encrypted metadata and provide features such as an editing history. The company believes this system, called Content Credentials, will help photojournalists protect their work and prove authenticity in a world riddled with AI-manipulated content.

Leica's M11-P can store each captured image with Content Credentials, which is based on the Coalition for Content Provenance and Authenticity's (C2PA's) open standard and is being pushed by the Content Authenticity Initiative (CAI). Content Credentials, announced in October, includes encrypted metadata detailing where and when the photo was taken and with what camera and model. It also keeps track of edits and tools used for edits. When a photographer opts to use the feature, they'll see a Content Credentials logo in the camera's display, and images will be signed through the use of an algorithm.

The feature requires the camera to use a specialized chipset for storing digital certificates. Credentials can be verified via Leica's FOTOS app or on the Content Credentials website. Leica's announcement said: "Whenever someone subsequently edits that photo, the changes are recorded to an updated manifest, rebundled with the image, and updated in the Content Credentials database whenever it is reshared on social media. Users who find these images online can click on the CR icon in the [pictures'] corner to pull up all of this historical manifest information as well, providing a clear chain of providence, presumably, all the way back to the original photographer." The M11-P's Content Credentials is an opt-in feature and can also be erased. As Ars has previously noted, an image edited with tools that don't support Content Credentials can also result in a gap in the image's provenance data.
This discussion has been archived. No new comments can be posted.

Leica Camera Has Built-In Defense Against Misleading AI, Costs $9,125

Comments Filter:
  • by ctilsie242 ( 4841247 ) on Friday October 27, 2023 @08:56PM (#63960381)

    Canon had a camera which would do some cryptographic signing and have that stored in the EXIF data of the picture. How it did it, I'm not sure, but it was supposed to be good enough for forensic work.

    I wish more cameras had this as a feature. At the factory, they would have a secure enclave generate a key, the key would be signed by an intermedary CA for the model and batch, and the key would be used to sign data coming in. Ideally, if the camera takes a RAW and a JPEG for a smaller thumbrint, both would be signed.

    Adding onto that would be photo editing tools in a closed appliance, which also add signatures, so one can see how a picture went from the camera, to some basic fixing up, to compression to a web friendly format, with all those transactions signed which ensures a chain of custody of the picture. Maybe even allow soft like Photoshop to be part of that chain, except requring a hardware HSM to allow it to sign photos, for solid security. Appliance-wise, the appliance would be something that might just compress the RAW photo into something smaller for the web or thumprints, while adding its signature to it. to ensure there is a low chance of gross modifications.

    Not a 100% system, by far, but it would provide a path of assurance that doesn't exist right now.

    • by Joce640k ( 829181 ) on Friday October 27, 2023 @09:04PM (#63960383) Homepage

      Canon had a camera which would do some cryptographic signing and have that stored in the EXIF data of the picture. How it did it, I'm not sure, but it was supposed to be good enough for forensic work.

      Right up until somebody removes the EXIF data.

      I'm sorry, but... if the "specialized chipset" in a camera can be defeated with Windows Paint then the whole thing is a farce.

      • It's valuable to be able to confirm, "This specific picture was taken by a camera - not generated by an AI". Being able to remove that data is not an issue, because you're also capable of not removing that data.
        • Being able to remove that data is not an issue, because you're also capable of not removing that data.

          You're also capable of dumping a set of valid credentials from a "secure" camera and then signing the image offline.

          You're also capable of hacking the camera's internals or potentially installing a softmod to allow intercepting and editing the image made by a "secure" camera before it's signed. (See also every video game console in existence.)

          You're also capable of just editing the photo then putting it in front of a screen with a better resolution than a "secure" camera's image sensor and getting an "

          • Plenty of devices exist that have cryptographic security against unauthorised code execution and have yet to be cracked. Someone from Microsoft did a presentation about the security on the Xbox One (a device that has not been cracked) and a digital camera doesn't have to deal with some of the issues that the Xbox One does (like needing to trust code read from shiny plastic discs)

            • Plenty of devices exist that have cryptographic security against unauthorised code execution and have yet to be cracked. Someone from Microsoft did a presentation about the security on the Xbox One (a device that has not been cracked)

              The lack of a publicly known hack doesn't mean that there will never be one. Nor that photos taken prior to the hack being made public, and thus the verdicts made based on them, are safe. All it will take is one hacked "secure" camera being implicated as being the most convincing reason behind a guilty verdict, to erode public trust in the whole concept. To say nothing about the public backlash caused by the inevitable government revocation of the camera's "secure" status when used against them in court.

              a digital camera doesn't have to deal with some of the issues that the Xbox One does (like needing to trust code read from shiny plastic discs)

              A

          • Signing an image offline might require hacking multiple cryptographic systems (for example, one would be attached to the GPS chip) on each individual camera. Attacks made by hacking/altering the camera would need to be 100% reversible and not require any disassembly, otherwise the manufacturer could spot the problem during an inspection. Selling periodic validations, inspections, court testimony, etc, would be part of their revenue stream. Also, some subtle steganography could added that is never actually
        • Could I just crop the photo and forge the meta data?

      • by kmoser ( 1469707 ) on Friday October 27, 2023 @10:32PM (#63960565)
        In that case, the image would be considered unauthenticated and thus subject to having been tampered. So the system is working exactly the way it's supposed to.
        • Unless someone else has a Leica camera and then takes a photo of the photo. Now the edited image is signed once again.
        • And then we decide nothing can be trusted UNLESS it is done in a closed, DRM-laden, restrictive, expensive environment. That could have a very chilling effect for those not in the tiny elite that can afford or want to deal with it all.

          I see the utility in specialized cases. But worry it could set some bad precedence, as well.

        • Or just an image like every other on the internet. A consumer whether a person or machine would have no indication of this tampering because 99.9.. to some number of nines of the images on the internet do not have these EXIF tags. Starting around 2012 Social media sites started stripping EXIF from images for privacy reasons, so anything passed via social media will loose this information.
      • by tlhIngan ( 30335 ) <slashdot@worf.ERDOSnet minus math_god> on Friday October 27, 2023 @10:37PM (#63960571)

        Right up until somebody removes the EXIF data.

        I'm sorry, but... if the "specialized chipset" in a camera can be defeated with Windows Paint then the whole thing is a farce.

        The point was not to defeat it by being un-alterable. The goal was to show "this is an original unaltered image from this camera".

        You can always edit the image afterwards. You just can't claim it to be an original image that was taken with the camera if you do it.

        So if a journalist takes a photo of a scene and people doubt its authenticity ("it was photoshopped!") the original image can be analyzed to show it's an authentic image taken with a camera and not altered.

        Also, many tools preserve EXIF, so even if it doesn't match, there's a pointer to a record showing that an original image likely exists, and you can use the hash to check you're looking at the right image (you often might take as many photos as you can hoping one turns out all right, so many similar images can be produced).

      • Or just take a picture of the picture with a Leica and now your picture is authenticated. Shrug?
    • by Zurk ( 37028 )

      oh yeah. because taking a picture of a screen showing an AI generated picture is soo complicated.

    • Itâ(TM)s far more likely that the firmware just contains the same private key for all cameras. I donâ(TM)t see any photographer generating TLS certificates on their camera (if it even has the capacity of doing it quickly enough), transferring a CSR and then waiting for a signed certificate and then fiddling with the cert order to load the CA chain back into it.

      • by jonwil ( 467024 ) on Saturday October 28, 2023 @02:36AM (#63960963)

        Many devices have unique per-device keys generated and burnt in at manufacturing time (including the cellphone I am typing this on which has a unique IMEI number)

        • by guruevi ( 827432 )

          Serial numbers and even IMEI numbers are easy to change and still doesn't prove who owns the device. We're talking about cryptographic proof of ownership, you need to sign something in real life or digitally as an end user to make the claim to the device and whatever it produces.

    • How hard would it be for cell phone cameras to sign data right off the camera chip? This should have existed for years to protect against photoshopping.

    • Now all we have to do is get the lens, camera body, and software all using the blockchain.

    • by irp ( 260932 )

      In my 2004 (so nearly 19 years old!) Canon 20D (which I still uses occasionally (having fun with macro, but not enough to get an expensive new camera)) the function is (from the manual)

      "C.Fn-18 Add original decision data - To verify whether the image is original, the Data Verification Kit DVK-E2 (optional) is required."

      It only works using the raw format... But it has been reverse engineered and can be forged http://lclevy.free.fr/cr2/odd/ [lclevy.free.fr]

  • A Nikon Z9 costs $US4,000.00 less, has WAY less expensive and greater selection of lenses, and for the purpose takes just as good a photo (most people can't tell the difference between shots taken with pro level Nikon glass, Tamron glass, Canon glass, Sigma glass or Leica/Zeiss glass). The Z9 has about the best and fastest autofocus on the market, massive dynamic range, and all the standard bells and whistles etc... and for sure the competitors are not far behind. It's just a matter of time before these cam
  • Take a digital picture. Save it offline somewhere along with an md5sum of the file.

    $9100 saved.

    You're welcome (that's the nice way of saying TFTFY).

    Sure, it's not 100% automatic in verification. Yet. But imagine a web extension so
    http[s]://image.url.here can be looked up by http[s]://cksum.image.url.here.

    Cost = near 0.
    Benefit > 0.
    Ratio = juice is worth the squeeze.

    $9100? Just send one percent of that my way please. I will use it to further the
    image auth tech.

    • I do understand it is all software after the photograph is taken. The tricky part is to define a way that only cameras are allowed to sign such photo such that anybody else knows it was taken with a camera, and that this info is preserved through the photo development process.

      Corner cases like a camera taking a photo of an AI image shall be taken into account. Kind of reminds me how early DVD rips were made, taking screenshots of what a software DVD player drawn.

    • sorry, your scheme is vulnerable to a man-in-the-middle attack
    • Sorry but that proves nothing, I can create an image via AI and also take a md5sum of it and store it offline. The whole point here is that it is signed by the camera so that it can prove that it was taken by that very camera. That your md5 cannot do.
      • Not quite. You can't create an image that was signed by a certificate [on the camera] that was signed by the camera manufacturer's certificate, and verify pictures against the camera manufacturer's certificate.

        If you can extract the camera's certificate and use it to sign pictures, all bets are off.

        • you can try to extract the certificate all you want, they used an asymmetric key so you need the private key that probably only resides at the Leica HQ.
  • So, it looks like if you can get some content credentials you can fake real photos by credentialing AI photos, or even present a clean credential on an edited file?

    Sounds like a juicy target.

  • WTF is this link https://fave.co/3FyEahd?xcust=... [fave.co] ?! Embedding tracking in TFA? Come on!!
  • ...is persuade all the AI image scrapers to adopt & use Leica's proprietary encrypted metadata system. Who do you think's gonna be first?
    • this isn't security by obscurity, the important part is the signing key, not the proprietary metadata system.
      • And what? If you can see the image, you can copy, scan, analyse, edit, etc., without any regard for the metadata. You're essentially trusting upon the good will of the public at large. So far, exactly how trustworthy have webscrapers & AI companies been?
        • yes but this is not about that, this is presenting the unaltered image together with the certificate from the camera to prove that it wasn't altered.
          • ...but most photos get altered by the photographer or their editor(s). Many pro photographers store photos in RAW format to preserve depth & detail for editing & then publish them in JPEG. I can't see much demand for certifying (digitally signing) "unadulterated" photos.
            • yes but this functionality is there for the very rare case of you having (or wanting to) prove that you took the photograph and not X or that your editing didn't alter the image in a way that removed or added evidence (if the image is used in a court of law).
              • Ah OK, so for the not so rare cases where people get sued by Getty Images for using their own photos because Getty Images claims ownership over any photo that they think they can possibly get away with. That actually makes sense.
  • It's rather unfortunate that this kind of integrity tech comes along at a time when humans are so brainwashingly stupid and gullible that they'll believe damn near anything printed above or below the photo-with-integrity to convey the lie instead.

    Solve for the lack of credibility and integrity in reporting instead. De-monetize lying. Then you might have a point with tech like this. It's not the camera society doesn't trust. For good reason.

  • by RightwingNutjob ( 1302813 ) on Saturday October 28, 2023 @07:50AM (#63961329)

    During Stalin's time especially, most photographs published in books or newspapers had a soft airbrushed look to them, even if they were unaltered. The point was so that you couldn't tell if a photo had been doctored to remove an unperson because all photos looked like they were doctored, even if they weren't.

    In a similar vein, I'm announcing a betting pool for how long it will take to crack this system and generate fake signatures for obviously altered documents.

    I'm placing my money on a few months at most. Just because it's going to take some time to nondestructively open up one of these cameras and pull the firmware.

  • Moving the editing software into the camera (watch for an Adobe-branded version of this, so it won't have to be a clunky knockoff) is a weird, overpriced and probably unsuccessful approach to ensuring against surreptitious use of AI.

    What everybody else is going to do, and make money on, is gradually move AI elements into their cameras.

  • Step 1: Design digital module for the physical camera that uses the same wire protocol and mimics the same signatures bit-for-bit. Step 2: Replace original camera module with malicious module, leaving rest of Leica intact Step 3: AI bullshit Step 4. Profit!
  • I hope this metadata gets packed, otherwise it's ruining the point of compressed image formats. Also, I don't believe this will ever take off.

Keep up the good work! But please don't ask me to help.

Working...