Leica Camera Has Built-In Defense Against Misleading AI, Costs $9,125 45
Scharon Harding reports via Ars Technica: On Thursday, Leica Camera released the first camera that can take pictures with automatically encrypted metadata and provide features such as an editing history. The company believes this system, called Content Credentials, will help photojournalists protect their work and prove authenticity in a world riddled with AI-manipulated content.
Leica's M11-P can store each captured image with Content Credentials, which is based on the Coalition for Content Provenance and Authenticity's (C2PA's) open standard and is being pushed by the Content Authenticity Initiative (CAI). Content Credentials, announced in October, includes encrypted metadata detailing where and when the photo was taken and with what camera and model. It also keeps track of edits and tools used for edits. When a photographer opts to use the feature, they'll see a Content Credentials logo in the camera's display, and images will be signed through the use of an algorithm.
The feature requires the camera to use a specialized chipset for storing digital certificates. Credentials can be verified via Leica's FOTOS app or on the Content Credentials website. Leica's announcement said: "Whenever someone subsequently edits that photo, the changes are recorded to an updated manifest, rebundled with the image, and updated in the Content Credentials database whenever it is reshared on social media. Users who find these images online can click on the CR icon in the [pictures'] corner to pull up all of this historical manifest information as well, providing a clear chain of providence, presumably, all the way back to the original photographer." The M11-P's Content Credentials is an opt-in feature and can also be erased. As Ars has previously noted, an image edited with tools that don't support Content Credentials can also result in a gap in the image's provenance data.
Leica's M11-P can store each captured image with Content Credentials, which is based on the Coalition for Content Provenance and Authenticity's (C2PA's) open standard and is being pushed by the Content Authenticity Initiative (CAI). Content Credentials, announced in October, includes encrypted metadata detailing where and when the photo was taken and with what camera and model. It also keeps track of edits and tools used for edits. When a photographer opts to use the feature, they'll see a Content Credentials logo in the camera's display, and images will be signed through the use of an algorithm.
The feature requires the camera to use a specialized chipset for storing digital certificates. Credentials can be verified via Leica's FOTOS app or on the Content Credentials website. Leica's announcement said: "Whenever someone subsequently edits that photo, the changes are recorded to an updated manifest, rebundled with the image, and updated in the Content Credentials database whenever it is reshared on social media. Users who find these images online can click on the CR icon in the [pictures'] corner to pull up all of this historical manifest information as well, providing a clear chain of providence, presumably, all the way back to the original photographer." The M11-P's Content Credentials is an opt-in feature and can also be erased. As Ars has previously noted, an image edited with tools that don't support Content Credentials can also result in a gap in the image's provenance data.
Canon did this a while back... (Score:3)
Canon had a camera which would do some cryptographic signing and have that stored in the EXIF data of the picture. How it did it, I'm not sure, but it was supposed to be good enough for forensic work.
I wish more cameras had this as a feature. At the factory, they would have a secure enclave generate a key, the key would be signed by an intermedary CA for the model and batch, and the key would be used to sign data coming in. Ideally, if the camera takes a RAW and a JPEG for a smaller thumbrint, both would be signed.
Adding onto that would be photo editing tools in a closed appliance, which also add signatures, so one can see how a picture went from the camera, to some basic fixing up, to compression to a web friendly format, with all those transactions signed which ensures a chain of custody of the picture. Maybe even allow soft like Photoshop to be part of that chain, except requring a hardware HSM to allow it to sign photos, for solid security. Appliance-wise, the appliance would be something that might just compress the RAW photo into something smaller for the web or thumprints, while adding its signature to it. to ensure there is a low chance of gross modifications.
Not a 100% system, by far, but it would provide a path of assurance that doesn't exist right now.
Re:Canon did this a while back... (Score:4, Insightful)
Canon had a camera which would do some cryptographic signing and have that stored in the EXIF data of the picture. How it did it, I'm not sure, but it was supposed to be good enough for forensic work.
Right up until somebody removes the EXIF data.
I'm sorry, but... if the "specialized chipset" in a camera can be defeated with Windows Paint then the whole thing is a farce.
Re: (Score:3)
Re: (Score:3)
Being able to remove that data is not an issue, because you're also capable of not removing that data.
You're also capable of dumping a set of valid credentials from a "secure" camera and then signing the image offline.
You're also capable of hacking the camera's internals or potentially installing a softmod to allow intercepting and editing the image made by a "secure" camera before it's signed. (See also every video game console in existence.)
You're also capable of just editing the photo then putting it in front of a screen with a better resolution than a "secure" camera's image sensor and getting an "
Re: Canon did this a while back... (Score:2)
Plenty of devices exist that have cryptographic security against unauthorised code execution and have yet to be cracked. Someone from Microsoft did a presentation about the security on the Xbox One (a device that has not been cracked) and a digital camera doesn't have to deal with some of the issues that the Xbox One does (like needing to trust code read from shiny plastic discs)
Re: (Score:2)
Plenty of devices exist that have cryptographic security against unauthorised code execution and have yet to be cracked. Someone from Microsoft did a presentation about the security on the Xbox One (a device that has not been cracked)
The lack of a publicly known hack doesn't mean that there will never be one. Nor that photos taken prior to the hack being made public, and thus the verdicts made based on them, are safe. All it will take is one hacked "secure" camera being implicated as being the most convincing reason behind a guilty verdict, to erode public trust in the whole concept. To say nothing about the public backlash caused by the inevitable government revocation of the camera's "secure" status when used against them in court.
a digital camera doesn't have to deal with some of the issues that the Xbox One does (like needing to trust code read from shiny plastic discs)
A
Re: (Score:2)
Re: (Score:2)
Could I just crop the photo and forge the meta data?
Re:Canon did this a while back... (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
And then we decide nothing can be trusted UNLESS it is done in a closed, DRM-laden, restrictive, expensive environment. That could have a very chilling effect for those not in the tiny elite that can afford or want to deal with it all.
I see the utility in specialized cases. But worry it could set some bad precedence, as well.
Re: (Score:2)
Re:Canon did this a while back... (Score:4, Informative)
The point was not to defeat it by being un-alterable. The goal was to show "this is an original unaltered image from this camera".
You can always edit the image afterwards. You just can't claim it to be an original image that was taken with the camera if you do it.
So if a journalist takes a photo of a scene and people doubt its authenticity ("it was photoshopped!") the original image can be analyzed to show it's an authentic image taken with a camera and not altered.
Also, many tools preserve EXIF, so even if it doesn't match, there's a pointer to a record showing that an original image likely exists, and you can use the hash to check you're looking at the right image (you often might take as many photos as you can hoping one turns out all right, so many similar images can be produced).
Re: Canon did this a while back... (Score:1)
Re: (Score:2)
oh yeah. because taking a picture of a screen showing an AI generated picture is soo complicated.
Re: Canon did this a while back... (Score:3)
Itâ(TM)s far more likely that the firmware just contains the same private key for all cameras. I donâ(TM)t see any photographer generating TLS certificates on their camera (if it even has the capacity of doing it quickly enough), transferring a CSR and then waiting for a signed certificate and then fiddling with the cert order to load the CA chain back into it.
Re: Canon did this a while back... (Score:4, Informative)
Many devices have unique per-device keys generated and burnt in at manufacturing time (including the cellphone I am typing this on which has a unique IMEI number)
Re: (Score:2)
Serial numbers and even IMEI numbers are easy to change and still doesn't prove who owns the device. We're talking about cryptographic proof of ownership, you need to sign something in real life or digitally as an end user to make the claim to the device and whatever it produces.
Re: Canon did this a while back... (Score:2)
How hard would it be for cell phone cameras to sign data right off the camera chip? This should have existed for years to protect against photoshopping.
Re: (Score:3)
Now all we have to do is get the lens, camera body, and software all using the blockchain.
Re: (Score:2)
In my 2004 (so nearly 19 years old!) Canon 20D (which I still uses occasionally (having fun with macro, but not enough to get an expensive new camera)) the function is (from the manual)
"C.Fn-18 Add original decision data - To verify whether the image is original, the Data Verification Kit DVK-E2 (optional) is required."
It only works using the raw format... But it has been reverse engineered and can be forged http://lclevy.free.fr/cr2/odd/ [lclevy.free.fr]
Too Expensive: PhotoJournalists Won't Buy It (Score:2)
md5sum (Score:1)
Take a digital picture. Save it offline somewhere along with an md5sum of the file.
$9100 saved.
You're welcome (that's the nice way of saying TFTFY).
Sure, it's not 100% automatic in verification. Yet. But imagine a web extension so
http[s]://image.url.here can be looked up by http[s]://cksum.image.url.here.
Cost = near 0.
Benefit > 0.
Ratio = juice is worth the squeeze.
$9100? Just send one percent of that my way please. I will use it to further the
image auth tech.
Re: (Score:3)
Corner cases like a camera taking a photo of an AI image shall be taken into account. Kind of reminds me how early DVD rips were made, taking screenshots of what a software DVD player drawn.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Not quite. You can't create an image that was signed by a certificate [on the camera] that was signed by the camera manufacturer's certificate, and verify pictures against the camera manufacturer's certificate.
If you can extract the camera's certificate and use it to sign pictures, all bets are off.
Re: (Score:2)
Looks like a market for Content Credentials? (Score:2)
So, it looks like if you can get some content credentials you can fake real photos by credentialing AI photos, or even present a clean credential on an edited file?
Sounds like a juicy target.
WTF (Score:2)
Re: (Score:2)
Here's the direct link to Leica's page that TFA referenced through its tracker-laden link...
https://leicacamerausa.com/lei... [leicacamerausa.com]
Great! Now all they have to do... (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
A planet, responds in kind. (Score:2)
It's rather unfortunate that this kind of integrity tech comes along at a time when humans are so brainwashingly stupid and gullible that they'll believe damn near anything printed above or below the photo-with-integrity to convey the lie instead.
Solve for the lack of credibility and integrity in reporting instead. De-monetize lying. Then you might have a point with tech like this. It's not the camera society doesn't trust. For good reason.
Back in the old country (Score:3)
During Stalin's time especially, most photographs published in books or newspapers had a soft airbrushed look to them, even if they were unaltered. The point was so that you couldn't tell if a photo had been doctored to remove an unperson because all photos looked like they were doctored, even if they weren't.
In a similar vein, I'm announcing a betting pool for how long it will take to crack this system and generate fake signatures for obviously altered documents.
I'm placing my money on a few months at most. Just because it's going to take some time to nondestructively open up one of these cameras and pull the firmware.
So basically, built-in Lightroom (Score:2)
Moving the editing software into the camera (watch for an Adobe-branded version of this, so it won't have to be a clunky knockoff) is a weird, overpriced and probably unsuccessful approach to ensuring against surreptitious use of AI.
What everybody else is going to do, and make money on, is gradually move AI elements into their cameras.
"Real" Photographs are the new NFTs? (Score:2)
How to circumvent: (Score:2)
Title. (Score:2)