Thank you, this is fantastic to know! I think we have to normalize requiring this or similar standards for news, it will go a long way.
Ideally we would have a similar attestation from most people's cameras (on their smartphones) but that's a much harder problem to also support with 3p camera apps.
More like I won't trust anything that doesn't come from a press photographer.
And what will make them more trustworthy?
it doesnt really matter if you can just take a photo of an AI image that's been printed out
That will look like a photo of a printout though. Seems easier to just hack the hardware to get it to sign arbitrary images instead.
This sounds like a good idea on its face, but it will have the effect of both legitimizing altered photos and delegitimizing photos of actual events.
You will need camera DRM with a hardware security module down all the way to the image sensor, where the hardware is in the hands of the attacker. Even when that chain is unbroken, you'll need to detect all kinds of tricks where the incoming photons themselves are altered. In the simplest case: a photo of a photo.
If HDCP has taught anything, it's that vendors of consumer products cannot implement such a secure chain at all, with ridiculous security vulnerabilities for years. HDCP has been given up and has become mostly irrelevant, perhaps except for the criminal liability it places on 'breaking' it. Vendors are also pushed to rely on security by obscurity, which will make such vulnerabilities harder to find for researchers than for attackers.
If you have half of such a 'signed photos' system in place, it will become easier to dismiss photos of actual events on the basis that they're unsigned. If a camera model or security chip shared by many models turns out to be broken, or a new photo-of-a-photo trick becomes known, a huge amount of photos produced before that, become immediately suspect. If you gatekeep (the proper implementations of) these features only to professional or expensive models, citizen journalism will be disincentivized.
But even more importantly: if you choose to rely on technical measures that are poorly understood by the general public (and that are likely to blow up in your face), you erode a social system of trust that already is in place, which is journalism. Although the rise of social media, illiteracy and fascism tends to suggest otherwise, journalistic chain of custody of photographic records mainly works fine. But only if we keep maintaining and teaching that system.
Then you can have a signed picture of a screen showing an AI image. And the government will have a secret version of OpenAI that has a camera signature.