C2PA Investigations

(tbray.org)

22 points | by zdw 6 days ago ago

6 comments

  • mzajc 2 days ago ago

    Besides the possibility of extracting the private key from the camera, doesn't this still leave the analog hole wide open? One could set up a high-DPI screen or similar in front of the lens, displaying anything they want, and have it captured with a genuine signature. This effort seems much more noble than DRM schemes, but ultimately it's the same fight against reality.

    • duskwuff 2 days ago ago

      It's worse than you think: there are official C2PA mobile apps which will apply C2PA tags to anything captured by a "camera".

      This includes external USB cameras, such as HDMI capture cards.

      You can probably see where this is going.

    • qingcharles 2 days ago ago

      I didn't read your comment first, but you are absolutely correct.

      https://news.ycombinator.com/item?id=45421293

      There are other uses for C2PA, but detecting AI is a weak one.

    • burnto 2 days ago ago

      Good comparison with DRM. Perhaps it’s fairer to frame C2PA as an authenticity factor alongside other factors like a creator identity or publication credibility?

  • qingcharles 2 days ago ago

    You can just generate an AI photo, print it and hold it up in front of a camera and now it's "not" AI any longer and it's a real photo taken with a real camera. I don't put any value in C2PA for AI detection.

    • duskwuff a day ago ago

      And vice versa: you can take a real photo, have an AI model make a minor change to it, and you can now claim that it's "a fake AI image".