Hacker News new | past | comments | ask | show | jobs | submit login

Or perhaps camera manufactures will start putting traces in. Anyone who makes surveillance systems (like stores use) should put some end to end things in so they can say "this camera took that recording and we can see that it wasn't tampered with by...". (if anyone works for a surveillance company please run with this!) With encryption we can verify a lot of things, but it sets the bar higher than someone took a picture.

Of course in a darkroom someone skilled could always make a fake photo - but the bar is a lot lower with AI.






I think the best proof of authenticity is already there, in the quirks each particular camera model has in the recording encoding. It's likely that a forgery would easily be shown to have differences in encoding when compared with a video file from the same camera. It would be more difficult with an audio recorder, as you could just record the AI voice in the room. An audio expert could probably show that the acoustics of the fakery don't match where it was claimed to be recorded.

Cryptography doesn't really fix it. There are a zillion camera makers and all it takes is one of them to have poor security and leak the keys. Then the forger uses any of the cameras with extractable keys to sign their forgery.

Or they just point a camera at a high resolution playback of a forged video.

This also assumes you can trust all the camera makers, because by doing this you're implicitly giving them each authorization to produce forged videos. Recall that many of these companies are based in China.


> Or they just point a camera at a high resolution playback of a forged video.

Exactly, it's just like DRM or cheating in video games. Even if everything in software is blocked, there's always the analogue hole.


The point is the camera maker certifies in court under perjury penalty that their cameras are not compromised and that is their image. "Other camera systems are compromised, but ours is not...".

Cameras are an IoT device. The S is for security. What good is having the company certify something that isn't true?

It seems like it would just be a mechanism to lend undeserved credibility to something that still isn't trustworthy.


Here the camera manufacture is asserting under oath that they (not their competitors) are secure. Just because IoT is almost always insecure doesn't mean it 100% is, being the exception can allow you to charge a premium price at times - but you better really be the exception.

Maybe go back to old school analog. Like they did in Altered Carbon



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: