Hacker News new | past | comments | ask | show | jobs | submit login

Nothing can stop me from pointing a camera at my computer monitor and recording a verified video of a non-verified video.



Sure, but in that case your chain starts with your camera, producing "pro" looking video which would be totally unlikely (not mentioning trivial stuff like noticing monitor/lens distortions etc.). Analyzing subsequent processing steps stored in blockchain would likely show very low probability of authentic work due to missing many required steps (OK, there is still certain low probability you can fool it somehow, but would you want to waste so much time/processing power on it?)


This is whack-a-mole. If I can design deepfake software, it isn't much harder to design it to specifically anticipate the user filming the result. The user would input their monitor specs, their camera specs, etc., and the software would produce a weirdly distorted video which looks perfect when filmed with that camera from that monitor.

Or, people just outright sell doctored cameras where you can intercept the input feed.

This isn't like adult content filtering, where all that matters is that kids can't get around it. You have to assume people who know what they're doing are going to attack your technical solution, and apply ingenuity in doing so. When the enemy consists of hackers, you can't ward them off with a hack!


I believe that the usual response to this is that the devices include some kind of unmodifiable time and location stamping, although that argument spirals out of control pretty quickly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: