Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Photos of your dog are not going to trigger it. Someone would need to engineer the 30 photos of your dog tweaked to hash to a particular value, and then convince you to save them to your device and then upload to iCloud. And then some portion/abstraction of the dog photo would need to convince a reviewer they were looking at CSAM.

The more likely path to trouble is legal NSFW material that's been engineered.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: