Hacker News new | past | comments | ask | show | jobs | submit login

> obviously they will not mind Apple scanning their photos to prevent horrific crimes

But Apple scanning all phones for CSAM makes it look like you're guilty unless proven otherwise. It's not even about preventing crimes -- Apple will only flag your device if it has a photo from the CSAM database. In that sense, the most horrific part of the crime, i.e., the abuse of a child, has happened already.




All the other major services already scan your photos with similar algorithms, the only difference is that apple is trying to do it in a much more robust and privacy preserving way.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: