Hacker News new | past | comments | ask | show | jobs | submit login

> There is a chance of false positives, so the human review step seems necessary...

You misunderstand the purpose of the human review by Apple.

The human review is not due to false positives: The system is designed to have an extremely low rate of hits where the entry isn't in the database and the review invades your privacy regardless of who does it.

The human review exists to legitimize an otherwise unlawful search via a loophole.

The US Government (directly or through effective agencies like NCMEC) is barred from searching or inspecting your private communications without a warrant.

Apple, by virtue of your contractual relationship with them, is free to do so-- so long as they are not coerced to do so by the government. When Apple reviews your communications and finds what they believe to be child porn they're then required to report it and because the government is merely repeating a search that apple already (legally) performed, no warrant is required.

So, Apple "reviews" the hits because, per the courts, if they just sent automated matches without review that wouldn't be sufficient to avoid the need for a warrant.

The extra review step does not exist to protect your privacy: The review itself deprives you of your privacy. The review step exists to suppress your fourth amendment rights.




This is the part I am very concerned about. This is definitely a violation of 4th Amendment rights because images are viewed by humans not on the device. What happened to just on device scanning for them?


There is one thing to be concerned about individuals violating terms of service and scanning on the device to identify and refer to law enforcement. It’s a WHOLE other thing to have humans somehow review images that are not in a device.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: