Hacker News new | past | comments | ask | show | jobs | submit login

Reporting the images to law enforcement is good. There should be a human in the loop to separate medical images from exploitative ones.

Perma-deleting his account on an automated accusation is bad. That should hinge on, at minimum, law enforcement's decision to charge a crime. [Edit: unless the criminality of the images is obvious - again, a human needs to be in the loop.]




> Reporting the images to law enforcement is good.

citation needed.

do these CSAM scanning things actually help reduce kid exploitation?

and if they do, is this the best use of our resources?


> There should be a human in the loop to separate medical images from exploitative ones.

No, there really should not. I would not want a facebook employee to look at my pictures. I don't use their services, but the thought is pretty off-putting. The idea that these companies have to police content is what is wrong.

There are other ways to get to offenders here. An environment that takes good care of kids will spot it. Not some poor fella that needs to look at private images.


Perma-deleting the account is destruction of evidence, so even if the criminality is obvious, an account lock makes more sense.

Even an account lock is probably a bad idea; it alerts the pedophile that they're under investigation, allowing them to destroy evidence, cut ties with coconspirators, etc.

Best to let law enforcement deal with it. In this case, assuming it somehow went to trial, the jury would almost certainly acquit, and the account would be restored.

There is the matter of the accused losing access to the account while the case was active though. That's potentially a big deal.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: