Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This only finds known pictures of child abuse not new ones and especially it doesn't find the perpetrators or prevent the abuse.

But it creates a infrastructure for all other kind of "criminal" data. I bet sooner or later governments want to include other hashes to find the owners of specific files. Could be bomb creation manuals, could be flyers against a corrupt regime. The sky is the limit and the road to hell is paved with good intentions.



> This only finds known pictures of child abuse not new ones

No, it finds images that have similar perceptual hashes, typically using the hamming distance or another metric to calculate differences between hashes.

I've built products that do similar things. False positives are abound with perceptual hashes, especially when you start fuzzy matching them.


But these are just variations of known pictures to recognize them if they cropped or scaled. The hash of a really new picture isn't similar to the hash of a known picture.


Perceptual hashes have collisions, and it is entirely possible for two pictures that have nothing to do with each other to have similar perceptual hashes. It's actually quite common.


Sure it is possible to have collisions, this is the main thing about hashes , in general collisions are rare but this are a different type of hashes and Apple is probably tweaking the parameters so the collisions are manageable.

But if the database of hashes is big and the total number of unique photos of all iPhone users is also a giant number you will find for sure collisions.


It certainly does help find the perpetrators and prevent abuse. Unlike videos of many other kinks, CSAM distribution is not one-way from a small number of producers to a large number of consumers but is often based on sharing "their own" material.

When we arrest people for "consuming" known old CSAM, we often find new CSAM produced by themselves; the big part of busting online CSAM sharing rings is not the prevention of CSAM sharing but the fact that you do get a lot of actual abusers that way.


I would like to read more about your claims,the problem is this subject is very dangerous , the only things I know about it is from articles that got popular and part of this articles are about innocent people that had their lives destroyed because someone made a mistake(like read an IP address wrong).

A nightmare scenario bould be soemthing like,

- giant tech creates secret algorithm , with secret params and threshold that they can tweak at will

- bad guys reverse it or find a way to make a inocent looking picture to trigger the algorithm

- the previous is used by idiots in chat groups or even DMs to troll you, like SWAT-ing in US , or DDOS and other shjit some "gamers" do to get revenge for losing some multiplayer game.

- consequences innocent person loses his job, family, friends, health etc.

I don't trust giants moderators either, they make mistake or just don't do they job and randomly click stuff.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: