Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
If you post this image on Twitter, you're immediately suspended for 12 hours (twitter.com/alexhern)
50 points by mkeeter on June 8, 2022 | hide | past | favorite | 12 comments


If this were a hash collision with some abuse image, then... why can I still see it? I don't get how something can be bad enough to warrant an instant temp ban but not bad enough to hide from the public


presumably someone manually reviews the collided post and allows it


Might be slightly fuzzed to prevent it from an auto-takedown


You'd presume abuse image prevention would use hashes that survive slight fuzzing.

https://inhope.org/EN/articles/what-is-image-hashing?locale=...


they appealed it and it got put back up


So suppose this is a hash collision, I don't really think twitter is at fault here. Any solution you design has a false-positive rate. I am curious how they will resolve it though.


Of course it's Twitter's fault.

These things should have a manual review.


Spoken like a person who does not consider the sheer volume Twitter (and companies like it) have to deal with.

The company would go bankrupt if they hired enough people to handle all of the manual posts. They also can't just leave abusive images up... so this is the compromise.


It's so cute :)


huh? I haven't tried it...is it somehow fooling a porn filter or something?


One of the comments says its a hash collision with some abusive material relating to doxxing.


I’ve always wondered that for hash table we always check that the key matches the target because of hash collision. Should the same be done here? For image hash matches, either run though an image similarity check or, if original image is not available (it’s so bad it can’t be stored), a secondary hash used to double check?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: