Hacker News new | past | comments | ask | show | jobs | submit login
Hobbling computer vision datasets against unauthorized use (unite.ai)
21 points by Hard_Space on Sept 17, 2021 | hide | past | favorite | 4 comments



Lots of words for a bullshit approach.

In the end, this boils down to polling a web API to download the decryption keys. And at that point, there's nothing stopping you from just storing them.

So their main result is: Encrypted files can be distributed in a public torrent as long as you keep the keys private.

BTW, their feature perturbation is most likely also specific to the architecture of the network being trained. Future architectures will be able to just use their "protected" images as-is.


So they have two versions of the dataset, one of which is completely pointless?

Just don't bother releasing your dataset. This is wasted effort.


If this takes of, it will become viable to store malware in pictures again. Perhaps this is the actual motive as it would coerce people to copyrighted ones.


Hopefully someone will use this technology to allow people to protect their private image library from being used in computer vision applications.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: