> ensure that nearly all of the next generation of CSAM will suffer from hash collisions with perfectly innocent images.
Even then, for that to affect _John Doe_, they would have to make 30+ images whose hash matches that of images in _John Doe’s_ iCloud account.
I think that means they could target individuals, but only if they knew or could guess what photos they have in their account.
They also might be able to target groups of individuals, say people who went on holiday to Paris. It would be interesting to see whether such people have enough overlap in the sets of Neuralhashes of photos they took there.
Even then, for that to affect _John Doe_, they would have to make 30+ images whose hash matches that of images in _John Doe’s_ iCloud account.
I think that means they could target individuals, but only if they knew or could guess what photos they have in their account.
They also might be able to target groups of individuals, say people who went on holiday to Paris. It would be interesting to see whether such people have enough overlap in the sets of Neuralhashes of photos they took there.