I know you're being coy to slam on Apple, but this already happens in the uS __without__ Apple's (imo misguided) strategy and it happens frequently.
Prosecution over thumbnails, file hash matches with tenuous connections to individuals, Crime-Tech has given a new position for law enforcement to craft fanciful narratives to support prosecutions of just about anyone. I'm not specifically citing or inferring there have been *targeted* instances of this, but it happens very often towards US citizens already.
Stingrays, Cellebrite, Forensic "science", ShotSpotter, it's absolute chaos and anyone can be a victim here.
My goal here is not anything about Apple and its CSAM strategy, but to make it very clear that the imagined scenario happens now and happens frequently.
For that to happen: they'd have to be matched 30 times, the Apple employees would have to mistakenly manually confirm the match 30 times, and the defence would fail to request to see the underlying images that matched the hashes during discovery.
"Our CSAM matching found 29 matches that are either confirmed or strongly suspected in your iCloud drive. But because Apple values your privacy, we are not turning this information over."
No. Expect that number to go to '1', or very close, and very soon. Because as much as this is a PR nightmare for Apple right now, the above scenario is even worse.
You might be surprised to know that many other major tech companies have already been using the same CSAM technology for the last two years and how many of these situations have you heard of? I can't think of any.