> The algorithm detects the content whether it’s posted online or shared via a messaging app like WhatsApp.
That statement contradicts Whatsapp official statement that all content is end-to-end encrypted: “WhatsApp ... we built end-to-end encryption into our app. When end-to-end encrypted, your messages, photos, videos, voice messages, documents, and calls are secured from falling into the wrong hands. Only you and the person you're communicating with can read what is sent, and nobody in between, not even WhatsApp.”
In this case, I believe Whatsapp more than I believe a technically illiterate writer on a government website that failed to understand some nuance. Maybe some Whatsapp users forwarded questionable content they’ve received to the authorities, then the algorithm scanned that content. That’s very different than secret scanning happening within Whatsapp.
My guess is that even if a photo is secure within Whatsapp once that photo is displayed on the screen of your phone then it's available to Android to scan. Yes, that's a creepy thought and not only that but presumably such a function would be using quite a bit of the phone's battery to be doing this continuously.
It's not clear though why all this stuff would be going to Switzerland particularly.
You might as well flag this since there is no way to access the referred to data sources (and I am pretty sure there is quite a lot of actual information lost in translation).
Source: I'm Swiss, read everything I could find, nothing even comes close to "apps like Whatsapp send pictures to the Swiss police". More in the direction of: pictures get sent by US LE to Swiss LE and the volume is growing over the last couple of years.
> In an effort to combat child pornography, providers like Facebook and Google automatically screen all photos for the presence of children and bare skin.
The presumption that childhood nudity needs to be screened in such a way is, itself, cringey and creepy and perhaps not without a hint of pedophilia.
> The algorithm detects the content whether it’s posted online or shared via a messaging app like WhatsApp.
OK, so what is "the algorithm?" Where can it be tested and audited?
And how do authorities acquire content send in a WhatsApp message? We just went over this yesterday, didn't we? wcathcart, any response to this?
Our daughter had serious skin issues for years, and my wife and I shared numerous photographs of her.
The idea of a pack of ... auditing my personal messages and photos to my wife is abhorrent.
What we need is actual police doing actual police work, and actually judges putting actual pedophiles away for life. Not stalking normal people.
But, I guess cowardice is alive and well - normals don’t shoot cops and judges in the face, so they’re easy targets to satisfy the “just do something” vote.
They're not screening for kids and bare skin specifically. It's just a first heuristic to narrow down the pool of photos that conceivably might be child porn to something that human investigators can feasibly check.
Per the article, 90% of the kids-and-bare-skin photos received are found to be totally innocuous once a human looks at them; no action is taken, and the photos and associated personal info are removed from police computers.
Except I'm not comfortable with my families photographs being screened by a third party that I don't know or have met, and this occurring explicitly without my knowledge.
So to stop child pornography an all-seeing third party vets my families photographs even when shared privately via WhatsApp or similar? That's fucked.
I'm not trying to blame you here but since you're on this website I assume you're more generally knowledgable about stuff happening. We've known about variations of this for so long.
There is a difference between knowing that Facebook posts are screened and thinking that private, "end-to-end encrypted" messages are being screened (if true).
> It's just a first heuristic to narrow down the pool of photos that conceivably might be child porn to something that human investigators can feasibly check.
Yes, but it's an invasion of privacy to even have someone looking at pictures that weren't shared publicly. Basically what you've just said is that 90% of the time they are invading innocent peoples privacy.
Child porn is really bad, the right to privacy is also really important, I'm not convinced they've made the right decision here.
> The idea of a pack of ... auditing my personal messages
What fills in the blank left by your ellipsis is one of the real points to be made in light of this story.
We have to ask: who is fighting to have this job? In fact, who is this job tailor-made for in the first place?
It seems to me that there is no (or, at most, very little) legitimate security interest being served by this process. But the opportunity created for someone who enjoys looking at these sorts of pictures - and doing it under cover of law - is absolutely obvious.
Does this only apply to the Swiss government, or is Facebook sharing any pictures that hit their filters with law enforcement agencies of whatever country you happen to be living in?
I had thought I'd seen some articles long ago that Microsoft was doing this with everyone's one-drive cloud thing that is auto installed with windows 10(?) and added to windows 8 if you install office (still?) -
that Microsoft's onedrive system auto-detects pics with X% of skin showing, and when it does, all of your nudie (and semi-nudie) pics were sent to humans to look at automatically - and those humans could forward your pics to other people and agencies..
So I have always assumed that google would be doing the same, as I recall some articles about them leading the way in detecting and censoring pics, and providing detectives with things like bed sheet matches for hotels, and tracking phone numbers and pics of hookers that changes cities (that one was a google employee on 6 month leave or something?)
I have not seen anything about Apple getting into these things, would be quite interesting if they do the same - it would mean all of the fappening celeb victims had already been peeped on before hackers release them to the public right?
are you 100% on "only check files which have been shared between users though." ?
- I do not recall seeing that in the info I read some time ago.
"typically use hashes of files" - I don't think this changes anything as far as msoft employees and others looking at pics of your kids or wife /gf / bf, etc; if you have taken the photos or he/she did and sent them to you right? This would only remove some of the internet shared/saved photos from the pile right?
Privately taken pics saved on the windows computer are / would still be sent to employed humans to look at, from what I understand.
That statement contradicts Whatsapp official statement that all content is end-to-end encrypted: “WhatsApp ... we built end-to-end encryption into our app. When end-to-end encrypted, your messages, photos, videos, voice messages, documents, and calls are secured from falling into the wrong hands. Only you and the person you're communicating with can read what is sent, and nobody in between, not even WhatsApp.”
In this case, I believe Whatsapp more than I believe a technically illiterate writer on a government website that failed to understand some nuance. Maybe some Whatsapp users forwarded questionable content they’ve received to the authorities, then the algorithm scanned that content. That’s very different than secret scanning happening within Whatsapp.