Yes, but let's not conflate this specific use case with every day usage.
I'm a gigantic privacy advocate and probably senselessly cautious about tons of technologies, but that's the result of a conscious choice. Set and forget encryption, or programs that advertise they can skip the first step, are dangerous in my opinion/experience. It can give the wrong impression about security and lead to dangerous decisions/actions or lack thereof.
The concept behind thoughtless encryption is noble, but for all encryption models/schemes I know of, it has to be a concentrated and intentional decision else you end up with:
- Lost data due to key mismanagement (not a bad thing, but extremely inconvenient)
- Incomplete or ineffective encryption
- Vehicles for intentional deception that allow bad actors to get you to share sensitive/personal data under the impression that it's protected
and much much more.
Steve Yegge said it best when he said that Security and Usability are constantly at odds, and from my perspective, this is a good thing to some degree. With so many FUD apps that promise security, for the time being is a very convenient litmus test for laypersons to know "how much should I trust this app?"
I know that it's popular on HN to shit on Telegram, but in my current country of residence, Telegram is the most popular messenger program...for normal non-secretive messaging. It is extremely well known not to trust TG for secret conversations, illegal purchases or any other illegal activities, and so on. Not even on the basis of the security model of secret chats, but the discoverability of them. Basically the thought is "If you could find it without being a member of some ring of trust, so can the police". It's one reason why the conversations that frequently happen on HN about Telegram feel so misguided to me -- those who have conversations that may put themselves at risk __aren't using the app for such conversations__. They're not using any such apps, and either doing disposable communications (burner phones, pen and paper convos, etc), or they're arranging meetings in other ways.
Encryption/security has been commoditized by app and platform creators and packaged into a marketing tool. I wouldn't trust Telegram or Signal any more than I'd trust WhatsApp, Messenger, Messages, or whatever Google's monthly name for their chat app is to have such conversations in most countries.
To wrap it back to the GP's comment, I get the complaint about Apple Notes not being E2E encrypted -- but, if you've got sensitive data that needs to be recorded, why are you cloud-syncing it in the first place with a company that has frequently been investigated/prodded by the US Government, and even more frequently probed/violated by data exfiltration companies that work directly with the aforementioned government?
I'm a gigantic privacy advocate and probably senselessly cautious about tons of technologies, but that's the result of a conscious choice. Set and forget encryption, or programs that advertise they can skip the first step, are dangerous in my opinion/experience. It can give the wrong impression about security and lead to dangerous decisions/actions or lack thereof.
The concept behind thoughtless encryption is noble, but for all encryption models/schemes I know of, it has to be a concentrated and intentional decision else you end up with:
- Lost data due to key mismanagement (not a bad thing, but extremely inconvenient)
- Incomplete or ineffective encryption
- Vehicles for intentional deception that allow bad actors to get you to share sensitive/personal data under the impression that it's protected
and much much more.
Steve Yegge said it best when he said that Security and Usability are constantly at odds, and from my perspective, this is a good thing to some degree. With so many FUD apps that promise security, for the time being is a very convenient litmus test for laypersons to know "how much should I trust this app?"
I know that it's popular on HN to shit on Telegram, but in my current country of residence, Telegram is the most popular messenger program...for normal non-secretive messaging. It is extremely well known not to trust TG for secret conversations, illegal purchases or any other illegal activities, and so on. Not even on the basis of the security model of secret chats, but the discoverability of them. Basically the thought is "If you could find it without being a member of some ring of trust, so can the police". It's one reason why the conversations that frequently happen on HN about Telegram feel so misguided to me -- those who have conversations that may put themselves at risk __aren't using the app for such conversations__. They're not using any such apps, and either doing disposable communications (burner phones, pen and paper convos, etc), or they're arranging meetings in other ways.
Encryption/security has been commoditized by app and platform creators and packaged into a marketing tool. I wouldn't trust Telegram or Signal any more than I'd trust WhatsApp, Messenger, Messages, or whatever Google's monthly name for their chat app is to have such conversations in most countries.
To wrap it back to the GP's comment, I get the complaint about Apple Notes not being E2E encrypted -- but, if you've got sensitive data that needs to be recorded, why are you cloud-syncing it in the first place with a company that has frequently been investigated/prodded by the US Government, and even more frequently probed/violated by data exfiltration companies that work directly with the aforementioned government?