>All chats use the same Signal protocol outlined in this whitepaper, regardless
of their end-to-end encryption status. The WhatsApp server has no access to
the client’s private keys, though if a business user delegates operation of their
Business API client to a vendor, that vendor will have access to their private
keys - including if that vendor is Facebook.
Not sure if the facebook exception was there in the previous version.
All chats use the same Signal protocol outlined in this whitepaper, regardless
of their end-to-end encryption status. The WhatsApp server has no access to
the client’s private keys, though if a business user delegates operation of their
Business API client to a vendor, that vendor will have access to their private
keys - including if that vendor is Facebook.
Yeah. It is really clear that Facebook is finally getting around to just implementing this feature of effectively having "hosted clients" for companies to be able to more easily--and yes: less securely--build chat bots (a mechanism that I appreciate is maybe less than ideal to encourage, but frankly just doesn't feel that bad and certainly isn't a surprise: Facebook has been taking about this for a year or two now); and all of the "changes" this week have been directly because of this, including the Privacy Policy update... the key article about which even explicitly said:
> The move, the spokeswoman said, is part of a previously disclosed move to allow businesses to store and manage WhatsApp chats using Facebook's infrastructure. Users won't have to use WhatsApp to interact with the businesses and have the option of blocking the businesses. She said there will be no change in how WhatsApp shares provides data with Facebook for non-business chats and account data.
And yet, somehow, everyone is just in complete hysterics over all of this, claiming Facebook is evil and undermining the feeling of security people have in WhatsApp, with lots of talk of switching not only to reasonable alternatives like Signal, but also to less secure messaging protocols like Telegram (or, frankly, Matrix). People at my supposedly-smart privacy company--Orchid, building something akin to "incentivized Tor for general VPN use"--are even panicking about this news, and it is really frustrating how no one even seems to want to analyze this carefully... "bUt FaCeBoOk Is EvIl!!" :/.
Yes, everyone is in complete hysterics exactly because Facebook is evil (by the definition "harmful or tending to harm" (OED) or "morally reprehensible" (Merriam-Webster)). Just remember the recent(-ish) Oculus controversy, where they forced everyone who bought their hardware to sign in with Facebook and in some cases (soft-)bricked users devices because their Facebook accounts did not have enough activity [1]. Especially because Palmer Luckey (founder of Oculus) when answering questions about the acquisition in 2014 said that Facebook would not do such a thing [0].
I personally am scared because the language being used here is not at all specific to the scenario mentioned here ("hosted clients"). I understand that anything more specific would probably be rejected by their legal team. I am afraid that some 5 years down the line they'll be able to do something worse without notifying users because the TOCs and privacy policies are written in this ambiguous language.
Regarding alternatives, I can't really speak on the security/privacy of any of them but from what I can gather, Matrix does have E2E-encryption functionality [2] so I'm not quite sure how it is less secure than Signal (provided you host your own server and/or have a reasonable degree of trust in the server-operator of your conversation-partner).
And when Facebook is doing something evil, I actively blast them for it; in particular, I have been extremely vocal with everyone I know about many aspects of the Oculus account issue, which I consider to be extremely evil when combined with their closed store model and DRM setup with developer account revocation (etc. I am somewhat famous for being a broken record on some topics, so I will try to avoid going into too much depth ;P).
Obviously, though, (but maybe not to you?!?) this is a completely unrelated issue to the WhatsApp "changes" this week: trying to use "Facebook is evil, so everything they do is evil" is not only ridiculously disingenuous--to the point of undermining the ability to make these kinds of arguments at all and still be taken seriously :(--but doesn't even satisfy basic questions like "ok, and do you also consistently use this frame with Apple and Google?" (both of whom are also evil to the point of being morally reprehensible).
As for Matrix: they do not have a solution for metadata yet, and even have gone so far as to claim that maybe they will never figure it out (due to being a federated system). Your metadata just ends up getting semi-permanently logged on various machines, and there is nothing you can do about it at this time. AFAIK, Signal has implemented solutions to this (even, I believe, fixing the subtle thing I used to complain about where their server technically had a temporary in-memory metadata log for rate limiting).
Facebook logs all metadata that is available from WhatsApp as well. I'd rather have my metadata on matrix servers than on FB servers - at least it's not connected to my phone number, which is tied to my real identity. Also, matrix doesn't upload my entire contact list to Facebook. If it's secure enough for the german military and the entire french government, it's certainly secure enough for me.
> Your metadata just ends up getting semi-permanently logged on various machines, and there is nothing you can do about it at this time.
Sealed sender means that an eavesdropper who can introspect into RAM inside Signal's AWS infrastructure is no better off than a network eavesdropper who passively sniffs ingress/egress.
That doesn't mean they can't build a reasonably accurate metadata database covering most people--people who communicate from a limited number of mobile ips to a limited number of mobile ips.
Signal is way better than matrix, but let's not pretend it has totally solved the metadata problem.
Extremely evil was when an entire population was wiped off the earth in the industrial genocide of the Third Reich. Facebook or WhatsApp changing its TOS is irritating but it is not "extremely evil" I just realised that this is the same absolute language that incited the violence we saw on Wednesday. If something is "extremely evil" then there are very few constraints short of the Geneva convention and probably not that you should be bound by in your response. The point is language matters and so enough with calling everything we disagree with "evil".
It was carefully explained to me that Facebook only wishes they could be as evil as Google is, now, or as Microsoft used to be able to be. Nowadays, even Microsoft and Russia wish they could afford to be as evil as Google; and even the spooks have had to outsource theirs.
(I use "evil" in the technical sense: not necessarily intending to exterminate humanity, but wanting to be able to -- or anything short of that -- if they did.)
Unfortunately, your interpretation of facebook's motives require trusting that they'll only do what their PR says they'll do, and not what they're able to do. Or, even if that is their current reasoning, one then has to trust that they won't then take advantage of said ability in the future.
For many of us, facebook's past actions are more than enough to prove that they do not deserve the benefit of the doubt in this case.
In addition, if this is indeed the backstory, then Facebook’s product management team failed miserably by not owning the story and instead deferring to anonymous internet commenters to explain their changes.
Remember when Facebook Security added SMS 2-factor verification and promised to never use the phone number for anything else, but then they were overridden and it was fed into the social graph, leading to their CISO resigning ?
Thanks for posting this. I was considering making the jump to a new messenger but decided to wait and see what others had to say about the changes to the privacy policy and what it actually means from a privacy perspective. The use case for businesses to be able to use it for hosted clients (probably hosted and with messages stored by facebook) makes sense, and doesn't seem as bad as its been made out to be – still get the same level of privacy we've always had between individuals and groups WhatsApp chats.
> but also to less secure messaging protocols like Telegram (or, frankly, Matrix)
Appreciate that Telegram doesn't have a good rep in the security community, but whats wrong with Matrix?
Also, this is off-topic, but I just wanted to say thank you for all the work you've done in the past with Cydia. I was a 1st gen iPhone user, and got a lot of use from services such as Cydia (in fact i'm convinced the App Store was inspired by services like Cydia).
Matrix is pretty open about how it hasn't been able to do anything about metadata leakage (which they have even at some times claimed is somewhat inherent to its federated nature; I think that is an overstatement, but is something that even they seem to believe).
> Matrix does not protect metadata currently; server admins can see who you talk to & when (but not what). If you need this today, look at Ricochet or Vuvuzela etc.
> Protecting metadata is incompatible with bridging.
> However, in future peer-to-peer home servers could run clientside, tunnelling traffic over Tor and using anonymous store-and-forward servers (a la Pond).
Signal, in contrast, put a lot of effort into metadata reduction--critical as they are a single giant hosted relay service--and in the process (I am very sure) even fixed the issue I used to complain about wherein their server was technically keeping around a temporary-ish in-memory metadata log for rate limiting.
If you are going to switch to something, switch to Signal (...though I sadly can't in good faith ever really recommend anyone do that, due to how Signal has crippled the ability to do chat backups; more info on this in the other thread going on today re Signal/WhatsApp).
Those slides are from 2017. P2P Matrix was released in June 2020. A lot of work is being done on Dendrite, the latest commit was posted two hours ago as of this writing. From the GitHub page for Dendrite: "As of November 2020 we're at around 58% CS API coverage and 83% Federation coverage, though check CI for the latest numbers."
So, yes, for now the metadata leakage is a real issue. However this is likely to change in the near future.
Thanks for the info. I was under the impression that you were claiming that Matrix is less secure than WhatsApp. If they both leak metadata then they're roughly equal from a privacy perspective no? I guess with WhatsApp you can't know the extent of metadata leakage, but at least with Matrix, you have the advantage of knowing precisely what data is leaked.
Not trying to push Matrix or anything, i've been using Signal for some time already anyway, but thought i'd see what alternatives there are. The lack of chat backups is a real drawback, though since the Android version has a backup option, i'm hoping it's something they'll eventually implement?
Probably not because that doesn't say whatsapp will have the private keys, just that the vender will. In fact the next sentence you left out of the quote is
> However, these private keys will still not be stored on the WhatsApp chat server.
That's in a different place (p. 11), but the gist is still the same. Even my quote includes "The WhatsApp server has no access to the client’s private keys".
I guess it wasn't clear, but I was trying to refute the claim implied by the Twitter post (by showing that the document still claims that WhatsApp servers don't have access to the private keys).
I have a question to people who said Telegram is worse than WhatsApp in every possible way for privacy. Do you still hold this belief? At least Telegram is holding its promise, if you start secret chat only you and your peer knows encryption keys
> if you start secret chat only you and your peer knows encryption keys
The moment a product is "secure by exception" rather than "secure by default," a huge benefit of E2E encryption is immediately thrown out the window. Sometimes the simple knowledge of which conversations are secure, and which aren't, is more valuable than the content of those conversations.
Furthermore, when everything is E2E encrypted, mass surveillance of message content is essentially quashed.
Please stop bringing up this "never roll your own crypto" argument. It's a guideline, not a hard rule. Signal actually rolled their own crypto and aren't constantly criticized for that, on the contrary. Signal is praised for rolling it's own crypto.
Don't get me wrong, the Telegram crypto can (and should) definitely be criticized. But please criticize that they use "bad crypto" or "strange crypto" or "unreviewed crypto", not that it's their own. (And of course, substantiate such claims with references that can be discussed.)
As far as I know, the Signal protocol was developed for TextSecure back in 2013. Noise actually references the Double Ratchet Algorithm by Moxie and Trevor Perrin as an inspiration. Not the other way around.
At some point in time, all now well-established cryptographers will have developed their first own cryptosystem, without already having established a good reputation. Whether or not someone develops a cryptosystem without being famous for cryptography work is simply not a good argument for discussing a cryptosystem. The properties of a cryptosystem are a good argument for discussing it.
Instead of Don't get me wrong, the Telegram crypto can (and should) definitely be criticized. But please criticize that they use "bad crypto" or "strange crypto" or "unreviewed crypto", not that it's their own. (And of course, substantiate such claims with references that can be discussed.) the people bringing it up can just provide you with a link to one of the many discussions of it.
All protocols are invented at some point. Telegram did a terrible job marketing this one but it has been a long time now and the only issue I ever heard of was fixed some years ago. It's still not exactly pretty, but then look at TLS and I'm actually quite okay with mtproto.
The real issue is that mtproto is never used. It isn't implemented in most clients for no apparent reason ("can't keep state for encryption keys!" is the usual excuse - dude you keep my login token what's the big deal here) and if you try to use it, it doesn't sync between devices. One of the core selling points is a solid desktop experience.
You mean Signal which was created by Moxie Marlinspike and other legit cryptogaphers and security researchers? Who rolled Telegram's crypto? No idea. Why should we trust them? No idea. I think I'll go with the people who have been contributing to the field for years and are highly respected.
I'd rather go with cryptanalysis and/or audits than big names. Both protocols are old enough now to have had ample opportunity.
And I can't tell if Moxie really means to improve the status quo or works for some three letter agency and builds just enough metadata opportunities into popular messengers and opportunistic encryption into WhatsApp to be helpful without being suspicious. To avoid redundancy, I posted these only yesterday and it includes some of the reasons: https://news.ycombinator.com/item?id=25669531https://news.ycombinator.com/item?id=25669267
They don't cover everything unfortunately but I'm also getting annoyed with the ephemerality of HN. What's posted last week is forgotten and never looked at again. I can try to find old posts that cover it or type it all out again (and it's a big claim so very few people will even take the time to read a big comment with reasons in the middle of another thread). I'm also not denying he does good stuff, just that there are enough weird opinions (decentralization = evil, anybody but us = evil, bug bounties = evil...) that I carefully look at what he makes and would rather there were better alternatives than their central servers.
Signal is still the only realistic messenger to use for good security and usability, unfortunately. Wire is a good second but Signal is definitely more smooth and I'd still recommend that to the general public, with the asterisk that it's an American company and that they should try Matrix if they're feeling adventurous (Wire falling somewhere in the middle, at that point you might as well try Matrix).
> And I can't tell if Moxie really means to improve the status quo or works for some three letter agency and builds just enough metadata opportunities into popular messengers and opportunistic encryption into WhatsApp to be helpful without being suspicious.
Moxie is an anarchist (or near to it) and has been so for a long time. Secretly working for the NSA would be a stupendously long con.
Might not have been planed from the get go. But let me quote myself from a sibling comment:
> it's more of a hyperbole than something I truly suspect. It's just that their opinions are in line with the hacker community 50% of the time, and in line with surveillance organisations the other 50% of the time. Of course, he always has some reason for having the opinion, it's all covered up just fine, so it could also be perfectly legit. It's just weird to argue both sides at the same time.
His being an alleged anarchist, how does that hold with the prohibition for forks to use Signal's servers? Or the insistence that Google is the only place you should get the apk from? Shouldn't we all build from source, not trust a central distribution point? They argue both sides and I find it hard to tell what they really believe in.
That said, I definitely see your point and, as said, he does plenty things to improve the status quo. It's just his rejection of other things that would be even better.
> And I can't tell if Moxie really means to improve the status quo or works for some three letter agency and builds just enough metadata opportunities into popular messengers and opportunistic encryption into WhatsApp to be helpful without being suspicious.
As if Moxie having opinions that you don’t agree with is evidence for some covert NSA operation or some such. What nonesense.
Yeah it's more of a hyperbole than something I truly suspect. It's just that their opinions are in line with the hacker community 50% of the time, and in line with surveillance organisations the other 50% of the time. Of course, he always has some reason for having the opinion, it's all covered up just fine, so it could also be perfectly legit. It's just weird to argue both sides at the same time.
Sorry, was still editing in a bit of context, please see the current version. If there is anything in particular feel free to ask, but the whole analysis is more of a submission of its own that I'm not sure I'm up to writing today.
No this is fine. I fully agree with your points and that's precisely why I'm not a fan of his. But yeah, Signal is the shiniest turd we have for secure messaging that's normie (as in not someone in tech) friendly.
Sure, they rolled their own encryption. But, lets look at this problem from different point of view. Every encryption protocol is invented by some group of people, Signal and Telegram is no exception, only difference is other encryption algorithms are tested, audited and verified by time for security. Keep in mind, every algorithm is rolled out by someone, you can't say there will be no better encryption algorithm in the future than all available today. So someone will roll their own crypto anyway in the future, which might be more secure than everything we have today.
This is a false dichotomy. It isn't Telegram vs WhatsApp. Even if it was, the answer, in light of this new discovery, would be neither.
At this point, Matrix, Threema, and Signal are some of the more popular cross-platform solutions left to ponder about. Telegram nor WhatsApp are answers to any privacy question anyone may have.
Can anybody remember the story, I think it was a few years ago, when a journalist warned in an article that messaging apps like Whatsapp are vulnerable because they rely on a server for key exchange, and all the security researchers requested that the story should be retracted because it would lead people to use SMS which is even less secure? I may be misremembering some details.
Warning that it's vulnerable because it relies on a server for key exchange is like warning that water is wet and you shouldn't let it loose in your house to prevent water damage. It's correct, but redundant. The very definition of end to end encryption is not trusting the server, so you need to verify the exchanged keys. This is a requirement in Signal, Wire, Threema, Jami, Briar, Element/Matrix, Keybase, OTR, and all other protocols. If you don't do that, then yes, you rely on whoever owns (or "owned") the server.
What might need a warning is that the server can push new keys to your phone at any time and, unless you go into your security settings, you will never notice. Being warned of key changes is opt-in. That's why WhatsApp does, by default, opportunistic encryption.
But Moxie was involved in the implementation and got only a few million for publishing that claim so no worries y'all.
Right. WhatsApp/iMessage etc end-to-end encryption is meaningless because a single firm can turn it off invisibly any time they like. In fact we only have their assurance that it even exists at all, given the difficulty of reverse engineering their protocols and checking everyone has the same clients.
I've felt very uncomfortable about the way Valley firms jumped on board the end-to-end bandwagon. The intentions are good and ones I wholeheartedly support, but the claims made for it are just not true. The WhatsApp paper is at least slightly less deceptive than it once was, and I guess that's progress of sorts, but the damage is done already. One day Facebook will discover some sort of burning reason why a WhatsApp user has to be decrypted, it will come out that this has been done, and trust will be irrevocably burned.
It doesn't mean they don't either. It's the removal of the previous claim that's worrying.
But honestly, it doesn't matter anyway since Whatsapp is somehow able to backup all your data on Google Drive and restore it on separate phones. How are they able to do that without backing up the private key?
The backups are unencrypted as highlighted in the UI (if I recall correctly). They re-generate the keys when you switch phones / re-install / clear data. That's when you get to see the "XYZ's security code changed" service message
I guess it makes sense. Doesn't make a difference if you deliver the chest with the key to the lock or the chest without the lock.
> For example, if you use a data backup service integrated with our Services (like iCloud or Google Drive), they will receive information you share with them, such as your WhatsApp messages.
So if I understand correctly, when a business uses WhatsApp, to maintain E2EE WhatsApp must "emulate" as if all customer reps of that business were sending WhatsApp messages through one phone. To do that, all reps connect to what is essentially one WhatsApp instance in a Docker container. This container holds the private key. And if a business tells Facebook to host that container, this means Facebook has possession of the private key?
The US usage patterns seem different, but where I am everyone has whatsapp and it's basically used as a sms system that actually works. If you compare whatsapp's security and features to sms security and features, you stop caring.
If you really want trustable end to end encryption, there are other apps for that :)
Don't they _need_ the private keys to mine all your messages for data on which to base the in-chat ads? This shouldn't be a surprise after recent announcements.
Not necessarily. In the past, there were also reports which showed, that if the Facebook App is installed on the same device, it would have access to the decrypted messages (not sure for which platform though).
Yikes. The bad news about WhatsApp just keeps pouring in.
For me personally I only ever used WhatsApp very lightly with a few work friends. After all of the recent news surrounding the app I sent a message saying I plan on leaving the app soon.
I wish it were easier to switch apps like this but it makes sense that they wouldn't want that to be the case.
After Facebook bought what’s app, I’ve never for a moment believed it was secure.
I’ve mentioned on here before about a conversation i had on what’s app and was presented with ads for the topic in Facebook right after. I’ve heard people have had similar stories. What’s app isn’t secure.
I've also heard many people claim this with Facebook and other platforms. I would love to see a study on this because I'm unsure of the evidence so far. Humans can make mistakes. They can forget typing something into Google, Facebook, etc. I can't even remember the Google searches I did yesterday!
Even then, a lot of the "I was talking about something with a friend and never even Googled it then got an ad for it" can be explained by Facebook leveraging their social graph to target ads if your friend googled something.
They have a social graph that indicates who your probable friends are, regardless of actual Facebook/Instagram/Whatsapp friend status; using Bluetooth & Wifi identifiers based on physical closeness.
I'm very skeptical of WhatsApp's security but I'm also very skeptical of these ad claims. We've seen a tone of those over the years, these companies would have a lot to lose if they did that.
Chances are that you're either more predictable than you expected, or it's just random chance and correlation bias. Billions of people use these services, there have to be some freaky coincidences happening all the time. We need something a lot more solid than "I've heard people" to make any conclusion.
But the general point still holds, it's a closed source app made by a company that thrives on data mining, of course it should be considered insecure by default.
My story is: 2 years ago I was looking for an apartment. My friend who is an agent took me to an apartment. After viewing I messaged him on what’s app saying I liked the place but I want the landlord to put latches on the windows so it’s child safe.
After messaging him I went to Facebook. Scrolling the timeline. A minute later I have adverts for window latches and window grills for child safety.
I didn’t search Google or anything. I was shocked.
I always see claims like this but never with actual evidence or research done.
On Android, I can spoof the microphone access permission so that Whatsapp thinks it has access. I can then log whenever this permission is being used while giving it spoofed data. I have it done this for many apps and unsurprisingly, most try to access the microphone in the background over and over again. I have not done this experiment with Whatsapp yet, though.
What I use for this is XPrivacyLua by M66B, which also has suppory for scriptable hooks in Lua.
The WhatsApp client is a closed source, so why is this news ? Is this some kind of promise from the company that the client will not have access to private keys ?
The "WhatsApp Encryption Overview" technical whitepaper [1] had the following text removed between revisions:
"At no time does the WhatsApp server have access to any of the client's private keys."
[1] https://scontent.whatsapp.net/v/t39.8562-34/122249142_469857...