Hacker News new | comments | show | ask | jobs | submit login

Here's how WhatsApp group messaging works: membership is maintained by the server. Clients of a group retrieve membership from the server, and clients encrypt all messages they send e2e to all group members.

If someone hacks the WhatsApp server, they can obviously alter the group membership. If they add themselves to the group:

1. The attacker will not see any past messages to the group; those were e2e encrypted with keys the attacker doesn't have.

2. All group members will see that the attacker has joined. There is no way to suppress this message.

Given the alternatives, I think that's a pretty reasonable design decision, and I think this headline pretty substantially mischaracterizes the situation. I think it would be better if the server didn't have metadata visibility into group membership, but that's a largely unsolved problem, and it's unrelated to confidentiality of group messages.

In contrast, Telegram does no encryption at all for group messages, even though it advertises itself as an encrypted messenger, and even though Telegram users think that group chats are somehow secure. An attacker who compromises the Telegram server can, undetected, recover every message that was sent in the past and receive all messages transmitted in the future without anyone receiving any notification at all.

There's no way to publish an academic paper about that, though, because there's no "attack" to describe, because there's no encryption to begin with. Without a paper there will be no talks at conferences, which means there will be no inflammatory headlines like this one.

To me, this article reads as a better example of the problems with the security industry and the way security research is done today, because I think the lesson to anyone watching is clear: don't build security into your products, because that makes you a target for researchers, even if you make the right decisions, and regardless of whether their research is practically important or not. It's much more effective to be Telegram: just leave cryptography out of everything, except for your marketing.




> To me, this article reads as a better example of the problems with the security industry and the way security research is done today, because I think the lesson to anyone watching is clear: don't build security into your products, because that makes you a target for researchers, even if you make the right decisions, and regardless of whether their research is practically important or not. It's much more effective to be Telegram: just leave cryptography out of everything, except for your marketing.

Honestly, this paper would be fine if it was just an analysis. The shitty thing about it is rather the prep'ed buzzy wired article

EDIT: I just noticed that Matthew Green published a blog post about this titled "Attack ...". That's really surprising :/


>EDIT: I just noticed that Matthew Green published a blog post about this titled "Attack ...". That's really surprising :/

How so? He consistently sensationalises his stuff.


> In contrast, Telegram does no encryption at all for group messages, even though it advertises itself as an encrypted messenger, and even though Telegram users think that group chats are somehow secure. An attacker who compromises the Telegram server can, undetected, recover every message that was sent in the past and receive all messages transmitted in the future without anyone receiving any notification at all.

I'm going to be honest, moxie. I'm a big fan of your work, and I basically agree with everything you've stated here as someone who works in the security industry. I don't particularly like Telegram, and I encourage use of Signal where possible. Just last week I was defending Signal on HN[1].

However, I think you shouldn't be bringing up Telegram here. The article does not mention Telegram by name, and I think that bringing it up here, as one of the developers of the Signal Protocol, distracts from your point. Holy war threads between Signal and Telegram bubble up on occasion on Hacker News, and people are basically aware of who you are. As an outside observer, bringing up Telegram in the way you did comes across as preternaturally defensive whataboutism.

I think you could have expressed your points about the security industry's disincentives (which are legitimate observations, in my opinion) without using Telegram as an example. But bringing up Telegram instantly shifts the focus away from Whatsapp, Signal and latent problems in the security industry; instead, it becomes the usual Signal vs Telegram circus. I don't think that's a particularly persuasive way to forward your points.

To reiterate: I agree with what you're saying, but I think that it's very likely your comment will be perceived in a way that you don't intend, to the detriment of persuasion.

___________________________

1. https://news.ycombinator.com/item?id=16064932


> I think you shouldn't be bringing up Telegram here. The article does not mention Telegram by name, and I think that bringing it up here, as one of the developers of the Signal Protocol, distracts from your point.

He quickly dismissed the idea that this vulnerability is a real one, and explained why. In the end it looks like a minor issue, blown out of proportion by this article.

The problem is precisely that this article does not mention Telegram even though it is in direct competition with Signal. If I didn't know better, I would assume from the article (and the paper) that Telegram is not subject to this vulnerability, and is probably "still" secure (if I thought it was before). Moxie addresses the issue, so this is not whataboutism; he just hints at what the article should have mentioned, that experts have been recommending Signal (and, after it, WhatsApp) over Telegram for ages, and that even though this recommendation could now take a hit, it probably won't budge with a vulnerability that small.

> Holy war threads between Signal and Telegram

"vim vs emacs" is a holy war; the fact that Signal is more secure than Telegram is not, when there is a consensus among experts about the question. IMHO, calling it such is misleading.


There's a fine line between being outspoken [1] about one's concerns, and hammering the same points that every opportunity. Telegram's marketing is heavy with weasel words that multiple people -- journalists, tech experts, crypto experts -- have called on probably being empty posturing, and their implementation shrouded in opacity in the exact ways it shouldn't. No one except laypeople believe Telegram clears the bar set by Signal, Matrix, or any of the systems OWS consulted on, but there are lots of laypeople: millions of them.

When you're in the industry, especially a leading innovator in the industry, it's infuriating to see an inferior product being recommended, one that you can credibly suspect doesn't even deliver on the promises, but in your attempts to discredit that product you'll sometimes come off as a crusading zealot, to the detriment of other content you've packaged with your commentary.

There was little need to call out Telegram in the post by name, because it does instantly re-frame the conversation, and in a forum some of the conversation will continue down the new path, as it does now. That's a mistake in this format, and it does come off as a defensive misdirect made in the heat of argument. A place to reinvigorate this criticism in light of the new revelations is one's own personal -- or even professional -- blog, where you can start off on the high ground.

[1] https://hn.algolia.com/?query=by:moxie%20telegram&sort=byDat...


I don't think the link you posted helps your case that Moxie is "hammering the same points that every opportunity". The link shows that he mentions Telegram only a few times a year, often in response to a telegram-specific article, and the last time before this one being two years ago.

Maybe it would seem that way for someone who's religiously following what Moxie says, but that's sort of like complaining of hearing "you should charge more" too often if you're religiously following patio11.

I also think he made a valid point in his most recent post, and mentioning Telegram added valuable context to his argument.


> "vim vs emacs" is a holy war; the fact that Signal is more secure than Telegram is not, when there is a consensus among experts about the question. IMHO, calling it such is misleading.

A holy war is determined by its propensity to raise "debates of attrition" in which both sides are so unyielding they may as well be (and sometimes are) ideological. Whether or not one side has a legitimate claim to superiority over the other is entirely orthogonal; such a debate is "holy" in nature because even if that superiority existed and was demonstrated, it would not be accepted. You cannot use reasoned expertise to decompose ideological adherence.

With respect to your other point:

> The problem is precisely that this article does not mention Telegram even though it is in direct competition with Signal. If I didn't know better, I would assume from the article (and the paper) that Telegram is not subject to this vulnerability, and is probably "still" secure (if I thought it was before). Moxie addresses the issue, so this is not whataboutism; he just hints at what the article should have mentioned, that experts have been recommending Signal (and, after it, WhatsApp) over Telegram for ages, and that even though this recommendation could now take a hit, it probably won't budge with a vulnerability that small.

I would have accepted this explanation, which is far more nuanced in presentation than the one we're discussing. You added all the context that would have safely negotiated those waters; but as stated, the comment does not achieve this purpose, in my opinion.


> I think it would be better if the server didn't have metadata visibility into group membership, but that's a largely unsolved problem, and it's unrelated to confidentiality of group messages.

Nitpick: Signal solves this problem just fine¹, by treating messages to a group as simple pairwise messages, encrypted similarly to pairwise messages, and sent separately to each member of the group. Group management is all done through these e2e-encrypted messages.

¹Signal also has a group messaging bug in that the app doesn't check that someone is a member of a group before accepting their group management commands, but that is trivial to fix.


I'm guessing that moxie has a pretty good idea of how Signal works...


That doesn't hide the group in any meaningful way. All the same-size same-time messages to the same contacts, over and over, make the group clear.


"I think it would be better if the server didn't have metadata visibility into group membership, but that's a largely unsolved problem, and it's unrelated to confidentiality of group messages."

Although this is true, I guess in this case this is not related.

As long as all the communication between peers are e2e, I think this situation can be solved by peers advertising the people they have invited to the group, later clients can refuse to do key exchange with parties, which are not announced before.

Or server can send new member join messages, by relaying a invitation message signed by the admin (or whoever invited the member)


> As long as all the communication between peers are e2e, I think this situation can be solved by peers advertising the people they have invited to the group, later clients can refuse to do key exchange with parties, which are not announced before.

This breaks group join links.


It does not then admin will just announce invite link and clients will check whoever joined with that invite link.

Ofc then whatsapp can reuse that link, but there is already some warning for invite links in whatsapp help


Details?


See https://faq.whatsapp.com/en/android/23776567/?category=52452... for the details of the feature. The identity of the new member isn't known at invite time.


It doesn't have too though, if you create a join link you could also advertise the code to other participants. When the new member joins via this invitation link, the code is recognized by everyone.

(I've been downvoted for saying that, but the solution works)


What public key do you suppose would be used to encrypt those messages without giving the server the ability to read them?


messages are already encrypted in the group chat, but there is another problem I didn't think about (see other comment)


As soon as the code is redeemed, the server has access to the code.


1. when you create a joining link, it creates a secret uuid

2. this uuid is shared with the rest of the group

3. if Alice joins the group, every uuid created is shared to Alice (except the one Alice used, if Alice used a joining link)

4. when Bob attempts to join the group via the group id, if Bob does not have a known code Bob is refused

5. if Bob uses a known code, Bob is accepted and everyone deletes the code

This does not prevent different participant views to be created, but this is already a problem in WA anyway.


(a) it means each code can only be redeemed once, which is a bit of a usability hit (b) it still doesn't stop a malicious server, which can try to MITM the group join and proxy messages to the client.


If this is a usability problem then remove this line:

> and everyone deletes the code

I'm not sure I understand your attack in (b), the message is encrypted to the participants the server cannot relay or mitm it.


> All group members will see that the attacker has joined. There is no way to suppress this message.

The only issue that I see here is that a large group may not see the user joined. People can ignore those messages over a certain threshold.

However most large groups are not going to be as privacy sensitive so it is not really an issue.


yup, security goes out the window in large group chats


Briefly, why is an alternative not to have the administrator sign the membership list and then let the server pass it around? I realize that this will means you won't be able to directly join via an URL, but you could either do that by having different types of group or having the admin have to verify each member after they apply to join via URL.

I feel like the article could have mentioned Telegram though, and I don't see why it couldn't have been mentioned in the paper too.


Thanks for the detailed response :)

I have a few questions:

- How does group messaging in Signal work?

- Does the server also hold group metadata?

- If there is a difference, why is there a difference?


> If someone hacks the WhatsApp server, they can obviously alter the group membership.

Of course an attacker can subscribe to the conversation if s/he owns the server, but that doesn't make it "obvious" that s/he can actually read messages' contents from that point onwards without any sort of confirmation from the chat's participants.


All the chat's participants get a notice that the new member joined. Every time they've ever joined a group, that's been the behavior. They've never gotten a "yes/no" dialog for new group members (it's hard to see how that would even work in practice). I think the behavior here is in fact pretty obvious.


The confirmation would be a participant who invited them and tells the other chat participants about it cryptographically (not by user interaction). That everyone rekeys (or whatever happens in the Signal protocol to allow future messages to be read by the new participant), I would see as a flaw of too much trust on the server.

A notification in a busy group gets lost, and in the scenario of an attacker owning the server, they could easily time it to coincide with a busy period.


How do you know the closed source app will show you all of the group members? Also anything to add about signal related to this paper?


A reminder that machine code is not a black box, it just takes a tad more work to look at it. Closed versus open source is not relevant here.


> and even though Telegram users think that group chats are somehow secure

how do you know what do telegram users think in regards? Assumption? — mother of all f*ckups, they say


jumba


> There's no way to publish an academic paper about that, though, because there's no "attack" to describe, because there's no encryption to begin with.

Then why does the Telegram faq state that there's "server-client encryption" for group chats? [0] "Secret Chats" supposedly even uses e2e encryption [1]

Note: Never used Telegram (Signal does the job for me, thank you!), I'm no coder, but your comment makes me wonder if I'm missing something here?

[0] https://telegram.org/faq#q-so-how-do-you-encrypt-data

[1] https://core.telegram.org/api/end-to-end


"server-client encryption" means that the data is only encrypted between the client and the server (i.e. the server can read all messages). "Secret Chats" are only between 2 clients and are end-to-end encrypted which means the data is encrypted the whole time between the clients, and the server can't read any of the data.


It’s likely that server client encryption just means TLS for the server endpoint.


Ah right, so the data on the server itself isn't encrypted at all, which obviously is quite an issue if the server would get compromised.

Now I feel kinda stupid for asking the question, guess that whole "There is no encryption" and Telegram faq saying "We have encryption!" threw me off, what's encrypted where is obviously the most important aspect.


Now you understand part of why Telegram alarms experts so much. No competent security engineer would claim a group chat was "encrypted" because it had TLS "server/client encryption". That's a property even AIM satisfied.


Imho part of the reason for that might be because Telegram has gotten the reputation of being the "encrypted Terrorist IM app" [0] Especially in the mainstream media, I've seen countless news pieces which blame Telegrams encryption for making it supposedly impossible to track down terrorists [1].

In contrast to that Signal and WhatsApp don't nearly get mentioned as often, at least in regards to "encryption enables terrorists!" FUD.

[0] https://www.hsdl.org/c/extremism-and-encryption-terrorists-o...

[1] http://www.smh.com.au/technology/technology-news/telegram-th...


That's an important point. I think the thing about Telegram is that it would indeed much less willingly cooperate with governments than say Whatsapp or FB Messenger do, since they don't have much relation with the western governments after all. However that says nothing about whether it is theoretically secure to hacking, which is quite a different thing.


What does Telegram have to do with WhatsApps problems?

The reasonableness of the WhatsApp design decision is ultimately to be determined by users. If users have no insight into the design then we can hardly say they have already decided on reasonableness. At best we can say they are ambivalent. (But if that were true, why would anyone responsible for the design care about the headline?)

Whether users get their insights into the design from WhatsApp, self-directed research, the work of "security researchers" or the media is perhaps an important issue.

If WhatsApp has made the "right" decisions then one would think they would be very forthcoming in subjecting them to review by users. If so, there would be very few surprises. WhatsApp could simply point to a detailed, public, technical document they released and say, "There it is. We tested or considered this before releasing the software and informed users about the risks, however remote. As such, nothing has been "discovered" by these researchers."

But I suspect if we went looking for this information we might only find marketing. Information promoting a "new feature", group chats.

If a WhatsApp/Facebook employee or contractor joins the chat is that considered an "attacker"? Example of a silly question perhaps, but it still needs to be answered, lest some "security researcher" and the media produce an undesired headline.

Anyone designing a messaging system today should be aware that some people are going ask these types of questions, sooner or later. Millions of people will not ask them and use the software willingly, but does that necessarily mean they do not care about these questions if someone else asks them? If yes, then headlines about "non-issues" should be of no concern.


>>To me, this article reads as a better example of the problems with the security industry and the way security research is done today, because I think the lesson to anyone watching is clear: don't build security into your products, because that makes you a target for researchers, even if you make the right decisions, and regardless of whether their research is practically important or not. It's much more effective to be Telegram: just leave cryptography out of everything, except for your marketing.

I am not sure your point is really being reinforced by comparing a flaw by a bigger flaw. You admitted (regarding WhatsApp) that 'it would be better if...'. And that it's an 'unsolved problem'. So why not focus as a community on solving that problem instead of comparing it to a (in your opinion) bigger problem (meaning Telegram) to even out the score?

Additionally, Telegram did not leave cryptography out of everything. You might not agree with it, it might not be secure, it might not be available in group chats, but to say they have left it out completely isn't true[1], you know that too.

https://core.telegram.org/api/end-to-end




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: