Hacker News new | past | comments | ask | show | jobs | submit login
Why Telegram is insecure (2015) (medium.com/thegrugq)
230 points by known on Jan 6, 2019 | hide | past | favorite | 246 comments



So, here's the thing I'd like somebody to explain to me. Telegram encryption spec is published, and the client is open source. This means you can verify that the server is following the spec by creating a clean room client implementation. You could even create a green field server implementation if you wanted.

Telegram regularly has contests to break its encryption with a reward of 300K USD https://telegram.org/blog/cryptocontest

If it's not secure, then surely people would be cashing in on that sweet money. So, why is it that we constantly see articles talking about how insecure Telegram encryption is, but nobody is showing a proof of concept attack or collecting the prize?

Unless somebody puts their money where their mouth is and shows an actual exploit with code, it seems like pure FUD to me. On top of that, it appears that attacks on Telegram often come from people associated with Signal in some way. Signal is endorsed by NSA who have a history of promoting weak encryption that they have found backdoors into. I hope everybody still remembers this debacle https://golem.ph.utexas.edu/category/2014/10/new_evidence_of...


> the client is open source

Uh, no it’s not. The client (which happens to seemingly violate a bunch of open-source licenses) and the code posted to GitHub do not match; occasionally a source dump is posted online but there’s no indication how this relates to what they are shipping, as the released binaries differ and update much more frequently. The author is quite unresponsive about this: https://github.com/overtake/TelegramSwift/issues/163


Sorry, that's just not true.

I have telegram-desktop (on Linux) which is unofficially distributed and compiled on my laptop via AUR[0], FDroid also has a version of telegram which is compiled from sources and not taken from pkg's released by the Telegram group, although they remove non-free parts apparently[1].

I'm not sure about iOS as I don't currently have the means to verify the content of my App Store iOS installed version of telegram, but the fact is that there are (at least) 2 client implementations commonly used that are compiled from source.

[0]: https://git.archlinux.org/svntogit/community.git/tree/trunk/...

[1]: https://www.reddit.com/r/fdroid/comments/9gmkob/why_is_teleg...


> I'm not sure about iOS as I don't currently have the means to verify the content of my App Store iOS installed version of telegram

I’ll save you the work: Telegram has updated more often that this repository[1] has been committed to, so the code is perpetually out of date, if it ever shipped at all (apparently it doesn’t compile?). The author has mentioned[2] that they will open source their new client once it has been released, but that promise has not been kept.

1: https://github.com/peter-iakovlev/Telegram-iOS

2: https://github.com/peter-iakovlev/TelegramCore/issues/11


how is it not true?

the version in github works, but the version on play store different a lot from it..


>and the code posted to GitHub do not match;

Well, I was challenging this assertion.

"Does not match" what thing exactly? My point was that people are using the non-packaged versions (I mean, I'm not FOSS mad, I'm just a user and I'm using the FOSS version almost by accident).

I would prefer the FOSS version to be 1:1 what is in the repo (or, even the upstream of the non-free version), but I see the situation as more chrome/chromium. We're not shitting on chrome for being "the most secure webbrowser" despite having non-free elements.

Seemingly because large US tech companies deserve more respect than a guy who was essentially exiled from his country for not handing over his users data?

Sounds like a double standard. Telegrams OSS edition works, so if they're sneaking things in it would have to be a backdoor, which would be INCREDIBLY damaging to their endeavour if it was found, and it would be relatively trivial to find too. So I'm not sure where this FUD is coming from.

Making the argument about the server not being FOSS is valid though, but then if the E2EE is good enough on the client then the server is basically a relay and can't pilfer anything except metadata. (which, if we're honest, is what governments are after anyway)


> I would prefer the FOSS version to be 1:1 what is in the repo (or, even the upstream of the non-free version), but I see the situation as more chrome/chromium.

Chrome is not GPL.


You can literally build your own client. Full spec is open source and documented and it works. There are many implementations already.

You can also compile the official client yourself.


> You can literally build your own client.

I mean, I could, but the Telegram developers being so opaque about this doesn’t make me feel like this is a good use of my time.

> There are many implementations already.

As far as I can tell, the only implementations for the platforms I care about are the “official” ones (in the sense that they are placed under some random person’s GitHub account and linked to by the Telegram website).

> You can also compile the official client yourself.

I have never been able to get this to work. Either way, what I compile from the “sources” they have put online and what they are shipping are two different things.


> As far as I can tell, the only implementations for the platforms I care about are the “official” ones

Telethon (https://github.com/LonamiWebs/Telethon) is a decent up to date (with MTProto) independent clean-room client implementation in the form of a Python lib.

You might consider trying to take care about it.


Hmm, I’ll take a look. I’m still not convinced that I would work on a better client, though.


They also have all developers share one single anonymous github account "john-preston". I can't understand the reason.


Perhaps it has something to do with:

" The founder of chat app Telegram has publicly claimed that feds pressured the company to weaken its encryption or install a backdoor.

He said, During our team's 1-week visit to the US last year we had two attempts to bribe our devs by US agencies + pressure on me from the FBI. "

https://www.theregister.co.uk/2017/06/14/telegram_boss_backd...


This is precisely the kind of FUD I'm talking about. The spec is public, there's code in GitHub that you can compile and run a working client. Whether updates are being dropped in a timely fashion or not does not change anything. It's sufficient to verify that the protocol is implemented faithfully.


> The spec is public

That’s good…

> there's code in GitHub that you can compile and run a working client

No, not really. The code pushed to GitHub doesn’t really compile and the author of the project is quite unhelpful in resolving these issues.

> Whether updates are being dropped in a timely fashion or not does not change anything.

It does: the project is licensed under the GPL, and in a sense “benefits” from having the name of the license associated with their product. But they are not compliant with software licenses, they’re basically source-available maybe possibly if you beg them enough, except also the sources they publish cannot be verified to and likely do not correspond to what they are currently shipping. For a project who’s entire goal should be increasing trust in their product’s security, keeping the development process opaque and violating open source licenses (in addition to their adoption of custom crypto, which I will ignore for this because I cannot comment on it) does not inspire trust at all. Plus it’s just plain annoying.


This is not FUD. Telegram HAS NO ENCRYPTION. Telegram stores all messages on the server in plain-text mode. Even whatsapp doesn’t do that.

Which means we must trust Telegram developers to not sell out our data to corrupt law enforcement agencies.

The service is free. Which means they would need to get profitable. Which means they would parse the chats or are already doing it.


> Telegram stores all messages on the server in plain-text mode.

Sorry, do you have source for this claim?


1. All historical messages are always downloaded when you sign in. It is possible to implement this feature without storing messages on the server, but Telegram intentionally decided to use servers. You can verify this by logging out of all your sessions and logging in again. All messages would be downloaded from the server. 2. If you ask if the actual messages on the server are in plain-text, it doesn’t matter. The keys for decryption are in memory. Which means anyone with backdoors or server access can get the messages.


This API endpoint will get you all past messages (except for those encrypted by Secret Chats):

https://core.telegram.org/method/messages.getMessages


>Telegram regularly has contests to break its encryption with a reward of 300K USD https://telegram.org/blog/cryptocontest

>If it's not secure, then surely people would be cashing in on that sweet money. So, why is it that we constantly see articles talking about how insecure Telegram encryption is, but nobody is showing a proof of concept attack or collecting the prize?

regulary != 2 times with limited time. Also, E2EE is not only about decrypting a message. E.g. signing messages as someone else isn't awarded. Also, you might need a lot of computation power. SHA-1 used in MTProto 1.0 for example is practically pretty secure, but not against a well funded attack.

But that aside, Telegram's encryption is probably good enough. But we already have standards that are good enough. Why risk it?

For example, from On the CCA (in)Security of MTProto[0]:

>Telegram is a popular messaging app which supports end-to-end encrypted communication. In Spring 2015 we performed an audit of Telegram's Android source code. This short paper summarizes our findings. Our main discovery is that the symmetric encryption scheme used in Telegram -- known as MTProto -- is not IND-CCA secure, since it is possible to turn any ciphertext into a different ciphertext that decrypts to the same message.

>We stress that this is a theoretical attack on the definition of security and we do not see any way of turning the attack into a full plaintext-recovery attack. At the same time, we see no reason why one should use a less secure encryption scheme when more secure (and at least as efficient) solutions exist.

>The take-home message (once again) is that well-studied, provably secure encryption schemes that achieve strong definitions of security (e.g., authenticated-encryption) are to be preferred to home-brewed encryption schemes.

And that aside, E2EE is not default and neither E2E group chats or E2E video calls are supported. This is the biggest security problem.

[0] https://dl.acm.org/citation.cfm?id=2994468


MtProto has since been changed to satisfy IND-CCA requirements: https://core.telegram.org/techfaq#what-about-ind-cca


Yeah, I'm not sure about it. I'm not a security expert that can look at the protocol for some weeks and come to a conclusion. Believing them after they did things wrong is not very convincing. Given some years and many eyes, and I might believe it. But until then?


The key point in all this is that nobody seems to have a working exploit to show. Traditionally, when people claim that something is insecure they have to provide tangible evidence of that being the case. I'm not aware of any published exploits that show MTProto to be insecure.

People also seem to conflate the fact that encryption is not on by default, or that it doesn't apply to stuff like voice chat with questions about the quality of the protocol. This is a completely separate topic of discussion.


Operational security is part of security. The article is not "Why MTProto is insecure".

Traditionally when people claim a system is cryptographically secure it's after rigorous analysis. MTProto isn't scientifically interesting. Two short contests several years ago don't make it financially interesting. The groups with the most incentive to break it have the least incentive to publish.


Telegram's crypto contest is questionable (see "A Crypto Challenge for the Telegram Developers" https://hackerfall.com/story/a-crypto-challenge-for-the-tele... and https://news.ycombinator.com/item?id=6936539 for previous discussion)

> Signal is endorsed by NSA

Uh, what? This isn't true.


Can you elaborate on Signal being supported by the NSA? I’ve never come across anything suggesting this.

I’m not sure how Signal was supported financially between 2015 and 2018 — maybe out of Moxie’s own pocket from Whisper Systems acquisition by Twitter? — but right now its continued development is supported by a $50,000,000 donation[0] from Brian Acton of WhatsApp.

[0] https://signal.org/blog/signal-foundation/



That article (accurately) points to Signal's initial funding by the Open Technology Fund, an program that gives grants to internet freedom technologies that line up with US interests and is funded by Radio Free Asia, which is under the State Department. Radio Free Asia was previously CIA-funded, many many years before the creation of the Open Technology Fund. (Also, Radio Free Asia, both in the past and in the present, is working on problems where being secure against large governments is much more useful than having back doors.) It has never been NSA-affiliated (unless you count "being part of the US government" as NSA-affiliated).

The relationship between Signal and the State Department is solely one of funding, not of technical interaction, let alone technical endorsement.


Further: a bunch number of open-source crypto projects received OTF funding. It was the policy of RFA to fund improvements and assessments in secure messaging and privacy technologies in general.


What a bizarre article!!

The author can’t seem to tell his story without tossing attacks at the NRA and guns - while admitting that there is a point to everyone having the same level of access to things the government might use against you (be it encryption or guns).

The author seems to embrace that the unbelievable idea from the media that “Trump won, now you need to encrypt your email”... while that sentiment ignores it was Obama admin that was prosecuting whistleblowers more than anyone before, using the NSA to unmask citizens in political oppsoition, the IRS targeting, and under Clinton pushed the “Internet freedom” policies... I feel like the article was gaslighting itself.

And then finally at the bottom, Signal is funded by a company that is associated with a government group. Which is concerning, no doubt.

Just what a strange and often seemingly self-arguing article! Maybe I just don’t understand the author at all.


Twitter owns signal?? Can you please link some supoort!.


Twitter doesn't own Signal. Twitter bought Moxie Marlinspike's past security company Whisper Systems. Signal is developed by Open Whisper Systems, which Moxie created subsequently (but which I believe uses some freely licensed code created by Whisper Systems).

Open Whisper Systems is an independent project, receiving grants and donations through the fiscal sponsorship of the Freedom of the Press Foundation. Last year, the Signal Foundation was created and funded by the aforementioned large donation from Brian Acton, which will take over this role once the IRS approves its 501(c)(3) charitable status.


Thank you - I truly cannot wait until we can donate to signal.


Thanks to the fiscal sponsorship arrangement I mentioned with the Freedom of the Press Foundation, you already can: https://freedom.press/crowdfunding/signal/

That money will, minus whatever portion the Freedom of the Press Foundation keeps to cover its overhead, be spent on Signal. And donations for Signal to this foundation are tax-deductible to the extent 501(c)(3) charitable donations usually are.

I have no connection with any of the people, projects, or organizations we've been discussing. But I am the president of another fiscal sponsor 501(c)(3) nonprofit in the free/open source software space, so I'm very familiar with this model.


The crypto is not the problem, the UX is the problem. Any time there is opportunity for the user to be confused, they will be. It's ultimately the user's responsibility to make sure their comms are secure, so it's probably prudent to choose an app that presents the cleanest, least confusing path.

Nobody (really) complains about PGP's crypto, but there are dozens of articles rightly criticizing its 80s-era "UX".


Personally, the main rub for me is "Telegram's end-to-end encryption is not turned on by default," which null and voids all of this.


This[0] is why the challenge is bunk.

[0]: https://web.archive.org/web/20171213214126/https://moxie.org...


> If it's not secure, then surely people would be cashing in on that sweet money.

Because that's not how cryptography works.

Remember that there is no mathematical proof that working cryptography even exists. Cryptography relies on "one-way trap-door functions," i.e., functions that cannot be inverted unless you have the secret key, and the existence of one-way trap-door functions implies P ≠ NP. Because we have no proof of P ≠ NP and it may well turn out to be the case that P = NP (surprising as it might be), we have no proof that cryptography is even possible.

So, what do cryptographers actually mean? In practice they mostly mean that something has been subject to attacks for years and the public mathematical / cryptanalytic community hasn't found any cracks in the armor. We expect RSA to be secure because people have been working on efficient factoring since at least when we named the P vs. NP problem and arguably for centuries and we haven't made any real progress. We expect block ciphers and hashing algorithms to be secure because they're subject to public scrutiny.

The last time we had a contest for a hash function https://en.wikipedia.org/wiki/NIST_hash_function_competition, there were multiple good finalists, none of which were cracked in their entirety. They each were refined as people found attacks against artificially weakened versions of the functions, on the grounds that attacks get stronger over time. And even so only one was endorsed, and even so consensus is, as far as I can tell, to use a newly strengthened version of one of the finalists (BLAKE2) instead of the actual winner (Keccak). And certainly nobody would recommend using any of the other finalists, simply because the attention of the public research community is no longer on them, even though they haven't been publicly broken.

It is very easy to make up an encryption algorithm or cryptosystem, run a contest for a few months, and not find a winner. That falls far short of the standards professional cryptographers have for solid cryptography. As with hash functions, it may be the case that nobody has found a specific weakness in Telegram's crypto, but using it when it's not the focus of research / not built on the best primitives the research community has to offer is unnecessary risk.

> Signal is endorsed by NSA

Please provide evidence of this. As far as I know, this is a lie.


The state actors who have the serious requirements to break into Telegram have zero care about the bug bounty.

Moreover it is in their interest that the vulnerabilities remain in the wild.

Meanwhile, the bug bounty contests are limited time, several times. Good for publicity, but hard for individual actors to capitalize upon (more likely to take on-and-off work over a long period).


Explaining this will require setting up a bit of terminology. I'm not going to comment on Telegram's security because I don't want to get involved in a Signal-Telegram flame war. But I can at least explain why the contest is a poor methodology for ascertaining the security of a cryptosystem.

To begin, when researchers publish or analyze cryptographic algorithms, they model the interaction of legitimate users and adversaries. Two legitimate users will engage in communication and an adversary will sit "in the middle", trying to break ciphertext confidentiality or authenticity (or both). Generally the adversary is allowed to tamper with or reply messages from Alice to Bob and vice versa. Adversaries are alsl given various capabilities of interacting with the cryptosystem and messages therein, and if the cryptosystem is secure against an ever more capable adversary, we say it is "____-secure." Researchers model the interactions in this way to more closely approximate real world, repetitive usage instead of an isolated simulacra.

So let's talk about cryptanalytic attacks. There are known plaintext, chosen plaintext, known ciphertext and chosen ciphertext attacks. The attack always has a ciphertext, so known ciphertext is the lowest form (least assured) level of cryptanalysis. A known plaintext attack gives the attacker a corresponding ciphertext and a plaintext. A chosen plaintext attack affords the attacker the ability to use an encryption oracle to encrypt arbitrary plaintexts and see the ciphertexts. And likewise, a chosen ciphertext (the most powerful attack) allows an adversary to decrypt arbitrary ciphertexts.

The Telegram contest requires you to provide the plaintext corresponding to a single ciphertext. You are not given access to real time encryption or decryption within a channel and you are not provided with the ability to attempt cryptanalysis on numerous messages. You can't even tamper with messages or try to replay them, because you have an isolated ciphertext to examine. Everything else is outside the scope of the contest.

That means the only thing you're allowed to do is a known ciphertext attack. A known ciphertext attack is extremely difficult to pull off even for weak cryptography and it does not resemble the structure of a real world attack. For example, legitimate attacks on TLS are not known ciphertext attacks.

That's why the Telegram contest is synthetic. Again, I'm not going to comment on its actual security. But if you want to do real cryptanalysis, you hire a team of cryptographers who didn't design the thing and try to break it with a more generous suite of capabilities that better resemble real world scenarios. The critiques from the cryptographic community should be examined on their own merits, and not judged according to whether or not they've successfully broken something under a known ciphertext attack.


I see a lot of people fixating on the contest here, however you don't have to go through the contest to publish an exploit. I'm pretty sure Intel didn't hold an exploit contest for Meltdown and Spectre, yet actual legitimate researchers found a vulnerability and published how it can be exploited along with a proof of concept implementation.

When I see the equivalent done for Telegram I'll start taking the claims seriously, until then it's just noise.


I'm not fixating on anything - you were the one who highlighted the contest and its lack of winners as the first part of your argument for Telegram's security:

> Telegram regularly has contests to break its encryption with a reward of 300K USD

> If it's not secure, then surely people would be cashing in on that sweet money.

Like I said in my first comment, all I'm doing is explaining why a contest requiring known ciphertext attacks is not evidence of cryptographic security. You originally spoke to this point, and I am responding about that point in particular because it's a topic I know intimately well.

You can double down on your second point about there being no published exploits; that's not something I can confidently speak to. I think there are reasonable counterpoints citing serious vulnerabilities are not always publicized, but I'll leave it to others to discuss that point.

It's not my goal to convince you that Telegram is insecure. What is important from my perspective is that you understand contests are in fact contrived and noisy, and should not be considered evidence of cryptographic security whatsoever.


Crypto is sometimes about feelings and maybes, not necessarily that an actual, working exploit trivially exists. Aspects of Telegram's crypto design and the UX do not give confidence.

To give a specific scenario to illustrate my point, the BEAST attack on TLS happened in 2014, but we knew that CBC was vulnerable to this specific attack since ~2002 (Vaudenay) and that malleability was a concern even earlier. At the time, however, for TLS it was considered impractical and only of academic concern. Had someone offered a sum of money to prove TLS hadn't accounted for this design flaw at the time, nobody would've been able to claim it. The modern reliance and widespread adoption of ajax and client-side web apps made an actual attacker-controlled-plaintext scenario required for this exploit a reality.

We're paying much more attention now to concerns like constant time code, invalid curve attacks, IND-CCA2 etc not because attacks are obviously possible now, but because we can't always predict when they may become viable in the future. However, Telegram's choices seem to ignore much of the conventional, hard-earned crypto wisdom. Statements like this:

"Properties like IND-CCA are convenient for theoretical definitions and scientific inquiry, but they are not directly related to the actual security of communication."

are really concerning - IND-CCA is directly related to the security of communication. It's very definition is that, if provided with some known chosen ciphertext/plaintext pairs (ciphertext controlled by the attacker), can the attacker later distinguish two messages encrypted under the scheme. In plain english, schemes that do not have this property start to leak information to the attacker if there is a small partial compromise. There's been a whole bunch of recent issues (particularly with RSA) where the lack of CCA-security has caused a possible attack.

Nobody else in the secure messaging space is trying to argue such security definitions are inconsequential. If the article is correct, it seems not only are they dismissive of it, but they've made no effort to fix it.

As for your last paragraph, I do not dispute your claim NSA have deliberately attempted to weaken encryption. Dual_EC_DRBG is direct evidence of this. However this is orthogonal to the issue at hand. So far as we know, there is no public evidence Signal and the NSA have collaborated (or of coercion) (your article is about general attempts to weaken cryptography, not Signal). Signal's protocol uses NIST Algorithms, but then so does Telegram. The curve choices in Signal's protocol were developed independently of NIST and NSA. I don't see any evidence Signal is endorsed by the NSA. If anything, the deployment of end-to-end encryption pioneered by Signal is concerning Five Eyes governments to the point they are legislating lawful interception requirements.

I stress, to avoid sounding like "FUD", that I'm not claiming telegram is insecure. I am, however, claiming Telegram's choices are concerning and that as a result I do not trust it as a secure messenger. They could fix this by reviewing their choices and being less dismissive of crypto research. I'm not associated with Signal in any way, I don't even use it. I use a competing product for secure messaging (and I'm not affiliated with them either).


Which messaging app do you recommend?


I don't use Signal mostly because nobody I know uses it, even though I know they have it installed. By pure usage statistics the winner would be WhatsApp. I also use Wire.

In terms of issues with each of these, Wire has occasional bugs and hiccups (UX wise) and would be unsuitable for large groups exchanging messages (every device implies an entirely new ratchet, users own multiple devices, and all Wire group chats are client-fan-out, meaning pairwise ratchets with all members. Users with more than one device, even if they communicate one-on-one, are effectively in a group session). However the group message problem implies trade-offs and doesn't have an easy solution, so this shouldn't be seen as game-ending criticism (there's an ongoing effort involving people from all three messengers to figure out group messaging in a safe and scalable way, called Message Layer Security). WhatsApp is much more feature stable, but compromises on client-fan-out to make group messaging simpler. I think Signal also does client fan-out. I think Signal were also a bit dismissive of the low order point issue in their protocol (https://research.kudelskisecurity.com/2017/04/25/should-ecdh...), but it's arguable whether this is an issue at all their instantiation of DH (https://moderncrypto.org/mail-archive/curves/2017/000896.htm...). So in terms of cryptography, all three are state-of-the-art.

I haven't really used Threema, but for completeness I had a brief look at their crypto "whitepaper". In it they say:

"Due to the inherently asynchronous nature of mobile messengers, providing reliable Forward Secrecy on the end-to-end layer is difficult. Key negotiation for a new chat session would require the other party to be online before the first message can be sent. Experimental schemes like caching pre-generated temporary keys from the clients on the servers increase the server and protocol complexity, ... Due to these and the following considerations, Threema has implemented Forward Secrecy on the transport layer only..."

This means Threema's forward secrecy is based entirely on TLS and doesn't apply to the end-to-end exchange. Also, Signal, Wire and WhatsApp use variants of Signal Protocol and these do pre-keying. It's hardly experimental.

There are other considerations and arguments you can make, such as jurisdiction, use of phone numbers, openness of code, who the parent company is and whether they have a voracious love for hoarding data to rival the NSA, federation etc. I'll leave those. I was responding mostly to the "if you can't break it it's obviously secure" challenge.

tl;dr for a normal person against the average adversary or an only vaguely-interested well-if-we-can-collect-it-why-not NSA, any of Signal, WhatsApp or Wire would be fine.


I can't believe that anybody would recommend WhatsApp for any kind of secure messaging. The app is closed source, and there's absolutely no way to audit that it implements the protocol faithfully. Facebook can easily be spoofing keys, there's already a story regarding this type of vulnerability https://www.theguardian.com/technology/2017/jan/13/whatsapp-...

Say what you will about Telegram, but at least you're getting what's advertised. WhatsApp is a complete black box that's owned by a predatory company whose business is selling user data.


> Telegram links an account to a telephone number.

> If you want to use secure and private apps, I recommend: Signal Private Messenger: https://signal.org...

I feel it's wrong to criticise Telegram for using phone numbers, and then in the same breath recommend Signal.


Why do all messengers keep requiring phone numbers? Is it because of the problem of spam? Couldn't this easily be prevented by for example requiring consent before sending messages to new contacts? You could for example have it work 2 ways: 1) require a phone number connected to message unknown contacts. 2) without phone number, requiring prior consent from any new unknown contact, to message them.


There were many messengers that ran on mobile phones in the old days (the days where text was pricey in many places on the world so you'd think people'd be extremely motivated to find an alternative to texting!) and none really caught on...

until whatsapp appeared on the scene. The key innovation whatsapp brought, because the app was laughably insecure and probably overly simple (but that simplicity can't have been the reason it won; there were other really simple apps out there)... was making your account ID equal to your phone number.

This gave whatsapp the ability to skip the phase of setting up your 'network'; there'd be no need to ask your friends what their ICQ id is or whatever. Whatsapp would even simply tell you which of your friends had whatsapp installed, immediately, without any consent or setting up your network required.

THAT sold. That simplicity. Yeah, you can (rightfully!) put quite a few question marks on the consent and authentication mechanism I laid out above, but it does lead to an app that is useful and understandable for a great many people (even if it is also not particularly secure or careful about your consent).

I'm sure all these messenger apps (signal, imessage, whatsapp, and telegram) know it and wouldn't dare walk away from phone numbers at this point.


Believe me or not, there was a J2ME app called Talkonaut that was wildly popular in the middle east, and used phone number direct mapping back in days. There is no chance that Whatsapp never heard of it. I made few localisations for it back in my high school years.

Syria was a very big market for it.


Telegram supports usernames in addition to phone numbers [1]. iMessage can be used with just an email address.

[1] https://telegram.org/blog/usernames-and-secret-chats-v2


It just lets you pick a username in addition to the phone number. The phone number is still required. (Granted it is a plus that you can give your username to someone and they supposedly will not find out your phone number, but still.)


Someone should simply fork signal and deploy an app without phone number, focused on privacy(i.e options to exchange keys in person without any user account or phone number)


So you'd have the account setup flow but still on a single server only few can control.

I'd prefer to see people go for a federated system (eg XMPP with OMEMO for E2E crypto) or work on truly decentralized ones (that one is hard, and blockchain is not a trivial solution ;-) ) rather than propose yet another system where somebody else can control who you can talk to (even if they can't see what you're talking about) as a replacement for the previous system with the very same properties, just ran by a different party.


At least you would have end to end encryption using a protocol regarded as secure. Descentalization would be the next step.


OMEMO is based on the Signal Protocol, and there's some discussion[0] about the effect of the changes.

So in a way this is Signal Protocol (the one "regarded as secure") with that "next step" of adding decentralization already taken.

[0] https://conversations.im/omemo/audit.pdf


Maybe you want Tox

https://tox.chat


Because:

1. WhatsApp was the first widely-used such app and it was phone number-based, so all the clones use the same system.

2. Non-technical people have a hard time (read: bordering on impossible) remembering their usernames, let alone passwords, and by using the phone number as both identification and authorization, the problem is sidestepped.

3. People have an existing address book of contacts (in the form of phone numbers) in their mobile phones, that can be used to pre-populate the app's buddy list.


Re 2 - Blackberry Messenger used an eight character alphanumeric ID, and was enormously popular in the UK (not sure about other countries however) - until Blackberries died a death at the hands of Apple and Android...


BBM was different since the PIN was baked into the device so it as more seamless than even using your phone number.

The problem was when you change your device you can't migrate your PIN so I think it contributed to decline to usage as people moved to other BB devices or iOS/Android

This only strengths the argument why majority of chat apps use phone number as your username.


Was BB ever really mainstream? I know it had loyal following, particularly for business users. But, at its peak it had 80M users globally. Most people still had basic phones then, and many of those with BlackBerries probably had an IT department supporting them.


BB was thoroughly mainstream in the UK.

https://www.theguardian.com/media/2011/aug/08/london-riots-f...

from 2010 / 2011

> Using BlackBerry handsets – the smartphone of choice for the majority (37%) of British teens, according to last week's Ofcom study – BBM allows users to send one-to-many messages to their network of contacts, who are connected by "BBM PINs". For many teens armed with a BlackBerry, BBM has replaced text messaging because it is free, instant and more part of a much larger community than regular SMS.


2. no. People can and do remember. Just ask their mail address. Of course someone will post anecdotal evidence that they can't.


4. One of the more effective ways of enforcing 1 person 1 account


In whose interest is that?


> 2. Non-technical people have a hard time (read: bordering on impossible) remembering their usernames, let alone passwords, and by using the phone number as both identification and authorization, the problem is sidestepped.

In Signal it's a requirement and not a convenience: I would like to avoid using the number, and I'm able to configure it, but I can't, because the apps don't allow that. Just the same like I have to use it for a new Google account etc.

The real reason is: both the spy agencies and the ad companies really like to be able to easily identify everybody.


Are you implying that signal works for a spy agency or an ad company?


Everybody should draw his own conclusions.

Some topics to consider, that those interestdd in such topics have to clear with themselves (source: all Wikipedia):

“The Open Technology Fund (OTF) is a U.S. Government funded program created in 2012 at Radio Free Asia.”

“Radio Free X” (for different X” have a very long history of effecively being made and maintained by CIA.

“Clinton's policy was "heavily influenced by the Internet activism that helped organize the green revolution in Iran in 2009 and other revolutions in the Arab world in 2010 and 2011".[3]”

“Notable projects that the OTF has supported include The Tor Project, Open Whisper Systems,..”

Such associations can at least (directly or indirectly) influence some design decisions.

The best proof against the points like mine would simply be to allow those who want and know how to do that to really use the service completely anonymously, that is, especially without providing some phone number.


There are a lot more projects you can't support than Tor and Signal if you think accepting help from OTF taints them.


I don't consider anything automatically "tainted." I'm considering the reason behind the Signal's decision to insist on acquiring the phone numbers and I don't have any better explanation than that.

It is just their insisting on the phone numbers is their action that for me makes use of their product problematic.

And the "Open Technology Fund" is surely an interesting beast, worthy knowing about, as well as the stories about the "Radio Free X" for some X.

Some fascinating insights on that topic:

https://www.theglobeandmail.com/arts/books-and-media/review-...


What does Radio Free Asia have to do with Signal? The only connection is that OTF, which is an RFA project, decided to spend money on open privacy and cryptography tools.

Signal has repeated the reason they use phone number identifiers ad nauseam. The goal of Signal is to make text messaging secure. Those are the roots of the project. You're confusing the application with the underlying technology, which Signal has published and many other projects have exploited.


> The goal of Signal is to make text messaging secure

For whom is that messaging more “secure“ if the phone numbers are the (technically completely unneeded) requirement on which is insisted by the producers of Signal?


I don't think you understand. By "has its roots in text messaging", I mean literally SMS text messaging, the most popular messaging network on the planet. That is the goal of the project: to take the largest mainstream messaging network in the world and make it cryptographically secure. That network already has identifiers and a network of relationships: phone numbers and contact books. That's how Signal bootstraps itself.

If you don't and never would send an SMS text message, you are not Signal's model target user and never were.

Use something else; you have lots of choices. If not Signal, I recommend Wire, though I think you'll come out behind on the privacy tradeoffs.


“It has roots in SMS” is of course absolutely miskeading as Signal is technologically completely different and there’s still on that level nothing that would prevent the producers of Signal to allow anybody to use any other way of identification.

So I still claim that the fact they don’t has some very strong reasons that are completely opposite to the security of their users.

And I still haven’t read here any argument that disproves that.


There's nothing to "disprove". The project has a different goal than you have. That apparently makes you angry. That's fine; snipe at it all you want, and pick a different messenger.

What you cannot get away with doing is making the argument that your disagreement with them is an indicator that they're compromised by the US Government. That's comically false. Snipe at them for things you can make actual arguments about (there are lots of those).


> That apparently makes you angry

"Apparently" because it's without any proof.

> What you cannot get away with doing is making the argument that your disagreement with them is an indicator that they're compromised by the US Government.

I'm not claiming that and I've never claimed that. I'm claiming only this:

1) there are no technical reasons for them insisting in their users giving them user's phone numbers and user's contact data: the underlying technology would not be any more complex without that.

2) There was(is?) actual monetary support of Signal development by the actual US spy organization (with the decades of historical support for these direct "spy" connections, some of which really amazing, see before). Which gives some additional context for the whole operation.

Specifically "tainted" and "compromised by" were always only your formulations. I'd just say "given the circumstances, there can be observed some non-technical reasons behind some design decisions."

Which would, I hope, be completely non-controversial if we would, for example, talk about Skype. Because for Skype we have the direct proofs from Snowden. And not having such for Signal we surely can say "we don't have proofs" but we also can't claim that "this time it's completely different." Even if Snowden himself talks nice (or something) about Signal.

And downvoting my comments stating both is not changing these facts.


I didn't see this comment until just now (it's buried 2 pages down in my comment history), let alone downvote it.


i think you might be incorrectly equating 'secure' with 'anonymous AND secure'


If you have this level of distrust, why use their service and expose your IP address to it? At that point, better to set up your own server.


There are, of course, different points of tradeoffs for everything related to security. So yours is not an argument that supports Signal’s feature set.

Yes for “complete” security by all means use your own optical fiber network and your own crypto, with a touch of quantum cryptography custom made by Bruce Schneier et al just in case.

We’re talking about someehing else here though.


I’m with you. The reasons given in this thread for requiring phone numbers are not convincing. Despite this huge and unmotivated barrier to usage, Signal enjoys strong support from HN and related communities, and I have no idea why.


> Why do all messengers keep requiring phone numbers?

Matrix/Riot.im and Wire doesn't. I think they're both good alternatives.


Riot.im is a damn pain to set up. I've been using computers for a quarter of a decade and I still get flummoxed by the UX.

Bridging Matrix/Riot to IRC/Telegram/Whatever seems to require some black magick beyond my capabilities.

On the other hand, even my grandmother can (and has) set up Telegram and WhatsApp.


> I've been using computers for a quarter of a decade

OT but I suppose you mean century?


Have you had a look lately? Plenty has happened in the last year (not only on the UX front)!

https://media.ccc.de/v/35c3-9400-matrix_the_current_status_a...

http://riot.im/experimental


Wire only if you register first on a computer. from the phone if you create an account you have to use your number.


They seem to have changed this now, you can register with an email address from mobile. It still asks for a phone number, but you can say no.


That has probably changed recently indeed.


Keybase is a great alternative for chats (no voice/video yet).

It is VC backed, doesn't require a phone number, everything is end-to-end encrypted and has apps for all major platforms (iOS, Android, Windows/Mac).


Not to mention it has a great filesystem behind it with integrated git. I use it as my main document store to replace dropbox and I use it on most of my projects in place of slack.


Agreed. My friends and I who kept in touch through Hangouts (gchat or whatever it is called this week), have moved to Keybase. It works fine, although the desktop chat app occasionally crashes on macOS, and the iOS app is not optimized for anything larger than an iPhone screen.


My answer, and some won't like it, but it's true...

Metadata is surveillance.


Ease of use


Because companies think that people are too stupid for exchanging usernames/handles. Obviously they are wrong.


In Telegram if I can convince the phone network that I have your phone number, Telegram will deliver all your Telegram messages (including historical ones) to me and I can read them. This is completely transparent to the other participants.

In Signal the remote Signal server will not deliver the messages, since anyway I can't decrypt them (there are no unencrypted messages in Signal) and it never keeps any that it has successfully delivered.

Now, if I convince the network that I have your phone number, how should your friends determine that I'm not you?

In Signal (and WhatsApp) this is designed into the UI. If you aren't sure if this is who you think it is, you can meet in person and compare numbers or QR codes to check there are no shenanigans going on. Many casual users never do this, but if something seems "off" you can verify.

But in Telegram there's no option like that.


In telegram also you can see an image and text derived from the secret key for comparison. Get your facts right.


Yeah this doesn't do the same thing at all.

In Signal you have an identity that's verifiable, and that's what their verify step does. The keys constantly ratchet, and you can even both reset them from scratch, identity verification isn't affected.

In Telegram you set up a separate encrypted chat and the secret keys for it can be visualised. But if you create a different chat any steps you took to verify the identity don't carry over.


>This is completely transparent to the other participants

nitpick, but I think you mean 'opaque', as in, they can't see it.


Not sure what the policy is on plagiarism, but this post plagiarized my write up.

The structure is the same and the section headings are the same. It is blatant plagiarism.

https://link.medium.com/cWfUtKQjgT



Thanks. Should maybe put in (2015) in the title as well? It is old and hasn’t been updated.

I stand by my assessment at the time, but I haven’t checked Telegram since then and don’t know what, if anything, has changed.


I think your post should highlight the main idea: Telegram stores all messages in plain-text on the server. Even whatsapp doesn’t do that.

Seems to me like the “end to end” was an afterthought. Just to claim that they can encrypt messages. While 99% of messages are stored right there, and we must trust telegram not to give that data to anyone. Why should we?


They added references a couple of minutes ago, I think very large chunks of their post should still be in quote blocks though. Great writeup btw, grugq https://gitlab.com/edu4rdshl/blog/commit/2fc4d85714d2bde3aca...


Thanks


I've had a couple of popular medium posts copied by a person looking to boost the visibility of his consulting company site. It's frustrating, but I think ultimately it does the thief more harm than good.



He did only after I publicly called him out[0]. And it is still plagiarism.

“Plagiarism is the "wrongful appropriation" and "stealing and publication" of another author's "language, thoughts, ideas, or expressions" and the representation of them as one's own original work.”[1]

[0] https://twitter.com/Edu4rdSHL/status/1081971696431706113

[1] https://en.m.wikipedia.org/wiki/Plagiarism


And the other parts of the post are copied verbatim from the MTProto paper.

Check out the "MTProto is not IND-CCA secure" section Eduard Toloza supposedly wrote: https://gitlab.com/edu4rdshl/blog/blob/9d490385d53cdf7048d16...

> Here I briefly present the two attacks that show that MTProto is not IND-CCA secure. We assume the reader to be familiar with the notion of IND-CCA security...

> Once again, I stress that the attacks are only of theoretical nature and we do not see a way of turning them into full-plaintext recovery. Yet, we believe that these attacks are yet another proof [https://eprint.iacr.org/2015/428.pdf] that designing your own crypto rarely is a good idea.

And compare it to Jakob Jakobsen and Claudio Orlandi's paper: https://eprint.iacr.org/2015/1177.pdf

The "author" just swapped out the "we" for "I". Aside from the text in bold at the bottom, there's no original content. It's plagiarism.


If you still believe this is plagiarism (after the citation edit), then your recourse is to notify Gitlab, where this is hosted on.

https://about.gitlab.com/handbook/dmca/


It is plagiarism, that is not debatable.

After contacting the author he has removed the plagiarized content.


Is it still plagiarism if the original author hides his writing behind a pseudonym?


Yes. Plagiarism is stealing another authors ideas, words, or expressions and representing them as one’s own work.

And I’m pretty sure comments like this are against HN policy as they degrade the quality of discourse.


Is it still theft if the thief hides his actions behind an alias?


What I find interesting is that this article is very recent, however the calculations on Crypto are not up to date. MtProto has had some big changes some time ago already:

- SHA-256 is used instead of SHA-1;

- Padding bytes are involved in the computation of msg_key;

- msg_key depends not only on the message to be encrypted, but on a portion of auth_key as well;

- 12..1024 padding bytes are used instead of 0..15 padding bytes in v.1.0.

See https://core.telegram.org/mtproto/description


None of these changes are big enough; it is still homebrew crypto. If there's issues with cryptography standards you can be sure your OS (such as your Linux distribution) lost CIA in one way or another. Whereas with MTProto, if that is broken, only your Telegram chats lose CIA. Which raises the question why not use standards?


They were, actually. MtProto v2 satisfies IND-CCA now as opposed to what the blog post claims: https://core.telegram.org/techfaq#what-about-ind-cca


So Signal and others are not "homebrew crypto"?

That criticism is fair a lot of times, but every higher level crypto construction is going to be unproven for a while until checked.

It's not like they were inventing their own hash function and stream cypher.


Signal is the best-studied multiparty secure messaging protocol; there are academic papers that provide formal analyses. Trevor and Moxie won the Levchin Prize at Real World Crypto for Signal Protocol; the Levchin steering committee is a "Who's Who" of cryptographers, as are the other winners of the prize.

No, Signal is not "homebrew crypto".


What would be a good definition of Homebrew crypto?

Sure, if I put some primitives together (even if I had a good knowledge of how to do it) in a closed product and nobody evaluates it (and I add a label like "military security") that's Homebrew, no questions.

But all systems are born "in secret" (at least for a short while). Unless the definition involves appeal to authority.


Obviously, the term is a straightforward appeal to authority.


Which is sometimes unjustly described as fallacious, though even the best can make mistakes.


Hopefully we agree on the authority here. But I jumped the gun on my response a little as well, because my argument isn't simply an appeal to authority; for instance, you can just go read the formal analyses of Signal Protocol and evaluate them for yourself. Maybe IEEE EuroS&P was wrong to accept the paper!


The fact that they spent so long using SHA-1 after it’d is a red flag for their hand-rolled crypto. They had no legacy install base but were deploying new services while the rest of the industry was deprecating it – even the U.S. federal government was ahead on that.


Any reason you mention Signal and Threema but not Wire? Wire is objectively superior as far as I can tell.

For a quick (probably incomplete) comparison, Wire has proper multi-device support (not sure how Threema does that), open source and cross-platform clients and open source server (Threema does not appear to have a Linux client nor publishes sources as far as I can tell), it's free (allowing everyone to use it, even if you're in a country where it's hard to pay from), it has a proper profit model (unlike Telegram, and to a much lesser extent Signal), and it does not require a phone number.

The two major downsides of Wire is the battery usage due to all clients being Electron, and that few people know about it (hence my prompt to give them a mention). Not major stoppers I'd say...


> not proxying through your phone which Signal used to do and maybe still does

This is what Whatsapp does. Signal never did and has real multi-device support.

> and the Signal server is closed source

It's not: https://github.com/signalapp/Signal-Server


> This is what Whatsapp does. Signal never did and has real multi-device support.

This is news to me. Curious then, why is it not possible to use Signal desktop without a phone?

https://support.signal.org/hc/en-us/articles/360008216551-In...

> Can I install Signal Desktop without a mobile device?

> Signal Desktop must link with either Signal Android or Signal iOS to be available for messaging.


You must link a client with a mobile device, but after that apparently you can use e.g. the Desktop client without your mobile phone turned on.


Although Signal is not closed-source, they tend to discontinue useful features of their product. For example RedPhone server code has been removed from GitHub after it was discontinued.

Last time I checked, it is not really feasible to setup a standalone Signal instance. Is there any guide to deploying a standalone version of Signal on your own servers or a successful example of it? If not, their approach may be more of "see but not touch" than being a free software. Also decisions like dependency on GCM and discontinuing features to increase development speed make Signal less reliable as a open source project.


I assume they removed it because they don't want anyone to use old insecure code. The RedPhone server hasn't been used for years now and was replaced with a better implementation.

You're right there's no official support on their server, but there are some unofficial guides on their community forums on how to set it up. Also, GCM is not a hard dependency. On Android it will fall back to using only websockets (and no GCM) when Google Play Services is not installed since some time. What other features have they removed without replacement?


I wish Signal did route through the phone and allowed to send regular texts from other devices.


But that's more of a feature request, or even a completely different product ("I would like to send SMS from my desktop"), than a design decision for a chat application's protocol.


Sure it's a feature request but a feature that would be much simpler if the messages were routed through the phone any way.


Your message history is always synchronised between devices, it doesn't sound like rocket science to make the client detect when a message is meant to be sent as SMS. No need to make the design choice of routing all traffic through the phone for that.


Corrected, thanks!

Curious though, why does WhatsApp not do it if they use the same protocol anyway? Why bother with the phone as an annoying proxy?


Only the encryption layer of the protocol is the same, they are not really compatible otherwise.

I assume it's a design trade-off when you have E2EE and don't store any messages on the server. With Signal (and I assume Wire) you have to register each device and each device needs to manage its keys. Every message is send multiple times, once for each device, and each device has an independent message queue on the server. That's why you only get the messages from after you registered the device.

Whatsapp doesn't have to do any of that and can just keep the keys on one device.


I like Wire and use it with some people. It allows signing up with email and supports up to three accounts on a device. But I do find it to be a bit slow, mainly in sending as well as downloading images where it’s painfully obvious.

Telegram is better on many counts and is a very nice app to use. Usability will usually beat other factors when it comes to mass adoption.

Signal is the last on my list (it’s slow and unreliable, setting up an account is a coin toss because the verification process wouldn’t work, can’t move chats to a new device, etc.).


I believe you can move chats between devices, they just have to be sourced from Signal and not SMS. SMS/MMS chats are what you may be talking about as immovable from Signal. Those are not shared between your Signal devices for security purposes, I imagine. You can back these up or move them yourself separately. Also, Signal device linking only works between one mobile device and multiple desktop clients. To move between mobile devices it requires a Signal chat backup and restore.


I have used and promoted Signal before, and I also made the mistake of making it my default sms/mms client. I'm on android. Signal does allow you to make a backup file. Then you have to get that file over to a specific directory on the new phone BEFORE you install signal. But the import rarely works. There are tons of issues on the signal repo, and a lot of them closed with won't fix type answers. I myself am stuck with a history of some signal, but mostly sms messages trapped on one phone because the backup the signal app creates fails halfway through importing on other signal apps. Even if it was reliable, it's way too complicated for most people. People change phones every few years and they expect their data to follow them to the new phone.

There's also https://github.com/xeals/signal-back which actually uses the same code as signal to essentially export a backup. If you check their issues page, you can see that most of the issues are problems reading the backup file. Since this tool is using the same code with just a command line wrapper, it gives us more insight into how unreliable the backup and restore functionality of signal is.


It’s just not possible to move Signal chats from one iOS device to another (on iOS, Signal and SMS are not connected). Signal prohibits backing up of the data to iTunes and to iCloud. So there’s no way to make this work. That’s how it has been, and Signal does this for “security reasons”.


Don't forget Wire having a proper browser app which has basically the same functionality as the other apps. You can just use browser for anything, and keep all account settings, make audio/video calls, etc.


Before moving to Electron, Signal was a Chrome app. I'm not sure if it's still available as such, however. A way to generalize between the two from the development side would be nice.


I keep checking out Wire every once in a while because I _want_ to use it, but as far as I'm concerned, it's not truly multi-device in a useful way because new devices will not be see conversation history: https://support.wire.com/hc/en-us/articles/206998259-Why-can...

I was excited to hear they introduced history backup & restore a while ago, but to my dismay it doesn't work across platforms: https://support.wire.com/hc/en-us/articles/360000775549-How-...

> History can only be restored from a backup of the same platform. You will need a fresh login to restore your history from a backup.

So you can't for instance make a backup of your history on web and restore it on Android, or vice versa. Nor does it support automatically making encrypted backups to cloud storage, which makes the history backup & restore feature rather useless for my purposes.

In addition, just now, when I logged back into my wire account, I see the message "You haven’t used this device for a while. Some messages may not appear here." in place of where my actual messages should have been displayed, which is absolutely maddening, and there doesn't seem to be any interest in addressing the issue: https://github.com/wireapp/wire-desktop/issues/655

TL;DR: Wire is simply unable to handle message history in a sane manner. And that's a shame because I love pretty much every other aspect of Wire, but broken message history is a complete deal breaker as far as I'm concerned.


Be sure to hit them with at least one support ticket per missing feature. I would really love Wire to succeed (they're doing very well so far, security- and usability-wise) and they do respond to support tickets. So far, for me, nothing ever came of my bug reports (for example, `markdown preformatted text` still sometimes clashes with the url parser), but I expect that if multiple people report the same thing, they'll give it some priority.


Wire on iOS is a native app.


Still eats battery on my work phone, but this is my first iPhone and it's pretty much the only installed app so I can't say if iPhones just have terrible battery management or if that's Wire. The iOS' battery overview is about as good as Apple Maps on first launch, so that doesn't help me much.


Threema is proprietary and I think it's dangerous to recommend it as an alternative. Everything we trust should be open.

Telegram is not perfect but better than WhatsApp. Also there is people who want convenience like history sync - it won't help telling them that it's unsecure. In the end at least facebook doesn't have their data.

For NSA etc. it's not that important where stuff is. From the leaks we know they can do everything they want (expertise + huge money) so if you want something more secure you maybe shouldn't use one of the popular solutions at all ;-)


> Threema is proprietary and I think it's dangerous to recommend it as an alternative. Everything we trust should be open.

On the other hand, they have been audited [1], have transparency reports [2] and their servers are Swiss based [3]. Yes open source would be nice, but a relative small company with a focused business model (not dependent on advertisement, data-gathering, donations, kind [bi|mi]llionairs) has its advantages too.

I'm using Threema since some years and wouldn't want anything else.

[1]: https://threema.ch/en/faq/code_audit/ [2]: https://threema.ch/en/transparencyreport [3]: https://en.wikipedia.org/wiki/Threema#Privacy


> their servers are Swiss based

So is Crypto AG. Didn't help Iran from keeping secrets from the NSA: https://en.wikipedia.org/wiki/Crypto_AG

[edit to add: not saying that Threema sells out like this, but "they're Swiss" is as meaningless for a security assessment as any other nationality]


>their servers are Swiss based

which doesn't mean anything.


Why not? Law, regulation and policies matter. The servers fall under the jurisdiction of the country where they are located.

There are certainly other options, but I think Switzerland is among the countries with quite reasonable data protection laws. Here the text... https://www.admin.ch/opc/en/classified-compilation/19920153/...) or some statements which I found with a quick google search: https://protonmail.com/blog/switzerland/ or https://swissmade.host/en/data-protection.


It doesn't relly matter that much if Facebook has your metadata or Telegram has your metadata. What's more important: WhatsApp E2E is more secure than Telegrams self built E2E, that needs to be enabled and doesn't support group chats.

Also, Telegram isn't that open source either: server side is closed source and the client side seems to be still an undocumented, mirrored mess.

So for me at least, as I don't want anyone that easily to read my messages: many other solutions > WhatsApp > Telegram.


Isn't Telegram open source strictly superior than WhatsApp open source? WhatsApp has neither client side nor server side code available(server side doesn't mean much anyways, as Good luck verifying if that is the code which runs on some server).


Does anyone have experience with using Matrix [0] as an alternative to Telegram and Signal? Maybe even running your own homeserver which stores your history on a server you control and syncs that across all devices?

As far as I saw you can also bridge to other messaging services, even to WhatsApp [1] which is why it might be worth a try.

[0] https://matrix.org/blog/home/ [1] https://github.com/tulir/mautrix-whatsapp


I've been using Matrix with Riot since 2016 (then called Vector). Rooms aren't encrypted by default, however you can turn it on with the toggle of a button. One drawback I've noticed is Riot uses a lot of battery on Android (this is probably due to me not using the GCM back end), and on Linux as well (this is probably due to it being an electron app). Sadly there is no other Matrix client which is as featureful as Riot (for example, most of the others don't have E2E support), and so we're kinda stuck with it.

I also tried setting up a matrix home server with synapse, however I couldn't get it to work with an nginx reverse proxy and let's encrypt, and decided to just settle with using the default home server with E2E enabled. Others have had more success with this however, so I must be doing something wrong.

The experience is pretty good, and my friends and I have tried the Slack, IRC and Telegram bridges and they seem to work pretty well (infact the telegram bridge was so fast that I thought my friend was using the native client when he was messaging me). Haven't tried WhatsApp yet, though.

I've also used matrix and riot to connect to large group chats on IRC and Matrix and overall its a more pleasant and user friendly experience than using IRC, especially on mobile.


I'm currently self-hosting a Matrix homeserver! I got a few friends to join me and we use it to chat daily.

I can't really speak for its security. Afaik the protocol is almost considered stable, and I trust it for everyday communications — but I wouldn't trust it if dealing with a nation-state level adversary.

The only privacy pain point is that if you run a synapse home server (the reference implementation), by default your voice and video chats are routed through Google's servers. You have to run your own TURN server alongside synapse to prevent this. This is fine, but imho it's not well explained on the install guide.

Riot is... Okay. It's great ffor power users, but imho is quite obscure for the average chat app user; the ux isn't great yet. Perhaps with the redesign...

Lastly, bridges are the biggest reason why I want matrix to succeed... But as of now, none of them are really useful or even usable. They're all in very early stages of development and need contributions.


>Lastly, bridges are the biggest reason why I want matrix to succeed... But as of now, none of them are really useful or even usable

Why do you say that? I know many projects using bridges successfully. For example the UBports [0] guys have bridges between their Telegram and Matrix rooms.

Besides FluffyChat [1] works wonderfully on UT phones, so that's a plus for them.

[0] https://ubports.com/ [1] https://christianpauly.github.io/fluffychat/


Yes, I do. If you have 39 minutes to spare I recommend you to watch this Ben Parsons talk at 35c3 https://media.ccc.de/v/35c3-9400-matrix_the_current_status_a...


Yes, I've been running my own homeserver for a while now.

I use it daily to communicate with three of my friends in a group chat. I use Riot on iOS, they use Riot on Android.

After the initial setup, I haven't had a single issue with it to be honest.

On a somewhat related note, it's hard to get people to switch to something new, but I was able to get my non-techie friends to switch by showing that we can use it to exchange high quality videos without any size limit (well, I actually have it set to 1 GiB, but I can change it). That's one of the biggest issues when communicating between Android and iOS. Sending videos over MMS sucks.


Yes, I've been using it alongside Signal and Telegram. It works very well. E2EE has been a work in progress, but it might have stabilized.


This post has some useful information, but the recommendation of other centralized platforms that rely on a phone number make this post seem like the pile of FUD that Telegram has been enduring for long. Contact theft, metadata, susceptibility to phone number for setting up or verifying an account are issues shared by the other recommendations too. [1] For someone who doesn’t know much about security, it comes down to “who can be trusted”, and that question applies to Signal and Threema too.

[1]: Edit. I wasn’t aware that Threema does not require phone numbers. Thanks to dbrgn for correcting me.


Threema does not rely on phone numbers. You can use it anonymously, and you can add other contacts without knowing their phone number.

More details can be found in the crypto whitepaper: https://threema.ch/press-files/2_documentation/cryptography_...


Telegram is popular because of it's convenience, not security. I've tried using whatsapp and the app seemed so much inferior: no nicknames, crippled desktop client, no separate contact list. It happened so that telegram became some kind of standart for formal chatting in my country (universities, businesses, journalists, officials use it) even thought it's blocked by ISPs (still works fine most of the time), I think mainly because you don't have to expose your phone number to be contacted and you can easily use it on desktop (also bots, I know one big store chain that uses tg bots for managing your bonus programme).


Yes, because it’s such a delight to don’t have a synced history across devices. When I was figuring out how to transfer WhatsApp hustory from my mother’s Android to her new iPhone I realized that was no longer possible unless you use some wacky paid tools, because one only syncs with Drive and the other is with iCloud. Although I’d love to log in Telegram with Authy.


The usability of Telegram is indeed awesome and I'm sure that's why it's so widely used (well, that and it not being owned by Facebook). But there are so many choices they could have done better.

So you want synchronised devices? Encrypted ("secret") group chats? Heck, encrypted group video calls? Wire.com has you covered.

You want chat history? Any of your old devices could transfer the encryption key via a QR code or by having you type it over. The server could store opaque blobs that only your devices can decrypt.

The only thing you can't do with current technology is server-side searching without server-side access to the plaintext (as far as I know, homomorphic encryption is not practical enough yet for that). But they could at least give you the option not to store your plaintexts and only have device-local searching. I'd just put a telegram-cli client on my server and have searches deferred there. Heck, media storage could be deferred there so I don't use hundreds of gigabytes (that's how large my account is) on their servers with no known profit model. The whole thing is just so shady.

Anyway, most of the features are easily possible, but instead of working on it, Telegram keeps perpetuating insecure defaults since they launched. The desktop client never even supported encrypted chats. They make it very cumbersome to use the encryption (mobile only, no device sync) that I do it only for a few exceptional cases.


I don’t understand why Telegram still doesn’t have E2E (“secret chats”) on the desktop. That’s such a pain for longer and hotter conversations when one needs to type a lot.


It's because they would have to implement multi-device sync. The last time they implemented crypto (publishing mtproto) people fell all over it, so I can see why they might consider it a risky move to introduce more crypto that someone somewhere will definitely find an imperfection in (even if you use an existing protocol).

Not that I agree with it, of course. I think it's a good reason to stop using Telegram. But I'm pretty sure that's the internal reason. Unless the reason is "because then everyone would actually use it and we can't read 99% of chats anymore", but that's a little conspiratorial. Not entirely implausible, given the non-existent profit model, but still.


The App Store MacOS version of Telegram offers end-to-end encrypted chats.


Thanks. I wasn't aware of that, and will look into it. But if at all it doesn't sync secret chat across devices, then it's again not of much value.


I guess because with each security-aimed you feature you suggest they’d be losing users and maybe Wire is better none of my contacts is there.


Adding to this, another well known trade-off of E2EE is not being able to have server-side indexed full text search. For example sometimes it’s necessary to look for some link you shared months ago, and keeping histories local and searching through it is just inefficient on space and speed.


Local history in plain text/SQLite is not that bad. I have 15 years of chat history including small traffic chat rooms, and it takes ~400MB.


Or just not keep your chat history , never back it up, never miss it

info which i want to keep just gets send via email


Yeah, that probably works well for you, but a lot of people want chat histories. In general, telling people to give up features which they really want is not a recipe for product adoption.


Why don't we actually ask the question from a perspective of a user.

Do people want randomly scattered , unmaintained information containers which are not accessible via unified interface, but still vulnerable to theft and abuse ?

Or the desirable functionality is an easy to use, secure, organized knowledge repository ?

My answer is: As a user, I want to be the only one able to search my own information, and I want to do it conveniently and securely.


"Telegram provides a feature called "Links previews" that's available and on by default in not encrypted chats, anyways if you use a "Secret chat", Telegram app ask to you if you want to use "links previews" adversiting that it previews are done in the server side. How can they know what links are you writing? Can they read your messages still if you are using "Secret chats"? (Not sure but is a edge case)."

They probably do pattern matching on the app, why would they need to send anything to the servers for content previews?


Not verified but i'd imagine the retrieval of the metadata (the cards/tags needed to show the image previews, etc.) is done by proxying via telegram servers, possible to prevent semi-doxxing someone (returning their real IP) just by sending them a message?


> Telegram provides a feature called "Links previews" that's available and on by default in not encrypted chats, anyways if you use a "Secret chat", Telegram app ask to you if you want to use "links previews" adversiting that it previews are done in the server side. How can they know what links are you writing? Can they read your messages still if you are using "Secret chats"? (Not sure but is a edge case).

It feels reasonable to assume the Telegram client detects the links (using a regexp or something) and sends them to the server side. Of course they see the links if you ask them to but this doesn't mean they also see all the text.


Sadly switching to Signal, Wire, and other "technologically superior to Telegram" apps means a drop in user experience. For example, I tried switching to Wire recently but I found that setting up a new device means I wouldn't have access to my old messages.

I understand these apps don't want to upload messages to a server, but can't messages be synced between clients? Even if I had to scan a QR code on my phone to sync with my laptop and had to stay on the same Wi-Fi network until it was done, it could then sync historical messages. I have Telegram messages from 2015 when I had a different computer, laptop, phone, and lived in a different house.

This is also why I don't use the secret chat on Telegram. As soon as you add too much encryption and security you have to trade-off usability. Must it always be that way? If you have to chose between security and usability, people will always choose usability.


Those other more secure chat apps also treat desktop as an afterthought or worse, which is simply not acceptable for my use case. I live at my desk, not on my phone, and expect desktop to be treated as a first class citizen. At present, the only two services that do this are iMessage and Telegram, so I use iMessage with everybody with Apple stuff and Telegram for everybody else.


This is a pretty big issues for me as well.

In terms of user experience on desktop, all the other apps are complete garbage. Telegram has a native client for all major platforms written in Qt that has all the features the mobile app has (except for secret chats). Everything else feels slow, unresponsive and has poor features.


I checked Threema because of this post (didn't know about it) and while searching, some people are complaining about it being under control of the Russian gov? Does anyone know anything about this?

https://www.newsbtc.com/2017/10/04/crypviser-secure-messagin...


Threema does not fall under Russian jurisdiction, so no.

http://www.ewdn.com/2017/03/16/russia-adds-intrernational-me...

> Interviewed by Runet Echo, a Threema spokesperson said, however: “We operate under Swiss law and are neither allowed nor willing to provide any information about our users to foreign authorities.”


That seems suspiciously specific.

>foreign authorities

This reads as "but local authorities are no problem" to me.


Check out the transparency report for details: https://threema.ch/en/transparencyreport


This is just Crypviser spreading lies. No, being on that Russian register doesn't imply any intention to comply with any wiretapping / censorship requests; in fact, IIRC, it doesn't even necessarily imply any consent to being added to the register. Saying that Treema's being added to the register means it "moved under the control of the Russian government" is no different than adding their company name and address to some Excel file on your computer and then saying that Threema has moved under your control.


At the end, the author recommends Signal (https://signal.org/) and Threema ((https://threema.ch/en). I've never heard of Threema before, but having more viable options in this space is definitely a good thing.


Signal requires phone number just like Telegram does. Does it mean it is susceptible to the same attack that Telegram is?


With Signal, if a the numbers registers a new account like in this attack scenario, 1. the attacker cannot access old/pending messages (as they are always encrypted), and 2. the person on the other side of the conversation gets a warning that the keys have changed, thus need new verification before continuing the conversation. So, Signal is more secure on that front.


Telegram got two factor support which a user can enable anytime for security.


Also, there's a PIN code you can use to protect from transferring your number to another device. This is periodically asked by the client app so that you'll remember it.


Yes


Yes. But you don't get message history.


I thought that you can get message history but it is not automatic.


How about Wire as an alternative? I'm curious.


I've started using Wire recently, and while it has some minor warts, call quality has always been excellent. In addition I really love being able to use it on my computer without having to check whether my phone has connectivity or not. I hate having to babysit my phone all the time and Wire is just wonderful for that, kind of brings me back to MSN days.

This chart [1] compares several messaging apps' security standards; I'm not sure how accurate it is but maybe it can help you.

[1]: https://www.securemessagingapps.com/


I thought the best jurisdiction for privacy was Iceland? The solutions provided by your link are all USA/Switzerland/Germany.


Meh, jurisdiction is one thing, but if the trust model does not include the server (a e2e encrypted chat app does not require a trusted server by definition), it doesn't really matter that much. Most they can do is social network analysis (as in, who talks to whom), against which the only solution is using an anonymisation network like Tor (no jurisdiction is going to disallow their intelligence services to see to whom an alleged terrorist is talking).


We use Wire at work. It's sometimes a little buggy, but given the advantages (see my other comment in this thread: https://news.ycombinator.com/item?id=18838010), it's a definite recommendation. Full disclosure: my company is one of the independent auditors of Wire (though they did so before I joined them).


I believe Threema is more focused on business messaging these days. My employer recently qualified it as our official mobile messenger.

Providing an Open Source client seems no goal anymore.

What all the other messengers should copy is their poll feature [0]. It can basically act as an integrated Doodle.

[0] https://threema.ch/en/blog/posts/threema-poll-feature


Threema puts more focus on business customers than in the past (as it's more sustainable than one-time payments, there are no external investors). But the business app is still fundamentally the same as the "private" app, just with management/configuration features added. Private users are not second class citizens.

Regarding open source, the app isn't unfortunately. But the crypto (based on NaCl by djb) is documented[1], has been audited[2], has been reverse engineered twice[3][4] and there is an alternative client implementation[5].

[1] https://threema.ch/press-files/2_documentation/cryptography_... [2] https://threema.ch/press-files/2_documentation/external_audi... [3] http://blog.jan-ahrens.eu/files/threema-protocol-analysis.pd... [4] https://media.ccc.de/v/33c3-8062-a_look_into_the_mobile_mess... [5] https://openmittsu.de/

(Disclaimer: Threema dev)


I think Threema is more well known in Germany.


> When registering an account with Telegram, the app helpfully uploads the entire Contacts database to Telegram's servers (optional on iOS).

The Telegram iOS client has started pushing dark patterns to get me to upload my contacts: it now shows a perpetual “badge” in the app’s main tab view (so that I can’t ever miss it), as if it has an error or alert it needs to tell me about. If I tap on it Telegram will helpfully tell me, on a visually broken page, that I should please allow Telegram to access and upload all my contacts to “seamlessly find all my friends” (like I needed this).


You my not need it, but FSB sure does.


To summarize: there is a theoretical concern about the MTProto protocol used by Telegram.

Mentioning "nation states" is laughable and it seems everybody should be reminded that not only advertisers have access to "metadata" generated by their mobile devices.

To make the life less convenient but more "secure", one could enable "secure chats", set an explicit password and don't allow access to the Contacts on iOS.


A theoretical concern about MTProto that was already fixed over a year ago with MTProto 2.0.


> Mentioning "nation states" is laughable

In some countries, a “nation state“ is pretty much every local police chief and prosecutor who like their comfy and paying positions a lot, don't like having too many loud democracy advocates around, and know who to call in such cases.

People tend to forget that the state is basically evolved mafia.


I made a habit of using automatically disappearing messages in chat using Signal (after a day or week). You have to manually save important messages, but typically this is not an issue.

The only downside is that the Signal servers may store automatically disappearing messages until all devices have received it, so you can't rely on messages being removed in due time.

It would be nice to have an e2e encrypted p2p messaging platform to overcome this issue.


> The only downside is that the Signal servers may store automatically disappearing messages until all devices have received it, so you can't rely on messages being removed in due time.

I don’t understand your problem statement. Signal is completely an end-to-end encrypted chat platform. Those messages stored on the servers are encrypted and Signal (the platform or employees) doesn’t have the keys to decrypt the messages. So what’s the issue if the messages remain on the servers until they’re delivered to the recipients’ devices? Even the non-disappearing messages don’t stay on the Signal servers forever (unlike say, Telegram).


So, Tox?


No way. Tox claims to keep users safe from governments and does not even have a threat model in the documentation.


XMPP: OMEMO, XEP-313, XEP-280, done. Cross-device e2e encrypted, synchronized messages. Local storage is not in-app encrypted though, but that needs to be taken care of on the system level anyway in my opinion.


Where can I sign up?

Serious question because I can't tell a few dozen friends and family to either run their own server, nor do I want to play customer support for all of them.


Also take a look at Quicksy [0]. It is done by the author of Conversations but allows phone number sign up, which simplifies things for non-tech people.

[0] https://quicksy.im/


Conversations (the best Android client out there) proposes using its author's server for a fee.

https://list.jabber.at has more servers.


What about PFS? Which apps would you recommend? Also with XMPP you almost have to specify which apps to use. Since it is as an open standard, just recommending the protocol means that someone could just pick one of the "bad" apps (old non-updated, or some which are actively against privacy) and the whole effort becomes pointless anyway.


For Android phones you need to get the app called Conversations, and for Linux Windows Mac desktops you need to get the app called Gajim, and just enable OMEMO encryption under plugins.

Conversations for Android is definitely the best XMPP client out there right now, so just make all your family and friends use it. And everything will be transparently flawless.


For those who recommend Matrix: could you maybe say what Matrix is supposed to provide over something like this? It sounds like everything I want in a messaging system, but as I understand it Matrix is supposed to be a kind of sequel to XMPP.


So storing data remotely is now a security nightmare? It doesn’t really help to exaggerate like this.


Not necessarily. But storing them in plain text on the server and all history accessible just by hijacking a phone number (or a single text message) is, considering the alternatives, don't you agree?


How do you know that it is stored in plain text on the server?


Because there is no end to end encryption (other than some very limited temporary one on one chats that only work between two mobile devices) which means the server can (and does) read and store all your messages. Getting access to your full chat history is as simple as intercepting a single text, because the server has access to your plain text messages.

They might do some form of encryption to store the messages, but this is meaningless when they also have access to the keys and can decrypt any message whenever they want.


So in essence, like about any service that performs manipulations on your files? Like a storage service that creates thumbnails of your photos?


Telegram does not store private keys only on client. On the other hand telegram splits the private key and stores in multiple severs in several countries making it legally very very hard to process requests. But this also means the founders or just some developer who has access to the servers can read the messages as they have access to private keys. Signal app is more secure.Signal app does not store private keys on the server. The following faq is from telegram site.

Q: Do you process data requests?

Secret chats use end-to-end encryption, thanks to which we don't have any data to disclose.

To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.

Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression. Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world.

To this day, we have disclosed 0 bytes of user data to third parties, including governments.

### End of Faq ####

Even the secret chats in telegram are not secret because the protocol is custom developed by amateurs devs. It is not tested and have seen some articles on methods to attack, could not get the link now.


Sounds more like this guy works for a competitor than a security firm.

No proof, and he apparently didn't even test his own theories.

And the whole "Telegram isn't secure because nation states can abuse their telcos to break 2fa" is, while accurate, also true for about 90% of the internet.

And what's with this method of disclosure? It's not ethical or responsible. It's plain defamatory.


Keybase.io is an alternative for secure chats. Anyone know how it compares to signal, wire & threema?


I think Keybase does chat well and their code is open source. One issue I see with them is that they seem in a perpetual beta phase, so who knows how dependent you can be on their software. I also don't like the React Native based phone apps as they don't feel native and lack polish. I also worry about them having joined the crypto train with no reason besides being backed up by Stellar foundation. The stake from Stellar into Keybase isn't clear and may influence decision making in direction of Keybase.

The chat clients do have a nifty feature of setting up a time based, self destroying message. And you can share git repos within your registered clients and peers. They also have a shared filesystem and team based chat. You can also generate a PGP key pair for identity. The feature list is vast.

The main point of Keybase is also sound in my book, which is to have a network of proof such that you can prove who you are in your correspondence in whatever social media or chat you use. You can even use the encryption / decryption services of Keybase only and send the encrypted payloads however you'd like.

I think with more polish on their clients and proven stability that Keybase will be a great service to depend on. But I do think they need more personnel and/or they need to step up on their social aspect because their last blog post was 200 days ago.


A bit of input on the beta worry:

By no means has our experience been flawless, but as a small business we have been running entirely on Keybase for around a year and a half now.

We use Keybase every day for team & partner chat, shared storage, and git repository hosting. Our email is on ProtonMail, A/V chat via Wire, and our static websites I am not entirely sure where we host. Keybase IPs are not fixed at time of writing, so a CNAME solution would be our only option for root DNS hosting on Keybase - so we are not interested in that at the moment.

Again, we are just a small business and we do indeed run frequent backups of repositories and KBFS to offline storage (though I am not sure I would want things different with any other third party host), but given daily heavy collaborative use for our size, we do not really have complaints on beta-ness.

Primary feedback at this point are mostly feature requests - like support for git-lfs, static IPs for better DNS flexibility, minor client usability tweaks, and support for tablet displays. Thankfully the Keybase team is very active in community channels so we have a good idea on the status of these things and rapid turnaround time for bugfixes.

Active clients in the team are on android, Docker, iOS, macOS, and Win7.


Keybase is nice indeed. I think what they are doing with their network of proofs is somewhat unique in this space and makes end to end crypto usable for the masses (unlike e.g. pgp). I agree they UI is not as slick as slack or telegram but nothing that they won't be able to fix with a bit of love and attention. Fixing this is expensive, however. Hence their association with Stellar for funding.

It's worth pointing out that Telegram has recently funded itself with an ICO as well. They are very well funded and they have a huge user base.

IMHO investor/greed driven walled gardens have long been a problem in this space and is also the reason email is still around because despite its many flaws, it has one feature that is incompatible with running a walled garden: it is federated and decentralized from the ground up. Anybody can run a server and email clients that lock you into communicating only to users on one server went extinct in the nineties.

With chat, it seems there is a pattern of perpetually recreating walled gardens with a single all powerful company at the center. This has been going on since the days of ICQ, AIM, Yahoo messenger, MSN and all those other long forgotten products that each did more or less the same thing and each tried to own your network. Strictly speaking, there's actually very little that differentiates modern day chat apps from those early chat apps in terms of what they do.

Signal just got a capital injection from the whatsapp founders. I like that Signal are setup as a foundation and that their stack is intentionally completely open source end to end and yet easy to get into for on techie users. This breaks the pattern of centrally controlled for profit walled gardens as the only option. And this makes them somewhat unique in this space: they are now well funded, independent, and widely respected by people that care about communicating securely, and their mission is keeping it like that.

I'm not that eager to swap out Whatsapp for Telegram. Ultimately, I don't trust either of them to do the right thing. But, arguably Whatsapp is somewhat more secure because at least it uses Signal's crypto. I use Keybase because I find their setup intriguing. And of course, I have Signal installed on principle but it seems none of the people I normally chat with are on it. I hope that they can fix that. But realistically, I seem stuck on Whatsapp, which I've never liked because it is such an inferior product compared to almost anything I've used for chat in the last 20 years (fugly UI, limited feature set, no decent desktop client, etc.). But world + dog seems to be on it so there's that.


I hear you. The blog post about Stellar made me very nervous. https://keybase.io/blog/keybase-stellar

However, I get more nervous about use software without an obvious means of support for the developers so it would be nice if they can fill some nitch and make money as long as they don't kill the current client.


Let me preface this by saying I work in security and use Signal. That said, the security industry can be incredibly obtuse at times. End users don't care. Full stop. Having the best encryption isn't a selling point to the masses. I'm making a chat app right now and I'm doing it exactly the same way as Telegram, unencrypted by default with an E2E private chat option. Why? Search. I don't need E2E. I use it based on principle. I do need to be able to search my conversations. As long as the chat provider is trusted, I'm happy to have my messages in plain text to be searchable and I think 99% of end users feel the same way.


Why can't search be done entirely client side?


It could, with IndexedDB or similar, but then you have a massive conversation history being stored on user devices. Space is a precious commodity on phones.


For text? Nah. Most apps automatically store photos on your device, and that's fine. Mobile phones have several GB for storage. That is enough for hundred thousands of messages, before even noticeable. Please reconsider your E2E policy and don't make the world a more insecure place.


The whole article is based on the fact that secret chats are not enabled by default. But i dont understand why that's a critical point, if you are so worried about privacy just start a secret chat. Also from what i can understand its a design decision telegram took. If the secret chats are always on then the private key needs to be transfered from device to device and on new devices,which is a security risk.

The points about the contacts is right though. Also i am a strict no for closed-source privacy app. Even though telegrams code is open, the repo is handedly very badly. With squashed commits pushed once in few months, issues disabled.


> But i dont understand why that's a critical point, if you are so worried about privacy just start a secret chat.

Usability is important. The more steps you put in the way of security, the less likely people will do the right thing.


Is there a standard checklist for messaging services that want to be secure?


EFF designed a way of scoring the security/privacy practices of IM applications. They've called it Secure Messaging Scorecard.

While it was useful, they've decided it was a fruitless attempt[0] and their arguments are very convincing to me. On the other hand, it's kind of shitty not to have a reputable resource to point to and say "you shouldn't be using this, it's at the bottom of this list". I appreciated the list while it was a thing more for its bottom than its top.

[0] https://www.eff.org/deeplinks/2018/03/secure-messaging-more-...


The most comprehensive comparison table I know is here: https://www.securemessagingapps.com/




I think we should always remember that users want usability too and most of the times aren't willing to surrender features for the extra gain in security. If users preferred security, a lot more people would be using PGP.


Telegram is insecure by nature because it's a proprietary service. It doesn't really matter much how is coded or how appear to be coded.

Not to reduce the value of author's analysis of course, but just to clarify a point: no proprietary service can be considered secure, no matter how good and well intentioned it's property is.


Unless you're advocating for P2P chat, I don't see how it's any different for open source solutions. At some point you have to trust the people hosting the centralized servers, OSS or proprietary.


Of course, we need decentralized solution at minimum, distributed at best.

Also about trust: I can trust a bit certain kind of paid hosting/services, for instance companies that are in my country under my country law can be trusted in the sense that I have a certain kind of legal protection and a clear signed contract. It does not stop them do thing I can't know with my data but I have few options. Against services hosted elsewhere in the world with "not-real-contracts" and zero formal fee my possibility of action is essentially ZERO so I can't even being protected by my country's law.


All these criticisms apply to Gmail, Slack, Hangouts, Skype, whatever.


None of these use “security” as their a primary component of their advertising strategy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: