Telegram regularly has contests to break its encryption with a reward of 300K USD https://telegram.org/blog/cryptocontest
If it's not secure, then surely people would be cashing in on that sweet money. So, why is it that we constantly see articles talking about how insecure Telegram encryption is, but nobody is showing a proof of concept attack or collecting the prize?
Unless somebody puts their money where their mouth is and shows an actual exploit with code, it seems like pure FUD to me. On top of that, it appears that attacks on Telegram often come from people associated with Signal in some way. Signal is endorsed by NSA who have a history of promoting weak encryption that they have found backdoors into. I hope everybody still remembers this debacle https://golem.ph.utexas.edu/category/2014/10/new_evidence_of...
Uh, no it’s not. The client (which happens to seemingly violate a bunch of open-source licenses) and the code posted to GitHub do not match; occasionally a source dump is posted online but there’s no indication how this relates to what they are shipping, as the released binaries differ and update much more frequently. The author is quite unresponsive about this: https://github.com/overtake/TelegramSwift/issues/163
I have telegram-desktop (on Linux) which is unofficially distributed and compiled on my laptop via AUR, FDroid also has a version of telegram which is compiled from sources and not taken from pkg's released by the Telegram group, although they remove non-free parts apparently.
I'm not sure about iOS as I don't currently have the means to verify the content of my App Store iOS installed version of telegram, but the fact is that there are (at least) 2 client implementations commonly used that are compiled from source.
I’ll save you the work: Telegram has updated more often that this repository has been committed to, so the code is perpetually out of date, if it ever shipped at all (apparently it doesn’t compile?). The author has mentioned that they will open source their new client once it has been released, but that promise has not been kept.
the version in github works, but the version on play store different a lot from it..
Well, I was challenging this assertion.
"Does not match" what thing exactly? My point was that people are using the non-packaged versions (I mean, I'm not FOSS mad, I'm just a user and I'm using the FOSS version almost by accident).
I would prefer the FOSS version to be 1:1 what is in the repo (or, even the upstream of the non-free version), but I see the situation as more chrome/chromium. We're not shitting on chrome for being "the most secure webbrowser" despite having non-free elements.
Seemingly because large US tech companies deserve more respect than a guy who was essentially exiled from his country for not handing over his users data?
Sounds like a double standard. Telegrams OSS edition works, so if they're sneaking things in it would have to be a backdoor, which would be INCREDIBLY damaging to their endeavour if it was found, and it would be relatively trivial to find too. So I'm not sure where this FUD is coming from.
Making the argument about the server not being FOSS is valid though, but then if the E2EE is good enough on the client then the server is basically a relay and can't pilfer anything except metadata. (which, if we're honest, is what governments are after anyway)
Chrome is not GPL.
You can also compile the official client yourself.
I mean, I could, but the Telegram developers being so opaque about this doesn’t make me feel like this is a good use of my time.
> There are many implementations already.
As far as I can tell, the only implementations for the platforms I care about are the “official” ones (in the sense that they are placed under some random person’s GitHub account and linked to by the Telegram website).
> You can also compile the official client yourself.
I have never been able to get this to work. Either way, what I compile from the “sources” they have put online and what they are shipping are two different things.
Telethon (https://github.com/LonamiWebs/Telethon) is a decent up to date (with MTProto) independent clean-room client implementation in the form of a Python lib.
You might consider trying to take care about it.
The founder of chat app Telegram has publicly claimed that feds pressured the company to weaken its encryption or install a backdoor.
He said, During our team's 1-week visit to the US last year we had two attempts to bribe our devs by US agencies + pressure on me from the FBI.
> there's code in GitHub that you can compile and run a working client
No, not really. The code pushed to GitHub doesn’t really compile and the author of the project is quite unhelpful in resolving these issues.
> Whether updates are being dropped in a timely fashion or not does not change anything.
It does: the project is licensed under the GPL, and in a sense “benefits” from having the name of the license associated with their product. But they are not compliant with software licenses, they’re basically source-available maybe possibly if you beg them enough, except also the sources they publish cannot be verified to and likely do not correspond to what they are currently shipping. For a project who’s entire goal should be increasing trust in their product’s security, keeping the development process opaque and violating open source licenses (in addition to their adoption of custom crypto, which I will ignore for this because I cannot comment on it) does not inspire trust at all. Plus it’s just plain annoying.
Which means we must trust Telegram developers to not sell out our data to corrupt law enforcement agencies.
The service is free. Which means they would need to get profitable. Which means they would parse the chats or are already doing it.
Sorry, do you have source for this claim?
>If it's not secure, then surely people would be cashing in on that sweet money. So, why is it that we constantly see articles talking about how insecure Telegram encryption is, but nobody is showing a proof of concept attack or collecting the prize?
regulary != 2 times with limited time. Also, E2EE is not only about decrypting a message. E.g. signing messages as someone else isn't awarded. Also, you might need a lot of computation power. SHA-1 used in MTProto 1.0 for example is practically pretty secure, but not against a well funded attack.
But that aside, Telegram's encryption is probably good enough. But we already have standards that are good enough. Why risk it?
For example, from On the CCA (in)Security of MTProto:
>Telegram is a popular messaging app which supports end-to-end encrypted communication. In Spring 2015 we performed an audit of Telegram's Android source code. This short paper summarizes our findings. Our main discovery is that the symmetric encryption scheme used in Telegram -- known as MTProto -- is not IND-CCA secure, since it is possible to turn any ciphertext into a different ciphertext that decrypts to the same message.
>We stress that this is a theoretical attack on the definition of security and we do not see any way of turning the attack into a full plaintext-recovery attack. At the same time, we see no reason why one should use a less secure encryption scheme when more secure (and at least as efficient) solutions exist.
>The take-home message (once again) is that well-studied, provably secure encryption schemes that achieve strong definitions of security (e.g., authenticated-encryption) are to be preferred to home-brewed encryption schemes.
And that aside, E2EE is not default and neither E2E group chats or E2E video calls are supported. This is the biggest security problem.
People also seem to conflate the fact that encryption is not on by default, or that it doesn't apply to stuff like voice chat with questions about the quality of the protocol. This is a completely separate topic of discussion.
Traditionally when people claim a system is cryptographically secure it's after rigorous analysis. MTProto isn't scientifically interesting. Two short contests several years ago don't make it financially interesting. The groups with the most incentive to break it have the least incentive to publish.
> Signal is endorsed by NSA
Uh, what? This isn't true.
I’m not sure how Signal was supported financially between 2015 and 2018 — maybe out of Moxie’s own pocket from Whisper Systems acquisition by Twitter? — but right now its continued development is supported by a $50,000,000 donation from Brian Acton of WhatsApp.
The relationship between Signal and the State Department is solely one of funding, not of technical interaction, let alone technical endorsement.
The author can’t seem to tell his story without tossing attacks at the NRA and guns - while admitting that there is a point to everyone having the same level of access to things the government might use against you (be it encryption or guns).
The author seems to embrace that the unbelievable idea from the media that “Trump won, now you need to encrypt your email”... while that sentiment ignores it was Obama admin that was prosecuting whistleblowers more than anyone before, using the NSA to unmask citizens in political oppsoition, the IRS targeting, and under Clinton pushed the “Internet freedom” policies... I feel like the article was gaslighting itself.
And then finally at the bottom, Signal is funded by a company that is associated with a government group. Which is concerning, no doubt.
Just what a strange and often seemingly self-arguing article! Maybe I just don’t understand the author at all.
Open Whisper Systems is an independent project, receiving grants and donations through the fiscal sponsorship of the Freedom of the Press Foundation. Last year, the Signal Foundation was created and funded by the aforementioned large donation from Brian Acton, which will take over this role once the IRS approves its 501(c)(3) charitable status.
That money will, minus whatever portion the Freedom of the Press Foundation keeps to cover its overhead, be spent on Signal. And donations for Signal to this foundation are tax-deductible to the extent 501(c)(3) charitable donations usually are.
I have no connection with any of the people, projects, or organizations we've been discussing. But I am the president of another fiscal sponsor 501(c)(3) nonprofit in the free/open source software space, so I'm very familiar with this model.
Nobody (really) complains about PGP's crypto, but there are dozens of articles rightly criticizing its 80s-era "UX".
Because that's not how cryptography works.
Remember that there is no mathematical proof that working cryptography even exists. Cryptography relies on "one-way trap-door functions," i.e., functions that cannot be inverted unless you have the secret key, and the existence of one-way trap-door functions implies P ≠ NP. Because we have no proof of P ≠ NP and it may well turn out to be the case that P = NP (surprising as it might be), we have no proof that cryptography is even possible.
So, what do cryptographers actually mean? In practice they mostly mean that something has been subject to attacks for years and the public mathematical / cryptanalytic community hasn't found any cracks in the armor. We expect RSA to be secure because people have been working on efficient factoring since at least when we named the P vs. NP problem and arguably for centuries and we haven't made any real progress. We expect block ciphers and hashing algorithms to be secure because they're subject to public scrutiny.
The last time we had a contest for a hash function https://en.wikipedia.org/wiki/NIST_hash_function_competition, there were multiple good finalists, none of which were cracked in their entirety. They each were refined as people found attacks against artificially weakened versions of the functions, on the grounds that attacks get stronger over time. And even so only one was endorsed, and even so consensus is, as far as I can tell, to use a newly strengthened version of one of the finalists (BLAKE2) instead of the actual winner (Keccak). And certainly nobody would recommend using any of the other finalists, simply because the attention of the public research community is no longer on them, even though they haven't been publicly broken.
It is very easy to make up an encryption algorithm or cryptosystem, run a contest for a few months, and not find a winner. That falls far short of the standards professional cryptographers have for solid cryptography. As with hash functions, it may be the case that nobody has found a specific weakness in Telegram's crypto, but using it when it's not the focus of research / not built on the best primitives the research community has to offer is unnecessary risk.
Please provide evidence of this. As far as I know, this is a lie.
Moreover it is in their interest that the vulnerabilities remain in the wild.
Meanwhile, the bug bounty contests are limited time, several times. Good for publicity, but hard for individual actors to capitalize upon (more likely to take on-and-off work over a long period).
To begin, when researchers publish or analyze cryptographic algorithms, they model the interaction of legitimate users and adversaries. Two legitimate users will engage in communication and an adversary will sit "in the middle", trying to break ciphertext confidentiality or authenticity (or both). Generally the adversary is allowed to tamper with or reply messages from Alice to Bob and vice versa. Adversaries are alsl given various capabilities of interacting with the cryptosystem and messages therein, and if the cryptosystem is secure against an ever more capable adversary, we say it is "____-secure." Researchers model the interactions in this way to more closely approximate real world, repetitive usage instead of an isolated simulacra.
So let's talk about cryptanalytic attacks. There are known plaintext, chosen plaintext, known ciphertext and chosen ciphertext attacks. The attack always has a ciphertext, so known ciphertext is the lowest form (least assured) level of cryptanalysis. A known plaintext attack gives the attacker a corresponding ciphertext and a plaintext. A chosen plaintext attack affords the attacker the ability to use an encryption oracle to encrypt arbitrary plaintexts and see the ciphertexts. And likewise, a chosen ciphertext (the most powerful attack) allows an adversary to decrypt arbitrary ciphertexts.
The Telegram contest requires you to provide the plaintext corresponding to a single ciphertext. You are not given access to real time encryption or decryption within a channel and you are not provided with the ability to attempt cryptanalysis on numerous messages. You can't even tamper with messages or try to replay them, because you have an isolated ciphertext to examine. Everything else is outside the scope of the contest.
That means the only thing you're allowed to do is a known ciphertext attack. A known ciphertext attack is extremely difficult to pull off even for weak cryptography and it does not resemble the structure of a real world attack. For example, legitimate attacks on TLS are not known ciphertext attacks.
That's why the Telegram contest is synthetic. Again, I'm not going to comment on its actual security. But if you want to do real cryptanalysis, you hire a team of cryptographers who didn't design the thing and try to break it with a more generous suite of capabilities that better resemble real world scenarios. The critiques from the cryptographic community should be examined on their own merits, and not judged according to whether or not they've successfully broken something under a known ciphertext attack.
When I see the equivalent done for Telegram I'll start taking the claims seriously, until then it's just noise.
> Telegram regularly has contests to break its encryption with a reward of 300K USD
> If it's not secure, then surely people would be cashing in on that sweet money.
Like I said in my first comment, all I'm doing is explaining why a contest requiring known ciphertext attacks is not evidence of cryptographic security. You originally spoke to this point, and I am responding about that point in particular because it's a topic I know intimately well.
You can double down on your second point about there being no published exploits; that's not something I can confidently speak to. I think there are reasonable counterpoints citing serious vulnerabilities are not always publicized, but I'll leave it to others to discuss that point.
It's not my goal to convince you that Telegram is insecure. What is important from my perspective is that you understand contests are in fact contrived and noisy, and should not be considered evidence of cryptographic security whatsoever.
To give a specific scenario to illustrate my point, the BEAST attack on TLS happened in 2014, but we knew that CBC was vulnerable to this specific attack since ~2002 (Vaudenay) and that malleability was a concern even earlier. At the time, however, for TLS it was considered impractical and only of academic concern. Had someone offered a sum of money to prove TLS hadn't accounted for this design flaw at the time, nobody would've been able to claim it. The modern reliance and widespread adoption of ajax and client-side web apps made an actual attacker-controlled-plaintext scenario required for this exploit a reality.
We're paying much more attention now to concerns like constant time code, invalid curve attacks, IND-CCA2 etc not because attacks are obviously possible now, but because we can't always predict when they may become viable in the future. However, Telegram's choices seem to ignore much of the conventional, hard-earned crypto wisdom. Statements like this:
"Properties like IND-CCA are convenient for theoretical definitions and scientific inquiry, but they are not directly related to the actual security of communication."
are really concerning - IND-CCA is directly related to the security of communication. It's very definition is that, if provided with some known chosen ciphertext/plaintext pairs (ciphertext controlled by the attacker), can the attacker later distinguish two messages encrypted under the scheme. In plain english, schemes that do not have this property start to leak information to the attacker if there is a small partial compromise. There's been a whole bunch of recent issues (particularly with RSA) where the lack of CCA-security has caused a possible attack.
Nobody else in the secure messaging space is trying to argue such security definitions are inconsequential. If the article is correct, it seems not only are they dismissive of it, but they've made no effort to fix it.
As for your last paragraph, I do not dispute your claim NSA have deliberately attempted to weaken encryption. Dual_EC_DRBG is direct evidence of this. However this is orthogonal to the issue at hand. So far as we know, there is no public evidence Signal and the NSA have collaborated (or of coercion) (your article is about general attempts to weaken cryptography, not Signal). Signal's protocol uses NIST Algorithms, but then so does Telegram. The curve choices in Signal's protocol were developed independently of NIST and NSA. I don't see any evidence Signal is endorsed by the NSA. If anything, the deployment of end-to-end encryption pioneered by Signal is concerning Five Eyes governments to the point they are legislating lawful interception requirements.
I stress, to avoid sounding like "FUD", that I'm not claiming telegram is insecure. I am, however, claiming Telegram's choices are concerning and that as a result I do not trust it as a secure messenger. They could fix this by reviewing their choices and being less dismissive of crypto research. I'm not associated with Signal in any way, I don't even use it. I use a competing product for secure messaging (and I'm not affiliated with them either).
In terms of issues with each of these, Wire has occasional bugs and hiccups (UX wise) and would be unsuitable for large groups exchanging messages (every device implies an entirely new ratchet, users own multiple devices, and all Wire group chats are client-fan-out, meaning pairwise ratchets with all members. Users with more than one device, even if they communicate one-on-one, are effectively in a group session). However the group message problem implies trade-offs and doesn't have an easy solution, so this shouldn't be seen as game-ending criticism (there's an ongoing effort involving people from all three messengers to figure out group messaging in a safe and scalable way, called Message Layer Security). WhatsApp is much more feature stable, but compromises on client-fan-out to make group messaging simpler. I think Signal also does client fan-out. I think Signal were also a bit dismissive of the low order point issue in their protocol (https://research.kudelskisecurity.com/2017/04/25/should-ecdh...), but it's arguable whether this is an issue at all their instantiation of DH (https://moderncrypto.org/mail-archive/curves/2017/000896.htm...). So in terms of cryptography, all three are state-of-the-art.
I haven't really used Threema, but for completeness I had a brief look at their crypto "whitepaper". In it they say:
"Due to the inherently asynchronous nature of mobile messengers, providing reliable Forward Secrecy on the end-to-end layer is difficult. Key negotiation for a new chat session would require the other party to be online before the first message can be sent. Experimental schemes like caching pre-generated temporary keys from the clients on the servers increase the server and protocol complexity, ... Due to these and the following considerations, Threema has implemented Forward Secrecy on the transport layer only..."
This means Threema's forward secrecy is based entirely on TLS and doesn't apply to the end-to-end exchange. Also, Signal, Wire and WhatsApp use variants of Signal Protocol and these do pre-keying. It's hardly experimental.
There are other considerations and arguments you can make, such as jurisdiction, use of phone numbers, openness of code, who the parent company is and whether they have a voracious love for hoarding data to rival the NSA, federation etc. I'll leave those. I was responding mostly to the "if you can't break it it's obviously secure" challenge.
tl;dr for a normal person against the average adversary or an only vaguely-interested well-if-we-can-collect-it-why-not NSA, any of Signal, WhatsApp or Wire would be fine.
Say what you will about Telegram, but at least you're getting what's advertised. WhatsApp is a complete black box that's owned by a predatory company whose business is selling user data.
> If you want to use secure and private apps, I recommend: Signal Private Messenger: https://signal.org...
I feel it's wrong to criticise Telegram for using phone numbers, and then in the same breath recommend Signal.
until whatsapp appeared on the scene. The key innovation whatsapp brought, because the app was laughably insecure and probably overly simple (but that simplicity can't have been the reason it won; there were other really simple apps out there)... was making your account ID equal to your phone number.
This gave whatsapp the ability to skip the phase of setting up your 'network'; there'd be no need to ask your friends what their ICQ id is or whatever. Whatsapp would even simply tell you which of your friends had whatsapp installed, immediately, without any consent or setting up your network required.
THAT sold. That simplicity. Yeah, you can (rightfully!) put quite a few question marks on the consent and authentication mechanism I laid out above, but it does lead to an app that is useful and understandable for a great many people (even if it is also not particularly secure or careful about your consent).
I'm sure all these messenger apps (signal, imessage, whatsapp, and telegram) know it and wouldn't dare walk away from phone numbers at this point.
Syria was a very big market for it.
I'd prefer to see people go for a federated system (eg XMPP with OMEMO for E2E crypto) or work on truly decentralized ones (that one is hard, and blockchain is not a trivial solution ;-) ) rather than propose yet another system where somebody else can control who you can talk to (even if they can't see what you're talking about) as a replacement for the previous system with the very same properties, just ran by a different party.
So in a way this is Signal Protocol (the one "regarded as secure") with that "next step" of adding decentralization already taken.
1. WhatsApp was the first widely-used such app and it was phone number-based, so all the clones use the same system.
2. Non-technical people have a hard time (read: bordering on impossible) remembering their usernames, let alone passwords, and by using the phone number as both identification and authorization, the problem is sidestepped.
3. People have an existing address book of contacts (in the form of phone numbers) in their mobile phones, that can be used to pre-populate the app's buddy list.
The problem was when you change your device you can't migrate your PIN so I think it contributed to decline to usage as people moved to other BB devices or iOS/Android
This only strengths the argument why majority of chat apps use phone number as your username.
from 2010 / 2011
> Using BlackBerry handsets – the smartphone of choice for the majority (37%) of British teens, according to last week's Ofcom study – BBM allows users to send one-to-many messages to their network of contacts, who are connected by "BBM PINs". For many teens armed with a BlackBerry, BBM has replaced text messaging because it is free, instant and more part of a much larger community than regular SMS.
In Signal it's a requirement and not a convenience: I would like to avoid using the number, and I'm able to configure it, but I can't, because the apps don't allow that. Just the same like I have to use it for a new Google account etc.
The real reason is: both the spy agencies and the ad companies really like to be able to easily identify everybody.
Some topics to consider, that those interestdd in such topics have to clear with themselves (source: all Wikipedia):
“The Open Technology Fund (OTF) is a U.S. Government funded program created in 2012 at Radio Free Asia.”
“Radio Free X” (for different X” have a very long history of effecively being made and maintained by CIA.
“Clinton's policy was "heavily influenced by the Internet activism that helped organize the green revolution in Iran in 2009 and other revolutions in the Arab world in 2010 and 2011".”
“Notable projects that the OTF has supported include The Tor Project, Open Whisper Systems,..”
Such associations can at least (directly or indirectly) influence some design decisions.
The best proof against the points like mine would simply be to allow those who want and know how to do that to really use the service completely anonymously, that is, especially without providing some phone number.
It is just their insisting on the phone numbers is their action that for me makes use of their product problematic.
And the "Open Technology Fund" is surely an interesting beast, worthy knowing about, as well as the stories about the "Radio Free X" for some X.
Some fascinating insights on that topic:
Signal has repeated the reason they use phone number identifiers ad nauseam. The goal of Signal is to make text messaging secure. Those are the roots of the project. You're confusing the application with the underlying technology, which Signal has published and many other projects have exploited.
For whom is that messaging more “secure“ if the phone numbers are the (technically completely unneeded) requirement on which is insisted by the producers of Signal?
If you don't and never would send an SMS text message, you are not Signal's model target user and never were.
Use something else; you have lots of choices. If not Signal, I recommend Wire, though I think you'll come out behind on the privacy tradeoffs.
So I still claim that the fact they don’t has some very strong reasons that are completely opposite to the security of their users.
And I still haven’t read here any argument that disproves that.
What you cannot get away with doing is making the argument that your disagreement with them is an indicator that they're compromised by the US Government. That's comically false. Snipe at them for things you can make actual arguments about (there are lots of those).
"Apparently" because it's without any proof.
> What you cannot get away with doing is making the argument that your disagreement with them is an indicator that they're compromised by the US Government.
I'm not claiming that and I've never claimed that. I'm claiming only this:
1) there are no technical reasons for them insisting in their users giving them user's phone numbers and user's contact data: the underlying technology would not be any more complex without that.
2) There was(is?) actual monetary support of Signal development by the actual US spy organization (with the decades of historical support for these direct "spy" connections, some of which really amazing, see before). Which gives some additional context for the whole operation.
Specifically "tainted" and "compromised by" were always only your formulations. I'd just say "given the circumstances, there can be observed some non-technical reasons behind some design decisions."
Which would, I hope, be completely non-controversial if we would, for example, talk about Skype. Because for Skype we have the direct proofs from Snowden. And not having such for Signal we surely can say "we don't have proofs" but we also can't claim that "this time it's completely different." Even if Snowden himself talks nice (or something) about Signal.
And downvoting my comments stating both is not changing these facts.
Yes for “complete” security by all means use your own optical fiber network and your own crypto, with a touch of quantum cryptography custom made by Bruce Schneier et al just in case.
We’re talking about someehing else here though.
Matrix/Riot.im and Wire doesn't. I think they're both good alternatives.
Bridging Matrix/Riot to IRC/Telegram/Whatever seems to require some black magick beyond my capabilities.
On the other hand, even my grandmother can (and has) set up Telegram and WhatsApp.
OT but I suppose you mean century?
It is VC backed, doesn't require a phone number, everything is end-to-end encrypted and has apps for all major platforms (iOS, Android, Windows/Mac).
Metadata is surveillance.
In Signal the remote Signal server will not deliver the messages, since anyway I can't decrypt them (there are no unencrypted messages in Signal) and it never keeps any that it has successfully delivered.
Now, if I convince the network that I have your phone number, how should your friends determine that I'm not you?
In Signal (and WhatsApp) this is designed into the UI. If you aren't sure if this is who you think it is, you can meet in person and compare numbers or QR codes to check there are no shenanigans going on. Many casual users never do this, but if something seems "off" you can verify.
But in Telegram there's no option like that.
In Signal you have an identity that's verifiable, and that's what their verify step does. The keys constantly ratchet, and you can even both reset them from scratch, identity verification isn't affected.
In Telegram you set up a separate encrypted chat and the secret keys for it can be visualised. But if you create a different chat any steps you took to verify the identity don't carry over.
nitpick, but I think you mean 'opaque', as in, they can't see it.
The structure is the same and the section headings are the same. It is blatant plagiarism.
At present the OP is nearly blank anyhow.
I stand by my assessment at the time, but I haven’t checked Telegram since then and don’t know what, if anything, has changed.
Seems to me like the “end to end” was an afterthought. Just to claim that they can encrypt messages. While 99% of messages are stored right there, and we must trust telegram not to give that data to anyone. Why should we?
+ 7 https://medium.com/@thegrugq/operational-telegram-cbbaadb901... by https://twitter.com/thegrugq
“Plagiarism is the "wrongful appropriation" and "stealing and publication" of another author's "language, thoughts, ideas, or expressions" and the representation of them as one's own original work.”
Check out the "MTProto is not IND-CCA secure" section Eduard Toloza supposedly wrote: https://gitlab.com/edu4rdshl/blog/blob/9d490385d53cdf7048d16...
> Here I briefly present the two attacks that show that MTProto is not IND-CCA secure. We assume the reader to be familiar with the notion of IND-CCA security...
> Once again, I stress that the attacks are only of theoretical nature and we do not see a way of turning them into full-plaintext recovery. Yet, we believe that these attacks are yet another proof [https://eprint.iacr.org/2015/428.pdf] that designing your own crypto rarely is a good idea.
And compare it to Jakob Jakobsen and Claudio Orlandi's paper: https://eprint.iacr.org/2015/1177.pdf
The "author" just swapped out the "we" for "I". Aside from the text in bold at the bottom, there's no original content. It's plagiarism.
After contacting the author he has removed the plagiarized content.
And I’m pretty sure comments like this are against HN policy as they degrade the quality of discourse.
- SHA-256 is used instead of SHA-1;
- Padding bytes are involved in the computation of msg_key;
- msg_key depends not only on the message to be encrypted, but on a portion of auth_key as well;
- 12..1024 padding bytes are used instead of 0..15 padding bytes in v.1.0.
That criticism is fair a lot of times, but every higher level crypto construction is going to be unproven for a while until checked.
It's not like they were inventing their own hash function and stream cypher.
No, Signal is not "homebrew crypto".
Sure, if I put some primitives together (even if I had a good knowledge of how to do it) in a closed product and nobody evaluates it (and I add a label like "military security") that's Homebrew, no questions.
But all systems are born "in secret" (at least for a short while). Unless the definition involves appeal to authority.
For a quick (probably incomplete) comparison, Wire has proper multi-device support (not sure how Threema does that), open source and cross-platform clients and open source server (Threema does not appear to have a Linux client nor publishes sources as far as I can tell), it's free (allowing everyone to use it, even if you're in a country where it's hard to pay from), it has a proper profit model (unlike Telegram, and to a much lesser extent Signal), and it does not require a phone number.
The two major downsides of Wire is the battery usage due to all clients being Electron, and that few people know about it (hence my prompt to give them a mention). Not major stoppers I'd say...
This is what Whatsapp does. Signal never did and has real multi-device support.
> and the Signal server is closed source
It's not: https://github.com/signalapp/Signal-Server
This is news to me. Curious then, why is it not possible to use Signal desktop without a phone?
> Can I install Signal Desktop without a mobile device?
> Signal Desktop must link with either Signal Android or Signal iOS to be available for messaging.
Last time I checked, it is not really feasible to setup a standalone Signal instance. Is there any guide to deploying a standalone version of Signal on your own servers or a successful example of it? If not, their approach may be more of "see but not touch" than being a free software. Also decisions like dependency on GCM and discontinuing features to increase development speed make Signal less reliable as a open source project.
You're right there's no official support on their server, but there are some unofficial guides on their community forums on how to set it up. Also, GCM is not a hard dependency. On Android it will fall back to using only websockets (and no GCM) when Google Play Services is not installed since some time. What other features have they removed without replacement?
Curious though, why does WhatsApp not do it if they use the same protocol anyway? Why bother with the phone as an annoying proxy?
I assume it's a design trade-off when you have E2EE and don't store any messages on the server. With Signal (and I assume Wire) you have to register each device and each device needs to manage its keys. Every message is send multiple times, once for each device, and each device has an independent message queue on the server. That's why you only get the messages from after you registered the device.
Whatsapp doesn't have to do any of that and can just keep the keys on one device.
Telegram is better on many counts and is a very nice app to use. Usability will usually beat other factors when it comes to mass adoption.
Signal is the last on my list (it’s slow and unreliable, setting up an account is a coin toss because the verification process wouldn’t work, can’t move chats to a new device, etc.).
There's also https://github.com/xeals/signal-back which actually uses the same code as signal to essentially export a backup. If you check their issues page, you can see that most of the issues are problems reading the backup file. Since this tool is using the same code with just a command line wrapper, it gives us more insight into how unreliable the backup and restore functionality of signal is.
I was excited to hear they introduced history backup & restore a while ago, but to my dismay it doesn't work across platforms: https://support.wire.com/hc/en-us/articles/360000775549-How-...
> History can only be restored from a backup of the same platform. You will need a fresh login to restore your history from a backup.
So you can't for instance make a backup of your history on web and restore it on Android, or vice versa. Nor does it support automatically making encrypted backups to cloud storage, which makes the history backup & restore feature rather useless for my purposes.
In addition, just now, when I logged back into my wire account, I see the message "You haven’t used this device for a while. Some messages may not appear here." in place of where my actual messages should have been displayed, which is absolutely maddening, and there doesn't seem to be any interest in addressing the issue: https://github.com/wireapp/wire-desktop/issues/655
TL;DR: Wire is simply unable to handle message history in a sane manner. And that's a shame because I love pretty much every other aspect of Wire, but broken message history is a complete deal breaker as far as I'm concerned.
Telegram is not perfect but better than WhatsApp. Also there is people who want convenience like history sync - it won't help telling them that it's unsecure. In the end at least facebook doesn't have their data.
For NSA etc. it's not that important where stuff is. From the leaks we know they can do everything they want (expertise + huge money) so if you want something more secure you maybe shouldn't use one of the popular solutions at all ;-)
On the other hand, they have been audited , have transparency reports  and their servers are Swiss based . Yes open source would be nice, but a relative small company with a focused business model (not dependent on advertisement, data-gathering, donations, kind [bi|mi]llionairs) has its advantages too.
I'm using Threema since some years and wouldn't want anything else.
So is Crypto AG. Didn't help Iran from keeping secrets from the NSA: https://en.wikipedia.org/wiki/Crypto_AG
[edit to add: not saying that Threema sells out like this, but "they're Swiss" is as meaningless for a security assessment as any other nationality]
which doesn't mean anything.
There are certainly other options, but I think Switzerland is among the countries with quite reasonable data protection laws. Here the text... https://www.admin.ch/opc/en/classified-compilation/19920153/...) or some statements which I found with a quick google search: https://protonmail.com/blog/switzerland/ or https://swissmade.host/en/data-protection.
Also, Telegram isn't that open source either: server side is closed source and the client side seems to be still an undocumented, mirrored mess.
So for me at least, as I don't want anyone that easily to read my messages: many other solutions > WhatsApp > Telegram.
As far as I saw you can also bridge to other messaging services, even to WhatsApp  which is why it might be worth a try.
I also tried setting up a matrix home server with synapse, however I couldn't get it to work with an nginx reverse proxy and let's encrypt, and decided to just settle with using the default home server with E2E enabled. Others have had more success with this however, so I must be doing something wrong.
The experience is pretty good, and my friends and I have tried the Slack, IRC and Telegram bridges and they seem to work pretty well (infact the telegram bridge was so fast that I thought my friend was using the native client when he was messaging me). Haven't tried WhatsApp yet, though.
I've also used matrix and riot to connect to large group chats on IRC and Matrix and overall its a more pleasant and user friendly experience than using IRC, especially on mobile.
I can't really speak for its security. Afaik the protocol is almost considered stable, and I trust it for everyday communications — but I wouldn't trust it if dealing with a nation-state level adversary.
The only privacy pain point is that if you run a synapse home server (the reference implementation), by default your voice and video chats are routed through Google's servers. You have to run your own TURN server alongside synapse to prevent this. This is fine, but imho it's not well explained on the install guide.
Riot is... Okay. It's great ffor power users, but imho is quite obscure for the average chat app user; the ux isn't great yet. Perhaps with the redesign...
Lastly, bridges are the biggest reason why I want matrix to succeed... But as of now, none of them are really useful or even usable. They're all in very early stages of development and need contributions.
Why do you say that? I know many projects using bridges successfully. For example the UBports  guys have bridges between their Telegram and Matrix rooms.
Besides FluffyChat  works wonderfully on UT phones, so that's a plus for them.
I use it daily to communicate with three of my friends in a group chat. I use Riot on iOS, they use Riot on Android.
After the initial setup, I haven't had a single issue with it to be honest.
On a somewhat related note, it's hard to get people to switch to something new, but I was able to get my non-techie friends to switch by showing that we can use it to exchange high quality videos without any size limit (well, I actually have it set to 1 GiB, but I can change it). That's one of the biggest issues when communicating between Android and iOS. Sending videos over MMS sucks.
: Edit. I wasn’t aware that Threema does not require phone numbers. Thanks to dbrgn for correcting me.
More details can be found in the crypto whitepaper: https://threema.ch/press-files/2_documentation/cryptography_...
So you want synchronised devices? Encrypted ("secret") group chats? Heck, encrypted group video calls? Wire.com has you covered.
You want chat history? Any of your old devices could transfer the encryption key via a QR code or by having you type it over. The server could store opaque blobs that only your devices can decrypt.
The only thing you can't do with current technology is server-side searching without server-side access to the plaintext (as far as I know, homomorphic encryption is not practical enough yet for that). But they could at least give you the option not to store your plaintexts and only have device-local searching. I'd just put a telegram-cli client on my server and have searches deferred there. Heck, media storage could be deferred there so I don't use hundreds of gigabytes (that's how large my account is) on their servers with no known profit model. The whole thing is just so shady.
Anyway, most of the features are easily possible, but instead of working on it, Telegram keeps perpetuating insecure defaults since they launched. The desktop client never even supported encrypted chats. They make it very cumbersome to use the encryption (mobile only, no device sync) that I do it only for a few exceptional cases.
Not that I agree with it, of course. I think it's a good reason to stop using Telegram. But I'm pretty sure that's the internal reason. Unless the reason is "because then everyone would actually use it and we can't read 99% of chats anymore", but that's a little conspiratorial. Not entirely implausible, given the non-existent profit model, but still.
info which i want to keep just gets send via email
Do people want randomly scattered , unmaintained information containers which are not accessible via unified interface, but still vulnerable to theft and abuse ?
Or the desirable functionality is an easy to use, secure, organized knowledge repository ?
My answer is:
As a user, I want to be the only one able to search my own information, and I want to do it conveniently and securely.
They probably do pattern matching on the app, why would they need to send anything to the servers for content previews?
It feels reasonable to assume the Telegram client detects the links (using a regexp or something) and sends them to the server side. Of course they see the links if you ask them to but this doesn't mean they also see all the text.
I understand these apps don't want to upload messages to a server, but can't messages be synced between clients? Even if I had to scan a QR code on my phone to sync with my laptop and had to stay on the same Wi-Fi network until it was done, it could then sync historical messages. I have Telegram messages from 2015 when I had a different computer, laptop, phone, and lived in a different house.
This is also why I don't use the secret chat on Telegram. As soon as you add too much encryption and security you have to trade-off usability. Must it always be that way? If you have to chose between security and usability, people will always choose usability.
> Interviewed by Runet Echo, a Threema spokesperson said, however: “We operate under Swiss law and are neither allowed nor willing to provide any information about our users to foreign authorities.”
This reads as "but local authorities are no problem" to me.
This chart  compares several messaging apps' security standards; I'm not sure how accurate it is but maybe it can help you.
Providing an Open Source client seems no goal anymore.
What all the other messengers should copy is their poll feature . It can basically act as an integrated Doodle.
Regarding open source, the app isn't unfortunately. But the crypto (based on NaCl by djb) is documented, has been audited, has been reverse engineered twice and there is an alternative client implementation.
(Disclaimer: Threema dev)
The Telegram iOS client has started pushing dark patterns to get me to upload my contacts: it now shows a perpetual “badge” in the app’s main tab view (so that I can’t ever miss it), as if it has an error or alert it needs to tell me about. If I tap on it Telegram will helpfully tell me, on a visually broken page, that I should please allow Telegram to access and upload all my contacts to “seamlessly find all my friends” (like I needed this).
Mentioning "nation states" is laughable and it seems everybody should be reminded that not only advertisers have access to "metadata" generated by their mobile devices.
To make the life less convenient but more "secure", one could enable "secure chats", set an explicit password and don't allow access to the Contacts on iOS.
In some countries, a “nation state“ is pretty much every local police chief and prosecutor who like their comfy and paying positions a lot, don't like having too many loud democracy advocates around, and know who to call in such cases.
People tend to forget that the state is basically evolved mafia.
The only downside is that the Signal servers may store automatically disappearing messages until all devices have received it, so you can't rely on messages being removed in due time.
It would be nice to have an e2e encrypted p2p messaging platform to overcome this issue.
I don’t understand your problem statement. Signal is completely an end-to-end encrypted chat platform. Those messages stored on the servers are encrypted and Signal (the platform or employees) doesn’t have the keys to decrypt the messages. So what’s the issue if the messages remain on the servers until they’re delivered to the recipients’ devices? Even the non-disappearing messages don’t stay on the Signal servers forever (unlike say, Telegram).
Serious question because I can't tell a few dozen friends and family to either run their own server, nor do I want to play customer support for all of them.
https://list.jabber.at has more servers.
Conversations for Android is definitely the best XMPP client out there right now, so just make all your family and friends use it. And everything will be transparently flawless.
They might do some form of encryption to store the messages, but this is meaningless when they also have access to the keys and can decrypt any message whenever they want.
Q: Do you process data requests?
Secret chats use end-to-end encryption, thanks to which we don't have any data to disclose.
To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.
Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression. Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world.
To this day, we have disclosed 0 bytes of user data to third parties, including governments.
### End of Faq ####
Even the secret chats in telegram are not secret because the protocol is custom developed by amateurs devs. It is not tested and have seen some articles on methods to attack, could not get the link now.
No proof, and he apparently didn't even test his own theories.
And the whole "Telegram isn't secure because nation states can abuse their telcos to break 2fa" is, while accurate, also true for about 90% of the internet.
And what's with this method of disclosure? It's not ethical or responsible. It's plain defamatory.
The chat clients do have a nifty feature of setting up a time based, self destroying message. And you can share git repos within your registered clients and peers. They also have a shared filesystem and team based chat. You can also generate a PGP key pair for identity. The feature list is vast.
The main point of Keybase is also sound in my book, which is to have a network of proof such that you can prove who you are in your correspondence in whatever social media or chat you use. You can even use the encryption / decryption services of Keybase only and send the encrypted payloads however you'd like.
I think with more polish on their clients and proven stability that Keybase will be a great service to depend on. But I do think they need more personnel and/or they need to step up on their social aspect because their last blog post was 200 days ago.
By no means has our experience been flawless, but as a small business we have been running entirely on Keybase for around a year and a half now.
We use Keybase every day for team & partner chat, shared storage, and git repository hosting. Our email is on ProtonMail, A/V chat via Wire, and our static websites I am not entirely sure where we host. Keybase IPs are not fixed at time of writing, so a CNAME solution would be our only option for root DNS hosting on Keybase - so we are not interested in that at the moment.
Again, we are just a small business and we do indeed run frequent backups of repositories and KBFS to offline storage (though I am not sure I would want things different with any other third party host), but given daily heavy collaborative use for our size, we do not really have complaints on beta-ness.
Primary feedback at this point are mostly feature requests - like support for git-lfs, static IPs for better DNS flexibility, minor client usability tweaks, and support for tablet displays. Thankfully the Keybase team is very active in community channels so we have a good idea on the status of these things and rapid turnaround time for bugfixes.
Active clients in the team are on android, Docker, iOS, macOS, and Win7.
It's worth pointing out that Telegram has recently funded itself with an ICO as well. They are very well funded and they have a huge user base.
IMHO investor/greed driven walled gardens have long been a problem in this space and is also the reason email is still around because despite its many flaws, it has one feature that is incompatible with running a walled garden: it is federated and decentralized from the ground up. Anybody can run a server and email clients that lock you into communicating only to users on one server went extinct in the nineties.
With chat, it seems there is a pattern of perpetually recreating walled gardens with a single all powerful company at the center. This has been going on since the days of ICQ, AIM, Yahoo messenger, MSN and all those other long forgotten products that each did more or less the same thing and each tried to own your network. Strictly speaking, there's actually very little that differentiates modern day chat apps from those early chat apps in terms of what they do.
Signal just got a capital injection from the whatsapp founders. I like that Signal are setup as a foundation and that their stack is intentionally completely open source end to end and yet easy to get into for on techie users. This breaks the pattern of centrally controlled for profit walled gardens as the only option. And this makes them somewhat unique in this space: they are now well funded, independent, and widely respected by people that care about communicating securely, and their mission is keeping it like that.
I'm not that eager to swap out Whatsapp for Telegram. Ultimately, I don't trust either of them to do the right thing. But, arguably Whatsapp is somewhat more secure because at least it uses Signal's crypto. I use Keybase because I find their setup intriguing. And of course, I have Signal installed on principle but it seems none of the people I normally chat with are on it. I hope that they can fix that. But realistically, I seem stuck on Whatsapp, which I've never liked because it is such an inferior product compared to almost anything I've used for chat in the last 20 years (fugly UI, limited feature set, no decent desktop client, etc.). But world + dog seems to be on it so there's that.
However, I get more nervous about use software without an obvious means of support for the developers so it would be nice if they can fill some nitch and make money as long as they don't kill the current client.
The points about the contacts is right though. Also i am a strict no for closed-source privacy app. Even though telegrams code is open, the repo is handedly very badly. With squashed commits pushed once in few months, issues disabled.
Usability is important. The more steps you put in the way of security, the less likely people will do the right thing.
While it was useful, they've decided it was a fruitless attempt and their arguments are very convincing to me. On the other hand, it's kind of shitty not to have a reputable resource to point to and say "you shouldn't be using this, it's at the bottom of this list". I appreciated the list while it was a thing more for its bottom than its top.
Not to reduce the value of author's analysis of course, but just to clarify a point: no proprietary service can be considered secure, no matter how good and well intentioned it's property is.
Also about trust: I can trust a bit certain kind of paid hosting/services, for instance companies that are in my country under my country law can be trusted in the sense that I have a certain kind of legal protection and a clear signed contract. It does not stop them do thing I can't know with my data but I have few options. Against services hosted elsewhere in the world with "not-real-contracts" and zero formal fee my possibility of action is essentially ZERO so I can't even being protected by my country's law.