Another thing that wasn't pointed out: Du Rove said "Signal messages have been exploited against them in US courts or media."
This would be the same case for Telegram as well, if someone has your phone. I believe that Signal can have a lock on the client, and the database is encrypted.
The other part that Du Rove conveniently left out: Signal went against the US courts and won [0]. When subpoenaed to give all user information they gave them all that had: the unix timestamp of when the account was created, and the last date you connected to the signal service. That was in late 2021. I'm really curious as to what Telegram has told the FSB.
Telegram iirc moved it's lead developers to Dubai specifically because the FSB was demanding info from them, so you could argue that's an unfounded concern.
The bigger problem with Telegram is that it by default has insecure encryption settings (as opposed to Signal, where encrypted is the default, you need to manually activate it with Telegram + I think it's not possible to enable for all chats and clients) and to my knowledge, Telegram will outright co-operate with law enforcement agencies to just hand over unencrypted communications. I'd personally argue that's a security dark pattern - make privacy a big selling point, but then don't activate the security by default
> security dark pattern - make privacy a big selling point, but then don't activate the security by default
pretty similar to whatsapp. they boast end to end encryption, but business account (all of them now) uses the facebook server's key, so that the business can give access to several other clients to answer customers. they still call it end to end encryption, and this was actually the last crap the original founder accepted before leaving with lots of money on the table.
Good for you, you probably do not live in a country under digital colonialism where the gov allowed facebook et al to force internet providers to tax the pop with absurdly low and expensive data limits and then "not count" things like facebook and whatsapp and one music app.
In most of the global south, 100% of business have a whatsapp. In those places it pretty much replaced telephone and the green whatsapp icon is now what the current young generation recognize as the we did the black telephone outline on a business front next to a number.
and if you own a business, or is self employed, it is even worse: you live by that app.
The lack of encryption between myself and a business is less offensive than replacing an open standard (plain old telephone systems, eMail) with a proprietary and closed one, backed by a single, private corporation
i completely agree with your sentiment, but i will also say this.
As an expat, this feature has enabled me to transact with locals from the convenience of my phone, even though i don't have any local line and i will not bother to get a local SIM card, nor do i want to have a US SIM and a local one interchangeably.
It also enables me to be very effective when requesting services on demand, and cutting thru the on-hold time, disconnected calls, or the needless chitchat.
I have many bad things to say about WA, but making living more difficult in a foreign country is not one of them.
Telephony is an artificial monopoly. There's not a single reason why you cannot call everyone around the globe for mostly free, as whatsapp proves. Yet here we are.
> Telegram iirc moved it's lead developers to Dubai specifically because the FSB was demanding info from them, so you could argue that's an unfounded concern.
I'd argue it's not giving us any certainty. They could've moved away to escape. They could've moved away to a nice FSB-sponsored location while making good publicity. Ideally the tech should be good enough for this issue to not matter.
But they gave the FSB info they asked for -- the vk.com website (facebook clone, at that time it had way more massive amounts of user data than telegram). They could have deleted the data, but no, they handed it over to FSB.
I will point out, in their defence, they handed it over to an organisation that has a habit of assisting people in learning how to fly from windows. This isn't to say Telegram is secure but that it's unlikely they "could have deleted the data" and remained alive.
FSB mostly wanted to prevent people organizing, and that would serve it well. They already had another popular service (odnoklassniki.ru) where to direct people.
vk.com and telegram have nothing in common, except the founder. Durov was forced to sell his part in the vk.com and telegram development started after that as a response.
You conveniently forgot about the second part of that comment. Durov was forced out of the country and had to cell vk.com for peanuts because of his refusal to cooperate with the government. He is still pissed off at the country at large (not just the government) and refused to add the Russian translation for years, for example, despite it having absolutely nothing to do with Putin.
Since he is Russian in origin, it's okay to throw baseless accusations at him and spout nonsense like "maybe they're FSB agents" or "maybe they hired an FSB agent without knowing it". You see it here everywhere, and HN is one of the better sites in that regard. Well, maybe Signal has hired an NSA agent and doesn't know about it either? How does that sound?
> Durov was forced out of the country and had to cell vk.com for peanuts because of his refusal to cooperate with the government.
It wouldn't be the first time a cover story was ever used.
> Well, maybe Signal has hired an NSA agent and doesn't know about it either? How does that sound?
You should presume they're trying. I, frankly, presume they've succeeded, either in placing an agent or by compromising something, in virtually every prominent messaging platform.
> You can easily activated it but you aren't burdened by it for 99% of the time when e2e encryption is not needed.
So, in those 1% of the cases when you actually need it, you're instantly flagging yourself as doing something fishy? Because if it ever comes down to it, good luck proving otherwise in a court.
That's like the whole point of why it should be on by default. Not because me making dinner plans is something super-secret that needs to be e2e-encrypted, but because those two scenarios need to be indistinguishable from each other for e2e to be effective.
Yes. Additionally you are at bare minimum signalling that the metadata of the encrypted comms is worth further analysis.
For exactly the same reason if you have a paper shredder, you don't only shred confidential material, you shred a bunch of junk as well to make it harder to find which pieces to reconstruct.
The winning in this case was they had to fight to be allowed to release what they provided.
As nice as it would be to not have to provide that information, Signal proved that the only information they have to give is largely useless to law enforcement.
So they lost and have to give up the information/metadata they have. It's just good news that it wasn't much. But that's not guaranteed to always be the case.
No, you're misunderstanding the situation. Companies can't legally refuse to provide information in the jurisdiction they're in some cases, especially when there's a court order. Every company is subject to someone's jurisdiction.
Signal prepared ahead for that eventuality and designed it so that their services received stored only the absolute bare minimum of information, and that most importantly didn't preserve the metadata of chats and calls. This is unlike Meta, for instance, which does keep that data for WhatsApp even though the chats themselves are encrypted.
That means that what they provided in that particular subpoena is all they can provide from their server records for any user of Signal, not that it's all that was available in that particular case.
It would've been even more anonymous if not for the phone number requirement, but I can understand why they made that trade off and given the lack of metadata it's not that useful to law enforcement/surveillance agencies in any case.
I believe they have your phone number hashed not in plaintext, also they rolled in usernames recently and option to not expose your phone number to anyone, which addressed privacy concerns
I don't know how they store phone numbers, but hopefully not hashed since the search space is trivially tiny.
The usernames thing does not address any interesting concern. I don't care to show my friends who I chat with my name and phone. They already know these after all.
What I care about is not having to give Signal (the company) my phone number. That's not something they should have.
Durov travels freely to and from Russia and several of their employees are still based in Russia. So yeah, the FSB have leverage if they need to use it.
As I stated in a sister comment, Dubai is marginally better, but not significantly better. If it's the same original developers, they could be squeezed through their family.
Same goes for Signal devs, or any devs really. You're only stating the obvious: humans can be forced and coerced given enough motivation and resources.
Singling out Telegram, or Signal, or any other service's devs is not advancing any argument forward.
There is more reason to be concerned about Telegram than most other similar services.
Partly because it’s insecure by default, which makes a large percentage of conversations vulnerable.
And also because the team behind it is very susceptible to pressure from the Russian government, which is especially bad when it comes to these things. Even if some of them are based out of Dubai now, it doesn’t mean that they aren’t still at risk of coercion, either directly or through for example threats against family members who remain in the country.
If you don’t trust Russia, which you shouldn’t, then don’t trust Telegram with anything sensitive.
Can we trust some more than others without trusting anyone completely?
I for one trust that there are more Americans who would say no to the NSA when they have a legal basis for doing so than there are Russians saying no to the FSB.
The state of the rule of law is certainly not great anywhere in the world right now. But it's far worse in some places than in others. The difference still matters to some degree.
No, we cannot, so yours and other comments come across as picking favorites. As an Eastern European I'll state that I distrust USA as much as I distrust Russia, China and North Korea. All nations with ambitions use every dirty trick known to man.
I just found the mention of the state confusing because surely this app is equally untrustworthy regardless of its origin. American companies can't be trusted either—they've long since indicated a willingness to work with law enforcement and intelligence. E2e encryption is the only meaningful answer.
Yeah I agree with that, I am always baffled why people single out Russia as if it's the Cold War; none of the major powers can be trusted at all, some are just more covert than others but I guess "out of sight, out of mind" applies in full force even for a seemingly less-biased audience like HN.
As for E2E encryption, eh, this topic has been beaten to death many times. Its UX is difficult so many apps like WhatsApp and Telegram do UX tradeoffs, quite successfully so I'd say.
Death is death, whether it's brutal or not, your dead corpse will not care.
Still no idea what your argument is. Or that of the grand-parent comment. "Oh look, Russia is poisoning people!" Yeah, we knew that ever since the Cold War. Show me a nation that doesn't persecute people it doesn't like.
They don't have to be friends to turn a blind eye.
If Dubai had to pick between letting some nobody foreign national living on their soil get squeezed by a foreign secret police, or pissing off the Russians, what do you think they would do?
(This isn't a knock on Dubai specifically, substitute them for almost any non-NATO country in the world).
I mentioned it because people very easily single out Russia and I think that's quite the outdated sentiment. I'd start with USA, North Korea, China, Russia and Iran -- again, just as a start, and I am sure there are many others that can't be trusted.
Where Telegram's devs and business resides hardly matters IMO. If somebody truly wants to put them under their boot they'll find a way, and that goes for probably half the countries on this planet.
You didn't "mention" it, you implied that I believed this, when it wasn't even part of the discussion. You can believe whatever you want, but you really shouldn't accuse others of such things.
Also: this whole topic is about a Russian company/app and Russian citizens living in Dubai. What other country would we be talking about here, exactly?
On desktop, on Android and iOS it uses the OS keystore. It really should do on desktop as well, Windows, Mac and Linux (through freedesktop standard) all have APIs for that, there really isn't much excuse. Desktop Signal has always had terrible security, unfortunately.
Telegrams Encryption is off most of the time. They have serverside access to messages. The optional E2E is annoying to use and isnt even available on every platform. For example Tdesktop afaik still has no E2E support. (And has a very brittle software architecture.) You can't register Telegram accounts with the open source client anymore.
This should be a non-Discussion.
MG implying that just because other messengers like Whatsapp use Signals encryption scheme does not make them more more trustworthy.
Yes you can verify in a binary if the stuff is implemented well. But if a vendor has control over the update channel or beta rollout features its kinda easy to hide targeted features. Wasn't Whatsapp caught exfiltrating chats in ways that don't involve the normal channel bypassing E2E?
Btw there is no Signal in Fdroid but nowadays there is an accepted by upstream third party implementation.
You could separate software and infra vendor. Look at Molly.im
Better to bring non tech folk to Signal than to other messengers that do the same but less protected.
Both services are relatively insecure because they require phone authentication. In the EU at least the number can always be traced back to you if you don't buy specific burner phones.
The level of encryption isn't as important anymore at that point. It is less probable you get into problems by using a service that doesn't know your identity.
> but your phone number isn't visible to anyone you chat with.
That's irrelevant - the phone number is known to Signal and can be request by law enforcement. And, since it's been made pretty much impossible to buy a SIM in the EU without showing identification [0], this will allow law enforcement to link the account to you.
[0] IIRC the Netherlands is the only country left where you can buy SIMs without ID.
> [0] IIRC the Netherlands is the only country left where you can buy SIMs without ID.
As far as I know, in Romania you can still buy and activate a prepaid SIM card without having to show your ID. There was an attempt a few years ago to make it mandatory to tie the phone number to an ID, but it was overruled by the Constitutional Court.
> Law enforcement asks signal if they have an account for a phone number, signal saying "yes, here's when they created it".
Law enforcement says that the suspect chatted with some username/told people to contact him by his Signal username, then they go to Signal and request the linked phone number, which is then linked to the ID shown when the card was bought.
This only works as long as the username is active/unchanged. It would probably be better if usernames were never linkable to phone numbers, but if your threat model requires a persistent, non-ephemeral username to remain anonymous when targeted by law enforcement that has access to your telecom records and warrants... that's going to require a pretty high level of opsec.
The UX on usernames in Signal might be non-ideal. It might be helpful to have a toggle that regularly cycles your username if that's important for your threat model.
You can buy "anonymous number" on fragment without using any client and without providing any personal information and use it as much as you can
When signal becomes at least remotely as popular as telegram it will implement same protection to fight against spammers because you can't have free unrestricted registrations and don't drown in spam
Telegram currently makes it as accessible as possible: either use it freely but register using phone number and official app or pay and use anonymously as you want
I just looked at the fragment.com site to see how much such a number costs.
The lowest possible bid you can currently make, and that is for an auction that has six days to go, so probably not even the final price, is over 100$.
That is an unacceptable price for basic privacy.
Signal is already extremely popular, their anti-spam by default is that you need to get matched to the user's local contact list or the spam becomes an allow/deny prompt. They also require a confirmed phone number and handle registration throttling.
* enough valid accounts. If the instance gets popular, you're going to need hundreds of them to get past rate limits according to the announcements some time ago (maybe the rate limits have changed though)
Signal's definition of "reproducible" meant for quite a while "download this binary docker image and build Signal inside of it". I don't know if that has changed since.
Signal rejects F-Droid for a different reason, though: They only want to distribute through channels where they get download statistics and control update rollouts.
I'm not sure what sort of "control" they have over the Play Store compared to f-droid, but I'd rather have a trusted 3rd party do the building transparently and verifyable.
F-Droid uses a package maintainer-esque process where the maintainers of F-Droid can intervene and prevent an update to an app from reaching users if it's deemed to be malicious or to add anti-features.
It's of particularly high need on mobile since popular apps, even those who were originally FOSS, are sold to scummy publishers who fill it with ads and subscription schemes (oft called anti-features, since removing them could be seen as a feature in and of itself), ruining the original. You can't really trust mobile app devs because the track record is downright awful. Recently that happened with the "Simple" collection of apps, where the Play Store version got filled with junk but the F-Droid maintainer froze the version and marked the apps as outdated since nobody could conceivably want the new versions.
Of course, that strokes poorly with developers who a. don't want to deal with potential third parties in their distribution chain rejecting their updates or b. are planning to add anti-features to their apps later down the line. With signal, I'm gonna guess it's mainly a; the Play Stores checks and balances are much less invasive than the sort of thing an F-Droid maintainer might check for. (As I understand it, Google Plays checks mostly are anti-exploit and keyword scans.)
> where the maintainers of F-Droid can intervene and prevent an update to an app from reaching users if it's deemed to be malicious
That sounds like a feature you want when using FOSS.
Imagine distros wouldn't have been able to intervene quickly and malicious xz would be still deployed through their channels just because the authors want to.
Oh yeah, it's an absolutely wonderful feature. F-Droid is pretty much the main app store I'd recommend to get "the basics" from if you're ever in the unfortunate position of having to manage the mobile devices of family members. Having a maintainer "on the lookout" gives so much peace of mind. Not suddenly having the gallery app turn into a data collection machine and baiting less tech-savvy people into vaguely defined subscriptions is a value that's too good not to pass up on.
FOSS isn't really the important part for me there; it's nice, but the real value is that F-Droid is pretty much the only app store that has some reckoning on how the relationship between mobile devs and mobile customers should be far more adversarial than on any other platform due to the poor track record of mobile devs and empowers users to be able to deal with that in a way that restores some degrees of trust.
It's a fucking shame there's not an equivalent on iOS where you can just say "yeah, what you find here can be trusted" and then not have that gets polluted a year down the line. Apple used to somewhat police the App Store back in the early 2010s for similar peace of mind, but that's not the case anymore.
> With signal, I'm gonna guess it's mainly a; the Play Stores checks and balances are much less invasive than the sort of thing an F-Droid maintainer might check for. (As I understand it, Google Plays checks mostly are anti-exploit and keyword scans.)
It might have been b as well – Signal did keep their server code proprietary for many months to add their custom cryptocurrency to it, and added this cryptocurrency for microtransactions into the app as well. There may be many more features like this planned, some of which F-Droid might oppose.
F-Droid with reproducible builds signed by both parties seems the best of both worlds to me, now I don't understand why Signal is so stubborn about this.
> This way F-Droid could potentially insert a backdoor in an update.
Google requires app developers on play store to give goole the keys that enable google to insert backdoors in any release. I can't trust anything on the play store for this reason. There is no way to tell which apps have been backdoored by google for whatever reason (the usual reason is a NSL).
Telegram were claiming they were more secure even when they had their own home-rolled crypto. Security is not Telegram's strong point and it never was.
Of course. But the history of the Signal protocol and implementation traces back 20 years. It's good enough that Facebook, WhatsApp, and Skype use it for E2EE messages. Telegram's traces back 10 years, the first version was very bad, and both versions have had a lot of scrutiny for weird design decisions.
Crypto schemes which get broken usually follow a pattern of "something smells wrong", "we have weakened it a little bit", "we have weakened it a little bit more", "this is now completely broken", "my god why are you still using MD5, it's 2017".
We're in the "something smells wrong" or "we have weakened it a little bit" phase for MTProto2, depending on how you view it.
> But the history of the Signal protocol and implementation traces back 20 years
Are you sure about that? TextSecure was created more 10 years ago than 20. 20 years ago, we did not have smartphones.
As I remember, TextSecure started with SMS (but that was not the Signal protocol) and added "internet" messages right after WhatsApp got bought (which was about when Telegram was started).
I love the Signal protocol, but I would say it's more 10 years old (like Telegram). Or am I missing something?
Signal/TextSecure (/DRA/Axolotl) has a pretty strong throughline from the "off-the-record" protocol (OTR) from 2004/2005. Signal themselves describes TextSecure as a derivative of OTR (https://signal.org/blog/simplifying-otr-deniability/).
It's close enough that if, say, a novel attack against OTR were discovered today, the first thing I'd want to know is if there are any implications against Signal.
Don't forget that Signal is the name of the app, and "Signal Protocol" is the name of the E2EE protocol. The parent was talking about the Signal Protocol.
The fact that Facebook, WhatsApp, etc. use the Signal Protocol kind of shows that it is an accepted standard. But of course there are many reasons to use Signal (the App) instead of those apps, for instance:
- The Signal App is open source. You can check the protocol implementation before you use it. For Facebook, WhatsApp and Skype, you have to trust them (or some audits).
- E2EE is only one part: it ensures that nobody except the recipient can read the content of your messages. But there is a whole story around the metadata. The metadata say who writes to whom, and when. It essentially helps build a social graph. Facebook is very interested in this social graph. It would appear that the Signal Foundation is not. And even if it is not perfect, Signal does a lot to try to minimize the amount of metadata it has access to (and quite obviously Facebook has a huge incentive not to do that).
This said, IMHO it is still a lot better to use WhatsApp than to use Telegram, because at least you benefit from a good E2EE.
It's only the protocol for their E2EE chats. There are two big caveats:
- Facebook and Skype E2EE messages are optional, and people rarely use that option, and
- Those apps collect a huge amount of data outside the contents of the messages.
Still I think mentioning the greatest data collection projects in human history in the same sentence as Signal which is supposed to fight that is not very good.
Sort of, but it's heavily peer reviewed and generally regarded as very good.
I really dislike the "hand rolled is bad" meme. Someone rolled all crypto. The questions are "who is doing the rolling," "do they know what they are doing," and "was it peer reviewed or directly and faithfully built from a peer reviewed design?"
Crypto is notoriously easy to get wrong, even if you know a lot about it - and most people do not. Secondly, proving something secure is pretty hard as well. If the crypto isn't a bog-standard algorithm in a well-known and reviewed implementation, assuming it to be insecure is a pretty good rule of thumb.
The people who take this advice are people who have respect for the difficulty of things like crypto and should be the ones implementing it, or at least on-ramped into learning how to do so.
The sorts of people who ship bad crypto because they don't bother to learn anything about the field are going to ignore this advice.
So I think as a strategy for fighting bad crypto it's neutral or maybe even net-negative by discouraging the right people from learning crypto and having no effect on overconfident fools.
Someone (Bruce Schneier?) said that the best way to get into actually inventing/implementing crypto is to first get handy inventing attacks / hacking into other algorithms and tools.
Telegram had some weird primitives which they said we should trust because they were made by their top team of mathematicians. Signal builds on widely used crypto primitives even if their protocol is their own (vetted by actual cryptographers though)
To add to the idea that crypto is hard, it is not just hard in the same way that, say, making a physics engine is hard. It is hard because there is no telltale sign you did it wrong.
All crypto algorithms, even weak ones output what looks like random numbers that can be deciphered back into the original plaintext. Just by looking at it, there is no way to differentiate between secure and insecure crypto. Contrast to a physics engine, it is hard to get right, but at least, if you did it wrong, it tends to be obvious.
Also, like everything security-related, it is adverserial. You may have some of the smartest and most resourceful guys on the planet working to break your thing. It is worse than even critical systems. Aircraft engine control is critical, people may die if it goes wrong, so robustness and correctness are crucial, but at least, pilots won't go out of their way to break it.
It's inherently risky – cryptography is hard and building secure software is hard, so starting it from scratch rather than re-using well-vetted code increases the risk unnecessarily.
It's not inherently broken, but it's sufficiently risky that it may be fair to assume it is broken. History has proven that software that's not known to be secure is typically insecure when it gets to the really hard crypto implementation. I think it's fair therefore to approximate it as "inherently insecure".
It’s insecure if done by your average full stack developer, that barely passed high school math. That’s why the usual mantra. It’s waaay different when done by math experts specialized in this topic, as is the case with telegram.
Math experts are not necessarily good cryptographers, and authors of MTProto were not renowned cryptographers (unlike with Signal).
Most people in the cryptographic community agree that the Signal protocol is well-designed, is widely believed to be secure, and the authors react openly and swiftly to potential issues. Meanwhile, a lot of the MTProto crypto is just weird (that is, it does not follow standard practices of the field, without strong reasons to do so), and many cryptographers treat it with suspicion.
In the case of Telegram, MTProto’s implementation leaves a lot to be desired[1][2].
Additionally, home rolled crypto does not usually get the kind of security review from the cryptography community which makes it very likely that bugs of all kinds exist.
Because doing proper crypto is VERY hard. You might think you've gotten the ultimate security and one year later someone will defeat it (or cryptanalize within a practical limit, which boils down to the same), because you forgot a detail. Even just implementing a crypto algorithm properly in a way that doesn't leak information is very complex, reason for which most applications tend to use well-established crypto libraries.
We explain this under the heading "A Somewhat Opinionated Discussion" here: https://mtpsym.github.io/ which is our security analysis of MTProto's symmetric cryptography.
> An alarming number of important people I’ve spoken to remarked that their “private” Signal messages had been exploited against them in US courts or media.
Any sources for this except the private testimony of a Signal competitor talking about his important friends? (ETA: Or is it when the court/media obtains your unlocked phone, in which case Telegram won't protect you either...)
My guess would be that their phone was taken from them, unlocked, and their messages were accessed that way.
I know several large IT orgs that have done this when Legal got involved. Literally using a 2nd phone to take pictures of Signal chats on the phone in question.
Given the location of Telegram's servers (Dubai), and the nature of the government (neutral dictatorship) and the lack of encryption, my default assumption would be that not only are they selling access to your data to major governments, they've probably even streamlined the bidding process.
Yep. The magic of "you could turn on encryption" is that nearly all people using it won't.
"Ah, but if you need encryption then you'll..." - well, two things now. Suddenly you're the person who has encryption switched on. And also more likely, someone they talk to will forget to switch it on and just blab everything into cleartext anyway.
The entire importance of Signal's model is that it is always encrypted. It's why LetsEncrypt is also important: to have effective security you need to be able to hide in the crowd. If encryption usage is rare, then who's using it itself (or suddenly starts using it) becomes an extremely valuable datapoint.
(so I'd add: Telegram absolutely sell timeline details of which user accounts change their frequency of encrypted chat usage).
Last I used Telegram, creating an e2ee chat with someone added an encrypted chat in addition to the unencrypted chat. This means if your not careful in which chat with a single person a message is sent to it's easy to accidentally send unencrypted data.
I'd guess this is possible because Telegram e2ee chats aren't multi-device capable, so it's necessary to be able to use unencrypted chats while using Telegram on something else than the phone with e2e.
Both Russians and Ukrainians use Telegram, including confidential messaging with their agents on the foreign territory. So that's a prove enough for me, that it's safe enough.
> Ukrainian artillery targets Russian soldiers by pinpointing their phone signals. Despite the deadly results, Russian troops keep defying a ban on cellphone use near the front.
It's not about ordinary soldiers. It's about special services agents contacting their "partisans" agents, while other side special services trying to catch them. They're supposed to apply best security possible in the given circumstances.
If you claim that neither Russian, nor Ukrainian special services are competent, I'd disagree with you.
I mean, the Russian Ministry of Defense admits it.
> “It is already clear that the main reason of what took place included the massive use, contrary to the ban, of personal mobile phones in the range of enemy weapons,” the Russian Defense Ministry said in a statement. The cellphone data allowed Ukraine, it said, to “determine the coordinates of the location of military service members to inflict a rocket strike.”
"including confidential messaging with their agents on the foreign territory"
Possible, as many ridiculous things happened around the whole war. (Recently german generals on a video chat were targeted by the russians, wasn't too hard, they did not use any encyption at all)
Sources would be nice though.
But it really would not be a reason for me to trust telegrams security.
Rather a confirmation again, that also secret services can show great incompetence.
It can use encryption. But they choose not to for probably lazy reasons. Which is bad for normal persons, even worse for generals who should lead by example - and ridiculous for generals with an background in IT who really should know better. But as far as I know, there were no real consequences so apparently it was not such a big deal.
I would guess that those two would turn encryption on?
IDK, the whole anti-Signal post really makes me suspicious of Telegram whereas I wasn't really before. Are trying to be the universal honeypot for agencies?
There are multiple layers where interception can happen:
1) On-screen keyboard - by default most phones do send what is being typed - a lot of phones also have 3rd party keyboards of doubtful origin preinstalled
2) "Enable backup" scam - on starting an app (like Google Photos or WhatsApp) chances you or your wife accidentally press "ok" on a pop up message
3) Hardware drivers - non open source binary blobs with back doors
4) Operating system - you basically don't know what information is logged and sent back to phone's vendor
>by default most phones do send what is being typed
That's extraordinary if true. Do you have anything to back it up, though? Even Google (!) wasn't brazen enough to log everything typed on Gboard, they implemented federated learning.
- "I don't like where one of their board worked" (find someone high up in the cryptography ecosystem who hasn't been involved in this sort of thing somewhere in their career)
- "I don't like where their funding comes from" (US govt regularly funds secure software because they depend on it for their own operations, see: Tor)
- "An alarming number of people think their chats were leaked". It's easy to state things without sources. Also an alarming number of people think Facebook listens to them through their phones' mic. People are bad at opsec. Not news.
- "No reproducible builds. They closed a GitHub request from the community." Well, except Android is reproducible, and they explicitly state on that closed issue that they don't do feature requests via GitHub and asked the reporter to raise in the proper channel.
- "Telegram is the only service with reproducible builds". Telegram barely has encrypted chats, reproduce all you like, that doesn't make the chats secure. Signal has E2E encryption and verifiable builds for Android, that's a strictly better security position.
I think Signal could stand to gain popularity by either prioritizing overall niceness and polish in their clients (especially on desktop) or by allowing third parties to build clients which prioritize those things.
iMessage, Telegram, and Signal all get usage from me, with the vast majority of that usage weighted heavily on the former two because that’s where most people in my circles are. When comparing user experience between the three, it’s easy to see why.
Allowing third party clients that provide identity verification signatures would be totally excellent, and I would support that.
Signal does this today by verifying phone numbers themselves, so they’d have to continue doing so centrally; “never trust the client” applies to their own client just as much as anyone else’s, and “allow unverified users to initiate contact with strangers” is the spam vector infecting all modern telephony (thus STIR/SHAKEN).
So with that need resolved, the biggest risk of third party clients would be intentionally compromised code within an attractive wrapper — but the only way to defend against that is to not allow third party clients at all.
So.. I guess I no longer support third-party clients, having worked through the timelines of what will occur. Ah well.
Durov's exile and distancing from Russia after the VK takeover may be just for show and for selling Telegram as 'the dissident app'. It is popular, easy to use and insecure.
They want to do this because they want more traction for their blockchain: TRON, which, IIRC, is the payment method for ads, usernames and "stuff" inside Telegram.
However Du Rove is right about a bunch of things:
- Signal clients suck, specially the Desktop one where they ship (or used to) pre-built binaries like their own lib: https://github.com/signalapp/ringrtc
- Also you can't have Signal without Google Play Store
- Signal client suck in usability. I wish I had Telegram client (android) and desktop (qt) instead of this electron garbage. Telegram clients are super-duper-awesome
- I would say that removing phone number requirement is their #1 request. yet they take so much time to address it, specially when they cry about phone number validation SMS costs
- BTW, telegram is implementing a very nice idea of a crowd sourced sms validation, where they use their users phone numbers to send the validation sms
- They have a very questionable crypto integration with MobileCoin, which have a obscure value: they depend on IntelSGX and is 95% pre-mined
You can use Signal without the Play Store. Download the apk from Signal's website and it will use a background connection to receive calls and notifications. The downside is that it's heavier on the battery.
They seem to want everyone to use the official app. That's supposed to be fine because of the reproducible builds.
In any case, if we don't trust the official client/app/build, then I'm not sure if a F-Droid release would help that much. If we think there's something funny with the official apk, then we can't trust any part of Signal.
It feels like any platform that allows for one-way initiation of a conversation is bound to increase in spam as the platform grows in usage (phone calls, email, SMS, various social media, various messengers, etc.).
Do any platforms require that both parties add one another? (And/or allow for restricting an account to such a mode)
e.g. if user123 and user789 wish to communicate, then user123 must add/contact user789 AND user789 must add/contact user123. Until both do so, then nothing happens.
It's more work to legitimately establish contact with someone, but that seems like it pales in comparison to the effort produced by spam/scams.
Same thing with verifying identities. In order to actually establish proper contact with someone, you need to communicate with them via some outside means (ideally in person) in order to establish the connection. Requiring both parties to enter/scan some ID/code/whatever seems like it would only facilitate proper verification (though not guarantee it, of course).
I'm sure that I'm missing something, though. I assume I'm just not familiar enough with these platforms and that some/all of them provide such a feature. It's just odd to me that spam sounds like such a problem when it feels like the above solution would be highly effective and simple to include.
I am in a Signal group which has an invite link discoverable on public internet (it's a local OpenStreetMap group). From time to time, a bot joins and proceeds to spam the group's members one-on-one.
The same happens on the Telegram OSM group. Now, the easy and 99% effective mitigation is to make a "bridge group" where you need to click something to join the real deal but changing that would invalidate any existing links.
Isn't it an expected issue with popular services, particularly ones with proper e2e encryption?
Things like WhatsApp and iMessage get scam messages too, and the less visibility the operators have for contents of messages the harder it is to proactively filter out spam.
Not being in either Telegram/Signal camp I see a lot of tribalism in the comments.
It seems that any arguments for/against either one end up in politics.
Like I understand that Telegram is probably not very secure, but seeing what proponents of Signal are saying doesn't really make me trust Signal either.
It is political. As I mentioned elsewhere in HN, Telegram is now being promoted in the US by the political-right there because they have lost trust in US BigTech social media platforms who, they believe, are "unjustly" censoring them on their platform. That is why the right-leaning media are now heavily promoting Telegram ( https://www.youtube.com/watch?v=1Ut6RouSs0w ) and bashing other platforms ( https://www.city-journal.org/article/signals-katherine-maher... ).
You mean the BigTech social media platforms that use Signal's protocol for messaging ? Wow I wonder why the people who don't trust BigTech social media don't choose Signal that is actually insane.
If it is indeed political don't try to bring some kind of technological merit into this, it makes you look really dishonest.
You missed my point - I am agreeing with you that the arguments for or against both end up political, because politics is the reason both are being promoted. Technically both products are equivalent. And both products may also be sharing data with government(s). The right- in the US just hope that Telegram isn't sharing it with the US government. :)
I've always at least strongly suspected that Telegram is a FSB honeypot.
It's insecure by default so I guess it could be an everyone-honeypot. I'll keep using Signal for my secure messaging thank you very much. Honestly I trust Apple iMessage encryption more than Telegram.
If you follow the discourse, the crypto quality is no longer brought up in factual Telegram-to-Signal comparisons, except as low-effort swipes at Telegram's general credibility.
Because saying "AES" is enough to talk about encryption ? Nothing else is involved ? Because if we're going in this direction everyone should just use XORs for encrypting and everything would be fine, and the rest would be implementation details.
Um, yes. When the “place that some people don’t like” is all sorts of CIA-connected NGOs and you’re a member of group defined by its paranoia about privacy, then absolutely this becomes disqualifying.
AFAICT Signal is collateral damage in this disinformation campaign. The original attack seems to be aimed at the CEO of NPR, coming from an assortment of right wing (and some Russian-aligned) voices. She happens to also be on the board of Signal which, through the prism of conspiracy theory, now extends their crusade. Given that Telegram is commonly understood to be aligned with the Russian government, this maps neatly on the US/left vs Russia/right axis through which such people already understand the world.
There seems to be a concerted effort to discredit Matthew's claims. Even here on HN. I find this suspicious. The Signal protocol has been heavily audited by many different people from many different countries. It's usually found to be sound. The telegram protocol has been found to have issues that are, if not malicious, amateur level mistakes.
Once again, this is not my opinion. This is the result of independent auditors who have no affiliation with either the USA or Russia.
There are positives to the UI of Telegram, there are negatives to the UI of Signal. None of these has much to do with the underlying protocol of either.
Personally I'd rather we all put our collective efforts into something like the protocol suggested by Matrix, but if only given the choice of Telegram or Signal, I'd avoid Telegram like the plague. They are either malicious or amateur. Either one isn't a good choice for security.
You can have a secure verified protocol but an insecure implementation of the protocol (the app). Note though that Im not saying that Signal the app is insecure. However I do think that Signal can certainly do more to make itself more transparrent and to accomodate libre 3rd party implementations of their protocol
> The telegram protocol has been found to have issues that are, if not malicious, amateur level mistakes.
Please provide evidence of such issues. Because at most, the issues with MTProto were at the level of "we are not familiar with this, but seems ok". Which seem to be inflated by Signal activists into maliciousness.
> The meaning of "bear's service" originally comes from a fable about a man and a bear. The bear wanted to help the man by killing a gnat which sat on his forehead. As a result both the gnat and the man died.
Basically, by being proactive you do more damage as if you didn't do anything.
Replying to this, as I can't reply to your down-thread reply for some reason.
What if the gnat isn't a gnat? What if the gnat is another man who now knows the communications of the first man? I'm not saying the Bear should kill both, but I'm pointing out that the analogy falls apart when the gnat isn't just a mildly annoying third party.
Ok, apparently I can now reply to this comment... Weird HN delays aside.
I don't care if the people who can decrypt Telegram chats are allied with any one side or another. I believe the idea of "Class enemy" to be abhorrent, and the moral / social threats of "the overall impact" to be negligible when compared to the fact that using compromised communications platforms will inevitably lead to greater problems than the act of calling them out.
This is the equivalent of "You'll keep quiet if you know what's good for you".
If Telegram is broken, certain people need to stop using it. The socio-political climate of the areas most likely to be using Telegram just makes this more urgent. This applies independent of if / how / why it's broken, and who, if anyone, may benefit from this.
Or a Polish one. (I guess the expression will be popular across Eastern Europe)
It's funny to see the basic cultural stuff float to the surface in comments like that. Like when there was a large number of "American" accounts some time ago on Twitter responding to financial news, but putting USD after the numbers... (To be clear, I'm not suggesting anything specific about the author here, just that sometimes you see enough opinions about something with the origin "leaking" through the side channel and wonder how organic it is)
> Recently, in [MV21 ] MTProto 2.0 (the current version) was proven secure in a symbolic model, but assuming ideal building blocks and abstracting away all implementation/primitive details.
Translation: it is secure, except for bugs, if any.
That's a generous translation! They were shown to be double-encrypting, using nonces where they weren't required, and generally making a bunch of mistakes that would be fine if they were writing a student level implementation of a secure messenger protocol, but not one that went on to be tacitly endorsed by a bunch of nation states!
It's like a clunkier version of the backdoor in Dual EC DRBG. When problems like this are found, you can either assume deliberate malice (as in the case of NIST) or accidental incompetence. Either should be immediate grounds for not using the software. This isn't Flappy Bird. This is meant to be secure comms. The "This Is Fine" mentality doesn't cut it.
Eh, split any important message into pieces, put a piece each in Signal, WhatsApp, Telegram, Threema, Line, and then the Americans, Russians, Swiss, and Koreans will each have some parts, but if you're lucky, nobody has all...
- "Elon Musk said so", which does not matter.
- Signal attachments can be viewed by an attacker with local access to the client. This is not Signal's job to protect against.
- Signal offers an optional `--no-sandbox` flag which only has security options if enabled on Linux.
- Weaknesses in sealed sender. This is the only one that might be an actual problem (two theoretical and one empirical attack, but the latter comes from an 18 page paper that I have not read). But this does not compromise the integrity of the chats, and is not something Telegram improves on.
Given how the posted described the optional `--no-sandbox` flag as "no sandbox on Linux", it's clear that they don't understand anything they're sharing, and they just want to spread FUD.
---
edit: Per discussion below, I was wrong about the `--no-sandbox` flag. It's enabled by default. The risk is that an attacker could figure out how to use Signal to run arbitrary JavaScript. I take back my insult- it was I who did not understand the linked issue.
I still stand by Signal > Telegram. The risk here is that an attacker could figure out how to abuse Signal to run arbitrary Javascript, e.g. through a specially crafted message.
>Given how the posted described the optional `--no-sandbox` flag as "no sandbox on Linux", it's clear that they don't understand anything they're sharing, and they just want to spread FUD.
Could you elaborate as you seem to be more "knowledgeable". This flag is clear at what it does and shouldn't be shipped into production. https://no-sandbox.io/
You're right. It seems I am eating my words on that item, the `--no-sandbox` flag does seem to be on in most Linux installs. From context and search, it looks necessary for it to work on Debian.
Can confirm with `cat /usr/share/applications/signal-desktop.desktop`.
This still would require a pretty sophisticated attack to take advantage of, but I wouldn't rule it out as an attack surface. (We regularly see iPhone exploits that attack font and image rendering, after all.)
Let me set a few things straight: Telegram is for the most part tiktok for people that don't mind putting some effort into reading on a few odd occasions. Saying that I have a lot of Ukrainian friends would be an understatement and the are the only reason I have telegram-all of them favor it, which, all things considered, is a grave mistake. In practice, telegram is far more closely related to tiktok and twitter than a messaging app and by extension it is heavily used to spread misinformation: telegram channels are ultimately under the complete control of their admins and they have the ultimate authority with no way of doing anything about it. Twitter was forced to put some effort into it through community notes but that hasn't even made a dent: it literally takes two google searches to find tens of thousands of bot accounts spreading misinformation. In that regard, telegram is much worse since it's an infinite source of cognitive dissonance: People are willfully joining echo chambers, which are openly advertised as such.
I am really glad that telegram is nowhere nearly as big in western countries compared to eastern Europe. It pains me to say this but, but even till this day, us eastern Europeans are way more susceptible to propaganda than the western world, although, for a million and one reasons that seems to have a huge effect on the western world as well. In that sense, telegram is an active contributor.
10/10 times I'll sit firmly behind Signal, despite the many shortcomings: there is no developer integration, if you want to create a signal account for your own personal bots or whatever, you can but only through a hacky repo that's on github.
Yes, the people behind telegram know all this very well and they don't like the fact that people who are aware of it as well are favoring signal infinitely more than telegram.
And that's good, that's their strength. I use it to read information from all sides of the conflict and decide for myself what's "disinformation" and what's not. A grown person doesn't need a gatekeeper that pushes their own interests and shuts up anyone daring to contradict them.
Oh yeah, "both sides". Sure... Wanna ask the two orphans living at my cousin's where their parents are and who killed them? How many thousands of such examples do you need? I'm sure as hell I can supply you with a sufficient amount, even worse than straight up shooting a child's parents in front of their eyes.
That would be a good argument if it wasn't for the thousands of videos of men, women and children getting raped and killed and russians gloating in the comment sections. Something which telegram is notorious for.
I really like Telegram from a UI/UX standpoint (so much better than Signal), but Pavel Durov is such a sketchy character that it's starting to turn me off. How can he be touting about being secure when they still haven't implemented end-to-end encryption by default? Also so many other things if you follow Durov's channel.
(I use Telegram and Signal each for about 45% of my messaging, even though I'm in Europe where WhatsApp is so prevalent.)
You can use the above link. Otherwise, you will have to log in, unfortunately. Earlier, we could have used a nitter instance, but all of them have been blocked.
You will eventually revise your opinion once you find your chat logs 20 years later in some randomly occuring IRC logs because that one guy was using an IRC bridge.
You cannot critique missing guaranteed end to end encryption when effectively matrix cannot guarantee it either.
> You will eventually revise your opinion once you find your chat logs 20 years later in some randomly occuring IRC logs because that one guy was using an IRC bridge.
E2EE can not prevent the receivers from sharing the message (they are one of the "end"s in "end to end-encryption" after all"). The same thing could happen because one person in the group chat ends up getting some ransomware on his phone; E2EE can not prevent that.
Not a huge fan of Signal (phone number requirement [0], crypto push a while ago), but there are worlds between those two, and every time the Telegram CEO makes a post it looks more like a scam than before.
[0]: Yeah, might be changing or has already. Now, after ages.
> phone number requirement. Yeah, might be changing or has already. Now, after ages.
A phone number is still required for registration. As of a few weeks, it's not necessarily communicated to your contacts anymore, which solves a few concerns (but not all).
> crypto push a while ago
I was worried about this, but I use Signal daily and I haven't even noticed anything in the UI about this, it seems like a non event in the end.
In theory somebody could just make a client that takes your message, generates a random string, XORs your message by that, and sends the XOR via Signal and the rest via Telegram.
Which big centralized messenger operator can be more trusted to run the SW they say they are running is always a contentious shifting argument. If you host your own or use a small hoster, these arguments about who might have been compromised or compelled are not relevant. Decentralized and federated protocols such as Simplex.chat, XMPP, Nextcloud Talk, Matrix, Session, and Delta Chat eliminate this concern.
As far as we can tell, they are both insecure: Telegram is closed source and Signal published their source but basically forces users to use the Google Play version which lags behind the OS version and you can never be 100% sure what it does, not to mention things like SGX.
What do you mean by SGX? SGX, even if it's fatally flawed, won't be worse than not using SGX. That's the worst case - they added a broken sandbox. Best case - they added a working one.
If you really have secrets which you don't wanna share, then you should not trust any of these services. Develop your own service or stick to PGP mails.
This just seems like a knee-jerk reaction to Durov promoting his own platform as usual, what does Elon Musk have to do with this for example? Is there any evidence that the authorities have ever had access to private conversations? At the end of the day, the issue comes down to the fact that Telegram is such a superior messaging app compared to anything else.
Du Rove made the original post on May 8th 7:31, judge it yourself if it is 'pretty intense'.
Boast that end-to-end encryption is absolutely safe is obscurantism. If you want most security in transmission, share your GPG public keys face-to-face.
Du Rove made this post after the founder of Twitter forwarded an article about Signal. Instead of Elon Musk who has turned Twitter into a easy to surveillance platform. It was also Elon Musk who used to be very direct and later learned to be smart these days.
I strongly recommend reading the original post yourself first.
Here's some global context from the past week or so. I'm just piecing this together, maybe someone more informed can comment:
- The Polish spy chief is warning that Russia intends to invade a NATO state in
the near future [0]
- Poland is strengthening its border with Belarus [1]
- Germany is considering conscripting all 18 year olds in the face of what it perceives as Russian aggression [2]
- Russia warns of "enormous danger" if NATO troops are sent to Ukraine [3]
- Russia threatens to use "special ammunition" against NATO [4]
Headlines were starting to read like this immediately before the Russian invasion of Ukraine. We saw troop buildups and threats for a while before any action was taken.
Amidst all this, there's a sudden push to move people off of Signal and onto the Russian-developed Telegram, which is widely regarded as less secure and is even not encrypted by default.
Telegram now operates out of the UAE, which has long been a partner with Russia. Wikipedia has this to say about the strengthening of UAE-Russia relations since the invasion of Ukraine:
> trade between the two nations strengthened with many Russians relocating to the UAE to invest in real estate, business, or "escape financial restrictions in Europe". Trade between the two countries has doubled to $5 billion since 2020 and there are approximately 4,000 companies with Russian roots that are operating within the country.
So, my take here is that this push toward telegram smells pretty bad given the timing. Telegram has always had kind of a smell about it, given that it rolled its own crypto and given Durov's involvement with the VK social network which was, in Durov's POV (again according to Wikipedia), taken over by Putin's faction.
Personally, I like Signal. I have some of the same concerns folks here have brought up. But it's been well vetted by experts and is highly regarded by people I trust. That doesn't mean you have to like Signal. Its crypto improvements have been spread to other apps, and many people are probably just fine using something like iMessage.
And while I don't know anything about Durov or his motives, I have yet to see any successful cryptography app anywhere in the world that didn't eventually have to compromise with a government. And Russia seems especially good at applying pressure, with a history of institutional tips and tricks that go back at least to the Soviet secret police, and possibly even further back to the Tsars.
As much as I think we can't objectively trust the US government in all matters, I think we can generally trust the cryptography experts. They tend to be skeptical of all governments when it comes to cryptography, even in democracies.
So that's my two cents. I wouldn't switch from Signal to Telegram. If you're on Telegram and you have especially interesting activity (like you fight in a war), then you should probably assume that Russia can see everything you're doing. That may change if Telegram gets the sort of robust cryptographic scrutiny Signal has. If you're not warring, you're not doing exotic fancy crimes, and you live in a democracy, you're probably fine with either app but possibly a little better off with Signal or iMessage.
Do use Telegram as a news source, to subscribe to channels
Do not, for the love of all that is holy, use it for any communications.
It is not secure and most likely has been hacked up the wazoo by the FSB.
"With assistance from Elon Musk" is a pretty big accusation. I held off replying until I read your whole thread, and then you didn't mention that at all. What the hell?"
Seriously, what the heck has Elon Musk to do with this? Unless we also want to debate what we all think of Elon Musk when we talk about chat protocols?
You can download Telegram and many forked clients from F-Droid. All the builds are from source code, so you know the source code is up-to-date.
Any distro can have Telegram clients, both official and third-party, in their repository.
Compared to this
1. You cannot download Signal from F-Droid. You need to download it from the Google Play Store. The released source code has lagged behind the version on the Google Play store by long periods of time many times. One example was when they implemented cryptocurrency payments, pushed the update to everyone but no one could inspect the source code.
2. Signal has sent legal threats to repositories that package Signal. The repos either need to confuse users by offering the client under other package names or remove it.
3. They also send baseless threats to forks that use their server. Combined with their lack of federation, this results in people having to use multiple apps from different sources with a much larger attack surface.
4. They beg for donations in the app even though they made an app with payments and cryptocurrency integration with an obscure coin (which they were involved with and had ample opportunity to hoard before ever announcing it as a feature in Signal).
5. They claim to have privacy features that other messengers lack, but these features are based on known-to-be-broken technologies like Intel SGX.
> pushed the update to everyone but no one could inspect the source code.
That was for the server code, which you shouldn't care about from a security standpoint for an E2EE messenger such as Signal. AFAIK that was not the case for the clients.
Regarding your other points, they have reasons that have been discussed elsewhere[0] to avoid federation, notably a lot of the progress on the Signal protocol would be way harder in a federated setting. There's no other messenger that has the same usability ("my grandfather can use it and won't have problems using it afterwards") while being at this level security-wise.
> Factually incorrect, just go to https://signal.org/android/apk/ (and the apk will then update itself) or build it yourself.
That page tells me that the safest way is to have a Google account, with Google Play Services installed on my phone, and to download it from the Google Play Store.
It then gives me an APK link after saying "Danger zone" and "most users should not do this".
If the app developer tells me it's dangerous and I shouldn't do it, can you even expect users to do this?
I don't know how to verify the SHA fingerprint without Googling (I know how it works, just don't do it often to know the exact openssl or equivalent command).
If I'm downloading the APK directly on the phone, there's a lot that's not under Signal's control that could happen.
What if I'm directly under attack, and I'm trying to move to Signal? The attacker could MITM the connection and intercept the download.
I think that's a fair warning to show a user, because indeed most users will likely want to install apps through Play Store, that'll reduce/remove supply chain risks. Users who know enough about APKs would be able to verify the hash, or build it themselves.
Even if I download the APK, I still have to accept a similar warning when installing it on my phone.
> If the app developer tells me it's dangerous and I shouldn't do it, can you even expect users to do this?
If you care about reproducible builds and avoiding trusting Google, you're already in the class of not-most-users.
Signal seems to have usually taken a pragmatic stance of defaults mattering.
Afaicr, that was the argument for linking to phone numbers (it allowed for more lazy users to use it) and encryption by default (few turn on opt-in-encryption).
And it seems accurate to say 'for most users, who don't know what they're doing and don't want to play personal-IT-department, using the Google Play store is more safe and secure.'
Google Play, for all its failings, IS the safest way to get an APK right now.
F-Droid for as much as I love the open platform, does not provide any security guarantees about what you're downloading. It is a volunteer run project and does not have the extensive security policies and practices that Google has. From https://f-droid.org/en/about/
> Although every effort is made to ensure that everything in the repository is safe to install, you use it AT YOUR OWN RISK.
Likewise downloading and side-loading it from their website, requires you to disable some security guarantees by doing things like enabling developer mode.
You know what you're doing, so you can ignore those errors. Seems like a much better alternative to endorsement of apk downloads directly from websites for non tech-literate users.
Do you disagree? The main issue I see with sideloads is that you don’t get automatic updates. I’d do that for an app I built myself, but not if the app is in the Play Store.
With Signal you actually do. The APK gotten from their website isn't the same from GPlay. It has an auto updater (will prompt you with a notif when a new version comes out, which you can click to install), and doesn't come with a FCM push notification system.
> 2. Signal has sent legal threats to repositories that package Signal. The repos either need to confuse users by offering the client under other package names or remove it.
Not that I really want to defend Signal (XMPP FTW!), but the legal threats were about using the Signal name, not making an unofficial client per se. I know a bit about it because I develop an alternative signal client (a signal-XMPP gateway to be more accurate). That said, they don't help 3rd party client devs at all.
I can understand that if they didn't compile it themselves they don't want 3rd parties using the 'Signal' name.
The name is what their reputation is staked on, and if a third party compiled it they have no idea if malware is secretly packaged in there too.
Having said that, the smart move is to dedicate a few engineer hours to packaging it for every linux distribution and every app store, even the smallish ones, to prevent others trying to 'be helpful' and requiring you to send a takedown.
> the legal threats were about using the Signal name
that's just misleading misdirection.
Firefox have issues with the legal name, that's why the source is called by other names and the branding is added later on.
signal ties the branding with the code, so it is impossible to build from the canonical source without triggering the branding issue.
So, in practice, it is a convoluted way to annoy anyone releasing from source. And as we know, actually using open source software without a "distro" is insanity. You cannot trust 1000s devs. you trust the distro, the distro trust 10s of package maintainers, the package maintainers trust 10s of devs. and everyone is happy. I trust f-droid just fine. But i don't trust the person who is publishing every apk on random sites like signal.
Telegram Foss clients exist only because of unpaid volunteers that take Telegrams messy mix of open and closed parts and rip closed parts out and replace them.
The Telegram organisation is notoriously late to release the source code to their current release. If they do, its a giant squashed commit without proper changelog.
These releases must then be first wrangled by volunteers to be well buildable.
The Telegram Org itself gives no support to volunteers at all.
You can't register with Foss builds. Only official binaries.
Nowadays a lot of features are premium only. You can only get premium with official binaries. That part is closed.
Telegram has fully reproducible builds and is not that complicated to build, no issues there. They even have a guide on how to build & verify. [0] No need to wrangle or modify, generally builds as is (at least from my experience).
Granted yes, the version commits are squashed like you said. [1] However I haven't seen source release to lag behind store releases, any sources on that?
A couple months ago I actually verified a build of Telegram on my friend's phone as he thought something might be off and didn't have any issues there (the build matched).
This doesn't affect the user that downloads these from distro repos or F-Droid because every single update they get comes from the source code. There is never a lag even for 1 second because without the source code there are no builds.
Pretty much all the packages on Linux repos come from package maintainers taking upstream source code, removing parts they don't like and then building that. This is a normal part of packaging and building open-source apps.
Yes and thats why users spend sometimes months on old builds.
Also which distro packages Telegram?
Fedora doesn't.
Debian does but at times it was so old the client crashed from receiving server comms because it wasn't fully compatible. It actually crashed as in segfault.
All true. But where are the sources of Telegram server? They are not open source, simply! What are they actually doing with our messages? Only they know. And they can read them because by default there's no E2E encryption.
would it matter if the server was open source? You'd know have no proof what is what they run on the actual server anyway, nor can you use a custom server.
It can matter if you can trust them to do the proper thing, i.e. if you assume they are not a malicious entity. In this case, checking the server source code can give experts insights about possible security risks.
If you assume they are malicious, (a) I wouldn't use their product in the first point, and (b) of course they can do whatever they want independently from the published code.
This comment sounds like astroturfing to me. Telegram doesn't even provide secure chats by default. Everybody I've talked to was unaware that the chat was unencrypted until I pointed it out. That's before considering that 1) I don't consider the company behind Telegram in any way trustworthy, 2) the servers for Telegram are closed source and it's unclear what's running there. Signal's server code is open source and the github is actively updated. We also know who works on Signal, as well as their credentials, and they're reasonably trustworthy compared to alternatives.
> The released source code has lagged behind the version on the Google Play store by long periods of time many times.
Seems like FUD. This took me 30 seconds to check just now:
-Telegram's android source code git hasn't had a tagged release in more than two months and is several versions behind the android app (10.12.0 vs 10.9.1)[1]
-Signal's android source has a tagged release two days ago that is two releases ahead of the stable version on google's app store, and also lists the tagged release for the version that is on the app store.[2]
Telegram rolled their own crypto and is used for a lot of intelligence operations like monitoring dissident groups, promoting propaganda, recruiting agents, etc. That probably explains the push to discredit more private apps like Signal.
Researchers of Telegram's protocol have said in some ways it's weaker than TLS.
“Rolling your own crypto” is discouraged for programmers, not for field experts. It’s not your average joe’s first try at encryption writing a caesar cypher…
It's discouraged for field experts too. In practice, real crypto schemes go through several rounds of analysis by multiple teams of experts, often working against each other. It's unusual these days for a single company to come up with a custom crypto scheme. It was probably more usual toward the beginning of cryptography.
For example of this sort of vetting, take a look at the standardization around AES or the post-quantum schemes.
In crypto you're almost always relying on hardness assumptions that aren't provable yet. So you need to guard against things like accidentally haven chosen the wrong constants that collapse a problem's hardness. Or, more mundanely, making a seemingly reasonable engineering choice that is known to weaken the protocol and which would be caught by a big org with a thorough review, but a startup may not catch.
Can these alternative telegram clients be sure the keys the server sent are from the other client, with no MITM? (Honest question, I don't know the answer.)
Pure smear comment. Signal was and is the choice of personal messaging app for anyone I know who has ever worked in security or intelligence. That should say it all. Aside from apple, who did it because of them, it has set the gold standard for e2e chat. People moan about the phone number and “metadata” when in reality all this can be used for is to say yes x has a signal account and this is when they last used it. That’s it. It’s effectively useless to anyone. People moan about it leveraging the local social graph of the device it’s a necessary convenience for the adoption of any modern chat app. They go into great detail about how it is and isn’t used in a way that it cannot be used/viewed by others.
Frankly I’d bet half the people smearing it have X and Facebook apps installed on there phones and really aren’t serious people.
If I wanted to smear off topic I’d point out that telegram, along with the usual suspects, is a gold mine for intelligence gathering for what I’ve heard.
I don't know about Telegram being nasty towards Signal but Signal brought this upon themselves.
Metadata are more important than the content of the messages and yet Signal has always been about knowing your phone number, with handwaving when the subject is mentioned.
Sessions, a Signal fork, had its tagline right: "Share encrypted messages, not metadata".
Signal is a metadata exchanging app and it's about collecting your phone number and everybody else' phone number.
Now I don't think Telegram needs to "attack" Telegram: Telegram is immensely more successful and has reached take-off velocity.
To me Telegram is going after WhatsApp, not Signal.
You are speaking of metadata as if all metadata is equal. Signal does collect phone numbers (even though, since usernames have been introduced [1], this can be made opt in from now on), but not the contacts or social graph, neither many other relevant metadata [2]. What they can gather from this, is only when the specified phone number registered to signal services and its last connection to the server [3].
So, if you can call "metadata exchanging app" an app that simply has a list of numbers registered to the service, without any metadata assigned to them except their last access, the same label could be assigned to a much larger number of services.
It may not be anonymous, but it can hardly be disregarded as private.
>but not the contacts or social graph, neither many other relevant metadata [2].
Assuming you trust them (notice all your links point to signal.org own publications). Most of the privacy people are cautious/paranoid and assume that everything that can be collected is collected. Even assuming a lack of malicious intent, what's stopping NSA from hacking into Signal's infrastructure and logging who's talking to who along with timestamps? That's not to say I don't trust signal (it's the best mainstream solution right now), but it could do better to hide metadata from the protocol.
> Even assuming a lack of malicious intent, what's stopping NSA from hacking into Signal's infrastructure and logging who's talking to who along with timestamps?
Sealed Sender, the second link in the comment you've replied to. The indicator is off by default, but you can enable it under Settings → Privacy → Advanced. If I remember correctly, it doesn't work for the very first message you exchange with someone, but then it turns on and remains on.
In layman terms, it turns "from A; to B; content: <encrypted>" into "to B; content: <encrypted>". Their infrastructure doesn't need to know the "from" part to serve its purpose, so they strip it away.
If it was the other way around, they'd have to give that info to the (US) court. Same as any other US-based business, it's not optional, they can't ignore such requests, they can't lie, otherwise they'd be placing themselves in legal troubles for a random nobody that happens to be using their product. So, when I see this page, I fully believe them: https://signal.org/bigbrother/. If I didn't, my first step would be to look up those court cases from alternate sources.
The point is that you don't have to trust them because the client (where the relevant cryptography is performed) is open source and the fact that my links point to signal.org is completely irrelevant, those blog posts are just ways to advertise facts that are freely verifiable. You can read the source code to check the implementation of sealed senders or how the social graph is handled.
NSA can hack into Signal's infrastructure, and what they will be able to gather are the same information provided by Signal in reply to subpoenas (the whole list here https://signal.org/bigbrother/), because everything else is end-to-end encrypted.
Not sure if metadata is more important but ok. Signal has launched support to use nicknames instead of phone numbers although it's true this took a long time.
I think Telegram is very wide-spread for running large group chats and communities, a use case that Signal is not interested in. Personally, I want my chats end to end encrypted and I'm grateful to Signal for pioneering this and inspiring Whatsapp, facebook messenger and others to adopt the same.
> Signal has launched support to use nicknames instead of phone numbers
You will still need a phone number to sign up for Signal. Signal still knows your phone number, you just hide it from your contacts. To me this only makes it even more suspicious.
Yeah, basically both super duper encrypted privacy oriented services want your phone number.
Sorry, but that's not privacy. I don't care what they do to encrypt your messages, they are still tied to me, which makes the super duper encryption pointless.
I agree with you that Telegram does not even have E2EE and that's bad.
But in this thread, GP was just talking about metadata. The goalpost here is metadata. GP particularly mentioned that Signal "fixed" the phone number issue and I just want to note that currently Signal isn't any better than Telegram in this aspect.
When they-them-those know who you are, knowledge of the full attack surface is the better way to compromise because it leaves the first step to compromise uncompromising-appearing. (The attack surface is broader than people generally consider, as it should include over-the-shoulder attacks, XKCD's wrench attack, etc.)
The success of such tactics can more easily be understood by even looking through the many, many comments right in this thread telling us Signal protects metadata because usernames are now a feature - guys, Signal has the metadata as the *services* are what is the topic of discussion here, not other users.
Not sure if you've got your sequence of events straight. End-to-end encryption was added to Whatsapp after it was bought by Facebook, before the co-founder of Whatsapp left to found Signal.
It's not as simple as Signal inspiring Facebook and Whatsapp, the sequence of events happened in reverse order.
Whatsapp started encrypting messages after significant security issues in ~2012
It was purchased by Facebook under initially some administration separation terms in ~2014?
In 2016 it added e2e encryption. If I recall this was controversial because it limits fb ad potential on users.
I guess what I'm trying to say is that the timeline seems to me to still be pointing towards <<e2e came from Whatsapp not FB as an initiative, even if FB owned Whatsapp at the time.>>
Again you're not factually wrong, but I hope my restating of the timeline makes it more clear why i I think your reverse order point doesn't tell the same story.
Signal is still a significant improvement in security and privacy over SMS, Telegram, Discord, X, Whatsapp... It achieved the level of privacy that solutions like PGP tried and mostly failed to achieve for decades. Being tied to a phone number was part of the convenience of their solution. Allowing for nicknames now, might improve on the metadata leakage problem slightly.
I'm sure reactionists will immediately drop Signal because Elon the great said they should without considering that it still might be their best solution to communicate privately with their friends and family. But X makes *all* of it's money from collecting a lot more than metadata from their users, Tesla collects driving data and metadata from all of its customers, Grok trains it's AI on all of the data collected from X, Tesla and other sources without asking if users want to opt out of those training datasets. So I'm not sure Elon has a leg to stand on in this conversation.
Did it really, or did he just jump on the Signal train after it had already left the station? Can you tell the difference? I'm sure he drove some users to the platform, but the technology and the platform have mostly spoken for themselves.
But Signal specifically used phone numbers to leverage the already existing social graph on your phone. The numbers were never transferred or stored by Signal. You can literally see what information they gave when they were subpoenad: https://signal.org/bigbrother/central-california-grand-jury/
If you read the PDF, the phone numbers are in the subpoena as the key for what's being requested, so yes they clearly were stored in a way Signal can access.
Your account phone number, yes. But, most importantly, Signal doesn't store your social network in its servers in a way that it could give authorities the phone numbers of all the people you communicate with. Or, worse, the times and dates of those conversations.
So all it can tell authorities is that a person with x phone number uses Signal and still uses it.
As of a few weeks ago, you can hide your phone number on Signal, and it's even the default, even for existing accounts. You can even opt in to disable discovery by someone who already knows your number.[0]
> To me Telegram is going after WhatsApp, not Signal.
They mention Signal by name in the referenced post.
Irs super complicated to use.
And you need an existing Telegram account to actually handle that cryptocurrency to buy these pseudo-numbers outside of the telephone namespace.
Guess what you need to register those. An actual working phone number.
I'm not sure what the behaviour is now but certainly the default a while back was that anytime someone in your contacts joined Signal you would get a message. Imo this was a crazy behaviour that immediately told you something about certain people in your contacts in a very visible way (that they were on Signal). I couldn't tell from the settings whether this was now off by default.
Telegram has done and may still (I don't know personally) do the exact same thing. Stated noncombatively and without assumption about what argument you may or may not be making, but seems relevant to mention in this context. Astonishingly bad behaviour no matter which app!
> Telegram doesn't enable E2E out of the box because it wanted users to compare some image matches in person (akin to the idea of PGP sign parties I suppose?)
That makes sense only in the context of e2ee to avoid MITM attacks, unless I'm mistaken.
Would make more sense if it is reproducible builds. Shouldn't we just all switch to matrix clients which use Olm and Megolm cryptographic ratchets and it doesn't rely on one server/one entity.
To follow on, it turned out not only did that stunningly milquetoast comment I made get flagged, but after I made it someone tried to DDoS everything associated with me.
I mean, do these people not know their Shakespeare?
It's completely fair to criticise Signal for its weird decisions (like that time they stopped publishing part of their code for a while to surprise everyone with a crypto scheme.
However, when this criticism comes from an insecure competitor that was forced to pay back immense amounts of money for misleading investors about crypto, I wouldn't take that at face value.
Signal is mostly fine with some weird/bad decisions. Telegram is worse in every single way.
All of that said, I've never seen any of the cryptocurrency shit show up in my phone. Is it geofenced or something?
It's exactly as hidden/opt-in as the Stripe crypto settings, where Stripe was completely shit on in 20 comments about scams, scamming, and scam currencies.
Downvoters hate having their double standard revealed.
This issue is sadly all just identity politics. Telegram is frequently associated with fringe groups, conspiracy theorists, anti-vaxxers, and "the right". Signal is pushed by the sort of lefty-liberals who quit Twitter, by journalists, and more associated with the mainstream media.
This is not to group everyone, I realise there are communities that cross that divide, and this is no judgement of people using either. But I think this divide will continue as the political trends continue. Both sides believe they are right, both more than they perhaps should given the evidence. That said, I'll stick with the one the cryptography/security nerds are using, not the one they think is a honeypot.
I use it since 2014 and have felt that it's very performant (more so than Signal or Whatsapp, Messenger, or Viber).
Also, it adds many useful features that other messengers didn't always have and many still don't have, for example Saved Messages, Scheduled Messages, Spoiler Messages, Reply to Message, message formatting (bold, monospace, etc), just to name a few off the top of my head.
Signal has "Scheduled Messages, Spoiler Messages, Reply to Message, message formatting (bold, monospace, etc)". I don't know what Saved Messages are, maybe it doesn't have those.
I don't get this. I'm in the EU and nearly everybody I know has Telegram.
Telegram has a huge advantage versus WhatsApp: it's not Meta. Then the Telegram UI is really excellent.
When you tell people all your friends and family are using it and that's it's not from Facebook, they usually install it on the spot. Then they're hooked.
I don't mind WhatsApp being Meta but Telegram is more lightweight and UI is far superior (for instance, ability to edit messages). Unfortunately, most people still use WhatsApp, you can't really avoid using it.
"Same in the EU" - but you're actually make an opposite statement than the GP (GP said "everyone i know uses telegram" and you said "nobody uses telegram")
Oh and I also use it for messaging sometimes. But my main use case is participating in various groups, like in a forum way. And my peer group does the same and I have not met a single person that uses telegram mainly for messaging. Most also have signal or whatsapp for that.
For Russian language content it feels a lot like pre-enshittification internet. You get blogs on all possible topics without ads or "Algorithm". Just read what you subscribed to, in whatever order you want.
I would never trust in with any confidential information though.
This is the most out of touch comment. Everybody is running after telegram, they are innovating all the time. If you want to see what Whatsapp will look like in a year, use telegram now.
Life will be much more boring if we cannot find humour even in the most boring things.
So, it's good that the personal involvement of the illustrious Elon turns even obvious political influence operations into a circus with talking horses and scary clowns.
It's not just opt-in, it's a non-default option you have to actively seek out and enable with every new conversation you start. So yes, by default, without additional steps taken, telegram is not e2e encrypted.
I'd guess telegram can be secure if used correctly but the fact that their desktop client doesn't support secret chats at all feels weird. It has been one of the most requested features but they seem to have no interest in implementing it and have closed the issue on github.
The non-standard crypto was also problematic, at least initially. Furthermore, as outlined, the claims on reproducible builds vis-a-vis Signal are debatable - both provide them on Android, neither satisfactorily on iOS.
They can't be compared because Signal's criticism of Telegram is legitimate and warranted, Telegram's criticism of Signal isn't. Telegram isn't even an encrypted messenger.
I don't trust either side and having a cryptography expert located in Baltimore, MD trying to prove that the other side is wrong seems just as off as a Russian owner trying to prove the opposite.
In the end it doesn't matter if you are using a smart phone from Apple or Google as your soft-keyboard is such an easy target there is no need to decrypt anything.
Yes, afaik the whole point of all this tech is that it is compromised by design and intended to allow agencies/governments/corporations fine grained access to each individual.
I use both these apps fwiw. I'm under no illusions that anything is really private online.
Stating that Telegram is unencrypted is incorrect. It offers optional end-to-end encryption; however, by default, it uses encryption in transit. Of course, there is a trade-off between convenience and encryption, and having access to all messages on all devices is beneficial.
However, technicalities are not the point: both Ukrainians and Russians trust Telegram—despite being at war. Telegram has managed to distribute its servers and legal presence across multiple countries, making it challenging for courts to track. This provides a level of security that American-based entities cannot offer.
There is a great discussion by Pinboard on why telegram is more safe, and it is preferred by activists in Hong Kong: https://twitter.com/Pinboard/status/1474096410383421452 "There's a disconnect between critiques of Telegram and its practical use that have made me uneasy about joining technical pile-ons around how it's not really encrypted messaging. Let me use the example of Telegram use in the Hong Kong protests..."
If anything, all this animosity toward telegram has always been a bit suspicious to me. But anyway, assume your cellphone to be extremely unsafe.
Optional E2E encryption is effectively not encryption. The default encryption in transit is useless if you don't trust the server. The argument about convenience is ridiculous. Whatsapp provides better default encryption than Telegram. The (probably deliberate) flaw in Whatsapp encryption is the default backup method, but E2E encryption is enabled by default with no loss in convenience. Telegram is not encrypted by default, and their encryption scheme has been shown to have rookie level mistakes in it. Just because it's "preferred by activists" doesn't mean said activists have any idea about secure communications.
I'm not saying that you should jump on Signal (or anything else). I'm saying Telegram is almost certainly broken. Maybe maliciously, maybe accidentally, but almost certainly.
For reference, I don't use either Signal or Telegram anymore, but Telegram sets off so many alarms I'd steer clear of it.
Having in-transit encryption in your communications software is kindergarten level stuff. It's the most minimum of hurdles to pass, so it's not worth mentioning anymore. Thus encryption always refers to E2E encryption in these discussions.
In my opinion this has started as part of Rufo's campaign against Katherine Maher (see https://news.ycombinator.com/item?id=40341993), then Dorsey and Musk boosted that article because it aligns with their political views. Durov decided to add Telegram vs Signal angle in his post.
This seems organic to me. I was a security researcher, and for years I've been telling anyone who would listen that Telegram is not as secure as their marketing says it is, while Signal is.
The reasons why are already pretty well listed in the thread above. Telegram's E2EE is hand-rolled and not the default. Signal's E2EE is always on, and it's _the_ industry standard protocol. (Outside of iMessage, I believe the Signal protocol is used on every well-adopted messaging service which offers E2EE chats.)
People also aren't aware that phone numbers and usernames are tied on Telegram. When a former friend of mine joined Telegram, I searched up his username, and found his _very_ explicit Reddit account. This identity compromise issue isn't mentioned more often.
You can add me to the list. There is no good reason to pick Telegram over Signal, unless you don't care about security. It DOES have more sticker packs.
>People also aren't aware that phone numbers and usernames are tied on Telegram.
But you can, under Privacy & Security, switch Phone number visibility to "nobody".
You can also change your username anytime you want to. A new feature called "anonymous numbers" allows you to purchase and use virtual numbers (they start with +888).
I think the bigger problem here is that Telegram has not e2e encryption enabled by default, which is definitely suspect.
That was my impression too, that this was more of a thread to slander Telegram than anything.
The main leg that Signal has to stand on is it uses standard encryption, but it has all kinds of shady components like it used to require sharing phone number to contact someone, and the cofounder Moxie launched some MOB crypto scam which went to 0 and he has now quit the project too.
As I recall they went out of their way to hide that they were working on that shitcoin integration as well, Signals open source releases went dark for a year or so without explanation and then it turned out to be because they didn't want people to know about MobileCoin. Compromising the transparency of the project to obfuscate the development of a feature that they surely knew would be unpopular isn't a good look.
Perhaps you're right, and all of them have the "greater good" intentions, but it's ridiculous how their "regular reminders" popped up in the same 24h interval
It's getting harder and harder to tell because bot activity has gotten so good, but Matthew Green has been around a while and is a genuine old school crypto dude. There is a group of people who just believes that crypto and privacy are good things and want to promote them.
The reason it gets harder is because you can spin up a handful of "expert" accounts shilling for this or that privacy VPN or bitcoin scam etc. So it's hard to just pull up a list of statements and know whether it has any weight. In this case, Matthew Green has a lot of weight because I've followed him for a while and I know what he's about.
yes, but that's the point: it's not a technical problem, it's an institutional problem. Facebook is pure surveillance capitalism. They live by scooping your data. E2EE is hardly a concern or a solution.
While metadata can leak a lot about conversations, it doesn't leak nearly as much as plain-text data of conversations. I've argued for years that companies have an incentive to do E2EE on private messages so they don't have to be held liable or have to get involved in a lot of investigations if they don't have any access to the info. Telegram has access to the plain-text data of the conversations, as far as I know. Signal, WhatsApp, and Messenger (more and more), seem to not have much, if any, access to the plain-text data of conversations.
But the Meta companies are lying about E2EE, I don't know? Signal has seemed to me to be the company (org actually, nonprofit) that cares the most about privacy in terms of intentions and implementation.
Facebook actually has had optional E2EE with the Signal protocol since at least 2016 (in my experience), as "secret chats". This puts it on a better security standing than Telegram.
Yes, but Facebook (and others) uses the Signal protocol in its optional E2EE chats, because it has withstood the test of time. But Telegram uses its custom protocol (MTProto2) in its optional E2EE chats, which has a host of problems and has not withstood the same weathering.
It’s ultimately a distinction without a difference, as it is an appeal to the morality of the corporation behind the product, which can change from based on their incentives. E2EE protects against that.
This would be the same case for Telegram as well, if someone has your phone. I believe that Signal can have a lock on the client, and the database is encrypted.
The other part that Du Rove conveniently left out: Signal went against the US courts and won [0]. When subpoenaed to give all user information they gave them all that had: the unix timestamp of when the account was created, and the last date you connected to the signal service. That was in late 2021. I'm really curious as to what Telegram has told the FSB.
[0]: https://signal.org/bigbrother/cd-california-grand-jury/