> The American Civil Liberties Union announced Tuesday that Open Whisper Systems (OWS), the company behind popular encrypted messaging app Signal, was subpoenaed earlier this year by a federal grand jury in the Eastern District of Virginia to hand over a slew of information—"subscriber name, addresses, telephone numbers, email addresses, method of payment"—on two of its users.
> ... “The only information responsive to the subpoena held by OWS is the time of account creation and the date of the last connection to Signal servers,” Kaufman continued, also pointing out that the company did in fact hand over this data.
Yes, it's not strong proof, but it should be taken into account when comparing the goals and motivations of organizations developing various other communicators.
The organization behind your communicator app could be in the business of gathering data about you and selling it in various forms (Facebook, Google), in the business of selling hardware and add-on software services (Apple), or in the non-business of trying to provide you with private communications.
The Signal app itself is opensource as well various pieces of the tech stack. You can audit yourself what is being sent and how their protocols work. The protocol itself has won awards due to its security and elegance.
There is a lot of good things to say about Signal and you can easily find it all. They have made some annoying or less than ideal features that are opt-out instead of opt-in but they're not sacrificing privacy for them.
Your phone is running an APK, which is a bunch of signed code. You don't have the keys to sign such an APK yourself, but you can get tools that will tell you exactly what's inside the one you have.
I believe the Java source in GitHub is designed to be capable of a reproducible build, where you get the exact same Java binaries out as Signal's own builders did and thus you can compare that to confirm the Java code in your APK matches a specific Git checkout.
The media files (e.g. images, labels) are just straight binary copies so that's easy enough.
However there is native code to make stuff like video calls work, and when I last paid attention there was no reproducible build for that component. So you could imagine that somehow the native video call code is actually a secret backdoor or something.
Still, years ahead anything else that actually have users.
The server is mostly 100% closed source. There's one open source that you can host, but it's widely believed to not be even close to the one they use.
I think only matrix is fully open and p2p.
> should be taken into account when comparing the goals and motivations of organizations developing various other communicators.
Business or funding models can change for both for- and non-profit organizations. Especially as people move to options that are believed to have better user-privacy, the idea that they do not sell/monetize collected user data today does not indicate what they will do tomorrow.
Unless users have strong evidence that companies are not collecting and/or monetizing this information (which as the OP pointed out, there is for Signal as found in a subpoena), the "billboard" approach towards promising user privacy via marketing and PR is a shallow one at best for non-profits as well as for-profit companies.
That said, at the end of the day you have to trust SOMEONE if you want to use digital communications. And there's certainly a difference between facebook and GPG email encryption.
It's just a matter of balancing convenience and privacy for your personal use case.
I'm asking specifically because I remember Private Internet Access, a VPN provider, also being tested in court in the past , and because of this I've chosen to trust them despite them falling under Five Eyes jurisdiction.
In short, the ACLU helped them to lift the gag order, and the blog itself shows the legal documents. The documents show exactly the data returned (Account creation and last access in Unix millis). Only the phone numbers are still redacted.
I didn't try other servers/cities to get more information.
My default choice is Sweden since they have the most lax copyright laws in the world, so subpoenaing any Swedish server gonna be tough.
They also offered me unavoidable discount.
This is a fairly common practice and my understanding is that every major and most minor VPN services offer this.
Mullvad also does this. Here's their server status page, so you can see your options: https://mullvad.net/en/servers/
Source: Another convert from PIA to Mullvad post Kape.
Unlikely. Signal and the ACLU were the ones who filed suit to allow them to disclose the terms of the warrant the first place.
It would be an incredibly expensive and risky move for them to do so if they knew that the judge could force them to reveal that they've been turning over more detailed user data in secret.
Not to mention that it would have amounted to perjury.
I chose NordVPN coz we can access from any server in the world and they have offered me good discount
I also read somewhere their CEO did something comprising to churn in more profits.
Maybe I will think about buying Mullvad subscription.
That describes literally most of the VPN service offerings out there.
Can they keep the money?
Is there not a suspicion that Google, another US-based corporation, may have some agreement with American national security to supply malicious APKs to individual targets via the Play Store? Having Signal’s signing key would allow the state to present that custom-targeted APK as an ordinary Signal version update.
If some government wants to get you they will get you, probably via your operating system... Signal won't help you. If that's your concern then you gotta stay off the internet to be honest!
Telegram, for all intents and purposes, is about as secure as using Facebook. The best you can do with Telegram is hope they don't sell out or get compromised at some point in the future, because all your private communications are stored on their servers forever. Telegram does have "secret chats", which from what I can gather, don't even work for group chats, only one-to-one messages.
My general advice is to treat Telegram like a new Facebook if you have to use it, assume everything may by read by everyone, don't treat it like it's private and secure.
For "text messaging" friends and family use Signal. Everything is end-to-end encrypted by default, so you know nobody is collecting your data.
The way I've been presenting it to people is that Telegram can look at more data than WhatsApp can. But WhatsApp will use the data they have more than Telegram will. That's the tradeoff.
And yes, obviously Signal is more secure than both of them but I've been steering non-techies to Telegram because of usability, backups, cross-device history etc. As usual, everything is a tradeoff, but if people were happily using WhatsApp up until now, and also use Gmail, Telegram is not worse than those.
I think you should look at the odds of that "can" turning into a "will" over time. After an acquisition, or a change in business fortunes, or a change in leadership...
According to his own numbers , Durov's entire net worth  can only sustain Telegram for about a decade.
> A project of our size needs at least a few hundred million dollars per year to keep going.
Is it just me or does order of $1 per user sound like a lot?
I’d estimate I cost over $10/yr to telegram.
But what I do like about Telegram is their good user experience and Bot API developer experience. It's soooooooooo fucking good I'm telling you. It just works, be it on web, mobile, and desktop.
At this point who the fuck knows if Durov can be trusted (hell we all wish, right, no harm in that). But regardless of that, at the end of the day I'd be willing to admit he's a fucking genius when it comes to Telegram's UX and DX.
It's a threat model decision. If you're someone who wants privacy from the US or other Western governments (think Antifa on the left side, or corona-deniers, qanons and other conspiracy nuts on the right side), Telegram is the best option since the Western governments can't hold them accountable. If you're a Russian or Chinese dissident, or opposition in countries aligned with them (e.g. Serbia) Whatsapp and Facebook are your best bet.
That's not true of Telegram/WhatsApp/etc...
I think it's wise to remember that what happens on the other "end" is outside of your control.
If the other person in the conversation stores chat backups unencrypted you're still at risk, and there's not much you can do about it.
If someone can read it, then they can copy it.
Most people will not archive all texts they get in the moment, it's only after some fallout or event happens they there's motivation to dig up old messages.
Wickr does "screenshot notification" somehow, so now I occasionally get sent photos taken of phones showing "private" Wickr messages...
As others are pointing out the reason a lot of people trust Telegrem and Signal is that corrupt governments don't!
That the few times the "kimono" had been opened both Telegram and Signal were doing what they said they were doing -- even if they both have different approaches.
This isn't like Facebook who just lie and have been caught doing so.
What we do know is that programs like Telegram have to store data about users on their servers, by design. A big difference between the two projects is that Signal is carefully designed to minimize the amount of data the service needs to operate; it's why identifiers are phone numbers --- so it can piggyback on your already-existing contact lists, which are kept on your phone.
By contrast, other services store, in effect, a durable list of every person you communicate with, usually indexed in a plaintext database.
Yes. Ultimately we have no choice but to trust trust itself.[a] That said, if the OP were a non-technical friend asking me the same question, I would respond more or less like this:
"Of all the widely used messaging services, Signal is the only one known to be designed to minimize the amount of user data needed to operate, and all indications are that they are operating as designed[b], so Signal is likely your best choice today if privacy is your main concern."
edit: after a quick algolia search, it has indeed been posted much more this year than years before.
Does that help in any way to verify that they do not store data on their servers?
Someone please correct me if I'm wrong.
Edit: That being said, I believe they could still record IPs, as well as the destination and timestamps of each message.
It protects signal from hackers or a malicious datacenter provider at best.
> SGX enclaves also support a feature called remote attestation. Remote attestation provides a cryptographic guarantee of the code that is running in a remote enclave over a network.
> Originally designed for DRM applications, most SGX examples imagine an SGX enclave running on a client. This would allow a server to stream media content to a client enclave with the assurance that the client software requesting the media is the “authentic” software that will play the media only once, instead of custom software that reverse engineered the network API call and will publish the media as a torrent instead.
On the contrary, if you answer the same to WhatsApp, it plain refuses to work. But it actually created an account on their servers, and from that on you appear on your contacts who do use WhatsApp as another user of WhatsApp, which invites them to write to you there although you cannot receive their messages. To fix this, you have to find the option in WhatsApp to delete your account.
Is (b) achievable by all users who have this concern?
As of mid-2016, and trusted as much as you feel like trusting something attested in a court of law, Signal stores: a bool (is this phone number a user) and two ints (epoch of signup, epoch of last transmission).
See the full explanation at https://signal.org/blog/private-contact-discovery/ (starts part way down, with "trust but verify"). Or check the client source code yourself!
Telegram: I dunno, they.re closed source, don't encrypt by default, and have shady ownership. I don't trust them at all, personally.
Also whole security dangles on Intel to be trusted to not give its private keys to anyone. Which is a big ask for any company. NSA/CIA likely can get those keys legally via FISA court order or illegaly via hacking and/or insider.
 - https://arstechnica.com/information-technology/2020/03/hacke...
 - https://www.theregister.com/2020/06/10/intel_patches_sgx_aga...
The answer is that signal includes an industry-leading attestation process using CPU security features.
It's true that if the CPU manufacturer is compromised that would compromise anything running on it, including attestation. But that's not really to do with Signal's implementation, and it is out of scope of the question.
The answer is that signal includes an industry-leading attestation process using CPU security features. If the CPU manufacturer is compromised that would compromise anything running on it, including attestation. But that's not a flaw in Signal's implementation, and it is out of scope of the question.
> do we really trust signal? cause i see zero reason to.
Here's a reason: I use it every day and I'm not dead yet.
And it is convenient, because you can just switch your smartphone and still access all your chathistory, without having to manually backup/restore.
But Telegram in general does not have a business model yet, so just assume, that one day, they want(or have) to cash out.
Signal on the other hand is a non-profit foundation and pretty open on what they are doing. That creates trust for me.
Telegram‘s „secret chats“ and signal chats are end-to-end encrypted. The servers still may store metadata, and there is no way to tell if they do than either joining them or let a trusted third party verify that.
To check if e2e encrypted message content cannot be encrypted via backdoors on their servers, you need to ensure they use proven encryption schemes and the client encryption does correspond to those algorithms.
These are three very different companies with very different security processes and trust profiles.
In the case of Signal: if you trust that the source code they distribute is the same as the app available in the Play Store, then it's pretty easy to verify that the messaging data is end-to-end encrypted in a way that prevents Signal from having much metadata that they even could store. With "sealed sender", they don't even know who's talking to whom: https://signal.org/blog/sealed-sender/
There's the possibility that Signal could ship a different app in the Play store, but that would require active malice to do in a way that would not be trivial to discover, and at some point you do have to trust someone. It's not impossible, but it's hard to imagine a world in which Signal is compromised but other links in the chain aren't, because quite frankly, there are far more easily corruptible or hackable links in the hardware/software stack that you use, so Signal would make a pretty inefficient target for someone who wants monetizeable data.
 ie, an accidental divergence between the two would be more conspicuous
The point is that even if Signal permanently stored everything you ever sent them, then they wouldn't be able to read it.
- You can build the client yourself per Signal's reproducible builds, so actually, they could not ship a different app to the published source without it being immediately detectable
- You can validate the source code does not send any unencrypted data to Signal
- You can validate that your private keys used for encryption are stored locally on the device and not transmitted to Signal
Theoretically, anyone who has the corresponding private key could decrypt the message. So if your contact uses an unofficial client which does share their private key with a third party, then that third party could unencrypt that message, however by that point, the app creator has compromised the device anyway, and could do something as naive as take screenshots of all of the messages in the background after Signal has done the work of ensuring the secure transmission of the message.
Note, I haven't actually done all that, because I do trust Signal. But I could if I wanted to. And obviously, this assumes that all the cryptographic standards used in Signal are still unbroken - but if they were, then you're screwed either way.
And, of course, one has the option to build the source and install that.
1. the protocol between client and server is setup in such a way, even if Signal wanted to store interesting information, they could not access anything interesting even if they wanted to (for example, messages), thus they don't store anything since it's useless
2. the app implements the protocol faithfully and this has been checked by people perusing the source code
3. the binary downloaded from the app/play store phone is compiled from the sources listed on github
I get how it might be done in theory but real life is complicated. Has anyone attempted to do this?
You compare the result you get compiling the app yourself with what you downloaded from the Play Store. For iOS this might be harder
This is mentioned elsewhere, but the answer is: reproducible builds.
You can take the Signal client source (which is available on Github), build an APK or whatnot yourself, then get the SHA256 hash or whatever and compare that to the artifact downloaded from the app store and validate that it's the same.
Has anyone done it? No idea!
You can though, decompile both your version and the app store version and compare them that way.
1. downloading the binary
2. jailbreaking the phone to extract the binary (pretty sure this is necessary on iOS)
3. check the version of the binary, then compile the original sources of the version
4. ??? compare the two binaries, this is likely the most difficult part, they won't be identical because of things like codesigning (and build flags, timestamps, ...)
I know noone that does this.
There is a large group of people who do this sort of research, and some fraction of them do this research and actually talk about it or publish papers. If you could find a deliberate weakness in the security of an app like what we are talking about (or WhatsApp or iMessages) then you have just printed your own golden ticket to whatever mobile cybersecurity job you want for the next decade or two, so there is a bit of an incentive to publish if something like this was discovered...
My concern is not about my data being stored on their servers. My concern is about having having marketing data being sold to third parties in order to target advertising at me, just as when you leave "third party cookies" active on your browser. That is creepy and invasive. Would Zuckerburg ever do such thing?
Now, concerning the other servers, it may be a problem, just like Gmail is a problem for everyone running their own email server. However, it's a much smaller problem and at least theoretically everyone who cares can escape the walled garden.
You can create a server for all your friends and you will always know exactly which information is shared with others and which isn't.
For Signal there is an open issue here for iOS 
and some documentation for Android 
Some nice work about it has already be done by telegram
- read the source code and are satisfied that it's secure
- compiled that version of the code
- installed it on your mobile or desktop
You're still only as secure as the client on the other side of the conversation.
If that one is compromised (has not gone throught the steps above) it could very well be sending all messages in clear text to a malicious party.
That doesn't mean I blindly trust them, only that despite seeing potential for abuse I judge that they have more incentive to be telling the truth than not.
Also check the comment by user faitswulff where they mention how they have been subpoenaed "and could only supply account creation time and last connection time".
>> There's no technical solution in any technology for preventing the other side being compromised, as far as I can see.
I don't know Matrix, but I can guarantee that it doesn't solve the problem of a compromised client obtaining the messages willingly sent to it.
If the NSA did have it backdoored somehow through the OS, it's a good bet they'd force LE agencies to use parallel construction to keep that information top secret.
That is why we really need open source hardware and OS's. A good (or even functional) open linux phone can't come fast enough.
* Magical amulets?
* Fake your own death, move into a submarine?
* YOU’RE STILL GONNA BE MOSSAD’ED UPON
The Google Play Services app/package? Heh...
It very much does matter if the server is malicious.
Please don't spread this harmful meme.
It is hard enough to get my parents to use a secure messenger. If I told them they needed to do a key verification process for every person they ever communicate with... they'd just go back to facebook messenger or sms.
I think it is completely reasonable for somebody to say "I don't care enough to worry about validating public keys" while also educating people like journalists about how to do that correctly.
Signal also offers to label contacts for which you could verify the authenticity by another way.
Doing a video call with the contact can be a simple way to clear doubts, even if it is not a proper different channel.
Signal does have the capability to have a verification phrase displayed, which is generated from the session key. Reading that off can make the video more difficult to MITM, because then they'd have to morph the audio to match the phrase, and if it's done after the video is setup, morph the video as well. Not impossible, but difficult.
This uses the fact that the client on each side is open source and inspectable, so that each side knows that they sent only the public key that they generated on their own device.
PS: to answer your last sentence, Signal allows you to flag specifically contacts that you managed to verify. Which is technically equivalent to say that you verified that the public key is theirs.
Indeed it is far from straightforward that merely doing a video call suffices to check the keys.
Signal is famously using a special protocol for secure key sharing through the server, which I have not studied.
But as said by another comment, there is no way around verifying explicitly the public key using an independent channel.
Metadata can be as damning as the actual message data, and in a lot of places you don't want the authorities to know that you are even communicating at all.
Can you elaborate on this? That's exactly what I'd expect of an app I compiled from source.
To be secure you would HAVE to build and install it from source.
But then again your OS could possibly inject code to get the keys. Or a keystroke logger may have been installed.
Edit: see discussion here: https://news.ycombinator.com/item?id=25690036
not allowed to use VPNs because national security issues.
"social media muisuse"
i remember last year this word was so much used, "misuse" which translates to criticizing the ruiling dictator government. it still is,
here. a whatsapp group needs to be "registered" with police.
and lastly more recently,
this is a reason why i never signed up for whatsapp, havent joined signal, don't tweet or post on facebook. Why? because PII
the danger is real and i am living it. people better realize it
If the main danger is, police scanning the phone for compromised material (without a police spyware on it), then there are some ways to deal with it technically, by using services that don't leave a trace. Telegram for example has a "secret chat" function, which won't save the messages, meaning someone scanning your phone later, won't find them.
(which I head is also a main reason for many people to join telegram, because so they can chat with their affairs and not have their wifes read it)
Then there are simply private tabs of chrome or ff, from where you can use chat-services without trace. (if the chat services are not cooperating with the police, or are decentralised by default, I think in that scenario I would use matrix)
Anyway, you live in kashmir?
I know mainly of the conflict by reading Shalimar the Clown, from Rushdie. Just curious about your opinion, if you know the book. I heard it was not well received in Kashmir itself?
I think it was very well written, but I don't know how accurate it is.
8 months or so ago I was stooped because it looked like I was "recording a video" on my phone when actually I was. Took a slow turn, double press power button and pickachu face that I wasnt. Still a couple guys around helped or I was history.
No. I havent read Rushdie. It has that whole demon verse thing around him, he isnt liked
The problem with telegram as with WhatsApp and signal is phone numbers. India has had this network analyzer on isp level for like 6-7 years, called "netra". So all unencrypted traffic goes through it. Same for all encrypted traffic. This is the reason why I stopped using tor, because my traffic would show up uniquely than rest and that gets them suspicious quickly.
There is a lot of text written on the conflict which actually is more than 500 years old. Kashmir has been under foreign oppressive occupation for over 500 years constantly and even today is under 3 nations. Its not like the occupation wont affect the people.
I am trying to get people I know on matrix because there is no PII, waiting for dendrite to come out of beta so that I can set up my own server and such.
The joys of living in an open air prison.
Well, he did made many people aware of the conflict, which created attention, which results in indian police having to give interviews to the intercept for example, which helps in some ways. Things would probably be darker without.
How is your opinion on a political solution?
Do you think independence would work out (if your big neigbhours would let you)?
My understanding is, that the kashmir population in itself is divided?
Anyway, hope you stay safe.
from what i have observed and from facts, at least on the indian occupied side, india spend like millions to convince entire generations about a local hero who happened to be pro india.
this was a couple years ago.
yeah so its not like people are not protesting, we just don't see a point in the conventional protests. kashmir has been fighting foreign invaders for over 500 years so its kinda in the system, to oppose.
as a kashmiri, i see no alternative other than complete independence. there is no other way forward, will that take 20 years or 500. Doesn't matter
personally i think all three of these nations have to let go of kashmir, not because of some altruistic reason but because of CPEC. india and pakistan "need" to pass through kashmir to access it. china cannot afford jeopardizing their project because of indo-pak squabbles. for them business is king and the sooner things settle down, the easier.
for us kashmiris, we could practically live off of port fees for all the goods passing through our borders so yeah, i am hopeful
I would be more afraid, that those big powers next to you don't want to let you go exactly for that reason. They want to use your land.
But yeah, I hope that they can settle for that. Leave you in the middle. A buffer between them.
Well a man can dream. Right?
I don't think that actually detracts from gist of your point, but just wanted to point it out in case anyone was interested.
Just try the following trick:
- in a private Browser window open web.telegram.org
- enter your phone number, receive the code
- turn on flight mode on your phone
- now enter the identification code in your browser
- access your whole chat history while your smartphone can definitely not act as a source
Even WhatsApp is better in this regard.
Article in German: https://www.heise.de/hintergrund/Telegram-Chat-der-sichere-D...
That said, Signal does apparently support reproducible builds so that people can check that the apk matches what’s on GitHub (though this is more of a way to detect malfeasance on Signal’s part rather than Google’s)
Ah, right, there's also that.
(I work at Google, but not on Android)
As far as promising not to store your metadata, or promising not to deliberately give the gif service information about your account because they hate you, or promising not to store your contacts when you search for other friends with Signal, then yeah you have to just take their word for it. Though, they may over time look for ways to put some of those guarantees on the client side as well with some clever engineering, so you could prove it.
Which does not prove that if the app sends your phone number or location (for example), they don't save it in database.
Indeed, interesting question
CLI prototype. Can be generalized into a nice phone app.
After that, what data do you care about? Neither Signal or Telegram is intended to provide complete anonymity. That is a much harder problem. For Mozilla that would involve Tor. I don't think that Mozilla really has "servers" in the sense you mean.
What about Mozilla? What could they store?
(1) All Signal messaging is E2EE; (2) they don't store messages on their servers; (3) the client code is open source, and it seems like a good portion of the server code is open source.
Where I think Signal could go further on being the most secure, useful, and privacy-conscious messaging app/company in the world:
1. Open source ALL of the server code. They have something called Signal-Server (https://github.com/signalapp/Signal-Server) on their Github, but it's unclear if this is the server they use, or simply a server one could theoretically use to run a private Signal server.
2. Open source all server-side services/infrastructure code that doesn't compromise security in some way.
3. Better features. Signal is currently the most secure and privacy-conscious of the messaging apps, but solidly the worst overall user experience. It's not that it's bad, it's just that the other apps are much better. People like gifs and giphy and emojis and a fast-feeling interface. This is important, because it's hard to be a privacy-conscious individual when all your friends want to text on other apps. At least in my social circle, Signal is still the thing that people jump over to when they want be extra super sure they're not leaving a paper trail, but not the default messaging app they use.
4. Introduce a user-supported business model. This probably makes a lot of people uneasy, and while I appreciate the current grant and donation-based business model (the Wikipedia model), that model comes at great cost of efficiency. By operating effectively as a non-profit, you are inherently in a less competitive position relative to your competitors (the best product and engineering people are more likely to go competitors who can pay more), and you're persistently in fund-raising mode (again, see: Wikipedia). There are lots of ways to skin this cat, maybe the easiest is to ask power users to pay like $5/mo. Or just give people the option to pay with absolutely zero obligation. Some non-zero cohort would inevitably take them up on this.
Most of these suggestions, of course, especially 1-3, are very very hard and come at an enormous cost. Building in public as an open source business seems to massively slows things down and introducing a huge amount of community management overhead. That said I'm sure there are ways to manage/mitigate those costs.