Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How do we know Signal or Telegram don't store our data on their servers?
264 points by smoqadam 10 months ago | hide | past | favorite | 236 comments
I'm just curious how we trust companies such as Signal, Telegram, Mozilla, that claim they don't store and sell our data?

Thank you

Signal’s claim to fame here is that they were subpoenaed in 2016 and could only supply account creation and last connection times:

> The American Civil Liberties Union announced Tuesday that Open Whisper Systems (OWS), the company behind popular encrypted messaging app Signal, was subpoenaed earlier this year by a federal grand jury in the Eastern District of Virginia to hand over a slew of information—"subscriber name, addresses, telephone numbers, email addresses, method of payment"—on two of its users.

> ... “The only information responsive to the subpoena held by OWS is the time of account creation and the date of the last connection to Signal servers,” Kaufman continued, also pointing out that the company did in fact hand over this data.


I think this discussion should also mention that Signal is a non-profit organization, dedicated to enabling secure and private communications.

Yes, it's not strong proof, but it should be taken into account when comparing the goals and motivations of organizations developing various other communicators.

The organization behind your communicator app could be in the business of gathering data about you and selling it in various forms (Facebook, Google), in the business of selling hardware and add-on software services (Apple), or in the non-business of trying to provide you with private communications.

I prefer to look at the history of who founded and continues to run the Signal Foundation... Moxie Marlinspike. Moxie has a long history of improving security in all kinds of tech and fighting for privacy.

The Signal app itself is opensource as well various pieces of the tech stack. You can audit yourself what is being sent and how their protocols work. The protocol itself has won awards due to its security and elegance.

There is a lot of good things to say about Signal and you can easily find it all. They have made some annoying or less than ideal features that are opt-out instead of opt-in but they're not sacrificing privacy for them.

Curious. Is there an easy way to validate the code running on my phone is exactly the same code available on Github (here: https://github.com/signalapp) ?

"Easy" is a moveable feast.

Your phone is running an APK, which is a bunch of signed code. You don't have the keys to sign such an APK yourself, but you can get tools that will tell you exactly what's inside the one you have.

I believe the Java source in GitHub is designed to be capable of a reproducible build, where you get the exact same Java binaries out as Signal's own builders did and thus you can compare that to confirm the Java code in your APK matches a specific Git checkout.

The media files (e.g. images, labels) are just straight binary copies so that's easy enough.

However there is native code to make stuff like video calls work, and when I last paid attention there was no reproducible build for that component. So you could imagine that somehow the native video call code is actually a secret backdoor or something.

The source have a script that builds in a docker from a bunch of other previously built binaries, allegedly to be built with keys that are secret, and then just output "the apks are the same" and you have to believe that ¯\_ (ツ)_/¯

Still, years ahead anything else that actually have users. The server is mostly 100% closed source. There's one open source that you can host, but it's widely believed to not be even close to the one they use.

I think only matrix is fully open and p2p.

For anyone looking for the script: https://signal.org/blog/reproducible-android/

I don't believe so directly, but you can build it yourself and put it on your phone. You'll still be able to use your account and their service.

Someone should file a PR with a git hash (and some form of proof) of the currently running app?

What could possibly comprise suitable proof? Anyone tampering with the app can edit it to show the same information.

> a non-profit organization

> should be taken into account when comparing the goals and motivations of organizations developing various other communicators.

Business or funding models can change for both for- and non-profit organizations. Especially as people move to options that are believed to have better user-privacy, the idea that they do not sell/monetize collected user data today does not indicate what they will do tomorrow.

Unless users have strong evidence that companies are not collecting and/or monetizing this information (which as the OP pointed out, there is for Signal as found in a subpoena), the "billboard" approach towards promising user privacy via marketing and PR is a shallow one at best for non-profits as well as for-profit companies.

Whenever I see an ad flaunting privacy guarantees, I ask myself "How would a honey pot for gathering user's information be advertised?" Exactly the same way.

That said, at the end of the day you have to trust SOMEONE if you want to use digital communications. And there's certainly a difference between facebook and GPG email encryption.

It's just a matter of balancing convenience and privacy for your personal use case.

Is it possible that they could in fact produce this data but were prevented from publicly saying so due to a gag order?

I'm asking specifically because I remember Private Internet Access, a VPN provider, also being tested in court in the past [1], and because of this I've chosen to trust them despite them falling under Five Eyes jurisdiction.

[1] https://torrentfreak.com/private-internet-access-no-logging-...

Signal has blogged all the answers to these questions.


In short, the ACLU helped them to lift the gag order, and the blog itself shows the legal documents. The documents show exactly the data returned (Account creation and last access in Unix millis). Only the phone numbers are still redacted.

PIA used to be my go-to, but I immediately ceased using PIA after the 2019 acquisition by Kape Technologies, which has a rather foul track record.

Thanks for the heads up. Really excited to see a lot of folks here agree on Mullvad as a good alternative.

What did you switch to?

I’ve been using Mullvad since the past few years and I’ve no complaints. The fact that the recent Mozilla VPN is based on Mullvad makes me more confident in my decision.

I switched from PIA to Mullvad too for that reason and have absolutely no complaints. Wish I had done them first!

Mullvad has contributed to Wireguard that secures me the confidence in their service. Also the experience of creating an account without my name and email address is the best. Theh only thing left is the billing message (I use credit card) has prefix: VPN*

Do you get decent speeds from Mullvad? Friends were reporting that they moved back to PIA due to worse speeds on Mullvad. That and the lack of a chrome extension (which is occasionally useful) has prevented me from switching away from PIA even if I'm unhappy about being in business with Karpeles and Kape.

I just fired it up and connected to an endpoint in my city, I have a 600mps download pipe and hit 150mbps with default settings and about 275mbps with wireguard selected in the mullvad app. Switching to TCP in the mullvad app didn't change my result enough to notice.

I didn't try other servers/cities to get more information.

I am getting great speed out of Mullvad, usable for everything, except frame critical gaming. Even video streaming usually works fine. I would say I get approximately 3/4 of my normal speed when using VPN.

Did you consider NordVPN? I like the fact that I get to login from anywhere in the world.

My default choice is Sweden since they have the most lax copyright laws in the world, so subpoenaing any Swedish server gonna be tough.

They also offered me unavoidable discount.

> I like the fact that I get to login from anywhere in the world.

This is a fairly common practice and my understanding is that every major and most minor VPN services offer this.

I never knew. Is it available on PIA and Mullvad?

It was on PIA, when I used them (stopped when I heard about the sale to Kape), so I'd assume it still is.

Mullvad also does this. Here's their server status page, so you can see your options: https://mullvad.net/en/servers/

Yes. It is available on both.

Source: Another convert from PIA to Mullvad post Kape.

NordVPN may have good intentions but they were hacked.


Honestly, hack is a big word for what it really was... no user data stolen and they closed the server. I still use it and would recommend it, just for the speeds and the easy interface.

Any thoughts on airvpn.org?

I can recommend protonVPN

Like many others here, Mullvad. I've also been experimenting with ProtonVPN because it was offered as part of a bundle with ProtonMail.

AlgoVPN is really so easy that it's hard for me to justify using anything else.

not the GP but I switched from PIA to Mullvad

> Is it possible that they could in fact produce this data but were prevented from publicly saying so due to a gag order?

Unlikely. Signal and the ACLU were the ones who filed suit to allow them to disclose the terms of the warrant the first place.

It would be an incredibly expensive and risky move for them to do so if they knew that the judge could force them to reveal that they've been turning over more detailed user data in secret.

Not to mention that it would have amounted to perjury.

They were bought by an adware tech company last year AFTER the events of the article you linked. I would suggest mullvad as a good alternative. I've had better speed and as good ease of use.

That’s one point in favor of PIA.

I chose NordVPN coz we can access from any server in the world and they have offered me good discount

Are you aware of the controversies around NordVPN?


Thanks for highlighting this.

I also read somewhere their CEO did something comprising to churn in more profits.

Maybe I will think about buying Mullvad subscription.

>I chose NordVPN coz we can access from any server in the world and they have offered me good discount

That describes literally most of the VPN service offerings out there.

"And Mr. Musk's endorsement of Signal last week sent publicly traded shares of Signal Advance Inc., a small medical device maker, soaring from a roughly $50 million market value to more than $3 billion. (The company has no relation to the messaging app.)"



Can they keep the money?

That's not how the share market works ...

It is if they can sell (their shares) quick enough!

Signal may have only supplied that metadata at the time. But what I am concerned about is that if Signal is US-based, couldn’t the state demand Signal’s app signing key via a NSL, and couldn’t that signing key then be used for targeted attacks by which someone of interest gets a Signal app upgrade that is malicious (while everyone else gets the non-malicious app)? I admit to being somewhat unfamiliar with Android distribution through the Play Store, so if this is unfeasible, help me understand why.

Yes. But if you specifically are targeted by organisations capable of issuing NSLs, you're completely hosed already. (And they're just as likely, if not more so, to have done that to your OS instead of just the Signal app.)

Technically they could get the signature key, but they can't force Signal to publish it via the store. Users would have to download an .apk file and install it directly. At that point there is no reason to have the signing key at all as the phone will recognize a sideload as a third party install. As far as I know, the government cannot compel a company to do something like update an app.

> but they can't force Signal to publish it via the store

Is there not a suspicion that Google, another US-based corporation, may have some agreement with American national security to supply malicious APKs to individual targets via the Play Store? Having Signal’s signing key would allow the state to present that custom-targeted APK as an ordinary Signal version update.

While I'm not saying Google hasn't done something like this (I have no proof either way) there's a strong legal argument to be made that forcing a company to produce binaries is compelled speech which goes against the first amendment.

It's more about preventing companies like Facebook getting their hands on everyone's data and abusing it as well as preventing organizations like Signal themselves using / abusing this data. We won't ever truly know if Signals data makes it's way into the hands of government security agencies but I would say it more than likely does or it will if they want it to in the future.

If some government wants to get you they will get you, probably via your operating system... Signal won't help you. If that's your concern then you gotta stay off the internet to be honest!

This, more than anything, is why I trust them and recommend them to others.

It's important to note that Telegram does store all your data by default as they do not enable E2EE for everything like Signal does. So if you're under the assumption that they don't, this is incorrect.

Telegram, for all intents and purposes, is about as secure as using Facebook. The best you can do with Telegram is hope they don't sell out or get compromised at some point in the future, because all your private communications are stored on their servers forever. Telegram does have "secret chats", which from what I can gather, don't even work for group chats, only one-to-one messages.

My general advice is to treat Telegram like a new Facebook if you have to use it, assume everything may by read by everyone, don't treat it like it's private and secure.

For "text messaging" friends and family use Signal. Everything is end-to-end encrypted by default, so you know nobody is collecting your data.

In a way, yes, Telegram is even less secure than WhatsApp for this.

The way I've been presenting it to people is that Telegram can look at more data than WhatsApp can. But WhatsApp will use the data they have more than Telegram will. That's the tradeoff.

And yes, obviously Signal is more secure than both of them but I've been steering non-techies to Telegram because of usability, backups, cross-device history etc. As usual, everything is a tradeoff, but if people were happily using WhatsApp up until now, and also use Gmail, Telegram is not worse than those.

> The way I've been presenting it to people is that Telegram can look at more data than WhatsApp can. But WhatsApp will use the data they have more than Telegram will. That's the tradeoff.

I think you should look at the odds of that "can" turning into a "will" over time. After an acquisition, or a change in business fortunes, or a change in leadership...

It will happen. So far Telegram has developed without consideration for revenue, but it cannot last too long.

According to his own numbers [1], Durov's entire net worth [2] can only sustain Telegram for about a decade.

[1] https://t.me/s/durov/142 [2] https://www.forbes.com/profile/pavel-durov/

Durov recently announced they will smart monetizing through ads, but only in "channels" of people with huge subscriber count (which generates a lot of costs). Channels are 1:N public broadcasts, a bit like Twitter.

That's how it all starts. And how will they know what ads to show you? By mining your conversations.

No. He confirmed that the ads will be generic. Not targeted to people based on their data.

> 500 million active users

> A project of our size needs at least a few hundred million dollars per year to keep going.

Is it just me or does order of $1 per user sound like a lot?

$1 per user/year sounds about right. I use Telegram more than any other chat app. People send lots of media (and even large files) and Telegram archives them forever. One group Telegram I’m in is 6 years old and has 500+ VIDEOS (and 10,000s of images) permanently archived in it.

I’d estimate I cost over $10/yr to telegram.

I could imagine a backup solution being built on top of that free storage. Uploading encrypted blobs of data.

It's true, and it's part of the consideration. But the current mass emigration from WhatsApp seems to show that it's not as hard to jump platform as everyone thought it would be.

What I don't get is people are trusting unverifiable builds of Signal, Telegram, WhatsApp, etc as "secure" on each of their E2EE implementations when that part of the binaries we install on our phones isn't even verifiable by code and compilable by ourselves.

But what I do like about Telegram is their good user experience and Bot API developer experience. It's soooooooooo fucking good I'm telling you. It just works, be it on web, mobile, and desktop.

At this point who the fuck knows if Durov can be trusted (hell we all wish, right, no harm in that). But regardless of that, at the end of the day I'd be willing to admit he's a fucking genius when it comes to Telegram's UX and DX.

Signal does have reproducible builds: https://signal.org/blog/reproducible-android/

Sir that's a big Today I Learned. Thank you.

For telegram, you can use the F-Droid builds, which I'd rate as one of the most trustable sources for android apps.

Telegram reports to have reproducible builds for android and ios. More information at https://core.telegram.org/reproducible-builds

> At this point who the fuck knows if Durov can be trusted (hell we all wish, right, no harm in that).

It's a threat model decision. If you're someone who wants privacy from the US or other Western governments (think Antifa on the left side, or corona-deniers, qanons and other conspiracy nuts on the right side), Telegram is the best option since the Western governments can't hold them accountable. If you're a Russian or Chinese dissident, or opposition in countries aligned with them (e.g. Serbia) Whatsapp and Facebook are your best bet.

There are many anti fascists in Russia too. In general anti fascists face repression from every nation state

Isn’t Telegram now based in Dubai, an emirate within a country that largely allies with the West?

Horcrux Encrypted Messaging combines multiple messaging options to protect you from all sides.


FWIW, I personally trust the "unverifiable" Signal build I get from the app store more than I trust the OS my phone runs.

That's not true of Telegram/WhatsApp/etc...

> Everything is end-to-end encrypted by default, so you know nobody is collecting your data.

I think it's wise to remember that what happens on the other "end" is outside of your control.

If the other person in the conversation stores chat backups unencrypted you're still at risk, and there's not much you can do about it.

The threat model assumes the other party is trusted. If not, in the limit they can film their screen and broadcast it to the internet, so there's nothing you can do about that.

I believe you have self-destroy timers in Signal. Perhaps those help.

Snapchat was based around that and people still copied content.

If someone can read it, then they can copy it.

it more so prevents people from retroactively going back in time and scraping data they wouldn't have in the moment. Many of my chats expire all messages after 24 hours because of this.

Most people will not archive all texts they get in the moment, it's only after some fallout or event happens they there's motivation to dig up old messages.


Wickr does "screenshot notification" somehow, so now I occasionally get sent photos taken of phones showing "private" Wickr messages...

In order for the message to be readable by the other party you fundamentally trust the other party. A self-destroy timer doesn't really help that aspect (which is why I don't use them).

You're right to point out what is and isn't the users control.

As others are pointing out the reason a lot of people trust Telegrem and Signal is that corrupt governments don't!

That the few times the "kimono" had been opened both Telegram and Signal were doing what they said they were doing -- even if they both have different approaches.

This isn't like Facebook who just lie and have been caught doing so.

Yep and also keep in mind that FBI/DOJ capitol breach presser yesterday the FBI dude basically said “it’s hard to tell who is shit posting and who isn’t so it takes some elbow grease” which I take to mean that it’s OK to shit post. Just you know, don’t use computers for anything you want to keep secret.

Yea, I believe telegram "secret chats" are E2EE and also have auto-destruct capabilities.

The secret chats feature is almost useless. They only work on phones, only in direct messages and are not the default. I doubt 1% of telegram users use it.

We don't know that Signal doesn't store data about users on its servers. Even the source code can't tell us that, because we don't run the servers.

What we do know is that programs like Telegram have to store data about users on their servers, by design. A big difference between the two projects is that Signal is carefully designed to minimize the amount of data the service needs to operate; it's why identifiers are phone numbers --- so it can piggyback on your already-existing contact lists, which are kept on your phone.

By contrast, other services store, in effect, a durable list of every person you communicate with, usually indexed in a plaintext database.

> We don't know that Signal doesn't store data about users on its servers. Even the source code can't tell us that, because we don't run the servers.

Yes. Ultimately we have no choice but to trust trust itself.[a] That said, if the OP were a non-technical friend asking me the same question, I would respond more or less like this:

"Of all the widely used messaging services, Signal is the only one known to be designed to minimize the amount of user data needed to operate, and all indications are that they are operating as designed[b], so Signal is likely your best choice today if privacy is your main concern."

[a] http://users.ece.cmu.edu/~ganger/712.fall02/papers/p761-thom...

[b] https://news.ycombinator.com/item?id=25764526

I've seen [a] on HN three times in the last week. Don't know if it's recency bias, Baader-Meinhof, a legitimate increase in common popularity/knowledge, or a sign of the times.

edit: after a quick algolia search, it has indeed been posted much more this year than years before.

It's been relevant recently due to the SolarWinds supply chain hack, too, since the implant was inserted into the build process, so I've been seeing it a lot more too. It wasn't used to infect a compiler, but still makes people think of Trusting Trust.

Signal has reproducible builds for Android. https://signal.org/blog/reproducible-android/

Does that help in any way to verify that they do not store data on their servers?

My understanding: If you verify the safety numbers in person, then I believe you can be confident that it's E2E encrypted for that conversation. If the safety numbers are different, then there could be a nefarious actor listening in.

Someone please correct me if I'm wrong.

Edit: That being said, I believe they could still record IPs, as well as the destination and timestamps of each message.

If they were storing that it would have been produced when they were forced to produce all data relevant to the case.

Agreed. Just pointing out what information they have access to if they wanted to start logging as much as they could.

Sadly I don't see any way to prove that over time except through periodic court orders :)

It only helps verify what data the client sends to their servers, not what fraction of that data is stored on their servers. They could be (but probably aren't; see other comments) storing e.g. information about how often you connect and the volume of data that passes through their servers.

We don't really know it, but there is some assurance the server is running the code they say it is because of Intel SGX.


I just had a skim over the post and it seems to be saying that it allows them to process user data without the OS having access to it. This does nothing at all for letting me verify what is running on their server or that they are even using this SGX feature at all.

It protects signal from hackers or a malicious datacenter provider at best.

I don't think you skimmed it very carefully.

> SGX enclaves also support a feature called remote attestation. Remote attestation provides a cryptographic guarantee of the code that is running in a remote enclave over a network.

> Originally designed for DRM applications, most SGX examples imagine an SGX enclave running on a client. This would allow a server to stream media content to a client enclave with the assurance that the client software requesting the media is the “authentic” software that will play the media only once, instead of custom software that reverse engineered the network API call and will publish the media as a torrent instead.

I can't trust any company that has to read my contact list PERIOD! It's not something anyone should be having to share ever.

Signal does not has to access to contacts. It does asks for contact access permission, to show in the app the names that you have set for your contacts. But you can just answer no and everything works.

On the contrary, if you answer the same to WhatsApp, it plain refuses to work. But it actually created an account on their servers, and from that on you appear on your contacts who do use WhatsApp as another user of WhatsApp, which invites them to write to you there although you cannot receive their messages. To fix this, you have to find the option in WhatsApp to delete your account.


Signal 1 WhatsApp 0

That's the point. They don't.

I don't want to speak for the parent commenter, but I think the concern is that the local app could be exfiltrating the contact list (and then by the exact same logic, message content as well) in some side channel unrelated to anything seen in the published source code, unless (a) the user builds the apk from published source code themselves, or (b) if there's some way to prove that the apk received via the Play Store is identical to one built from that source code.

Is (b) achievable by all users who have this concern?

For the most part, and for Android users, b is achievable : https://signal.org/blog/reproducible-android/

Signal isn't a company. They're a non-profit. In addition, as others have mentioned, Signal works without giving them permissions to read your contacts.

Signal and the ACLU sued and were granted permission to release sealed warrant data from a previous law enforcement request for user data.

As of mid-2016, and trusted as much as you feel like trusting something attested in a court of law, Signal stores: a bool (is this phone number a user) and two ints (epoch of signup, epoch of last transmission).


Signal: operations that involve sending your contacts (like contact discovery) use a pattern Signal invented where the client can validate the software running on the server. The server runs inside the SGX secure enclave. Before your client sends any data, it performs remote attestation on the running server code to ensure it matches the published open source code.

See the full explanation at https://signal.org/blog/private-contact-discovery/ (starts part way down, with "trust but verify"). Or check the client source code yourself!

Telegram: I dunno, they.re closed source, don't encrypt by default, and have shady ownership. I don't trust them at all, personally.

Keep in mind that SGX is not as secure as advertised[1][2].

Also whole security dangles on Intel to be trusted to not give its private keys to anyone. Which is a big ask for any company. NSA/CIA likely can get those keys legally via FISA court order or illegaly via hacking and/or insider.

[1] - https://arstechnica.com/information-technology/2020/03/hacke...

[2] - https://www.theregister.com/2020/06/10/intel_patches_sgx_aga...

Sure, but the question wasn't "does the NSA have access to data", it was "how do we know that information isn't stored."

The answer is that signal includes an industry-leading attestation process using CPU security features.

It's true that if the CPU manufacturer is compromised that would compromise anything running on it, including attestation. But that's not really to do with Signal's implementation, and it is out of scope of the question.

Sure, but the question wasn't "does the NSA have access to data", it was "how do we know that information isn't stored."

The answer is that signal includes an industry-leading attestation process using CPU security features. If the CPU manufacturer is compromised that would compromise anything running on it, including attestation. But that's not a flaw in Signal's implementation, and it is out of scope of the question.

While not directly about data storage, I loved this tweet [1] from Edward Snowdon this week:

> do we really trust signal? cause i see zero reason to.

Here's a reason: I use it every day and I'm not dead yet.

[1] https://twitter.com/Snowden/status/1347217810368442368?s=20

I like this reason because you don't need to know anything about tech at all to be able to understand this. You also don't need to trust or like Snowden. If you view Snowden as a hero or a traitor, it changes nothing. All you need to trust is that he's got no reason to lie about using Signal, and neither does Elon Musk.

Unless he's an NSA operative who's promoting a honeypot.

That's a stupid reason. He's not dead (or at least imprisoned) yet because USA doesn't have an extradition treaty with Russia. What does a chat app have to do with it?

Why would he need to be extradited to the USA in order to be killed?

If the US government wanted to send a hit squad to Russia to take him out, the fact that he uses Signal instead of WhatsApp isn't what would be holding them back.

One could read into Snowden's tweet that he has sent/received messages using Signal that would with certainly have pushed the US government into wanting to send that hit squad.

You can read into it however you want, but the fact is that USA isn't going to essentially invade Russia over him.

Well in the case of Telegram, you can trust, that they store your data on their server, because they say so.

And it is convenient, because you can just switch your smartphone and still access all your chathistory, without having to manually backup/restore.

But Telegram in general does not have a business model yet, so just assume, that one day, they want(or have) to cash out.

Signal on the other hand is a non-profit foundation and pretty open on what they are doing. That creates trust for me.

Durov wrote his thoughts on monetization lately here: https://t.me/durov/142

IIRC the business model was going to be the GRAM coin/TON network, which was P2P transactions, dropbox style storage etc.

I thought that got shut down by the US government for being a scam.

Telegram is storing your message content in their cloud for „cloud chats“ (default), as those are not end-to-end encrypted.

Telegram‘s „secret chats“ and signal chats are end-to-end encrypted. The servers still may store metadata, and there is no way to tell if they do than either joining them or let a trusted third party verify that.

To check if e2e encrypted message content cannot be encrypted via backdoors on their servers, you need to ensure they use proven encryption schemes and the client encryption does correspond to those algorithms.

> I'm just curious how we trust companies such as Signal, Telegram, Mozilla, that claim they don't store and sell our data?

These are three very different companies with very different security processes and trust profiles.

In the case of Signal: if you trust that the source code they distribute is the same as the app available in the Play Store, then it's pretty easy to verify that the messaging data is end-to-end encrypted in a way that prevents Signal from having much metadata that they even could store. With "sealed sender", they don't even know who's talking to whom: https://signal.org/blog/sealed-sender/

There's the possibility that Signal could ship a different app in the Play store, but that would require active malice to do in a way that would not be trivial to discover[0], and at some point you do have to trust someone. It's not impossible, but it's hard to imagine a world in which Signal is compromised but other links in the chain aren't, because quite frankly, there are far more easily corruptible or hackable links in the hardware/software stack that you use, so Signal would make a pretty inefficient target for someone who wants monetizeable data.

[0] ie, an accidental divergence between the two would be more conspicuous

I had to scroll down a shockingly long way to find this.

The point is that even if Signal permanently stored everything you ever sent them, then they wouldn't be able to read it.

- You can build the client yourself per Signal's reproducible builds, so actually, they could not ship a different app to the published source without it being immediately detectable

- You can validate the source code does not send any unencrypted data to Signal

- You can validate that your private keys used for encryption are stored locally on the device and not transmitted to Signal

Theoretically, anyone who has the corresponding private key could decrypt the message. So if your contact uses an unofficial client which does share their private key with a third party, then that third party could unencrypt that message, however by that point, the app creator has compromised the device anyway, and could do something as naive as take screenshots of all of the messages in the background after Signal has done the work of ensuring the secure transmission of the message.

Note, I haven't actually done all that, because I do trust Signal. But I could if I wanted to. And obviously, this assumes that all the cryptographic standards used in Signal are still unbroken - but if they were, then you're screwed either way.

> trust that the source code they distribute is the same as the app available in the Play Store

And, of course, one has the option to build the source and install that.

Telegram has reproducible client builds.

you don't know if they know, because you can restore that meta data

In the case of Signal, I imagine people assume all of the following:

1. the protocol between client and server is setup in such a way, even if Signal wanted to store interesting information, they could not access anything interesting even if they wanted to (for example, messages), thus they don't store anything since it's useless

2. the app implements the protocol faithfully and this has been checked by people perusing the source code

3. the binary downloaded from the app/play store phone is compiled from the sources listed on github

This might seem like a naive question, but how is it possible to verify that 3 is true?

I get how it might be done in theory but real life is complicated. Has anyone attempted to do this?

Signal has reproducible builds for at least Android: https://github.com/signalapp/Signal-Android/tree/master/repr...

You compare the result you get compiling the app yourself with what you downloaded from the Play Store. For iOS this might be harder

> I get how it might be done in theory but real life is complicated. Has anyone attempted to do this?

This is mentioned elsewhere, but the answer is: reproducible builds.

You can take the Signal client source (which is available on Github), build an APK or whatnot yourself, then get the SHA256 hash or whatever and compare that to the artifact downloaded from the app store and validate that it's the same.

Has anyone done it? No idea!

The signatures on app store apks mean that doesn't work.

You can though, decompile both your version and the app store version and compare them that way.

Any reason you can't just zero out the signatures and then hash?

I imagine it's non-trivial. I think it would involve (in case of iOS):

1. downloading the binary

2. jailbreaking the phone to extract the binary (pretty sure this is necessary on iOS)

3. check the version of the binary, then compile the original sources of the version

4. ??? compare the two binaries, this is likely the most difficult part, they won't be identical because of things like codesigning (and build flags, timestamps, ...)

I know noone that does this.

You don't compare builds because you probably don't actually have sources. What you do is use a special iPhone (a Security Research Device) that Apple grants some researchers or you use an emulator like the one from Corellium (to whom Apple recently lost a lawsuit over this emulator) to probe and step through the code. Find the key sections that do the real crypto work and make sure that they do what they are supposed to do and that they are getting the correct inputs.

There is a large group of people who do this sort of research, and some fraction of them do this research and actually talk about it or publish papers. If you could find a deliberate weakness in the security of an app like what we are talking about (or WhatsApp or iMessages) then you have just printed your own golden ticket to whatever mobile cybersecurity job you want for the next decade or two, so there is a bit of an incentive to publish if something like this was discovered...

> You don't compare builds because you probably don't actually have sources


If you don't want to trust anyone but to verify instead, consider running Matrix with your own server. In this case you still can talk to anyone else on Matrix, because it' federated.

I'm not familiar enough with Matrix, but I'm curious about how/if it can protect against malicious federated server operators? Does that just boil down to needing to trust they are not running modified code?

Your own Matrix server does not spread any information unless you allow it or unless you message other servers.

Now, concerning the other servers, it may be a problem, just like Gmail is a problem for everyone running their own email server. However, it's a much smaller problem and at least theoretically everyone who cares can escape the walled garden.

You can create a server for all your friends and you will always know exactly which information is shared with others and which isn't.

Telegram is very clear that they do store our stuff on their servers. And in clear text unless you choose end-to-end encryption.

My concern is not about my data being stored on their servers. My concern is about having having marketing data being sold to third parties in order to target advertising at me, just as when you leave "third party cookies" active on your browser. That is creepy and invasive. Would Zuckerburg ever do such thing?

Signal is open source, you and anyone else can inspect the code. You can then build it from source and install on your android directly avoiding the play store.

That is a somewhat misleading statement, you may know what's on your device but you don't know what is happening on the servers

Its end to end encrypted chat. They could store the encrypted messages sure. I think the biggest fear people should have with signal is the client side encryption.

Right! the keyword here is "Reproducible Builds". Basically once there is documentation about how to produce the release build, you can do it yourself and compare the resulting hash with the build distributed in the Store. Generally speaking it does no come for free, but once you find a way (e.g. for iOS compiling with a specific Xcode version in a specific OS with some adjusted config) is kind of doable (except that Apple encrypts your build server side for DRM purposes, so you'll need a jailbroken phone to do something about it)

For Signal there is an open issue here for iOS [1] and some documentation for Android [2]

Some nice work about it has already be done by telegram https://core.telegram.org/reproducible-builds

[1] https://github.com/signalapp/Signal-iOS/issues/641

[2] https://github.com/signalapp/Signal-Android/blob/fab24bcd1e5...

This has nothing to do with the comment you replied to, as you have no idea what software is running on their server, so what would it even mean to reproduce it in the first place? The correct answer is merely "the server never received much in the first place so it doesn't matter as much if they stored all of it".

right, I think I messed up with the reply while I was reading other comments.

Because of public-key crypto, it doesn't matter if the servers are malicious.

Assuming you have:

- read the source code and are satisfied that it's secure

- compiled that version of the code

- installed it on your mobile or desktop

You're still only as secure as the client on the other side of the conversation.

If that one is compromised (has not gone throught the steps above) it could very well be sending all messages in clear text to a malicious party.

Edit: formatting

Ok, sure. But what do you propose? It's still a much better situation than what we have with Whatsapp. Is there something that the Signal Foundation could do to alleviate that concern you have? There's no technical solution in any technology for preventing the other side being compromised, as far as I can see.

At present I'm choosing to trust Signal.

That doesn't mean I blindly trust them, only that despite seeing potential for abuse I judge that they have more incentive to be telling the truth than not.

Also check the comment by user faitswulff where they mention how they have been subpoenaed "and could only supply account creation time and last connection time".

Matrix since you can self-host and have control while still being able to communicate to other people on it through federation

>>> You're still only as secure as the client on the other side of the conversation. If that one is compromised ... it could very well be sending all messages in clear text to a malicious party.

>> There's no technical solution in any technology for preventing the other side being compromised, as far as I can see.

I don't know Matrix, but I can guarantee that it doesn't solve the problem of a compromised client obtaining the messages willingly sent to it.

Yeah and since you have the possibility of dealing with state actors with deep pockets, you have to wonder if Android or iOS doesn't have the ability to copy your private keys and send those off somewhere for storage. Because of signal's popularity, it feels pretty possible to me.

If the NSA did have it backdoored somehow through the OS, it's a good bet they'd force LE agencies to use parallel construction to keep that information top secret.

That is why we really need open source hardware and OS's. A good (or even functional) open linux phone can't come fast enough.

If your adversary is state actors with deep pockets or the NSA, you've lost already. No amount of opsec cosplay is going to save you.

Your solution?

* Magical amulets?

* Fake your own death, move into a submarine?



AOSP (Vanilla, GrapheneOS, CalyxOS) doesn't have this capability.

The Google Play Services app/package? Heh...

The server can MITM the public keys, providing you with a key from the server instead of the key from your conversation partner.

It very much does matter if the server is malicious.

If you are paranoid, you can do public key verification through another channel. People with high risk profiles should do this.

Key authentication is not for the "paranoid" or simply those with "high risk profiles", otherwise every web browser in the universe wouldn't do it by default on every single connection to every single website. It is a normal, routine thing that is expected in all modern secure communications systems.

Please don't spread this harmful meme.

We've got certificate authorities to centralize trust for server public keys. And those require trusting organizations that lots of people don't want to trust. We don't have an equivalent system for individuals. There is no trivial push-button key verification process for peer-to-peer communications. Key signing parties suck and never worked. Key validation for things like Signal is nicely automated if you are physically near the other person. But beyond that it is tricky.

It is hard enough to get my parents to use a secure messenger. If I told them they needed to do a key verification process for every person they ever communicate with... they'd just go back to facebook messenger or sms.

I think it is completely reasonable for somebody to say "I don't care enough to worry about validating public keys" while also educating people like journalists about how to do that correctly.

Not if the keys are generated by the client.

Signal also offers to label contacts for which you could verify the authenticity by another way.

Doing a video call with the contact can be a simple way to clear doubts, even if it is not a proper different channel.

Video calls alone won't stop a MITM attack. They would just send both video streams along, and record both sides.

Signal does have the capability to have a verification phrase displayed, which is generated from the session key. Reading that off can make the video more difficult to MITM, because then they'd have to morph the audio to match the phrase, and if it's done after the video is setup, morph the video as well. Not impossible, but difficult.

This is false. A video call will not prevent or detect MITM. You may be suggesting that a video call is used to authenticate the key, which is certainly a step in the right direction, but I don't think Signal supports this.

It will, because it will prove (or give you a lot of confidence) that the agent who sent you their public key is your legit correspondent.

This uses the fact that the client on each side is open source and inspectable, so that each side knows that they sent only the public key that they generated on their own device.

PS: to answer your last sentence, Signal allows you to flag specifically contacts that you managed to verify. Which is technically equivalent to say that you verified that the public key is theirs.

Yes, but it doesn't support doing that whilst in a video call with them.


Indeed it is far from straightforward that merely doing a video call suffices to check the keys.

Signal is famously using a special protocol for secure key sharing through the server, which I have not studied.

But as said by another comment, there is no way around verifying explicitly the public key using an independent channel.

not true with respect to meta data

which I don't think is as concerning, but is there a particular piece of metadata that concerns you?

I do think it is a valid concern. Over the years, various sources reported that intelligence agencies mostly use metadata (who's talking to whom, i.e. the social network) in their analysis because message content is harder to parse and understand (and, outside of email traffic, harder to obtain in the first place).


Metadata can be as damning as the actual message data, and in a lot of places you don't want the authorities to know that you are even communicating at all.

Certain parts of the world, people get bombed on metadata alone.

While this is a great way to build trust, there is obviously no way to confirm the App Store version is the same as one built from their public source. In fact, due to the way Apple optimizes apps for each device, this becomes even harder. Furthermore, just because you compile it from source and put it on your phone does not mean that you can reasonably stay aware of or understand all the internal workings that happen inside the app.

I know that developers can post LLVM bitcode to the App Store instead of a binary, which allows Apple to recompile it for architectural changes. I'd be surprised if Apple optimized per device. Creating separate builds with optimizations for different iPhone models would make more sense. Do you have more details on that?

I think he meant "iPhone models" when he said "devices". I'm not the op, but they definitely do optimize per model, not individual device.

Per model is what I meant.

> Furthermore, just because you compile it from source and put it on your phone does not mean that you can reasonably stay aware of or understand all the internal workings that happen inside the app.

Can you elaborate on this? That's exactly what I'd expect of an app I compiled from source.

Are you going to read every commit and fully explore the entire app to know your messages and encryption are being handled securely? And keep doing this every time you update? If so, you have more time than I do. :-)

As much as it is open source, there's no way to know for sure that the software running on their servers is the same that's published on Github.

The software on the server doesn't matter, as long as the encryption is solid on the device. That's the whole point, the devices handles all encryption/decryption so the server can't understand any of the data coming and going. The reason they don't store the data is because it would be pointless. The Github repos contain the device source code which, for most platforms, can be verified.

> You can then build it from source

To be secure you would HAVE to build and install it from source.

But then again your OS could possibly inject code to get the keys. Or a keystroke logger may have been installed.

So how do I build the Signal server and confirm it's identical to the one they're running?

It doesn't matter if it's not. Security should depend completely on the clients. Public-key crypto allows private communication through insecure channels.

No, it does. In some cases (think dictatorship) - you not only want the secret police to not read your messages - you don't want them to know at all, who are you talking to(and how often and when!). Otherwise you might all go to jail (or worse), if they are after one contact of yours. And then you can try to feel save, that they don't know your encryption password.


Session is a signal-derived app that attempts to mitigate these types of information leaks.


Edit: see discussion here: https://news.ycombinator.com/item?id=25690036

here https://www.deccanherald.com/national/north-and-central/jk-a...

not allowed to use VPNs because national security issues.


"social media muisuse"

i remember last year this word was so much used, "misuse" which translates to criticizing the ruiling dictator government. it still is,


here. a whatsapp group needs to be "registered" with police.

and lastly more recently, https://theintercept.com/2020/12/06/kashmir-social-media-pol...

this is a reason why i never signed up for whatsapp, havent joined signal, don't tweet or post on facebook. Why? because PII

the danger is real and i am living it. people better realize it

Well, if a government goes authorian, than it does not matter much, what service you use, if you have to assume your phone has spyware on it.

If the main danger is, police scanning the phone for compromised material (without a police spyware on it), then there are some ways to deal with it technically, by using services that don't leave a trace. Telegram for example has a "secret chat" function, which won't save the messages, meaning someone scanning your phone later, won't find them.

(which I head is also a main reason for many people to join telegram, because so they can chat with their affairs and not have their wifes read it)

Then there are simply private tabs of chrome or ff, from where you can use chat-services without trace. (if the chat services are not cooperating with the police, or are decentralised by default, I think in that scenario I would use matrix)

Anyway, you live in kashmir?

I know mainly of the conflict by reading Shalimar the Clown, from Rushdie. Just curious about your opinion, if you know the book. I heard it was not well received in Kashmir itself? I think it was very well written, but I don't know how accurate it is.

Yeah. Just last year I had to teach neighbours and such to use a launcher like Evie which lets you hide apps. Many were stopped during random street checks and that saved their Turkey. Heh.

8 months or so ago I was stooped because it looked like I was "recording a video" on my phone when actually I was. Took a slow turn, double press power button and pickachu face that I wasnt. Still a couple guys around helped or I was history.

No. I havent read Rushdie. It has that whole demon verse thing around him, he isnt liked

The problem with telegram as with WhatsApp and signal is phone numbers. India has had this network analyzer on isp level for like 6-7 years, called "netra". So all unencrypted traffic goes through it. Same for all encrypted traffic. This is the reason why I stopped using tor, because my traffic would show up uniquely than rest and that gets them suspicious quickly.

There is a lot of text written on the conflict which actually is more than 500 years old. Kashmir has been under foreign oppressive occupation for over 500 years constantly and even today is under 3 nations. Its not like the occupation wont affect the people.

I am trying to get people I know on matrix because there is no PII, waiting for dendrite to come out of beta so that I can set up my own server and such.

The joys of living in an open air prison.

"No. I havent read Rushdie. It has that whole demon verse thing around him, he isnt liked"

Well, he did made many people aware of the conflict, which created attention, which results in indian police having to give interviews to the intercept for example, which helps in some ways. Things would probably be darker without.

How is your opinion on a political solution?

Do you think independence would work out (if your big neigbhours would let you)?

My understanding is, that the kashmir population in itself is divided?

Anyway, hope you stay safe.

i "think" i am safe. Recently a prominent lawyer was assassinated in broad daylight, that has gotten me scared. other than that this place is actually safer for non combatants than rest of the subcontinent and i can say that with authority. the governments otoh are not safe for people.

from what i have observed and from facts, at least on the indian occupied side, india spend like millions to convince entire generations about a local hero who happened to be pro india.


this was a couple years ago.

yeah so its not like people are not protesting, we just don't see a point in the conventional protests. kashmir has been fighting foreign invaders for over 500 years so its kinda in the system, to oppose.

as a kashmiri, i see no alternative other than complete independence. there is no other way forward, will that take 20 years or 500. Doesn't matter

personally i think all three of these nations have to let go of kashmir, not because of some altruistic reason but because of CPEC. india and pakistan "need" to pass through kashmir to access it. china cannot afford jeopardizing their project because of indo-pak squabbles. for them business is king and the sooner things settle down, the easier.

for us kashmiris, we could practically live off of port fees for all the goods passing through our borders so yeah, i am hopeful

"personally i think all three of these nations have to let go of kashmir, not because of some altruistic reason but because of CPEC. india and pakistan "need" to pass through kashmir to access it. china cannot afford jeopardizing their project because of indo-pak squabbles. for them business is king and the sooner things settle down, the easier."

I would be more afraid, that those big powers next to you don't want to let you go exactly for that reason. They want to use your land.

But yeah, I hope that they can settle for that. Leave you in the middle. A buffer between them.

Good but currently the land is divided between 3 and there is fighting. Unless status quo changes, fighting will continue. Its not like if any of the three forces its way on others territory and the two will stay quiet. Either all can have it like now and keep fighting or all let go and no one fights.

Well a man can dream. Right?

My sympathies are with Kashmiris but you have to admit Mirwaiz who sees a Qadiani behind every corner and Syed Ali Shah Geelani are not good PR for you.

That's not going to be true of metadata, though. A malicious server could keep a lot of valuable metadata about you.

Other than literally showing up at somebody's datacenter and taking their machines away you won't be able to do this with any service ever.

and how does this proof that they don't save what you send them?

what use are my encrypted messages to them?

Building social graphs and activity logs, for one.

1) Meta data 2) Decrypt in case of security breach or more advanced computing

Hello, future? Yes, this is Richard Stallman calling from the 1980s. I think what you might be looking for is the source code to the entire software stack.

Perhaps a little pedantic, but I don't think this is technically correct, since Stallman and the GPL wouldn't really apply here for server-side code, since it doesn't seem like the client-side application code is being questioned regarding trust, but rather the intermediary code on Signal's own servers. In that case, Affero GPL would be the answer. And that license was first published in 2007, which is something like 18 years after the initial GPL release.

I don't think that actually detracts from gist of your point, but just wanted to point it out in case anyone was interested.

What prevents Google from replacing Signal on the Android Application store with their custom and backdoored version ? Can we check a hash or something ? Does the signal foundation do that on a regular basis ?

If Google wanted to read your messages and were willing to use malware to do it, there’s little to stop them on Android. Even if Signal checked the apk regularly, there’s no guarantee that the apk served to them is the same one served to everyone else. They could also push an update to the OS that recognizes the Signal apk and applies a patch after downloading but before installing.

That said, Signal does apparently support reproducible builds so that people can check that the apk matches what’s on GitHub (though this is more of a way to detect malfeasance on Signal’s part rather than Google’s)


> They could also push an update to the OS that recognizes the Signal apk and applies a patch after downloading but before installing.

Ah, right, there's also that.

Never forget to reflect on trusting trust, of course:


You can verify the installed version on Android with the Github repo version. Yes you can essentially check a hash.

Signal is signed with a key that's held by Signal, not Google. Android won't install app updates unless they're signed with the same key as the currently installed version.

(I work at Google, but not on Android)

There's nothing stopping Google from silently pushing a keylogger to your phone and recording every single thing you do. They don't need to hijack Signal or anything else for that. By using your phone you are implicitly trusting Google, the manufacturer and several other parties.

Nothing, because they can also just read it off of your phone and don't need to 'break' Signal.

Telegram stores all data on own servers.

Just try the following trick:

- in a private Browser window open web.telegram.org

- enter your phone number, receive the code

- turn on flight mode on your phone

- now enter the identification code in your browser

- access your whole chat history while your smartphone can definitely not act as a source

Even WhatsApp is better in this regard.

Article in German: https://www.heise.de/hintergrund/Telegram-Chat-der-sichere-D...

What's this supposed to prove?

that it stores all of your messages on a server, and not on your phone. Your phone being in plane mode disconnects it from the internet, but you are still able to browse your messages.

You don't need to 'prove' it. Thats their selling point. Access your messages on every platform.

Isn't it as they describe it? Unless you created e2e chats specifically (Secret Chats).

If you trust the source code of the software you're running, you can at least get a sense of what data they're getting in the first place. You know, at least, that they're not getting the content of your communications if you verify safety numbers. You can also prove that they're not getting the contents of the gifs you're grabbing for your conversation, because the client makes a secure connection to the gif service using Signal's servers as a proxy.

As far as promising not to store your metadata, or promising not to deliberately give the gif service information about your account because they hate you, or promising not to store your contacts when you search for other friends with Signal, then yeah you have to just take their word for it. Though, they may over time look for ways to put some of those guarantees on the client side as well with some clever engineering, so you could prove it.

Telegram stores your data by default in their servers. You have the option of removing single messages or conversations but there’s no way of knowing if they really do so. Also if you remove your account without removing your conversations first they stay there forever (others can see the messages)

At first I wanted to write about client-side app verification, that we can prove that the apps we have and the open-source complied app would be the same.

Which does not prove that if the app sends your phone number or location (for example), they don't save it in database.

Indeed, interesting question

As long as it is stored on the server and the transmission is encrypted, we will never know for sure. Especially in cases where the company "controls" the encryption and decryption.

If this is a concern for you, consider using the signal protocol without a server.

CLI prototype. Can be generalized into a nice phone app.

https://github.com/adsharma/zre_raft https://twitter.com/arundsharma/status/1348718596415918080

Which data? We can be somewhat sure that they don't have access to the content of Signal or Telegram secret chats as long as we have verified the identity of our contacts.

After that, what data do you care about? Neither Signal or Telegram is intended to provide complete anonymity. That is a much harder problem. For Mozilla that would involve Tor. I don't think that Mozilla really has "servers" in the sense you mean.

Telegram is based on the UAE. The UAE is known of their very strict monitoring practices of their citizens. Not until recently they allowed FaceTime to work there. I honestly doubt that Telegram does't give the UAE government access whenever they need it's a monarchy government.

You don’t. Everything is about trust.

Yes, and I trust Durov.

If you wish to be more certain, use something open-source. For instance, Matrix has many clients made by different teams, in the open, and several of these are part of e.g. Debian, so you should be able to find at least one you can trust.

What about Mozilla? What could they store?

There's a lot about Signal in particular that they get right. AFAIK:

(1) All Signal messaging is E2EE; (2) they don't store messages on their servers; (3) the client code is open source, and it seems like a good portion of the server code is open source.

Where I think Signal could go further on being the most secure, useful, and privacy-conscious messaging app/company in the world:

1. Open source ALL of the server code. They have something called Signal-Server (https://github.com/signalapp/Signal-Server) on their Github, but it's unclear if this is the server they use, or simply a server one could theoretically use to run a private Signal server.

2. Open source all server-side services/infrastructure code that doesn't compromise security in some way.

3. Better features. Signal is currently the most secure and privacy-conscious of the messaging apps, but solidly the worst overall user experience. It's not that it's bad, it's just that the other apps are much better. People like gifs and giphy and emojis and a fast-feeling interface. This is important, because it's hard to be a privacy-conscious individual when all your friends want to text on other apps. At least in my social circle, Signal is still the thing that people jump over to when they want be extra super sure they're not leaving a paper trail, but not the default messaging app they use.

4. Introduce a user-supported business model. This probably makes a lot of people uneasy, and while I appreciate the current grant and donation-based business model (the Wikipedia model), that model comes at great cost of efficiency. By operating effectively as a non-profit, you are inherently in a less competitive position relative to your competitors (the best product and engineering people are more likely to go competitors who can pay more), and you're persistently in fund-raising mode (again, see: Wikipedia). There are lots of ways to skin this cat, maybe the easiest is to ask power users to pay like $5/mo. Or just give people the option to pay with absolutely zero obligation. Some non-zero cohort would inevitably take them up on this.

Most of these suggestions, of course, especially 1-3, are very very hard and come at an enormous cost. Building in public as an open source business seems to massively slows things down and introducing a huge amount of community management overhead. That said I'm sure there are ways to manage/mitigate those costs.

It's cool that Signal is open source, that's a big source of confidence for me. Is there some way for me to verify that the app in the app store is actually built from the OSS code on github?

There's no need to trust them. You assume they log everything that your device sends them, as well as the time, IP address, etc. and infer all they can from it. Then you act accordingly. You can apply similar conservative assumptions to your device and the programs it runs, but for practical purposes you may want to relax them somewhat.

For that matter, what is Zoom's architecture and dataflow?

Use Matrix or KeyBase and self host. Im shocked how much people still trust these shady companies.

Get hired and sign NDA

The same way that we trust that Apple doesn't log our keystrokes and send them back to the mothership.

Signal was created by one of the founder of WhatsApp which was sold to Facebook. Is there a guarantee that Signal won't have the same fate?

Yes, Signal is a nonprofit charity.

One reason you can believe the claim is that there's no real market for personal data, despite the folk belief that everyone's data is somehow worth a fortune.

Not everyone's personal data is worth a fortune. Specific persons of interest, however, their personal data IS worth a fortune.

No real market is an easily disprovable claim. While not worth a fortune in the small, in bulk it's worth a imperial butt load. Attention is what you're able to sell. That's advertising and sales. E.g. do you think that those associated with others in right wing militia would be more or less sensitive to advertisements for gear for prepping? How big is that market?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact