This is fantastic! I also love that there is the QR code generator. It'll make connecting easier.
I hope moving forward we can have multiple usernames and profiles. This would greatly increase privacy since we may have different identities in different social groups. Even on HN a lot of us have multiple personas. I find one of the big challenges is actually handling these different identities as most software only assumes you have one. Though it seems to be common on social media like twitter or instagram. But bitwarden still doesn't know how to differentiate microsoft logins lol
Edit: I'd love in the future to also see things like self destructing or one time links. I don't think these should be hard to implement, especially if one can have multiple usernames. Certainly a limit like 3 would be fine with the numbers, right? Personally I wouldn't be upset if multiple names became a premium feature but I'd strongly prefer if it wasn't. I get that signal still needs money (https://news.ycombinator.com/item?id=39446053)
> But bitwarden still doesn't know how to differentiate microsoft logins
To be fair to Bitwarden even Microsoft doesn't know how to differentiate between multiple Microsoft logins. As of at least a year ago, you can technically have different logins with the same username/email identifier, and different login prompts will behave differently.
Also nice to mention that some of those are connected and some are not. For example I have a personal account (that I did not create but appeared magically at some point; it behaves as totally separate), a work account (main work tenant) and three guest work tenants that share the password, but don't share the 2fa. For some apps you chose the tenant, but not for all.
Oh yeah it was more a joke than anything. Microsoft is just creating such a shitty environment. I can be logging in from my company portal where they know the identifier yet I still have to add @company.com. I mean I got one for my job, for my university, for conferences (CMT), and I swear I'm forgetting 30 others that I only use once in a blue moon.
They also are real shady with yubikeys. You can't set them as default but you can set "security key." So the process ends up being it assuming you want to use Hello (which breaks my Outlook... wtf), clicking use another device, security key, clicking next, then finally typing in your credentials. The next part makes me real suspicious since all the other dialogues go to the next page without clicking next. Why just this page? It's some weird dark pattern bs.
I'd call it malicious, but I think maliciousness requires intent. A chicken running around with its head cut off isn't really malicious if it runs into you.
indeed, with an incoming Teams meeting invite, it should be determinable from the sender's context which account should work on the meeting. Instead there is 2 minutes of waiting, and what seems like pot luck with the account.
Telegram has had all of these features for a while… too bad it isn't as secure as signal or it'd be perfect, since it's also written in a real GUI toolkit and present in distribution repositories.
I do wonder how telegram and signal are planning to finance it long term. Telegram is adding absurd paid features like exclusive animations, which won't earn nearly enough to cover the costs.
I wonder where signal is about keeping the servers up, since they hate federation so much.
Telegram and Signal solves very different types of privacy issues.
Telegram is good, as you mention, to be relatively private in groups/chats/channels without a need to expose neither your phone nor even a nickname (unless you live in autocratic countries — will come to this later).
But it comes with costs. First, their p2p communication is not e2e encrypted by default. Not to say that all comments/group chats are not encrypted too, unlike let’s say WA.
Second, Telegram API. It gives too much information. You can do a lot with it: read history, track changes of usernames, etc. For example, it is quite easy to obtain an internal user ID and there are black market services and databases where they promise to connect that ID with phone number if that account ever had privacy settings switched off in the past.
Claimed that they kind of scrape all accounts and pair ID for those where privacy settings set poorly. Even if you change it later — your internal ID and that scrape will state forever.
Third, Telegram was funded by Russian government since Durov had issues with SEC. He raised money from different Russian state-owned banks like VTB, issued bonds which are traded in Saint-Petersburg stock exchange, and even take some money directly from Russian government though a Qatar proxy-company. Not to say, that there are cases when TG was involved in criminal charges against people (the most famous one is story with Ryanair plane being forced to land in Minsk to arrest Lukashenko’s critique) and it was never directly addressed and explained by company how exactly those people was caught and how company protect against “SIM card replacement” cases (Signal at least inform me everytime my peer logged to new device).
Selecting between Signal with AFAIK no known cases of charges in dictatorship countries like Russia, funded by non-profitable charity, and TG without default e2e encryption, public API and Russian-state funding, is quite obvious for me.
More to this “lucky coincidence” it was unbanned exactly when Durov failed in trouble with SEC and raised Russian-state money to solve his problems. Around same time almost all official Russian institutions open TG accounts and Russian Parliament (if we can call that silly thing like this) representatives was saying like “we solved all problems with them”.
When war started, and Russia banned a lot of services like FB, they created list of communication platforms they have questions about loyalty and cooperation with Russian government. TG was not on that list and through the whole war the only issue was about Telegraph — supplementary platform to publish long notes. AFAIK there was 0 questions or criticisms to TG in those 2 years.
I didn't know a lot of this. I thought Telegram was mostly funded through Durov's Bitcoin and VK money? It feels strange that he'd be so "in bed" with the Russian govt when the whole reason he left was because of his staunch opposition to taking down Navalny's VK page.
But I haven't done extensive reading on this.
Durov was indeed an opposition to Russian govt for some time and TG was banned in Russia for some time.
But then “SEC-incident” happened. He and his brother wanted to build TON and fund it by kind of ICO (without naming it ICO). SEC decides enough is enough and blocked launch of TON with charging Durov for selling unregistered securities.
At the end, issue was settled, Durov returned all money and settle the deal with SEC, but it shrinks his finance by a lot and he ran out of money for TG.
Then he was seen in Russia and issued bonds for $1 bln. According to Russian financial press [1], bonds were underwritten by Russian banks closely affiliated with government or directly stated-owned (all of them are in sanctions list now), and even some money was invested by Russian Fund of Direct Investments [2]. Last summer he again issued bonds for TG for $270 mln.
You can buy TG bonds at SPB stock exchange where they were listed 2 weeks after the issuing [3].
Surprisingly (repeating my comment below), around same time, Russian govt withdrew all their claims to Telegram and started to use as the official communication channel.
Not to say that other “transformations” happened like Duriv publicly denounce US declaring it is a “police state” [4]
Durov personally blocked Navalny channels in Telegram during 2021 elections - https://www.rferl.org/a/telegram-navalny-smart-voting/314662... even though "technically" as a foreign legal entity they had no obligation to follow orders of Russian censorship agencies. Also, if you look up the results of court decisions in Russia, Telegram leads by a significant margin among other messengers. Yes, of course, it is the most popular messenger in Russia, but it is designed from the ground up to tie and control the circle of communication to specific people as precisely as possible.
Dictatorship exists in varous forms. Russia has democracy though in bad shape. There various flavours of democracy. But what about total dictatorship in China has no opposition and many countries with theocratic monarchy.
It's really easy to tell the difference between a democracy and a fake democracy. Democracies are messy, people never agree. Anywhere that get's consistent landslides for one person or party is not a democracy.
Take for example France vs Russia. In the 2022 election, Macron managed to get just ~30% of the voters that wanted him as President. In the second round where only two options remained, only 58%.
Without any serious opposition (with the murder of Boris Nemtsov and jailing/deregistration of Alexei Navalny), the 2018 was again a landslide for Putin with 76.69% of the vote.
There are of course other easy ways to tell, but this serves as a pretty easy heuristic.
This is, of course, a gross simplification, of everything that makes up a democracy. For example, the US is at best a flawed democracy because of all the lobbying, money and gerrymandering (and things like the Electoral College).
Disclaimer: Not American, I'm a Kiwi, so outsiders view of US politics.
Bullshit. Russia has no democracy, even in the minds of its citizens, not to say in the government. It never had and it may never have democracy. At least, until Russia exists in its current shape of form.
My bet is that they have a chance for democracy only when Russia becomes a set of little independent states. As Russia in a nutshell, is just a Muscovy that occupied other sovereign states. It was exactly like they’re trying it with Ukraine currently, again. Again, as the previous one was in 1918, when Russia ‘incorporated’ other states, what we know as ussr.
Don’t worry, telegram is now gatekeeping certain privacy settings behind the premium subscription like it’s 2003.
They also make it difficult to hide your pseudo identity from your phone contacts. I’ve had all the “discover contacts” settings turned off, and simply reinstalling the app caused people to be given my username without my consent. Settings somehow magically switched themselves back on and I couldn’t turn them off until after the damage was done.
There was no confirmation prompt. Pretty sure this happened to me more than once.
i've been using Telegram on and off since 2015 or so, and i've never shared my contacts. never! re-installing Telegram has never changed that setting.
The real problem with cellphones is that a lot of privacy-threatening issues are literally one fat finger away. And clearly, that's a feature, not a bug. That's why I prefer to work and message on my laptop anyway.
but again, Telegram has been, in many practical ways, much more privacy-oriented than all the other messengers, exactly because you don't have to share your phone number to participate in groups and chats.
For example, now you can’t restrict who can send you a message unless you have a premium. Also they added a “feature” that premium users can bypass non-premium users privacy setting “last seen and online” and TG will tell that info regardless of your choice unless you are premium too.
You're significantly misunderstanding the changes.
> now you can’t restrict who can send you a message unless you have a premium.
And before that you just weren't able to restrict that at all, there was no such feature. They didn't remove this feature for free users - it never existed. They just added it right now only for paid users.
> premium users can bypass non-premium users privacy setting “last seen and online”
That is absolutely not what the feature is. If you hide YOUR OWN last seen time, you won't be able to see last seen time of other users, even when they have it public. Now, premium users will be able to see public last seen times of other people if they hide their own. But they obviously still can't see last seen time of people who set it to private, that would've been very dumb.
Thanks for the clarification on last seen, I certainly misread it. About messages: hm, I was sure it existed before but maybe again my brain just lags.
As someone who for some time created and moderated fairly popular chat (200+ people) for anti-war Russians, I have very long and complicated history of relationship with this service and have a lot of different grey-zone stories where it is hard to understand whether it is a mistake from users and whether it is a leak from the service.
Hence I have a little low expectation and overreact on their recent changes
I have three Telegram channels with a few hundred subscribers each, and I also use the service daily, as I'm Russian as well.
I generally agree with you that Durov makes a lot of incredibly stupid decisions. I think pretty much everyone in the "Telegram community" (eg. channel administrators, bot/client developers, etc.) would agree that the changes Telegram is introducing are often bad.
The issue, though, is that there isn't any alternative right now - Telegram is the best messenger out there in terms of general usage. So while I do hate what they're doing sometimes, I still use the product and even pay for Telegram Premium. It's bad enough to be mildly annoying, but not bar enough to actually make people leave the platform.
Edit: just as I was writing this, Telegram introduced a new feature. I'm not sure if I love it or hate it to be honest, it's a smart way for them to save money, but it is pretty weird: https://t.me/tginfo/3942
If you consider Telegram as a product to be a logical continuation of the VK message system, then all of these "features" existed.
Restricting of incoming messages existed (cloned from Facebook as usual).
Restricting of "last seen and online" existed in third-party clients. Later on VK started to actively destroy this functionality, by moving manual "is online" management from designated API into all data-fetching APIs.
Not to mention that VK and Telegram are now actively fighting with third-party clients. In which world they would not fight Ninjagram/AyuGram/Plus Messenger/other forks, which allow to add multiple accounts, hide online/reading (to some extent), show message editing history and so on?
> And before that you just weren't able to restrict that at all
This is a really basic security feature though that every single platform should support. If Telegram didn't support messaging restrictions before, that doesn't mean they're not currently gating a basic privacy/safety feature behind a paywall. It just means they should be embarrassed that they used to be doing something even worse, ie not even offering a basic privacy/safety feature at all.
Correct that this would not technically count as removing a feature, but I feel like that's possibly a distinction without a difference. I'm not coming out of reading this explanation feeling more charitable about Telegram's security or willingness to gate off security features. It's a bad look for a company to put basic blocklists behind a paywall, that is not a company I trust not to start degrading security for free users.
How is message restriction a "basic privacy/safety" feature? It's at most a basic "anti-annoyance" feature, I'm not sure what security you gain from preventing everyone from messaging you. The ability to block users was always there and it still there for free.
> It's at most a basic "anti-annoyance" feature, I'm not sure what security you gain from preventing everyone from messaging you.
This could be a long conversation. The short version is there are plenty of articles online by marginalized groups talking about the consequences of having no ability to block arbitrary groups from harassing them online. If someone is calling that "just an annoyance" they've likely never been the target of an extended public harassment campaign.
A slightly longer answer is that the consequences to privacy and security are in a practical sense -- in the sense that someone coming into my house is a violation of my security and privacy. Privacy is not just about hiding information, it's also about why we hide information. It's about the ability to be private; to not be forced to constantly listen to a bunch of people shout at you. Similarly, security exists for a reason, we have security in our homes in the sense that people can't just walk into them and start yelling at us and harassing us. And DMs should be thought of as analogous.
Your DMs are not secure if you have no way to turn them off or restrict them.
> The ability to block users was always there and it still there for free.
If you recognize that is important to privacy and security to be able to block individual users, it's not too hard to recognize that the requirement to individually block users leaves a huge gaping hole in security for a network that supports open registrations.
I use disposable email addresses rather than just blocking individual spammers in my email client. The reason is because there are a near-infinite number of spammers and blocking them one-by-one is ineffective. Being able to turn off a leaked email address is much more valuable to me. It's something that actually cuts down on spam.
And the same is true on social media -- being able to go private and turn off messages or restrict messages to certain subgroups is critically important for people who are stuck in the middle of public harassment campaigns.
----
Regardless, the lack of a feature that is pretty much standardized across most other platforms, and that is pretty widely recognized as a safety feature -- it doesn't make me feel better about Telegram's willingness to gate these kinds of features behind paywalls.
You're saying that the ability to block users is free, but there is no bright line between blocking users and setting general messaging restrictions. That is the same category of safety feature. There's no reason to believe that Telegram wouldn't make blocking users into a paid feature in the future, especially since it has demonstrated that blocking/moderation/lockdown features are something it is willing to monetize.
I don't get why people who are so paranoid about someone associating their Telegram handle with their phone number simply don't go and grab a burner SIM at Tesco.
I mean I'm all down with the idea of tech companies respecting our privacy. But here we are, complaining that corporations that are at least trying (and that are operating at a loss since their conception for our convenience) aren't giving us "Snowden hiding in Russia" level of security out of the box, for free, just because we deserve it. All while we could easily implement it ourselves for like $8 and with no online trace whatsoever.
It's like, Tails Linux exists, but FUCK GOOGLE for forcing me to Ctrl+Shift+Delete in Chrome if I want to erase a cookie. I'm so significant and certainly not a criminal, why do they hate me so much??
It's not always that simple. In many countries, like Brazil, you need a valid ID document to buy a SIM card, and the number is then and always linked to your government ID. This is the case for quite a few relatively free countries as a means to fraud prevention (not that it's particularly effective though).
Why is there such a pervasive crowd of people who chock this kind of thing up to a lost cause? From my prespective, if it can be done, we should, we must, do it. Is there something special about hiding something from a government thats qualitatively different than hiding it from any other criminal? That they can levy greater amounts of violence? Isnt that even greater justification to privacy?
I'm fully in agreement; we have policies around warrants, etc that have been long-running and should in general treat the government as a quasi-malicious actor.
However, just because the government forces something for them doesn't mean we should just give up entirely for everything - the fact that the government knows your SIM purchase doesn't mean that random users on HN should be able to find it.
> I don't get why people who are so paranoid about someone associating their Telegram handle with their phone number simply don't go and grab a burner SIM at Tesco.
I could not hate the phone number requirement more, and it's one of the main reasons why I don't use these applications.
With one exception: I have an overseas friend who only communicates through WhatsApp. For him, I did go out and get a burner phone for this purpose. But the friction level of doing that is unnecessarily high and I doubt I'd do it for anyone else.
I hadn’t used a burner in years, last year my phone broke on a trip and I just wanted to grab a phone, to get me through the week. I can say it’s not like it used to be! Can’t just grab one at the gas station and pop it in a phone. Gotta give ID, sign up for accounts, etc.
It depends of the country. You can buy a sim card at an Oxxo in Mexico like you would buy a bag of doritos. I did it precisely last year.
Having said that if you leave the country I am pretty sure that sim card and number would be deactivated after a few months if not connected. I am not sure how fast a number can be reused.
Telegram isn't a messaging service. It's a social network with a messenger UI. Quite ingenious, if you'd ask me, but a social network and a private messenger can't really be reconciled into a single product.
I think "social" in this context refers to frictionless friend finding, not stickers. Good privacy involves a certain level of friction, with PGP verification being a classic example of the UX problem space.
I kick in $5 a month because that's about what I figure self-hosting a messaging service would cost me. I don't want the hassle of self-hosting and I trust Signal more than the other remote hosted options.
I’m not who you replied to, but I agree with his sentiment about signal being superior to telegram in terms of security (or more specifically, privacy).
For me, there’s two big reasons for this:
Signal chats are E2E at all times, while Telegram is only E2E when you explicitly create a “secret chat” with whoever you’re conversing with. I don’t fault Telegram too much for this, because they still provide the option to use E2E for everything, but Signal gets brownie points in my book because they just do it by default without getting in the way of the User.
Secondly, as far as I know, Telegram uses their own in house encryption techniques as opposed to industry standards. I am not at all knowledgeable about encryption or cryptography— I only know what’s required of me in my job (basically the bare minimum), and so I don’t actually know whether this is anything of serious concern. It could very well be that Telegram’s encryption techniques are just as effective as the established norms, but I do see the general consensus trending towards “roll your own encryption = bad, use established norms = good”, which is primarily what I am basing my opinion on here.
To further detract from my own point, it actually seems like Telegram might be using “established norms” for encryption nowadays anyways [1], although I couldn’t really tell from the brief description I read on Wikipedia.
Overall, I think Telegram is perceived as being less secure than Signal primarily because of the reputation Telegram has for implementing their own in house encryption techniques, even if they don’t use those techniques anymore— their name has become associated with their known history of using ad hoc encryption.
Chats are not e2e encrypted by default, they are just encrypted in transit. However this allows chats to be synced across many devices, so it is very very convenient.
Telegram has e2e encrypted chats but only on mobile and not on desktop for some reason.
telegram is e2ee only for secret chats, all other chats & group chats are not e2ee (which means telegram can access their content at will on the servers)
Synced chats across devices is possible with e2ee, even signal has this, it's just one edge that's poorly implemented: initial sync of the chat history and afaik they haven't fixed this yet, but all messages after setting up a new device are in sync as far as i know
I don't want to be too dismissive of Matrix, but I also see these types of comments as understanding what problem Signal is actually addressing: security for the masses. There's no way I'm getting my grandma on Matrix and you're delusional if you think she can setup a server. But it isn't hard to get my grandma on Signal and that's a much better security feature than federation or even not having phone numbers. If I want extreme security, you're right that there are better tools. But my threat model isn't trying to avoid nation state actors, it's mostly about avoiding mass surveillance, surveillance capitalism, and probably most importantly: sending a message to the gov to fuck off with all this spying. At the end of the day, there's no other app that's even close to fulfilling those needs.
I didn't realize my comment rose to the top. When I had written this I had also written this comment[0] which was the grandchild of the top comment at the time. It has a bit more details on my thoughts/reservations of federation. tldr is mostly about avoiding centralization. This remains an open problem and I think it is far too easily dismissed. But federation isn't solving the problems people want it to if it's federated like email and web browsers. That's just mostly centralization with all the headaches of federation.
And to anyone complaining about lack of federation, what's stopping you from running your own Signal server? Sure, it won't connect to the official channel, but is that a roadblock? Even Matrix started with one server. This is a serious question, is there something preventing this? Because if the major problem with Signal is lack of federation, I don't see why this is not solvable building off of Signal and not needing to create a completely different program. Who knows, if it becomes successful why wouldn't Signal allow a bridge or why can't apps like Molly allow access to both the official and federated networks?
Oh, I agree completely with everything in the top paragraph, and I certainly have seen a natural trend towards central nodes/relays in all the federated networks I can think of. I think the appeal is that for the average user its about as good security as anything else available, and it has the option to work off the centralized network.
lol I can barely get my grandma to text. My parents don't even get Signal. Most of it is will power though, no one gives a fuck. In fact, most of the people in my CS grad program think both are too hard to use and don't see the point of using encrypted messengers. Even people studying security aren't using Signal. Yes, I think its odd too.
The willpower bit can be countered with your own : refuse to use the software you deem harmful. (It's easier to never start a bad habit than to stop it.)
> Note that even once these features reach everyone, both you and the people you are chatting with on Signal will need to be using the most updated version of the app to take advantage of them.
> Each version of the Signal app expires after about 90 days, after which people on the older version will need to update to the latest version of Signal. This means that in about 90 days, your phone number privacy settings will be honored by everyone using an official Signal app.
Which is also an example of a challenge for open ecosystems where everyone can create apps.
I understand that it doesn't outweigh the benefits to everyone, but it is a valid reason.
The apps and most of the backend are open source too, not just the protocol.
The important distinction is that it's not decentralized like XMPP or email, which is a conscious decision: it would become very difficult to change it to add new features and they'd be left behind by closed-source competitors (see: XMPP).
I see that it is a ton of wishful thinking and FUD on the side of Signal to claim that: XMPP is alive and kicking, has all the features one needs, runs everywhere, at scale, offers the same or better crypto, better privacy, better resilience and is more sustainable. When Signal will inevitably fail/turn against its users/enshittify itself or get acquired, all federated and P2P protocols will keep on going. For decades. That's the kind of communications systems we should be demanding in the present era, nothing less.
XMPP is underrated. A lot of people are imagining Pidgen in 2011, but the protocol has been extended, the actively developed clients are good, and it avoids the heavier parts of Matrix (both client and server side.) I wouldn't be surprised if Slack's replacement when Salesforce inevitably fucks it up will be XMPP based rather than Matrix.
"The protocol has been extended" has been XMPP's theme for decades, and also its problem. Name your favorite client, it probably won't have several extensions, and a lot of useful things require support on both ends plus the server. Lots of things that should be ubiquitous are not, including s2s auth. There needed to be more structure, like AIM back then or Signal now. Also the XML stuff is a nightmare.
Even if Google Talk kept XMPP, they weren't going to save it, cause nobody used Google Talk. Facebook was by far the biggest XMPP-supported platform (though it wasn't federated), and they stopped probably cause they didn't see enough clients. Even Slack supported XMPP for a while, did you use that?
Is it really a wish if it's already come true? I can't name a single person who uses XMPP. If a federated chat protocol ever wins, it'll probably be something more modern like Matrix. At least there's email too.
> My understanding is that Signal (the app) is private, not anonymous, centralized, and closed.
You are right about that.
There used to be an open source build called LibreSignal
Moxie Marlinspike made clear [1]: You may inspect the code. You are even allowed to compile it. You are not allowed to connect your self compiled client to our message servers. We are not interested in a federated protocol. Make sure your fork creates its own bubble that does not overlap with Open Wisper Systems. Stop using the name Signal.
You can run Signal app forks on the Signal server. Molly is a popular one. You just can't create new servers. I wish you could, but I get the reasoning of not wanting honeypots. But that doesn't stop you from running your own network of Signal servers. So I don't see anything stopping anyone. I mean Mullvad runs their own stuff and I don't see half the complaints about them. I've always been curious why Signal is so unique here. If 1/100th the people that made these concerns developed a open community of signal servers, I'm sure we'd have a viable alternative network. What's stopping everyone?
One of the big lessons from Twitter and Reddit was third party apps are tolerated or even encouraged until they are not. Unlike, for example Discord, I haven't see any indication that third party clients are causing account bans, yet.
The status of open source, privacy respecting messaging apps looks really healthy to me, compared to where we've been over the past 30+ years (thinking starting with ICQ.) Signal was a big leap toward getting average people using much more secure messaging, although it is pretty clear even most 'tech' people don't grasp what is going on or why it is important to be able to use e2ee separate from a combined client+server provider.
Yes, but my argument is more in the realms of "why are there no projects to create an open network using the existing architecture" not "we shouldn't have an open network and completely rely on Signal forever."
I'd still appreciate a source. There's things I'm aware of that I think could be confused with this, but I've seen no indication of them actually wanting to or even caring about forks. Only in the servers.
As far as I can tell, Signal's policy is more "Do what you want, but server costs are high so we don't want to pay for your product. But if you do, here's all the code to give you a start." That's a very different policy from blacklisting.
And as I keep asking others, what's stopping everyone from making a federated Signal? If you can use the same account on both the production/official server and the staging server, why can't you on the production server __and__ a community federated server?
And if they ban you from the production server, so what? Now you're on par with literally every other federated service. Like what is Signal going to do? Stop open sourcing code? That'd be like trying to kill a mosquito by stabbing yourself in the heart. If they're willing to do that, I'd rather it be sooner than later anyways.
So I want a source because I just don't get what you all are complaining about. Is it just that someone else didn't make the thing you want? Sure, I get frustrated, but the comments more come off as Signal being nefarious and I just don't see Signal acting in any way malicious. In fact, hosting links to forks and being a common place for those forks to discuss seems like they are actively supporting them.
Who is complaining? This is confusing. The whole idea about Signal is to compete with mainstream, as well as with the federated, ecosystems by having a single implementation of both client and server, I believe the argument is that only by moving faster is it possible to compete with the more mainstream commercial messengers for the masses and still have reasonable cryptography.
Moxie wrote several articles about this and expanded on this idea in his conference talks. You are very welcome to take the code and write your own messing system, but do not connect to Signal's servers because that costs them money and they will need to take action, sooner or later.
They were very clear that LibreSignal had no future. They have also been very clear that they discourage any non-official distribution of builds. They have repeatedly told the F-Droid project that they will not publish using their reproducible build system, and any user doing the same will be kindly asked to take down their copy. The F-Droid project has complied.
This seems to be a strange thing to discuss. If the above links are representative it may be a popular subject among a subset of users, which seems misguided. Signal does not wish to be xmpp or matrix and neither should they. It must be their right to decide. There are so many chat software projects. If you don't agree with the goals of one of them, you energy is better spent elsewhere.
Lots of people are complaining. It's why Moxie wrote those many articles. It's why there's so many comments bringing up Matrix and others. People were even doing this before Matrix was E2EE! So yeah I'm tired of hearing it so calling people's complaints out. If you don't like it, fix it. It's HN and people are devs here.
Moxie's post looks solid, but there is a counter example: bitcoin nodes. They are a very loose federation of nodes that go through regular upgrades in the protocol. So it is possible.
But yes, it's also very hard. The bitcoin protocol didn't start out that way. It took a lot of knocks and bruises to get to the point they could upgrade all the servers in the federation.
Interestingly, the method bitcoin came up with allows protocol changes to fail, meaning the bulk of the federation never takes them up. Everyone gets a vote, and it only succeeds if the bulk of the federation upgrades. Perhaps from Moxie's point of view that's unacceptable, as it means he is no longer the dictator of the protocol.
Nonetheless, it is possible to design a protocol so it can be upgraded relatively quickly. Even if you don't do add "quick transition" features to a protocol transitions can still haven. IPv6 will replace IPv4. But as Moxie says, it's painfully slow.
The author is no longer CEO, though, and there are a lot of "I" statements in the post. Is it still accurate? Has the current CEO made any comment on it?
It's a great encapsulation of why Signal is not federated, and, unless you find the current CEO stating otherwise, is unlikely to change. Changes like the one detailed in the link simply wouldn't be possible to roll out efficiently in a federated ecosystem.
Signal has consistently focused on helping /most/ users do what they want with the app without sacrificing security. This change - away from requiring phone numbers - helps plug one of the biggest criticisms, both on the security and product side. Nothing about their mission requires federation, so I respect that they haven't sacrificed their mission in order to do it.
I tested matrix in 2021 and found the experience pretty darn awful outside the main client. And by a cursory glance the ecosystem is still pretty much controlled by the matrix.org folks. When I was using it there was a lot of accusations that Synapse did not follow the specification and that server implementera had to reverse engineer what Synapse did to be able to federate.
And talking about that: does federation work properly yet? I used a third party provider and it made my life miserable.
I am all for federation, but in my experience the "federated" part of matrix was a lot worse than the jabber one they want to replace.
It's not [attempting to be an open ecosystem]. Their ToS used to forbid using third party clients. I don't think this has changed. They haven't banned anyone for using third party clients (to the best of my knowledge), but they're openly against an open ecosystem.
It's private, centralised and the network is closed (e.g.: non-federated), but the source code is public and open source. I think that for the server implementation they do code dumps every once in a while, rather than continuously keep it public.
I wish it were more obvious that Signal expires its apps every 90 days.
My mom couldn't receive signal calls on the backup phone I gave her. I had disabled auto-updates since apps break UI sometimes and she gets confused by things moving around.
When I visited, I opened the signal app and was told I had to update.
I have been bitten by this in the past. At least now they give warnings in-app that the app will expire soon. But if you don't use the app regularly, you wouldn't even know. Also, I'm not aware of any other apps that die in this way, so it's not like people are in the habit of periodically checking the app to make sure they're still on a version that can receive incoming messages.
This has more sinister implications in some places. For example, Apple app store in Russia can get banned at any time. So if I understand this correctly, if that happens, Signal will stop working for all iPhone owners in Russia in 6 months. And guess where you really need something like Signal?
It's patently unforgivable that a message would not be delivered because the client is out of date.
The Signal team is incredibly clueless and arrogant toward its userbase. It seems to simply not have occurred to them that many people rarely/never have wifi, may not be on AC power when they are on wifi which means the phone may not check for / apply updates, etc.
In the US, cellular is often expensive and slow.
In underdeveloped countries where software like Signal could be really important, all this is even more true.
We get shit crammed down our throats to protect the most obscure edge cases for the smallest percentage of the most vulnerable users - such as not being able to sync messages between devices - but then they pull shit like this which has a huge impact for people in rural areas and underdeveloped countries?
> Delivering a message to a client which is known to be less secure than the sender expected it to be is unforgivable.
That is inconsistent with the threat model of a messaging system!
Inherently, a messaging system will deliver a plaintext copy of the message to the recipient(s). Wouldn't be much of a messaging system otherwise.
Once you sent something and it was delivered in plaintext to the recipient, the information disclosure risk is completely out of your control (and out of control of the application in use). The recipient is free to leak it however they wish.
If you don't trust the recipient to keep it private, don't send it.
> That is inconsistent with the threat model of a messaging system!
I disagree, the worst thing that a messaging system that aims to be "private" can do is to actually not be private. Sending to a known-insecure client is a violation of, like, the one thing signal claims to do.
> If you don't trust the recipient to keep it private, don't send it.
My threat model is some combination of "third party actors who I don't trust" and "second parties who I trust but who are non-experts"[1]. I would like Signal to protect me from the first (by not delivering things to known-insecure clients that can be middlemanned or otherwise discovered) and the second, by having privacy-respecting and mistake-preventing defaults. Things like disappearing messages and such. Keeping my trusted-but-nonexpert peers from making mistakes that can harm either of us in the future is a key part of my threat model.
For example, disappearing messages prevent me from being harmed by my friend, who I trust to discuss things with, not having a lockscreen password and getting warrented by the police. An outdated or third party client that lets you keep them forever, even if well intentioned, can break that aspect of the threat model. And yes, a peer who is actually nefarious can still do that, but that's not my threat model. I think my friends aren't privacy-experts, I don't think they're feds.
[1]: This is, for example, the reason that I think PGP is not a good tool. Even if I do everything right, a well meaning peer who is using the PGP application can unintentionally leak my plaintext when they don't mean to, because of the tool's sharp edges.
But you don't know, at the time of sending, which version of the client will show up to retrieve it. Otherwise both clients would need to be connected at the same time before you were allowed to send.
Just curious, since I'm not really active in this space, but wouldn't the threat model of most concern be that an external actor breaks (maybe an outdated version of) the app or protocol? This would leak data without you or the recipient being any the wiser. It seems like that's the threat the app-expiry policy is intended to address.
You could update the protocol version if and when a protocol weakness is discovered and then stop talking the previous protocol version after a transition period.
No need to continuously expire apps in the absence of a protocol breach.
What if there's a vulnerability in the app itself?
I have no idea if that's what they're concerned about - they may just be being arseholes in this case - but from the outside it seems like a legit reason to build in the capability for app expiration.
If the app has to be updated on a 90 day schedule, then it's likely that most of those updates aren't making anything more secure. So it's not "known" that someone running last quarter's version is less secure than the sender expects.
I think this is the tradeoff that Signal makes versus the messenger most similar to it, WhatsApp. Though of course everyone in a group chat must pick one or the other, so it's not much of a free choice. (My friend group in the bay area is entirely on Signal, for example, though I also have a WhatsApp account.)
> In the US, cellular is often expensive and slow.
Mint will sell you a plan for 5GB of data for $15/mo. Its not that expensive to have a basic cellular plan. And that's assuming you're not poor enough to have your cellular plan almost entirely subsidized. And also assuming you're pretty much never anywhere with wifi.
In the vast majority of markets in the US it'll take a minute or less to download, it'll probably take more time unpacking on your device and installing.
Sure, but the thing I was responding to was "in the US".
There's cheaper per-gig plans in the US. Visible has unlimited plans for $30/mo which is cheaper per-gig if you use a lot but more if you're using less than 5GB anyways. And if 200MB/yr currently seems like an expensive amount of data to you, you're probably already using less than 5GB a month.
Yup, I was on an international trip with hardly any data allowance when all of a sudden my messages stopped sending, and I couldn't receive any new ones... That'll never happen with SMS. I love Signal, but some of their product decisions have been questionable.
Their decisions seem right for the use case of a secure messaging app, but I don't care about that use case and would rather use a non-e2ee app that'll be reliable, not lock me out, and work seamlessly across devices. Also, for those who truly care about e2ee, it's pointless if you aren't checking all the safety numbers out-of-band.
Yes, this is a compromise on the CIA triad. It prefers integrity and confidentiality over availability.
That is a fine decision to make for a security-minded app, but signal has always presented themselves as a full alternative to SMS and other messaging systems where availability is prioritized over confidentiality and integrity. It should really be made more clear so that users are making an informed decision. They could also do wonders for the user experience by having the app inform the user of the problem and how to remedy it.
Yeah, but I wouldn't call SMS super available either since it relies a lot on the ends too. Had a lot of those drop when I traveled. Something like Facebook Messenger has a whole server storing messages, so it's solid, you'll receive them later even if your phone breaks.
The way they say "privacy settings will be honored by everyone using an official Signal app." kinda suggests they're gonna let third parties keep getting this info...
They won't. It'll be similar to message timers or delete for everyone. You can revoke sharing your number and it will be hidden in official apps but third party apps won't magically forget the number that was previously shared. However if you choose not to share your number from the start, no one will be able to see your number.
This is a common, but terrible argument. Anyone can (mis)use, make, or weaponise technology given enough time and funding. Following this reasoning to its logical extreme, nobody should ever do anything.
The problem something like this solves is to raise the bar somewhat and discourage a fraction of those who would.
It's not a big expensive task to look at what data an app is sending/receiving. Anyone with minimal reverse-engineering skill will know how to intercept HTTPS to/from their own phone in 5 minutes. Signal uses some other protocol, but it's also doable, also it's open source anyway.
The conclusion isn't that Signal should be closed-source, it's that Signal's servers should not trust the clients not to be tampered with. So after 90 days, they will remove phone numbers from the protocol for users who have hidden them, breaking old clients, which is fine. What is the alternative solution you're thinking of?
I mean, if WhatsApp said this about the privacy of messages, Signal would be running billboard ads about how they don't care about privacy and look at how much better Signal is, right? This is the company that goes out of their way to pile on advanced encryption and insists on using dangerous secure enclaves to get this kind of thing right... until they are asked the hide phone numbers, at which point they are selling people a false bill of goods that WILL confuse someone into giving their phone number to someone who they really shouldn't have. It isn't as if it is somehow impossible to hide anyone's number at the protocol level: hell... even Snapchat does this, right?
I like the idea, but they should have called it something else instead of ‚usename‘. Maybe ‚connection string‘ or ‚discovery phrase‘. Right now they have to explain at length in what ways it’s different from regular usernames.
European quotation marks commonly have the left one down low and the right one up high. The same applies for single quotes. But using comma-backtick is deeply unorthodox.
Interestingly, the author does not follow this convention on his personal site (first link in profile) … instead option for the ‘single quote’ form instead.
To give a definite answer to the discussion below - it seems Czech, Slovak, German, Slovenian and Croatian sometimes use this format. Here an authoritative source: the EU publications office:
Both are wrong. tcmb didn’t use ‚comma-apostrophe’ – they opened with , U+201A SINGLE LOW-9 QUOTATION MARK (not U+002C COMMA) and closed with ‘ U+2018 LEFT SINGLE QUOTATION MARK (otherwise known as an open single quotation mark).
Sorry I was quoting nsxwolf. But now that you point it out, I can see the difference. It's subtle so I'll copy paste so others can see.
tcmb: ‚usename‘
nsxwolf: ,comma-backtick`
stavros: ‚comma-apostrophe‘
godelski: ,comma-apostrophe'
Though while copy pasting I see tcmb and stavros as having the same character which is different from the longer character you pasted. Seems my clipboard doesn't like that character. I also seem to have crashed OSX's emoji and symbol tray. No longer pops up if I press the button (bottom left) or select from firefox but got it back by opening safari.
Fuck man, I do not envy you people working on ligatures. Or timezones. I'm always impressed by these random rabbitholes and complexities in things that always look very simple. It's beautiful in a weird way.
Wow this is like the most HN thread I've ever seen, I love it! It's almost like a punctuation version of "Who's on first?"
Everybody's arguing, then finally all is revealed, and I learned a ton of stuff along the way about German quotation marks and the subtle difference between backticks and opening curly quotes, and low quotation marks and commas, in the Verdana font!
(If this had been a serif font with actual curly quotes the differences would have been much more obvious...)
HellDivers 2 LFG rn is all about sharing Friendcodes... you can get a ton of them on discord or reddit... but then you end up haveing a "friendcode" cybermentally-distributed DNS system for them over time.
Six degrees will still exist.
(funny weird thing is that with HD2's server issues due too demand, one way to harvest this would be to create a fake LFG host game and have tons and tons of accounts bang against your HellDiver-Pot - and get whatever you can scrape from that?
---
OK - I actually went down this hole the other daty... you look at the reddit thread on helldrivers for LFG - or the discord...
So on reddit, you just put .json at end of thread - DL the entire thread as json, now you have reddit id, location, play style, etc, details AND their friendcode on HD2...
but since they can individually generate random friend codes on any game/system that allows such... you have a breadcrump (with enough attention span to just correlate all the shared info between these friend codes and data received...
still - even with random friend codes - six degrees is still available, easily.??
---
I deeply hope they do a Tech Talk on the post-mortem of this lauch success spiral - its fascinating....
But one thing I am really interested in, this is based on the Autodesk Engine, I know they co-dev-dog-fooded, but I hadnt really known of this engine at all... what little I do know, is that - its amazing...
But I'd really like to know more about the arch and overall traffic flows etc of this game.
Its beautiful see "problems" like this explode in like ~2 weeks.
What do internet traffic graphs look like since growth, per carrier?
The developers last game had an all time peak of 7,000 users. They planned worst case scenario of 250,000 users for the sequel expecting more realistically 50,000 users.
They're currently at 394,686 players on steam alone - not including Playstation players. The servers are doing their best right now.
Sorry, I don't quite understand this in the context of "friend code" vs "matchmaking". Are you saying that friend codes bypass their servers, allowing peer-to-peer play even when the servers are overloaded (the way direct IP addresses used to do in old PC games)?
I apologize for not asking a clearer question. I was actually just interested in buying the game, but only if it has public matchmaking built-in for finding anonymous pick-up groups, instead of needing an external Discord server to swap friend codes on.
I get that, but how does that help with overloaded servers? Unless the friend code is actually an IP address for peer to peer networking (which is rare in online games because of cheating), it still has to go through some central server.
At a minimum, the server would connect the tokens to players in a database. But usually I think they do more than that, such as hosting lobbies, punching through NAT, and in many cases, actually hosting the games themselves and being the authority for all the state.
In that case I don't see how tokens would save any load over matchmaking.
My reaction to the article was that they're using a lot of words to explain this change. That suggested to me that maybe they aren't being completely candid.
I've never used Signal, because (a) I don't want to rely on a smartphone, and (b) I don't want to use my phone-number as my ID, because it's traceable. I can't work out from the TFA verbiage whether this change addresses my concerns or not. That in itself is concerning, to me.
They also missed the opportunity, like many times they have done over the years, to actually make it something rather like 'Hide My Number' in true sense, after spending years sitting on this feature. That would have been the true case of "caring for privacy". This is just a lazy (too lazy!) copy from Telegram (however, with one good thing -- getting rid of username vanity)
Unfortunately, spam exists and phone verification is one of the least-bad-way to ensure that the user is a real person (there are other options, but it really is one that has many advantages).
Given that Signal does not have access (by design) to much information about their users when they use the service, they can't really fight spam once accounts are created. You could do spam detection on the client and privacy-preserving voting in order to ban spammers, but the UX would be very poor and that opens a whole new can of worms.
This reasoning doesn't make sense to me. A spammer can make an account, but how would they contact me if they don't know my account handle?
Even if that leaks, the handle should be changeable, and the spam issue could be completely mitigated by having a tab for first time "message requests" separate from the normal inbox.
I can't take a private messenger seriously when they require an identifier that's linked to your government-issued ID in many parts of the world.
> I can't take a private messenger seriously when they require an identifier that's linked to your government-issued ID in many parts of the world.
Well that's a whole separate rabbit hole.
Governments shouldn't be requiring something as simple as a SIM card and phone number to be directly linked to a government ID. The right to privacy is a hell of a thing and the only reason a government would require this is to be able to spy on or track everyone.
There is absolutely no way you can connect to the Internet in Switzerland without a chain of custody of your identity. And in Germany you need to show your ID card or passport to obtain a phone number. Your phone number is required for SMS verification even for free in-store wifi in shopping centers and grocery stores. There is no exception to this except for some rare routers managed by people who accidentally left their router unsecured.
I have to say, Germany doesn't surprise me too much there. The culture in general is in favor of strict rule of law and heavy regulation. I don't mean that disparagingly at all to be clear, that isn't my cup of tea but I don't live there and really don't mind at all how another culture or society prefers to run things.
Switzerland surprises me a bit there though. Presumably the people approved that, I had a Swiss friend while living in the Netherlands and was surprised by how frequently they vote on seemingly minor regulations. I very much appreciate it honestly, both as a much more democratic system and as a way of making sure the government is both slower moving and checked by the people. I have to assume the Swiss voted to allow such a regulation, curious if you know more about how that was actually legislated though.
It’s true that governments shouldn’t require ID to be linked to phone numbers, but in much of the world (likely most of the world’s population), they do.
I guess I'm just not sure where that path of thought is supposed to lead. Sure we can recognize that many countries require this today, but we don't have to accept it.
If we stop caring as soon as enough governments grab more power they'll just keep doing it. We don't have to overthrow the government for something as simple as ID requirements for a SIM card, we just need to stop using them when we don't accept the premise.
Yeah, I assume that part of rejecting that would include not utilizing a chat service that requires one’s account to be tied to their government ID (phone number).
Right. Cell phones have become pervasive in many societies and it is expected that everyone has one st this point, but we can actually survive without them.
Granted that can be harder in some countries that, for example, use phones to pay for everything and don't use cash. That's just another example where people could have refused though, we can still survive without it even if life becomes less convenient.
You're right but it's Signal's mission to provide private messaging in the face of government overreach.
Even if they have a good reason for the paywall, it's so bizarre that they don't ask for $2-$5 donation via their own cryptocurrency MobileCoin as an alternative to providing a phone number.
That's a self-imposed problem. If they used a semi-random account handle (e.g. chosen nickname + 4 digits) there would be nothing to enumerate, and remaining spam could be filtered out with a "message requests" feature.
I've received hundreds of spam messages on Facebook, but I only found out about them years later when I clicked on "message request" tab by accident, it's extremely effective.
That problem would be mostly solved if they didn’t use the phone number = account model. If you have public usernames that are sufficiently complex, spammers would spend the vast majority of their time shouting into the void. Presumably, seeing an account spamming messages to recipients that don’t exist would be a strong indicator of an account that should be closed.
They're resilient to spam, but often impossible to recover.
I had a spare SIM card that friends and family use when visiting from abroad. It's been unused for 90 days and has been deactivated. The number is lost, and irrecoverable. A friend had created a (second) Signal account with this number and can no longer log into new devices.
As a more mundane example: If I accidentally drop my phone into a river, the SIM is gone forever, and so is that line.
Sure, you can have a contract line which allows recovery. Depending on where you live, these can be several times more expensive than a regular pre-paid line.
You don't need a "contract" for recovery, just an account.
E.g. in the US, Mint Mobile is $15/mo. and is prepaid in the sense that you buy blocks of months at a time. But if you lose your SIM they'll still send you another one with the same phone number.
So no, if you lose your SIM you don't necessarily lose your number, even if it's prepaid. That only happens if you're buying your SIM as an "anonymous" one-off purchase, which is not what most people do these days. Not to mention the increasing prevalence of eSIMs.
"Often impossible"? Not my experience at all. Maybe it would be more problematic with prepaid SIMs, but why would a monthly billed account get deactivated?
You lose your SIM? You go to a branch, verify your identity, and get a fresh new SIM for your line. There's no more straightforward and surefire way to recover any other type of account as of today.
No. I believe that it's way easier to recover your phone number when you lose your SIM or change your phone (eSIM). When you lose your email password and the recovery code at the same time, your email is gone, forever. That's a huge difference, IMO.
Yeah, it may not work when you buy a prepaid SIM and decide not to use it for a long time, but for billed plans, it's impossible. I've been using the same cellphone number for the last 23 years. I'd got my phone stolen, I was able to reactivate my cell the same day with another phone.
Well nobody is a bit a overstatement when thousands if not millions of people actually do run their own or partial email stack. If you don't own your email it's just as bad as a phone number, but not worse.
In general emails don't just get removed and given out again if you don't interact with them for a few months. Phone numbers do usually after 6-12 months or even after 3 months for some providers.
You can prepay a domain for up to 10 years or more and always setup a email server when you need it so you essentially have full control, long term.
And don't get me started with possible SIM copy and stealing attacks. Things already pretty much solved with email and DMCA
I said "nobody does" because nobody can. It's a practical impossibility to run your own email today. It doesn't even matter if you have the necessary advanced technical skills. It's just not possible. https://cfenollosa.com/blog/after-self-hosting-my-email-for-...
As the owner of the same cell phone number for the last 23 years, proposing hosting your own email as an alternative to SIM cards that's available at every corner doesn't sound feasible to me at all. I don't even like cell operators, but that's just the way it is.
That's one anecdotal experience and simply not an ultimate turth.
I've had a email startup (sold meanwhile, but still running) and seen all these mentioned issues and know very well how painful this can be, but far from impossible. There is a market for clean IPs for company infrastructure, that's a thing still as well.
Probably I shouldn't say email is much better. But phone numbers definitely are a very weak decision for any kind of security.
Apart from Spam, phone number is also one of the few unique identifiers, which is valuable to, among other things, to ID you cross-channel and show you ads.
It is easy to create a new email, but not so easy to create and keep a new phone-number.
I've been a Signal beta tester on iOS for as long as I remember, knowing that they were going to introduce usernames, and I wanted to get my (relatively common) name as my username. Now they finally introduced it, but they require it to end in at least 2 digits "a choice intended to help keep usernames egalitarian and minimize spoofing".
Edit: this is not actually a serious problem for me, don't worry! Rather, I think it's funny. And honestly I kind of like having the numbers required, it's a good idea. It does remove a lot of the vanity from usernames.
It's a brilliant design choice. At first I was like "What?" and now the more I think about it, the more I realize it is an absolute genius move.
People need to get trained out of (even informally) assuming they can identify someone because their username looks familiar, and this is a great way to do it.
> more or less completely eliminates “vanity names” and the “value”
With notable exceptions, i’m sure, being username69 and username420 and a few others (a similar phenomenon happened in magic the gathering, when they introduced limited edition 500 print runs of cards with the serial number stamped on them, and the only ones you can really sell or command a good price for are 1, 69, 420 and 500)
I can't wait to talk to elonmusk420! I'm sure it'll be the real Elon. His online antics are such anyone with that username will instantly trigger Poe's Law. Getting rid of phone numbers as identifiers is a good idea but I think it would be better to just assign user IDs or generate hashes based on user inputs or something.
> generate hashes based on user inputs or something.
Because friend codes were so popular on Nintendo.
Hey add me real quick, my id is 12716472-83647281746-8172649! Or use the hash code, 0x28A56ED9! Super easy to remember, way better than giantrobot22 or vel0city66.
Given nintendo's user base includes a LOT of children who are very young, the long codes may have been a feature, not a bug - the equivalent of a child latch - to slow down/discourage young users from adding people themselves so their parents have a better idea of who they are interacting with.
I expect it's more a combination of several factors:
- if we don't have usernames we don't have to deal with obscene usernames, trademarked usernames, impersonation claims, and similar
- if we don't have usernames and our generated friend codes aren't guessable, we don't have to worry about people getting random unexpected friend requests from people they don't know
Don't get me wrong I get there were intentional reasons for it in regard to friend codes and I don't necessarily fully mind with that in mind in that use case. I do kind of wish there was an "I'm 13/18+, let's take the training wheels off" feature though.
The issue there is "veI0city66". Depending on the font that capital "I" might look identical to a lower case "l". A hash with an alphabet that doesn't include homoglyphs would reduce ambiguity.
There's also the "weedlordbonerhitler69" issue. A user name that seemed hilarious at 16 likely seems less hilarious at 26.
If users were identified with a hash derived from an input user name you could type in "weedlordbonerhitler69" and what would be displayed is a hash on the client side. The contact add UI could simply return the UID for the input username. So you could give out the UID or username and another user could still add you.
> The issue there is "veI0city66". Depending on the font that capital "I" might look identical to a lower case "l". A hash with an alphabet that doesn't include homoglyphs would reduce ambiguity.
They're not going to get mixed up typing it in from me verbally telling me the name. They're not going to get confused typing it in. And even then, validate the user after, that's another feature of signal is in person/out of band validation of the ends. So start the convo the verify through a channel you otherwise trust.
> There's also the "weedlordbonerhitler69" issue. A user name that seemed hilarious at 16 likely seems less hilarious at 26.
And with their setup you can change it at any time, so once again not really an issue.
Usernames are only used for the initial connection, so "getting" a username doesn't really gain you anything other than the "username" you give to people who don't already have you as a contact: "a username is not the profile name that’s displayed in chats, it’s not a permanent handle, and not visible to the people you are chatting with in Signal"
I'd settle for full sync of chats between my own devices. If I can sync between my laptop and my phone, that's sufficient, since I already back up my laptop.
I don't want backups for IM. I don't want my counter-parties to have backups for e2e encrypted IM. I don't want IM to last. Why record every conversation on your permanent record? It's nuts.
For me, having a searchable record of everything said defeats the whole purpose if IM and e2e encryption. I'm sure the NSA like it.
The lack of any kind of backup/export for iOS is the main thing keeping me from recommending Signal.
Sadly, from what I’ve seen in similar threads online, it seems the devs are opposed to backups in principle (they believe that chats should be ephemeral and backing up is antithetical to this).
I‘ll take it. Even offline backups would be an improvement.
For people worried about having not consented to other peoples backup. They could implement ephemeral-only chats, or backup-excluded chats where both parties have to agree to changes.
If I understand correctly it’ll still not be possible to create an account without entering a phone number?
For me this is a requirement to call a service a private service because in Germany at least every phone number is connected with a persons identity. To get a phone number you need to connect it to an identity using a identity card
Here in Thailand it's the same but phone numbers get recycled and expire very aggressively. I just got a new phone number and I can login to many platforms of some 20 year old guy who really likes pc gaming.
Phone numbers should have NEVER became an ID. Incredibly hypocritical of Signal to claim "privacy focus" when the lowest layer of the system is literally the least secure identification method we have.
I had two SIM cards dedicated to online crap - one for important stuff like banking, another for social media and such.
both have expired after ≈ 3 months of inactivity, when my 2 week trip unexpectedly took 4 months. those SIM cards weren't physically inserted into my phone - I used to do that once a month to call someone and get billed a few cents so it would remain active, until that trip.
there's no way to get those phone numbers back and it's been an enormous pain the dick. I hate this fucking system, but I hate the fact that fucking everything requires a phone number even more.
in Germany at least every phone number is connected with a persons identity. To get a phone number you need to connect it to an identity using a identity card
Personally, I am totally baffled by this.
Due in large part to C3's positive influence, Germany is at the forefront of privacy issues and legislation on so many areas, except for this one, which ends up turning into a massive backdoor in the whole edifice. Okay, we can't ask for a copy of your identification card... we'll just use a telephone number or SIM code or something trivially tied back to your IMSI (like an app store account or IMEI) instead. Because of the absurd 2017 law, these are equivalent to your government ID card.
I really don't understand why Germans put up with this while simultaneously pushing so hard for positive changes in every other aspect of online privacy. Especially when so many other developed Western countries do not tie SIM cards to identities: Netherlands, Denmark, Finland, Iceland, Ireland, US, UK, Canada, and many many others.
It's like a giant `sudo gimme-your-identity` backdoor in all the other data collection protections. And nobody seems to care about closing the backdoor.
It wasn't always like this - the requirement to give your ID to get a SIM card, as you noted, was only introduced in 2017 (though it certainly feels way longer ago for me).
Anyways - why does nobody care?
Simple: most don't feel this being an issue.
Some may even say that they "don't have anything to hide" and there goes the erosion of privacy, bit by bit - by the time someone notices "ok, this may become a problem" - it'll be too late :(
Sure, but what's incredibly weird is that many Germans do feel that almost all other digital privacy matters are an issue. It baffles me that they treat this one particular issue differently for some reason.
I wonder if this is some kind of mass-psychology exploit, like it doesn't occur to your average nontechnical person that the ID requirement makes your Apple app store account, and every app you use it to install, equivalent to your government photo ID.
On the flip side, SMS fraud is almost nonexistent from German mobile numbers, which is why scammers just send from other countries to German mobile phone owners. Mostly from France.
Yes yes, of course; there are root causes and proximal causes. You are correct about the root cause, which is the reason why Germans in general care about these things.
C3 is the catalyst that turned that caring into actual tangible results. Or at least a big part of the catalyst. Their level of political effectiveness is extremely unusual in the hacker world. I'm glad it has been a force for positive change.
That said, it has limits. And I have heard rumblings before about the telecom giants (DT) being an insurmountable political obstacle. So hacker culture has more political influence in Germany than elsewhere, as long as it doesn't upset the telecom giants.
This is a fundamentally different problem for a fundamentally different audience.
If we take privacy issue, it can be divided into 3 segments:
* Privacy of user data. The basic level. When you use Google or Apple, they collect data. Even if you minimize all settings — data is still collected. This data is used to train models and models is used to sell ads, target you or do anything else you have no clue about (like reselling it to hundred of “partners”).
* Privacy against undesired identification. Next layer of privacy. When you want to have some personal life online without sharing much about you. Like Reddit, anonymous forums, or Telegram (to some degree).
* Privacy against governments. The ultimate boss of privacy. When you want to hide from all governments in the world your identity.
Signal was perfect at first layer strong but not perfect at 3rd layer (e2e encryption, no data collection to share nothing with governments who seek for data, good privacy settings, always tell you if your peer logged to new device to protect from cases when government operates with telecom companies and use sms password to make a new login), and almost non present at 2nd because they have no public features except group chats where you share your number.
Now they in one move close gaps at 2nd layer — you can hide phone number and stay fully anonymous, and strength their positions in 3rd layer, leaving the last piece open: government still will know that you have some Signal account.
As for me, this setup solves 99,999% cases for regular people in democratic and semi-democratic countries and address the most fundamental one: privacy of data and actions online.
Yes it is not perfect but barrier for government to spy on me is that high that I reasonably can believe that in most cases you should never be worried about being spied, especially if you live in some places which are named not as Iran or Russia.
The only scenario, in my perspective, you can want to have a login without phone (with all sacrifices to spam accounts, quality of peers and usual troll fiesta in such places) is when you want to do something you don’t want ever be found in your current country.
But in this case, IMO, Signal is the last worry you usually have on your mind and there are a lot of specialized services and protocols to address your need.
1,2 and in part 3 were already fixed with the Signal FOSS fork back then, but Moxie and his army of lawyers decided to send out multiple cease and desist letters against those projects. Which, in return, makes Signal not open source, no matter what the claims are. If they don't hold up their end of the license and argue with their proprietary (and closed to use) infrastructure then I'd argue they are no better than Telegram or WhatsApp. Signal's backup problem is another story which might blow up my comment too much.
Because of your mentioned points I would never recommend Signal, and rather point to Briar as a messenger and group/broadcast platform. Currently, it's still a little painful to use and e.g. QR Codes would already help so much with easing up the connection and discovery/handshake process.
But it has huge potential as both a messenger and a federated and decentralized platform.
Yeah, not sure how that happened, but that link wasn't exactly what I was going for. If you scroll down far enough from there you'd find the parts I tried to point you to, but try this link instead:
https://news.ycombinator.com/threads?id=autoexec&next=394457...
Just to be safe here's a copy/paste with the details:
This has been true for many years now. At the time it caused a major uproar among the userbase (myself included) whose concerns were almost entirely ignored. Their misleading communication at the time caused a lot of confusion, but if you didn't know that Signal was collecting this data that should tell you everything you need to know about how trustworthy they are.
Note that the "solution" of disabling pins mentioned at the end of that last article was later shown to not prevent the collection and storage of user data. It was just giving users a false sense of security. To this day there is no way to opt out of the data collection.
My personal feeling is that Signal is compromised and the fact that the very first sentence of their privacy policy is a lie and they refuse to update it to detail their new data collection is a big fat dead canary warning people to find a new solution for secured communication. Other very questionable Signal moves that make me wonder if it wasn't an effort to drive people away from the platform as loudly as they were allowed to include the killing off of one of the most popular features (the ability to get both secured messages and insecure SMS/MMS in the same app) and the introduction of weird crypto shit nobody was asking for.
Just use Wire (wire.com). True end to end encrypted multi device messenger, open source, federated and based on MLS. All you need is an email address, no phone number required. And based in Europe. They allow building your own clients (with some stipulations) and seem to solve everyone’s issues with signal here
I think it is a holdover from the Text Secure days. And like others say, it's a different problem.
But for solutions, can't you just buy a voip number? You just need it for registration and then can dump it. I'm sure you can buy one with cash or zcash if you're really paranoid.
While in the US I don't have to show my gov ID to get a phone number, I don't know anyone who buys a phone with cash except international students. So practically everyone is identifiable anyways. But I'm not sure this is a deal breaker since all I'm leaking is that I have registered a Signal account. AFAIK Signal only has logs of an account existing and last online with 24hr resolution (which avoids many collision deanonymization methods). Even paying with cash is hard as I'm probably caught on camera (but these usually get flushed).
So I'm legitimately curious, why is this a dealbreaker? It doesn't seem like a concern for the vast majority of people, and the problem Signal is solving is secure communication for the masses, not the most secure method possible with unbounded complexity. It's being as secure as possible while being similar in complexity to the average messenger.
> But for solutions, can't you just buy a voip number?
No, how would my uncle in the countryside of Vietnam do that? He doesn't have a credit card -- not many here do. He doesn't speak English -- can you find a website that sells voip numbers in Vietnamese? Buying a voip number from a provider in Vietnam has the same exact KYC requirements as buying a SIM, so it is still tied to your government ID and registered forever.
Also buying a VOIP for 1 month costs something like $10 from a quick Google. Average salaries are like $1.50/hour. Nobody is going to pay an entire day's salary to buy an VOIP number they throw for a month just so they can register anonymously for chat.
So, not you can't "just" buy a voip number unless you're a rich Westerner. But who needs privacy more? People in liberal democracies or people in places like Vietnam (literally an authoritarian country where people are routinely imprisoned for speaking against the government)?
> I don't know anyone who buys a phone with cash except international students.
Everyone buys a phone with cash here because few people have credit cards, since there is no such thing as "credit ratings" and it is easy for people to disappear from their debts. There are more people in Vietnam than any country in Europe. We all use smartphones and messenger apps here, too.
Briar ('droid only), SimpleX, and Session; optionally with a cheap VPN like Mullvad or Proton to ameliorate anonymity issues in the p2p voice/video features.
Indeed. Even most technical people don't have experience setting up VOIP stuff. And needing some techie's intervention just to create an account is not beneficial for a company's user base. Calling this a non-issue is being ignorant about how usability works and influences user engagement.
Why do you need a German phone number? Many countries let anyone have a phone number, with no proof of address or other identifying information. Just use one of those numbers instead. One example service is https://jmp.chat/ but there are many others.
Basically everything is VoIP these days (VoLTE, etc.). Online services sometimes have secret lists of phone numbers they don't like, but "voipness" isn't a silver bullet for determining whether or not a service will like your number or not. The JMP people wrote more about this at https://blog.jmp.chat/b/2022-sms-account-verification
Anyway, this thread is about Signal, and JMP numbers work with Signal, which is why I suggested it.
This is not correct. Go to a phone booth, get Signal, never need the phone number again. Any phone will do. Get a phone number from a different country online and without identity check, who cares, you will never need it again.
"amenity:phone", "amenity:telephone" or "amenity=telephone" (no other filters) returns the same 2 matches.
EDIT:
Belgacom started removing them in 2013 in Brussels, and the rest of country followed suit. The Belgian regulator found it unnecessary to require them with the ubiquity of mobile phones.
Well if you lost your only credential and it’s a secure solution, it’s gone. You must set it up from scratch again.
Since we’re discussing not providing your phone number out of privacy/security concerns, I assume that “registration lock” and PIN are on the table, which would anyway block you from registering again using the same number after loosing your phone.
Hence, the situation is the same as with your mobile phone number: no backup, no luck.
Not if you set a PIN no. But I think the next bloke can't use the booth to create a signal account anymore. I don't think we'll run out of booth though considering how rare the use case is ;)
Partially off-topic: I've always found this German requirement baffling. In the Netherlands you can just buy a SIM card at a supermarket and pay cash. No identity, nothing.
This is the case in most countries these days. There are very few places left where you can get a mobile phone number without identifying yourself at some point.
I used to care, but at this point it’s obvious that taking a phone number is by far the most effective anti spam and anti trolling method in existence.
There was a forum that used to have as a requirement a non-free email account and seemed to have no issues with spam accounts with tens of thousands of members for more than 10 years. In that use case it seemed the non-free account aspect to sign-up was the threshold which seemed to keep spammers out vs the fact such an email account could be (with relevant authority) traced back to a real identity.
I'd be curious if there is a study that has looked into the thresholds for different use cases at which spam account creation drops to negligible amounts and how much price vs anonymity vs difficulty factors into it.
> ... but then Signal wouldn't have your phone number either. What they need it for is ... dubious if you ask me.
The reasons they need it aren't really that dubious to me: they want to create a service that actual people will actually use, not just weird privacy geeks who never gave up on PGP. Using phone numbers allows for the kind of user discovery that most people expect in 2024, and requiring them inserts a barrier to mass account creation that can keep spam accounts down to a manageable level (especially given the whole point is they can't do content-based spam-filtering in the way that makes email managable).
Personally, my understanding is they've always been trying to develop the maximally private usable chat app, which requires some compromises from the theoretically maximally private chat app.
Yeah, privacy is weird and cringe! Let's call 'em "privacy-bros" or maybe "encryption-bros" to signify that they are low status (I don't want to be like them, ew!)
> I think the remark is more about these sort of rhetorical tactics which permeate every topic. It is a fair remark.
It's not a fair remark though, all it did was twist what I said into a inflammatory derailment.
The point is there are a lot of (usually technical) people who are too focused one aspect, but are missing the bigger picture. If you follow them, you'll probably get a communication app that only those people can/will use, which has deal breakers for mass-market adoption. And once that happens, those people probably won't use it either, since they want to communicate outside their group.
Both his and your comments come off as inflammatory derailment to me. That's how it reads, I'm not ascribing malintent. People didn't use to talk like this, I hope you reconsider.
"not just weird privacy geeks who never gave up on PGP." is simply not conducive towards making your point. You can make your (otherwise solid) point and even win the argument on merit without this sort of thing.
The main selling point of Signal is privacy. That's basically the only reason it exists - without it, why not just use WhatsApp, Messenger, Snapchat, etc?
What is the usability concern for no longer needing a phone number?
> Using phone numbers allows for the kind of user discovery that most people expect in 2024
Do people really expect to still exchange phone numbers ?
Fundamentally I don't want people to call me nor SMS me (that's for spam only), most messaging services will allow contact exchange through a QR code inside the app, and if everything else fail an email address will be the most stable fallback.
In many countries SMS was either crazy expensive, unreliable, wall gardened to death (can't message people on other carriers...) and had no traction in the first place.
Then phone calls are also crazy expensive: I'm looking at the phone plans right now and the main focus is the data amount. Phone call options are either to only allow for super short conversations for a flat fee (less than 5min per call, for a 25% increase in the monthly plan) or 30 min to an hour of phone call for double to triple the price of the plans.
Moving to an alternative is just the normal course given these incentives, and that's what people did in droves (looking at Japan for instance)
You can now hide you phone number, according to the blog post.
[...] Selecting “Nobody” means that if someone enters your phone number on Signal, they will not be able to message or call you, or even see that you’re on Signal. And anyone you’re chatting with on Signal will not see your phone number as part of your Profile Details page – this is true even if your number is saved in their phone’s contacts. Keep in mind that selecting “Nobody” can make it harder for people to find you on Signal.
Well, to link with recent news, do you think talking with the late Alexey Navalni over Signal would protect you from russian police? They'd still be able to see that you talked to him.
And then what's the point of the super duper encryption?
In Signal, probably no. Signal has this sealed sender functionality hiding significant amount of metadata from passive observer and active examination post-communication: https://signal.org/blog/sealed-sender/
What Russian police would be able to see, that in a given time period of certificate rotation at most X people communicated to Navalny.
Signal does not know who you correspond with. The only information they keep is the account creation timestamp, and the date that the account last connected to the Signal service.
You may have confused this information with WhatsApp which indeed keeps a lot of metadata on each user.
Signal absolutely knows who you correspond with. How could they otherwise route your chat messages?
They promise to throw this information away, which is nice but not possible to verify.
They also employ a roundabout way of encrypting this data, but as they rightly point out in their article that describes the scheme, encrypting or hashing phone numbers is not safe from a malicious attacker. The space of all possible phone numbers is so small that it could be brute forced in the blink of an eye.
You place all your trust in Signal (and Google/Apple) when you use them. That may be better than the alternatives, but it's still something we should be honest about.
That said, keep in mind that Signal and Google/Apple can also trivially backdoor your software, so unless you take specific precautions against that, the details of their middleman protection isn't terribly important.
I guess you are right. It's trust-based. For an actual obfuscation Signal would need to implement something like onion routing, right? I think Session does it.
Well, TIL. That does not refute my comment, though. Signal still does not know who you chat with. It's the cloud provider who might log the IP address of the sender. Identifying the person based on that information alone would be non-trivial if not simply impossible.
> They'd still be able to see that you talked to him.
Signal has no access to metadata, including participants in a conversation. All they know is the date of account creation and the date of the last connection.
However, if they got access to Navalni's phone, then they of course can see everything Navalni can.
Even encrypted data is not irrelevant. The frequency of messages is relevant, as is how many messages are sent how quickly, the total package size can be revealing if they arent hella padding the data, there is a lot you can learn just from the data. Total obfuscation is ideal.
If you are worried of an adversary that is using numerical analysis on the frequency of messages to somehow undermine you, I’d recommend not using a smartphone or internet connected device. And perhaps medication.
We don’t insult each other here. Take the cheap potshots to Reddit.
>Why worry about nation-state level attacks when you can simply be hit over the head with a mallet until you give up your password?
Yes, that would be the point of obfuscation, as opposed to just encryption. End to end encryption does not prevent the $5 wrench attack, obfuscation does.
If a person is a member of a terrorist network - or friends with someone who is - the fact that a warrant could force Signal to expose that link could mean that a court is then more likely to approve increased surveillance of your (non-Signal) communications because of that link.
On the other hand if you are a woman on Tinder and using Signal to communicate with matches, this doesn't expose you to the person you have just matched with adding your number to their phone book, uploading it to LinkedIn and then finding where you work (which is what you can do with a phone number).
My feeling is this is a reasonable compromise, but it is important people understand what it does and doesn't protect you from.
The claim (which generally I'm inclined to believe) is that requiring a phone number drastically increases the cost to sending spam. That in turn drastically reduces the spam amount.
If you're worried about Signal's hosting provider seeing your device's IP address, use a proxy. Personally, I'm not, because there's no trivial way to go from "Here's some IP traffic" to "this human had a conversation with this human".
What they need it for is simply that it's the way the system has always worked, because Signal started life as an encrypted replacement for SMS. The point was that you could switch from the standard SMS app you were already using over to Signal (which was called "TextSecure" at the time) without having to change your habits, because sending messages to people's phone numbers was simply what people did then. There's nothing nefarious about it.
Signal has spam issues even with the phone number requirement, as I've experienced lately (though nothing on the scale of Twitter). I dread to think what the spam would be like without the requirement of a phone number.
At least now you can solve the existing spam problem if you want by disallowing people from using your number to message you in the privacy settings and randomizing your username after anyone new adds you - that way your username is like a one time password to add you, kind of like what lots of people here wish existed for phone calls.
Just like you haven’t received any communication from anyone about any topic other than talking about Matrix. It’s not that Matrix has a magic formula, it’s that a fraction of a fraction a percent of people care even an iota about it.
They could collect a small amount in cryptocurrency to prove user is not a spammer. Telegram tried this but the price for not providing a phone number was too high. Does it mean knowing user's number is so valuable?
It strikes me as hopelessly naive to think that keeping a personal phone number private is the only reason a user would want to be able to sign up for a service completely anonymously. The question is not whether knowing a user's number is worth $X, the question is whether _anonymous access to your platform_ is worth $X; a question that applies equally to both innocent good-faith users and to spammers/phishers/etc. If your platform is actually worth anything, $X is not going to be a small amount.
And yet many people seem to earnestly believe that a tiny token fee will be enough to deter spam, despite clear evidence to the contrary (see for instance how Twitter's "verification" fee has completely failed to stop bots from overrunning the platform, many of which proudly display their blue checks).
I could certainly point out the differences, but the fact that you yourself aren’t acknowledging them indicates to me that you’re throwing intellectual integrity out the window because this product doesn’t work in the way that you want it to work. Engineering is about tradeoffs, and not every company serves to build something that does exactly what YOU want it to. I prefer Signal the way it is. I understand the tradeoffs.
They are not usernames, so why do they call them that? They are more like disposable per-conversation identifiers.
"Usernames in Signal do not function like usernames on social media platforms. Signal usernames are not logins or handles that you’ll be known by on the app – they’re simply a quick way to connect without sharing a phone number."
Also, this is not finally the feature Signal users actually want - not having to sign up for Signal with a phone number and using a username instead.
This new "feature" does very little to make signal more secure or private.
It does, because instead of having to share your phone number to Signal + all your contacts, you can share it with Signal only. It is an improvement. It doesn't address the case where you are not willing to share your phone number to Signal, but it addresses the case where you tolerate it but would like to discuss with someone with whom you'd rather not share your number.
I hope it will allow creating groups without forcing members to have their phone numbers shared with everyone.
That was my first thought too. It's stupidly confusing to call something that acts nothing like a username a username. They clearly know that given the number of times they clarify how they work. Here's another:
> Note that a username is not the profile name that’s displayed in chats, it’s not a permanent handle, and not visible to the people you are chatting with in Signal. A username is simply a way to initiate contact on Signal without sharing your phone number.
It's absolutely a username. It can be changed arbitrarily whenever you like, and you'll probably in the future be able to have more than one name for the same underlying account, but it's still a username.
Other services do this too. For instance, you can sign up for some services with an email, and that's what you use to sign in, and you might be able to find other people by email if they let you, but you don't necessarily get shown someone's email on their profile, just the display name in their profile. And (in a well-designed service) you can change your email address at any time.
Because a regular person, being given not a number for something, is going to call it a username.
Later explaining "you can have multiple usernames" is easier then trying to undo that conception. People are familiar with it. Your username is how you identify yourself on the computer in every context when it's not obviously your phone number.
> Also, this is not finally the feature Signal users actually want - not having to sign up for Signal with a phone number and using a username instead.
Agreed. I don't own a phone of any kind, and would love to use Signal, but alas I can't because you need a telephone number, or a level 65 Necromancer to do the magic to sign up without it.
Is it? On Twitter and discord people see a different name than my username. Username tends to be more for connection and display name for identification. While I get the argument I don't see why this is a big deal.
Doesn't seem "disposable per-conversation" in my reading of the announcement. Seems like a permanent username that just doesn't get featured in the conversation.
>Your profile name remains whatever you set it to.
It's not really permanent - you can change it as much as you want. Once someone has established a connection with you via your username once, that connection will still exist even if you change your username.
Maybe they're not necessarily going for "all humans that exist everywhere under any circumstances" but instead "humans likely to have access to a phone number which can sometimes receive SMS."
Not every app needs to cater to every single human and potential use case on the planet.
A spam account is a fake account that sends spam. Like Bitcoin bullshit. Platforms like signal, Whatsapp, telegram, and others have an issue since you can just message literally every possible number. One way signal handles this is by not identifying that you even have an account unless you accept the message. There's also rate limiting and other stuff going on. But I'm pretty sure you know that a spam account is. If you really don't I'd love to learn how you use the Internet because I'd love to learn how to avoid these accounts. Twitter and Gmail loves to connect me with spam accounts.
How does signal know that account X is sending Bitcoin bullshit if the messages are encrypted? Also I see you have a Keybase account, Keybase doesn't use phone numbers, how do they solve "spam accounts" ?
> How does signal know that account X is sending Bitcoin bullshit if the messages are encrypted?
They don't. That's not what I intended to say, sorry for the miscommunication. It's just a common spam bot I see on things like Facebook, Insta, Twitter, TikTok, Reddit, email, etc. But Signal can stop you from sending 100 messages a second. There are other ways to fight spam without needing to know any of the users or contents of the messages. A lot can be done from the minimal metadata that's required to perform communications.
> Keybase doesn't use phone numbers, how do they solve "spam accounts" ?
I don't know but I'm not a security expert. So you probably shouldn't be asking me. But if you got any questions about ML I'm qualified to answer some of those.
I'm pretty sure a big reason Signal uses phone numbers is just because they built from Text Secure. It is also aimed at an audience less technical than Keybase's target audience. I mean Keybase is free and private but everyone still uses Slack or Discord. FWIW, Signal does write blogs about these things. So if you'd like to learn more I suggest reading those while you wait for someone much more qualified to answer your question. I think you'll get it answered much faster if you're less assertive. Or you could go the otherway and try the old tactic of confidently saying something outlandish and wait for people to correct you. But I think this is a more difficult method to get answers to a specific question. Your call though.
If I'm reading this correctly, this also means that a person that already has my phone number in their contacts will necessarily be able to link my number to my username after they have scanned my QR code.
If you set your privacy to nobody and someone saves your phone number, to them it will appear that you do not have a signal account, even if they start chatting with you via your handle.
By "link" I mean they immediately know what person the username belongs to iff they already had that person's phone number because the chat that is initialized after they scan the QR code is just the old chat being continued.
But if they have my number, why would I be worried that they know my username? The username is there so I can avoid sharing my number, not the other way around.
Ah, that's what you mean. Yeah, if you want to be anonymous to Signal itself, I don't think that's possible. If you want to be anonymous to people, I think you can delete and recreate your account. I think that might do the trick.
First, it is a mistake to call these usernames. Second, it's a big mistake because this is a cool feature.
It's interesting to compare this feature to Session, where you also have randomized identifiers, but they identify you globally, and there's no way to give someone a handle to you that isn't linkable to other conversations. It sounds like Signal now offers that, which is actually the first time I've been intrigued by Signal.
Agreed. It’s ridiculous that they’re even calling this feature usernames, since you still need a phone number, thus completely defeating the purpose of a “username”.
For most services to sign up, you also need an email address. This is also to help you recover your account in case you lost your password. A phone number can be used for this purpose too. Now you can share your Signal account with someone without sharing your phone number. Like you can share your Facebook username without sharing your email address.
Heh, I donate monthly to the Signal foundation but still get the occasional notification in the app to do so. In some sense, I am paying them anonymously :D
Whatsapp added this recently and it is very convenient. You can link a companion device in the same manner you sign into WhatsApp web.
A kind of hacky workaround (that I used to use for both signal, WhatsApp and others) is to set up a server with matrix bridges running and bridge your signal, WhatsApp etc. so then you can install the one matrix client on all your devices.
But as most apps do support multiple devices these days, bar signal, it doesn't feel like it's worth the effort. And I seem to remember the signal bridge in particular being a little buggy.
I'm sure it will become possible soon. The code is already there on iOS, as the app also work on iPad, but hidden behind the internal feature flag [0]. Same with Android [1]. If your second device in an Android, you can already use it now with [Molly](https://github.com/mollyim/mollyim-android).
Also, WhatsApp recently added this feature, so the expectations from potential new users who switched is now there.
Would signing into Signal on a work device not negate most of the security benefits of using Signal? Genuine question; I am only vaguely familiar with Signal.
The interesting thing is that it is possible to share the account on multiple devices, as long as only one of those is a phone. You can sign in to and chat from that account just fine on the desktop app, even if your phone is off.
(I guess theoretically you could run something like PostmarketOS on a phone to run the desktop app, but you know what I mean.)
That's useful but not quite sufficient for this use case, though. The different devices currently have no way to sync chat history, so you'd lose all your old chats.
What I'd love to have is the ability to connect my phone and my laptop to the same Signal account, have them automatically sync chat history between each other, and then in the future if I add a new phone (e.g. because I've upgraded) my phone can sync from my laptop and get all of my message history.
My current work-around is just to use a group chat and have both work and personal accounts part of the chat. Fortunately, I only need to be able to chat with a few people (family) while off with the work phone so this isn't that big of a hassle, but it's something I wish I didn't have to do.
Unfortunately I don't. If I were to guess, I'd expect it's just a matter of the engineering hours that would need to be invested not being worth it at this time, given how few people they expect to need it.
Yeah, this is still my top requested feature. I have two phones, one is data only sim. I just want to be able to signal from both of them just like how I can on my mac and PC.
I like the concept of Signal usernames not being public either, and that they’re only a means to tell others how to find and contact them. I can’t wait for this to be rolled out.
It’s not clear to me if it’d be possible to prevent the “contact joined Signal” messages if someone else has the phone number in one’s contacts. That would be a huge thing.
For a little more historical context, with this change Signal has now solved the problems that became widespread during the protests in Hong Kong in 2019 — someone else (authorities) adding random phone numbers to their contacts list, opening a chat app (such as Signal or Telegram) and finding if that person uses that app. Telegram solved this swiftly by adding more privacy controls, [1] while Signal had other priorities.
There's new phone number privacy settings for "Who can see my number?" and "Who can find me by number?", both having "Everyone" and "Nobody" settings. I assume disabling both should stop it from messaging people, although not sure if you can set it quickly enough after registering.
> People who have your number saved in their phone’s contacts will still see your phone number since they already know it.
I know this is great and groundbreaking seemingly. And that it was and more was already there in Telegram, for years.
This is just unfortunate if it has been implemented like Telegram and it seems it has.
I should be able to dictate that “if I initiated communication” to “username” or “from my username” my phone number should not be linked to it even though the other person has my phone number in their address book saved, because that doesn’t mean they are a friend or even if they are I might not want to know that or chat outside the username.
I will try to access the beta (pretty sure it’d be full by now) and test how it goes but I hope it has not been implemented like Telegram after taking all these years.
Though I like that they have essentially nuked vanity username rush and grab in the bud. Kudos.
> This is just unfortunate if it has been implemented like Telegram and it seems it has.
Yes, agreed. This doesn't stop an adversary who knows your phone number and identity (such as a surveillance state) from linking communications under your username with your real identity.
It just means that people don't need to give their phone number to someone just so they can communicate via Signal.
I think this can lead to people having a false sense of security.
Signal is one of the great undertakings of our time. And it's one of the last bastions of internet freedom.
A free-to-use global communications platform that doesn't censor, respects user privacy from the ground-up, and is run by a non-profit foundation that is faithfully dedicated to its mission. https://signal.org/bigbrother/.
We should support it. If you haven't already, then consider signing up for a recurring donation to the Signal Foundation. I try to give what I can afford, because I believe that digital freedom is essential for the progress of all humankind, https://signal.org/donate/
Without such projects, our civilization will stagnate and die in darkness.
Yeah, nah, it might be fashionable but I'm not 100% convinced that it's not an operation intended to be a lightening rod for "private" communication.
Given how tightly they control development, disallow third-party clients, disallow federation, disallow self-hosting servers, have a history if disallowing use without google play and have hid huge development features from the public (mobile-coin) despite being open source. etc;
The idea that it's a great undertaking of our time is so bombastic that it's guaranteed to be false even if you truly believe that they are completely altruistic (which I'm willing to believe but it's not coming easy to me based on the above).
"What's better"? Matrix. Which seeks to solve all of my points, the only thing lacking is market share which honestly is partially caused by these "easy to use" services which trade off everything else, which also consumes developer mind-share even if you're unwilling to acknowledge that. (devs are motivated to solve issues for friends, family and themselves if they are exposed more frequently to systems and services that are sub-par).
I think this is a false dilemma; you can have the high-quality implementations and be more open.
I've criticized Matrix before for their "protocol-first" approach and "too neutral" stance towards clients (which they've changed somewhat it seems; previously [1] was a table of clients with no clue what to choose, now it at least has "featured clients"). I feel they repeated the same mistakes as XMPP, which has not improved their client list.[2] Protocol nerds will say that's a good thing, but all it really does is ensure your protocol remains marginal because most people just get confused. People choose software, not protocols.
But you can write a high-quality client and a specification and allow people to write their own apps. IMHO Signal is needlessly restrictive. Sure, focus on your own implementation and the quality of that first. 100% the right decision. But there's no reason to not at least allow some things down the line. Signal is just a few months shy of their tenth birthday – they're well past the "ensure the quality of our official client"-phase.
As soon as you do that though, it becomes a nightmare to adjust anything about the protocol, and you end up with incompatible clients. So you can use the app perfectly with friend 1 that has the official app, but with friend 2 who uses one client, sending photos doesn't work, and with friend 3 voice calls don't work, and adding friend 4 to a group chat somehow breaks it entirely for everyone.
Friend 2 insists on using their client because it has dark mode, and for the average user, what they see isn't "friend 2 is extra and has a broken client", they see "that app fails to send pictures about a quarter of the time, let's use whatsapp".
At the end of the day, the problem with this model is that it expects free labor to take over the next part. Which might work for a little bit -- until it doesn't. Then you have the situation we're currently in where everything related to matrix is mediocre.
I don't know if there is a straightforward correlation. I agree that my first Matrix experience was also not that satisfactory, but my university switched from XMPP to Matrix. I really liked conversations and quicksy. It just worked for me out of the box even with OTR stuff. However, it seems that there was not enough development on the server side, which I guess it led to the switch by our computing Center. Also the whole German health system as well as the army is switching to Matrix. I still think it is completely over engineered but it has a decent push.
Easy to use is important and it's a shame that you're downplaying that. More accessible than PGP/OTR? Sure. But maybe by a hair's width of an alligator's back.
If I am working with a source who gets frustrated by the impenetrability of communicating with me because I insist they use matrix while they're not technical and likely impatient, then that person will be much more likely to use a fallback method such as SMS or email, and they'll do it without warning. It's legal risk, period. My job is to make sure that they can share information with me as easily as possible and during a particularly sensitive period of that person's life, usually. Matrix, as a sibling post highlighted well, is too difficult for this use-case. That is an enormous failure for a use-case of sensitive information sharing.
I really like the idea of federation, but I haven't seen it be successful in practice. I can't think of a federated service that isn't also highly centralized. This was a big problem for cryptocurrencies and it's not like email isn't almost all Microsoft or Google. Mastodon has been struggling as well.
While I think there are better services to be private and secure from a technical perspective, there's one killer security and privacy feature that Signal has that on one else does: usability. It's pretty hard to get my grandma onto Matrix, but it isn't hard to get her on Signal. The truth of the matter is that you can't have private and secure conversations if there is no one on the other side. So while I really do like Matrix and the like, I think of them as more alpha or beta type projects. I don't find that the bashing of Signal is helpful (like we also do with Firefox) because all it does is creates noise for people that don't understand the bashing is coming over a nuanced and biased point of view (we're mostly highly tech literate here on HN, it is a bubble. But people still read our comments that aren't). End of the day, if we aren't getting 1 click server installs (or literally everyone is a host), federated systems are going to become highly centralized at some point. PGP's always failed because the easiest way to hack a PGP email was to reply that you couldn't decrypt. It wasn't appropriate for the masses even when it wasn't difficult to use. Don't get me wrong, I love Matrix, but it's got a long way to go to get mass adaptation.
Fwiw, I remember a user awhile back offering a bounty for a decentralized pathway in Signal[0]. The idea was to create an AirDrop like system to help with things like local file sharing but then extend the project forward to create a mesh network. Seems like a reasonable idea to me. I think it may be more advantageous to try to push Signal in the right direction than rebuild from scratch. I'd highly encourage people with other opinions to participate in the Signal community because it is a crazy echo chamber in there and for some reason the devs treat it as a strong signal.
I agree with all this, but only to a certain extent. The big disadvantage of a centralized system is the ability to control an entire ecosystem. The same reason we dislike monopolies. It's because monopolies of any kind have the ability to abuse their power, though that doesn't mean they do. I mean browsers are "decentralized" and that doesn't stop Google from exerting significant control, especially considering most browsers are chromium (I find it weird people say to fight Chrome by switching to a different color of Chrome).
Like I said, I'm all for Signal becoming federated. It's why I dropped that link to the airdrop feature request. I'd also be in favor of people running their own servers. I mean the server code is available, you just can't connect it with the main network. So as far as I see it, there's nothing stopping this from happening. I see a lot of people complaining but I'm not aware of any major roadblocks. That doesn't mean there aren't any, but I'm just not aware of any. And fwiw, there are alternative Signal clients like Molly[0]. So at least the app can be disjoint from the official ecosystem.
Signal has said that they don't want a decentralized network until they have settled on their standard and implementation as they see decentralized federation as what has prevented email from modernizing . I'm assuming they will never get to the point where they feel Signal is stable enough to decentralize.
I'm not sure why people keep responding with this. Signal doesn't want to work on the federated problem, sure, I'm well aware. But everything is open source. We're on a forum of hackers, makers, and programmers. So what is in the way? People keep saying "Signal this" "Signal that", what are they gonna do, stop sourcing the code? Ruin their entire business model? I doubt it. The code is open, so seriously, someone tell me what's stopping you all from creating a federated version?
XMPP cries in a corner. I wish XMPP had more accessible (to the general public) desktop clients. Conversations is great, but speaking from experience, people aren't going to want to use Gajim because it looks like it's ten years old (even though that's a good thing ;). XMPP needs better clients in general. The last time I used Profanity it had very annoying bugs about sending and saving OMEMO encrypted files.
in a world where iOS users won't install another free app from the app store because they already use iMessage, matrix is like asking for your friends to perform calculus just to talk to you.
Sure, but I don't see whatsapp/telegram as worse realistically if you've already lost at that level.
Signal is very much in the same area of: "trust us".
With a caveat that they also say: "here's a bunch of information on why you should: but you can't really verify any of it and we have proven bad faith before- also we have an army of people who will pile-on if you call us out for not being actually verified, so, just trust us- we are the secure messenger and all those scary things are just so we are easy to use".
I read somewhere here that, in the case of what's app more metadata is shared with meta, and telegram doesn't have E2EE by default for groups.
Didn't check though.
You're correct. There are more security features with signal too like the server stuff. It's true that they don't update the code enough but the parent is being overly critical. It's not like WhatsApp is giving us access to the server in any form. So it's not a fair comparison. (Edit: Also, the app can be built from source and you can verify that the communication isn't happening in a way where the server could decrypt it. So it's not too big a deal that the server isn't perfectly up to date on public commits)
To their point, there are benefits to federated systems. But I've yet to see a federated system have moderate to large usage without becoming centralized. Think email. And until this problem can be solved you're still left with a "trust us" problem. There's no trustless system out there, yet. But hopefully it comes in the future. In the meantime, signal is the best if you also want to communicate with anyone that can't tell you if a stack is FIFO or LIFO (or even know those acronyms).
Definitely not true. Facebook literally censors private conversations. You simply can't send certain text strings to your friends. That is far more dangerous than relying on a third party that claims to be protecting your privacy. Especially since all signs point to them being honest.
I don't know about WhatsApp (but I also didn't mention WhatsApp), but go to FB Messenger right now, open up a conversation with yourself, and try to send a message containing the string "thedonald.win". You'll get an error message saying "Couldn't send", with no further explanation. The list of banned strings used to be longer, but they've unbanned a lot of them since the election ended.
To be clear, this is in private conversations. Not just posting publicly on Facebook or w/e.
Funny enough the best way I found to convince iOS users to talk to me on signal is by telling them it's like iMessage but cross platform. Sure there are differences but most people aren't using those features. I do think signal could really benefit by just linking signalstickers.com into the app since that's the biggest complaint I actually get.
We really should convince Moxie Marlinespike to push the implementation of an out-of-the-box working bridge between the Signal client and the Matrix network. With e2e encryption, of course.
I think we're definitely approaching time when Signal / WhatsApp / Facebook Messenger / Google Messages / Matrix / etc will all become at least somewhat interoperable, and it's gonna happen very fast (~Q3), mostly because EU's Digital Markets App is basically forcing them to. (Well okay, only Meta-owned platforms are forced to.)
> It is unlikely that we will ever federate with any servers outside of our control again, it makes changes really difficult.
> ... I understand that federation and defined protocols that third parties can develop clients for are great and important ideas, but unfortunately they no longer have a place in the modern world. ...
Signal has its problems, some of them sever. It's also buying "us" much needed time to build out federated and self-hosted chat platforms.
I truly believe they are altruistic, although it is unrealistic to expect that to last forever.
By the way, some of the claims you made about their "bad actions" are actually false. And Matrix is still incredibly annoying to work with for "normies" and only recently got first-class E2EE and retention policy, both things needed for a secure chat experience. And btw, those things aren't deeply supported in the ecosystem, and also it doesn't have client feature flag alerting (to allow good intentioned clients to de-facto report they don't support certain security features).
I do think Matrix (or something like it) is the future, but it's certainly not the present.
Matrix?! As someone who runs is own Matrix homeserver, oh, man, no way. Matrix is super fiddly, unreliable, and user-unfriendly (and I say this as someone who has at times agreed that Signal can be user-unfriendly).
Matrix also is just not particularly private. Servers control and know far too much about users, and pretty much no mainstream client enables E2E encryption by default. Matrix is an impressive piece of technology, but it has a long way to go before it's as usable for an average mobile phone user as Signal is.
Just because a project is open source doesn't mean everything the team works on or releases will be in the public eye, nor does it even imply that it has to be open source as well.
I agree about the passing utility of Signal [0] but Matrix (which I do use) is a barely adequate dumpster fire. They spent all this effort developing a generic synchronization protocol, but yet didn't include native encryption in 2014 and had to bolt it on as an afterthought? And the last time I tried to find a native client it seemed like they were all still using web engines for rendering (inherently slow and insecure), presumably because the markup is too complex to make straightforward native apps.
[0] I don't even use Signal. My tack is to isolate and contain my "mobile phone" device as much as possible (when I'm home it generally stays next to the door on a charger). Whereas Signal has been designed around that single device as a critical part of my life. When I can sign up using only a username, and use Signal from a native client or web browser without any sort of Android device in the picture, then I'll be interested.
The license in the repo says otherwise, and the license is what governs your use and modification and redistribution of the client app, not their indignation.
Forks are a natural consequence of releasing free software. This is the life they chose.
Also, free software isn’t a product.
The ToS is the only thing that governs end users connecting to the API, and it doesn’t deny end users the use of third party clients. Also, even if it did, that would be insane, like Google saying you can’t even load google.com when browsing with Firefox. It would be pretty much without precedent on the web, and bonkers.
The GPL is the only thing that governs developers’ use of the client codebase. The GPL of course allows forking and modification and redistribution.
Such forks and redistributions obviously cannot use Signal’s trademarks, so LibreSignal was dumb to do so. Ultimately the feelings of the Signal team don’t matter here - only the license under which they officially released the code. You can’t be more explicit about permitted uses than that.
You can’t be open source but then claim you don’t want forks. It’s one or the other.
> The ToS is the only thing that governs end users connecting to the API, and it doesn’t deny end users the use of third party clients.
"You must not (or assist others to) access, use, modify, distribute, transfer, or exploit our Services in unauthorized manners" [1]
By my reading, the ToS does deny the use of third party clients. Someone could try to argue that a third party is using the services in the same manner as the authorized first party client, therefore it doesn't break the ToS; but since the company's leadership have said that's not OK (causing the mentioned client to stop being updated), I'd assume that if that argument worked in court, they'd just change the ToS to be more explicit about stopping it.
Who would you take to court? LibreSignal is simply distributing software, it's the users who are potentially breaking Signal's ToS by connecting to their servers using unauthorized clients.
This is like attempting to sue qBittorrent for copyright infringement.
Wow, I never really followed Signal's anti-federation drama that closely, but reading that thread is nuts. The LibreSignal folks just don't get it, despite Moxie's clear (at least to me) and plain language. The entitlement there is mind-boggling.
> And it's one of the last bastions of internet freedom.
I don't want to be too negative on Signal since they do some good work and I do use it.
But freedom? No. It is another completely proprietary platform. A better one, but still proprietary, so the antithesis of internet freedom.
For example just earlier this month the Signal client overnight stopped working on my old Mac because they decided to no longer support older OSX releases. So I can longer use it on that machine, my primary desktop.
If Signal was in any way open or free (as in freedom) I'd just compile my own client to speak an open protocol and be back in business. But no, Signal is just a proprietary service with a proprietary client.
>If Signal was in any way open or free (as in freedom) I'd just compile my own client to speak an open protocol and be back in business. But no, Signal is just a proprietary service with a proprietary client.
Isn't the source code available? What's preventing you from compiling your own copy?
The server is centralized -- you might be able to stand up your own but it doesn't matter because you can't use it to talk to anyone else who isn't using your custom built app that uses your server
In other words you're complaining that it's not federated? That point has been relitigated in other parts of this thread so I don't want to go down that path. More to the point, I don't think that's what the parent post is talking about. He's complaining how he can't run signal on his outdated machine, not that he can't run his own private server.
As far as I'm aware, everything is open[0]. Only issue I know of is that the server code isn't consistently up to date and you can't run your own. But you can compile the app and desktop clients yourself. I guess there's also the issue of reproducible builds but AFAIK this is a play store issue and doesn't seem that problematic since you can compile from source. I mean they even have a commit from 4 days ago for the Android app.
I believe what the grandparent comment meant was that you can't run a server that participates in the public network, not that you can't run a private server. That was my prior understanding, at least.
I might very well be wrong, and if so, someone please correct me.
That is correct. I should have been clearer in my distinction. You can run your own server but that server won't connect to the official Signal network. You're completely fine to run your own[0]. FWIW I've seen other software roll their own servers and use the Signal protocol. I mean WhatsApp uses the Signal protocol but I think they've diverged a lot since.
[0] There's always talk about the big deal breaker for Signal being that it isn't federated. So I've always wondered why this passion isn't used to generate a federated Signal network and is more focused on Matrix (who only recently started being E2EE). I don't know how these things work, I'm not that kind of programmer, but I can't see why you couldn't modify the server code to work in a federated fashion and edit the app code to be able to connect to both? I'm actually interested to know why if someone actually has an answer.
From my understanding, they're not a fan of it (not sure if it's officially against their TOS or not) but they don't go out of their way to stop them. At least as long as you don't use the Signal name and make it clear you're not an official app.
Even in this blog post about usernames, they clearly make sure to mention them: "This means that in about 90 days, your phone number privacy settings will be honored by everyone using an official Signal app."
How old OSX are we talking? Is it older than current Xcode with Sonoma supports? If it's that, then you have your answer. If you want to daily drive and older machine Linux or even Windows should be fine, but this is not really the way with Apple hardware - if it was, Xcode would make this easier for the developer. For reference, you can still build for Windows Vista using current Windows 10 SDK - I haven't tried Windows 11 SDK, so not sure how things are there.
> We should support it. If you haven't already, then consider signing up for a recurring donation to the Signal Foundation.
I always like to remind people that you can also donate through your employer and many will match. This is a great way to multiply your donation and everybody wins. Your org is going to donate x amount a year anyways and so might as well "vote" on where some of this money goes.
It encrypts your metadata (the most important data) and doesn't use it to manipulate you. It's a non-profit. And now you can use it without exposing your phone number to other users.
Again: Metadata. WhatsApp records a timestamp of every message you send/receive, and who the other party is. Signal only records two pieces of metadata: timestamp of when you signed up, timestamp of the last time you sent a message.
Whatsapp only e2e encrypts message contents. The only thing Signal knows about you at any given time is the time of account creation and the date of your account’s last connection to Signal servers. That's tied to your phone number. They don't know who you chat with, the contents of those messages, your phone contacts, anything.
I'd get a chuckle out of comparing that with the privacy of Whatsapp.
My 2¢, as someone who tried using WhatsApp once and ran away screaming:
WhatsApp requires you to give it access to all your contacts (your entire address book) in order to use it at all. This information is uploaded straight to Facebook’s servers where they’ll inevitably use it to place your WhatsApp account in a social graph so they know who you are based on your contacts. I found this to be unacceptable so I uninstalled it.
Even if all the other things sibling posters mentioned didn't exist, the simple fact that Whatsapp is owned by Meta and Signal is not... well, that'd be enough for me.
1. Facebook owns WhatsApp and uses it to collect data about people, such as who they communicate with, how and when. They also know about many of the websites you visit and what you do there. They know everything you do on Facebook, Facebook Messenger and Instagram. They buy mountains of data about us from other sources. By analysing all of that data they can probably do a reasonable job at guessing the content of your WhatsApp messages.
2. WhatsApp tries to get every user to accept the option to backup messages and photos to Google Drive, where they sit unencrypted and accessible by Google. Even if you reject that option yourself, your correspondents are likely to have enabled it (if only just to stop WhatsApp from nagging about it) and so your messages are available for Google to read. Example of why this can be bad: https://www.vice.com/en/article/zm8q43/paul-manafort-icloud-...
3. Google Photos asks WhatsApp users if they'd like it to back up their WhatsApp photos. Even if you reject that option, your correspondents may have enabled it and so your photos are stored online unencrypted and accessible by Google.
4. Why should we limit what Google and Facebook know about us? Google and Facebook influence our behaviour for the benefit of their paying customers. Their computer systems are too powerful for our minds. They work against us, not for us. Companies like Facebook will come to be seen like tobacco companies, except that the harm is as from mind altering drugs. There is a documentary on Netflix called The Social Dilemma which explains this well. The polarisation of societies and the spread of conspiracy theories are some of the effects. The only defence is to disengage.
While I am thankful that Signal exists and is a considerate of privacy concerns I don't think their decisions are always right.
For instance, I would love to see picture sent to me by my spouse automatically saved to camera roll. Signal has no option for this because it could put the privacy of me and the sender in jeopardy.
I actually like it this way. Occasionally (not always, which is even more confusing), images from random Whatsapp conversations ends up in the Android equivalent of my camera roll, and it annoys me to no end.
My camera roll is for photos that I have taken. If I want to put something from someone else in there, that's a decision I will pro-actively make. Other apps shouldn't be doing that for me.
WhatsApp has this feature and it drives me nuts. My roll is full of crap people (especially chat groups) send me and I have to clean it up every now and then. I surely hope Signal doesn't do this and keeps the current approach of allowing users the option to download the images they want, when they want.
They have a community forum with a feature request system. Though I'll admit it's a big echo chamber there. But every new user adds a new voice and I can't see how that isn't a good thing.
Fwiw, I want this feature too. And others. I've submitted feature requests in the past. I even asked that usernames add QR codes and links. I'm not sure if I was heard, but hey, the feature is there and even some of the echo people were against it.
They need to actually listen to users. Signal needs to support SMS, they need to support backups, they need to support easily migrating to new devices. I don't care if it makes me slightly less secure, make it a checkbox in the client that I agree if I enable the features, I'm a moron because some nation state could abuse it.
Otherwise, it'll always be niche. I'm never getting non-technical friends and family to adopt a messaging app that isn't unified for SMS and secure messaging. When they say "users might not know they're sending insecure SMS messages" - fine, you own the client. Make the client bright red with a flashing "INSECURE MESSAGES" across it for all I care. It's not hard to inform a user in 2024 that they are sending a less secure message.
Signal has so many footguns that I stopped recommending it. I know more than one person who lost all their messages and pictures when they switched phones.
> I'm never getting non-technical friends and family to adopt a messaging app that isn't unified for SMS and secure messaging
Er, what? So no one you know uses Whatsapp, FB Messenger, Telegram, Google Talk, or anything else? I suppose it's possible that's true, but even if so, you and the people you know do not represent the common-case user.
That's correct, but so what? So does Tor. The US isn't a single unified entity. They get some funding from groups that promote encryption. Gov still wants encryption for their own people and for people in authoritarian countries (it's hard for normal people to overturn an authoritative government when all communications are watched. No need to discuss CIA). But also remember there's plenty of US gov groups that attack Signal too. Just saying "US funded" isn't strong enough on it's own. The gov has it's hands in everything so it's too noisy. You'd need to make an argument about it's dependency on that money, which they aren't. Records are public btw, they are a nonprofit.
I couldn't believe it when I first signed up for Signal and people who had my number were * sent notifications * that I had just signed up. This could've included people I had blocked on my phone.
Same. One included an unstable individual who I was happy had forgotten me. Suddenly he messages me out of nowhere -- "Oh hey, you still exist! And you just installed Signal.... hmm, given what day it is, I'm guessing you're at such-and-such event?"
I think the Signal devs hadn't thought this through at all and just blindly copied what Telegram was already doing thinking it must be cool and trendy with the masses, without understanding their core user base at all.
Same with prioritizing stories, stickers and crypto payments as core features of Signal when that's not what most of their users care for. Meanwhile there's still no official way to port your existing chat history on PC and iOS to your new device, or support for Android tablets. Obviously, stickers are more important.
Signal (and Signal's phone number model) predates Telegram. It was designed as an SMS and WhatsApp replacement; that is, it was originally designed to replace insecure phone-number-addressed systems.
Obviously, the cryptographic guarantees of the two systems aren't even close to comparable.
They're messengers. They have messenger features. The details of how those features are implemented is what matter. Last I checked, Telegram doesn't even have encrypted group messaging, and it has a serverside database of who's talking to who.
I don't know what "feature" you're talking about not existing until 2014, but before Open Whisper Systems, the thing we call Signal was "TextSecure", a literal SMS replacement.
This is true. At every point where Telegram and Signal had the choice between being a pleasant messenger experience or being secure and private, each made decisions consistent with all their previous decisions.
Forcing you to use your phone number and then the same second you created your account go behind your back and spam everyone you just did so is neither private nor something many would associate with secure.
I guess something doesn't have to be secure if you can pretend it is public.
Of course Signal has carefully designed their goals to allow them to do that but in doing so that is a straight up asshole move in a context where they should be seeking trust?
Absolutely mind bending.
This is a great improvement, but they have already proven they can't be trusted with anyone's phone number so it is a damn shame they still won't allow you to create an account without one.
It is a decent service otherwise, but my fricking god I hope they at some point realize the harm they've done.
Up until today I've been ashamed of suggesting signal. Hopefully that will change with this feature.
My general experience in discussing this over the last 10 years is that nerds like us generally find it absolutely mindbending when privacy services make decisions in the interests of ordinary people, such as using the phone-number-based addressing ordinary people already use in order to minimize serverside metadata. But I think it mostly just speaks to how carefully people aren't thinking about the project's goals, and the fixation they have on their own goals. A lot of people are just super angry they can't write their own TUI for Signal.
Having to share your phone number does not meaningfully affect security and privacy. Being able to sign up without a phone number enables anonymity. Anonymity and privacy are related, to be sure, but anonymity is not required for privacy.
I think it's a mischaracterization to say that they spam "everyone" when you create an account. They only tell others who a) have you in their contact lists, and b) have an account with Signal too. I agree, though, that they should be more transparent about this, and require that you opt in to this behavior.
Personally, though, I don't mind it; for the most part this is how I've discovered other contacts on Signal, and vice versa. But I can understand why it makes some people uncomfortable.
What I find "absolutely mind bending" is that this is such a big deal-breaker for people such as yourself. While I wouldn't call it a nothingburger, it's -- to me -- at most a simple error in assuming what people are comfortable with.
Edit: I re-read what a few others had said upthread about how indiscriminate this new-user notification is. The examples of notifications being sent to users that had been blocked via the phone's built-in call/SMS blocking features are especially chilling. There's really no excuse for that, but still, to me this automatic notification is a feature developed in good faith, with good intentions, not some nefarious privacy invasion. They should be taken to task for its failings, but dismissing the entire platform over it seems a bit over the top.
> Having to share your phone number does not meaningfully affect security and privacy.
Of course it does, way more than "meaningfully". I actually wonder if I got your message right taken that I am not a native speaker of English.
Do you mean that if your phone is public (and you are known to be the owner) you will not get creepy calls, have ot listed as a free pizza delivery for calls after 23:00, having it blacklisted, ....
I must have understood wrong.
Otherwise what is the number of your president/prime minister? Or the CEO of Google/Apple/... I do not think they are public.
TextSecure and Redphone did not upload your contacts to the cloud. No need to be a security expert to know that it's unwise to leak user state to contacts. In fact textsecure (now Silence) is the first SMS app to have a different colors for each contact to help the user avoid mistakingly messaging the wrong person.
Nothing about Signal is haphazardly borrowed from Telegram. The feature we're discussing was chosen to help Signal to grow from a few thousand users to 50M+ without needing to build a social graph on Signal servers.
This mechanism may not be ideal for all users, and it's possible that Signal has now outgrown it, but without it, there would be no Signal as we know it today.
>The feature we're discussing was chosen to help Signal to grow from a few thousand users to 50M+ without needing to build a social graph on Signal servers.
How did THAT feature help Signal grow?
You only receive that spammy message if you already have Signal installed and your contact already has it too.
Signal grew a lot in 2021 (in Europe) because of the pandemonium created by Meta when they announced a change in WhatsApp Privacy Policy so everyone rushed to install Signal but the initial surge, was short lived.
Moving the clocks forward to today, looking at my extended network of family, friends and acquaintances, almost everyone has Signal installed, but most don't use it anymore as it's too frustrating and feels dead, so everything is still on WhatsApp, especially groups. All the Signal groups I have, originally meant to replace the WhatsApp groups, slowly died out and people stopped posting on them or following them, defaulting instead back to the WhatsApp groups.
You don't fix this lack of retention with stickers and spammy messenges.
Stickers are more important because just like every other tech company, growth is the only way to stay in business. You can just run a business on delivering a good product to your customers anymore. You have to grow constantly, which means bringing in new customers which, by definition, aren't part of the core user base. It's gross and depressing and it enshitifies everything
>You can just run a business on delivering a good product to your customers anymore.
Who said Signal was a good product to begin with? And who though adding sticker would improve market share?
Casual users value UX and porting their chat history and VoIP calling vastly more than they value E-2-E encryption. You can't talk about growth when you fail to deliver on these fronts first. That's how Telegram and WhatsApps rule the market.
Adding stickers won't move the userbase needle when you already lost your potential users at the lack of chat history and UX.
That's not the point. The point is if stickers make people love Signal. Sticker are popular on other platforms as well but because those platforms are popular not because they have stickers.
What fantasy land are you posting from? Signal has 40 million users as of 2022 (this was the first stat I found on a quick DDG search, which is all the effort your post deserves).
Also: "Who said Signal was a good product to begin with?" LOL. Just read the comments on this link bro.
How does Signal count it's active userbase? Like I said, me and almost everyone else I know have it installed but don't regularly use it because most people don't really like it versus the established Telegram and Whatsapp.
Signal is known to store two points of data per (hashed) phone number: the first login date, and the most recent login date. The second point is sufficient to get a user count.
Having a "most recent login" doesn't prove someone is an active user. I use it about once every two days, am I also an active users? Compare that to WhatsApp which most people use multiple times a day or even multiple times per hour, and you get the picture of how popular or lack thereof Signal is by comparison.
Like I said, a lot of people have Signal, but very few use it as their primary messenger on a regular basis, and more of a "it's just there in case one of those tech nerds who told me to install it decided to message me on"
Yes, that is definitely a non-standard definition of "active user". It's not really a relative term - if you're signed in and sending/receiving messages, you're an active user.
I was all excited about Signal, but rarely use it because of this very feature. Once it started sending me notices about other users, I was extremely not happy. I was very hesitant since one of the first things it did was ask for access to contacts. I'm still pissed at myself for allowing it.
Hi there, engineer on the Signal Android app here. Just an FYI that the notifications are generated on the receiving client by detecting that one of their contacts newly showed up as a registered user -- they're not "sent out" by you when you register or anything. Also, these notifications have defaulted to being disabled for the last 1.5 years or so. So only people who go into their settings to manually turn them on should be seeing them at this point.
That said, the complaint around this is usually that people don't want others to know that they use Signal. And unfortunately there was no way to _really_ do that (until now), because if you open your chat list, you'll see all of your registered contacts. But in the 7.0 release, we added the ability to hide yourself from being discoverable by phone number at all. So for people who don't want anyone else to know that their phone number is registered with Signal, they now have that option.
How come it wasn't the default right from the start?
How can a privacy oriented company not see the privacy implication of this? Sometimes, you want to be forgotten by some people, and Signal is telling them you are still there and active on that number. I remember reading a story about someone getting into real trouble for that.
Without "usernames", the proper way to handle it would have been to not let anyone know you are on signal when they look up your number. To get into contact, send a message, then the recipient will receive a notification with the message and an option to rely. If the recipient doesn't respond, from the sender point of view, it should be as if the account didn't exist.
I personally don't have a problem with this feature, and it's actually how I discovered Signal use among many of my friends.
But I think it's inexcusable that these sorts of notifications could essentially allow someone to circumvent blocking done by one of their contacts. If I've blocked someone via my phone's default contact blocking mechanism, and then I join Signal, and that person is already on Signal, they should not suddenly be able to contact me... and even be explicitly invited to do so on their end!
I wouldn't be surprised, though, if neither Android nor iOS gives regular apps access to the blocked contacts list. So I'm not really sure how an app like Signal could solve this problem.
After I realized this happened to me, I uninstalled signal. But because of the way signal jumps in and replaces normal sms, I found out later that signal users were no longer sending/receiving plain text messages to/from me properly. I forget the details but it was really frustrating.. first it ate my contact list and contacted them, then after I uninstalled it held those contacts hostage, breaking comms with them because those users didn’t know they were still signaling me, not using a normal text message. I text, they reply with signal, I can’t ask them to uninstall their app, so now if I don’t reinstall the app myself or borrow a friends phone to try and reconfigure it then I guess we’re now out of touch forever? It’s not privacy-friendly to replace or hide built in functionality, it’s just an attempt to coerce people and to bolster your user numbers.
>now if I don’t reinstall the app myself or borrow a friends phone to try and reconfigure it then I guess we’re now out of touch forever? It’s not privacy-friendly to replace or hide built in functionality, it’s just an attempt to coerce people and to bolster your user numbers.
yeah, you need to authenticate to delete the account (aka deregister). How else would they verify that you are the owner of the account you want to delete?
So because they elected to blur the line between their own opt in service and a built in service, I have to jump through extra hoops to properly opt out and get my comms back up? That’s if you even realize any of this is happening. Whether it’s down to design or to negligence, that’s a pretty hostile user experience and it feels deliberate, especially since they pawed through my Contacts to “help” me into this position. I felt disrespected and no longer very confident in their stated values/mission. Hard to use or recommend after something like that
It would be interesting to know whether signal decided to fix the awful UX I’m describing or if the android/iOS app stores noticed the abuse and disallowed it
> We've discussed at length why this is not possible, but if you have more thoughts then please visit the forums. Please try not to open duplicate issues in the future, even if you feel like something is important.
The list of phone numbers with signal accounts is basically public. It kind of has to be. When a new number gets added and it matches someone in your address book, your app will tell you that one of your contacts has joined. People have always had the ability to turn off that feature, but that's not what the feature request seems to be asking.
People seem to be asking for a way they can join Signal without their number showing up in the registry of Signal users. This is why it's "not possible".
edit: This may have changed today. I'm now seeing an option that lets me hide my number from the registry. This means that even someone with my phone number will not be able to message me on Signal, which seems like a good deal to me.
Yes, this drove at least two people I know/encouraged to use it off the platform. When people see this they also think that Signal snooped their contacts. Very bad.
This and the iPad "We'll remind you later" iPad notification nag are significant problems. I am a big supporter of Signal, but it's certainly hostile to those escaping an abusive situation. Usernames are a step in the right direction at least.
I know some people defend Signal out of ignorance or loyalty, but I suspect there are some paid shills for Signal now. I don’t see how anyone with a bit of security awareness (which is the reason to use Signal instead of whatsapp) can justify using a phone number as an ID in 2024..
> Note that if provided with the plaintext of a username known to be in use, Signal can connect that username to the Signal account that the username is currently associated with. However, once a username has been changed or deleted, it can no longer be associated with a Signal account.
The "no longer associated", I will need to get Signal word for that, right. (You cannot cryptographically prove something was deleted, right.)
You shouldn't need to cryptographically prove that an old username is unavailable. You should be able to simply send a request to Signal servers asking if it's available and receive "no" as a response.
You'd have to take their word that this wouldn't change, though.
I just donated the minimum amount to Signal through the app (~$3), I encourage all other users to do the same, because every time a Signal article is posted it’s a reminder how dystopian IM would be if there was no realistic, privacy respecting option for ”normal people”.
It’s probably the only piece of privacy friendly software I’ve recommended to older relatives that actually stuck. It’s not fancy, but it’s solid, simple and does what it’s supposed to.
How is it a “humble brag” that I donated three bucks to an open source project? The only reality here is that you’re acting like a fundamentally unpleasant person
> Your username is not stored in plaintext, meaning that Signal cannot easily see or produce the usernames of given accounts. [Footnote: Usernames in Signal are protected using a custom Ristretto 25519 hashing algorithm and zero-knowledge proofs. Signal can’t easily see or produce the username if given the phone number of a Signal account. Note that if provided with the plaintext of a username known to be in use, Signal can connect that username to the Signal account that the username is currently associated with.]
(emphasis mine)
Couldn't Signal just brute-force all possible usernames in order to connect them with their accounts?
All in all, it seems usernames are just as public as anywhere else and the encryption part sounds like snake oil. Ok, maybe once more they try to protect the username table (or its equivalent in the zero-knowledge proof algo) from getting probed too often by means of an Intel SGX enclave or something, but I wouldn't want to trust SGX either.
I don't agree with the 200-bit estimate. Usernames will typically not be random and will have much less entropy.
Either way, I was not talking about brute-forcing a single username. What I suggested was that Signal could loop over the space of all possible usernames. Every other name would be a hit (i.e. exist) and reveal the account ID, possibly even the phone number, of that user.
Hell, couldn't regular users do the same? The blog post at least doesn't mention anything about rate limits when probing usernames.
> A username on Signal (unlike a profile name) must be unique and must have two or more numbers at the end of it; a choice intended to help keep usernames egalitarian and minimize spoofing.
Interesting choice. I’m guessing most people might just use the last two digits from their birth year, to make it easy to remember.
When they announced usernames I thought I will be able to install Signal on my TV desktop (linux) and send / receive messages from to it (links, files, etc).
Now that I know it still needs phone number I assume it will need to be unique so my use case fails.
For the record, I am still a happy Signal user and a monthly supporter, thank you very much.
Just hair splitting obviously but I don’t think it’s really a contact, it’s just what the recipient shows as when you send something to your own number.
Most of the use-cases for requiring a phone number to sign up for a service e.g. Twitter, Signal seems to be to avoid spam. Atleast allegedly!
What alternatives can be used instead, something that is easily accessible/available to the general public but not easy to obtain to create mass users?
Instead of heavily limiting account creation, Discord for example limits the possibility to message users outside of your network by default. Only people you have added as friend or you share a server with are allowed to message you by default.
For signal that would be harder to implement since it's more focused on 1o1 chats instead of groups, maybe if spam gets out of hand they could use a grey-listing approach like Instagram does where users outside your network get moved to the "message requests" inbox by default.
Discord, while overall better than Telegram for privacy, will flag your ip / device / identity and require a phone number for new accounts if you do something like use a message archiver to back up conversations. Took me years to get the block removed (but not for my work account). It was a privacy nightmare for me and when I had to get an account for work I had to sign up for an additional cell phone service, which cost me thousands to this day.
I’m still nervous about making new accounts in case it triggers some process to lock me out of my one account that I don’t have a phone number for. I couldn’t join the baldurs gate 3 discord to find people to play the game with because it required a phone number on the account, which I was already forced to use for my work account.
On the other hand, I’m glad they actually do enforce their rules, unlike Telegram (which is a haven for scammers, pedos, radical communists, open market drug dealers, and terrorists, not to mention the soul-depleting interactions I’ve had overall with chat rooms there)
Telegram can’t be trusted to not broadcast your pseudonym to whatever work or other contacts you might have saved on your phone.
Telegram doesn’t allow for changing one’s name per group chat like Discord does, so if you want to be known by a certain name only in a certain place, you need another account, which means you need another phone number. That’s not privacy.
As far as message contents go, people can click two buttons to export the entire chat. Or even delete the entire history. Then they’ll have the history and you won’t, and malicious actors (which are everywhere on Telegram) will be able to take your words out of context and use them against you. That’s not privacy.
On Discord at least you have some protections against that by having hoops to jump through to export messages (risking account ban or account creation limitation), helping keep people honest. The account age is also visible in plain sight, and that is very much a data point that people use to ward off potential bad actors.
Another dimension of privacy is I might want to read a message without knowing the sender read it. You may or may not care about that.
If you want to sell drugs and remain private, Telegram might be less likely to report you to the government.
Privacy is not about what features your app might have, it’s about what the user ultimately experiences, and the specific risks they take. The threat model suited for the average person is much more unfavorable on Telegram than on Discord.
It’s more complicated of a situation than being able to say “we can do end to end encryption”.
> Telegram can’t be trusted to not broadcast your pseudonym to whatever work or other contacts you might have saved on your phone.
Disable contact sync, you don't need it for telegram to work
> Another dimension of privacy is I might want to read a message without knowing the sender read it. You may or may not care about that.
Long press on chat
> Then they’ll have the history and you won’t
You have a button to "nuke" chat for both of you from server. Then they'll have no proof except their words that "I didn't make this up, I honestly exported chat". Honestly, I don't really get this point. Can you elaborate a bit? What is the problem with exporting?
> If you want to sell drugs and remain private, Telegram might be less likely to report you to the government.
And with telegram you can register anonymous "number" on blockchain via tor and don't even care about telegram reporting you to the government. Is this intended to be bad point for telegram?
> threat model suited for the average person is much more unfavorable on Telegram than on Discord
Discord have TOS that allows it to read your private chats and ban you for privately discussing things that are against their code of conduct. This makes any sentence with words "discord" and "private" a complete joke. It's non-private messenger that never even pretended to be private. Maybe somewhat private if your main threat are random internet trolls
If anyone wants to help me add the thinnest layer of security possible to the signal desktop app please reach out to me. It needs the option to use a pin to unlock, like, yesterday.
As it stands, if you let someone use your computer and you have signal desktop, they can see all your E2E texts. Desktop computer sharing is much more common than the devs acknowledge. Also there have been several high profile cases of federal agents squatting on a confiscated laptop, keeping it awake and eavesdropping on signal group chats without the other participants’ knowledge. See the evidence in the FTX trial as a recent example.
I'm a python programmer, and I have zero experience changing the internals of an electron app, but this is a big deal to me.
If the feds have you device, they have everything, regardless of how hard you try to lock it down. Not worth even considering how to keep them out because you're simply not going to.
Also consider that, a sufficiently motivated private threat actor is likely going to break a pin, there's not enough entropy there, or they'll hit you with a $10 harbor freight pipe wrench until you tell them the pin.
For everything else, bitlocker, LUKS, or equivalent is more than sufficient and battle tested for those uses. Yes there are ways of breaking both, conditional on XYZ, etc, but, they're good enough. It does force you to multiboot, but that's good practice anyway, no reason someone using your computer should be using your root partition in 2024.
Ugh. I shouldn't have even mentioned the feds. This isn't threat actor level stuff. The thing that bothers me is that if I let a non-technical user onto my computer to do something like write an essay for school, they might stumble upon some messages that should have been private. There's zero protection. That's why I called adding a pin the thinnest possible level of security.
It might not please you to learn that Signal Desktop stores your messages in a trivially-read SQLite database. But it may prevent you from trying to lock the client with a pin.
There is a universal way of fixing this for all your desktop apps: locm your computer. It works similar to locking your phone: when it's locked, you have to first unlock it with a password or something, in order to start using the decide and it's apps again. As long as it's locked, all your data is protected.
Personal computers are big. They don't fit in your pocket. They don't lock when you hit a small button on the side. They are often shared. Your argument about locking down the whole computer is brought up every time someone wants this feature. The reality is, we want signal to be an application that anyone can use. Not everyone is a single male with enough income to have their own private computer.
So what about your mail client, team chat app, browser (with all your accounts logged in to your sites), terminal, and all your files and network mounts?
I would rather make this a feature of your desktop environment / window manager. Then you have this functionality for all apps, and the apps themselves don't have to make that functionality.
Edit: actually maybe what you're looking for is to have multiple accounts on one computer. Then every user has their own desktop environment with their own apps and data and apps are not shared among users.
Signal originally marketed itself as an application to replace sms texting. People put more personal information into their sms chats than they do into a discord chat or email typically. Every other chat application I use on the desktop has the option to log out.
I'm not going to argue the justification for this functionality with you all any more, and it is a little strange and suspicious that there are so many of you all opposed to a simple option being added to an app. It's a little insulting that you and other commenters think that multiple accounts on a computer and locking a computer with a keyboard shortcut are novel concepts that need to be explained.
If anyone is reading this who would like to help me instead of arguing against the implementation, my email is in my profile.
If you want the functionality inside Signal, then only the Signsl devs will be able to help you. If you want a solution outside from that, look into desktop environments that have a feature to lock a specific app. Or use multiple accounts. That is what it's designed for I would say.
Computers have keyboard shortcuts for locking. Some even have dedicated/configurable single buttons for locking, so they are just as easy to lock as phones are IMO.
So basically copying telegram way. That being said, why does Signal still require a phone number in the first place? Exactly, because when needed, it will be used to be linked back to your real identity, it has nothing to do with spam or anything, Signal isn’t a social media with public posts and what not, it is a messaging app.
> We use third-party services to send a registration code via SMS or voice call in order to verify that the person in possession of a given phone number actually intended to sign up for a Signal account. This is a critical step in helping to prevent spam accounts from signing up for the service and rendering it completely unusable—a non-trivial problem for any popular messaging app.
I'm not sure why you need to assume that it will be linked back to your real identity; I haven't seen anything that indicates any motivation to do something like that. I'm all for being cautious, but being overly cynical can lead to letting perfect being the enemy of the good.
For the spam part, I commented below how’s that doesn’t work and it doesn’t even make sense for a messaging app.
> I'm not sure why you need to assume that it will be linked back to your real identity;
I’m not assuming, only North America (edit: and some European countries) doesn’t require an ID for a phone number (1), and even in here, you would use it in other services that are linked to your real ID like banks or paying the phone bill online. The concept simply boils down to as soon as you find an account’s phone number, it’s a game over for that said privacy.
> The concept simply boils down to as soon as you find an account’s phone number, it’s a game over for that said privacy
You completely misunderstand what kind of privacy Signal aims to achieve. Signal protects you from eavesdropping and data hoarding, two major privacy issues with solutions like Facebook Messenger for example.
They do not and have never claimed to offer a service where “privacy” means nobody knows who anyone is, it isn’t Tor and I wouldn’t want it to be.
If you don’t like the goals and design choices of Signal, just use another service.
There are benefits of the choices they’ve made, namely ensuring that most users of the service are “real people”, which I think is great. It’s not a social network, it’s a messaging app between friends that solves issues presented by alternatives like SMS or Instagram; that’s it.
It's a lot less like data hoarding than keeping a separate copy of your social graph. What is an adversary going to do with a list of phone numbers that are known to have signal accounts and nothing else?
Hoarding =/= collecting the bare necessities. Signal needs one piece of data to distinguish users from each other, and collects that. Hoarding would be to collect (significantly) more pieces of identifying data, more than needed to distinguish users. Signal does not appear to be doing that.
Because they don’t know anything except the phone number so all they have is a list of phone numbers which maybe people use. Quite different from Facebook reading everything you send, for example
And how to these black markets connect the phone numbers to names? I guess from data collected from more insecure sources. So I think Signal is being responsible with their data.
Also, you need some way to log in to your account. So you need an identifier and some way to validate that you are the owner of that identity. And next to that you want to prevent spam. So I think the choice to use a phone number as an identifier for a text-messaging app that is meant to be a secure replacement of SMS is not that weird.
But let's say they are data hoarding our phone numbers, and they can get other details about us through the black market because we use other more insecure services where we suddenly don't seem to care about privacy. Then what do you think Signal does with this data? They can't resell it because they don't have anything unique, they actually need to invest money to link their database of just phone numbers to something else. And then? What malicious things will they be able to do?
Ok, now you have a list of people's names and you know they have signal installed. Google and Apple also have this (presuming you installed it via a mobile app store). Your carrier has this (from the IP addresses on your messages).
What have you gained? What does the attack look like?
> On the opposite end of the spectrum, users who want to live on the edge can enable an optional setting that allows them to receive incoming “sealed sender” messages from non-contacts and people with whom they haven’t shared their profile or delivery token. This comes at the increased risk of abuse, but allows for every incoming message to be sent with “sealed sender,” without requiring any normal message traffic to first discover a profile key.
By default, the first message between someone and you clearly identifies who is communicating with whom. That's enough.
We don't know whether an intelligence agency is listening in on their servers and logging this data.
Assuming an eavesdropper that can defeat TLS or is listening via DMA attacks on the signal servers,
- you can log initial signup or login, which allows you to connect user id and phone number
- you can log the first time a chat is created, which allows you to build a social graph of which person is connected to which other people
- even with sealed sender, you still know the identity of the receiver and the IP address of the sender, which is often enough to figure out who is in contact with whom
This would be enough dragnet surveillance to automatically figure out the contacts of people you've already identified as threats. You'd also have enough evidence to get a sealed court order to do targeted surveillance on these people.
I hope moving forward we can have multiple usernames and profiles. This would greatly increase privacy since we may have different identities in different social groups. Even on HN a lot of us have multiple personas. I find one of the big challenges is actually handling these different identities as most software only assumes you have one. Though it seems to be common on social media like twitter or instagram. But bitwarden still doesn't know how to differentiate microsoft logins lol
Edit: I'd love in the future to also see things like self destructing or one time links. I don't think these should be hard to implement, especially if one can have multiple usernames. Certainly a limit like 3 would be fine with the numbers, right? Personally I wouldn't be upset if multiple names became a premium feature but I'd strongly prefer if it wasn't. I get that signal still needs money (https://news.ycombinator.com/item?id=39446053)