1. You should look into what other messengers do with sender/receiver pairs information. One very popular competing messenger logs pairs permanently, serverside, in order to make UI features work.
2. One of the least popular attributes of Signal (on Hacker News, at least) is its lack of federation and ability to interoperate with third-party clients. This feature is a pretty crystalline example of the kind of protocol change you can make when you control all the mainstream clients, and that would be an absolute nightmare for a protocol where you didn't.
Personally I would be fine if they were in control of the only implementation but provided libraries that could be used to create bindings and gateways for other clients and protocols. My main problem with Signal is that their desktop app manages to be slower and clunkier than the average electron app which is quite a feat on its own.
If they provided a basic shared library or maybe even a stand alone "headless" binary client you could interface through some socket protocol for instance it would let me write a gateway to use irssi or whatever not-completely-trash and actually configurable client instead of that glorified webview that manages to be worse than an actual webpage (still haven't figured out how to change the spellchecker settings in the edit box since right clicking doesn't have the option to change the dictionary like in a real web browser). Also I've just started it and it already uses 158M of resident memory but that's business as usual for this garbage fire that's electron.
As far as I can tell that wouldn't make it harder for them to develop and improve the protocol since they would still completely control that bit of the code.
Frankly if it's just that nobody did it I might seriously consider adding Rust bindings and making a basic IRC gateway or something like that.
I would be very shocked if he would be any nicer about someone writing an entirely new app from scratch that uses their protocol and inter-operates with their servers. Because then Signal on F-Droid wouldn't have been an issue.
However you can't label this "Better Signal" (they own a trademark on the Signal name) or connect it to Signal's network.
Which is precisely what this thread is talking about.
What I had in mind was to make a gateway that would act as an IRC server, not client. Basically I'd run it alongside my usual IRC client and use it to access Signal discussions using the same interface that I already use to browse IRC. The IRC connection would be local and completely trusted so as far as I understand it nothing would be "destroyed". Alternatively I could just write an irssi plugin that would directly connect to Signal for instance.
Basically what I want is to use Signal as it is today but with a different UI. I don't want to tweak the protocol or anything like that and I'm perfectly happy using a Signal-sanctioned implementation of the protocol.
I suppose they could forbid non-official clients to connect to the network (although I'm not sure why they would care) but then since as far as I know the server-side software is not available in any form it's not like I can rehost it anyway.
In practice I suspect that if you just build this and use it, that will actually work. If you make a big song and dance, I suspect Moxie will tell you this is not allowed.
Irssi's codebase is probably a tiny, tiny fraction of the electron framework and it's super stable these days. Meanwhile didn't Signal suffer an HTML tag injection vulnerability a while ago? But anyway, that's besides the point.
>In practice I suspect that if you just build this and use it, that will actually work. If you make a big song and dance, I suspect Moxie will tell you this is not allowed.
Okay I see. I hope eventually they'll open the door a little so that we'll be able to build around the application in broad daylight. But hey, that's their project and they give it out for free so who am I to complain? Maybe one day I'll seriously consider implementing Simias Comms 1000 (tm).
Thank you for the feedback.
This feature is an example of what happens when you don't solve the meta data properly in the first place. It doesn't really work that well but unless Signal starts routing through something like Tor (a system they don't and can't control) they are stuck with half measures.
At some point in the future maybe the roles will reverse there, and interoperability will be a bigger issue. For now, I think Signal's userbase is benefiting from the decision. If it became necessary, presumably they can alter their stance in the future.
How so? Are things like OMEMO different cryptographically from Signal protocol?
All the open/interoperable solutions that I found were sufficiently broken that I no longer use them, since it felt like I was spending more time debugging them than actually communicating.
I wish you'd name things and explained what exactly they are doing.
Putting "user experience" anywhere near PGP/GPG makes most gpg users immediately gag.
gpg offered a solution to encrypting email. No one uses it because it's unusable.
gpg offered a solution to releasing signatures beside releases with the idea that users could verify them through the web of trust.
The exactly zero people who notice when the person signing changes or a technical error results in an invalid signature shows no one actually verifies signatures.
XMPP, when it needed e2e encryption, could not use pgp/gpg because their effectively unusable in what it's doing.
gpg's adoption rate is not great, and any adoption it has is entirely in-spite of its bad UX, not because of it.
Every time I get Fedora updates they're GPG signed, if you compromise a mirror and shove crap there it isn't signed and so the system will just try a different mirror. If you try compromising the metadata, that's signed too.
Instead of my bazillion of web site passwords being "in the Clown" as seems to be the popular style these days, or in some fool's hand-built database, they're in Donenfeld's password store, which uses GPG to encrypt text files with passwords in them.
On the other hand, the point of the feature is to partially compensate for a privacy weakness that exists in part because of the lack of federation/decentralization.
If you’re trying to have truly secure communications with someone, having a single party that could theoretically be logging the sender and receiver identity (as easily personally identifiable phone numbers!), timestamp, and sender/receiver IP addresses of every single message is a serious weakness. There‘s a reason people on this forum were up in arms over the NSA’s metadata collection program even though it was limited to the same type of information. As an example – while I’m on the subject of the NSA – if Signal had been around in 2013 and Edward Snowden had used it to provide his leaks to journalists, the metadata of who he was contacting would have been almost as dangerous to him, should the NSA have obtained it, as the actual contents of the messages. For a somewhat less dramatic example, the same would apply to the various Trump administration officials who have been leaking to journalists, apparently using encrypted messaging apps including Signal.
Of course, this is a hard problem in general and decentralization wouldn’t be the whole solution. Indeed, it wouldn’t even be the start of the solution. In the above examples, the most important first step Signal could take to protect the leakers’ privacy would not be any kind of decentralization, but simply allowing users to create accounts without a phone number, allowing them to communicate with an identity not trivially linked to their real-life identity. Ideally they would then access these accounts only through Tor, disable push notifications to avoid leaking identity through push tokens, and avoid accessing them at the same time and through the same pipe as their normal accounts (if applicable), to prevent the server from trivially correlating accounts using traffic analysis; the second step Signal could take would be building those measures into the app itself, since they’re very easy to screw up if done by hand.
However, this scheme, even if implemented well, would still have some weaknesses. For one, the anonymization Tor provides can theoretically be compromised, especially with the funds of the kinds of adversaries that would be in a position to compromise the Signal central servers. More importantly, even if the adversary can’t directly identify the parties communicating, they can still build a web of who’s contacting who, when, and how often, information that can be a source of important insights – including the potential to lead to deanonymization.
And then there’s the case of “everyone else”: people who aren’t going to take drastic measures to hide their identities, who probably don’t need to, but would still like their communications to stay private, including metadata. To stick with the same scenario, consider the internal communications among the journalists at a news organization after something has been leaked to them (or really, in general). From the types of people communicating with each other and the amount of communication, you might be at least able to guess, say, when a major bombshell is about to drop, as well as who exactly is involved. As another example, consider the anonymous op-ed from an official in the Trump administration that the New York Times published last month. The paper has openly acknowledged three top employees as knowing the identity of the author, but learning the names and roles of lower-level employees involved could provide clues as to who the author is. (Of course, the New York Times is largely shielded by legal protections and so probably doesn’t have to take extreme technical measures, but that’s not the point.)
That’s where decentralization comes in. If the New York Times could host its own Signal server, they could attempt to protect that metadata themselves rather than relying on a third party. First consider the case of internal communications. They would still have to worry about an adversary performing traffic analysis at the transport layer, but for internal communications, this is no worse than with a private server than a central one, and sometimes better (if the transport path is trusted); and in general, traffic analysis doesn’t leak as much metadata and is easier to mitigate. Then there’s the risk of their own server being compromised. Admittedly, when it comes to some types of attacks (hacking, as opposed to legal threats), for the Times and most similar organizations, having their own server would probably be a regression rather than an improvement, since their IT staff doesn’t possess the same level of security expertise as Open Whisper Systems. But that isn’t the case for every organization that has secrets worth keeping, and of course hosting your own server would always be an option, not a requirement. On the other hand, they would at least be able to force adversaries to target and actively attack their organization individually, rather than hacking one centralized service and getting everything. In practice I think that would be a significant deterrent, especially for smaller organizations.
What about communications with a leaker? Well, there are a few possibilities, depending on the type of decentralization. In all cases, the news organization could have some sort of mechanism to give leakers accounts on their server (or better, a separate server set up for the purpose). If the system were based on federation, the leaker could alternately communicate to them from an account on a server that they trusted. Obtaining a trusted server wouldn’t be a difficult task for technical users like Snowden, but more so for others. I’d prefer more of a peer-to-peer design where a cell phone or laptop could be its own peer and communicate directly with someone else on their server, without specifically needing to have an account on that server.
In any case, traffic analysis would be much more of a threat for a leaker than with a central server, since knowledge that they were connecting to a news organization’s private server would be much more revealing than merely knowledge they were using Signal. I admit that’s sort of an inherent disadvantage to decentralized designs – but it’s one that can be mitigated using Tor or other types of proxy setups. It’s much harder for a user to mitigate the risks of a central server.
Edit: And for the record, the new sealed sender feature will reduce the risk of metadata collection, but a compromised server probably wouldn't find it hard to correlate a user's anonymous message-sending connections with their main authenticated connection. In many cases, IP address would be enough; in others, you could look at the pattern of messages being sent back and forth in a conversation.
If the feature is well-documented, and you announce it well-ahead of final release, why? Anyone who is actively maintaining their client can add the feature. Those who aren't will either be forked or die. Nobody said it was on the core product to update third party clients. See: AIM/ICQ and trillian.
BBM back in the day worked great with their unique "PINs", that could be shared by QR code, and I could reject an "add" request.
The answer is always the same:
Phone numbers bootstrap a workable social network for ordinary users. Signal's goal is to transform all ordinary messaging into secure messaging. Not elite secure messaging. All messaging. The most popular messaging application in the world uses phone numbers for identifiers (as, obviously, does SMS). That's the goal they've set for themselves.
The simple answer for a lot of questions about Signal is that they aren't trying to solve every problem, or even many of HN's problems.
If phone numbers are super-problematic for you, use Wire. Consider carefully the privacy tradeoff you'll be making, though.
What it doesn't answer is: why can't Signal also provide an option to add a contact using something other than a phone number? Bold, italics, and double-underline on the also. Is it simply a question of finite developer time?
I have multiple specific segments of the population in mind who would benefit greatly from the secure private messaging Signal provides but for whom exchanging phone numbers is a total non-starter. "Ordinary", non-technical, non-"elite secure" people. I hate the framing of this problem as some sort of niche techie elitism. That's a dodge that reflects a social blind spot of its own.
By way of analogy: In every HN thread about Firefox there are dozens of comments by people who say that they just couldn't use Firefox because page loads are so awfully slow.
Consider that for a second: HN readers are willing to give their entire browsing history to Google in return for at the very worst a few dozen microseconds of load time saved.
The equivalent here is, that for non telephone number identifiers you have to create a whole UI for actually adding/discovering people. That's the equivalent of the few microseconds. You can argue that it's not a big deal, but it's exactly the type of friction in the ecosystem that hinders adoption ("I already have your phone number, why do I need another number to contact you? Know what, I'll just send you an SMS.").
And it's exactly the type of UI/UX problem that prevented encrypted email adoption ("Download a new program and some sort of key for you? I'll just send you an email directly, I'll figure this out later...").
I don’t want to be discovered. I want to give you my pseudonymous pointer and you can ping me to connect.
Using phone numbers to ‘discover’ doesn’t solve the problem you say prevented encrypted email adoption, that’s a next step in the dance.
There are exactly two pseudonymous identifiers that have "made it". Both associated to technological revolutions. One is phone numbers, the other is email. You put them on business cards and save them in your contacts.
This is strong evidence that this is a hard problem. No other identifiers have been successful long term.
And yes, this absolutely prevents encrypted email adoption. Imagine if switching to encrypted was like installing signal (back in the SMS fall back days).
"Hey use this email client, it automatically detects when the person you're mailing also has an encryption capable client and then all your communication with that person is encrypted."
"Hey use this EMail client, and if you find out that someone you know is also using it, you can go to this menu item here and search this database in order to find out how to email them securely, and if that doesn't work just email them normally to get a public key from them, before you send them the document you wanted to send."
No guarantee that the former would work, but it would have a fighting chance.
Of course such a solution isn't possible partly due to usage patterns that email has that would break. But that's why I'm relatively forgiving of Signals strict stance on shooting down these type of feature requests. Because they actually have massively improved secure communication for more than a billion people.
On the contrary, this is precisely the discovery many users do not want. 1. Someone you know can discover you’re using this channel. 2. You can’t use the same channel both overtly and with deniability. I should be able to have as many faces for speech as I choose.
If you need deniable encryption today use WhatsApp. Done.
Use the phone number just to prevent spam (though it can't prevent one user having multiple accounts using the same phone number).
This will serve two use cases:
1. People who need simple private IM like WhatsApp can continue using it without asking for A or without giving it any heed.
2. Or you go for A and you are communicating w/o any real world identity attached to your messages.
1. Use the device's address book (phone numbers).
2. Use Facebook Connect (FB id).
3. Store the entire social graph on the server (custom identifiers).
I think #3 is what every messenger that offers non-phone identifiers does (Snapchat, Twitter, Telegram, Wire, Viber, etc).
The reason is simple: if someone does manage to create a social network by slowly discovering a bunch of usernames from their friends, but then they reinstall the app or get a new phone, it would be pretty unusable if that entire social graph was just... gone. It's bad enough to have to create this social graph from scratch, but to do it every time you reinstall, lose your phone, or get a new device?
The consequence is that many people advocating for this feature (or using other messengers because of it) probably don't understand what it is that they're really advocating for or getting themselves into.
Right now Signal is much more "private" than any other messenger, if you measure that by how much Signal knows about you (timestamp of account creation is the only thing iirc). By supporting a custom identifier, they would have to store your entire social graph, like other less private messengers.
On iPhone, after a hardware upgrade w/ restore, Threema offers to restore your client side graph from a client side backup.
Maybe. Or it could be because they don't want to. People equipped to solve one problem may not want to solve another, related problem. It might not be a priority, interest, or motivation for them. That's not apathy; that's decision-making.
(1) allocating developer time and organizational priorities; or,
(2) a technical incompatibility with Signal's existing, phone number-based model?
Based on your response I'm inferring (1), but I'm frustrated that this is never directly answered when the topic comes up.
I seriously don't understand what people expect in these discussions, though. The situation is straightforward. A small but vocal minority of Signal's user base wants non-phone-number identification. Signal hasn't prioritized that feature. Put up, or use a different messenger. How is this complicated?
Don't pick horrible messengers, like ones where encryption isn't enabled by default, or doesn't even exist for group messages, or isn't built on a protocol anyone understands or has reviewed. But even with that constraint, you have options.
Have you done a survey before arriving to this conclusion. If so, I'd love to take a look at it, if possible. Otherwise, I can't see how you can make this assertion; everything can be dismissed as a "small but vocal minority".
As a previous signal user, I stopped using signal because I discovered this issue.
(Also, it's kinda funny that Signal has reimplemented the iMessage problem -- you have to unregister your phone number on your website so people can SMS you again rather than continuously sending you Signal messages unintentionally.)
(There are also other problems that have bubbled up in Signal in the past year since I stopped using it -- I've heard there's an auto-backup process that takes more than an hour and makes your phone unusable and you cannot change when it happens.)
More importantly, I'm still very interested in seeing the data behind your vocal-minority assertion.
Comparing those two userbases doesn't make much sense to me.
I think it's really you who is making the extraordinary claim here and you should be providing the evidence. Signal's goal, as has been repeated many times in these threads is to replace SMS and other less-secure forms of messaging for as many users as possible, not to cater to somewhat off-the-beaten-path concerns over phone numbers. They are aiming for users of messaging. And they are making the very reasonable inference that most of those users are not hung up on identifying themselves with a phone number.
Additionally, if Signal's current users cared about the phone number thing that much, they wouldn't be using Signal to begin with. Where's your survey data that says Signal users just can't stand the phone-number-as-id thing?
What I was trying to convey is that I would expect people who don't understand or care enough about their personal privacy would use more popular and mainstream messengers. Whether people who care about privacy consider phone numbers private or not is something we need data to determine.
> Where's your survey data that says Signal users just can't stand the phone-number-as-id thing?
That's exactly what I'm asking for. At this point, we're both speculating. None of us can make solid claims about either userbase without providing evidence. tptacek most certainly can't claim " a small but vocal minority of Signal's user base wants non-phone-number identification" without providing evidence either, which is the original objection that started this thread.
Don't expect people to take your reason for dismissing their concerns seriously when you base it on your personal perception or beliefs in stead of facts, and don't make unsubstantiated claims to discredit their concerns.
But they ‘know’ the phone number is a non starter because that’s how other social networks outed them. So they will also often end up using a string of bad tools as an awkward way to keep several worlds separate.
Most won’t admit it, but this second reason is why they, as iPhone users, aren’t in encrypted Messages for those chats. Neither end wants the chats in their phone number world.
While you can argue this use case is for hiding that one even participates in chats others might frown on, that to me sounds like a use case that matters, since today, most of the world doesn’t agree what should be frowned upon.
// This is not a technical assessment, it’s from the non-technical users’ point of view, how they think it’s working or not working. Same folks assume phone number is more identifying than finding an app that doesn’t need phone number because it can accept their FB log in. It’s a pretty rough world for non-techies.
Yes, it is some, but it's not a ton.
Yes, it is – and it turns out that Go 2 will be adding generics.
Before that, you could have said "If generics are super-important to you, use Rust." But it turns out you can have both Go and generics, and (if you prefer Go over Rust for other reasons) that's even better than having to pick one or the other. The designers made a choice to omit generics in the initial version due to difficulties reconciling it with their goals of simplicity and ease of use, but (unlike some posters on forums like this) they never claimed that generics were fundamentally bad or that Go would never add them. Now that they have had time to think about how to work generics into the design without compromising their other goals, they are planning to do so.
I hope Signal does something similar.
Its easy to imagine that adding an alternative UX in Signal for non-phone based contacts might complicate user flows. Creating a user-flow for alternative contact discovery without hurting the user experience for existing users I think is not simple and easy to underestimate.
Go and Signal both choose a set of trade-offs that consider both technical and human elements. I think this irks people that don't understand these trade-offs since the human side of trade-offs is harder to define or evaluate the importance of. Ergonomics plays a huge role in the ultimate value of Go and Signal to society.
I personally know at least one person who can't use Signal because they don't have a smartphone to install it on (but could install it on the desktop, if the desktop app didn't require the phone app to be configured first). I know two more people who can't use it because they use tablets, and Signal still hasn't released a table version, even one that works the same as the desktop one.
It's weird to me to see people downplaying this; if they succeed, it will be a monumental achievement, surpassing SSL/TLS in impact to communications.
If you don't have a smartphone you can install apps on - perhaps the company making a smartphone app which happens to integrate with a desktop app as well) isn't for you. Same for "tablet users".
FWIW, you don't need to do _too_ much hoop jumping to get a non-phone device running Signal - I mainly use it on a dedicated iPod Touch, and I've got the iPhone app running on an iPad as well - in old-school 2x ugly-mode, but it works. (You do need to be aware of the risks around whatever phone number you use to bootstrap your way in possibly being reallocated. You might not want to use a 30 day burner SIM or a temporary Twilio number if you might have a targeted enough attack to impersonate you by hoovering up temporary mobile numbers...)
They recently added an optional "Registration Lock" feature to address this: https://support.signal.org/hc/en-us/articles/360007059792-Re...
And I don't think that tablets are an "edge case". I mean, seriously? We're talking about millions of devices on the market. They may not be as popular as phones, but they are popular, and people use them. So when a guy uses, say, iMessage today, because it works across his iPhone and his iPad, I can't really pitch Signal to him. And that, again, has network effects.
Skype almost managed to replace the legacy phone network, but fell out of favor. If there was interoperability between clients like Signal, Whatsapp, Line, Wechat, it would take over a multi-billion dollar industry. Instead, my bank is trying to use some proprietary video conference system to schedule meetings with me.
Sounds like a broken smartphone platform to me.
I'm not being glib: I really do believe that I should be easily able to replace any app on my phone. Android, while imperfect, is closer to that goal than is iOS, and so I use it, and recommend it to others.
The nice thing about phone numbers is you put them into a non app specific address book, so your friends list is portable.
Then go see what other messengers do to provide the same UX.
Depending on the messenger you pick, you will likely not have to give up your phone number, but you will leave the messenger operator with a log of everyone you've communicated with.
Signal's goal is to make all the messaging in the world that currently uses phone numbers as identifiers --- which, by a long ways, is most messaging in the world --- cryptographically secure.
Hope that helps. You are welcome to have and to advocate for different goals. Please don't pretend your goals are Signal's, though.
This is a better explanation. However, these goals do not protect privacy. If Signal's main goal is to protect privacy, then they need to change their secondary goals to accomodate it.
Claiming Signal doesn't protect privacy because: phone numbers is an opinion given you haven't qualified your argument.
Finally their goal is not predicated on what you claim just because you claim it. You're effectively creating a false argument so you can justify your position.
Secure systems are not built on trust. They're built with math and with facts.
Their goal isn't based on what tptacek said just because tptacek said it, either. If I'm wrong and privacy isn't their goal, well that speaks volumes on its own.
There are two groups of people (among others) Signal clearly doesn't aim to serve:
1. People that are very sensitive about, and only have access to, their one phone number.
2. People who want to sysadmin their phones.
I have perhaps more sympathy for people in group (1), but, unlike you, in neither case do I think the mismatch is a great moral dilemma. I do, however, believe that promoting inferior and untested cryptography is immoral.
The two of us are far apart on these issues and perhaps we should just agree not to engage on them.
I have F-droid installed as a system app and the only method for installing apps on my device. It is an AOSP device without Google play services or anything proprietary save for the minimum blobs required to allow the device to boot and communicate with the cell networks.
I don't enable root, as normally assumed of users that don't run stock. Doing so on Android is a well known terrible security idea. This is a hardened personal device where I choose to opt out of Google tracking, and backdoors like SprintDM.apk that Google bundles with their stock OS.
In order to install signal without Google Play I would need to turn on unverified sources on my phone and open myself up to Man In The Disk style attacks, and other security issues.
This is ridiculuous that a company that champions itself an advocate for security and privacy refuses to support users like me that opt out of proprietary software and the tracking systems that come with them.
Moxie not only said he will never support third party signed installation methods like F-Droid but has been actively hostile to those trying to do this for him.
Moxie suggesting people fork and make their own private network that simply want a secure installation method of third party signed/verified binaries on security optimized Android devices is irresponsible and does not inspire trust.
What you achieve with custom ROMs and custom app stores is customization capabilities and nothing more. If you believe you're achieving next-level security or privacy because you don't have Google installed; you're kidding yourself. Yes, you may be leaking (at first glance) less data to advertisers; but you've opened a whole different kind of attacks that could compromise all your data on your phone, not just what Google and the Android platform allow to share.
A perfect example is the "LibreSignal" project you mentioned, what kind of joke was that? The project was abandoned because it didn't get Moxie's blessing? That's a really strong sign of commitment with the cause. I'm sure that LibreSignal has more than zero active users, what do you think about their security/privacy level currently?
So you're doubling down on the lie I called you out on the first time around, then?
That goal is fundamentally broken, though, because phone numbers aren’t cryptographically secure. One can use any exploit which allows one to take over a phone number to take over someone’s Signal identity. Yes, all of the target's Signal contacts will receive a message stating they his keys have changed — but they are used to that.
Signal's got some awesome crypto, but it also has some intriguing holes.
I'm not so sure. A lot of the objections to Signal using phone numbers seem similar to objecting to SSL/TLS because it's not Tor.
The type and amount of privacy a user wants and what you can realistically achieve beyond that depend on your market and threat model.
Thinking Privacy is binary is like thinking IT Security is binary; until it's 100% it doesn't exist. That kind of thinking doesn't allow thinking in incremental improvements.
If they catered to what some people want (no phone numbers and federated network) then the regular user would have different options to use Signal. Which one is the correct one? Are they all the same? No. if you decide to develop your own client (like the LibreSignal example), do you trust that the client is secure? If the end application has vulnerabilities, then the communication privacy is compromised. That's why I say they're intertwined. Even Signal suffered from this same thing with the Desktop client. It is not an easy problem to solve, and that's why Signal does not want to have random people creating custom client apps and having them associated with the project, as it could confuse non-technical users.
Let'a say a journalist is targeted by a sophisticated attacker. The attackers want everything on the phone,why just calls and messaging? They won't even attack the protocol,they'll first try putting a RAT in place which will have access to everything. Signal does not promise to protect your communication after your phone is compromised(which only makes sense) but now the attackers don't just have access to your messages but also to your contacts. They now know the journalists sources and contacts by the phone number they used.
This approach enables "easy" mode for casual users who prefer phone number registration, while supporting additional privacy for others.
If IETF efforts to standardize E2E messaging protocols can lead to interoperability between clients, we can reduce the influence of social network inertia on messenger client selection.
But I want to use signal forfor cross platform messaging.
Can't use Facebook without a testable phone number either now.
Phone authentication is better than nothing and lowers barriers to adoption.
Perhaps when manually verifying an identity via the QR code add an option to generate a new id not tied to the phone number.
The Signal servers can't determine cryptographically that the message originates from Device A. But it is certainly from device A, because this isn't a peer to peer protocol.
It seems to me like what you'd need to make this work is some sort of intermediate layer, a bit like onion routing, that would have messages arrive at the Signal servers without basically giving everything away in the source IP field.
With general use of NAT there's a N-to-one mapping of identities to IP addresses, sure, but this seems to be technically true whilst in many cases completely erasing any benefit of this entirely.
If later they get a request from the NSA to look at their database, the undelivered messages can no longer be traced back to the sender.
Previously, an undelivered message would have to sit on Signal's servers with the sender's metadata in cleartext. Now, it doesn't.
I think it prevents their servers from correlating my identity and my IP address etc., but since I want replies and I'm asking the server about replies, doesn't that operation tell the server what my identity is anyway?
(There are some comments here talking about anonymous messages, but that doesn't sound right since the phone number is apparently kept in the encrypted, inner envelope, and also how would you route replies if you didn't have an identity of some sort for the sender?)
So suppose an attacker sees two packets:
Encrypted("Indian for dinner?"): Their username, your address.
Encrypted("Sure, sounds good."): Your username, their address.
From this they could be reasonably sure you two are talking but it's less data than they had before and now that attacker needs to either know you're using the source address or see both messages to get the full picture.
Suppose Alice is asking about Indian and Bob replies that it sounds good.
A passive attacker sees TCP/IP packets between Alice's IP address and Signal's and between Bob's IP address and Signal's. They get no other information from this beyond that Alice and Bob use Signal (and perhaps not even that if Signal shares IP addresses with other services)
The Sealed Sender feature makes no difference in that layer
If an attacker has control of Signal's servers (or perhaps Signal's server admins are secretly bad guys) ordinarily they would be able to see the sender and recipient of every message.
With Sealed Sender, the servers don't know the Sender any more, only that the Sender seems to be someone permitted to send messages to this recipient.
You could try to correlate IP addresses to Signal users, but you have absolutely no guarantee that they're correlated, much less that there's a nice 1:1 correspondence.
In a mundane example, Alice proposes Indian food then leaves her office, her phone disconnects from WiFi and goes to a 3G network. The reply from Bob is picked up by Alice using a completely different IP address, because she isn't in the office any more.
It's also odd that the tokens are 96 bits. Very odd, really.
(This is my imagination and it might be completely wrong so please feel free to correct me)
they mentioned spam problem which leads me to believe that if enabled, the sending client will encrypt the whole message using the recipient's public key and put all metadata other than recipient's identifier inside this bigger encrypted envelope. The receiver opens this envelope with its key and opens the smaller box inside which contains the sender's metadata.
What we lose now with this is the server does not have much insight into who is sending messages (by design). This means if you allow sealed sender from everyone, someone could send you a lot of messages which you may not like.
Yes, their servers can see your IP but if they discard it immediately you're safer if the NSA or another government agency reaches out to them.
Its been a rallying cry/common complaint by those who are technically inclined and privacy conscious for years now, surprised OWS would choose to give credit to the problem.
There is one privacy issue people put pressure on Signal about, and that's the use of phone numbers. Apart from that, Signal has always led the field on privacy issues.
The post about Giphy integration was a great piece of writing about an interesting technical challenge, and Signal's solution to the problem (a TLS proxy run by them, with a client that only accepts Giphy's TLS certificate, making queries quasi-anonymous) is not a bad way to go about solving things.
The reasons for avoiding Signal (metadata leakage, mandatory phone number usage, questionable 3rd party dependencies, etc) are valid, despite how I often argue to the contrary in favor of getting as many people on Signal as possible and using it for daily communication. Talking common sense to this demographic is hard though, in the context of basic security concerns persisting year after year.
On the flipside, the phone number as identifier issue has caused apps like Kik (super popular among the gay community, despite shit security), Wickr and Wire to become popular among the non-tech demographics, which is extremely disheartening.
It is totally OK if you are extremely worried about hypothetical scenarios where the phone number you used to register to the Signal network can be correlated to your physical location and then a gas station camera filmed you and then all is lost; but I want to believe that really at risk people are smarter than that, and just get a burner phone and even pay a homeless person a few bucks to buy it for them.
There are also ways to get a phone number through the Internet, so you don't even have to go to a physical location to buy it.
I think that's why Signal isn't prioritizing this right now, phone numbers can be a problem? yes. Is it hard to get a fake phone number that is not traceable to you? not really. Next problem please.
I think Signal is achieving the goal of being the default go-to secure messenger. I'm sure, even technical people who like to nerd out on alternatives, faced with a real world risky situation when they have to communicate with a non-technical person, would recommend Signal without a second thought.
I want to use it, but I haven't been able to get it to work reliably, so I haven't started trying to convert people.
About having to click on images to download them: it should automatically download pictures unless you don't have the sender in your contacts. Auto downloading on cellualr networks might be disabled by default though.
To say Signal is "unreliable" is bull shi*. It's a fantastic product and service that I would gladly pay for but am glad it's free. In the meantime I'll continue to donate as Signal has been very reliable in my years of use.
I have been using it for 3 years and struggle to recommend it to people because I constantly have reliability issues, including messages delayed for hours, bugs where contacts get in an unusable state and other little weird things.
Whatsapp doesn't have these issues, so even if it's due to not being online 100% of the time, Signal should deal with it.
Can you provide any issue you've submitted from years ago that hasn't been addressed? Please post it, I'd like to see.
Granted, a lot of these are probably outdated. However, given that there are literally >1000 issues closed by a bot, there must be some that are still valid but no one actually looked at it.
It won't prevent correlation attacks, but it does make metadata attacks in general harder and less confident. An improvement is an improvement even if it doesn't completely solve a problem.
Signal could probably force TOR connectivity (forcing websockets for those connections), which would conceal the sender even more, but would not stop correlation attacks.
This also wasn't made to stop correlation attacks. This is a measure to reduce metadata being stored on the server. Messages stored on the server won't have a "from" label that can be read by the server.
As James Mickens teaches us: "YOU"RE STILL GONNA GET MOSSAD'ED UPON"
> To prevent abuse, clients derive a 96-bit delivery token from their profile key and register it with the service. The service requires clients to prove knowledge of the delivery token for a user in order to transmit “sealed sender” messages to that user.
> Additionally, blocking a user who has access to a profile key will trigger a profile key rotation.
People who don't know you can't use the new sender privacy beta feature, but if you are willing to risk spam then you can allow everyone to use it:
> users who want to live on the edge can enable an optional setting that allows them to receive incoming “sealed sender” messages from non-contacts and people with whom they haven’t shared their profile or delivery token. This comes at the increased risk of abuse, but allows for every incoming message to be sent with “sealed sender,” without requiring any normal message traffic to first discover a profile key.
It's not. But if you want something fixed/looked into, your best bet is emailing the admins. They're super-responsive.