Hacker News new | past | comments | ask | show | jobs | submit login
Large-Scale Abuse of Contact Discovery in Mobile Messengers [pdf] (ndss-symposium.org)
410 points by weinzierl 27 days ago | hide | past | favorite | 201 comments



Wire (from the creators of Skype) does not mandate a mobile phone number (SIM cards are tied to government identity in many countries). Only an email address is required to open a free account. Nor does Wire mandate upload of your phone's address book with personal social graph of contacts. Free for consumers with paid teams offering for enterprises, optional on-prem server. Open-source clients and server. Cross-device history if the device logs in within a few weeks of the sent message. Basic export/import for moving your device's message archive to a new device of the same type.

https://wire.com/download/

They are contributing to IETF MLS for end-to-end encrypted group messaging: https://datatracker.ietf.org/wg/mls/about/


Wire is massively underrated in general. It’s got a slick UI that’s easy for non techies, it’s got native clients on all major platforms, and it has everything you really need from an e2e IM without the fluff. I’m surprised it doesn’t come up more in these discussions and people just “settle” for Signal or another service that needs your phone number etc.


Agreed. I've been using it for years, and much to my surprise I've been able to convert most of my extended family into using it. The fact that our parents (60s-70s) are using it successfully to post and share pictures as well I think is a good sign of it's accessibility. The rich media, good voice/video support, and ease at making multi-user discussions are all excellent. The persistence across devices is excellent, and I love that it's just as easy to use on the desktop as it is on mobile.

I do think there's some improvements that can be made, such as a better visibly into how to sign up without a phone number (I think this is still the default on the phone app) and a more visible download option on their website (the free version is buried under "Resources" -> "Downloads". You can make backups, but there's no easy method to do a plain text export. I get the feeling sometimes messages get "Stuck", and there's been issues in the past with notifications not being sent or push notifications not getting through certain Android sleep states. Sometimes I'll edit a message just to "resend" it such that it's delivered.

Overall though, It's still my secure messanger of choice by far. Glad to see it discussed here.


The clients are not actually native, at least on desktop it's just Electron and the mobile clients (Android, iOS) don't feel fast either, but frankly rough edges like these are my only real complaint.

Features are available and work everywhere (unlike Signal which has a dumbed-down desktop client and no web client at all), it does everything you generally need and the search is actually superb (better than Telegram even, since tg only does word matching and Wire can do symbol and arbitrary string matching and is also very fast) although limited to one chat so you need to know which chat contains what you're looking for. Meanwhile Element (Matrix) goes "can't search encrypted chats"... useless if you want to communicate something non-ephemeral, you'd need to switch to pgp-encrypted email and all its problems or another chat service.

Compared to all the alternatives, Wire with all its faults is the best encrypted messenger. I would recommend it to everyone aside from the network effect: Signal clearly has more users (while being worse on features and privacy). Because it's useless to be alone on a messenger and because it's still a step forwards from the status quo, most of the time I end up recommending subpar solutions.


> dumbed-down desktop client

That used to be true, but now the desktop client has almost all the feature of the mobile clients. Which feature do you miss there?

> Signal clearly has more users (while being worse on privacy)

I assume you're talking about the phone number requirement? This is fair criticism, but what about the rest? Signal leaks way less metadata than Wire, which is more similar to WhatsApp in that regard.

See also this comment: https://news.ycombinator.com/item?id=14069674

Some things might have changed, but it's not clear to which extent.


For metadata you just have to trust them. Sealed sender doesn't solve that[1], even if it's better than nothing. It's also better than nothing to require no phone number in the first place (Wire), or not to require a payment (like Threema does) which is roughly as hard to make anonymous as getting an anonymous phone number. But either way, they can track everything you send, it's a matter of wanting to. The only way to avoid that is by not sending personal data to semi-/untrusted parties at all (Matrix).

Since centralized services are currently also the convenient ones and neither Wire nor Signal show any sign of wanting to use a decentralized protocol, those that can't be bothered to use inconvenient services have to trust that their centralized service is ethical about what they do with your data.

The privacy differences are very minimal by comparison when they're all missing one basic feature or other like message editing (Signal), a desktop client (Threema[2]), cross-chat searching (Wire), searching a chat at all (Element), any usable encryption (Telegram), etc. If you're going to use an end-to-end encrypted messenger where the server can't read any contents, the privacy differences are just not that large when you trust them all equally. The only thing that you can objectively compare is what the client sends, since that's something they can minimize and you can actually check.

[1] "clients derive a 96-bit delivery token from their profile key and register it with the service. The service requires clients to prove knowledge of the delivery token for a user in order to transmit “sealed sender” messages to that user". It's a bit opaque so in regular English: my Signal client sends something like deliveryKey = H(profileKey, currentTime) to Signal, where my profileKey is something only my contacts and I know. Great, so my contacts only specify the deliveryKey and not who's sending, so you send anonymously! But wait, those contacts connect to Signal with an IP address and are doing other things like updating their profile or registering their delivery keys for their account from that same IP address. 1+1=2 and you know who is sending messages to whom.

[2] They have it, but your phone plays Chinese Whispers with your computer and every time your laptop or phone reconnects to wifi/mobile data you need to open the app on your phone and navigate two menus to re-enable it.


> The only way to avoid that is by not sending personal data to semi-/untrusted parties at all (Matrix).

With Matrix you still end up trusting the server sysadmin (both in terms of ethics and technical abilities), it's not like it solves the problem for mass communication where at least one user has to agree on a 3rd-party instance.

> those contacts connect to Signal with an IP address and are doing other things like updating their profile or registering their delivery keys for their account from that same IP address.

Correct me if I'm wrong, but I think that happens within SGX. This in itself is its own can of worms, but assuming it's secure, I don't think you can do this association IP / account that easily. At least it seems so easy to work around that I would expect it to be like this (given that they already use SGX for other things). Now of course SGX is not secure, but it still makes the attacks more expensive and complex than they would be otherwise. If you have access to a Wire server, it would take you minutes to get all the group memberships of one user. If you have access to a Signal server, you'd need to log connections over some time, and do some statistical analysis to extract useful information. Not impossible, but far more costly and less reliable.

In terms of privacy Wire might be better with respect to the phone number requirement, but otherwise Signal has an edge.

The IP address is also something you can solve independently (VPN / Tor), so it makes sense not to solve it within Signal. But it would be nice to have some integration with Tor eventually.


> SGX. This in itself is its own can of worms, but assuming it's secure

It likely doesn't really add much security. Anyone capable of running processes on the chip hosting the SGX enclave can probably run a side-channel attack to recover the necessary keys. I was very disappointed that Signal took that approach.


Element does encrypted search fine on desktop, fwiw.


The Wire iOS native Arm client is fast and can be made officially available for M1 Arm-based Macs. Looking forward to that, as the Slack iOS Arm client is way faster on my M1 Macbook than the desktop memory-hogging Slack.


Did you use unofficial means to install Slack iOS on M1 (which is now patched if I remember right) or is there still a way?


I did it before the patch, but the enforcement is now being done by Fairplay DRM. There are reports that you can extract the iOS app from a jailbroken iPhone, with DRM removed. Then the resulting .ipk will work on M1.


Element can search encrypted chats, just not yet in the browser


I also like wire and just by chance found it to message my kids on their iDevices that don’t have a phone number like iPads and iPods. It works great and has all the main features. I’m even happier now that you guys inform me it’s secure also.


If they’re Apple devices, why don’t you just use iMessage?


This is purely speculation into their motivations on my part, but it might have to do with Apple backing up encryption keys to iCloud.


> I’m surprised it doesn’t come up more in these discussions and people just “settle” for Signal

Not needing a phone number is nice, but last I checked Wire does little when it comes to metadata, while Signal is more or less the state of the art in that regard.

The phone number requirement itself is supposed to be eventually dropped in Signal (although I'll admit it's taking quite some time, and with the spam issues it might take some more).


Matrix tools like Element is decentralised which is preferred, wire is not.

The company keeps a list of all the users you contact until you delete your account.

Source: https://archive.fo/ARZe4#im


It is federated, not decentralized. You need to use a server, which will have access to your contacts and the rest of the metadata such as how often you talk to them etc (and all message content that is not E2EE). You are only safe from third party if both you and people you talk to run their own servers.


That's changing. Dendrite [1] is the official next-generation server implementation and optionally supports decentralization.

But due to the usability constraints of truly decentralized systems, I think most users are better off with federation.

[1] https://github.com/matrix-org/dendrite


mind explaining how Matrix is not decentralized?


As I understand it, the consensus is to call multi-server systems "federated" and peer-to-peer systems "decentralized". See https://en.wikipedia.org/wiki/Decentralized_computing for references.


P2P is a form of decentralization. Your resource doesn't back up your claim.

E.g. it states: "A collection of decentralized computers systems are components of a larger computer network, held together by local stations of equal importance and capability. These systems are capable of running independently of each other." This is true for Matrix with Homeservers being the decentralized computers and the network of the same being the larger computer network.


"Decentralization" referring to P2P is exactly what I said...?

If you are going to pick a tiny part of the page, you might find that it's not sufficient. In the sense of your chosen quote, Twitter is decentralized, since it runs on multiple computers organized in a network. Those are capable of running independently of each other, to a degree, if Twitter Inc did their High Availability work correctly.

It is not decentralized from a user's perspective though, since the software they use on their devices cannot run independently of the remote systems of twitter.com.

I don't know where that leaves us, but then again I don't know what you were trying to argue either.


Just because P2P is a form of decentralization, it does not need to be the only form of decentralization. Federation is imho another form of decentralization.

Yeah, and I agree there are different types of decentralization: e.g. physical, authority, ... or combinations

Each incarnation with different goals


Sure, this is not the only possible way those words could be used, which is why I mentioned a "consensus". Many words can be used in ways that are "technically correct" but confusing. Like I said, by some definition Twitter is "decentralized", however it is not what anyone ever means by "decentralized".


What they mean is that there is a client/server separation. Truly decentralised systems don't have this distinction and only have communicating nodes. Though this is coming to Matrix too.


Seems arbitrary as clearly Matrix is decentralized despite having client/server separation.


If you think about it in terms of network topology, you'll see it's not completely arbitrary. In fact, the long-term plan for Matrix is to explicitly support the fully decentralized topology by using Dendrite, where a client wouldn't need to connect to a dedicated server at all. Instead, each client is also a server.


I'm amazed that they still don't have any kind of 2FA after nearly four years.

https://github.com/wireapp/wire/issues/85


Perhaps this comment is why my email has blown up over the past day. It's interesting to see this issue pick up steam every now and then.


I didn’t login on Wire for 3 months and “for my security” messages that were sent to me during that time were just... lost. I think my history was deleted too. This happened 2 or 3 years ago, but it made me just switch to something else (Telegram).


So you want them to just hold on to your messages on their servers indefinitely? I realize this is the norm nowadays, but is this really what you actually want?


Wire is not for people to leave message unread for 3 months. Most user can deal with that. And once it is read. You can archive your message on to a file (and then put on USB). For most users, why use anything else?

For those who like to leave your message unread on the server for up to a year, then go with signal or telegram.


In a flawed world that deals perfectly with its flaws, infinite storage would be a service you can optionally buy from an open market of third parties, alongside optional identity mediation (for those who definitely don't want their identity bound to a device or SIM or some other PID known to the core network). Bonus points if the technological and organizational interface between third party and core network is paid, I believe that this would be one of the least bad ways imaginable of funding the core network.


Yes? I want them to hold to my messages in am encrypted form for all time. The NSA has probably logged them anyways. I'm done with managing my own backups, my state should be persisted "in the cloud" with no effort from me.


I think it should be an option, at least.


For those who want end-to-end encrypted messages, it's a feature that the server doesn't have a persistent archive of message history. Wire messages are on the server for a few weeks, long enough to relay those messages to transiently offline devices. Telegram is great at what it does, different use case from Wire.


I'm actually fine with the wire approach.

But it would be nice if the sender could be notified that the message was never delivered.


Each Wire message has a user-visible status: Failed, Sent or Delivered.

It's not obvious, but if status never changes from "Sent", then it wasn't "Delivered".


There's a new icon now as well with an "eyeball" that tells you if the message was viewed. The option for this read receipt can be enabled in the options if both parties enable it.


Matrix does have end-to-end encrypted persistent history.


Yes, it's possible. Depending on your threat model, it may or may not be a good idea to have a long-term archive of encrypted data be subject to legal process. Matrix has plans to support IETF MLS, which is a necessary precursor to messenger interoperability.


Matrix lets you configure the history retention on a per-server or per-room basis, fwiw: https://github.com/matrix-org/synapse/blob/develop/docs/mess.... From a metadata perspective, we're working on P2P to avoid metadata accumulating on servers. In terms of the OP here (address-book based contact discovery), https://github.com/matrix-org/matrix-doc/blob/hs/hash-identi... is how Matrix optionally implements it while trying to preserve privacy.


Yeah, I guess it’s just a different use case, but it was a behavior I didn’t expect (maybe I didn’t read the “fine print”) and it turned me off using it for long-term stuff. It’s a pity because I liked the interface and features.


This is one of those replies that should be put in some kind of HN canon. It perfectly shows why there are so few privacy or security respecting options. They did the correct thing for security and you switched.

As I've observed for a long time: UX is more powerful than anything else except maybe cost, and even then one driver for user preference for "free" apps is not having to dig out a card... so cost is also UX.


That’s a bit unfair. I value privacy for some things but for other things I value more not losing my message history. Telegram is not as secure as other options by default, but for me it strikes a good balance between convenience/usability and privacy, as I can optionally open a self-destroying secret chat when I need it.


With Matrix you get privacy with message history.


Thanks for the suggestion, I will try it out.


FWIW Telegram deletes your account if you are away for a year, and you cannot disable it.


Uh, just FYI this is alao a feature on Telegram that can't be disabled (only extended up to a year).


Threema also does not require your phone number.


It requires payment - so google account or your paypal/credit card are on record.


You can pay by Wire transfer, MasterCard, Visa, PayPal or even Bitcoin[1] so there is at least one anonymous payment option available

1: https://shop.threema.ch/terms


Who said Bitcoin is anonymous?


There is no such thing as an anonymous transaction, for a fully informed definition of anonymous.

Bitcoin isn't... neither is cash in hand. Neither is a drop. Someone knows.

A discussion of 'anonymity' in this context is one of increasing the difficulty of discovery, not thinking that the discovery is impossible. If a major world government is after you, good luck with "anonymous"


Monero is.


So you think if a notorious terrorist or whatever moved millions of dollars through Monero to fund a terror attack, American or other intelligence agencies wouldn't be able to identify the transaction?

It could be done truly anonymously when up against the full weight and might of US, Western, Israeli etc intelligence budgets and methods?


Well the sender, receiver and the amounts are all hidden, so yes.

I should still note that it isn't a magic bullet, and that things you do in connection to the monero blockchain can obviously still give the authorities an idea of what you are doing.


Would it still be hidden if a government identified servers hosting the network, seized them and forensically analyzed them including having access to potential zero days in encryption tools, backdoors in algorithms, and supercomputers?

Etc etc. My point is that anonymous is an ideal, not a absolutist reality.


You could convert Monero ou Zcash to Bitcoin at an exchange before paying. I don't know which exchanges currently allow to do that without verifying your identity, though.


Doesn't Monero focus on differential privacy? Meaning you have plausible deniability not anonymity. Where as something like ZCash you have to trust the seed. Did things change?


If you can acquire BTC anonymously, then you can pay anonymously.


If you can acquire BTC anonymously, then you can acquire prepaid virtual credit cards anonymously. Most localbitcoins exchangers will happily do bank transfers for you without asking any questions either.

There's not many kinds of payments which aren't fairly easy to do anonymously.


Any recommendation on prepaid virtual credit cards which are accepted in EU?


Sorry, I have none. I never bothered to try acquiring de-personalised crypto, as I never needed any. The VCC idea sounds neat, but it's practically transparent for state actors. I think the easiest and safest option is directly exchanging cash for crypto at a crypto-ATM, if you can find any where you live. Downsides, you usually get a relatively bad exchange rate and they are often hard to find, if not forbidden. Or you buy privately, preferably from someone you trust not to cheat you, but you'd first have to find that person. The next-best thing might be to buy a relatively private coin like ZCash or Monero and exchange that for whatever else. But don't take that as advice, I'm not an expert. It's probably best you do your own digging.


Ezzocard has many. Exchange rates are bad, but it's a small price to pay for privacy.

These guys must be making insane amounts of money, would love to build a competitor.


I forgot this fact. One could create a new Google account and buy a gift card.

In any case, doesn't this only link your personal data to the ownership of a Threema app usage license and not to the content (user ID) inside the app?

It's definitely different to entering your phone number into the app.


FWIW, your threema identity isn't tied to your license key.


How can you get push notifications on an Android device without using Google Services?

You can just get a gift card and pay in cash for it.


I can't seem to find any information about their free version on that page.


https://app.wire.com/auth/?hl=en#register

It's a bit hard to find, looks like they've given up trying to compete for non-business users, but the client has a registration form open to everyone.

I think they'll keep supporting this because inviting those 'guest' accounts into rooms of business users is a big feature. We regularly collaborate with people via Wire (customers or freelancers, who can just use their personal Wire accounts) because it's the easiest way to collaborate without forfeiting encryption or features.


I love wire, but this is one of my biggest pet peeves right now. I can't tell someone to just "go download Wire" on the desktop without giving them navigation instructions ("Resources" menu at the top, then "Downloads" because there's so much focus on the paid products.)

Not a pro move in my opinion.


Every time I open the Snapchat Android app it prompts me with a Snapchat-styled (not the system) dialog to share my contacts. Every time I hit "Don't allow". Every time it prompts me again.

This is an inexcusable dark pattern. Two things need to happen:

1. The operating system needs to provide a "screw you, never" option for any permissions.

2. We as engineers need to say "screw you, never" to requests to implement behavior like this. Sure, this could be a bug, but I see the same behavior with Venmo and location access.

Personally I'm rather disillusioned with where we've found ourselves. This sort of adversarial relationship in which people are property of a platform and treated as such is winning.

Edit: Venmo had been set to "Only while using the app" and was prompting to enable location services on the device, not for permission. That's my own fault.


The "screw you, never" option is to delete your Snapchat and Venmo accounts, and delete those apps from your device. This is what I did.


Indeed, that's what I should do as an individual and try to drag along as much of my social graph as I can. There probably isn't any salvaging platforms built on data collection as a business model.


>We as engineers need to say "screw you, never"

The engineers have spoken. They say, "I'm getting paid too much to care."

Or they look at it from my perspective- who cares? According to you this is an issue but from another perspective there are clueless users who accidentally denied the permission and are grateful later. Can we really afford to have every engineer constantly objecting to the slightest subjective interpretation of what makes a dark pattern?

That's not for me to decide.


The system does have a "screw you, never" option for all permissions.

The issue is that Snapchat (in your case, as I don't have this happen on v11.23.3.36) is told that they wont get the permission and wont be able to ask for it either.

And so they perform their own inhouse permission request to you.

There is nothing that can be done from the system's point of view for that.


> There is nothing that can be done from the system's point of view for that.

They can ban the app from app stores for using any non-system interface to request permissions, like they do for payments.


I am not agreeing with it, but they are showing a piece of custom ui that probably links into the settings for you to make changes.

That is a very normal practice on Android. The fact that Snapchat decides that pestering the user for the permission is acceptable I find to be a very strange design decision.


It's not that strange, it's just a dark pattern. But it does touch on a specific problem with permissions interfaces.

If you have an interface that looks like "requestPermissions(...)" or something like that and have a case where users can reject permissions forever without nags, then how do you handle users who inadvertently deny permissions forever? One solution is to just nag forever, if you can.

Another workaround is to enable permissions manually through settings, but you need to make errors loud and obvious with workaround instructions and people interpret the issue as a bug in your code rather than a feature of the platform.

This is a problem on MacOS, just as an example.


The system could supply an empty contact list.


> v11.23.3.36

v11.25.0.29 Beta for me

> in house permission request

That's what I had feared. I'd expand the scope of "the system" to include Play store rules.


It would change the ecosystem, for the better, if Google would uphold and enforce their policies as fiercly as Apple does. It would surely reduce the amount dark pattern predatory apps on the market place, and overall just pull the quality up a few notches.

Why they haven't done so already is a mystery to me.


I get annoyed by the web version of this for push notifications - mostly the ones that copy the specific browser UA style.


This dark pattern is called the ratchet. https://news.ycombinator.com/item?id=16689663


There needs to be two lists of contacts.

One which I allow to be shared with apps And another which are my contacts I use with my dialer.

People don't need their messenger apps knowing the phone number of their doctor


An alternative would be to make it similar to the iOS photo gallery permissions:

When an app requests permission to all photos the user gets the option to share only a subset of photos with the app. (This subset can be different for each app.)


While I agree with you, the current way this functionality works is really terrible.


It's indeed terrible in Apps that reimplement the photo selecting feature, but in simpler Apps that just use the default picker it works quite well and it is transparent.

Popular apps like WhatsApp, Twitter, Facebook, Telegram, etc, could just fall back to the default picker when full access is not available.


Agree - it's a great concept but seems to involve an inordinate number of clicks (touches?) all over the screen


Don't forget that on Android, Google will sync all contacts with their servers without asking the user to install an application (the Play Store have access to all contacts and will sync them with the Google account when logging into the Play Store... which is inevitable on Android).

The protection must be set higher up (the law, I guess) since the operating system has become spyware.


To that point, I find it entirely un-amusing that Google is presenting me with a notification on my phone that says: "Account action required: add your birthday" %@$#

A piece of info they very likely can derive, but would really appreciate if I'd help them out and confirm it. Screw them.


on ios if one does not grant addressbook permission for the app:

* telegram uses an "internal" contact list for which one can add contacts via the desktop client and then works as expected.

* whatsapp let's the user freely initiate contact by phone numbers, but then only shows the number (no name).

don't know about signal.


WhatsApp on iOS cripples user experience without access to OS contact list. You cannot create groups, and cannot initiate a chat with anyone, even by phone number. It works if someone else messages you first or adds you to a group.


> cannot initiate a chat with anyone, even by phone number

as stated before, i can initiate convo by number just fine. it's just that i can't edit the identifier/handle and so it's always shown by number. thankfully most ppl have set their profile pic public, but it gets messy pretty fast.


I can’t, care to share the way you do that purely within WhatsApp for iOS without a jailbreak?


in the "Chats" tab there is an "edit" icon in the top right (rectangle with a pen in it).


Well I’ll be darned. They must have changed that recently or I didn’t update WhatsApp for a while, it definitely used to lead to a non-ideal state screen fairly recently. Props for the pointer!


https://wa.me/phonenumber

Very intuitive, I know.


For people as confused as I was about the above link - after spending 5 minutes trying to figure out why visiting it gives me a 404 page, I eventually got out my phone and tried pushing that link to Android activity system, and then I discovered: it's meant to be wa.me/[an actual phone number goes here] - like wa.me/12345678912345.

Not great indeed.


If you type https://wa.me/your_own_phone_number (replace your_own_phone_number with your number) you can get into a chat with yourself. Useful for misanthropes, introverts, or for having a place to store information.


404 for me.


I can't imagine the use case where you want to keep your contacts from WhatsApp for privacy, but continue using a Facebook service.


Because WhatsApp has a sound e2e implementation enabled by default, is more trustworthy than Telegram and less liable to be influenced by Russian government? Keeping contacts out of it is just hygienic.


As far as I can tell, on Android, whatsapp requires adding a person to your contacts before you can message them at all. I find this very annoying when I'm going to be messaging someone for a brief period only. If there's a way to not do this, I'd like to know it.


You can message a person without adding them to your contacts, if you know their phone number.

https://wa.me/their-phone-number-in-international-format . Type this link in the browser or in some chat. Long press, click open and it will open in whatsapp by default.

You can start a chat with yourself in whatsapp, to send these links to yourself and easily click on them. (type wa.me/your-phone-number-in-international-format , click and send any message )


Thanks, I'll remember that. It's an awkward workaround compared to pasting their number from an email into the app, but it'll have to do I guess.

Here it's pretty common for people to, for example, put a notice up in my apartment foyer "does anyone have xyz that I can borrow, please app me at +123456789" and having to create a contact is an annoying bit of overhead.


Or the number of the VD clinic, or the number of the an oncology ward, or the number of the pawnbroker, ...


In time I expect all OSes (mobile and desktop) will provide a "give false data" option. So Sandboxed+false inputs, sort of a digital Descartes deceiver.

Because you know the slimy app developers will refuse to work if you don't hand over your full contact list. and people will just accept that.

So the end game is a completely adversarial relationship, even on your own device.


> In time I expect all OSes (mobile and desktop)

How much time? This has been a thing in one form or another since j2me. Some j2me platforms actually supported this kind of behavior, but that was all lost once Android and iOS came along. Same with fine-grained permissions over network access (eg, user having complete control over what networks/etc an app can access).

We /had/ all of this in the days of BlackBerry, and lost it. Nobody wants to give it back now.


That's a good question. We know exactly what technical solutions are needed to counter this adversarial activity.

What social tipping point needs to happen for those to be implemented? How can we lower the social bar for that to happen?

I have no clue. I'm almost always wrong on the social side of things.


I didn't know that (Blackberry). I would like to know about it, if you have any links.


All current info is about BB os 10, which did not provide this policy; but I can attest to it as a former owner and app developer for their their java-based OS (through 7.1). Whenever an application attempted to access a network, you would be given a prompt to approve/always approve/deny; and you had pretty fine-grained controls beyond that.

This link has some screenshots and gives an idea of the capabilities. Adblock recommended:

https://crackberry.com/blackberry-101-application-permission...


It would be much smarter if your contacts could chose to allow you to share or not share their details.


While I agree with the sentiment, I am not so sure how well this could be implemented in practice.

I feel it would rather be better to disconnect the more static portions of a contact (in this case the phone number) from the contact on the chat network, and use something a bit more transient. Like an email address.

And while yes, primary email addresses are even more static to our identity then phone numbers, having the option of using a single address for each chat network would disconnect this contact graph somewhat.

Going further, if each platform (iOS/Android) would have a more granular portions of the contact queryable rather than handing over the whole contact, the applications could simply just request the "chat network identity" portion of the contact (never to receive the phone number, country, home address etc etc)


I don't know about other Android phones, but Samsung has a Secure Folder with a separate list of Contacts.


The reason this abuse of privacy is so widespread is because there are no bad consequences for the perpetrators.

Governments don't enforce privacy acts in this circumstance.

Users just roll their eyes, knowing there is no way for them to complain except though boycotts, which are difficult to organise and might not work unless coordinated on a massive scale which has never been tried.

And so it goes.


And yet people are still bitching about the GDPR. The only problem with the GDPR is the lack of enforcement.

However, it isn't really surprising considering a large chunk of this very community makes their money off large-scale stalking and the same unethical things they complain about.


The only people really bitching about GDPR are software engineers and lawyers at companies whose privacy practices are still questionable after all the time given to clean up their act. That's why, if you look at HN comments, everyone seems to hate it. Sample bias. None of the remaining 7 billion people on the planet really know much about it. I bet if you summarize the regulation and describe it to a random person on the street, they'll nod and think "sure it's kind of a good idea" and forget about it in the next 10 seconds.


People will complain about the meta-information leakage of email but then turn around and suggest we should instead use something based on your phone number[1]. I guess this is a reminder that phone number based contact discovery has issues as well ... perhaps worse ones in practice.

With email the server operators know who is talking to who but do not necessarily know who any of those people are. Email clients will not show who has you in their contact list. There is no practical way to enumerate every active email address in use.

[1] https://latacora.micro.blog/2020/02/19/stop-using-encrypted....


The paper claims that stricter rate limits are a possible solution to this issue, and that with stricter limits in place "crawling entire countries would only be feasible for very powerful attackers". I don't think I agree. Take Signal: the authors managed to crawl all US phone numbers in 25 days, using 100 accounts. Their proposed stricter rate limits force an approx 50x slowdown on an attacker (Table V), which seems to imply that over the same crawl period an attacker would require 5000 accounts. If we assume that virtual SMS numbers are around $5 each, then the attack now costs about $25k, which is about 0.001% the GDP of East Timor.

They also propose a global salt as a mitigation. I'm a little confused there too, because wouldn't the salt need to be present in the endpoint application? If so it would be trivial to extract.

Their proposal of using a key stretching hash algorithm (e.g. Argon2) seems reasonable? At a significant increase in cost on the server side.


From the paper:

> Signal acknowledged the issue of enumeration attacks as not fully preventable,

So rate-limiting is fine as long as you don't hurt user experience, e.g. you can still message your contacts within 1min if you have around 500 contacts. It's also nice to lower the load on the servers. But there's no real fix as long as phone numbers are used.

And it's fine, given that Signal basically leaks one bit of information: whether a phone number has a Signal account or not.

...of course, assuming that the account owner doesn't accept unsolicited messages (and thus shares their profile, with their picture and "About" field).


It didn't take long after the first social graph was created to realize that your contacts define you as much or even more than your other indicators do. That's why so many companies are gunning for this information.


A quote I saw on the Internet long ago went, "You're the average of the five people spend the most time with". I used to interpret it only in its original, prescriptive sense: if you want to become a different person, make appropriate changes to your social life.

It took me way too long to realize it's even more applicable in the descriptive sense: the people you spend most of your time with are a good statistical predictor of who you are.


"Interestingly, if the number provided by Hushed was previously registered by another user, the WhatsApp account is "inherited", including group memberships. A non-negligible percentage of the accounts we registered had been in active use, with personal and/or group messages arriving after account takeover."

Did not know that taking over a WhatsApp account is that easy.


You can set a pin which I think would reset your account eventually if you want to register the same number but fail to provide the correct pin, and the previous owner of the phone number does not use WhatsApp in some time.


The most interesting section for me was "Exposed User Data". My takeaway is that an attacker can know whether my phone number is registered with Signal and can receive voice and video calls. They also get my encrypted name and profile picture but would need my explicit consent for that.


That's my take as well, and I think that's already known by most users. So nothing new for Signal users.

For WhatsApp it would be nice to change the defaults so that no information (other than the fact that the number is registered) is shared with no interaction.


I think the information who is a contact of whom should not reside with the provider, but be distributed among the peers.

I had a proof of concept of something similar working a couple years ago. I was writing a file-sharing app, and the goal was to piggy-pack on the existing social graph that you had from Facebook, Skype and so on. I could not register as a proper Facebook app, since that required having a domain, and I would be legally catchable in case somebody used my app to share copyrighted or illegal stuff.

Back then, Facebook messenger was based on XMPP/Jabber, which allows sending custom stanzas (secret messages that are not shown to the user). My app would ask for your login, then send a message to every contact, and if it got a reply from another instance, it would perform a handshake and exchange keys. This also worked over (old) Skype, which although it didn't support XMPP, you could do something similar with SkypeAPI. (Unfortunately, a few months later almost every messaging service that allowed such a trick stopped it...)

Now, this trick doesn't help if you are trying to set up a new social graph in the first place. But I think with this "send an invisible message to a potential contact's app" primitive, you could build a list of mutuals securely. (Other caveats apply, you'd let somebody know that you have their number for example...)


> I think the information who is a contact of whom should not reside with the provider, but be distributed among the peers.

Signal partially solves this with SGX. Partially because SGX will probably never be fully secure.


An even bigger problem: if you set up an Android phone without a Google account and need to access the Play Store (which is impossible to do without nowadays), Google will force you to sign into a Google account at the OS level and will suck in all your devices contacts -- with no opt out. You can disable this sync "feature" but *only after*, once Google has collected all your contacts (phone number, addresses, emails, birthdays, etc).

Explanation: the toggle to opt out is made available after you log into Google and you must navigate in multiple screens which gives Google ample time to collect hundreds of contact details. Of course, it is not possible to turn on airplane mode during this procedure since the log in requires an Internet connection.

This is 100% against the EU GDPR. I've submitted a complaint to my local privacy regular (French CNIL) but never heard back.

This probably impacts 200+ hundreds millions of EU citizens (since 2/3 of the population must be using an Android phone). I can't imagine a more massive data collection program, since each user must probably have more than 50 people on their device so the total amount of people that is affected exceeds my imagination.

How can Google get such a pass?


I use Aurora for the Playstore. It's buggy but does the trick for the few apps not in FDroid. No account needed as they provide a service for shared credentials.


I feel you. Unfortunately, I don't think GDPR will be of much help on that case. A fine of up to 1% of they're revenue will hardly make a dent - it's just another tax. Misbehaviour will be still worth it.

Actually GDPR is of great help to them, by putting smaller competitors away from using similar techniques. For those, a 1% revenue is a lot.


Happened with Australia's real time payment system too.

https://www.itnews.com.au/news/monitoring-fail-allowed-westp...


I didn't know that. Thanks.

I suppose the consequences for Westpac are... nothing.


As far as I know, no. I suspect that it wasn't just them.


The authors have an implementation of an improvement on these messengers' contact discovery algorithms at https://contact-discovery.github.io/


That's an interesting project / paper[1]. I don't have the time to dive deeply into it and understand how this works, but at a glance it seems better than what I was working on a few months ago[2]. Does anyone else know more about this, since the paper is really written for the in-crowd? The video[3] helps a little for a very high-level overview (to give some context before starting to read), but doesn't explain how it works or what properties their algorithm has.

[1] https://eprint.iacr.org/2019/517

[2] This would do partial hash matching against a database on the server (similar to HIBP) and then do an interactive session with each of the matches, basically alternating sending the next binary bit of the phone number until either party gets it wrong or the value fully matched. The todo is to check whether there are parameters that make it both scale and protect privacy.

[3] https://dro.pm/a.webm/preview (link works for 18 hours after posting) or if you've already accepted the Google terms of service: https://www.youtube.com/watch?v=4vgKHmNaAAw


Nice to see Telegram imposed strict limits for contact discovery. They were only able to scrape 100k numbers over 20 days.

Strange that they keep and return metadata for non-registered numbers though.



100% of signal scrapped - ugh


They have been too busy with integrating crypto payments instead of fixing long standing issues or planned features (allow to register without a phone number). But - to Signal's defense - while you can scrape the phone numbers, there is not much you can gain from it, you can only tell that a specific number is a Signal user. And sometimes you can see the username (if the user chooses to not encrypt it).


Given the re-use and re-issuance of phone numbers in very busy US area codes (like 212, 202, etc) it also only allows discovery that at some unknown point in the past, that number has been registered as a Signal user.


Just as insecure as Whatsapp, just far fewer users.


> Just as insecure as Whatsapp

From the paper:

> With its focus on privacy, Signal excels in exposing almost no information about registered users, apart from their phone number. In contrast, WhatsApp exposes profile pictures and the About text for registered numbers, and requires users to opt-out of sharing this data by changing the default settings.


In other parts of the world, users share their contact books to get access to what others shared, see Bellingcat's extensive use of these phone contact book apps, here documented in 2019:

https://www.bellingcat.com/resources/how-tos/2019/04/08/usin...


I will criticize how Contacts are implemented on Android for this.

For example, I don't want any person who I interact with once or twice a month to have access to my WhatsApp or any other social media app. But I can't do this in Android because once you add contact every damn app has access to that contact list. It's full access or no access if app uses permissions. I need something where I can label contacts to not appear in main contact list.


Android Work Mode has this contact-separating feature: https://f-droid.org/packages/net.typeblog.shelter/


That is not correct.

In Android you have two ways of accessing most things: full access or use the system to access one entry.

You should blame WhatsApp for not supporting the second method.


The fact that an app can choose to "support" a method is a flaw. The app should be completely oblivious to whether what it's seeing is the full list of accounts or a carefully selected one.


My point remains valid. If OS is handling contacts it should have some sort of control over what's getting shared. Theirs no point of providing alternate ways if an app can access full list of contacts.


That is a very odd take away.

The first method is good to have for apps you trust, and requires permission from the user.

https://developer.android.com/guide/topics/permissions/overv...

The second method is what you want with apps you don't yet fully trust or for some other reason don't want to give direct access to your contacts.

https://developer.android.com/training/permissions/evaluatin...


Why would an app developer ever think their own app should use the method reserved for untrustworthy apps?

All incentives suggest the developer should prevent using the app without full permissions being given (basically force the user into giving permission) and only implement the first method.


Because it would be easier for the users and allow some customization you can't otherwise have?

Sure it can be abused, but there are also legit use cases


While I get where you are coming from, but giving that option is a design flaw.

As Android developers, we should only be given one pipe to consume from. The system should then present the user with the ability to choose if that pipe for this particular application is going to be:

"All contacts" / "A subset of contacts" / "No contacts

And even better yet just "A subset of the values for a subset of contacts"


You need a migration path and backwards compatibility. They can't kill all the apps which used the old system.


They can just lie to the old apps. Tell them they're getting the full list when the API is called.


Privacy Guard in Cyanogenmod used to do this I think, at least to fake the list to be empty. It somehow still broke a tiny number of apps (unintentionally, i.e. the app owners didn't purposefully add code to annoy those users) so there seems to have been some flaw between 'empty list with permission granted' and 'empty list with permission not granted'. Regardless, I'm not sure why this didn't become mainline Android aside from that it would come with no benefits to the main developer of Android.


For backwards compatibility, they could just put in some nonsense entries. If the program can figure out those entries are nonsense, it's no longer out of date and it's on them.

Why they didn't do this by default is anyone's guess. Maybe they like app devs more than users. More technology should lie on users' behalf.


Sure sure, I was just saying that this sort of thing already existed in the past (in Cyanogenmod, and that it worked well for 99% of apps even without providing wrong data) but other vendors or even mainline Android never picked it up unfortunately.


What if the app caches some (or fragments of) entries for some good reason? If you're unaware you're getting only one requested contact each time you may be triggering edge case behaviour. What if you have reasonable time-out behaviour which triggers each time now while user chooses which contact to expose to the app? What if the contract list is polled in the background, or when you receive a message to match it up with the right source?


"Yep, this user has exactly one contact, and gee, it happens to be the one they're calling now, how fortuitous! They had a different single contact yesterday, but apparently they've deleted that one and added this one."


If the devs update the code to respond to fake entries, then the program is no longer out of date and thus no longer in need of backwards compatibility.


ideally true. The issue is common person expects things to 'Just Work' - means read/allow contacts.


Imagine my surprise when one can actually give up MORE revealing information about yourself and your contacts just by installing BOTH Telegram and Signal apps.

Signal: when you’re most concerned about privacy of your message content.

Telegram: when you’re most concerned about association with contacts.

WhatsApp: when you’re most concerned about losing your ability to reach out and contact someone.

Nothing is absolute.

But don’t tell our politicians and lobbyists that. Oh wait.


Matrix: when you are most concerned about losing your message history.


which is probably the best sweet spot any privacy fan can aim for.


Your contact list seems like a pretty unique fingerprint with high entropy. It also persists across devices and accounts through common synchronization mechanisms.

How many people in the world share the same contact list as you?


I was wondering, what would happen, if I would add all the phone numbers from the Facebook leak to my contacts. Did anybody try something like this? What is the upper contact limit?


> What is the upper contact limit?

answer is in the paper


This scientifically-looking paper could have been written by Captain Obvious himself. It is beyond obvious that contact discovery in any major messenger or social network is facilitated by uploading all contacts from the user’s address book, with all the implied drawbacks.

If users' behaviour has shown us anything, it's that they love it. And for all the dangers of their privacy loss, they happily trade it for the convenience of finding the people they know.


"They love it" but they often aren't given the choice, nor are they fully aware of the consequences (I expect many would choose not to accept if the findings in this paper were presented to them in a clear understandable way).

The average person barely knows what a server is. They install e.g. WhatsApp on their phone, they are likely to think that the app on their phone is doing the work of telling them who else is on WhatsApp. They are not likely to think that their contact list is being scraped and uploaded and stored on someone else's computer in a warehouse, and then profiled for advertising purposes and then exposed to strangers via an API.

The average person may love the convenience, but the average person does not understand how it is implemented or the consequences of using such a service (as described in this paper).


In my experience, even after having made fully sure that they understand the risks involved, "the average person" will happily switch back to a walled garden IM platform the moment the next stupid feature comes in.

Last time it was the (I think Whatsapp?) feature that allows, when replying to a message with an attached picture, to highlight a portion of the attached picture. That's it. There was no network effect at this point whasoever. This pseudo-feature was enough for an adult person to decide to switch back to Whatsapp and fuck my and everyone's privacy. THEN the network effect kicks in in favor of Whatsapp, because of course Whatsapp is a walled garden, so everyone is forced to switch to Whatsapp.

I have seen this already happen several times network-wide and I will see it happen again. Non-walled garden IM networks are just set-up to lose.


It doesn’t show that we love it. I hate it. But it’s the cost of entry if you want to communicate with a group on any of those platforms (which I refuse to do outside of Signal). I didn’t love giving Signal my contacts list, but I did it.


> This scientifically-looking paper could have been written by Captain Obvious himself

Why didn’t you write it then? It’s easy to dismiss the work of others, not so easy to do the work yourself.


> It is beyond obvious that contact discovery in any major messenger or social network is facilitated by uploading all contacts from the user’s address book, with all the implied drawbacks.

Mass uploading contacts should be limited, like Telgram rightfully implemented. Signal should do the same.

Also, for signal you have to give the list of your contacts. And you don't have with Telegram (and I didn't).


With Signal, it means your friends can rat you out by them sharing their address book.


Yeah, associativity remains an issue via Signal but the encryted-at-rest message content can be timed out at your interval.


Actually, the general love for Signal on HN is puzzling to me. It's nothing more than yet another centralised silo not owned by the users. End-to-end encryption? I'd take federation over it any day.


there is always Matrix app.

Some may argue that Matrix still a centralized server by the virtue of seeding your group info somewhere. But this seeding can be done via paper-only thereby it is still a true decentralized messaging server.


No, there is always xmpp. Matrix is just an app, and we need a federated protocol. I think that Matrix will never have an alternative server implementation made by a competing party, which makes it's main selling point void.


Matrix is not an app. It is a protocol. You can find the specification here: https://matrix.org/docs/spec/

There are also multiple client and server implementations already. You can find them here: https://matrix.org/docs/projects/try-matrix-now/

There are also at least two companies offering homeserver hosting: https://matrix.org/hosting/


It is not a federated protocol. It is an app that has some internally developed protocol, which makes it hardly more than an app, really.

As long as one for-profit company decides how it changes and evolves, it's nothing more than that.


Maybe I'm missing something.

What do you understand by federated protocol?

As I understand it, Matrix seems to be an open protocol that supports federation.

The open protocol part is evident by the extense documentation of the protocol specification that I linked in my previous message and by the fact that anyone can propose a change in the spec: https://spec.matrix.org/unstable/proposals/

You can see how the protocol supports federation here: https://matrix.org/docs/spec/server_server/r0.1.4

As for the organization governing the protocol, there is The Matrix.org Foundation: https://matrix.org/foundation/

In the foundation page it states it is "a non-profit UK Community Interest Company, incorporated to act as the neutral guardian of the standard on behalf of the whole Matrix community"


by an open federated protocol I understand the likes of email or xmpp, or TCP, for that matter. Standardized and developed by an independent entity, for better or worse. Where the power of any single developer is checked by other developers and the standards body. Currently, matrix.org owners can unilaterally change the protocol in any way they like, upgrading their server that hosts the vast majority of users, and all the other independent implementations would be left in the dust.

Until this is possible, it is not really a protocol, it's more like a private API available on multiple instances.


So if all goes well, it will become an "open federated protocol", according to your definition, in a few years when it is more stable, mature and multiple interests (companies) are governing its direction?

Sounds like a fair position to have.


I wouldn't hold my breath for it to happen. Why would the current owner relinquish control to others to govern its direction?


Matrix is no app. Matrix an open protocol for decentralized communication that works through federation.

Further, there is a alternative server implementation: Conduit.

What main selling point are you talking about?


If you make a device (say, wireless walkie-talkie) that can communicate with other devices of this type, it is not yet an open standard protocol. It's just your proprietary thingy that you do with some communication properties.

Same thing here. It's a product of one commercial company, which fully decides how it works.

Conduit is not finished, and, given the monolytic nature of matrix protocol (as opposed to XMPP, by the way) it will likely never be finished. Even on it's GitHub page it writes with big big letters: DO NOT RELY ON IT.


> If you make a device (say, wireless walkie-talkie) that can communicate with other devices of this type, it is not yet an open standard protocol. It's just your proprietary thingy that you do with some communication properties.

True, but this isn't the case. The device you are talking about is Element, which uses the protocol. Here you can find the protocol: https://spec.matrix.org/unstable/

It is an open standard.

> Same thing here. It's a product of one commercial company, which fully decides how it works.

You, again, conflate Matrix with Element, which btw. does not fully decide how it works. Read more about that here: https://matrix.org/foundation/

> Conduit is not finished, and, given the monolytic nature of matrix protocol (as opposed to XMPP, by the way) it will likely never be finished. Even on it's GitHub page it writes with big big letters: DO NOT RELY ON IT.

Conduit is a server of a competing entity. I didn't claim it was finished or will ever be finished in the way that there won't be any development any more.


> It is an open standard.

what's it's RFC number? which body does govern the development of this 'protocol'?


Open Standards don't require RFC numbers.

It's governed by the Matrix Foundation, which I have linked before


Manyverse (Sweden) app does Matrix well.

But it’s design intent isn’t FEDERATION, not at all.


Is there a "scientifically-looking" paper that shows that users love it?


I have thought for a while about creating a contacts app for the phone that had much more privacy, almost like a Signal but specifically for the address book.

Any thoughts on this?


The practical consequence of this is that your phone number + name is public information. Almost as if it were listed in a phone book. That was pretty much the case already so it's not really a new threat. Of course people with unlisted numbers will be a bit annoyed by this. Likewise, your email address probably is part of numerous databases, including those owned by spammers/scammers.

I'm not saying this is good. But merely that assuming otherwise was always a bit naive. Now in terms of GDPR and similar laws in the US, leaking information via a scrapable API is of course still a problem.


> The practical consequence of this is that your phone number + name is public information.

From the paper:

> With its focus on privacy, Signal excels in exposing almost no information about registered users, apart from their phone number. In contrast, WhatsApp exposes profile pictures and the About text for registered numbers, and requires users to opt-out of sharing this data by changing the default settings.

So if you only use Signal you don't leak the connection between your profile (which could contain your name) and your phone number, at least not until you accept a message from an attacker.


My name/number is already pretty much public information, you can find it in various public registries if you search.

But who I have in my contact list is _not_ public information. You can infer a lot from that.

And the idiots who create these social apps always assume you want to "connect" with whoever is in your contact list.

Even if you added their number so you know to _not_ pick up when they call.


Yeah ... no. I'm not sure what exactly gets shared with these apps, whether it's just the numbers or also the rest of the contact information. If names are being shared, that's a seriously fucked up nono. These parties are getting insights into our lives that are way beyond what is reasonable. Imagine I have 555-12345 in my contact list as "Wife" and someone else has the same number as "Girlfriend".


Paywalled




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: