Hacker News new | past | comments | ask | show | jobs | submit login
Path uploads your entire iPhone address book to its servers (mclov.in)
834 points by iamclovin on Feb 7, 2012 | hide | past | web | favorite | 267 comments

I find it mind blowing that (in the comments of the blog post) someone asked the Path CEO:

> Why wasn't this [sending all the contacts to your servers without users knowing] an opt-in situation to begin with? Isn't that against Apple's own T&Cs?

and the Path CEO replied:

> This is currently the industry best practice and the App Store guidelines do not specifically discuss contact information. However, as mentioned, we believe users need further transparency on how this works, so we've been proactively addressing this.

Really guys? REALLY? This is why developers need explicit guidelines, because as they just demonstrated if there are no guidelines companies default to the thing that exploits the end user! (incidentally, its unfair to pick on Path too much as almost all social networking applications do exactly this also.)

I actually cringed when I read this "however, as mentioned, we believe users need further transparency on how this works" ... which is why it took someone running a proxy and writing a blog post for you to suddenly be transparent about it. Mind blowing. Why even say that?

Btw, times like this? You destroy any and all credibility when you say you are trying to build a company that is built to last or one that is going to follow in the footsteps of Apple.

Apple would never do this to their users.

(do not make this a discussion about the evil and good sides of Apple. Apple has repeatedly not bowed to companies desires for owning contact information and I expect they will fix this contact hole in the near future.)

It's sad because I respect Path and their love of design. But design isn't just about how it looks. It needs to resonate through the entire vision, company, product, and how you treat people.

Mind-blowing level of arrogance. Path just ensured that I will never use their product and that I will actively discourage all my friends, colleagues, co-workers, and users that I support (who number 100 or so) from ever using Path, too.

"This is currently the industry best practice"? That's the biggest bullshit line I have ever heard. No, it's most certainly NOT a "best practice", and even if it were, it shouldn't be, and as a CEO, you're supposed to be bright enough to know this. And if you don't know this, you're supposed to be bright enough to make up a better excuse when you get caught. Hint: This ain't it.

  >> No, it's most certainly NOT a "best practice"
Apparently they meant to say 'industry lowest common denominator'.

Wait: What about MY INFORMATION if I've never installed Path? If someone I know with my contact information installs Path, does that mean that my information is stored on their servers?

How can I remove my information if I've never installed Path before? It doesn't seem right that my contact information, which I have kept private, because someone I know has uploaded that information. Do I not have a right to keep that information private?

This would make Path and other companies that upload the entire contacts database the prime candidate for hackers and government agencies that want non-Facebook information about people, given a name, phone number of email address.

Clearly there are a lot of WTFs going on at Path, but this isn't one of them.

> Do I not have a right to keep that information private?

But you didn't. You gave it to someone else. It's not your information any more.

Information about you is not information you own.

Privacy and anti-spam laws in various jurisdictions cover what an organisation can do with information they collect about private individuals, but that has nothing to do with ownership.

In the EU, the third party would be using personal data for other than the reason it was collected, so it is illegal.

"Do I not have a right to keep that information private?"

Generally no. I mean anyone can put their in law's information on their blog. It's a dick move but not illegal generally (if you're putting the person in danger like an battered spouse or witness protection there may be problems, IANAL).

I get the outrage that they didn't hash everything but the righteous indignation that a social network is trying their best to let people know when their friends sign up seems overblown.

Moral of the story: don't be shocked when social networks don't follow best practices for privacy. Also foxes like chickens.

If they're smart they'll revamp their system to work like this:

edit: (0) we get your permission /edit

(1) we check for your contacts in our database (hashing your contacts).

(2) we let you know if any matches are found.

(3) we throw away all your data afterwords.

They'll generate a few fewer matches this way but since they're going for stronger ties it shouldn't really be an issue.

This is an important point, but it's an issue that would arise even if Path allowed voluntary opt-in for contact scraping.

> This is why developers need explicit guidelines,

No, this is a matter of security. Apps should not be able to access user data without explicit permission. It's not something you can rely on guidelines for.

> Apple would never do this to their users.

You're being way too generous to Apple here. They are the ones who provide the API. I've used other phones and their APIs never freely provided my data to apps. Honestly, if I knew the iPhone worked like that I wouldn't have bought it.

I think you're assuming too much about the API, all it does is provide access to the address book so you can do things like create a "invite a friend" dialog, it's not meant to be used for bulk uploading to remote servers.

Yeah, I only came to the comments to express my utter disgust and anger at the claim anything about this is "currently the industry best practice". What a load.

Wait, on second thought. Maybe he is right. Hopefully he will post a comprehensive list of all other companies he is aware of that engage in this practice in order to show his good will in stating it is an industry best practice according to his personal knowledge, and not that he is merely a compulsive liar. I look forward to Mr. Morin's follow up with the list.

While I still support Path, the best PR move they could do right now is to pro-actively wipe all non-members' contact info from their servers, and then fast-track approval of the new "opt-in" version to the App Store, so that users can re-upload.

Played right, this episode could actually give them free publicity. Companies like Facebook and Zynga have been embroiled in far worse controversies, and they've all blown over.

That's not a PR move, that's what you do while crossing your fingers that state attorney generals and the FTC doesn't come after you.

I've just:

  1) saved their Privacy Policy and Terms of Use
  2) requested a complete deletion of our family's account
  3) requested deletion of any/all stored information
  4) considering contacting our lawyer
As I emailed to Path's support, our 3-4 year old children's schools, bus companies, physicians, pharmacies and our family lawyer were in that contact list - that's an insane, willful, and quite unexpected violation of our privacy.

Worse, it could have easily been solved by adding an entry to their Privacy Policy (under the "What Personal Information Do We Collect?" section) and/or a simple dialog prompt.


As I emailed to Path's support, our 3-4 year old children's schools, bus companies, physicians, pharmacies and our family lawyer were in that contact list

Ok, I'm going to pick on you for a second.

Hold the downvotes everyone! Let me explain.

This seems like a bit of a knee-jerk reaction akin to "think of the children!" or the whole child porn scare-mongering that politicians engage in that we on HN are always criticizing. I recognize that Path screwed up, big-time, but I'm unclear on why them having the information you cited, along with dozens or hundreds of other contacts from your address book, for millions of users, constitutes some kind of terrible threat to your children. I mean, their schools, their bus companies? How is that even remotely useful information to anyone?

I think there's plenty to criticize here from just the high-level perspective of "they used my contacts without my permission", without use the children scare-mongering tactic. But maybe there's a specific threat in mind that I'm not thinking of?

Anyway, just thought your response was a little over the top, and more informed by emotion than reason.

Ok, now everyone can downvote :)

Having all that information (school, doctor, lawyer, pest control company, health insurer, employer, credit card company, ...) about one person or a family, together in one place, is a social-engineering / identity-theft cornucopia. Imagine if Path had a data breach resulting in this contacts database floating around the internet.

Now most people's response to that kind of threat is to think "I'm just nobody important, no one would ever go to the trouble of using this information to impersonate me or otherwise make my life difficult." Probably you are underestimating one or more of: (a) your importance, meaning how much money someone stands to gain by impersonating you, (b) the gullibility/apathy of customer service reps at the companies you interact with, or possibly (c) the amount of free time and/or perversity of someone who will fuck with you just for the lulz.


One of my kids has special needs. This means he rides a certain bus and goes to a certain school. It would be trivial to uniquely identify him for the rest of his LIFE with only the information contained in my contacts list.

So now, without consent, this "private" "friends and family"-based app I installed on my phone, plus it's company, plus any other company they choose to do business with, or any entity that acquires them in perpetuity, or any data mining, social profiling, credit bureau, can start building far-reaching and long-lasting profiles of a four year old little boy that needs a extra help.

What part of that confuses you?

p.s. this could have been avoided with a dozen lines of code via a dialog box.

Actually there is a simple solution for your problem. Don't use social apps. Especially not if they are free!

Do you also buy snake oil if it comes with a document using lots of difficult sounding words but ends saying it cures everything?

> I'm unclear on why them having the information you cited

First of all, my wife and I actually read and attempted to analyze Path's Terms and Privacy Policy before joining. They did not in ANY WAY have our permission, either implicitly or explicitly to collect private information about our children, who are, 3 and 4 years old.

> along with dozens or hundreds of other contacts from your address book

From path.com/about

  Path should be private by default. Forever. You should 
  always be in control of your information and experience.
I was never once asked, agreed to, or gave consent to allow anyone to collect sensitive information about where are children are schooled at, what buses they ride, where they receive medical treatment at, or OTHER PLACES I LEFT OUT OF THE ORIGINAL LIST BECAUSE THEY ARE PRIVATE TO MY FAMILY. :)

> for millions of users

"kill one, it's murder - kill 1,000,000 it's a statistic" - this isn't about your children - it's about mine. ;)

> constitutes some kind of terrible threat to your children

Where did I say this was a "terrible threat" to my children? Maybe it is, maybe it isn't - bottom line is we did not consent to it. And perhaps we just want to protect our underage children from having behaviorial profiles or credit risk assessments built up on them before they reach kindergarten.

Interestingly enough, according to Path it is VERY reasonable that I should protect my children's information:

  We take reasonable measures to protect your personal information 
  in an effort to prevent loss, misuse and unauthorized access, disclosure, 
  alteration and destruction. Please be aware, however, that despite our efforts, 
  no security measures are perfect or impenetrable and no method of data 
  transmission can be guaranteed against any interception or other type of misuse.
Combined with:

  (You)...accept all risks of unauthorized access to the Registration Data and any other information you provide to us.
My risk, right?

> But maybe there's a specific threat in mind that I'm not thinking of?

Yes, there is. And I acknowledge that you might live in a world where you have no problem allowing anyone in the world to know any detail they can illicitly sneak out of your phone about you, your family, and your friends - but most of the rest of us don't.

For fuck's sake a UIKit dialog box and handler code is less than a dozen lines of code and then NONE OF THIS WOULD BE AN ISSUE.

> Anyway, just thought your response was a little over the top, and more informed by emotion than reason.

I'm curious, do you have a spouse or children?

> They did not in ANY WAY have our permission, either implicitly or explicitly to collect private information about our children, who are, 3 and 4 years old.

What are you talking about? Do you expect them to perform complex data analysis to figure out that certain contacts are young children, and then explicitly ask permission to share those? Or do you expect them to preemptively ask for any potential sensitive contact information? "Can we use your children's information?" "Can we use your in-laws' information?" "Can we use the address of the President's safehouse?" Etc.

> What are you talking about? Do you expect them to perform complex data analysis to figure out that certain contacts are young children, and then explicitly ask permission to share those? Or do you expect them to preemptively ask for any potential sensitive contact information? "Can we use your children's information?" "Can we use your in-laws' information?" "Can we use the address of the President's safehouse?" Etc.

Just a "Can we upload your entire address book?" would have worked. Or perhaps listing "Your entire address book" in the "What personal information do we collect?" section of their Privacy Policy.

That still wouldn't be specific permission to share children's information specifically, which is what it seemed like your were requesting.

No, but giving him the information would have informed him sufficiently so that he could have decide whether he wanted to (a) not use the app (b) delete sensitive contacts before using).

I think you're spot on here mash but I have a disconcerting question. How do you intend to handle this situation with every other app you, and presumably your wife, have ever downloaded? Specifically those that may not be as 'transparent' as Path?

I ask because we would be foolish to think the developers of some less then typical quality apps have, or will, certainly exploit this for their own monetary gain.

> How do you intend to handle this situation with every other app you, and presumably your wife, have ever downloaded?

Not sure yet. Path is actually the first (and will certainly be the last) social network I've ever joined - and it was precisely because it was supposed to be private and they had a pretty reasonable privacy policy. I remember something of this nature after the App Store was first released but had honestly thought it was a fixed issue.

On our lap/desktops we use prompting firewalls and on occasion will even watch suspicious apps or behaviors, if you will, where on iOS this is much harder.

I have an idle FreeBSD box and may start mitm'ing like OP did, but seriously pouring through the kind of output a home network produces doesn't sound like fun at all and I already know that going back to a dumb phone would probably be just as easy.

I was worried that would be the response. Not that I think it's a bad idea, its just such substantial shift from what I'm used to.

I would be curious for someone to do this with other apps. Even those that aren't social networks. I have a strong inkling that most of the top free apps are doing this without any of us knowing.

> I'm curious, do you have a spouse or children?

What for an argument is this. So if he doesn't have a spouse or children he can't be right. What kind of populist are you?

Seems to be an ad misericordiam argument. It's bad they share private information of people in your contact list without your or their permission. But adding children in the mix is just used to add effect to your argument.

Don't really like this kind of argumentation.

considering contacting our lawyer

What do you expect to achieve with this step?

To get his money back, of course.

To get perspective, actually. Most lawyers are wicked smart and it sucks you aren't in a position to have such a valuable resource available in your own life. HTH.

Lawyer? God, get a fucking grip. No wonder companies treat their users like morons.

Because asking for advice from those wiser than oneself clearly makes one a moron.

Yes, good point. And regarding state attorney generals, how is this not data theft? It seems to go far beyond privacy issues, the program is in every way that matters a trojan that steals personal data. I can't see how it could not be considered so given the details of what was discovered.

If I were an evil-state-attorney-general, I'd be calling up Path and saying "Here's a list of names (unsaid - of suspected drug dealers), please forward all of their details and contacts, and the details and contacts of anyone who lists them as a contact. Thanks"

Yeah - or I hire a private eye to spy on my wife and he pays off a path DBA. Don't people always complain on HN that they don't get enough kaching?

If I were involved in this (and I'm not, I just think transparency - not privacy - matters) I would want the CEO and CTO of Path to create a video that is displayed to all relevant users in their mobile app. The first thing they do is apologise, they explain in plain words what people are up in arms about, the CTO reiterates that a) this was dumb and a poor choice but we are all human, b) what this means (eg: we did this not for our value but to deliver the best experience by matching you to your friends effortlessly) and c) why this matters on a macro scale for the industry.

I would respect a company that did this because they are not only addressing users that are aware of it but also users that are not aware (but are affected.)

Wiping data is fine but it feels like it doesn't solve the crux of this problem -- communication and transparency. Companies make mistakes and they can fix them, sure, but communicating about them? that's much cooler. (I suspect this is overkill unless mainstream news catches on this - which seems unlikely)

I wouldn't be surprised if there is an engineer there who voiced concerns, but whether they still work there or not would be an open question. Wherever they are, they should be found and put in charge of development.

Ethical engineer: "I've got a problem with doing this, we're storing personal info without permission. Shouldn't we at least have something that lets our users know?"

Ambivalent boss: "I don't think it's a problem, who's going to notice anyway?"

Not that the ethical engineer will get anything more than personal vindication for actually giving a shit.

I would want the CEO to:

1) Immediately delete all of the non-user data

2) Send an apology e-mail to each Path user explaining the situation

3) Write, by hand, a corresponding apology letter for each Path user

4) Hold a townhall-style meeting in which members of the public can ask him questions

5) Pay, out of pocket, the travel expenses of anyone who attends the townhall meeting

6) Wear an indicator of shame (large necklace or a sign) for as long as he is CEO of the company

Seppuku basically

7) Commit seppuku

If “Apple would never do this to their users”, then how is it that Apple provided the API which Path used to do this to their users, without requiring the users to give the app permission (as they do with, say, allowing an app access to a user's location)?

Because it requires 2 API's both of which have legitimate uses:

1.) Get the user's address book and 2.) upload it to a server.

Installing an application implies a higher level of trust than a web application. You can't prompt the user for every API that might have a nefarious use. Location data is also much more sensitive so it makes sense to prompt the user for that.

From the traction this story is getting, it sure looks like address book information is considered sensitive by a lot of people. Possibly on par with location data.

It's sensitive depending on what you're going to do with it. If you're a native app and you want to access it so that you can show me my address book in some unique way, then I don't want to be bothered giving permission. If you're a native app that's just a front end to some social network and you're going to shuttle it off to some big database in the sky, then maybe not.

The problem is that this isn't easily enforceable at the API level without the user having to make decisions. The right level of enforcement is at the app review stage.

> The problem is that this isn't easily enforceable at the API level without the user having to make decisions.

It's not enforceable even WITH the user having to make decisions. The user can not allow the app to upload one kind of data and disallow another (address book). You can only allow ANY upload or no upload at all.

That's certainly Apple's intentional business decision, though. It's simple to provide a way for global abook perms as well as for individual apps.

I guess it's more like

1) Get the user's address book 2) upload _something_ to a server.

A user could give permission to both.

Yeah, but then using apps would quickly descend into a horrible mess of deny/accept, confusing and scaring the user. The pop-up hell of windows would pale in comparison.

You'd have solved the problem, but created a horrible user experience instead.

Maybe I wasn't clear enough. I wanted to say that it's not possible to solve the problem by asking the user's permission because API does not allow you to ask for a permission to upload specific data (address book). So there is no way to prevent an app to upload your address book without totally preventing it to upload anything.

Your message was not lost on me. The mind blowing part (to me) is that it takes years before some average Joe (not necessarily security "expert" by profession) decides to take a look at the logs to see what is REALLY going on behind the curtains - revealing something huge like this. Like you write, unless explicitly stated, companies will default to whatever is in their best interests, which is why Facebook going public should be a worrying thing for those users. As someone wrote, "if you are getting something for free you are not the customer, you are the product."

I've noticed a pervasive attitude throughout the SF social app community that your app is at a disadvantage if it doesn't use all of the (potentially dirty) tricks that other apps use -- especially in a crowded space. If your app is the only one that doesn't do automatic friend discovery, or post to the Facebook news feed, your growth coefficient is going to suffer. Of course if you're the first to be found out doing these tricks, the backlash can hurt more than it helps. It's a gamble, and although the HN community is (rightly) in uproar, Joe average user likely won't care that his address book was uploaded unless he's explicitly told to be upset about it, or unless someone compromises Path's servers and he's personally hurt by it.

> we believe users need further transparency on how this works, so we've been proactively addressing this

I feel like shooting someone every time I see them (or for that matter, anyone else) doing things 'proactively' (at least three times in the comments of original blog post). My BS meter goes all red on that. What does 'proactively addressing issue of transparency' mean? Even the sentence itself is not transparent.

I seriously wonder why so many companies communicate using language like that. Is it because of law? If instead, he'd say "well, we really screwed that one up and we want to apologize; right now we're trying to figure out how to fix those issues, please be patient" - could that get them sued, or what?

The funny thing is that proactively means exactly the opposite: by their own initiative, instead of waiting for someone to find out using a proxy.

Well, maybe I'm missing something here, but I really think it's mind blowing HN-readers are only now realizing this is happening with these kind of apps.

And yeah, if you don't do this (everybody else does AFAIK) you're left with a disadvantage in hooking you up to your friends who also use the service.

I actually think the CEO's response is not that bad.

> so we've been proactively addressing this.

"Proactively" doesn't mean acting after you've been caught.

Actually, it means exactly the opposite.

The problem is any site that decides to grab this data gets an advantage over any site that does not, and the regular users simply don't care enough.

About four years ago, a new trend just started emerging. Sites would ask user's for their Gmail passwords and scrape all the users' contacts to invite them for their service. I remember this because I was at a company where I refused to implement a service that requested a user to enter their Gmail password. They got someone else to do it. The issue was that we had competitors using this tactic and they were gaining a lot of users. Unless you have a way to level the playing field, you'll end up just punishing the ethical companies.

The CEO's comment is some grade A bullshit. Obviously they realized this was an issue before they got caught, but if they really thought that it was "important that users clearly understand it" the opt-in would have been in version 1.0, not 2.0.6.

Apple would never do this to their users.

Apple makes apps prompt me every time they want to know my location or send me push notifications, but they don't require it for the contacts info.

How is managing push notifications more important than leaking private contact info?

I don't want to rag on you, but the answer to this is really, really obvious--Path certainly screwed up, but that's no reason to lose your head and start making silly claims.

iOS doesn't know what's being uploaded by an app. It can't know. They could ask every time an application wants to access your contacts (which, I think, would really suck for UX, and it'd be a context-free question without indication of what the data would be used for), but after that? There is no practical way to know that that data is being sent over the wire to somebody.

Ok, how about asking the first time?

So...what, exactly? "This thing wants to use your contacts." It's a social network. It can be expected to want to use your contacts. It has no bearing on how Apple is supposed to avoid letting Path package up your contacts and send them to Path's servers.

Apple would never do this to their users.

Perhaps not, but remember that Apple are supposed to have approved all Apps on the AppStore. It's supposed to be for user benefit, to prevent malware, viruses and bad applications. However this app was approved by Apple. What, exactly, is the point of the AppStore approval/walled garden approach if this is acceptable?

But Apple did do this to their users. They are just as culpable as Path, if not more so. If you are going to provide a platform for app distribution, it is your, and only your responsibility to ensure that private information is not abused if you create the illusion that user information is safe.

Not sure why people are down-voting this, but how about offering a counterargument if you disagree instead of just clicking an arrow like a Rhesus Monkey?

Facts are stubborn things.

This data was never sent to Apple servers.

This is false, they do not send a recorded record of your movements to apple, however they do send GPS+WLAN BBSID correlation data back to apple,[1] they claim the processed is anonymized, but there are very powerful deanonymization techniques that can be applied to large data sets. [2][3][4]

I live in almost the middle of nowhere, i guarantee nothing like google maps, etc has ever passed this way to map my WIFI point's BSSID onto a physical location, yet the week a member of my family got an iphone, plugging the BSSID into a location api gives the exact location of my house...





That's circumstantial evidence at best.

Here's more useless analytical evidence to suggest that most people don't know everything: when Samy Kamkar[0] first demonstrated geolocation via BSSIDs, I tried out every wireless router in my house, including one that had not been plugged into a wall in over 4 years and never at my current residence, long before Google started wardriving for street maps and well before the first iPhone came out. He was able to accurately map it to my old residence. That means that sometime before December of 2006, someone or something was able to snatch my BSSID from someplace, accurately note it's physical location in the world, and store that away in some database that was used almost 4 years later. I can guarantee you it was not Apple, and I'd be damned surprised if it was Google at that time.

[0] http://samy.pl/mapxss/

Not surprising. There's at least one collection project that had already been running several years at that time: http://wigle.net/

its hardly dismiss-able as circumstantial evidence when apple themselves have said they do it.

The data was stored on the phone, not sent to Apple, and certainly not to advertisers.

The data was used for GPS assistance -- it was a cache that triangulated your location from cell phone towers to help get a faster GPS lock (and to find your location without GPS if you’re getting bad GPS signal).

If you're concerned about the police finding out your moves, they have access to such information from the tellcos themselves with your cell number, whereas to use those stored GPS logs they would need physical access to your iPhone.

Dave Morin, Path's CEO just responded in a comment: http://mclov.in/2012/02/08/path-uploads-your-entire-address-...

>Arun, thanks for pointing this out. We actually think this is an important conversation and take this very seriously. We upload the address book to our servers in order to help the user find and connect to their friends and family on Path quickly and effeciently as well as to notify them when friends and family join Path. Nothing more.

>We believe that this type of friend finding & matching is important to the industry and that it is important that users clearly understand it, so we proactively rolled out an opt-in for this on our Android client a few weeks ago and are rolling out the opt-in for this in 2.0.6 of our iOS Client, pending App Store approval.

edit: Morin responds to a response http://mclov.in/2012/02/08/path-uploads-your-entire-address-...

To the suggestion that they just hash the addressbook entries:

> 1. This is a good alternative solution which we'll look into. Thanks for the idea.

>we proactively rolled out an opt-in for this on our Android client a few weeks ago and are rolling out the opt-in for this in 2.0.6 of our iOS Client, pending App Store approval.

"Proactively?" How do you get into the Social Networking business and not see this issue coming before the first line of code is written?

[re: hashing] >This is a good alternative solution which we'll look into. Thanks for the idea.

Again, no. That no competent system design talent/time was dedicated to this process is a damning critique of your organization's ability to be trusted to safeguard user data.

I think the simplest explanation is that he's playing dumb.

He almost certainly is either playing dumb or is dumb. If you're not dumb, you have to play dumb, because otherwise you'll be crucified.

Playing dumb. Hashing the information is such an obvious choice, there's really no plausible explanation for the developers to have not to consider it. They probably just figured "everyone else is doing this so what's the harm?"

It's been a long time since Plaxo.

> How do you get into the Social Networking business and not see this issue coming before the first line of code is written?

"It is difficult to get a man to understand something, when his salary depends upon his not understanding it." -- Upton Sinclair (http://en.wikiquote.org/wiki/Upton_Sinclair)

> "Proactively?" How do you get [...]

was about to say exact the same. the only thing I can add here is that if this wouldnt make headline, noone would have thought of opt outs.

If all they care about is matching users up, couldn't they just use hashes of the relevant data?

EDIT: possibly even better, they could use a Bloom filter, similarly to how Chrome uses them to filter malicious websites without sharing your entire browsing history with Google.

You really don't need to upload address book for that. Sending just hash sum of each of phone, address, name, and email would be enough to make the matching.

Hashing phone numbers doesn't do much since the space is so small.

Not that small. It's comparable to a weak password. There are about 5 billion active phone numbers in the world [1].

Besides, a small search space can only be searched quickly if it takes little time to a hash a phone number. Doing a few billion MD5-sums is not so difficult. If the hashes are computed with an expensive bcrypt then it's just a matter of increasing the number of iterations to make brute force attacks unfeasible.

Edit: I realize that the hashes can't be salted (because different phones must produce the same hashes for the same phone numbers), so a rainbow table can be created for the entire database.

[1] http://www.cbsnews.com/stories/2010/02/15/business/main62097...

The client could do 'signed' hashes using the local phone number and the friend number (sending the server both the local:friend pair and the friend:local pair).

That wouldn't really stop anybody from reversing the hashes, but it would make a global rainbow table useless.

It would make reversing the hashes substantially harder for any given hash function, though, right? Thanks very much for this idea. I'd thought about tracking social connections by sending hashes (on an explicit and opt-in basis) for my research app, Mappiness[1], but gave up the idea mainly because hashing seemed so hopelessly weak. But I think this + bcrypt might make it workable.

1. mappiness.org.uk

At that point it's also useless for matching.

It wouldn't be a strong signature, it would simply be the other half of the number pair. Numbers A and B both have easy access to A:B and B:A.

The hashes for a given user could still be attacked using their phone number, but a global table wouldn't work.

That's clever. You can then even improve the algorithm by only sending the hash of A:B for every phone number, where A < B (numerically). Then you don't have to worry about whether it's Friend:Local or Local:Friend.

Key strengthening can help. If you do a bcrypt-style hash and set the cost so as to take one second on a modern CPU, brute-forcing each phone number would take about 57,000 days :)

I would be more comfortable with this than giving them my entire address book, anyway.

But how long would it take to bcrypt your entire contact-list on an IPhone? (no idea.. but might be too long..)

PBKDF2 with a high iteration count.

Hashing doesn't let you match Kate, Katie, Katherine as the same person like Facebook does, however.

Normalization could be done on the client, however I don't think Path matches users by name anyway.

I'm sure if the Path guys had thought of that, they would be in a less bad place right now :)

I actually think that Dave's response was pretty lame. That was a typical PR/B2C response, neglecting that he is talking to techies here.

Everybody knows "why" Path is doing this and the response should have been more of "why this way".

As an engineer at a different social network -- not Facebook, but not small, about 30 million users -- he's right. That fuNctionality is important. But we do it in way that preserves privacy. IMO this is just sloppy...

Are you in a position to elaborate on your method? Does it involve hash comparisons?

I understand wanting this from a usability perspective, but is there some way to opt-out if our data is already on their servers? I, for example, in just browsing around the app out of curiosity ended up on the "find my friends" screen and without warning Path now has all of my contact data.

I have yet to open the app again.

This appears to be a sound response to a sensitive issue. Certainly handled far better than some others have handled their PR (debacles) recently.

You should not have been downvoted for your opinion on this, but I have to respectfully disagree. There MUST have been somebody at some point who mentioned that they were storing the details of non-users and making a massive database of connections without authorization, and as the CEO he must have been aware of this, and as the CEO he made a bad decision to go ahead and do it anyway.

He didn't even respond that they were checking your address book against their database for matches and then making those connections and dumping the rest of the data. He actually confirmed that they are storing non-user data in the hope of one-day making a connection. But if that was correct, the new user would make that connection when they signed up. You don't need two independent sources to make the connection through the address book.

I'll admit I only really superficially followed this through HN, and it seems you're more informed than I am - but my point was this was handled FAR better than AirBnB's debacle.

Hopefully this feature will not just prevent the Address Book from being uploaded but also remove any data already stored on their servers.

This is actually nothing new. A lot of apps have been doing this for a very long time. However, it is one of the best kept secrets in our space. I kind of have a feeling no one talks about it because they don't want word to get out. Can you imagine the scandal if this made it on the front page of CNN or Drudge?

Ever since I learned this was possible, I've been very careful about which apps I download, and actually have downloaded very few since, as a result. There are a lot of random iPhone developers that I really don't think need to have access to my entire contact list.

This sort of behaviour is almost certainly illegal in the European Union. You may not store personal information unless you have a clear and legitimate reason to store it. If you affected by this, you should contact your local data protection office.

This was news to me, so I went ahead and submitted a tip to both CNN and Drudge. I hope they pick it up.

If no explicit permission is given by the user, how is this practice not illegal?

Apple doesn't prompt the user to ask for permission when the APIs are used (like what happens with location), so this is the desired behavior. It's very simple:



17.1 Apps cannot transmit data about a user without obtaining the user's prior permission and providing the user with access to information about how and where the data will be used

It's likely that this app will be pulled from the App Store within the next few hours.

All that means is that it has to be mentioned in a very long terms of service somewhere. If Apple cared about address book information like they currently do for location data they would make the API query the user.

I noticed that Path did this a few weeks ago when I initially installed it. My reaction was much the same: WTF?! I proceeded to file a bug report with Apple that the API should prompt for access just like the Core Location API does (somebody having ALL my contacts' info is more important to me than an app knowing where I currently am). My bug was closed as a duplicate; hopefully a change is in the works.

Yep, which makes the address book a hack for (high-latency, obviously) cross-app communication. E.g., last I knew, TextExpander added an entry to your address book with your abbreviations, so that other apps e.g. Simplenote can use those abbreviations as you type. Very well intentioned hack, which shouldn’t be necessary, but is, using an API that really just shouldn’t be open…


I was operating under the assumption that this is not possible as I am sure many other people were. What sort of imbecile at Apple decided that allowing apps do that was even remotely acceptable to the phone owners?

Which apps do that? Do you have a list? Can anything be done about the data after-the-fact?

Facebook, Foursquare, Twitter, basically any app that allows you to "search my address book for friends" will do this.

All these services require either a email or phone number to sign up, so to search for friends who have also signed up for the service, you need to compare two data sets: emails or phone numbers of users you already have, and those in the person's address book.

You obviously wouldn't download your entire database of users contact information to the phone to compare the data sets, so you send the data set up to the server.

The addresses from the user's address book should be hashed before sending to the server and compared to hashed addresses on the server. Then only positive matches are registered, and the server doesn't see more private information than it needs.

Hashing data from address book doesn't work because people write the same addresses and even phone numbers in many different ways. Normalizing it on the client is not really an option either because it requires a lot of data to do decent normalization - not practical to send it all to each client.

Phone numbers are easy to canonicalize: convert to international form.

Email addresses can be effectively canonicalized by lower casing. Not many mail servers are case sensitive these days. Additionally, for the local part, you can generally strip off anything after a "+", and with gmail, you can drop any period in the local part. (Granted, it's not perfect-- so make sure that's not a security concern.)

These techniques have been working fine so far in my app for my "Find My Friends" feature.

If so many apps do this, why all of the sudden uproar with Path doing the same? I'm not condoning it, just curious why it still happens and so frequently. It seems to me that the "industry best practices" actually need to be best practice.

I do believe WhatsApp also sends your address book, to what extent though I don't know.

I haven't MITM'd it to confirm, but it appears from using the app that the Twitter iOS app does this as well.

Honest question: Isn't this within the kind of behavior that AppStore reviews are supposed to prevent, at least if there isn't an app specific functional explanation for it? Does Apple have a list of what kind of behavior like this is tolerated or does word just get out about what they don't reject?

Two 'by review' app stores I've had experience with are iTunes App Store and Amazon App Store. Here's what I've seen:

- iOS app review is very minimal. For the initial submission, they'll play around with the app for ~5 minutes. I've had updates approved without the app even being launched, and other times it's approved with simply logging in and launching the app on different devices. They are mainly concerned about policies, private APIs, etc. Things get stricter when you submit in-app purchases, but again those are more administrative than functional. So, I don't think they would ever catch something like this.

- Amazon's testing is insanely detailed compared to Apple's(at least, for the first submission - I haven't submitted updates yet). They tested the app on several Android devices, and also were looking at data over the wire using, presumably, a client proxy. They will reject the app if you send up passwords/usernames without using SSL, for instance. They hit all the menu buttons and try most features. And they review all permissions your app needs.

Well, since you only ever only submit the compiled application binary to Apple, it'd be pretty darn hard for them to detect behaviour like this. Especially if the code to do so is obfuscated, and/or the data is smuggled out via SSL (or worse, steganography-style piggy-backed on to other data).

Sometimes it's tempting to speculate whether the real purpose of the app store review team is just to ensure developers aren't trying to access Private Frameworks (i.e. non-public APIs) or try to upsell the customer while bypassing the 30% Apple tax?

Pulling contact data requires API calls that can be detected in the compiled binary (this is one way that Apple detects calls to unpublished API's).

That said, it's humorous how a blatant abuse of trust such as this gets through unscathed but god help you if you try to access the iPod library the wrong way!

Well, the app could have legitimate reasons for linking to the required API (such as pretending to only use it after obtaining user confirmation), but then you could add additional obfuscated calls to the same API without prompting the user. So that wouldn't really help.

Perhaps, however when calls like this are noted additional scrutiny of the application could be applied to ensure they are not abused (such as using a proxy in the way described by the parent).

There are other actions allowed by the SDK that seem to have little non-nefarious use, such as the ability to hide the fact that an application is transmitting and receiving data (the network "spinner" can be disabled by the application); as others have mentioned it's interesting which API calls require authorization from the user while others do not.

A postdoc in my lab published an academic paper that did exactly this: automated static analysis of iOS compiled binaries for privacy violations.

As far as I know Apple was not interested.

Here's the paper if you want to take a look: http://seclab.cs.ucsb.edu/media/uploads/papers/egele-ndss11....

Interesting. Quick question, how would you deal with things that call APIs via, for example, NSSelectorFromString, where the String is built in an obfuscated way?

(I'll go back and read the paper in more detail soon)

As I remember, the analysis doesn't handle calls that can't be determined statically.

So the analysis would fail to determine the method and class of a obfuscated string.

I've received a rejection for using a "private" ivar (it was actually a framework doing it).

The ivar was in a public header, and was not marked @private, which is the only correct way to designated an ivar as private in Objective-C. Putting a comment above it saying "this is private" (which they did) doesn't count. It's protected, by definition.

NSActionCell.h, I think.

Eh, I don't think you're quite right here. @private means "Only accessible by this class and its instances, not parent, sibling or child classes." What Apple means by "private" in that case, though, is "Only for use by Apple, not outside vendors." If NSActionCell has private subclasses that need the variable, marking it @private would be flat-out wrong.

No, the correct way to do it in that case would be to mark the ivar as @private, and have a private category on the class with a @property definition for that ivar (or just getter/setter methods). Leaving the ivar as protected and relying on a header file comment is just sloppy. Protected implies that any subclass can use it, not just Apple-blessed subclasses.

Apple would simply tell their SSL library to dump the raw data, I mean they wrote it (or at least have source access to it) and have absolute control of the devices used to test. Nothing hard at all.

They could do the same thing this guy did in an automated way (seed the device with unique data, sniff traffic for that that data), but as you said there are many ways to obfuscate it.

Certainly. An even easier way is to have the app call home to a web service that returns "stealUserData: false" until the app is approved, after which you switch the web service response over to "stealUserData: true"....

Yes that's the offical line. There are numerous examples of bad behaviour going live though.

The app explanation for it will be 'path can hook into your address book' - presumably for sending invites or messages to friends. However at this point the cat's out the bag and path can do what they like with this data (albeit against app store policy).

The problem is surely one of governance - it must be that the app reviewers simply don't (whether through sheer volume of apps they have to review, or lack of ability) see what's being posted, and where.

What's more if Path used https and a CA, would we ever have found out what was being posted short of live debugging?

The address book is uploaded using TLS/SSL and the author used mitmproxy.

D'oh. Would this man-in-the-middle attack have worked if path validated against a CA or stored cert and only submitted the data when it was sure it wasn't being snooped on?

I've come across the latter, but it's not a difficult thing to get around if you're willing to play with the binary. You might be able to recognize the stored cert and sub it out with your own, or you can just ensure the branch that validates it never runs.

Presumably Apple could demand the ability to change the certificate an app validated against for testing purposes, if Apple cared enough to do that.

Nope. Turns out Siri was (at least originally, not sure if it still is) vulnerable to the same attack.

Honest answer: This is the kind of behavior that justifies the expense of writing multiple native versions of an app rather than just developing a single website accessible from any browser but having limited access to data stored on the users' computing device.

Our app was recently rejected specifically for this reason, though we had a "skip" button, contacts weren't just automatically 'farmed'. So we had to add a popup with explicit allow/deny buttons and then the app passed subsequent reviews.

I begin to understand what Richard Stallman has been saying all those years. Although I don't like the guy on the personal level, this incident make him completely right - running closed source software can compromise your rights. (rights to privacy in this case).

I also want to thank the author of this post to discover this! I wanted to try Path some time ago, now I can safely avoid it without regret.

open source software can collect exactly the same information on you.

there was a furor recently where it was revealed that OS X and Windows collect data on what access points you have associated with. what was omitted was that linux does exactly the same thing: the wireless subsystem has a debug print (at a debug info level turned on in all major distributions) that will log the MAC address of the AP you just associated with.

it's still there, afaik.

I think I'm missing something.

You think people should be upset because a Linux computer knows the MAC address of the AP you are associated with? If that is a problem, then imagine what people will think when they realize that the computer knows what keys you press on the keyboard (!!??)

There is only a problem if the operating system shares information with 3rd parties without authorization.

the furor (at least, that I saw) was that the devices stored this information and would potentially let others look at it later.

it shouldn't be upsetting that your computer knows what keys you're pressing or what network you are affiliating with. recording that information permanently could be bad.

if you have an ubuntu laptop with wireless handy, run the following command:

sudo grep AssocResp /var/log/syslog

I think this is Apple's problem really. Path is just one of many apps that probably do this without asking you.

Ideally the OS should prompt you if an app wants access to your address book, just like it does for location.

Android apps must explicitly request a READ_CONTACTS permission. But even there, no one actually reads those permissions lists, and apps routinely ask for far more than they need. User authorization is a very weak security mechanism in the consumer space.

Like FB apps, even legit Android apps ask for the moon, with no option to dole out granular permissions.

"The Weather Channel" is a default icon suggesting a free download on the Kindle Fire.

It asks for:

    Set the wallpaper
    Send SMS messages
    Write to external storage
    Access info about Wi-Fi networks
    Access coarse location
    Initiate a phone call without going through the Dialer user interface for the user to confirm the call being placed
    Write (but not read) calendar data
    Read calendar data
    Required to be able to access the camera device
    Open network sockets
    Access fine GPS location
    Access vibration feature
    Access info about networks
    Record audio
I haven't installed it, so I have no idea why it should be able to silently dial out without my permission or send SMS messages.

If legit apps are demanding all this, then a Chinese weather app dialing those toll numbers in the Caribbean could do the same.

FWIW, if you have a rooted Android phone, you can install an app called "LBE Privacy Guard". It lets you install apps which require permission to send SMS, make calls, read contacts, access the network and a bunch of other things, but then prompts you when an app tries to do any of these things and lets you block/allow it temporarily/permanently.

CyanogenMod allows the user to remove specific permissions frmo specific apps. If more users used CyanogenMod, more app developers would become compatible.

From what I have seen you can only remove those permissions "late". Ie you have to black list permissions, you cannot deny them right away. From my understanding this would not protect me fully since apps could do their thing before I disabled it.

As I understand it: They can't.

If you do not open them manually or restart you phone (if they have the permission RECEIVE_BOOT_COMPLETED) They are not executing. You can install them and revoke certain permissions before they are running for the first time.


I also remember seeing that permissions were reset on reboot, but that might have been some other setup, not CM.

I use cyanogenmod and didn't know about this, nor do I think I or most folks will ever remember to do such things.

Why does a weather channel app require recording audio?

To allow them to monitor tornadoes in your area, obviously.

With FB apps, you can go into your App settings and revoke individual permissions that you don't want to give. They still have to be granted to auth the app, but at least you can clean it up very quickly and easily.

Sure, but if they're doing their job, by the time you can navigate there, they've already pulled your current data.

Where is this setting? This is the second time I've heard someone say you could do that but I cannot find any options like that. Settings just has Refresh interval and options to configure notifications.

Asking upfront is also a problem. Asking on demand is much more annoying to the user, but also makes them think about what the app is asking for - as opposed to a list of permissions at install time, which are skimmed over and then forgotten.

And Path has a precedent around asking more permissions than necessary: https://skitch.com/timothee/g911q/skitched-20120207-135815

For their Facebook Connect permissions, they ask for all the permissions… (that was true beginning of November, not sure they changed it since)

But even there, no one actually reads those permissions lists, and apps routinely ask for far more than they need.

Lots of people do read those permission lists, and they are one of the most commonly referenced complaints in app reviews. A firestorm arose when an Angry Birds update inexplicably added the ability to send SMS'.

Further it focuses a spotlight when an app does request a permission that seems out of place. Ideally when Google evaluates app for their "staff's picks" (the "optional curation") they consider threat surface area.

You'd think people would learn. I mean, this is the original scandalous practice in mobile apps. See this from 2008:


I'm at a loss as to how this is surprising anyone. How did people think that these apps found other users you know? This is built to support: A) finding existing people on the service and B) so they can (theoretically) send you notifications if a friend joins. If you want those features (and it seems that users do), this is the only way to do it. Admittedly, most apps are more explicit about it with a "find friends from address book," but if you want to lower the friction as much as possible, this is the way to do it.

So does Facebook, as shown 107 days ago, and continuing today. http://news.ycombinator.com/item?id=3145857

Yeah, I've noticed that too. After my first sync with Yahoo! Address book years ago, I ended up getting Facebook suggestions to people I didn't really know but who were in my Yahoo! address book.

A few years ago, at a “Facebook developer garage” event, I personally asked Dave Morin (Path Founder and CEO) a very similar question to the one in today's news. At the time, he was in charge of the Facebook developer platform, having not yet left Facebook to start his own social network. I asked him about the amount and variety of information Facebook gave freely to applications using their platform (there were far fewer privacy controls at the time).

I also asked about whether and how Facebook intended to enforce their platform terms of service, which essentially said apps could use such information temporarily, but that they must discard it no later than 24 hours after a user's most recent use of an application.

I remember that in answering those questions, he essentially said that his preferred approach was not to try and make violations of those terms difficult or impossible through technical means. His inclination was to give apps the benefit of the doubt, and deal with troublemakers if and when issues arise. He also relayed a story about his college days, in which he said that his study of the workings of government was better preparation for his web career than anything directly related to technology.

One can fuel a lot of user engagement by scraping the address book and notifying users every time one of their contacts signs up.

The "Beluga" app did this, without user permission or warning, and it boomed ahead of competition that did not. "Kik" did something similar. "Industry best practice" indeed.

Sadly, it's a winning strategy, and will continue to be until someone fixes the rules of the game.

Just a quick note to also point out, regarding this from the CEO: "if you'd like your account deleted, including all data, we're happy to do this as well."

I emailed to have my Path account deleted a few weeks ago and was told it had been 'deactivated'. After querying this, it was confirmed that they did not yet have the functionality to delete your data, only hide it. Worrying that he said they can.

This sounds like a wonderful Cydia / iOS Jailbreak app opportunity. MobileSubstrate allows easily hooking system methods. An app which replaces the Address Book API with something returning empty data for all non-system apps seems pretty easy and quite urgent.

Morin and company need to provide an "opt-out and wipe all of my contact data now" option if they don't want legal action and backlash, as well. Simply making the app require opt-in to share this data in the future isn't nearly enough (and, especially in the EU, isn't legal).

Update: I'm working on a MobileSubstrate tweak to neuter AB* functions in non-Apple apps, and it's now possible to get your information wiped from Path... by emailing service@path.com.

From the Wikipedia entry [http://en.wikipedia.org/wiki/Path_%28social_network%29]:

"Contacts are suggested from among persons in a user's electronic address book, as well as people with whom the user is communicating by email."

It's been there for over a year. http://en.wikipedia.org/w/index.php?title=Path_%28social_net...

Though it's quite a difference whether the contacts are checked client-side or all sent over to THEIR server including all (unnecessary) info.

How do you propose to check them client side? :) You still have to send each contact over to the server...

As Matt Gemmell proposed: send over the hash codes of the email addresses or whatever else needs to be compared.

Yeah, but you still have to store the hashes server side in the case where you want to notify people when their friends join (which is how Path was using the data).

Why am i not surprised, this is from a Facebook alumn after all. Uninstalled Path, kind of a dealbreaker since its whole angle is privacy and the CEO can't even get this one basic thing right.

I e-mailed Path and they replied. The only thing I am worried about is how to verify my information is actually wiped out. And what about all my other friends who have me in their address books? How do I get rid of that?

Zack S. FEB 08, 2012 | 05:19PM PST Hi Jeff,

Thanks for getting in touch with us! I have erased your contacts and their information from our servers.

On behalf of the team, I’d like to apologize for any privacy concerns that you may have had. Our current release of Path for Android requests permission to access your address book. In the next iOS release, we will have this same permission request added.

Until the update is released for iOS, selecting “Add Friends” will display the names of contacts that you have stored on your phone. But now that you’ve opted out of contact uploading, we will never re-store this data on our servers.

Please let me know if there is anything else I can do to help you. I’m more than happy to address any further questions or concerns that you may have.

Best, Zack

So I have read the responses and it seems that there are a few schools of thought here and I just want to make sure that I understand the possible solutions.

Per user Steko is this the ultimate solution to the problem -

(0) we get your permission (is this in the ULA, the in app screen? The privacy page of the app?)

(1) we check for your contacts in our database (hashing your contacts). The method of hashing yet to be determined or what info to hash and match if anything other than the email address or maybe the phone number.

(2) we let you know if any matches are found.

(3) we throw away all your data afterwords.

My question is - do you go through steps 1,2,3 each time that you boot up the application or click the add connections button. Compare the hash, report on the matches and dump the rest? Rinse and repeat?

Is the issue more the keeping the address book for later matching, or the passing it in the clear part?

If you were going to have an opt-in or disclosure what would you want it to say?

Combine this with the fact that some times syncing your iPhone in a corporate server brings the whole company address book to the phone. They must have a lot of contacts stored.

It would be nice to go a single week without seeing how utterly complete the notion of privacy has been destroyed.

The responsibility entirely rests in a large part on the shoulders of the geek community - the enablers. I find this entire thread surreal. These were the obvious issues that were front and center way back when chat servers showed up. Some have been raising this issue both publicly and privately since early 90s if not earlier and were marginalized precisely for being bad news bears.

Here is to RMS and his kind.

At this point, if you want a solution, you need to contact your representative and demand data and electronic privacy laws like that which is written in the constitution of Switzerland.

Here's a question: was there a concept of privacy 100 years ago? Or 500? Whenever someone had a baby, or bought a cow, or had an affair on their spouse, didn't everyone in town know about it? Did they ask people's permission when the first telephone book was published?

Or was the first response, "hey, that's an invasion of my privacy!" I doubt anyone said that before the 1950's.

I think privacy is an invention of the late 20th century. I am truly curious if any real notion of "invasion of privacy" existed for most of man's history.

This is a patently absurd notion.

I haven't heard an assertion so patently foolish and I'll considered since the Path CEO claimed that uploading every users "little black book" onto the Path servers without permission or notification was an "industry standard best practice."

What a bunch of hogwash.

Your comment seems trollishly silly, but... the internet and residential electricity are also both inventions of the 20th century - I guess we could destroy those too without bothering you?

Can someone explain to me exactly how I could be harmed by this? My contact list is just a list of names and phone numbers of people I contact. Even if I had an escort service in there or something, I don't think anyone on Path's end is individually looking through the data.

An employee at Path might very well decide to start looking through that data. There have been other cases where employees gave in to temptation to access someone's data. Imagine for example if a celebrity is involved and someone decides to leak their address book.

Now one would hope that employees wouldn't have unrestricted data to this access, but one would also hope Path wouldn't do this in the first place. The fact that they collect all this information in the first place, unnecessarily and without consent does not inspire much confidence in their internal safeguards for access to this data.

Also, if anything were to happen to the company, it's hard to know what hands all that data will end up in.

I don't know any of the Path employees personally, why would they decide to go after me? The possibility seems rather remote.

Maybe not you personally, but think of the NOW scandal going on right now. Information about people's mobile contact info is valuable to a number of organizations in ways not immediately apparent.

Also, for apps with private messaging systems like Path, I'd be far more concerned about rogue employees looking through the messages I send than contact list data. I'm far more likely to have sensitive information in there.

Don't forget about the possibility of Path's data being stolen. Your contacts probably contain enough info for a criminal to carry outa pretty good phishing expedition, for example.

My Facebook friends list is already public. Seems like a motivated criminal already has more than enough info to carry out a phishing expedition. I've never really seen targeted, personalized phishing though, most phishing seems to be broad and generic "enter your bank info here" style.

This common travel scam relies on knowing of at least a pair of friends:


More details would probably make it work better, too.

"I don't think anyone on Path's end is individually looking through the data."

Oh, really? You'd be surprised. http://gawker.com/5637234/

The question isn't whether or not they are, it's the very possibility of them being able to.

But being able to do what precisely?

The fact that address-book upload should be opt-in is obvious, and was stated so many times here it was boring. But, there's also the other side: me, and quite a few of people I know, have good reasons to have an opt-out from being discoverable this way. If someone knows my email address, let them send me an email with an invitation code. They shouldn't even know I'm signed up until I accept.

Though I also don't really think it's something private companies should solve. Now, I can of course avoid services that let me be too easily findable, but the proper solution is to make said opt-out required by law. Otherwise it's just not beneficial for the company to provide it.

I don't have a problem with this as long as they ask permission up front before doing so. I don't recall having been presented with that question myself though.

Disappointed in Path, especially since their focus was on a more private, tightly knit social network.

So I download an IM app that automatically finds your friends based on your phone directory. I launch it and scrolling through my friend's list I see my mom. Some contacts later, I see the real name of the hooker. Both my mom and the real hooker are on this IM platform...just a click away from chatting with me under the same identity. This can be more than creepy, fortunately this is a made up example ;)

I thought about this with whatsapp. This is scary because while we are used to having multiple emails for different parts of our lives, juggling multiple phone numbers is still a chore despite services like google voice.

I feel like we're missing part of this story. :)

It's a double edged sword. Many users like it that they don't have to create a user name, remember a password, confirm an account (usually), etc and that all of their other friends 'automagically' appear on the app who've installed it. Whatsapp has something like %90 of smartphone users in sweden.

Phone numbers cost money, and multiple emails are usually a chore still.

I'm not sure about the hooker part, but when I finally ran through the Facebook mobile app I discovered some of my "friends" have been using aliases.

I don't know if any you remember, but this is why Guido van Rossum quit using Twitter. The official twitter client for Android uploads your entire contact book without showing more than a notification stating 'find your friends' or similar; you click this notification and by that time it's already too late. More on that here:


I wrote a MobileSubstrate (jailbreak only, sorry!) tweak to block the use of ABAddressBookCopyArrayOfAllPeople, the most common method of stealing contacts in this manner.

It's rough around the edges, but check it out: http://news.ycombinator.com/item?id=3564968

It should be available in the BigBoss repository as "Address Book Privacy" sometime tomorrow.

Even if Path buried this disclosure deep in a TOS page, would anyone read it? I just posted a startup idea I have to generate easy-to-read summaries from website TOS pages: http://clearsignal.posterous.com/do-we-value-our-laundry-mor...

Call me crazy, but I prefer it when companies do this. If I'm interested in using their service, then I'd be happy to be alerted when my friends sign up for it.

That being said, I wholeheartedly agree it should be opt-in (or at least have an opt-out) for people who are concerned about their personal data.

Well that's the whole problem isn't it?

If they had asked up front for permission this article would not have been written.

Has anyone looked at Path's privacy policy?

Do they explicitly state that what personal information they download to their servers, what they use it for, and how long they retain it?

If not then they're breaking the law in many countries, regardless of what Apple's current developer guidelines happen to be.

Dave Morin's (Path CEO) response:

Arun, thanks for pointing this out. We actually think this is an important conversation and take this very seriously. We upload the address book to our servers in order to help the user find and connect to their friends and family on Path quickly and effeciently as well as to notify them when friends and family join Path. Nothing more.

We believe that this type of friend finding & matching is important to the industry and that it is important that users clearly understand it, so we proactively rolled out an opt-in for this on our Android client a few weeks ago and are rolling out the opt-in for this in 2.0.6 of our iOS Client, pending App Store approval.

Dave Morin Co-Founder and CEO of Path

This is an accident waiting to happen. Whoever does this is doing it wrong. The case was well-made here by Colin Percival (the tarsnap guy) in his blog: "Playing chicken with cat.jpg" http://www.daemonology.net/blog/2012-01-19-playing-chicken-w...

>> "The answer isn't for (any company) to prove that they can be trusted; the answer is to ensure that their customers don't need to trust them ... The best way to avoid privacy breaches is not to formulate a detailed privacy policy; it's to reduce your capabilities so that you're unable to violate anyone's privacy"

In our Q platform, we specifically upload only the hashes of the address book. There is absolutely no need to have the actual email or phone number of people in order to find "who is on the service". However, when you INVITE people, we specifically download the full email address because we send them an invitation ourselves.

This is just one out of 100 things that our platform does while solving the usual stuff of apps: user signups, importing address books, invites, etc. However, we applied for a patent on some of the stuff we do. Even though I personally don't like patents, it's the thing to do in the current environment. Going to write a blog post about it soon.

So my entire address book is on Path's servers right now?

Well shit. How do I get it off their servers?

One-way hashing the phone numbers and emails would at least be a good solution to alleviate the privacy concerns, while still allowing you connect with your friends on Path.

One-way hashing phone numbers cannot really be one way, unless you are using an inherently really slow hash function.

Without it, generating the table of say 10^10 hashes is within range of almost everyone (especially on a GPU). At say 1ms per input, it would take 10M seconds, or about 4 months.

Does this mean that the standard HTTPS stack on the iPhone is insecure? Shouldn't certificate verification fail when it attempts to send data via the mitmproxy?

Actually I think you can manually add SSL certs to the iPhone, so just add your own cert and the iPhone will trust your MITM.

I feel like this is [unfortunately] a regular practice of app makers nowadays. I'd love to abide by "let the industry govern itself" but I don't think that's realistic. I've seen so many apps that have abusive and opaque permissions.

Is there any regulation to protect consumers here? If not, are any legislators drafting any? Would the FTC step in or does this only happen when a giant like MSFT/GOOG/FB makes a mis-step?

> industry best practice

Did he say that with a straight face? Heard a lot of corporate BS in my time but this takes the cake.

This is Apple's fault for allowing all apps access to the address book. But there is a deeper issue here, trust. Just because I leave my office unlocked doesn't mean my colleagues can steal from it.

I love this app and had great hopes for it but trust is a limited commodity and Path just lost mine.

There's a dozen or so people right in this discussion that are repeating it with a straight face, that this behavior is normal and acceptable.

I know for a fact it is illegal in Europe, know for a fact it is a violation of their contract with Apple, and I am almost positive it is criminal in the US as well.

Therefore the names and affiliations of the engineers here who are claiming data theft of private information is normal are very interesting to me, and I am noting them carefully, as should we all.

Albeit too late, after reading this I uninstalled Path from my Android. I did not buy into this.

1. I just changed my phone # 2. I notified all of my contacts to change their phone #s 3. I contacted both Apple and my State senator.

I am outraged by this scandal, and I still can't bring myself to believe that Path has been collecting this sensitive personal information. My 6-month old's pediatrician's # is in my phone. If this were EVER exposed or shared with a 3rd party, I can only image what kind of damage could occur. Path should suffer for this. I forgive Apple for secretly tracking my iPhone's location for a year, but I DO NOT FORGIVE PATH. Not this time. This went to far. Dave Morin should know better. I bet an engineer voiced that he felt morally wrong doing this, and Path just fired him. This is just wrong. A defining moment in our industry. We need to stand united on this issue, and just try to move forward.

Your child's pediatricians phone number will somehow cause inconceivable damage if this number gets out? I bet calling up all of the pediatricians in your town phishing for this info would be much more productive than worrying about it being stolen off of a database from Path.

I think you need to crank the irony up a little further, some people aren't getting it.

You can contact Path and they will remove the data for you.

I always wondered what the purpose of Path really was, given that it offered little over and above Facebook itself (apart from an arguably nicer UI.) It would seem the purpose is to data-mine users' handsets.

It's worth noting that the AddressBook API dates back to mid 2008:

"The Address Book framework provides access to a centralized contacts database, called the Address Book database, that stores a user’s contacts."

It has been there since iOS 2.0

I didn't know HTTPS requests can be traced so easily from a proxy. I was planning to start coding an authentication endpoint with SSL but obviously it is tracable that quickly. Is there no way to avoid that?

Quora best handles this situation. There can be a lot of benefit for the user to have the contact lust on the server, but it needs to be (1) transparent, (2) obvious, and (3) come with a delete button.

I'd like to keep my contact lust private, thank you.

Moral of story: don't target techies as your end users. They'll just look under your hood to make sure you're not doing anything embarassing like this, and passing back clear-text password in JSON.

That's not the moral of the story; techies are great early-adopters and end users for that reason.

The moral is to treat customers privacy with utmost respect.

Anyone know if they have an Android app? Does that share this feature?

Android asks for your permission on what information you want to give the app. iOS just simply gives it whatever it asks for, without asking you first.

Interesting. What else can an iOS app get access to without permission? location? browser history? other installed apps list? emails? notes? pics? vids? music list? podcast list? itunes username?

Location: Permission is asked for

Browser History: There is no way to communicate directly with what Mobile Safari stores.

Other installed apps list: Apps are sandboxed so it is impossible to know what else is installed. If you've developed one of the other apps you can share the same App ID which gives you access to the same storage space so you could create a flag to indicate one of your apps has been installed. Some apps respond to certain protocols so you can ask iOS if a given protocol will be handled and if it returns yes then you know the app is installed. Again because they are sandboxed you really can't do anything harmful and responding to the protocol only allows the other app to receive information, not expose it.

Emails: No, the only way you can do anything with email is prompt the user to compose an email.

Notes: Same as Mobile Safari.

Pics: You can display a popup to the user that asks them to select an image from their camera roll/iPhoto and if they select a photo you then get a reference to an object that represents the photo. You can't just search their camera roll.

EDIT: rbritton points out that with the AssetLibrary framework you can actually search through all pics/videos and for some reason it gives a location access prompt when you do. http://news.ycombinator.com/item?id=3563336

Vids: Same as pics.

Music List: You can get a list of every song in the users library without asking for permission:

Podcast list: Same as music list.

iTunes Username: To my knowledge there is no way to access this but I've never been asked to so I really haven't spent time looking. In theory because you can access the Address Book you could make a best guess at which contact is the user and then assume one of their emails is their iTunes username.

You can access the Picture/Video library since iOS 4. It does prompt at least once for location access (apparently since they can contain GPS metadata), but it does not mention anything about why it's asking for that location access.


Really? It asks for location access to get access to your asset library? That is pretty stupid.

Thanks for the heads up. I was unaware of the AssetLibrary framework.

This is because the photos contain GPS data about where they were taken.

Ah that would make sense. So that would mean that if you turn off location services any app could access your asset library without any prompt then. Good to know.

I can say for sure that iTunes username is not exposed. In fact, they won't even provide an opaque user ID, which makes correlating purchases through in-app purchase with server-side user accounts very frustrating. You can sort of fake it by correlating with a device UDID, but that is leaky and has a lot of edge cases.

Furthermore, they've officially deprecated the UDID as of iOS 5 which means it'll be going away sometime after that.

In theory, all iOS apps are supposed to be reviewed by Apple and not allowed if they do Bad Things™. Walled garden and all that.

In theory.

So my choice is easy, but for the life of me I can't figure out how to delete my Path account. Both online and mobile interfaces appear to be missing this function. Help?

I was thinking of using Path as a personal diary. But not anymore. Just deleted Path app. I would suggest everyone to do the same. A lesson for Path and others.

You know... I think Google uploaded my entire contacts list to its servers, and I don't recall being informed very clearly about that either.

Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact