Hacker News new | past | comments | ask | show | jobs | submit login
We are sorry (path.com)
533 points by revorad on Feb 8, 2012 | hide | past | web | favorite | 211 comments

Only problem is: It was not a mistake.

They did this only to cover their asses and that has been the only concern they've ever had. That they already tried to push the opt-in was of course only in fear of what just happened.

I'm sorry, I'm all for public apologies and I truly believe that it is in times like these companies have a chance to really prove themselves and really make a mishap something positive (and come out stronger than ever before). And they have tried to do that, for that I give them credit.

But. It was not a mistake. And this sentence really shows why: "Through the feedback we’ve received from all of you, we now understand that the way we had designed our ‘Add Friends’ feature was wrong."

They did it deliberately, there was not a mistake anywhere when implementing this nor with their intentions, and if they honestly didn't understand that what they did was wrong they don't deserve to be trusted again, not never. And if they did understand that it was unethical, which they undoubtedly did, it is even worse.

Their trust is not worth anything more than what they think they can get away with. The only thing that is different today from yesterday is that they think they can get away with less.

This desperately highlights why both android and iOS needs a way to spoof contacts for apps (return an empty list). Some android developers have solved this by having two apps in the market, one "private" version that requires fewer permissions. But that's a kludge (that I really appreciate) that almost noone uses.

You seem to define a mistake as 'an error in the code'. However, their use of the term mistake to be 'an error in judgement' or 'an error in the value of user's privacy', is also valid. When someone does something immoral, saying that it was a mistake because they didn't think it was immoral at the time (but now they realise it is) can, in many situations, be a good response.

Of course, there are some things for which no amount of apology could ever bring about true forgiveness (think godaddy). I personally don't feel this is one of those times, but everyone is entitled to their opinion.

EDIT: Switching "amoral" to "immoral"

Sorry to pick on semantics, but wanted to clarify this because I initially was confused when reading your post: you meant "immoral," correct? "Amoral" does not mean "morally wrong," but rather refers to things which are morally agnostic. It actually seems to me that Path believed their actions were amoral, that is, not registering anywhere on the moral spectrum (neither right nor wrong).

Well, as long as we're picking on semantics, that's not what amoral means. Amoral means lacking _regard for_ morality. You use it to describe a person or other entity that is unconcerned with the morality of their actions. Which fits Path pretty well.

Both uses of "amoral" are correct.


"Amorality, the absence of morality; for example, a stone, a chair, or the sky may be considered amoral",


"not involving questions of right or wrong; without moral quality; neither moral nor immoral."

I think Wikipedia is in the wrong here. In fact, if you read the article linked from the disambiguation page, it strongly supports the definition shown in all dictionaries, and makes no mention of rocks or chairs being 'amoral'. Regardless, none of this bears out the idea of 'amoral' meaning morally agnostic, ie actions having no moral component, and your initial semantic nitpick was itself incorrect.

If by "all dictionaries" you mean no dictionary I've ever read, nor the one quoted for you right here. You're wrong.

Dictionary.com: having no moral standards, restraints, or principles; unaware of or indifferent to questions of right or wrong: a completely amoral person.

Merriam-Webster: lacking moral sensibility <infants are amoral> | being outside or beyond the moral order or a particular code of morals <amoral customs>

Apple dictionary: lacking a moral sense; unconcerned with the rightness or wrongness of something : an amoral attitude to sex.

thefreedictionary.com: Lacking moral sensibility; not caring about right and wrong.

Admittedly, some of these do mention definitions along the lines of "having no moral component", so it looks like everyone was wrong. Hurray!

I suspect, if you probe more deeply, that some of the Path developers where familiar with how this problem is normally solved and just copied a common design pattern. A large number of IOS applications supposedly upload the contact list to make it easier to find friends server side - I further suspect that many, many of the popular social apps do this.

Hopefully at least Five good things will come out of this:

  1) Social Apps immediately remove the "upload contact list code from their
  2) Social Apps come up with a more privacy clueful way of searching for 
     your friends.
  3) Social Apps (all apps, ideally) focus more on user privacy.
  4) Apple requires permission to be granted before allowing an app to read 
     your contact list.
  5) Apple is more explicit about what app developers are _not_ allowed to do 
     when transmitting information off the IOS Device
  6) The App review process adds a check to see if certain user private fields 
     are accessed, (Contact, Photos) - and ensures (through audit, or 
     confirming with the developers) that private information is not 
     being uploaded without opt-in.
If some or more of these things happen, then I'm actually happy what Path did was publicized. They've deleted 100% of the contact information off their servers - people now have to opt-in to add it back in.

It seems to me to be more a case of developers taking the easiest option, rather than spending some time considering a more secure, less creepy way of doing what they wanted to do.

I could not agree more. I think they've set a reasonable precedent for dealing with such oversights.

The fact is, there are innumerable factors that affect people's decision-making in situations like this. The ability to decide whether or not a course of action is ethical is greatly affected by what your competitors are doing, groupthink, incentives, time pressures, etc. I could keep going. And yes, your ability to "get away with it" is also a factor.

Now you may say, "Who cares what the factors are? A wrong decision is a wrong decision." And you're right. However, as a practical person who wants to see real change come about, I cannot be satisfied with the run-of-the-mill, "They did it because they're evil and untrustworthy" response.

People are rarely inherently evil. I find it hard to believe that this group of engineers is really so different from you and I. It's likely that all of us grew up in similar environments, have gone through similar experiences, and possess similar moral beliefs. So aren't you the least bit curious why they're capable of making a decision you could never imagine yourself making? I think simply dismissing them as untrustworthy people is an irresponsible and short-sighted reaction. Human beings are more complex than that.

A lot of teachers believed that the only students who cheat are the dishonest ones. Well, some clever psychologists came along and -- lo and behold -- they showed that under the right circumstances, you can convince almost any student to cheat. That's the nature of humans.

Like it or not, we react to situations much more than to our personal moral codes. No amount of shaming greedy bankers, book-padding executives, dishonest politicians, privacy-invading programmers, etc is going to work. If we want to effect real change, we need to change the systems that allow for and incentivize this type of behavior.

I highly recommend reading up on basic human psychology. Influence (by Robert Cialdini) is a good place to start. Charlie Munger's writings, although unorthodox, are also great.

>The ability to decide whether or not a course of action is ethical is greatly affected by what your competitors are doing, groupthink, incentives, time pressures, etc.

No, its not. You do not kill a person over any of these. You do not kill someones trust in you over any of these.

>I cannot be satisfied with the run-of-the-mill, "They did it because they're evil and untrustworthy" response.

Then how about that they are shitty crappy company who are unconcerned about ethical matters of things and more concerned about what they can get away with. You know that they must have spent considerable time and effort to enable their app and service to steal all Contact data in the first place, right?

>I think simply dismissing them as untrustworthy people is an irresponsible and short-sighted reaction. Human beings are more complex than that.

irresponsible, irresponsible? What shit are you smoking chief? I have zero responsibility for their actions, or the pubic outrage against it, or my own reaction to crap. Let them rot in hell for all I care.

>convince almost any student to cheat. That's the nature of humans.

I am alarmed, you are now equating cheating under the right circumstances, to planned and intentional thieving under business as usual.

>I highly recommend reading up on basic human psychology

and I highly recommend some common sense.

  No, its not. You do not kill a person over any of
  these... they are shitty crappy company who are
  unconcerned about ethical matters of things
You're oversimplifying human behavior. It's not as simple as "bad people do bad things." There are COUNTLESS examples in which large groups of decent people have acted in horrifying, deplorable ways. And our psychologists know enough to reproduce this type of behavior in a lab. Read about the Milgram experiments, in which researchers were able to convince average American citizens to knowingly torture each other for almost no reason.

  I am alarmed, you are now equating cheating under
  the right circumstances, to planned and intentional
  thieving under business as usual.
Circumstances are circumstance, whether we're talking about business or school. In this particular circumstance, you have Path participating in a market where "the police" Apple simply allows this behavior to go on. And where there's tremendous social proof, because everybody else is doing it. And where there is tremendous groupthink, because the only people they consulted with were themselves. And where there was tremendous incentive, because they want their company to be successful. And where they can attempt to rationalize their decision by saying, "Well we won't use the data for anything bad" without any oversight. All the ducks are in a row. It's just the type of perfectly disastrous environment that could entice even the most noble of people to make bad decisions.

  irresponsible, irresponsible? What shit are you
  smoking chief? I have zero responsibility for their
  actions, or the pubic outrage against it, or my own
  reaction to crap. Let them rot in hell for all I care.
It's simple: Either you care more about verbally abusing people who behave poorly, or you care more about preventing poor behavior in the future. If you claim to belong to the former group then fine, keep doing what your'e doing. But it never fixed anything in the past, and it won't do so in the future. But if we want to bring about real change, then we're going to have to concentrate on the immoral systems that allow and incentivize bad behavior.

>You're oversimplifying human behavior

The facts in this case are simple. The judgement is clear. You seem to be justifying their actions. There is no moral justification.

>type of perfectly disastrous environment that could entice even the most noble of people to make bad decisions

see, because of this incident we can now clearly tell which companies are noble and which were pretending to be so. "ducks in a row" is not a moral argument.

>Either you care more about verbally abusing people who behave poorly, or you care more about preventing poor behavior in the future

I am sorry, it is not an either-or, and not the way you put it too. You admonish people for __bad__ behaviour because you care about preventing it in the future.

  You seem to be justifying their actions.
  There is no moral justification.
You can't simply assert that you are right and I am wrong. I gave you clear examples under which normal people can be influenced to do bad things. If you don't think that's possible, then cite errors in the evidence. But if you're going to simply ignore the evidence, I can't take your responses seriously. There's no point in continuing.

  because of this incident we can now clearly
  tell which companies are noble and which were
  pretending to be so.
  You admonish people for __bad__ behaviour because
  you care about preventing it in the future.
Humanity has been admonishing the immoral behavior companies/politicians/etc for millennia, and yet it still continues to this day. Appealing to morality does not work, has never worked, and never will work. Unless you fix the system, you are accomplishing nothing in the long run. What you're doing is the equivalent of blowing on a pot of boiling water to try and cool it off. Sure, it may get a degree or two colder for a few seconds. But unless you take the pot off the fire, the water will keep boiling.

> if they honestly didn't understand that what they did was wrong they don't deserve to be trusted again, not never. And if they did understand that it was unethical, which they undoubtedly did, it is even worse.

This is precisely my reaction to Facebook's Beacon. I decided that they were either completely inept or amoral. In either case I don't trust them.

You are using a very fine definition of "mistake", more like a synonym for "bug" or "defect".

But it's also possible for management to make a mistaken decision, and that's what Path is meaning here.

I, like many others in this thread, think this is a fairly gross over-reaction. As engineers, we're trained to dig up problems and create solutions -- and a big part of this process is understanding what data can be made available to you and how you can use it to make your product better. I sincerely believe that they saw the immense potential of having this information available to them and ran with it under the excitement-induced delusion to the effect of, "who could be unhappy with this when it brings so much value to the table?"

I think their flaw was either in not polling their user base before hand or making it opt in to begin with, but I also think that this oversight happened because they truly believed in the usefulness of what they were doing.

Then again, I still believe that Google isn't trying to be evil (nor do I really think they ARE particularly evil for the time being), so take my opinion with a grain of salt.

Yeah this is like BP making excuses for the oil spill. Anyway, it's every iPhone owner's fault that they give away their privacy that easily without even caring. On the other hand it's Path's fault that they seem to have done this without a real plan (what are we keeping the data for? what are we going to do in case of "PR nightmare"?).

You are right. And I am not touching Path with a ten foot pole. Not in hell. What annoys me is that Apple has caused a few hours of my life to be wasted on this shit. And has me deeply worried about what other crap is uploading all my contacts information into their hush hush secure database. And also that they allowed this piece of free advertisement seeking shit company to get through their fabled review system. Heck, I want an apology from Tim Cook, and maybe one from the heavens where a visionary soul probably rests forever now.

If the going is so bad, I will soon end up using a clam shell that I put up with all these years before an eventful day that I fell in love with an "are you getting it?" product. Makes one wonder what all those jerks making the rounds on SOPA and PIPA are doing to protect us from these Path like shit makers.

You do know, that many other social apps do the same thing as Path, right?

Doesn't this illustrate once again that community policing always works better than corporate policing?

This illustrates rather than you should think twice before using a service you do not know much about. You can always make rules and there will always be ways to go around them. Personal responsibility and awareness is what makes the difference in the end. Would you eat just anything given to you if you ignore where it comes from and how it's made?

> Only problem is: It was not a mistake.

I don't think the developers behind the product were thinking about it in a bad way when they did it that way. It was probably more practical to do it that way at that moment and they didn't give it more thoughts, like they would never have considered selling those informations.

I get that now it's a big deal since it became a huge product. I wouldn't call that a mistake though, the problem with personal information on internet is pretty recent (facebook, google+...) and developers don't really know how to deal with it yet.

I guess the more we see problems like that, the more developers will educate themselves on the matter.

Only problem is: It was not a mistake. They did this only to cover their asses and that has been the only concern they've ever had.

Well, you can make a mistake intentionally.

As in: "I intentionally opted for course A, and I realize it was a mistake".

True. What I meant was that they try to make it sound (at least to me) like the action of stealing the contacts of its users was a mistake. They do this by saying: We believe you should have control when it comes to sharing your personal information. etc. etc. It makes it sound that it somehow was a mistake for those believes to be violated. It wasn't.

And if you ask me, that breach of their users trust is not something that you can just turn around. If they didn't understand that their users might get upset that only makes it worse (when it comes to trusting them).

The action that path says they've taken - which is to delete all the contacts they have so far collected - serves to move me to forgiving them, 'cos that action is the only thing they have going for the contact collection being a "mistake".

"Privacy empathy" (no .. pun not intended I swear) seems hard to come by these days.

I see it as "We made a wrong decision"

Ironically, this was exactly the kind of apologetic but side-stepping rhetorics used by still German president Christian Wulff so I call shenanigans and nothing but modern rhetorics and modern PR-management. This is very alike to catching someone red-handed, with the hand still on the murder weapon stuck in the body... and then they make a public apology along the lines of "Through the feedback I’ve received from all of you, I now understand that the way I handled this dispute over $20 million, which are clearly mine by the way, was wrong. But I deeply care about ethics, human life and I honestly believe that body should be allowed to live and as a clear signal to my commitment to human rights, I will immediately retreat my serrated blade from their chest."

> They did it deliberately, there was not a mistake anywhere when implementing this

Exactly - so that's why those "oh I realize that now and really want you all to understand my deeeep commitment to the exact opposite moral values of what I actually did" apologies make me so sick. It completely side-steps the fact that it was done deliberately, 100% on purpose and they basically cover that up by trying their hardest to scrape it under the rug as an "oopsy-daisy!" now and let users feel as if thousands of phonebooks beamed themselves totally magically into their servers and they really had no idea that was happening!

You can simply not be so detached from reality that you do not worry about reading people's phone books like that.

Want to apologize and really speak through actions? Dave Morin, Co-Founder and CEO, step down immediately because you have deliberately violated human rights and now you are just trying to get away with it, IMHO. And as CEO, you are ultimately responsible.

Human rights? If I'm correctly understanding the issue, their software monitored your contact list so they could notify you when one of your contacts joined the service. You seem ready to throw him before the International Court.

That they've wiped their user data and are giving people the opportunity to use their product in a setting with opt-in sharing seems to demonstrate to me, at least, that they still believe that hosting your contact information would add value to their product, but they now realize that concerns regarding privacy are significant enough to warrant using the product without this feature. To reference a parallel thread, I don't think this is a reflection of morality/amorality/immorality, but rather that this never registered in their engineering oriented brains.

IMHO privacy should become a human right in these surveillance-ridden times but that was ahead of time - replace with "privacy", if you ask me it was a huge intrusion into the privacy of the users.

Sure, privacy should become a human right. I personally am a very private person. What you need to understand is we live in a world where there is no such thing as privacy. We need to clearly define what is right and what is wrong. What needs to be opt-in and what needs to be opt-out. What we, as a community, need to do is set a standard. We need to establish boundaries so that we can regain our privacy.

Outside of establishing boundaries there needs to be a way to deal with those who break the rules. Sending CEOs straight to the slaughter house doesn't accomplish anything. Companies need an opportunity to react and do the right thing. Especially when intentions were good, and the reaction from the Company is as responsible as Path's.

+1 Paths Owned their mistake +1 Deleted all the data +1 Fixed the mistake by publishing an opt-in feature.

They did everything they could to right their wrong.

You also need to realize they didn't commit murder. They weren't 'caught red handed with the murder weapon'. They had some digital data, and then deleted it. It's not like they raped and murdered your wife and family. They didn't commit genocide. They made a minor mistake and fixed it.

If this was such a minor little mistake, why does the congress suddenly deal with it?


Key paragraph: "We believe you should have control when it comes to sharing your personal information. We also believe that actions speak louder than words. So, as a clear signal of our commitment to your privacy, we’ve deleted the entire collection of user uploaded contact information from our servers. Your trust matters to us and we want you to feel completely in control of your information on Path."

Great save for a bad mistake.

I would bold it if I were them. It's a nicely written message, but it reads like a lot of other PR apologies and it's easy to skim over it, deep in its position in the 5th paragraph.

Sometimes you need to make actions speak louder than words. :)

>We are deeply sorry if you were uncomfortable with how our application used your phone contacts

Better would have been 'we are sorry we misused your phone contacts', rather than trying to make the users responsible by invoking their feelings.

Aside: interesting how the concept of theft seems meaningless when applied to copyrighted material, but meaningful when applied to private data.

I don't think that you should assume that they are sorry that they "misused your phone contacts". This, like a lot of companies' efforts, is emblematic of their efforts to find out what people's (ever-expanding) comfort zone is when it comes to giving up their privacy. They (Path) are not looking at this as a philosophical failure (which would be cause for the apology you put forth)...they simply see it as an A/B test result ('sorry about making you uncomfortable').

I would bold it too. That's at the heart of how sorry they really are. If they didn't delete the info, it would be PR blah.

I think you mean "make your words as bold as your actions." :)

I almost wanted to bold it even in the quote. I actually missed it the first time I skimmed the post.

How does that apparent belief square with their actions though? If they really believed that you should have control over your personal information, and that your trust mattered, they would never have uploaded it without user consent.

It's not a "great save", it's a piece of PR flak arse-covering.

I completely agree. The engineers knew what they were doing when they design the app to upload all my info. This behavior should be illegal. I do not care what their BS press release says - they are just covering their a. I will never trust them ever again <\endrant>

This behaviour is illegal in the UK.

Great save for a bad mistake.

Still leaves me with a shitty feeling. Basically this boils down to "sorry we got caught", they knew what they were doing.

Not to single out Path, a massive number of apps are guilty of this behavior.

>Not to single out Path, a massive number of apps are guilty of this behavior.

And so is Apple. I am alarmed that any 2 bit app can access and upload all my personal contact information for any use they want to.

Hardly. It's good PR maybe.

Why would I not trust them with my contact info but trust that they actually have deleted it and there are no copies.

Also, these are developers, there are copies, it's a near certainty.

"we’ve deleted the entire collection of user uploaded contact information from our servers"

Since we're talking about a fantastic breach of trust, I'd like clarification that all copies of all uploaded contact information have been deleted from all servers (even ones that one could argue are other people's), and from all backup media, and further that no effort will ever be made to try to recover this information.

Because, I'm sorry, but anybody who thinks its OK to violate someone's privacy like this is at best someone who is able to easily justify unethical behavior because they think their business might depend on it and at worst a sociopath.

There is not a human on earth who would not object if someone else picked up their phone and started looking through the contacts.

Key giveaway to their untrustworthiness still: "and we want you to feel completely in control of your information on Path"

TO FEEL, as in "we still don't give a shit whether you actually _ARE_ completely in control of your information"

"Great save for a bad mistake."

I'm not so sure.

The updated iPhone app does the right thing.

What do the apps not updated to the latest version do? Does it re-upload the contacts? If it does, what does the server do with the data?

But which mistake?

I assume that you are not ware of the FTC fine involved if they kept the data, right?

Its almost as bad was Zynga pulled its first year in operations

I completely agree. I'm happy they actually took action and didn't just say sorry.

They should prove it by publishing the collection they deleted, otherwise how could we know? :P

Honestly, I keep hoping Apple adds a permission check for the contact list (like they do for GPS location). If the user says no, they should just return a blank contact list (to keep old apps happy that aren't expecting the call to fail).

It's crazy that they haven't added this already. Facebook needs to get my permission to find out where I am, but not to scrape a hundred names, phone numbers, and addresses out of my phone? Bizarre.

I realise you might be using Facebook as an example in a theoretical sense (i.e., that Apple believes protecting your location is more important than protecting your contact database), but in case you weren't, Facebook's "find friends" feature does give you an explanation of what is going to happen, asks you to confirm.

Here's the explanation:

"If you enable this feature, all contacts from your device (name, email address, phone number) will be sent to Facebook and be subject to Facebook's Privacy Policy, and your friends' profile photos and other info from Facebook will be added to your iPhone address book. Please make sure your friends are comfortable with any use you make of their information."

Facebook is trying to push the responsibility for the privacy of your friends to you with this, and by doing so they are violating EU privacy laws.

See: http://en.wikipedia.org/wiki/Data_Protection_Directive

This is one of the few areas where the EU is (still...) ahead of the rest of the world. Facebook should not be able to collect data on your friends even at your request unless your friends explicitly consent to this.

Clearly your friends have no business passing on your data and Facebook has no business collecting it. "Make sure your friends are comfortable" is no excuse for facebook to go ahead and break the law.

Your neighbours to the north also have laws like this that are on par with the DPD. The EU treats PIPEDA as essentially an implementation of the DPD so that DPD compliant orgs can share data with Canadian businesses.

  > Your neighbours to the north
Confused me a bit, b/c I don't think that jacquesm is from the US. I was thinking 'neighbors to the north' meant Scandinavia or Iceland.

Yes, Facebook was just an arbitrary company/app to use as an example. Nice that they exercise some restraint, though.

With the recent news, I'm confident they'll do just that in the next iteration. I also submitted a bug report to http://bugreporter.apple.com/, because I really believe it's the way it should be (even though it restricts me as a developer).

I also submitted another bug report (15th time, I believe) about iOS 5's stupid lack of support for audiobook chapters and podcasts...

Totally agree. Same thing before letting apps dump the entire iPod library, while they're at it.


So what changed really?

Yesterday morning Path thought it was perfectly OK to scrape user's Address Book behind their back, and now they suddenly acquired moral backbone and ethics? Please give me a break. What they did today is the only sensible thing there was to try and save the company, so they did it, but should they be commended for that? Hell, no. Would you commend a landlord for dismounting a hidden camera in your bathroom? Doubt it.

The fish rots from the head. The company is still under the exact same management it was yesterday morning. Nothing's changed. I wish Path a slow, painful and very public demise to serve as a dire warning to others in similar positions.

They still think it's okay. It's the users' fault this is a problem. Read carefully:

- users brought to light an issue

- we now understand that the way we had designed... was wrong

- we are deeply sorry if you were uncomfortable

Not sorry. Sorry if and only if you took it wrong.

- We want you to feel completely in control of your information on Path.

You won't be in control, but we want you to "feel" you are.


- stored securely on our servers using industry standard firewall technology

Hmm. My firewall doesn't store data.

- We hope this update clears up any confusion

It's not us, it's you. Stop being confused.

So many people are fooled by these weasel words I am surprised.

I agree with you 100%, and have no idea why your comment is not at the top.

This is a non-apology apology. It's a "you caught us, and we don't want our company to die" apology.

If they actually cared about your privacy, they wouldn't have stored all your personal data on their servers without your permission to begin with.

> Would you commend a landlord for dismounting a hidden camera in your bathroom? Doubt it.

Woah, woah, nice analogy. Excepting what Path did is even worse. They also retain(ed?) your bathroom activity photos forever and can do what they please with it.

Surprise: an actual apology, followed by an explanation and how they're going to do it slightly better in future, plus a remedy of sorts.

Better than ATT, VZW, MS, TW, Comcast, or any national US bank.

The fact that they've already deleted all user address book data, and have an updated version of the app available today with a privacy option, is a big deal. I don't know how they managed to get an update to the app approved so quickly (24-48 hours?), they must have worked directly with Apple. A good sign, either way.

IIRC, Path's CEO commented on the post that initially reported this behaviour saying that an update had been submitted that would make it an optional feature prior to the report.

Any dev can appeal to Apple to expedite a review when there is a good reason. I've done it several times and they generally have it reviewed within a few hours and out within 24. It's good that they've done this though and deleted all the data.

Apple is getting a lot better with their approval process, I've had updates approved in 24-48h without doing anything special.

You are entitled to a number of expedited reviews per app

There is no proof of that they really deleted all user address book data.

You can't prove a negative. You can only prove the existence of certain data on a particular server, you can not prove that a company does not have certain data unless you are prepared - and they are willing - to give you full access to audit each and every byte on their systems and to wipe any parts that they can't explain and you can't find a way to decrypt.

Clearly that is not practical so we'll have to take them at their word, as it stands I think that if path is found out to be lying about this that it will come back to haunt them big time.

It probably would help if they had an outside auditor to verify the actions that were taken. Still wouldn't be final proof to anyone who believes that they might still be hiding something but is a step further than just saying "trust us".

This is a welcoming move from Path. However, "industry standard firewall technology" is gibberish.

Translates as "We don't use encryption because it would cost too much, but we have ACLs so only our staff can look at your personal data"

Hmm, I read it more as "We actually have no idea what we're doing with any of this stuff and we're not giving you any more reasons to trust us with ANY data".

Written by their CEO => icing on the cake.

Agreed. That specific sentence reminded me all too much of snake-oil and charlatanry, and it coloured my reading of the rest of the apology.

Yep that sentence, and ones like it always worry me when I see them in companies' documentation about security. If the best thing you can say about your system and application security programme is that you use a firewall, that wouldn't fill me with a lot of confidence...

Yeah, that was kinda jarring, especially keeping their earlier claim about how silently stealing the address book was "industry best practice" fresh in mind.

Dave Morin, 2010:

Path does not retain or store any of your information in any way.

Source: http://gawker.com/5883549/dont-forgive-path-the-creepy-iphon...

I was very critical of Path yesterday. Their initial response didn't really address the issue and was basically an excuse. But this has restored my faith. I never believed they were doing anything malicious with the data but the fact that they bill themselves as a trusted/private social network leads me to want to hold them to a higher standard.

The big thing in this apology is that they have deleted all the data. That was a good move and shows they listened to complaints. The app update is also smart. Hopefully they will implement a better friend finding system soon (maybe using the hashing ideas put forward in yesterdays HN thread).

It reads like standard PR damage limitation, but it ticks all the right boxes:

* They've admitted responsibility.

* They've shown they understand why they were wrong.

* They've explained what they've done to put it right now.

* They've explained how they intend to proceed in the future.

If you accept and later decide you would like to revoke this access, please send an email to service@path.com and we will promptly see to it that your contact information is removed.

My only qualm is that you can't revoke the permission from within the app. The opt-out should be as easy as the opt-in.

I suspect this is because technically it would be a PITA to allow users to allow/revoke at their own discretion.

While I agree that it would be nice from the users point of view, the impact of pulling data from the kind of analysis I'd expect them to be doing is going to be a data analysts worse nightmare (i.e. holes in your data set can sporadically appear, so nothing is concrete and all analysis must be reverse-justifiable). If you can reduce the frequency this happens but still give the users the option, this seems like the best of both worlds.

It's not a perfect solution, but I don't understand why Path don't hash the contact details before uploading them, and check against the hashes. You can still infer all kinds of social graph information, of course, but they're at least not consuming raw contact details.

Preface: I will joining Path this Summer, but I do not speak for the company in any way, nor have I spoken with them about the situation. This is a purely technical reply...

You cant guarantee a unique hash. When you hash users' data there is the possibility of collision; this probability grows with every new user. Without identifying data of some sort, it's difficult (impossible?) to get the exact user.

That is incorrect. SHA1 still has no known collisions despite years of research and computing power dedicated to finding just one collision.

Edit: Furthermore since the set of valid emails and phone numbers is a very restricted set of input, it is extremely likely that there are literally no two valid email/phone numbers that SHA1 hash to the same value.

I agree that it's practically not a concern, but the local part of an email address[0] is up to 64 characters in an alphabet of size 72, and the domain part is 253+ characters in an alphabet of size 38, giving the valid email space a size of greater than 3e519, which is enough to guarantee collisions in SHA-512 and all of the SHA-3 finalists.

[0] http://tools.ietf.org/html/rfc3696

If the set of data did contain 3e519 entries then yes it certainly would generate collisions, however it you look at a more restrictive set of data, lets say 5 emails per person alive then you're looking at about 2^35 email addresses which could easily be hashed by MD5 with out a significant chance of collision.

Instead of an MD5 they could just as easily upload a bloomfilter which would expose even less data and would compress it significantly, however it would be more computationally expensive to generate matches that way vs. hashing.

That's irrelevant, since this isn't crypto.

This is matching user email addresses so they can spam you and your friends and grow their company on the back of dodgy practices.

That's a fair assessment,

It doesn't matter. The purpose of the hash isn't to uniquely identify users, it's to narrow the list of users that need to be sent down to the phone. If Path could send their entire user database to the phone, they wouldn't need to send the contacts to their server.

Replying to Me1000, with hashes it would still be trivial to create a system that determines when a friend's signed up, and alerts you. That doesn't require a full address book entry - or any contact details at all beyond an identifier. It's not like Path is actually notifying using any of the contact details it stores, which means it's either representative of wasteful coding on their side, or of something else going on.

The main purpose is to send you a push notification when your friend joins, since a push notification can't execute client side code... that wouldn't work.

You can guarantee a unique hash for all practical purposes. If you manage to find a collision in a robust cryptographic hash then the world of computing will have far more important things to worry about than a social network getting slightly confused.

Well, this isn't quite right--the domain of the hash function is a numerical representation of the user, maybe a 64-bit int, so it's obvious that you can engineer a non-colliding hash (trivially: the hash function is XOR). What's more interesting is whether there's a hash function such that Path can't infer the social graph from user requests without the user's permission. It seems to me a hash would be both one-way (obviously) and dense (so that a randomly generated request from a user would have a good chance of matching another user).

I can't think of a hash function without a good public-key infrastructure, which is obviously beyond Path's remit. Anyone aware of a solution to this?

You could just use SHA-256. If it does collide then Path's malfunctioning will be the last of our concerns. :-)

"If you accept and later decide you would like to revoke this access, please send an email to service@path.com and we will promptly see to it that your contact information is removed."

There it is. If you have a button that stores all contact information, Why can't you add a button that says remove all my contact information ? Ofcourse, then more people will click it. Just a stunt, nothing more.

"It is also stored securely on our servers using industry standard firewall technology."

Undoubtably in plaintext. Having "industry standard firewall technology" didn't do jack for Zappos, why would Path's data be any more secure?

The fact that you have to email Path to "revoke access" is still unacceptable.

This information should never be stored on Path's servers. Best case scenario they should be storing hashes of information and before people say there can be collisions so what? The number of people who would be presented with a friend that they don't know will so minuscule versus the number of people whose personal information is stored in plaintext in a database somewhere.

The idea that when someone signs up for Path and is instantly recommended to friend someone else because that person shared their personal information is scary.

Making this opt-in gives people the illusion of control when one of their tech illiterate friends who always clicks accept has already given out all of this information.

This place is a privacy disaster waiting to happen. I smell an "ideas guy".

I'm glad they decided to nuke the data, but can you really trust them again? This only happened because someone was hacking a project and ended up tracing what Path is doing to his address book and then blogged about it, getting enough attention and momentum to end up forcing Path to take this action.

But can you really trust a company like this in the future?

I think Dave Winer is right. One should treat others data as one would like others to treat their data.


Good for the most part, but does anyone feel like they deliberately left out what it was they're apologizing for?

I can imagine a user unaware of the recent event stumbling across this article and leaving confused about what wrong was committed. They sort of just assume you knew what happened, instead of explicitly explaining what they'd been doing.

But, they're taking steps to resolve the issue, apparently; so good on them.

"We made a mistake. Over the last couple of days users brought to light an issue concerning how we handle your personal information on Path, specifically the transmission and storage of your phone contacts."

Dave explained the issue well enough in the first paragraph.

I don't think that explains anything to someone not familiar with the scenario. It doesn't say how they handled transmission and storage of phone contacts, just that they did it in a bad way.

I don't think it's intentional, though. When writing this I doubt the audience in their minds were the people who don't know about the issue.

Given that they've removed all the data and updated the app, I'm not sure it's necessary that they give highly granular details as part of the apology.

That was a deliberate mistake. at the first they said it's not a big deal (just like Airbnb did) but then when they saw the social media getting on fire they apologized. Better they should not have done it, but good they took measures.

If you put yourself in a user's shoes that doesn't know what the issue was then that is still generic. As a user who doesn't know the story I'd be wondering:

- Did they get hacked and now some unknown party may have the contents of my address book?

- Were they selling my information to others?

- Did something happen as it relates to storage that mixed up or deleted information

- Was my data being transmitted in the clear

- Was mt data being transmitted without my knowledge or approval?

Two of those things did happen but the user doesn't know for sure. To be fair though, I think their statement was enough. They really don't have to go into more details unless the situation calls for it and it doesn't right now. Those who know get the apology they deserve and those who don't continue using Path as if nothing ever happened. Win win.

Paragraph four, which answers questions 2 and 4 in your list and suggests that the answer to 1 and 3 is "No":

"In the interest of complete transparency we want to clarify that the use of this information is limited to improving the quality of friend suggestions when you use the ‘Add Friends’ feature and to notify you when one of your contacts joins Path––nothing else. We always transmit this and any other information you share on Path to our servers over an encrypted connection. It is also stored securely on our servers using industry standard firewall technology."

The actual problem was number 5, and they tell you exactly how they are fixing this: by deleting all existing data and letting people opt in to sharing it.

Actually, in the blog post by the guy who discovered that, he said he was able to read the data - meaning that it was transmitted NOT encrypted (please correct me if I am wrong).

Also, I hope that their "industry standard" firewall is better than their "industry best practices" data sharing practices.

Speaking as someone who has never heard of path before today I have no idea what they are apologizing for, and I'm scanning the HN comments hoping someone will list some background.

For the benefit of anyone else who is confused: http://mclov.in/2012/02/08/path-uploads-your-entire-address-...

Isn't it unfair that Path gets all this press for making a mistake and apologizing? What about all the apps that didn't make this mistake?

Sure, but on the other hand press isn't a currency system for rewarding good work. The other apps should have a competitive advantage by not making this mistake.

They're getting press because It's relevant to people like me, who are current users of Path that started freaking out over it...

Give credit where credit is due. Zynga would never in a million years do this. Facebook probably wouldn't, either.

Dave's message is straightforward and sincere.

Facebook actually asks for your permission before sucking up your entire address book.

Even if that weren't the case, "better than Facebook" is a pretty low bar. "Worse than Facebook" is way, way out of bounds.

I have a question about how they store the contacts. Can't they encrypt each of the phone numbers before they get sent to the server? This way there's no breach of privacy and the friend suggestion feature still works for everyone.

That wouldn't add any real protection. Phone numbers is a very small set (100 million possible in the U.S. and Canada). A rainbow table of all possible combinations can be created in only a week or two.

However, phonenumber+userid creates lots of nice unique hashes.

If they are going to hash the data, they should salt it (and possibly use key strengthening a la bcrypt, etc).

Hashing phonenumber+userid does absolutely nothing for them, though.

The purpose of uploading your contacts is so that if Jack's phone number is (555) 555-5555, and Sam uploads a contact list saying that he is friends with a guy whose phone number is (555) 555-5555, Path can match up those two phone numbers (or hashed versions of them) and tell Sam that Jack is a member. That match-up doesn't work if the phone number is stored as (a hashed version of) 5555555555jack and 5555555555sam.

They could take the phone numbers, sort them and hash them together. So if Sam is 5 and Jack is 6, they both upload the hashed social relationship 56 to the system and it can match them up.

It wouldn't keep someone with access from checking if a social relationship existed in the database, but it should make recovering phone numbers and the like from the hashes quite a lot harder.

That only works though if you have both Jack & Sam's phone books (and they have each other in their books) so the hit rate would go down, possibly significantly.

But if you do this then you can't match phone #s in a users phone book against phone numbers in your database.

Do you mean hash instead of encrypt?

How would one carry out the friend suggestion feature with encrypted phone numbers?

a cryptographic hash of a phone number on their server should match a cryptographic hash of a phone number in a contact list on a phone. The app sends the hash to the server, the server looks up users via the hash and responds with user data for matches.

To be honest this should be a third party service, since it sounds like every major social networking app is doing the same exact thing.

In my opinion, giving out your number, along with the hash of each phone number in your address book to an authority with millions of such hashes isn't appreciably better than giving them in plaintext.

(Hi Dan?)

But you wouldn't give out your number. I haven't completely thought it through but the service provider would provide an api for common platforms. All it would do is 2-way encrypt contact numbers (SSL?). Then the service would do a basic lookup using the encrypted data as a key. If there is a hit for this particular platform it'll return the platform specific data (in this case, like a path specific user id).

Of course the other side would be maintaining users in this service, which again is pretty straight forward.

(Hi David?... I'm the OTHER DJB, probably not the one you are thinking of)

instead of comparing the phone numbers you'd compare the hashed phone numbers.

So now they have admitted what they have done, is someone in the UK going to prosecute them, I wonder?

It seems they may have broken the data protection act in more than one way.

* First, they collected personal data about UK citizens without their permission (as a 3rd party cannot give that permission),

* Secondly, personal information was kept for longer than is necessary (it should have been deleted after it was used)

* Thirdly, they allowed personal data to leave the EU.

Note that personal data includes name and address, telephone number or Email address.


This may be a calculated risk. Hopefully they did more due diligence, but a quick search shows a Dec 2011 case [1] where an estate business was prosecuted for collecting info, but was fined a small amount (< £1,000). The extenuating circumstances were that they had already complied with the law by the time the case was heard by the court. [1]: http://www.ico.gov.uk/news/latest_news/2011/estate-agent-pro...

But they are still not complying. They are allowing people to opt-in on behalf of their friends which strictly speaking is against the rules.

As someone who is always concerned about my own privacy and the privacy of people who trust our company with their data, i am very pleased to see that when things do go wrong honesty is being appreciated.

While i don't think its acceptable to ever make this kind of mistake, we should also encourage companies to be upfront and honest about what went wrong and what they're going to do to make things better when issues come up.

This is a positive step forward for this company and tech companies as a whole. Having said that, maybe i would feel different however if i actually used this app?

Might be worth pointing out that presumably Path's business model is still to collect as much data about their users as possible and sell it to advertisers.

"In the interest of complete transparency we want to clarify that the use of this information is limited to improving the quality of friend suggestions when you use the ‘Add Friends’ feature and to notify you when one of your contacts joins Path––nothing else." Is "complete transparency" == me trusting you just because you say so?

I don't understand how they could think it's not a terrible thing and feel dirty for doing it just because they can. It's less about making mistakes and more about developer greed. They could have md5'd the email addresses before sending them or used any number of simple to implement techniques that would still have retained the original feature. Stop telling us it was a mistake! This was a premeditated privacy breach, 1st degree. They decided to hedge their bets and profit from their users by blatant disregard for their own users privacy -- the user's why they have a job and VC in the first place. In the end all this is just going to reduce the user experience with unnecessary confirmation dialogs and prompts because a few lousy startups couldn't keep it in their pants. These companies do not deserve forgiveness; they deserve a class action lawsuit.

It is of course stupid that iOS doesn't sandbox and require permission of apps to access the address book. Apple needs to fix this. But it is not enough.

I keep a lot of data about people in my address book in addition to phone numbers and email addresses: birthdate, names of children and spouses, residential and work postal and physical addresses, gift ideas, group affiliations, etc.

I am happy to click "OK" if an app asks for essentially the social graph information that I've already exposed through Twitter and Facebook. I don't want an app to have the other data I've curated. Even if you can trust the app vendor to not be evil, you can't guarantee they won't leak the data through incompetence.

So while Apple really should require permission for apps to get access to the address book, we really need a new model more sophisticated than all or nothing.

It's a step in the right direction, but doesn't clear up all of the confusion. I can't update to 2.0.6 (it's not an option on my device, a 4th gen iPod touch running 2.0.5). In addition, how will adding friends work going forward -- Facebook Connect, or manual searches by name?

Will hashing be implemented?

If you can run Path 2.0.5, you can run Path 2.0.6. I think you're confused because the AppStore hasn't actually updated yet to show 2.0.6. Try again later today.

They said everything will work exactly like before if you opt-in. So the second you click "accept" they have your entire address book until you email them and say please delete it.

This is smoke and mirrors and makes it sound like they've done a good deed.

Nobody seems to be talking about the obvious issue here: they are essentially asking us to trust them again when they tell us they have deleted everyone's contact info from their servers. "We fucked up. But we fixed it, trust us." Am I the only one that finds this odd?

Nice to see a transparent and timely response to this issue. I get the feeling that the startup world learned some serious lessons in crisis management after seeing the Airbnb nightmare unfold. At the end of the day, the customers/users are the real winners here.

we are sorry you found out what we were doing and couldn't do much other than apologize about it.

Yeah. They should just shut down completely.


Path is one of the most well-managed apps I have had the privilege of using to date. Their response to this 'scandal' was as close to flawless as it gets.

Not only did they take full responsibility for what they did and apologize instead of making excuses, they deleted all the data people were concerned about, wrote a well-worded blog post about it that hit the top of hacker news within a couple hours, AND pushed a fix for the issue to the app store all within less than a day of the concern becoming public.

Path's attention to detail not only in the gorgeous design and user experience of their app, but in the way they handle PR crises like this one only makes me trust them more. Well done Path, well done.

Its very surprising that no one who was involved with the implementation of the feature thought that they were doing something wrong.

Only when someone caught them "in a compromising position" they said sorry.

Its like Bill Belichick saying "I misinterpreted the rule" :)

Path should come up with a Privacy Protection Program that commit them not to repeat their mistake. It's too easy to do something and ask for forgiveness later on. That would distinguish themselves from Facebook's way of doing things.

The could set up a company in the European Union, and hence be subject to EU Data Protection Law, which is stronger than the USA. They would then be risking fines and court orders for things like this. It would show that they don't think it'd happen again.

It would be a bit of an beaurocratic pain in the ass though.

Companies make mistakes. In the rush to develop great products decisions are made rather quickly. When your intentions are pure, the fact that you might be doing something wrong simply doesn't cross your mind.

Unfortunately, these things happen.

What you have to do now is look at how Path reacted. The second the article exposing their mistake was published Path became very open and honest. Above that they offered reassurance to their users, deleted the data (I never expected that), and pushed a feature to opt-in to sharing your private data.

In my opinion they couldn't have handled this any better. For that reason, I give Path all the trust and respect in the world.

Call me a cinic, but suggesting that we shouldn't be concerned by saying "It is also stored securely on our servers using industry standard firewall technology" seems somewhat naïve. As if that means that our data was adequately protected from prying eyes...

I'd be impressed if they'd turned round and said "We realise it looks like we were trying to expand our business off the back of your private data, and have therefore decided that in our next release we will stop uploading user's contact details altogether. We'll make our social network so compelling that it'll go viral without abusing your privacy."

The photo of Dave Morin and the words "sincerely" felt like a bit of a mismatch.

A proper apology. Unlike what Google did with their fiasco in Kenya a few weeks ago, the company actually did away with any benefits they derived from the bad conduct. Google simply apologized (great because it's free) yet didn't mention deleting all data scraped, deleting all contact information collected for the businesses, and cancelling all orders for hosting and other such services. I presume they must have maintained all profits generated by their conduct. It's good to see that Path at least understand what it actually means to be contrite.

Private social network seems almost oxymoron. To create large social network, a company has to tap into user's vast social network and try to get the user's friends/contacts to join in as well. And then to increase engagement or stickiness, you have to keep reminding them to come back. Otherwise, the social network company might not grow as fast as founders/investors would like and also most likely affect revenue.... It's a tough place to be, really. I don't condone their action. At the minimum, they should've gotten user's permission first.

How do you know they did delete the data?

Do you honestly believe they are sorry and they deleted your data just because they said so?

I personally doubt it. It's valuable for the company and it would be foolish (from their perspective) to delete it. Somebody has to write & test code, to make sure that the code uploads all your contacts.

I find it hard to believe that you have access to all the data, see what is coming in, and then discover, when you're caught that "ups, we did a mistake". Our implementation sucked.

The thief is sorry to get caught.

I like the apology. They are doing to right things as well by deleting all of the existing data, but it is a lesson to all companies playing in the business-to-consumer space: Have clear and easy-to-read privacy policies and get explicit consent from users before you collect their data.

After reading the post, it is apparent that Path did nothing wrong except poorly communicating their procedures and policies.

"""We are deeply sorry if you were uncomfortable with how our application used your phone contacts."""

Or do they mean they are sorry you found out about it?

They really need to update their privacy policy, which is currently mostly generic nonsense. https://path.com/privacy

Regardless of whether they throw up a confirmation prompt, their privacy policy needs to clearly describe what information is scraped from your phone, how it's used, and how long it's retained.

There was a great discussion about hashing strategies as an alternative to storing all of this contact info. Did any specific code/examples follow?

It'd be great to see a new "best practice" emerge from this discovery. If it's easy to use, everyone building an app will just default to comparing hashes vs. matching phone numbers.

Perhaps I'm too cynical but as I subvocalized this, I added "now" to the end of every sentence.

But there are all sorts of sentences in that post that don't make sense with "now" tacked onto the end of them, wouldn't that bother you?

I see a trend of pushing the envelope of what is admissible. If users don't like it they are quick to apologize (lusers..., we'll iterate over this...)

If they do it often enough, in the end one of them will even claim they're using the "standard industry practice"

"Standard industry practice" in the second paragraph. I can't believe it:


I like the "sorry if" comment ...

Yeah, I don't consider an apology that blames ME an apology. You screwed up, full stop. You're shirking your responsibility if you're talking about my response to your mistake.

I wonder how many backups of their database still have remnants of this contact info?

Bah, "We are deeply sorry if you were uncomfortable with how our application used your phone contacts." Is still a non-apology. We're sorry that YOU feel this way. Not, we're sorry that WE screwed up.

We are sorry.

We are guilty. We took your contacts...and no, you can't have them back.

I was hoping to see that they would just drop the entire database, and then implement hashing from here on out. Otherwise, the apology feels sincere and I appreciate it.

"So, as a clear signal of our commitment to your privacy, we’ve deleted the entire collection of user uploaded contact information from our servers."

That sounds like exactly what you were hoping for.

Except for the "and then implement hashing from here on out." part.

So, they haven't changed their implementation, they've just added the ability to opt out of the poor implementation.

But hashing doesn't add any protection in this case. There are a very limited number of phone numbers in North America and so those hashes can be pre-computed and rainbow-tabled in a short, reasonable timeframe.

Is this true if they were salted with a very long phrase?

But the app would need to contain the salt in order to send it to Path's servers hashed and salted. So a hacker could decompile the app to determine the salt.

So you can opt out then, since they're not doing enough to address your concerns. They're being upfront about it, though, and putting that choice in your hands.

> putting that choice in your hands

They are putting a false choice in your hands that they hope will lead to the status quo while still giving a show of making good on this issue. They could, through sophisticated hashing and matching algorithms, do the user matching without ever learning your contact details. But they aren't bothering to do that. Instead they are just planting a checkbox in front of the user before they go and violate their privacy, and they hope that the vast majority of users will just check it and they'll only lose data from a minority of privacy nuts. Which means Path will end up exactly where they would have been anyway - with a giant database of personally sensitive information sitting unencrypted on their servers, waiting to be exploited, abused or leaked.

Im SORRY to post this but i just couldn't resist http://www.youtube.com/watch?v=BeP6CpUnfc0

(This will be an unpopular opinion)

The community response to this is ridiculous. Off the top of my head, I can't think of any other company that has responded to community criticism within a day or two with a policy reversal, a software change, and a deletion of offending data.

Guys and gals, stop picking on Path. They are AWESOME. They deleted your data and changed their app so it would never happen again. Try that with Facebook.

As a nerd, on some level I too lament that they didn't fix this with a cryptographic hash and a bloom filter, but come on, as businesses go, this is top notch.

So if I do something bad and then when someone finds out I reverse it, am I instantly forgiven and innocent? And anyone with half a brain can figure out that people would get pissed about this, which means they took the risk to do this secretly. They did not do this accidentally.

Occam's razor disagrees with you. It is perfectly plausible that some engineer with little interest in cryptography/privacy implemented the most obvious solution.

Also, yes, I typically forgive people when they reverse their actions and ask for forgiveness. It makes for good relationships.

Forgive people, yes. Companies? Eh, plenty of other fish in the sea.

A sincere apology always adds value to this world. And I guess, it's sincere, even if there are flaws in it here and there. Just my perception ...

Trust - difficult to earn, easy to lose. Let us see whether their users give them a second chance. At least they are open and honest about it.

I still don't understand how someone, maybe even a friend, can allow Path to access/store my information from his/her contact book.

Not bad, but could have done without the "As we continue to expand and grow we will make some mistakes along the way." sentence.

how do you know they actually deleted all? I'm not saying they did not, but at some point, I expected you to say "pics or it didn't really happen" Many people are getting emotionally attached with companies, apps etc. so that it hinders their ability to even think about whether there is evidence or not.

Is anyone cataloging public apologies like this for future reference by start ups and other online businesses?

The right thing to do. However, their 2nd paragraph should have been the one starting with "We believe you should have control when it comes to sharing your personal information..."

The rest of it is a repeat of yesterday and is really not necessary.

I do want to know how I can backup by Path to a S3 or Dropbox account. Does anyone know if they support this?

Are you guys buying into their PR BS? They knew exactly what they were doing!

I wonder if when Path 'deleted' the data, they shredded the hard disks too?


That's a good point, and then there are of course back-ups to be considered. Deleting data is surprisingly hard, but fortunately for path this is 'bulk' so that makes it a little bit easier.

Making sure you really lose a single record is a lot more expensive because then you have to selectively remove it from your spinning back-ups as well, in this case you can just wipe the back-ups of the file by opening the file for 'update' and overwriting it with random data.

Tapes are a bit harder again...

Sorry doesn't make it OK to start with.

Dear Path, go FK yourself. BYE.

I have a real IMPORTANT question...

Did you know that Path by default and always does not store android phonebook address entries on their server?

In facts its against standard android dev practices to the point where its prohibited by Google..

So when Path found out about that in completing the android app why did they continue to insist that it was right on iphone to do so?

Now I would not say the Path CEO is directly lying, but it stinks pretty bad..

"we’ve deleted the entire collection of user uploaded contact information from our servers. " -- I doubt it based on how he define "delete".

Seems like they did the right thing. Kudos to a company that reacts appropriately.

They still didn't respond to my email that I sent >24 hours ago.

Note to PR dicks: never include a mission statement in an apology if that very mission statement is the reason you were hired to write an apology.

Note to app builders: never hire a PR firm to do your dirty work.

I have to say, of all the comments here (and the large volume of downvoted ones), I don't see the problem wiht jsavimbi's comment. It does seem very strange to say "We are all about user choice and privacy" while apologizing for violating that credo.

Hahahhaahah what a fucking joke

I'm kind of sick of this "let's revolt against everybody using my data" mentality. They don't persist your contact data to their server. What exactly is it that you're afraid of?

Moreover, how on earth did you think the "Add Friends" feature worked? I'm assuming at least some of you program software, and you should know that data doesn't just appear out of nowhere. Do you really expect a software startup to move every piece of data sorting & analyzing to the client side that has potential to piss off its userbase?

I understand that it's easy to just encrypt the information, or some other X remedy. I'm just saying there's a line between a software mistake and the let's-grab-the-pitchforks rhetoric that inevitably stems from stories like this.

If they aren't persisting your contact data to their server, how can they now say they deleted it all from their server? I was under the impression that storing your contact data was exactly what bothered people. Am I mistaken?

I agree in general, but they do actually store your data on the servers.

Oh. Nevermind, then.

The issues are that they sent address books in plain text without opt in. Two fundamental, I'd-sack-him-if-I-employed-him mistakes.

Moreover, how on earth did you think the "Add Friends" feature worked?

Unless all their userbase is composed of programmers or at least IT people, it's completely unreasonable to expect them to know that. I think they should, for their own sake, but as a app developer you can't assume they do.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact