Hacker News new | comments | show | ask | jobs | submit login
To what extent is an organisation liable when they get security wrong? (troyhunt.com)
26 points by mikeevans 1260 days ago | hide | past | web | 13 comments | favorite



It would only be healthy for the industry if organizations was liable for loosing sensitive information. If there is no economic risk to be lost, then no serious amount of budget will ever be spent on security. It is basic economics.

There is always the opinion that liability might threaten innovation. That can however be mitigated by having less sensitive information stored on ones servers, or by regular insurance if such information is critical to have.


That would be the right policy, IMHO, if it were defined in the right way.

The crucial distinction is between software providers and businesses storing data related to customers. If the software providers were liable, there would be a devastating effect on the business world and society - basically it would become so that the only way to obtain software would be from giant corporations, and every coder would have to get licensed and buy insurance.

So let's not go there - but businesses selling products or services to the public and storing customer data should not be able to opt out of liability by putting some fine print in their boilerplate terms. Instead they should have to pay the full cost of repairing "identity theft" (so-called, it is actually impersonation) for every customer whose data they lose control of. (Think of Target, TJM etc. rather than Snapchat - i.e. cases where the attackers get personal details and maybe CC data)

This would establish the right incentives, but it would not impair business because with competent staff and best practices, it is possible to get good-enough security for the purpose.


I've been wondering the same thing about Snapchat. Acknowledging a security risk publicly and then not addressing it sounds like negligent conduct. Then again, I'm not a lawyer, and I'd guess if there was money to be made in a class action lawsuit someone would have started those wheels turning already. It does raise a concerning proposition: if any of us can be held liable for "leaving the windows open", who gets to decide when the windows are left open or not? Having to launch each and every app with a whole slew of security features before opening it to the public could end up stifling innovation.


The fact that the courts sided with AT&T and not weev, I am guessing a class action lawsuit wouldn't get very far. The problem is that people who don't really understand this stuff will be asked to set up guidelines on "leaving the window open." I could easily see a "solution" that makes things worse.


A responsible CEO is going to have to think very hard about publically accepting responsibility for an event like this. Accepting responsibility publically puts you in a difficult situation if there is a lawsuit.

I feel like the writer is a bit naieve about the realities of "insecure" software in writing this article for more than just being unaware of why a ceo has to frame the discussion as abuser-centric.

Writing secure software is not easy. Putting liability in place for writing secure code would open every software company in the world to risk of lawsuits and there would never be any real resolution. The best in the world still get security wrong and on top of that you are architechting solutions on top of layers and layers of abstraction. An attack against a lower layer is usually how you end up getting hacked.

You might as well write a law that says everyone gets free unicorns.


I mean let's put this in perspective on the whole snapchat issue - they lost a list of usernames.... which could theoretically be used to match to the person's other usernames from other sites.

It's just not crazy useful... you can do the same thing by scrapping facebook or linkedin without even having to "hack" them. There's companies that expose as much through open APIs.


As was pointed out in a thread when this first happened, there's more important and valuable information in a phone book than what was leaked here.

If we're talking about the content, then I feel it's a pointless argument. If we're saying that it should have been avoided regardless of content, I agree, but the media time and the severity at which it's been advertised is probably a bit overboard.


A few points that might help put this into context (oversimplifying a bit):

1. In general, negligence law works to prevent people and companies from imposing the costs of their actions on others, that is, externalizing those costs, while retaining the benefits.

For example, suppose that a pizza delivery guy drives too fast. Perhaps he wants to make more deliveries during his shift and thereby earn more money. Whatever the reason, suppose also that the pizza guy hits a parked car with no one in it (to simplify the example).

In that situation, the pizza guy (and his employer, but that's another discussion) likely will have to pay to have the other car fixed, and for a rental car for the other car's owner. The rationale is that the pizza guy and his employer shouldn't be able to retain the benefits from his fast driving while making others, i.e., the owner of the car he hit, bear the resulting costs.

(That, incidentally, is why drivers in many jurisdictions are required by law to carry liability insurance -- so that if a driver does negligently get into an accident, there will be a pre-established pool of money that can be tapped to pay for the resulting damage, even if the driver himself happens to be broke at the time.)

--------

2. Negligence can be loosely paraphrased as a failure to use due care when there's a duty to do so. In any given case, it might be debatable whether a duty of care existed, and if so, whether the defendant complied with that duty. In assessing these questions, courts generally look at, for example:

+ the likelihood and magnitude of the potential loss from the conduct in question (i.e., the expected loss);

+ the incremental cost of additional measures to prevent the expected loss;

+ in the case of a business, whether that incremental cost can be amortized across the business's customer base by the business's buying insurance (or self-insuring) and then increasing its price accordingly;

+ which party is in a better position to take measures to prevent the expected loss, and/or to bear the loss if it comes to pass.

For the mathematically-minded: A famous case studied by all U.S. law students is United States v. Carroll Towing Co. [1], where the opinion was written by the legendary judge Learned Hand (yes, that was his name). Judge Hand put it in algebraic terms:

"Since there are occasions when every vessel will break from her moorings, and since, if she does, she becomes a menace to those about her; the owner’s duty, as in other similar situations, to provide against resulting injuries is a function of three variables:

"(1) The probability that she will break away;

"(2) the gravity of the resulting injury, if she does;

"(3) the burden of adequate precautions.

"Possibly it serves to bring this notion into relief to state it in algebraic terms: if the probability be called P; the injury, L; and the burden, B; liability depends upon whether B is less than L multiplied by P: i.e., whether B < PL."

--------

3. So how does this apply to criminal hacker break-ins? Suppose that: (A) a company fails to use "due care," whatever that means, by way of security precautions; and (B) as a result, third parties are damaged in ways that "reasonable people" would have foreseen. In that situation, it's not hard to imagine that the company might well be held liable for such damage.

As a practical matter, in a negligence trial, the plaintiff's lawyers will often think up additional precautions that the defendant supposedly could have taken without undue cost or burden. The defendant's lawyers are then in the position of having to convince the judge or jury that the cost or burden would indeed have been "undue." That can be an uphill battle, especially when the plaintiff is a sympathetic sort and the damage is something that judges and jurors can identify with.

[1] http://en.wikipedia.org/wiki/United_States_v._Carroll_Towing


I believe this area of the law is considered a tort. In a tort case, it's up to the company to perform reasonably. The biggest difficulty that the court has is creating a bar for what is reasonable. Much of the time reasonable is defined by what similar firms (latest and greatest) are doing. One of the most famous tort cases is the McDonald's case where a woman sued the company because she burned herself from their coffee. She ended up winning the case and now McDonald's has to label every coffee cup with: 'Caution: Contents is hot'. I'm not sure how this relates to getting a ticket for not closing your window, but this definitely relates to the Snapchat security vulnerability.

http://en.wikipedia.org/wiki/Tort


The answer in the US for a stock brokerage is "categorically no regulatory-based liability". If there is nothing for a fiduciary that handles your money, there is no hope for customers of any other kind of company. I know this for a fact from first-hand experience witnessing someone's large 6-figure retirement account at Dxxxxxs (name disambiguation available upon request, post an email here and I'll send it to you, but most can guess who it is after this hint: BNY) get zeroed out from an information security attack, and helping with their subsequent attempts to recover funds.

The fiduciary admitted they do not suspect the victim drained the account on their own, and admitted there was no laxity on the victim's part they are aware of (no claim of password has too little entropy, for example). We suspect only a few accounts were hit, and all were drained due to an information security breach of some kind (the victim has some ties to sources who work on Wall Street who indicated they've heard about the breach).

The fiduciary proactively notified the victim of unusual account activity. That was the ONLY positive reaction to date (14 months ago now as of the time of this writing). The fiduciary then proceeds to:

* Lock out the victim from their own online account. Even read-only access to review historical transactions was denied.

* Mark the account "under internal security review", refuse any further discussion pending results of review.

* Make no effort to communicate with victim (my guess is they are likely expecting to lawyer up and go to litigation).

* Refuse at the local branch office to print off historical transactions (victim had switched to paperless a long time ago).

List of agencies contacted and who have refused to assist due to lack of jurisdiction:

* local police

* state banking commission

* state financial regulatory commission

* FBI

* SEC

* FINRA

* CFPB

I am waiting for the victim to give up all hope and give me permission to politely go through my attorney and if rebuffed out this publicly to any financial news reporter who wants the detailed information (account numbers, what little paper history trail exists, etc.), while giving the fiduciary a fixed amount of time to resolve. I might also involve the IRS after consultation with a tax attorney; there are some interesting tax laws that might be relevant from some basic research I did.


I think the "leaving the windows open" argument is wrong. As far as what's right and wrong is concerned, I should be able to leave the door wide open if I so choose. It's my car. If someone comes along and steals my car, however, I would call that wrong and the thief is at fault. Obviously, I don't want my car stolen (because I need its utility) so I take preventative insurance measures to avoid the potential burden of losing use of my car, talking to the police, and hunting down the thief. But, imho, that's my choice and I don't think I'm actively causing others harm by leaving my windows down.

To add some genderism to the fire: if a woman choses to wear clothes that some would consider "provocative" is it her fault that she gets raped because she was inviting undue male attention? Is she simply "asking for it"? I don't think so. And I think it's a human's right to freely express him or herself as s/he choses. If I want to leave the window to my car down, that's up to me, the owner of the car. If someone steals my car, that's their fault.

As far as software security is concerned, I think there are definitely reasonable and unreasonable steps that can be taken in the development of it. But I don't expect the developers of the software I use to be on the edge of their seats watching the internet for the every single security exploit that pops up so that they can instantly apply a patch. I do expect them to take reasonable steps to reasonably secure the software, and when something becomes obvious, to deal with it. When I entrust the storage of my personal details with a website, I have certain expectations about how that information is handled, but I don't expect it to be 100% impervious to attack.

I also think there lies some responsibility with the user to choose to use software they trust (trust being a spectrum, not a binary distinction). For example, I wouldn't expect a lot of security from some kind of seedy porn software, and would actually expect the software to actively compromise my system. If I chose to install such software, I think I'd be partly liable for installing something that is obviously insecure. But I have different expectations from my banking software. I think those expectations, which are relative to particular industries and markets, and which are fuzzy and ill-defined, are partly a user's responsibility (but not wholly... I still expect my bank to reimburse me if someone breaks into their database and steals my money).

My point is that liability in software security, to me, is a fluctuating grey area, and that the areas are defined differently for different kinds of software which should, at least partially, be apparent (and avoidable) to the end user.

I'm not trying to defend Snapchat or say that you should leave your car unlocked, just that I think these kinds of issues are not black and white, nor that one party is 100% wrong and another party is 100% right.


Would the police fine a convertible owner who leaves the top down? interesting concept and incredibly flawed.


I know liability and laws are all the rage, but here is a different take:

I'm disturbed by car == snapchat software. They didn't get the security wrong either. Whatever people can do to abuse the system, at least the system exists in the first place. I fear that if you try to regulate software in terms of right and wrong or in terms of open car windows etc. you will end up with goofy standards that will stifle innovation. I'm sure someone could mention one of those standards.

The software is just doing what it does. You could call this "attack" a feature for power users.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: