
Should developers be sued for security holes? - fwdbureau
http://www.techrepublic.com/blog/european-technology/should-developers-be-sued-for-security-holes/1109
======
tptacek
Do you think we currently understand every mechanism by which software running
in adversarial conditions (a web server with anonymous users, an operating
system running a freshly downloaded application, that application handling
random files off the Internet, &c) can be subverted? I don't; I've learned new
bug classes within the last year, and I'm nowhere close to the best in my
field.

If you're not confident that you understand all the avenues by which software
can be compromised by attackers, how could you possibly be at ease with the
idea that the law would presume vendors could secure their code before
shipping it?

You may think, "I don't expect vendors to be at the forefront of security
research," (which would be good, because even the vendors who want to be there
are having a devil of a time trying to fill the headcount to do it) "but let's
not excuse vendors who ship flaws readily apparent when their product was
being developed". But who's to say what is and isn't readily apparent? How'd
you do on the Stripe CTF? Last I checked, only 8 people got the last flag. But
nothing in the Stripe CTF would have made the cut for a Black Hat
presentation. You think a jury and a court of law would do a better job of
judging this?

~~~
imgabe
In the absence of some codified standard, it's not going to be possible to
adjudicate a lawsuit. Do we understand every mechanism by which a building can
catch on fire? No. But we still have a National Electric Code that specifies,
for example, what size wire to use for a given load. If you install wire
that's too small and it overheats and causes an electrical fire, you can be
sued.

I think likewise if your app contains some known vulnerability, like you don't
sanitize your database inputs, there should be a way to be held legally
accountable for that.

~~~
mechanical_fish
Before we pursue this metaphor too far, some observations about the National
Electrical Code.

One is that software is _many orders of magnitude_ more complex than an
electrical delivery system. People have spent decades trying to figure out how
to construct secure software by bolting together "secure components" in a
simple way. Let's be charitable and just say: That work continues, and will
continue throughout my lifetime. It's a much harder problem. Electricity is
easy.

The other is that even the electrical code doesn't provide strong guarantees
against malicious attacks by hostile humans. That's not in the spec. There's
no armor on the wires that come into my house, no alarms that would go off if
a ninja with a saw started cutting down a crucial utility pole in a DoS attack
on my power, no formal procedures for screening the wiring in my walls for
wiretapping devices. The electrical code doesn't even mandate a backup battery
or generator, let alone that said generator should be tamper-proof.

Similarly, the fire code doesn't specify that my smoke detectors should have
locks so that those ninjas can't easily remove the batteries before setting
off a gasoline bomb in my living room late at night. There isn't even a
gasoline-fume detector in my house. The windows aren't armored. This place is
not defensible!

Of course, there are places in the world that are built at great expense to
withstand attacks by armed bandits or trained spies. But if we wrote our
building codes to incorporate such measures, what would happen is just what we
see happening with software: People would line up to sign waivers and
variances, so that they could just build a simple inexpensive house and get on
with their lives.

~~~
imgabe
FWIW, the life safety code (separate, but related to the NEC) does mandate
emergency power for certain systems (fire pump, egress lighting) in public
buildings (not homes). The NEC doesn't address attacks by hostile agents
because that's not its purpose. It's to prevent people from getting
electrocuted and to prevent electrical fires from starting. That's it.

My point is not that a code is going to provide 100% assurance from all
possible forms of attack. It's quite the opposite. The code simply spells out
how certain known failure modes are avoided. It's not a guarantee that nothing
ever will go wrong. It's basically a list of specific things that have gone
wrong in the past, and what things should be done to prevent them.

The point is to establish exactly what "reasonable measures" are for the
purpose of determining liability, not to spell out a method for a fail-proof
system. If you present yourself as a competent developer and the build someone
a system that passes user input directly to the database and stores passwords
in plaintext, you should be held accountable for damages resulting from a
security breach that made use of those holes.

------
lutusp
> If you’re poisoned by a burger you can sue the restaurant that sold it - so
> why can’t you take a software developer to court if their negligent coding
> lets hackers empty your bank account?

Whoever came up with this example doesn't know much about law (and IANAL). To
prevail in a court of law, the hamburger plaintiff would need to show a
violation of prevailing standards of behavior, for example, health codes. But
if a threat is unknown at the time of the injury, the defendant is generally
held blameless.

An example is Legionnaire's Disease, which was ultimately traced to a decision
to turn down water heater thermostats to save energy during the 1970s oil
crisis. The hotel operators could hardly be held accountable for "negligence"
in a case like this, where they had no possible way to anticipate a side
effect of a reasonable decision.

It should be the same with software -- if a developer writes code in good
faith and meets prevailing quality standards, he shouldn't be held to account
for an exploit that arises later.

If the developer worked in collusion with hackers, that would be different,
but it's not what's being discussed.

~~~
pseudonym
The key word there is "prevailing quality standards". If a developer writes
something, in good faith, that's vulnerable to SQL injection, or stores
passwords in plaintext on a server, is that negligence?

~~~
rprasad
Maybe, it depends. What does the app do? If it's just a photo-sharing
application, or a weekend project, then no. There's no expectation for the
former to be secure, or for the latter to be a full-fledged product that meets
standards.

On the other hand, if it is a file-sharing saas targeting small businesses,
the failure to handle SQL injection or store passwords properly would be
negligence.

~~~
tptacek
I think it would be very easy to convince a jury under a straightforward
liability framework that "failure to handle SQL injection is negligent". Which
is unfortunate, because "failure to handle SQL injection" is by itself a
mostly meaningless statement. Most SQL Injection flaws are indeed very dumb,
very obvious bugs. But there are bugs that end up vectoring to SQL injection
that are not obvious at all.

------
bradleyland
This article actually does a pretty good job of focusing on what really drives
lawsuits: negligence. It's easy to establish negligence in food safety,
because there is consensus on what represents "reasonable food safety"
procedures. This will be incredibly difficult for software because of the wide
variety in types of software.

If legislation passes, you can bet it will be bad. I hate to be cynical, but
I've yet to see a technology law that was well written. Large software vendors
will see this as an opportunity to hamper smaller software developers by
lobbying for ridiculous requirements that have little to do with _actual_
security and everything to do with codifying expensive processes that big
businesses already do.

~~~
rprasad
* there is consensus on what represents "reasonable food safety" procedures. *

There really isn't. Food safety laws vary from state to state, and from
country to country. The only real common ground is that refrigeration of food
is required.

 _This will be incredibly difficult for software because of the wide variety
in types of software._ Not really. There are more varieties of food than there
are of software. Facebook, twitter, and groupon are all really just variations
of CRUD applications for which we already have best practices.

 _Large software vendors will see this as an opportunity to hamper smaller
software developers by lobbying for ridiculous requirements that have little
to do with actual security and everything to do with codifying expensive
processes that big businesses already do._

Well, yes. Such is life. Once an industry starts to become mainstream, tort
liability starts to rear its ugly head. It is the third immutable law of life
(after death and taxes).

~~~
bradleyland
> There really isn't. Food safety laws vary from state to state, and from
> country to country. The only real common ground is that refrigeration of
> food is required.

There isn't consensus in the language of the law, but there is consensus in
the principles. Your example that refrigeration is required is an illustration
of that point. Rather than focus on the details of the law, we can recognize
that all food safety laws make some requirement related to the maintenance of
temperature. Taken further, this is based on the principle that the available
science is aware that food pathogens grow more rapidly at certain temperature
ranges. There are many principles similar to this, upon which food safety laws
are based, and these principles are the consensus.

The principle is what matters to this discussion, because that is what the
laws will be based upon. I would expect that "software safety" laws will
follow a similar tact. Even if I were to take the most cynical view possible,
it doesn't seem plausible that legislators would attempt to write software
safety laws for Java, C, C#.NET, Python, Ruby, Closure, etc, etc. Rather, they
would attempt to establish principles, upon which process would be built.

------
rogerbinns
No problem. I'll sell two versions of the software. The first will be as you
get today with no capability of suing me.

The second will be a lot more expensive, but you can sue. The accompanying
documentation will list exactly how you can use the product, and anything
outside of that voids the product. It will list an exact certified environment
you can use it in (eg only certain Intel processors), versions of operating
systems, firewall configurations, other components that can be installed (any
more voids the product). Essentially the product will be useless.

The problem with the "suing the developers" approach is who pays. It will
always come at a cost including increased development times, more restrictive
usage modes, less functionality, insurance, lawyers, court time etc.
Ultimately people buying the software will have to pay for those, and it
doesn't really benefit them.

At the moment people can put their money wherever they want, and they can
choose the level of security, timeliness, access to code etc. I don't see the
problem with that.

------
lucisferre
Perhaps, but if there is a liability on that code, you can be damn sure I'll
be charging lawyers rates for writing it and it's going to take at least twice
as long to produce.

In the end most industries that carry liabilities like this also are required
carry error and omissions insurance or something similar. That cost will end
up having to be factored in to the cost of the software.

------
SeoxyS
If you expect 100% reliability from a piece of software, that ought to be in
your contract with the vendor. You ought to have a provision describing what
happens if the vendor fucked up. And the only time it is okay to sue is if the
vendor then does not abide by that contract and its described penalty for
less-than-100% performance.

The US has a sue-happy culture which I believe is the cause for much
ridiculousness in life around here. Somebody finds a way to hurt himself that
nobody else has though of, and sues, which leads to everybody then patching
that ridiculous hole and making everybody go thru hoops to avoid future
lawsuits. Ad nauseam. (The same mentality that caused the rule mandating we
take off our shoes at airports…)

------
geebee
Are there any other situations where a completely unlicensed trade would be
held to any standard other than caveat emptor?

Would regulating and standardizing software to the point where we could apply
a standard of competence and sue people over even improve software security?
If it did, would it be worth the potential chilling effect on innovation?

One thing I really can't stand about this is that at least professional
associations (which I consider to be cartel-like in many ways) like the AMA or
ABA do enable their practitioners to stand up to clients and employers. There
are even laws around who is allowed to directly employ many licensed
professionals.

Sounds like some lawmakers would like to deny any professional stature to
developers other than the right to be sued, as individuals, for their work.

For the record, I'm generally opposed to the creating a developer's cartel.
Anyone can read a book on PHP and hang out a shingle as a developer, which is
fine by me. But it would be particularly offensive to be subjected only to the
liabilities of professional recognition with none of the perks.

~~~
rprasad
_Sounds like some lawmakers would like to deny any professional stature to
developers other than the right to be sued, as individuals, for their work._

Actually, one of the major features of being a "profession" is that the
practitioner is held liable for the quality of their work (as measured against
their adherence to pre-established standards). This is why teachers,
businessmen, and programmers are not considered "professions": they do not
have clearly defined standards for proper work conduct, or liability for their
work product.

~~~
geebee
Here's the definition of malpractice from the free legal dictionary (IANAL, of
course)...

(<http://legal-dictionary.thefreedictionary.com/malpractice>)

"The failure to meet a standard of care or standard of conduct that is
recognized by a profession reaches the level of malpractice when a client or
patient is injured or damaged because of error."

...

"Negligence is conduct that falls below the legally established standard for
the protection of others against unreasonable risk of harm. Under negligence
law a person must violate a reasonable standard of care."

So you're right, it's not so much quality of work as failure to meet minimum
and pre-established standards of care. I suppose this might work with software
- not so much "your code was low quality" as "you clearly violated one of the
top 10 OWASP security vulnerabilities." Perhaps completely failing to validate
input would be malpractice, whereas writing crappy code to do this wouldn't.

But this is all by the wayside, it's not really related to my main point -
which is that I don't _think_ there's any precedent for holding a practitioner
liable for professional "malpractice" in the absence of a profession that sets
standards (and controls the right to practice).

And, as I said above, I tend to be very suspicious of professional
associations. I'm not saying I think there should be no regulation on who is
allowed to be a medical care provider, but I do think the AMA (along with many
other prof assns) show extremely cartel-like behavior that can be very
damaging.

------
splamco
Of course developers should be held accountable for security flaws in their
code - right after we hold bankers accountable for blowing up the economy and
plundering the world.

------
sseveran
I do think that at this point code being shipped by people who consider
themselves professionals should be free of the types of common types of
problems (SQL Injection, buffer overflows, integer overflows, etc...) There is
no real excuse for having one in this day and age. I am less convinced that
legal liability will cause a meaningful decrease in vulnerabilities. I do
think that, like healthcare, the liability would cause less to get done by
more people. Developers would evolve techniques for passing the buck on down
to someone else.

If legislation was used the big boys (Microsoft, Google, Oracle, etc...) would
unintentionally shape it in a way where startups would have most of the
liability for not having "adequate security procedures" like having your own
security team, certain tools, etc... And that would be bad in the long run.

------
jmags
If one would have to be an idiot to write server sotware (because one would be
liable for things beyond her or his control), only idiots will write servers.
If this sounds glib, it's only because the suggested regulations here are so
utterly insane.

------
rasur
<devils advocate>Yes - it will encourage them to be more diligent in their
work.</devils advocate>

~~~
s_henry_paulson
<da>It would discourage anyone from writing a library or component that could
be used in other software</da>

~~~
rprasad
The same way that people are discouraged from making parts for cars, or
ingredients for foodstuffs, or...

What silly argument were we making again?

------
grabeh
In the case of consumers, certain clauses restricting consumer remedies would
be unlawful due to legislation protecting consumers in any event.

This includes removing implied terms like the implied term to exercise
reasonable care and skill in the provision of a service. A consumer could
argue that a developer has failed in this duty if they provide software which
had an avoidable defect. As the article identifies, the problem is it would be
a difficult process to identify what is truly avoidable. Due to the complexity
of software and the ingenuity of hackers (for want of a better word) certain
flaws will always be present. There is also the issue of contributory factors
like the user's own failings in updating etc.

In reality, I think the main barrier to consumers bringing an action would be
the cost and expense of litigation, not necessarily the terms of a EULA.

------
givan
Should architects be sued for burglars that get into your house?

~~~
bradleyland
If one could show that:

A) There was a reasonable expectation that the architect was responsible for
the security of the home under a design agreement.

and

B) That the architect was negligent in their duty to provide said security.

Then yes, they could be sued.

You're focusing on the wrong aspect of the argument. You can already be sued
for any number of things related to failing to meet an agreement.

The core question here is not whether software developers should be liable for
security issues arising from their software, but whether or not software
companies should be able to disclaim liability in such broad ways in their
EULA language.

------
dllthomas
There is a whole lot about a specific deployment that generally isn't known
during development (that isn't contracted for the specific task) that is
highly relevant to both liability and threats. If I build a tic-tac-toe app,
someone uses it to land 747s, and terrorists turn it into a fireball using an
obscure timing attack, that's not my fault. Liability should rest with whoever
deployed the system, and if they're not comfortable with that they can pay to
have the developers or publishers (or insurance companies) adopt more of it.
The real problem is that EULAs are impenetrable walls of legalese, half of
which is unenforceable in any given jurisdiction, and so go unread, and so
there's no pressure on companies to do anything but waive everything they can
get away with waiving.

------
kosmogo
If the restaurant poison you, you will sue the restaurant owner, the
restaurant owner can sue the cook, but i'm not sure you can or would sue
directly the cook, that would suppose you know who cooked your meal i guess.

Also I believe if a new branch of software industry was developed where every
library, os, patch, updates, firmware , framework is certified as "developed
without any negligence for any use " only the defence and nuclear industry
would be able to afford it. Actually I guess they already did that for their
own use.

So the wish to have a software certified without negligence (and for any use
the customer use it) is probably too expensive and customer would just keep
buying the same as today if they would be given the choice.

------
sslayer
So to sidestep this issue, a developer could provide a product configured to
be completely locked down with little to no access to
network/filesystem/kernel and provide a blurb as to being completely secure
with the "recommended" configuration. Any configuration opening up access
would be considered to be at the users own risk.

------
benmmurphy
it's a really bad idea unless they properly grandfather it. people have sold
software under the assumption of limited liability and if they are now are
suddenly held liable then it could financially ruin some companies or at least
end up being a massive transfer of wealth from software companies to
consumers.

------
zhouyisu
Software need to be sold 1000x ~ 10000x more expensive to overcome this law.

------
DenisM
Yes, but only if you want all software development to move offshore.

------
rprasad
Yes. Lawyers need work.

But joking aside, developers do need to be held responsible for their work
product. Every other industry that makes things is held responsible for the
safety/workablity of their output, and many of those products (i.e., cars and
planes) are far more complicated than software. [There are some narrow
exceptions, i.e., vaccines, where the industry is shielded from liability
because alternative mechanisms provide for consumer redress.]

This does not mean strict liability (i.e., liability without actual fault),
but it does mean that negligence should be on the table.

What would negligent liability require? Merely that developers follow best
practices regarding their product. They already exist for the most common
types of projects. Where no best practices exist, following standard
conventions is usually sufficient.

Also, users can waive the developer's liability for simple negligence (but
generally not for intentional negligence), so this is unlikely to impact most
developers.

~~~
SkyMarshal
Cars and planes aren't under constant bombardment from invisible human
adversaries all over the world. They're required to withstand failure in the
face of known natural conditions and flawed workmanship, but not constant
attempts by human adversaries to break them.

And an example that proves the rule - we don't sue car manufacturers when
their lock system fails to prevent a thief stealing the radio or the car.

And, should we hold military vehicle and fighter aircraft manufacturers
legally responsible for damage to their products by enemies shooting at them?

If a delivered software product doesn't meet a contractually-agreed-upon spec,
then there is legal recourse for that. Maybe software contracts should specify
exactly what penetration attempts it has been designed to withstand, and in
very specific, not general, terms.

