
Would You Trust Your Life To Your Code? - fogus
http://www.basilv.com/psd/blog/2009/would-you-trust-your-life-to-your-code
======
rm-rf
The author brings up a fundamental difference between traditional engineering
and software engineering. In many fields, engineers sign on the dotted line
and assume professional and financial liability for the correctness of their
design. They have to think about things like warranty repairs, recalls,
product liability lawsuits and dead citizens. They tend to design very
conservatively.

How many software engineers are willing to make their careers dependent on the
correctness of their code?

There probably are some that are. Unfortunately I've ever met them or had the
privilege of hosting their applications.

~~~
wallflower
> How many software engineers are willing to make their careers dependent on
> the correctness of their code?

Proving program correctness is a very involved, expensive, and rigorous
process. That, being said, comprehensive unit tests are a good investment in
proving that code works as it should. Software cannot be engineered like a
bridge.

<http://en.wikipedia.org/wiki/Formal_methods>

~~~
mfukar
So, do you believe that proving the structural integrity of an engineer's work
is less expensive or difficult?

No, they are simply built according to several, well-established, well-studied
principles. Which software engineers not only lack, but are not interested in
pursuing.

~~~
davidw
People have been building bridges for a lot, lot longer than they have been
building software.

Also, you can go on all you want about how to engineer quality stuff, but if
you ignore the economics of the situation, you'll end up pricing yourself out
of the market by an order of magnitude in many fields. For instance, the web
or mobile phone stuff I have been working on lately: it's important that it
mostly works, and is reasonably priced. The cost of making sure it never, ever
has any downtime or ever fails would simply not be worth it to my customers
and clients.

~~~
wallflower
> if you ignore the economics of the situation

The classic software development triangle.

"The iron triangle refers to the concept that of the _three_ critical factors
_scope_ , _cost_ , and _time_ at least one must vary otherwise the quality of
the work suffers. Nobody wants a poor quality system, otherwise why build it?
Therefore the implication is that at least one of the three vertexes must be
allowed to vary. The problem is that when you try to define the exact level of
quality, the exact cost, the exact schedule, and the exact scope to be
delivered you virtually guarantee failure because there is no room for a
project team to maneuver.

Software development projects often fail because the organization sets
unrealistic goals for the "iron triangle" of software development:

    
    
        * Scope (what must be built)
        * Schedule (when it must be built by)
        * Resources (how much it must cost)"
    

<http://www.ambysoft.com/essays/brokenTriangle.html>

~~~
davidw
Or the quick version: "fast, cheap, good: pick two".

------
PeterWilson
You hear this BS from time to time. Usually from some organization, or
individual, that wants to make computing a "Profession". They usually neglect
to mention the thousands of bridges, buildings, and other structures that
collapse every year, or the occasional city that drowns. How many automobile
or baby cot recalls were there this year? I suspect that Engineering is no
more reliable than computing.

~~~
wallflower
I studied civil engineering in school. All of my civil engineering friends are
now licensed professional engineers (most, if not, all of them structural -
designing and inspecting bridges).

What that means, besides having a really nice seal embosser, is that when you
sign-off on a design (embossing it with your professional signature) - is that
you take responsibility for that design. You are professionally liable for the
failure of your design.

As a side note, to take the Professional Engineering exam in certain U.S.
states, you have to submit 4" thick or so of your actual engineering work
notes from your 3 or 4 years of professional work experience (a prerequisite
to sit for the exam). Pretty much the year you sit for the P.E. exam you study
for it like a part-time job (companies support you because they know the
importance of it).

It is not just about reliability, it is about ethics (the Citibank building
case is studied) and a sense of professional responsibility. Real professions
are very regulated and self-regulating (belonging to a tribe) at the same
time.

When I heard my sister was dating a programmer - my future brother-in-law, I
had her ask him if he thought software could be engineered. And he gave me the
correct response. Having been trained in structural engineering, I hate...
really, really _abhor_ the term software engineer - I prefer software
developer.

~~~
allenp
So I have to ask then (and this is really in the spirit of discussion not
trolling) - is it that software cannot currently be engineered, or is it that
software can never be engineered?

I ask because I feel like there are examples of "bridge-worthy" software,
mostly examples out of NASA where the code tightly fits the hardware and the
number of bugs per line of code is something like 1-to-1million.

~~~
wallflower
If the scope is very narrow, software can be engineered to be correct.
However, I think it is better to work towards organic, self-healing software
systems like Google (where they leave failed cluster nodes in place, not
bothering to find or remove them).

Another reason software cannot be engineered in a cost-effective manner is
most engineering is based on inviolable physical rules (the constant g
(gravity) - this allows software (irony, yes) to be developed that helps
engineers correctly (there is always a "factor of safety") engineer (FEA
software like ANSYS). My hat is off to the legends like Thomas Roebling who
designed the Brooklyn Bridge before computer aid.

The pain and cost of engineering software to structural engineering standards
(ISO 9000 is a joke, NASA has its own internal standards) usually exceeds the
lifetime benefit of correctly engineering that software. A good rule of thumb
is will someone's life be jeopardized if the software fails (dialysis,
automobile vehicle systems, nuclear thermal rod control, 747s, space ships
(makes me wonder, actually, how "correct" Burt Rutan's X Prize spaceship
software may be)).

~~~
hxa7241
> most engineering is based on inviolable physical rules

Don't rules such as those of algorithmic complexity have the same status and
position? They are as objectively certain -- more so, in fact, since they are
purely logical. At its base, software is built on bits and operations, which
are entirely determinate. The ramifications might not currently be fully
understood or exploited (software is only about 50 years old!), but the
potential is there.

As an example: with CPUs like the Z80 it was possible to count clock cycles of
instructions, and determine upper/lower/average running times of code. That
would be equivalent to determining resultant forces across a whole physical
structure. Today, CPU makers have made things difficult, but that is
completely contingent: it would in principle be possible to predict
performance, in an engineering fashion, for many purposes.

~~~
wallflower
In the homework we do for engineering classes, we would be able to make
simplifications without fundamentally altering the problem. For example a
continuous load on a structure could be modeled as a single force vector
because G, the gravitational force always acts downwards. You can't do that in
software with loosely-coupled, autonomous modules. These modules do not obey
anything. Just an implicit contract that can be broken easily. Unit tests only
help to clarify the contract for a module or between modules. In structural
engineering, there are a lot of unknowns but they can be modeled across the
entire range because there are many known constants. The load-bearing capacity
of the beam, the moment (the twisting force acting on the beam). Virtual
abstractions like software do not obey physical rules. Because they are
virtual. You can create a Castle in the Air in your program world, if you
like. Hard physical reality has real constraints. Virtual reality has flexible
constraints. Physical reality is one of the reasons building construction is a
solved problem. The irony is with sophisticated software, architects like S.
Calatrava and F. Gehry can begin to create almost Castles in the Air.

------
tryke
Techincally, yes, but I imagine it's very simple code. Most skydiving rigs
these days have an Automatic Activation Device (such as the CYPRES:
<http://www.cypres-2.com/>). You're not supposed to rely on it, but can save
your life if you're knocked unconscious during freefall.

The thing is run by a small microcontroller. It compares your altitude and
velocity. If your altitude is less than 750 feet and you're still in freefall,
it pulls the chute for you.

Now, would I trust my life to code running on a desktop OS? Hell no! I've seen
too many CVEs for that to happen.

------
geebee
No, I wouldn't trust my life to my code, because I don't need to. Quality
matters, and defects can serious financial consequences (for instance, losing
data can have a serious and measurable impact). But in the trade-off between
innovation and risk, I should probably position myself fairly aggressively.
Like most programmers, I'm in a position where it's probably better for me to
make and recover from mistakes than to avoid mistakes altogether.

This is what I think the "software is like building a bridge" analogy is so
inappropriate. If you're going to hold software accountable for its relative
lack of reliability, you should also acknowledge that the innovation from this
field has been astounding.

Would licensing help things? Would the space shuttle control system software
become very reliable if only those software "engineers" were licensed? I
seriously, seriously doubt it.

But I do think that "software engineers" would slowly succeed in choking off a
competitive and free environment. In a nightmare scenario, you'd have to major
in "software engineering" instead of math or physics to be legally allowed to
write code, self study would be banned, and something like Ruby on Rails would
be illegal because the people who wrote the EJB specs don't like it.

Look at the activities of the ABA or AMA. That nightmare scenario isn't as
impossible as you might think. And I'm pretty sure that it would hurt
innovation in software severely, in exchange for a safety and security that 1)
it wouldn't deliver anyway, and 2) we don't need in the first place.

------
mseebach
> So let me ask a related question. Would your trust your financial assets to
> your code? How much would you wager that your code is correct?

Anyone who runs a business that's based on code, whether it's as an ASP or
consultant do this on a daily basis. I trust my code to make value for my
client so my client will pay me so I can pay my rent.

------
mburney
IMO one of the great things about exploratory programming is that you don't
have to trust your life to your code. It may annoy a lot of users, but the
trade off (in being able to explore/experiment) is worth it.

------
dschobel
Funny story (and by funny I mean terrifying).

My graduate advisor did some consulting for one of the major airplane
manufacturers. He said looking at their systems code put him off flying for
six months.

------
RiderOfGiraffes
I am reminded of this story: <http://catless.ncl.ac.uk/Risks/17.06.html#subj7>

------
edw519
Interesting question. I was just thinking about this yesterday, so I'm glad
you brought it up...

 _Business software, in particular, is often financially-critical (failure of
the software leads to loss of money) rather than life-critical._

This may have been true at one time, but not any more. As a business
programmer, I've worked on quite a few things where there is much more at
stake than just money. Just a few of them:

    
    
      - distribution of mission critical airline parts with linked certifications
      - scheduling & routing of ambulances and firetrucks
      - scheduling & routing of trucks carrying time-sensitive medical supplies
      - clean-room quality control of medical devices
      - distribution of pharmeceutical formularies
      - medical claims processing & adjudication
      - formulas & recipes for large batch food processing
      - medical demographic databases of allergies
      - certification of automotive safety devices, including airbags
      - building contractor specifications, including electrical & plumbing
      - clinic scheduling
    

Just because something won't hurt you immediately doesn't mean that it can't
hurt you _eventually_. You can see from my examples that so much we program
does affect the welfare of many, even if indirectly.

We really have reached the point where software QA is just as important as
engineering QA. We programmers aren't the only link in the chain, but we are
an important one.

I have looked at horrendous enterprise code that supported critical health and
safety issues and thought, "Do you really want to get on that plane?" or "Are
you sure you want to take that pill?" (Hopefully QA catches most of the
potential culprits.)

Thanks for getting us to think about it a little more. This sort of thing
should always be on any good developer's mind.

