

The one sure way to advance software engineering - edw519
http://bertrandmeyer.com/2009/08/21/the-one-sure-way-to-advance-software-engineering/

======
dkarl
_But if anyone is looking for one practical, low-tech idea that has an iron-
clad guarantee of improving software engineering, here it is: pass a law that
requires extensive professional analysis of any large software failure._

This is his conclusion, and it's bullshit. Most software engineering failures
result from the violation of known principles; basically, calculated risk-
taking in a cost-sensitive industry where lives (usually) aren't at stake.
Accumulating more and more engineering knowledge won't change that.

Really, just open your ears on any software engineering project and you'll
hear that "accepted principles" are routinely ignored. A month ago I heard
someone say, quite upset, "We're supposed to be in testing right now, but
we're still in development!" because QA uncovered a bunch of bugs that needed
to be fixed by developers. Clearly that person sees testing as a joke, a pious
fraud, that isn't actually supposed to catch any bugs. Usually most bugs
missed in development are released to production and fixed later. Catching
more bugs than usual in QA, thanks to an enthusiastic and zealous QA engineer,
was a blow to the schedule and will probably result in the project being
perceived less favorably than if QA had done a sloppy job. Bertrand Meyer is
kidding himself if he thinks engineering knowledge is the cure for this kind
of disease.

EDIT: Stepping back, I guess the cost and embarrassment of a mandatory
investigation would serve as a deterrent. But the knowledge gleaned from such
investigations would be depressingly mundane.

~~~
fauigerzigerk
You're right, but apart from health and safety related cases quality is an
economic issue, and QA has to be balanced against other economic factors.

------
gabrielroth
I was right there with him until the words 'pass a law'. I'm hardly a
libertarian, but it seems pretty clear even to me that any such law would (1)
raise serious and legitimate privacy concerns, and (2) be rife with unintended
consequences.

I could imagine such a law applying only to software projects funded with
public money, though. There are enough of those to advance the state of
knowledge a fair amount....

~~~
PotatoEngineer
The tricky bit is that once you have a law in place for public projects, the
people who pushed for that first law will probably try to extend it into the
private sector. So even having that first law (which is a good one!) paves the
way for more invasive laws later.

~~~
timwiseman
It does provide something of a "slippery slope" problem, but there are ways of
handling that.

One step in a direction does not always lead to the next, especially if there
are definite plans at the beginning to avoid it.

~~~
gloob
_One step in a direction does not always lead to the next_

Perhaps, but choreographing 50 million people (plus or minus an arbitrary
number, depending on your country) is pretty tough, even on a good day.

~~~
timwiseman
True, and quite well phrased.

Still, there are ways of dealing with slippery slopes. Knowing that something
is a slippery slope means "approach with caution" not "stop".

------
tilly
He's right that the period vs comma is likely wrong. But it is a legend based
on a real event.

For details see <http://en.wikipedia.org/wiki/Mariner_1> or
<http://catless.ncl.ac.uk/Risks/5.65.html#subj1> for more details. Here is a
summary. The Mariner 1 probe was destroyed because it veered off course
shortly after launch on July 22, 1963. By July 28 the NY Times was reporting
that the cause was due to a missing hyphen. NASA submitted a report to
Congress the next year that confirms this.

The comma vs period version likely arose as a misremembering of which typo
caused the problem. Or possibly by confusing it with another incident. (There
are anecdotal reports of such a bug being caught in Project Mercury. Whether
they are true is a different story.)

------
10ren
Planes use software.

After a plane crash, I'm pretty sure that software errors would be
investigated along with everything else. How has that worked out for software?

~~~
mynameishere
I was going to say that. Further, the comparison between physical systems and
software is mostly bogus. When software screws up it often _tells you_ what's
wrong. Little investigation is ever needed. If the epoxy on an airplane is too
weak, it doesn't log heap dumps and stack traces.

------
gaius
Consider the Eurofighter Typhoon. The RAF issued the original specs for it in
1972 (probably before anyone who will actually fly one "for real" was born).
The RAF is only just now taking delivery of them. Can you even imagine
building a web app on those timescales?

~~~
spitfire
The lesson from the eurofighter isn't to not to do software engineering. But
instead, do not develop web software with a multi-supplier, multi-national
bureaucratic bureaucracy.

Even then, the eurofighter turned out to be more capable than both the F-22
and F-35.

------
billswift
There was something I read in the past year about the conflict between
recovery and forensics - preserving the evidence for an investigation prevents
a quick and smooth recovery, and conversely quickly recovering from a problem
destroys evidence. So real world systems that have to keep working either need
a complete backup system (ridiculously expensive) or they need to stay down
until the investigation is completed (which could be even worse, especially if
a gov't agency with a stronger sense of CYA than urgency is doing it).
Investigating airplane crashes doesn't have this problem, since the plane
isn't going anywhere. I've spent the last ten minutes googling trying to find
the source and can't.

------
Goladus
Will not work the same way with software. The scope is far too large and
dynamic. Software is much more diverse and complex and advances much more
rapidly.

Airplanes have been solving the same basic problem for decades, getting
incrementally better at it year by year. A flight simulator running on a
desktop computer is one example of one piece of software that could get
incrementally better year by year. Meanwhile new software solving new problems
is released all the time. Designing laws to account for that is difficult and
very dangerous.

The FAA model might work well for specific software systems but is no silver
bullet.

------
edw519
The aerospace-IT analogy is an interesting one. I have done programming in
aerospace environments and am always amazed that every single screw, nut, and
bolt must be redundantly inspected, QA'ed, and certified but "QA" as it
applies to software checks source code indenting and not much else. We have a
long way to go before application software is as bulletproof as flying.

~~~
spitfire
<http://libre.adacore.com/libre/>

Not really that far. The tools are already there, and available to use. We
just have to gain a culture of quality.

Which is the real problem. The technology industry has a culture of cowboys
and constantly forging ahead rather than sober introspection (masturbatory
blog posts aside).

My background in aviation has had a huge effect on my life in technology.
Every pilot has been in a "situation" over their flying career, some more than
once. When we do our hanger flying we swap stories about how to safely get
down every time. This just doesn't happen in technology. It should.

~~~
miloshh
A culture of quality has a significant cost. This is justified if lives are at
stake, but that's pretty much the only case where it's justified. Grocery
stores also don't check each apple 3 times independently if it's not rotten.
:)

~~~
spitfire
Implicit in what you're saying is that you're willing to suffer the costs if
your software fails.

For me, if my banks software decides to munge some data, that's not cool. If
my $2K laptop crashes, that's not cool.

More importantly, a culture of quality mainly has initial costs, after that
maintaining it is a small marginal cost with significant savings in reduced
defects.

~~~
nostrademons
Some software industries - aviation and healthcare particularly, but also to a
lesser extent finance - _are_ regulated. And have strong cultures of culture.
And it works - how often are plane crashes due to "pilot error" vs. due to
"software error"?

For most things, though, that's just plain unnecessary. If my free web browser
crashes - well, I'm a little annoyed, but I suck it up and restart the
program. And I'd much rather use that free browser that crashes than wait
until they can make it perfect and crash-free.

There's an opportunity cost to everything. Time spent making sure your program
_never_ crashes is time not spent on solving new problems, and it's likely
those new problems are a lot more pressing than "I need to reboot Firefox
every 2-3 days because it has such horrible memory leaks."

~~~
spitfire
I am not at all sympathetic to your world view. You're advocating that we make
bad software on purpose. Not only that but you're advocating placing the
burden of defects as an externality onto users.

This to me is unprofessional. We have the technology and skills to build
reliable software, without significant additional cost. To do anything else
should be criminal.

I'll use an analogy. You wouldn't date a fat, ugly, dull girl if you knew with
even the most trivial effort you could date an attractive girl who's great at
conversation and loves to discuss René Magritte and Douglas Hofstadter, would
you? No. So why accept the same in software?

EDIT: I should note that here in B.C., Canada Software Engineer is a certified
title. You may not legally call yourself a software engineer unless you are
licensed.

h<http://www.apeg.bc.ca/>

~~~
nostrademons
Eh, you don't have to be sympathetic to my worldview. You almost certainly use
software I've written anyway. ;-)

I'd strongly disagree that building reliable software can be done without
significant additional cost. I spend perhaps 2/3 of my time writing tests, and
then development time (on top of that) is probably doubled by code reviews.
The benefit of this is that I have enough confidence in my (and others') code
that I can freely improve it without worrying too much about breaking things.
The downside is that I've spent 6x as long to implement a feature as I
otherwise would've. That's a cost. I pay it because a.) I have to and b.) the
business importance of the code I write really does justify it.

I would not pay that cost if I were hacking up a startup with zero users and
iterating until I had a useful product. And I don't pay the cost when I'm
prototyping new ideas within my employer - most of my work later this week has
been straight hackety-hack with no tests, because I'm doing a demo for
something new.

"You wouldn't date a fat, ugly, dull girl if you knew with even the most
trivial effort you could date an attractive girl who's great at conversation
and loves to discuss René Magritte and Douglas Hofstadter, would you?"

Haven't you heard of "Smart. Pretty. Nice. Pick two"? You generally _can't_
have everything - not unless you happen to be a debonaire billionaire with
Brad Pitt's body. (The fact that such a person doesn't exist might be further
evidence that you can't have everything...) Most people date a girl who has
the qualities that they care about, and they don't worry so much if she's not
perfect in other ways.

------
psyklic
Good point; however, the basic construction of airplanes has remained the same
since 1967. Software, unfortunately, changes constantly. Software engineers
have become very talented at what does stay constant (just as aerospace
engineers have) -- e.g. languages -- however, it is likely the always-evolving
libraries and paradigms that cause most problems.

