

Edsger W. Dijkstra: Answers to questions from students of Software Engineering - RiderOfGiraffes
http://www.cs.utexas.edu/~EWD/transcriptions/EWD13xx/EWD1305.html

======
10ren
I love Dijkstra. I share his valuing of things that make sense, and the dream
that all things can be understood - and his humble acceptance that we have to
break down problems to fit inside his, mine and your human brains.

But you can see where Alan Kay's "nano-Dijkstras" joke comes from, e.g. "the
horrendous mess produced by a group of incompetent, unorganized programmers",
referring to your complex project.

Industry - and you - and me - have accepted that things don't always work
perfectly. Industry pays people to fix things, rather than make them perfect
in the first place. Partly it's an acceptance of human error; though it's also
an acceptance that you'll need to change it even if it's perfect, because the
world keeps changing. "Iterative development" elevates this to a dogma.

Here's a thought experiment: imagine you need a library (e.g. for network
protocols; or a fancy GUI; or to parse a EDI transactions). You have a choice
of two libraries. Both have been used extensively by others, and both work
well when you try them out. One is proven (in Dijsktra's sense) and one is
not.

1\. How much _more_ will you pay for the proven one? 10%? x2? x10? x100?

2\. Now consider how much extra work was required to rigorously build the
proven one. (Of course, if the builder of the component also had access to
proven sub-components, this would be easier).

[You can imagine doing this by building it yourself if you like - using
dollars just makes it clearer.]

~~~
hvs
I'm not sure why you got downvoted for this as I think it is a fair
assessment. I think it assumes that the current state of development will
continue on its current path, though. The increasing complexity of
applications demands that we come up with better models for development that
allow us to think about problems more abstractly. One could make the argument
that this is only truly possible with formally verified applications.

~~~
10ren
Historically, we SOTSOG - using abstractions, so complexity is pushed down to
a lower layer (language, libraries, OS). I think this will continue to be how
we cope with complexity - one specific case at a time, rather than a new
general way to address all complexity issues (sounds like magic).

If the rate of complexity increase itself increases (i.e. accelerates), then
we might need a a different methodology altogether, such formal verification
as you suggest. However, we have had the need for better ways to handle
complexity since before Fred's Mythical Man-month (IBM 360), and we've had the
tool of formal verification techniques to do it for a long time. It hasn't
happened yet, suggesting there is some problem with it.

In my admittedly very limited experience of formal verification, it makes
problems _more_ complex, rather than less (This is why Dijkstra says that you
can't prove software the way it is currently written - you have to write it in
a simpler way, with clearer interactions between modules and so on). Its
benefit is that you know it's 100% correct (assuming your assumptions,
understanding of the real-world problem, understanding of interactions with
other components are all correct, your proof is correct, and any proof
automation tools you use are also 100% correct). OK, it gives you a higher
probability that it is correct.

Ah, this looks like the opposite cause and effect from your suggestion (that
formal methods are a way to handle complexity), and instead that trying to
formally verify applications _forces_ us to write them in a simpler way, which
is... simpler. Or, we could just write them in a simpler way, and not worry
about the formal verification part. It still leaves the problem: what is this
way of writing apps in a simpler way? Isn't the only way to do this by
dividing it into abstractions with well-defined behaviours and clear
interfaces? In other words: _Historically, we SOTSOG - using abstractions, so
complexity is pushed down to a lower layer (language, libraries, OS)._

PS: I would love it if there was a new general way to approach complexity. But
as Fred said, "no silver bullet".

 _"better models for development that allow us to think about problems more
abstractly"_

That's possible I guess; but I don't think we lack in our ability to think
about problems abstractly. It's finding the _right_ abstraction that is
tricky, and this amounts to a search through the space of abstractions. That
sounds straightforward to automate, except that he search space increases
exponentially with the complexity of the problem. So we need a heuristic to
guide that search.

So far, the best heuristic is _us_. If we can automate this guide through the
search space - the ability to make connections, have intuitions and insights -
it would be a huge leap towards Strong AI. My Masters was an approach to this,
and the start of my PhD. Quite possible, I think, but hard.

~~~
hvs
This is a well put argument and I won't attempt to go into all of your points
more than to say that I pretty much agree, and step back some from my argument
that formal analysis necessarily helps abstraction.

I guess my point was that in order to for larger and larger abstractions to be
implemented, there needs to be trust in those abstractions. One problem that
you point out is that _right_ abstractions can be difficult to come up with,
which will not be improved by formal analysis. The other problem is that as
you build greater and greater abstractions on top of each other, "holes" in
those abstractions can cause odd problems in the system. Here, formal
verification can at least provide the level of trust that the abstraction
works exactly how you expect it to and can implement your system on top of it
without concern for odd effects.

One analogy could be with the computer hardware on which software is
implemented upon. It goes without saying that while software is often expected
to have flaws, the hardware is essentially assumed to be flawless for all
intensive purposes. This isn't a perfect analogy because you have firmware,
etc, but hopefully it illustrates my point. You don't worry about the
processor not correctly implementing

    
    
      add ax, bx
    

It is taken as a given. At almost no point in software development today can
you make that assumption about any code.

Maybe formal analysis isn't the solution to this particular problem. I do
think it is important to research the topic as fully as we can to figure out
if it is or isn't.

~~~
10ren
"It is by logic we prove, it is by intuition that we invent" - _Henri
Poincaré_

I emphasized the invention, but you're quite right that we also need the
proving. I agree with your hardware example (things like the Pentium bug are
rare...), but I think we similarly rely on other layers (OS, languages,
libraries), though I concede with eroding confidence as we rise.

For hardware, it's also that we have more trust in the common pathways
(Dijkstra wouldn't like it - it _all_ should be perfect. I can relate to this,
but that's not how it is - today, anyway). Popular hardware is used
extensively, and very importantly, I think it tends to be designed to be
pretty flat, in that almost all of it gets exercised by almost all uses (i.e.
there are few rare pathways). So flaws will show up quickly.

We also don't worry about the language not correctly implementing

    
    
       a+b
    

So I think you _can_ make that assumption about code, if the code is well
exercised (e.g. in a fundamental part of the code in the language runtime).

Concessions: Dr. Gosling apparently did write some (informal) proofs for Java,
but it was in the byte-code verifier, not for the typical language stuff. But
I've heard that Intel does some formal verification :-).

I agree that formal analysis is a worthwhile endeavour. I just wish they would
do more work on interesting, useful problems. It used to be that computer
scientists would invent something incredibly cool (like Boyer-Moore substring
matching, or arithmetic coding), and _then_ also prove it (well, really, prove
some specific qualities about it). But these days, there seems to be a lot of
proving things just because they can be proven - not because they are
worthwhile proving. It's quite... insular.

They focus on proof, not invention.

------
hvs
I'm a huge fan of Dijkstra, for obvious reasons, and I won't trot out the
usual argument that "he's living in an ivory tower" or that "formal analysis
is impossible in large applications", but his attitude towards "Software
Engineering" does become grating after reading many of his articles. One of
the primary reasons that formal analysis is not widely used in software
development today (besides the fact that it is often very difficult) is not
due to the stupidity or laziness of "software engineers". It is due to the
radical failing of computer scientists to both effectively teach the concept,
and to develop tools to make it practical. On top of that, the widely used
languages in the industry today are just _now_ beginning to allow for the type
of high-level formal analysis that Dijkstra promotes. Functional languages are
finally starting to make inroads into the industry in the form of Haskell, F#,
Erlang; and even imperative languages such as C# are adopting functional
attributes. But we are still a long way from the regular use of formal
analysis (if it ever actually occurs) because the universities are not
providing much, if any, guidance in this area to the industry.

Formal analysis is a very hard -- maybe impossible -- problem for anything but
toy applications. And I think it is an worthy goal to attempt to achieve, but
it will not be well-served by denigrating the work of the many dedicated and
intelligent "Software Engineers".

~~~
spitfire
Formal analysis is a fact of life in many industries. Aviation,
rail/transportation, medical systems and some power systems.

All these industries have one thing in common - a lot of money to throw
around. So the technology and techniques are there, they just have to be
brought down to the masses. There are signs of that happening, it just may
take more time than you expect.

~~~
hvs
I don't think it'll take more time than _I_ expect, but I take your point.

Like I said, I think formal analysis of software development is a laudable
goal that should be pursued. But in many ways, software is so much more
complex than other engineering fields that it becomes increasingly difficult
if the development is not carefully controlled (something that Dijkstra is a
big believer in). I know that Dijkstra recognizes that it is not an easy
problem to solve, and that he feels that computer scientists have not
adequately researched this area. My main point was more about his general
attitude to computer engineers than to his actual point.

He addresses this well in this EWD Manuscript:

[http://www.cs.utexas.edu/~EWD/transcriptions/EWD13xx/EWD1304...](http://www.cs.utexas.edu/~EWD/transcriptions/EWD13xx/EWD1304.html)

~~~
spitfire
I agree with you entirely. A huge problem is in mindset and culture. I've
mentioned formal methods here on HN before and been essentially shouted down.
Which seems odd considering the hacker-above-the-commoners ethos HN wears on
its sleeve.

But I digress, I think in the next few years as functional languages become
more mainstream we'll see a huge uptake in formal methods. Not end to end mind
you, but of the low level proover type tools.

Formal project planning and design languages still take too much training to
be widely accepted. Can you imaging a ruby hacker analyzing and writing up his
code in Z language before they code? Neither can I. But any amount of progress
on this front is a good thing.

~~~
10ren
BTW: it's not so odd if you think of a hacker as an expert tinkerer. Tinkering
is the opposite of upfront design.

------
stefano
My favourite one:

    
    
      "It is not the task of the University to offer what 
       society asks for, but to give what society needs.
       
      [The things society asks for are generally understood,
       and you don't need a University for that; the university
       has to offer what no one else can provide.]"
    

The sad thing is that CS courses tend to go towards the opposite direction.

------
abalashov
_No, I'm afraid that computer science has suffered from the popularity of the
Internet. It has attracted an increasing —not to say: overwhelming!— number of
students with very little scientific inclination and in research it has only
strengthened the prevailing (and somewhat vulgar) obsession with speed and
capacity._

Hallelujah, old friend - speak the word.

------
RiderOfGiraffes
Dijkstra has been mentioned before on HN: <http://searchyc.com/ewd>

~~~
xtho
So was "answers" and "software engineering". =8-O

~~~
10ren
GP is the submitter. He's linking a way to locate other articles on HN about
Dijkstra (by using "ewd", which always appears in the URL for Dijkstra's
site).

~~~
xtho
You're right. My apologies.

------
tpyo
What does the computer industry need saving from?

