
The Humble Programmer (1972) - tareqak
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EWD340.html
======
stiff
Has anyone here actually learnt what Dijkstra advocates in almost all EWDs and
used it to any benefit? Basically almost all his essays in one way or another
end up being about the need to prove programs correct using formal logic and
specifically a particular approach to correctness proofs he devised. In fact
he even advocates starting from a formal logical specification of a problem
and then deriving the program using purely syntactical transformations.

I tried to read the "Discipline of programming" where he explains his
approach, but it was barely understandable and it takes 300 pages for him to
get to the point of developing simple algorithms of the type you meet in the
first chapter of an algorithm textbook. It could have been the translation
(didn't read the English original), but I doubt it, because I have never read
anything technical from him that would actually be interesting. I am afraid
his essays are liked because of the general sentiment for "more rigour" in
programming, whatever it would mean, and not because of any understanding of
what precisely he advocates and the merits of his techniques. The living proof
is some comment in this thread how Dijkstra sheds insight into the value of
TDD...

So, if you upvote his articles, what precisely have you learned from Dijkstra?

~~~
btilly
You need to put it in historical perspective. Essays like this one were
written at the dawn of the "structured programming" debate. If you're
programming in a language and codebase where goto is rare or non-existent,
then Dijkstra's ideas have had an impact on your environment.

More precisely he is always a champion of the idea of having programs express
things in a direct and understandable way which makes comprehension
straightforward. And a champion of getting there by removing unneeded features
from languages.

This was a necessary overcorrection to now-forgotten excesses. But today it is
clear that he went too far. As an example, I would consider Go's a language
which is heavily influenced by Dijkstra's ideas on the value of simplicity in
language design. However the internal loop exit implied by break exists.
Multiple return values are used to separate out data (the result of a
calculation) from metadata (the presence and details of errors). We use
features beyond what Dijkstra considered wise.

~~~
stiff
There are hundreds of EWDs and I haven't seen a single one where Dijkstra
would talk about making programming languages simple to make comprehension
straightforward, this is something that modern programming "gurus" are more
likely to say. The reason Dijkstra wants simple programming languages, and
most EWDs are more or less about this, is because he wants to _prove theorems
about programs_ , which is possible only with a simple language. Not only
that, as I said he wants to _reduce program development to well defined
transformations on a specification of a problem in the language of
mathematical logic_. His stance is very nicely summarized here:

[http://en.wikipedia.org/wiki/The_Cruelty_of_Really_Teaching_...](http://en.wikipedia.org/wiki/The_Cruelty_of_Really_Teaching_Computer_Science)

You have to understand some mathematics and mathematical logic to really
understand Dijkstra however, you cannot look at it from a pure software
engineering background because you will only take away platitudes from what he
says without actually understanding him. If you have some background you can
read about his real technical ideas:

[http://en.wikipedia.org/wiki/Program_derivation](http://en.wikipedia.org/wiki/Program_derivation)

[http://en.wikipedia.org/wiki/Guarded_Command_Language](http://en.wikipedia.org/wiki/Guarded_Command_Language)

[http://en.wikipedia.org/wiki/Predicate_transformer_semantics](http://en.wikipedia.org/wiki/Predicate_transformer_semantics)

This is the area Dijkstra worked most on during his research career. Those
ideas actually go back centuries before anyone ever thought about structured
programming. For hundreds of years there has been a fundamental debate ongoing
on whether all kinds of reasoning can be reduced to some kind of calculi with
well specified rules, which would be equivalent to being able to construct a
(at least theoretical) machine for automating it. This is where computers come
from, this goes back to Charles Babbage, to the works of Leibniz, Boole,
Turing etc.

Dijkstra is in the same historical tradition. The program I understand
Dijkstra has for Computer Science is similar to the program Hilbert had for
mathematics with what is now called formalism, from which the work of Goedel
and Turing sprung off. When programming was still mainly done by
mathematicians this has been a lively research area, there were various
systems of proving correctness invented, various ways of deriving programs,
there has been a very heated debate on when proofs are needed and to what
extent ordinary programmers have to master them etc. You can find loads of
books and monographs written in the 70s and 80s about this. Now as far as
research goes program derivation seems to be an almost dead topic, but there
are still people today championing it in some alternative form. Richard Bird
wrote a book called "Pearls of Functional Algorithm Design" where he derives
algorithms using algebraical properties:

[http://www.cs.ox.ac.uk/people/richard.bird/](http://www.cs.ox.ac.uk/people/richard.bird/)

Alexander Stepanov has been advocating something similar and wrote a book
called "Elements of Programming":

[http://www.elementsofprogramming.com/](http://www.elementsofprogramming.com/)

[https://www.youtube.com/watch?v=Ih9gpJga4Vc](https://www.youtube.com/watch?v=Ih9gpJga4Vc)

~~~
btilly
_There are hundreds of EWDs and I haven 't seen a single one where Dijkstra
would talk about making programming languages simple to make comprehension
straightforward..._

If that's the case then you need to read more closely. This discussion is
sparked by
[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EWD340.html).
I'll focus on just two statements from this one which demonstrates the theme
that I said was there:

 _Finally, although the subject is not a pleasant one, I must mention PL /1, a
programming language for which the defining documentation is of a frightening
size and complexity. Using PL/1 must be like flying a plane with 7000 buttons,
switches and handles to manipulate in the cockpit. I absolutely fail to see
how we can keep our growing programs firmly within our intellectual grip when
by its sheer baroqueness the programming language —our basic tool, mind you!—
already escapes our intellectual control._

It would be hard to find a clearer advocation of simplicity in programming
languages.

But a clearer statement of his priorities - and why they are priorities - can
be found here:

 _A study of program structure had revealed that programs —even alternative
programs for the same task and with the same mathematical content— can differ
tremendously in their intellectual manageability. A number of rules have been
discovered, violation of which will either seriously impair or totally destroy
the intellectual manageability of the program. These rules are of two kinds.
Those of the first kind are easily imposed mechanically, viz. by a suitably
chosen programming language. Examples are the exclusion of goto-statements and
of procedures with more than one output parameter. For those of the second
kind I at least —but that may be due to lack of competence on my side— see no
way of imposing them mechanically, as it seems to need some sort of automatic
theorem prover for which I have no existence proof. Therefore, for the time
being and perhaps forever, the rules of the second kind present themselves as
elements of discipline required from the programmer..._

Your complaint is essentially that I focused on rules of the first kind
described, when Dijkstra did a lot of work on rules of the second kind. If I
were truly doing that then I'd be failing to understand him exactly as badly
as you are in saying he was only interested in rules of the second kind while
ignoring the fact that his largest concrete impact came from
[http://www.u.arizona.edu/~rubinson/copyright_violations/Go_T...](http://www.u.arizona.edu/~rubinson/copyright_violations/Go_To_Considered_Harmful.html).

I recognize and value his work in both areas. However the question that was
asked was about his impact on current programming practice. And there is no
question that his ideas on structured programming have had more impact than
his ideas on provably correct software.

~~~
stiff
I try to read this paper from the perspective of his overall work and writing,
and I have read through a lot of the EWDs. There is an abundance of examples
that his #1 concern was about making programming more mathematical. Hence I
interpret everything that you are quoting as proposals motivated by the need
he perceives of proving programs correct. That the industry trivialized his
ideas to "don't use goto" does not mean this was Dijkstra's great point. This
is in fact excellently summarized in the first link I posted. Meanwhile you
make it sound like he was interested in contributing to software engineering
like it is done today, which is not the case, he seemed to really despise a
lot of the things which are now software engineering best practices, like unit
tests.

What I actually hoped for is someone who really learned the way of writing
programs Dijkstra advocated to some extent and their experience with it. I
can't say I really understand how those derivations look like in practice.

~~~
btilly
_I try to read this paper from the perspective of his overall work and
writing, and I have read through a lot of the EWDs._

In other words you're missing the plain meaning of the sections that I quoted
because you try to consider it in the light of
[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EWD1036.html)
which was written a decade and a half later? Pardon me, but I won't be
emulating your example.

The focus of Dijkstra's research career was how to make correct software. As
he himself would claim, there are two halves to this process. The first is to
limit ourselves to forms of writing code which are easy to reason about. The
second is to actually perform that reasoning.

The part of his proposal that actually had an impact is the part which says
that we need to focus on methods of expressing ourselves that are easy to
reason about. The part that did not have a direct impact is the part which
says that we need to perform that formal reasoning. The fundamental reason why
not is that Dijkstra's reasoning assumed the existence of a consistent,
unchanging specification. The real world does not work that way - computers
exist to do what humans ask. And humans do not always ask for things that make
sense.

This is not to say that this is the impact that he wanted to have - it is
clearly not - but it is the impact that he did have.

However some of his other ideas have indeed found their way into practice,
albeit in a muted way that he would have objected to. For example take unit
tests, since you brought them up. He was against tests as part of including QA
as an integral part of the programming process - if the programmer performed
properly then that should not be needed. (Nice theory, fails in practice.)
However today, well-designed unit tests do serve as a limited form of
specifying exactly what a given piece of code is supposed to do, and verifying
that it in fact does that. (I'm sure that he would say limited and inadequate.
But it is better than nothing.)

~~~
stiff
_In other words you 're missing the plain meaning of the sections that I
quoted because you try to consider it in the light of
[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EW..).
which was written a decade and a half later?_

This clearly shows you haven't actually studied his writings, because he has
been raising the same points for many years in almost every EWD, OP is EWD340,
so here is a sample from before this period:

[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EWD302.html)

[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EWD303.html)

[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EWD317.html)

For example from EWD317:

 _If we take the existence of the impressive body of Mathematics as the
experimental evidence for the opinion that for the human mind the mathematical
method is indeed the most effective way to come to grips with complexity, we
have no choice any longer: we should reshape our field of programming in such
a way that the mathematician 's methods become equally applicable to our
programming problems, for there are no other means. It is my personal hope and
expectation that in the years to come programming will become more and more an
activity of mathematical nature._

For a programming language to be simple in the sense of being able to prove
things is a completely different thing than for it to be simple in the sense
of being easy to understand informally. Yes, bastardized versions of his ideas
did make it into the mainstream, I doubt it's always even because of his
direct influence and that's what the wiki article I posted three comments
earlier is about.

------
killahpriest
tl;dr Programming has been ignored because hardware has a much more visible
payoff. The impact of computers and the innovation in hardware will "be but a
ripple on the surface of our culture, compared with the much more profound
influence they will have in their capacity of intellectual challenge without
precedent in the cultural history of mankind." That is, the intellectual
impact of programming is more significant than the impact made innovation on
the hardware side. At least thats what I think what Dijkstra is saying.

Some gems:

Test driven development, 1972.

 __ _Today a usual technique is to make a program and then to test it. But:
program testing can be a very effective way to show the presence of bugs, but
is hopelessly inadequate for showing their absence. The only effective way to
raise the confidence level of a program significantly is to give a convincing
proof of its correctness. But one should not first make the program and then
prove its correctness, because then the requirement of providing the proof
would only increase the poor programmer’s burden. On the contrary: the
programmer should let correctness proof and program grow hand in hand._ __

For loops have brain damaged us.

 __ _Another lesson we should have learned from the recent past is that the
development of “richer” or “more powerful” programming languages was a mistake
in the sense that these baroque monstrosities, these conglomerations of
idiosyncrasies, are really unmanageable, both mechanically and mentally. I see
a great future for very systematic and very modest programming languages. When
I say “modest”, I mean that, for instance, not only ALGOL 60’s “for clause”,
but even FORTRAN’s “DO loop” may find themselves thrown out as being too
baroque. I have run a a little programming experiment with really experienced
volunteers, but something quite unintended and quite unexpected turned up.
None of my volunteers found the obvious and most elegant solution. Upon closer
analysis this turned out to have a common source: their notion of repetition
was so tightly connected to the idea of an associated controlled variable to
be stepped up, that they were mentally blocked from seeing the obvious. Their
solutions were less efficient, needlessly hard to understand, and it took them
a very long time to find them._ __

~~~
jmilloy
>I have run a a little programming experiment

Does anyone know what these test problems and solutions were? Or have similar
examples?

~~~
jloughry
The thing that comes to mind is the C idiom for copying strings:

while( _dest++ =_ src++);

"...their notion of repetition was so tightly connected to the idea of an
associated controlled variable to be stepped up, that they were mentally
blocked from seeing the obvious."

~~~
jmilloy
I'm not sure I agree; aren't dest and src still variables that are being
stepped up? Maybe they are not "controlled variables", I don't know what that
means. This is a task that is eminently suited to repetition, any loop makes
sense. Plus, the article mentions FORTRAN's DO loop (not just for loops
incrementing a counter).

I was expecting an example where a loop worked, but a simpler mathematical
solution exists without a loop.

Anyone else?

------
andrewflnr
I find his point about the "economic need" for programming to be more
efficient interesting: at the time, software was about as expensive as
hardware, and hardware was about the get drastically less expensive, and so

    
    
      If software development were to continue to be the same clumsy and
      expensive process as it is now, things would get completely out of
      balance. You cannot expect society to accept this, and therefore we
      must learn to program an order of magnitude more effectively.
    

And yet, society has accepted it. It's now a truism that programmers cost more
than hardware. Then again, it doesn't seem like his hoped-for revolution has
occurred, either, so I guess he hasn't really been disproven and is merely
guilty of being too optimistic.

~~~
dmethvin
These are just engineering tradeoffs. Sure you need to weigh them carefully,
but the fact that we spend more on software than hardware does not indicate a
horrible failure in the industry.

For example, processing power, communication speed, and storage has improved
by orders of magnitude, but battery capacity is perhaps better by one factor
of ten. Much of the advance in battery _life_ comes from reduced power
consumption by other components. Much of the advance in software development
has come by creating inefficient abstractions that let us create things faster
but squander CPU, network, and battery life.

Most companies today are willing to burn up some hardware performance in order
to reduce software development time. If they're burning too much, they'll fix
it or they'll go out of business. Premature optimization is still not a good
idea.

~~~
andrewflnr
I guess my post sounds more pessimistic than I meant it to be. I don't think
it indicates any horrible failure, I just find funny the exact way that
Dijkstra's vision failed to come to pass. Honestly, as a programmer, I kind of
like it that hardware is cheap and programming is expensive. ;)

------
ColinWright
An old friend, so many postings, so little discussion. Given the usual high
standard of discussion here on HN, it's a shame that there's been so little
about this popular submission. FWIW, I've upvoted this, because I really do
want to see a balanced discussion about it from an up-to-date viewpoint.

It was the Turing Award Lecture in 1972 - it's over 40 years old. Printed in
"Classics in Software Engineering" by Yourdon Press, 1979, ISBN 0917072146.

HTML:
[http://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340....](http://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340.html)

PDF:
[http://www.cs.utexas.edu/~EWD/ewd03xx/EWD340.PDF](http://www.cs.utexas.edu/~EWD/ewd03xx/EWD340.PDF)

Here are some of the previous submissions here on HN:

[https://news.ycombinator.com/item?id=86288](https://news.ycombinator.com/item?id=86288)

[https://news.ycombinator.com/item?id=109724](https://news.ycombinator.com/item?id=109724)

[https://news.ycombinator.com/item?id=126638](https://news.ycombinator.com/item?id=126638)

[https://news.ycombinator.com/item?id=135111](https://news.ycombinator.com/item?id=135111)

[https://news.ycombinator.com/item?id=156505](https://news.ycombinator.com/item?id=156505)

[https://news.ycombinator.com/item?id=449806](https://news.ycombinator.com/item?id=449806)

[https://news.ycombinator.com/item?id=1179277](https://news.ycombinator.com/item?id=1179277)

[https://news.ycombinator.com/item?id=1649246](https://news.ycombinator.com/item?id=1649246)

[https://news.ycombinator.com/item?id=1672262](https://news.ycombinator.com/item?id=1672262)

[https://news.ycombinator.com/item?id=1799296](https://news.ycombinator.com/item?id=1799296)
<\- 3 comments

[https://news.ycombinator.com/item?id=1894784](https://news.ycombinator.com/item?id=1894784)
<\- 8 comments

[https://news.ycombinator.com/item?id=2011732](https://news.ycombinator.com/item?id=2011732)

[https://news.ycombinator.com/item?id=3060828](https://news.ycombinator.com/item?id=3060828)

[https://news.ycombinator.com/item?id=5035560](https://news.ycombinator.com/item?id=5035560)

[https://news.ycombinator.com/item?id=5266220](https://news.ycombinator.com/item?id=5266220)

[https://news.ycombinator.com/item?id=6112467](https://news.ycombinator.com/item?id=6112467)
(This item)

~~~
pavs
I dunno about others, but I am a slow reader, and this will probably take me
about 20 minutes to finish reading. I have a general rule on sites like HN and
Reddit, if it takes more than 10 minutes to read something I will probably not
going to read it (unless it's extremely interesting, important or relevant to
present time based on comments).

I spend about an hour on HN a day, in small bouts. Most of the time I will
skim over titles to see if there is anything interesting, if there is and its
a long read (like this one) I will save it to "Pocket" to read it later. I
will probably get around to reading this few weeks or even a month from now.

I suspect, very few people will read it on the spot, which is why even
interesting submissions like this doesn't get a lot of discussions.

~~~
prawks
Ditto. This sounds like a great article, but it looks immensely dense. It's
most definitely on my "To-read" list, but that rarely gets actually read
through...

Also, although the author carries immense weight, and I'm sure the ideas put
forth in the article are fantastic, what benefit will I get out of the article
other than "hmmm, well that was interesting"? It's not a tutorial on a tool I
can put into practice, or necessarily any sort of technique I can use in my
day-to-day life (at least before reading, it seems this way). It looks like a
philosophical article about the art of programming and being a programmer.

I love these kinds of articles, but their return on the large time investment
required is ultimately not very large. This is in my relatively limited
experience, of course.

~~~
chaddeshon
These are two of the most depressing comment I've ever read, not just on
Hackernews, but on the whole internet. So much so that I wonder if they were
written with sarcasm that escapes me.

They represent a philosophy that is referenced often but always before as an
unintentional consequence of our internal laziness -- a laziness that should
be fought against. However here it is being presented as a conscious choice.
Indeed a right, proper, and good, choice.

The commenters seem to be arguing that reading is only worth the time if the
content has been distilled to its basic facts, and further that that facts
need to be immediately actionable. Have we no room for soul? Do we lack the
energy to take general concepts and apply them to new areas in new ways?

When we break a larger writing down and extract just the main theses, we make
it easier and quicker to under understand, but we also neuter and even change
the meaning. Sometimes what we learn or what we experience is subtle.
Sometimes writing doesn't give us a todo list, but instead it ever-so-gently
shades and nudges all our todo lists.

~~~
pavs
> The commenters seem to be arguing that reading is only worth the time if the
> content has been distilled to its basic facts, and further that that facts
> need to be immediately actionable. Have we no room for soul? Do we lack the
> energy to take general concepts and apply them to new areas in new ways?

You are reading too much into it. I (and probably the person you replied to)
have nothing against long-form articles, as a matter of fact I prefer them.

On a typical day, I will spend 1-2 hours on a book, I will read many smaller
articles here on HN and on Reddit, I will also check out my RSS reader, I will
read work-related email, I will do actual work, I will train for my marathon
(alternate day running and weights) which takes about 2 hours, spend time with
my family, socialize, and hopefully get some sleep too. Its all about how you
manage your time, not about distilling long-forms into bite-size chunks.

If given a choice between reading a long-form article online or reading a
book, I will read a book.

There is only so much time in a day. There is so much to do. I save long
article like this for my lazy or slow days to read.

~~~
chaddeshon
Fair enough. Obviously I don't know your situation and I directed my rant too
much at your specific comment and not the general issue that I really wanted
to speak to.

------
virtualwhys
Enlightening, thanks for posting.

"[LISP] has assisted a number of our most gifted fellow humans in thinking
previously impossible thoughts.", that's pretty profound ;-)

------
michaelwww
"[The speaker] managed to ask for the addition of about fifty new “features”,
little supposing that the main source of his problems could very well be that
it contained already far too many “features”. The speaker displayed all the
depressing symptoms of addiction, reduced as he was to the state of mental
stagnation in which he could only ask for more, more, more... "

LOL

------
mathattack
Has anyone here studied under Dijkstra?

I only know of one programmer who did - one of the best hackers I've ever met,
who eventually dropped out of UT to program full time. I never asked about his
opinions of the professor, and anything I could write would be second hand.

I had a professor out of UT for my programming languages class. When asked,
"How do you go about debugging interpreters?" he responded, "I don't debug. I
prove every line of code correct before I type it."

Was Dijkstra inspiring? Or just a pedantic fuddy-duddy?

------
kostyakow
This article feels like it's from a completely different era -- an early age
of computers with wild discoveries and unexplored frontiers.

But in reality it wasn't that long ago and a lot of the early computer
pioneers are still alive. Computing is still a really young field!

------
k4rtik
One of my professors, Prof. Vineeth K Paleri, at NIT Calicut
([http://cse.nitc.ac.in](http://cse.nitc.ac.in)) asks the students to read
this article by Djikstra in their first class with him. Was an interesting
read indeed.

------
emmelaich
What's interesting to me is that Dijkstra (and Wirth for that matter) didn't
particulary like Lisp or were very late to appreciate it.

~~~
kostyakow
>The third project I would not like to leave unmentioned is LISP, a
fascinating enterprise of a completely different nature. With a few very basic
principles at its foundation, it has shown a remarkable stability. Besides
that, LISP has been the carrier for a considerable number of in a sense our
most sophisticated computer applications. LISP has jokingly been described as
“the most intelligent way to misuse a computer”. I think that description a
great compliment because it transmits the full flavour of liberation: it has
assisted a number of our most gifted fellow humans in thinking previously
impossible thoughts.

------
Ashuu
A really interesting read! I am glad I read this very long article. Worth the
time!

------
karangoeluw
A tldr version please?

~~~
Zoepfli
Since "tldr" means "too long didn't read", maybe you need some of that
humbleness yourself?

~~~
RivieraKid
It has nothing to do with humbleness.

