
Dijkstra: On the cruelty of really teaching computer science (1988) [pdf] - jackhammer2022
http://www.cs.utexas.edu/~EWD/ewd10xx/EWD1036.PDF
======
ColinWright
It's interesting to see how attitudes to this EWD have changed over the years.
Here are some of the previous submissions with comments:

[https://news.ycombinator.com/item?id=43978](https://news.ycombinator.com/item?id=43978)
: A few comments

[https://news.ycombinator.com/item?id=1666445](https://news.ycombinator.com/item?id=1666445)
: Many comments

[https://news.ycombinator.com/item?id=3553983](https://news.ycombinator.com/item?id=3553983)
: Many comments

In the earlier threads there are two main voices: "Where can I learn more
about this," and "It's useful to see the different perspective." Both say that
there are many nuggets of wisdom to be mined, even when they don't agree.
There are also a few who dissent and say it's irrelevant, impractical, and
inapplicable.

Even in 3553983 where the top comment says it's largely irrelevant there is a
useful dialogue discussing the ideas, and how they can be made to work, or at
least how we can learn from them in today's context.

In contrast, in this thread so far we've mostly had comments about the hand-
writing, and complete dismissal of the ideas. So far there's very little
attempt to say: "OK, times have changed, but can we learn something from
this?"

So I ask - Can we? Are there nuggets to be mined? Or are you convinced that
you really do know better, and have nothing to learn from it.

I'm still learning.

========

For reference, other submissions, mostly without comments:

[https://news.ycombinator.com/item?id=6701607](https://news.ycombinator.com/item?id=6701607)

[https://news.ycombinator.com/item?id=2122826](https://news.ycombinator.com/item?id=2122826)

[https://news.ycombinator.com/item?id=2090256](https://news.ycombinator.com/item?id=2090256)

[https://news.ycombinator.com/item?id=1989473](https://news.ycombinator.com/item?id=1989473)

[https://news.ycombinator.com/item?id=383210](https://news.ycombinator.com/item?id=383210)

~~~
crististm
The audience is self-selected. The ones that don't know any better see only a
condescend tone instead of the material. They don't get the difference between
"lines spent" and "lines produced" and they don't know that computers were
used to "compute" and thus the formalism was absolutely necessary to trust the
result.

What we write today is WoW and stuff we don't need formalism to prove they
don't crash - because, you know, who cares about that?

Dijkstra comes from a time were you could reuse a piece of software and build
upon it because you could understand it. Today nobody does it because the
software is "disposable". Who cares about proving a piece of future garbage?

~~~
the_af
Agreed. In a way, it's like the "Blub paradox", only applied to formal methods
instead of programming languages. Those who aren't familiar with them don't
see the point and think it's merely "mathy" nonsense.

------
GuiA
This EWD is mind blowing in its beauty. It's 30 pages, but 30 pages that you
won't regret reading.

I love computer science and teaching it is some of the most fun I've ever had.
With a lot of practice, you can get across concepts such as computability and
turing machines to the non-mathematically initiated within the duration of a
party, and it's greatly satisfying. (yes I am a lot of fun to be around)

The rest of the EWD notes are highly recommended. (also it's weird to see an
academic write in script). Side question: do any other HNers keep notes in
written form? I still do it, using a fountain pen, and am often poked fun at
by colleagues :(

~~~
ics
Re: your side question, sure![0] I would love to use a fountain pen primarily
but the nib that I want is out of my budget for now. I have a couple Muji
fountain pens though which are great for the price, just far too thick for me
to use regularly.

[0]:
[http://academic.ianclarksmith.com/Coursera/DesignArtifacts/c...](http://academic.ianclarksmith.com/Coursera/DesignArtifacts/content/demo1.jpg)
(Uniball .28 and mechanical pencil on blank 4x6 card)

By now I probably have thousands of these scattered throughout books (I use
them as bookmarks while taking notes) or in boxes. They're not particularly
well sorted except for being vaguely chronological. I have a decent stash of
notebooks as well but haven't used them since it was a requirement for school
except for a single pocket notebook that gets filled roughly once a year.

------
beala
Transcription:
[http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036...](http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html)

------
randomsearch
Who is Dijkstra to espouse about the best way of teaching mathematics to
primary school children? I think it would be equivalent to a researcher in
education deciding they know the best way to design a calculus.

The manner in which Dijkstra writes about this is very off-putting - it sounds
arrogant and one-sided, dismissive. In fact he comes across as ignorant, which
of course, he is, in this area.

Worryingly, I see a lot of similar writing in CS blogs. Often authors present
as fact their opinions on a topic that they are not knowledgeable about.

When I read something like that, I look for the balance in their argument. If
they say "well, what are the good sides to teaching by analogy? When does it
work best? What is the evidence that teaching basic arithmetic is done well?"
then I begin to take them seriously.

[edit: typo]

~~~
Jtsummers
Where in this essay does he "espouse about the best way of teaching
mathematics to primary school children"? It seems to be about CS education,
not primary school math.

~~~
randomsearch
Page 5.

------
micro_cam
As someone who focused on proof based mathematics in collage and is largely a
self taught programer I think this proposal has merit. Begin forced to
confront abstraction and formalism without any crutches as a freshman in
college changed and expanded the way I thought. Learning to write down a
correct proof without an external way of verifying it is an experience
everyone should have.

However I think this needs to be coupled with an introduction to the concrete
at a younger age. In mathematics we teach arithmetic to young children then
step up the ladder of abstraction to high school algebra, simple proofs and
linear algebra before eventually exposing them to abstract algebra (ie group
theory etc). Teaching abstract algebra first would make theoretical sense but
the mind tends to need to know at least one example of something before it
accepts the abstraction.

I imagine a playful introduction to programing for young children coupled with
a strong course in formalism for young adults could produce some great
programers. Though, as with the proof based math series I took, I'm sure there
would also be loads of freshman drop outs/transfers to more applied courses.

I also agree with others that insight rarely comes from pure formalism. The
point of learning these things is to expand the way your mind works allowing
you to think about abstract objects and allowing you to verify insight when it
does come.

For me Charlie Parker sums up the necessity of a formal education even if it
is not explicitly used in practice:

"Master your instrument, master the music & then forget all that & just play."

~~~
mathattack
For me Charlie Parker sums up the necessity of a formal education even if it
is not explicitly used in practice: "Master your instrument, master the music
& then forget all that & just play."

How very true across so many aspects of life!

Science doesn't happen through formalism. People learn the body of material
that's been proven, then they get new intuitions, then afterwards they try to
formally prove their insight. This isn't entirely inconsistent with EDW
either. A lot of his point is that programmers need to know how to think
symbolically, and if you truly prove the correctness of things, you can build
the base of knowledge on top of it. If not, you're frequently wondering why
things don't work.

------
kaivi
A wonderful read from a badass scientist. Here is a good nugget of wisdom:

 _Software engineering, of course, presents itself as another worthy cause,
but that is eyewash: if you carefully read its literature and analyse what its
devotees actually do, you will discover that software engineering has accepted
as its charter "How to program if you cannot"._

------
elwell

      My point today is that, if we wish to count lines of code,
      we should not regard them as "lines produced" but as "lines
      spent"

------
stiff
I think people are attracted more to the form of the EWDs and to it's
melancholic tone, rather than to the content itself, which is confirmed by the
huge popularity of the EWDs as compared to "The discipline of programming",
which is the technical exposition of what Dijkstra really is proposing in the
EWDs.

As almost all of Dijkstra's writing, this piece here is a plea for teaching
formal methods. It's nicely written I admit, but it is also unbelievably smug
and disconnected from reality. Even in physics or mathematics, which are
obviously much more mathematically tractable than computer science, people do
not do significant new things by using only formal methods. In fact, the
technical achievement Dijkstra is most known for, Dijkstra's algorithm, was
not invented using formal methods, but like most discoveries ever it occurred
in a flash of insight after a long period of thinking, as he said in an
interview. No matter how much one loves rigour, the dream of formalizing all
reasoning has turned out to be a pipe-dream, as evidenced for example by the
failure of the Hilbert program, and people everywhere proceed about their
business as usual.

Mathematics is not, after all, just the manipulation of a string of symbols,
and neither is Computer Science, even if all computers do is exactly this,
because we humans are complex beings, largely driven by emotions, not
consciously aware of all the reasoning processes going on, that are often
reasoning in "fuzzy" ways (and frequently with good effects), as opposed to
being machines executing chains of logical inferences. In fact, people are
born with very different internal worlds, some people are really good at
symbol manipulation, algebra, and language, their daily thinking consists of
"talking to themselves", while some people naturally think by imagining
pictures and having various loose sensual impressions, among the latter for
example the great mathematician Poincare. Was Poincare in need of someone to
"really teach him mathematics"? Unfortunately I think Dijkstra assumed
everyone is identical to him.

Before preaching this too much, look at the practical implementations of his
ideas on this, for example have a look at him lecturing on a technical topic:

[http://www.youtube.com/watch?v=qNCAFcAbSTg](http://www.youtube.com/watch?v=qNCAFcAbSTg)

or read "A discipline of programming". As nice as the EWD prose is, I honestly
can't stand the man talking about anything technical for more than 5 minutes.
The more concrete the EWDs get the worse they are.

For some perspective on how science really gets done, as opposed to people's
romantic images of how it is done, I recommend "The Sleepwalkers" by Arthur
Koestler, about the Copernican revolution, and "The psychology of mathematical
invention" by Hadamard.

~~~
marcosdumay
I got the impression that we both read completely different articles, because
nowhere in it I see EWD saying that mathematicians or scientists should think
by means of formal methods... That doesn't even make sense, since formal
methods are a way not to think.

What he states is that computers are a "formal method applying machine" (a
conclusion you seem to share), and that for competently using it, one must
know formal methods. And, honestly, I can't imagine how one can disagree with
that. (And that was the fate of EWD, he saw all the patently obvious problems
of computing science that somehow everybody was ignoring, and communicated
them, just to be called a crazy radical at first, and "duh, of course it's
right, tell me something not obvious next time" later.)

~~~
stiff
Dijkstra advocates creating programs by doing formal transformations of
specifications written in mathematical logic, and computers just do boolean
algebra on strings of bits, so it's not that Dijkstra advocates formal methods
because they are somehow essential to how computation is done in the computer
- they are not, especially in the form proposed by Dijkstra.

He is proposing formal methods as a way of thinking about developing computer
programs, this is clear from reading "A Discipline of Programming". You can
not say "formal methods are a way not to think", because the theoretical
existence of a sequence of transformations giving you the final program does
not tell you how to find those transformations, so some thinking on the part
of programmer is still needed. Of course you can come up with a program by a
way of thinking completely uninformed by the formal specification, and then
formalize it, but this is not what Dijkstra is speaking of in this article.
What I am saying in the comment is that I do not believe many people benefit
that much from a formal approach to program development, just like not that
many new theorems in mathematics, outside of mathematical logic, were
discovered using the tools of mathematical logic. Even the proofs themselves
in mathematics aren't done in a completely formal way.

~~~
Jtsummers
> What I am saying in the comment is that I do not believe many people benefit
> that much from a formal approach to program development

A lot of systems would benefit from formal (or more formal) methods. I worked
in aviation software, formal methods would have saved us so much trouble,
especially since the code size is often relatively small (10k-200k LOC) and
because we're dealing with other systems the IO spec was reasonably complete.
Sure there'd be issues, but feature creep was not one of them (except on one
project). Feature creep makes formalisms difficult, but only when you're
looking at whole program formalisms. So certainly safety critical systems
benefit.

What else might? How about medical systems (see Therac-25 for faulty UI
leading to deaths). Network code, especially fundamentals like routers,
switches and core services like DNS that so much depends on. Cryptographic and
other security systems. Compilers, obviously, similar to network code they're
too ubiquitous to be left to error prone ad hoc approaches. Anything handling
money. Anything handling privacy (ensuring HIPAA and whatever European privacy
rights). Software handling production automation or managing
warehouses/inventory, failures there mean lost production time, wasted
inventory, lots of money and productivity gains to be had by having software
that works.

If you accept formal methods on the small scale (essentially the promise of
functional programming a la Haskell and the ML family), you can be confident
in composing large systems out of smaller, formally verified systems.

------
zeeboo
His page numbers are zero indexed. Cute. I would have loved to take his
proposal for an introduction to computer science course.

~~~
chas
You might also be interested in CMU's introductory curriculum in the same vein
which is somewhat less extreme and has a particular focus on parallelism as
the standard case of computing.
[http://existentialtype.wordpress.com/2012/08/17/intro-
curric...](http://existentialtype.wordpress.com/2012/08/17/intro-curriculum-
update/)

------
10098
I know Dijkstra's genius and all, but his papers are just so unpleasant.
First, I hate his verbose style and second, I hate his attitude.

"The effort of using machines to mimic the human mind has always struck me as
rather silly. I would rather use them to mimic something better."

Jesus, could he be any more smug?

~~~
marcosdumay
> Jesus, could he be any more smug?

Funny that I think that statement is extremely humble.

~~~
scaramanga
Indeed, the very idea that your fly by the seat of your pants intuition,
memory and senses cannot be trusted is the foundational presentiment that
brought us science.

------
Pitarou
How can a human write that much good prose without scribbling things out?

~~~
rcthompson
Presumably this is a final draft.

------
ics
Was this post inspired by this post
([http://www.reddit.com/r/compsci/comments/1rxfze/if_you_liked...](http://www.reddit.com/r/compsci/comments/1rxfze/if_you_liked_ew_dijkstras_paper_on_the_cruelty_of/))
on r/compsci by any chance? I didn't read much of the discussion there but the
other links are interesting and things which I hadn't seen before.

~~~
10098
I think it's the opposite, i.e. the reddit post was inspired by the paper,
which I read yesterday... but I couldn't have found the link on HN, because it
has only appeared today, so it must mean that I orginally found it on reddit.
Now I'm confused...

------
elwell
Interestingly, I found the handwriting to more readable than most computer
fonts; that is, very visually consumable.

~~~
sgloutnikov
That was the fist thing that jumped out at me. Amazing handwriting!

~~~
cpg
His handwriting looks beautiful!

I loved it and emailed Prof. Dijkstra during the brief time that he _finally_
had email (I got a graduate degree from his department and attended some of
his classes). I asked him for permission to one day create a font from his
handwriting, not really expecting a response.

However, he did eventually reply, granting me the permission to do so.

I commissioned the font and it worked nice. It was distributed briefly and it
may still be out there (I see one or two are found out there).

My original copy may be in old backups. One day I will find it and update it!

Edit: found the email reply! He retired in 11/1999 and passed away in 8/2002

    
    
      Austin, Tuesday 11 April 2000
    
      Dear Dr. Puchol,
    
            If a competent designer can use my handwriting as a source of
      inspiration, that is fine with me. Thank you for asking my permission,
      which is granted with pleasure.
    
            With my greetings and best wishes,
                            yours ever,          Edsger W.Dijkstra

------
mrcactu5
I am glad he was able to articulate why some types of change are accepted and
other changes rejected with hostility -- that some new things can be
interpreted through analogies with previous experience.

It is the difference between "change" and _change_ , "thinking" and
_thinking_.

