

Edsger W.Dijkstra - How do we tell truths that might hurt? - dhotson
http://www.cs.virginia.edu/~evans/cs655/readings/ewd498.html

======
wzdd
The problem with discussions about this hateful list is that discussion of the
interesting, timeless comments is drowned out by arguments about whether BASIC
still warps minds (or whether it ever did), whether physicists should share
FORTRAN, what we could possibly learn from APL's mistakes, what made PL/I so
bad, and what company might constitute a modern-day IBM.

Many of the comments on the list are out of date. Many of the rest are such
unravellable apothegms, incomprehensible without a context that most of us
don't have, that trying to apply them to any modern-day scenario in the way
that Dijkstra intended is an exercise more in creativity and personal bias
than in interpretation.

Most of the discussion in this comment section is either about the comments
which now appear trollish (such as the BASIC one) or about Dijkstra's software
design philosophy, which is only _extremely tangentially_ related to this
list.

------
cafard
With all due respect to EWD, and the wonderful things he did, I believe that
he let himself get carried away with his gift for the glib aphorism. I don't
think that those who post them do Dijkstra's reputation any favors.It is as if
one were to fill a biography of Churchill with his clever quips and omit 1940.

~~~
microtherion
The difference being that, once you set aside Dijkstra's glib aphorisms, there
may not BE a 1940 to talk about.

~~~
philwelch
You're talking about the inventor of the shortest-path algorithm and the
semaphore.

~~~
microtherion
Obviously, mutual exclusion is widely used. I'm not so sure that semaphores as
such are used as widely (and certainly not with functions named P() and V()).

------
wglb
So those of us who read Dijkstra when it was fresh may be more mindful than
others about some of the other things that he did. Using the methods that he
espoused he and a small team wrote a compiler which, during its lifetime, had
a total of four errors.

There are practices that he taught that most of us, even after all our million
lines of code, still have not mastered. This is particularly visible from
vantage point of the security industry.

------
JonnieCache
The thing that always strikes me when I read Dijkstra's writings on languages
is that in the context of present-day discourse, he comes across as a massive
troll. He rarely gives any in-place justification for his outrageous use of
insulting and attacking language.

I realise some of this is explained by the fact that a lot of this stuff was
written for consumption by people who were already aware of him and his ideas,
other people at various institutions with him. However this begs the question,
if your target audience is already so aware of your unusual ideas that you do
not feel the need to provide any kind of context or justification, why moan so
loudly at all? I'd love to see some of his colleagues reactions of the time to
some of these tracts.

I guess when you're that far into the right-hand crevice of the intelligence
bellcurve, you're going to fall into the outer quartiles for quite a lot of
other psychometrics as well.

~~~
aristus
He was a troll back then, too. Alan Kay used to joke that arrogance in
computer science is measured in nanodijkstras. EWD could back it up because he
was a genius, but it didn't win him friends.

Either way, some of the aphorisms are less defensible now. 35 years later, if
something so broadly useful as programming _remains_ "one of the most
difficult branches of applied mathematics", then I think we have failed our
responsibility to move the science forward.

------
mduerksen
Ignore a common misbelief, or actively fight against it?

I would say: Get successful ignoring it, and _then_ openly "admit" that you
don't share that conventional wisdom.

It can be very tedious, impossible or unfavorable to convince people who just
parrot common misconceptions. Or, as Proverbs 27, 22 pithily says: _Though
thou shouldest bray a fool in a mortar among wheat with a pestle, yet will not
his foolishness depart from him._

It's much easier when you have success on your side. I would guess that's the
main reason why nobody talks about the majority of Dijkstra's list anymore:
There is a overwhelming lot of examples who succeeded ignoring or opposing
them. Waterfall software development isn't on the decline because smart people
won an argument, but because even big companies realize that the most
successful software today wasn't made that way.

------
gvb
_Many companies that have made themselves dependent on IBM-equipment (and in
doing so have sold their soul to the devil) will collapse under the sheer
weight of the unmastered complexity of their data processing systems._

Interestingly, this still applies except s/IBM-equipment/Microsoft software/.
Further, I have not seen any companies collapse due to the unmastered
complexity (IBM or Microsoft), but I _do_ see a lot of people that have no
clue what they are doing, they just do the same steps over and over because
someone in the dim dark past discovered that those steps formed a path through
the maze of complexity. The Office equivalent of "Up Up Down Down Left Right
Left Right A B Select Start".

~~~
dabent
I'm pretty sure entire companies don't collapse, but projects are canceled and
companies are limited in what they can do by relying on overly-complex
solutions. I know that from experience.

Of course, that's one thing that can give a startup an advantage against a
giant, provided the startup doesn't fall into the same trap.

------
hasenj
I don't find anything in this list relevant (except maybe the natural language
programming bit).

I think the most "uncomfortable" truth is the following:

* Bug-free/fault-free software is impossible

Although continuous/iterative development is very effective at minimizing the
count and impact of bugs, but this is only relevant for web applications, and
applications with some decent update system (e.g. Chrome, Apple's "App store",
Ubuntu's PPAs, etc).

~~~
derleth
> Bug-free/fault-free software is impossible

This is an absolute if you don't have the luxury of formal standards that stop
evolving at a defined point in time.

It also means that you get to define 'bug-free' as 'implements the standard to
the letter', not 'does what every single user expects all of the time, even
when those expectations contradict and are insane.'

~~~
stcredzero
_This is an absolute if you_

...exclude certain conditions as unrealistic, or take the particular common
case as universal.

 _It also means that you get to define 'bug-free' as 'implements the standard
to the letter',_

When the specs are engineering specs, this can sometimes be usefully close to
true.

 _not 'does what every single user expects all of the time, even when those
expectations contradict and are insane.'_

Formal systems will usually tell you when specs are outright contradictory.

~~~
derleth
> Formal systems will usually tell you when specs are outright contradictory.

Right, and I'm sure they're useful when designing avionics software. But how
many web browsers have specifications, let alone specifications that can be
subjected to that kind of analysis? Doesn't every user have a potentially
self-contradictory and possibly insane 'specification' for 'web browser' in
their head, and don't they think it's a bug when that 'specification' is not
adhered to?

Maybe I should have just said there are two kinds of software: Software that
is an implementation of a mostly-stable specification, like a POSIX-compliant
OS or the aforementioned avionics software, and software for which no
specification, or at least no stable specification, exists, such as web
browsers and text editors.

------
meric
"It is practically impossible to teach good programming to students that have
had a prior exposure to BASIC: as potential programmers they are mentally
mutilated beyond hope of regeneration."

BASIC was the first programming language I was exposed to when I started
learning to program 5-7 years ago... I would hope today this truth is false.

It did, however, took me 2 years to learn something else other than BASIC plus
another year of programming BASIC in Java...

~~~
andreyf
Have you tried learning Scheme? My first language was C, which I learned very
much as if were BASIC (plus pointer arithmetic), and I related very much to
Dijkstra's observation when a couple of math major friends and I started
learning about continuations in college.

Either my friends were unequivocally smarter than me, or my knowledge and
experience with C/Java/Python really got in the way of my understanding. The
difference in how quickly we grasped the material was really clear: my mental
model was that of saving and recalling the call stack. Their conceptual model
was quickly somehow intuitive in a way I still don't fully grasp.

~~~
xiongchiamiov
I've heard that introductory courses that use Scheme go over pretty well, but
most people's heads explode when they take Programming Languages I (in Scheme)
at my university. It seems that after 3 or 4 years of Java, C and C++, their
minds have formed certain expectations of what a programming language is
supposed to be like (which is helpful when learning a new language that shares
those expectations).

------
bartwe
I find it hard to believe in the way Dijkstra seems to want to develop
software. To me he seems to believe in the existence of faultless, consistent
and complete specifications. I don't believe that the specifications and
requirements of sufficiently complex software can have any of these qualities
in real life, in part because of the humans that interface with the systems.

~~~
stuhacking
I agree. There's plenty of literature supporting almost the opposite of what
Dijkstra preaches. While it's good to aim for formal specification, in
practice you're likely to overdesign, deliver late and potentially find
yourself unable to react quickly to changing requirements. The key is to
deliver a working piece of software because handing your customer a page of
beautiful mathematical notation is not going to solve their problem.

(I'm aware of the danger of referencing Worse is Better... but I'll do it
anyway :-) ) <http://www.dreamsongs.com/WorseIsBetter.html>

[http://ravimohan.blogspot.com/2007/04/learning-from-
sudoku-s...](http://ravimohan.blogspot.com/2007/04/learning-from-sudoku-
solvers.html)

In addition, the books 'Hackers' by Steven Levy and 'Coders at Work' by Peter
Siebel demonstrate various scenarios where a groups of Hackers had to get
something done in a short space of time. Almost invariably, they aim for the
final product.

------
anonymous
> The use of anthropomorphic terminology when dealing with computing systems
> is a symptom of professional immaturity.

He makes it sound as if this was a bad thing.See
<http://www.catb.org/jargon/html/anthropomorphization.html>

------
edw519
The first time I read this, I laughed.

The second time, I cried.

Sure, there are some pretty funny (and clever) thoughts here. The quote about
COBOL is priceless.

But this...

 _It is practically impossible to teach good programming to students that have
had a prior exposure to BASIC: as potential programmers they are mentally
mutilated beyond hope of regeneration._

FWIW, I have written over 1 million lines of BASIC for over 100 customers,
most of it still it production, and all of it doing important work producing
goods, services, and jobs in so many uncool industries we couldn't live
without.

Maybe I'm an outlier, but I have gone on to learn algorithms, dynamic
programming, database theory, client/server, and web development. I believe
the elegant simplicity of BASIC and database theory, although limited in
application, has provided an excellent base upon which to build.

I know that ewd is a giant to be respected, but I think it's a red flag when a
teacher mutters "practically impossible to teach", even in jest. IMHO, that
says more about the teacher than the student.

Posts like this are great for a laugh, but when you stop to think about it,
all they really do is further amplify the perception of a huge gulf between
theory and practice. Academics whine while those of us in the trenches are too
busy to notice because our sleeves are rolled up while we build that which
must be built.

~~~
Peaker
The BASIC language has evolved a lot.

I think he's referring to very old BASIC which used numbered lines, a whole
lot of GOTO's, did not support subroutines (or did it? Correct me...), etc.

~~~
nollidge
BASIC has always supported subroutines, albeit in about the most primitive way
possible: the GOSUB command jumps to a particular line number, then the RETURN
command jumps back to the line after the GOSUB.

~~~
bitwize
ANSI BASIC has supported SUBs and FUNCTIONs going quite a ways back. Microsoft
just didn't bother to support that standard except in its commercial
compilers, and later beginning with QBasic.

Some non-Microsoft microcomputer BASICs (for example Extended BASIC on
TI-99/4A) support them to some extent.

~~~
Someone
That may be the case, but the oldest ANSI basic that Wikipedia mentions is
"for minimal Basic" from 1978. This text is from 1975, and I doubt that ANSI
standard had anything like what we nowadays call functions.

Typical Basics of 1975 had gosub/return, but no functions (= no arguments, no
return values). Apart from loop variables, all variables were global.

Commodore Basic had a "def fn" command, but it only supported single-line (=
single-statement) functions.

Add in the effect that one had to 'name' subroutines by line number, and
compare this with the Lisp's and Fortrans of ten years earlier, and Dijkstra's
position becomes quite defensible.

~~~
stcredzero
_Commodore Basic had a "def fn" command, but it only supported single-line (=
single-statement) functions._

Sounds vaguely familiar!

------
mwg66
Plenty of Dijkstra's points remain relevant today. My favourite (and I
completely agree):

"Besides a mathematical inclination, an exceptionally good mastery of one's
native tongue is the most vital asset of a competent programmer."

~~~
nathanb
But even this one seems to be disproven by example. Some of the best
programmers I know are exceptionally slovenly in their written language. I
strongly suspect that this list of "uncomfortable truths" is intended to
illustrate that a competent programmer looks, for all intents and purposes,
like ewd himself.

~~~
mwg66
I disagree. I seldom meet a talented programmer who isn't able to articulate
themselves very well.

~~~
nathanb
There's a difference between "able to articulate well" and "exceptional
mastery of one's native tongue".

~~~
mwg66
Indeed. But I can't help but think that is the point he was making.

------
stcredzero
_By claiming that they can contribute to software engineering, the soft
scientists make themselves even more ridiculous._

I might get voted down for taking him (and the post) dead-on, but I think
Dijkstra is just plain wrong here. Granted, this was written in 1975, but I
think a lot of thinking about human interface design has tremendously
benefited various computer fields. That said, there is a subset of "soft
scientists" for which this may be true.

------
chanux
OK. Can we build a modern day list of 'truths that might hurt'?

~~~
derleth
> Can we build a modern day list of 'truths that might hurt'?

Sure, if you can do it yourself.

Any collaborative process would just be opening the floodgates to trolling
beyond all reason.

------
jonpaul
His opening thesis starts out strong. But his language bashing is too myopic.
It's a shame to read some of these points from such a well respected computer
scientist. I guess it just shows that we're all capable of being human.

------
zdw
_Besides a mathematical inclination, an exceptionally good mastery of one's
native tongue is the most vital asset of a competent programmer._

Wow, prior art for what 37signals has been crowing for the last year or so.

------
cybernytrix
I wonder what he would have said about C++ !!

------
Deprecated
Dijkstra (in 1975) strongly dislikes FORTRAN, BASIC, APL, PL/I, and COBOL.
What languages did he like? Pascal?

~~~
lmkg
Lisp. Dijkstra was the original Smug Lisp Weenie.

> _Lisp has jokingly been called "the most intelligent way to misuse a
> computer". I think that description is a great compliment because it
> transmits the full flavor of liberation: it has assisted a number of our
> most gifted fellow humans in thinking previously impossible thoughts._

~~~
fmw
_LISP's syntax is so atrocious that I never understood its popularity. LISP's
possibility to introduce higher-order functions was mentioned several times in
its defence, but now I come to think of it, that could be done in ALGOL60 as
well. My current guess is that LISP's popularity in the USA is related to
FORTRAN's shortcomings._

Source:
[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD07xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD07xx/EWD798.html)

The memo you're quoting[1] is a lot milder about Lisp, but doesn't come across
as the ramblings of a "Smug Lisp Weenie" (disclaimer: I'm currently having a
lot of fun playing around with Clojure, so I might not be entirely objective).

[1]
[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/EWD1284.html)

------
olalonde
Please people, put the year in the title when you submit older stuff! (1975 in
this case)

~~~
CodeMage
It's _Dijkstra_. I can understand complaining about not knowing the date of a
post by Yegge, but did you really expect anything written by Dijkstra to be
dated newer than 2002?

~~~
olalonde
No, but there's a big difference between 1975 and 2002. The fact that the
author is famous or dead is irrelevant here.

When I read an article submitted on HN, I can safely assume it was written
within the past few months. If it isn't the case, I can't make assumptions
about its year of publication (OK, I can assume it was written within the
author's lifetime but that doesn't really help).

