
How do we tell truths that might hurt? (E. Dijkstra, 1975) - Luyt
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EWD498.html
======
rauljara
> How do we tell truths that might hurt?

Well, if you are attempting to influence someone like me, not like this. I
think Dijkstra's great reputation and 37 years of experience with programming
are basically enough for anyone reading this today to take all of his
statements as basically true; but if someone published a similar list dissing
all your favorite programming languages... well, we've all witnessed internet
flame wars.

The basic problem here is that the list is nothing but assertions. Dijkstra is
'right' about almost all of them, but there's no proof, or attempt to argue
the point here. If you didn't already agree with him, or god forbid, actually
use one of the languages, what you'd hear is Dijkstra basically saying you are
stupid for using that language.

And humans have a very natural tendency when someone calls them stupid to
assert angrily that they are not. And then the argument basically becomes a
form of "no I'm not stupid for liking Apple products, you're stupid for liking
Android." Which is entirely not the point of what Dijkstra is doing.

In my experience, the best way to get someone to change their mind is to show
them a better way, and show them how and why it is better without ever
insulting the old way or implying people who follow it are stupid.

Easier said than done, of course. But the insults are unnecessary and, more
importantly, distracting. They tend to be more about the ego of the person
hurling them than anything else. And stuff is already hard enough to figure
out even without egos getting involved.

~~~
tikhonj
That's entirely fair.

The most disheartening thing is that what you described _is_ a good way to
convince somebody, and even that often doesn't work. (As, I'm sure, many
advocates of functional programming know well.)

If your goal is to convince somebody of something, this rational approach may
be the _best_ , in some sense, but it probably isn't the most efficient.
Instead, I imagine a combination of _some_ good points with a bunch of tactics
borrowed from advertisers and behavioral psychologists is the most likely to
actually convince people. I would love to believe that programmers are all
rational and listen only to reason, but I've found this too often to not be
the case.

~~~
pretoriusB
Well, if you could absolutely convince anybody of anything, that would akin to
hypnotism and will amount to fascism.

So, let's assume that you only want to convince people of things that are
indeed true.

Then the thing goes awry: how do you know what's true? Maybe it's not, and
they are rightfully not convinced. Maybe there are more than one ways that the
thing can be seen. Etc.

Take something that is "obvious" for some people, like "PHP is a bad
language".

It's not at all obvious to me that even that one universally true. One could
answer, e.g: all languages have sore points so it's no big deal, if reversal
of needle-and-haystack is your biggest worry about a library then it's pretty
much fine, I can easily find programmers and that's what matters to me, it has
never failed my projects, it works for Facebook, it has tools and communities
I cannot find elsewhere like Wordpress, etc etc.

And PHP is as close-cut a case as it can get. Fighting Fortran? Not so much.

------
borplk
Breaking news: our computing technology is not perfect, neither is our world

He dismisses and complains about every piece of popular technology back then
as if the world is about to collapse because _hey! look, these apes invented
COBOL and FORTRAN_.

What did he do? Did our world come to an end because how terrible COBOL was?
No, we learned some lessons and we paid the price and now we're doing better,
this is how it works.

Every piece I read from Dijkstra it seems like he is having a hard time
accepting the imperfect aspects of our computing technology. He wants the
perfection of formal proofs to be everywhere in every program.

~~~
tluyben2
He exaggerated his points because he did see the big picture which became
reality; untrained people writing buggy, horrible software. He knew that would
happen and tried to use his position to do something about it. And we are
doing better? Computers were doing important things back then, but now they
are almost driving our cars, they are flying our planes and giving us
radiation therapy. All that, usually, without formal proof and with some kind
of 'but we use unit tests so ...' attitude. Outsourced if possible because
then it's cheap. Of course, humans make more mistakes than buggy software
generally, however his point was that we have ways to almost prevent this by
properly training software developers and writing proper software. You cannot
prevent everything as he suggested (but, again, he exaggerated to make a
point); you can make it better by not calling people who did some online
course programmers and by creating open and robust systems. If you are young
enough, I would be willing to bet that you will, at some time in your life be
seriously hurt or even die from the consequence of not taking this seriously
enough. You are probably losing money because of software bugs already;
besides rounding errors, you (probably) have no idea how many bugs there are
in the software which processes your insurance, taxes etc, all written in that
evil Cobol by untrained monkeys. I know this for a fact in the Netherlands as
I have seen this up close; probably true everywhere.

~~~
borplk
Hmm...makes sense. I see your point and indeed I do agree.

However, again, why didn't he do something practical about the perfect
programming languages that he seems to talk about?

Whatever technology you name, it is always possible to dismiss it by talking
about another level of perfection, but what is it beyond imagination?

We could perform our surgeries in this and that way...

We could build our cities and buildings in this superior way...

------
johncoogan
As someone who was born more than a decade after this was written, I can't say
I understand the context in which this was written. Would anyone care to
elaborate for me?

Specifically, I'd like to know what opinions about Fortran, Basic, Cobol, PL/I
and APL were common at the time.

Also, how was this received?

~~~
btilly
The easiest way to update it is to replace those with the closest current
equivalents.

Fortran = C, Basic = PHP, Cobol = Java, PL/I = Perl, and there is nothing on
the landscape that is really equivalent to APL.

So let me say more about APL.

APL basically started as math notation turned into a programming language. In
order to type it you needed a special keyboard for all of the extra symbols
you needed. It evolved into an incredibly compact and versatile language, with
a well-deserved reputation for being hard to read. In fact the classic
challenge among APL programmers is that one would write a program in a single
line, and the other would try to figure out what it did!

The Wikipedia article on the language is at
<http://en.wikipedia.org/wiki/APL_(programming_language)>. They offer the
following example of an implementation of Conway's game of life in APL:

life←{↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵}

 _Update:_ Apparently one or more of my comparisons are offending people. If
you'd respond indicating which comparison offends you, I'll explain why I made
the comparison.

~~~
eropple
The comparisons you've made are flippant ones without much serious weight to
them. I didn't downvote you, but, for example, COBOL bears no resemblance to
Java beyond "it's used for business applications." Unlike COBOL, Java has a
thriving non-business ecosystem and is a suitable tool for a lot of varied
tasks. And, perhaps more importantly, it is regarded as such--the opinions
held of COBOL at the time do not look remotely similar to the opinions about
Java today.

I can't speak to PL/I : Perl so I won't try, but FORTRAN : C is likewise
unfair. BASIC : PHP is closer, but still denies the credit (yes, there's some)
that the PHP language, core library, and community deserve.

(EDIT: I will totally agree with you that APL is cool, though. ;-) )

~~~
btilly
I said closest equivalents, not close. And, in terms of understanding a 1975
comment with 2012 language comparisons, I challenge you to do better.

As for Java, it, like COBOL, was intentionally designed from the start to
subtly restrict programmers so that large groups could work together on
extremely portable programs. Both became popular in lots of "boring business
applications" that will live forever. And if you think that current opinions
about Java are dissimilar from old opinions about COBOL, you need to get out
of your echo chamber and learn what adherents of various scripting languages
think about Java. If you're tempted to retort that there are more Java
programmers than scripting programmers, and Java programmers like the
language, a 1975 COBOL programmer could justifiably have said the same thing.

The Perl vs PL/I comparison is not original to me. A bit over a decade ago MJD
took a 1967 description of PL/I, did word substitution, and came up with
<http://perl.plover.com/perl67.html>. The description fits - both languages
combined several previous ones in an idiosyncratic way, with a strong emphasis
on making it easy to get stuff done. The result in both cases was a language
which made it easy for "sysadmin types" to solve real problems, with solutions
that in time could become problems of their own.

The FORTRAN to C comparison is, admittedly, unfair as languages. But both have
the characteristic of being low-level languages that are not (by current
standards) very expressive, with large established code bases, riddled with
repetitive mistakes that cause problems.

You admit that BASIC to PHP is better. If you think that I have failed to give
the PHP language, core library, and community the full iota of respect that
they deserve, then I likewise believe that you fail to give Kemeny and Kurtz
full respect for deciding in the early 60s that computing was going to be a
universal right, and there needed to be a programming language, for non-
programmers, that would let them unleash that potential. And then designing a
language that successfully served that purpose for decades.

~~~
MaysonL
_And, in terms of understanding a 1975 comment with 2012 language comparisons,
I challenge you to do better._

How about simply: The web is broken, Windows is almost as bad, Unix is getting
awfully long in the tooth, and Javascript is braindead.

~~~
sixbrx
And that productive people continue to get amazing stuff done with all of
these technologies, and that starting over from scratch in any of theses areas
would be far harder than critics would like to admit.

------
millstone
> Programming is one of the most difficult branches of applied mathematics;
> the poorer mathematicians had better remain pure mathematicians.

This is completely backwards. The majority of papers about programming are
accessible to most good programmers, perhaps after a little study. The
majority of pure mathematics papers are inaccessible to almost all
mathematicians.

What programming achievement is comparable in difficulty, complexity, or scope
to the Classification of Finite Simple Groups or the Poincaré conjecture?

~~~
btilly
You seem to be under the impression that being difficult to understand is a
sign that you're working on difficult stuff. I disagree. It more often means
that you've got so much jargon and such poor expository skills that you're not
understandable. See <http://bentilly.blogspot.com/2009/11/why-i-left-
math.html> for more on my perspective about math in particular.

 _What programming achievement is comparable in difficulty, complexity, or
scope to the Classification of Finite Simple Groups or the Poincaré
conjecture?_

Let's compare the Classification of Finite Simple Groups with the Linux
kernel.

The classification spans tens of thousands of pages over decades, was done by
about 100 mathematicians, in the end large key papers were so poorly reviewed
that some of the most pivotal people in producing that classification came to
the opinion that the whole cannot truly be trusted and began an attempt to
redo the whole proof in a more understandable and verifiable form.

The Linux kernel is about 15 million lines of code from a few thousand
developers (most of whom, admittedly, only contributed one patch) over a
period of 20 years which is widely tested in the real world on everything from
phones to massive compute clusters.

I agree, they are not comparable achievements. The Linux kernel is bigger,
involved more people, and we have more assurance that it actually works.

~~~
omra
I think the entire statement is silly in that it attempts to compare two
entirely different branches of mathematics, it would be almost as arguing
between physics and biology. Physics is the study of matter and energy.
Biology is the study of life. Any attempt to assign some "order" to them or
declare one "harder" than the other is subjective and pointless.

However, there are some factual problems with your reply which I want to
address. The Classification of Finite Simple Groups was a large endeavor
spanning over an entire century. It was not made "by about 100
mathematicians".

The fact that the current was not working is not sufficient enough to conclude
that the result was a failure or that pure mathematics is somehow worse than
computer science. It would be similar to writing some code, and then suddenly
realizing that you have to rewrite it if you want to get acceptable
performance or to stop it from becoming spaghetti code. Neither of this sheds
bad light on the field of computer science.

The key note errors you refer to are not much different from bugs, which of
course are inevitable.

~~~
btilly
Let me address the claim of factual errors.

I pulled my estimate of the effort put into the classification from the second
paragraph of
[http://en.wikipedia.org/wiki/Classification_of_finite_simple...](http://en.wikipedia.org/wiki/Classification_of_finite_simple_groups).
Do you have a better source that I should have used? If so, then cite it and
update Wikipedia.

I pulled my estimate of the effort for the Linux kernel from
[https://www.linux.com/learn/tutorials/560928-counting-
contri...](https://www.linux.com/learn/tutorials/560928-counting-
contributions-who-wrote-linux-32) and then extrapolating from the fact that if
1316 had patches for one major kernel release, that over all versions we
probably had thousands of developers contributing code but probably not tens
of thousands.

The facts that I presented are the best that I have available.

On the overall question of the comparison, I did not choose what should be
compared, I merely compared them by the most convenient criteria that I could
and came to the opposite conclusion from the previous poster.

~~~
omra
I've already counted 90+ authors and contributors from a _brief_ history of
finite simple groups, using [1]. This includes "et. al", from which I have
assumed best practice of APA Style Guidelines, indicating at least 6 authors.
I also could have counted more using [2], but I believe this is sufficient.

I feel that the statement that "only 100" mathematicians worked on the
classification of finite groups is disingenuous. There is clearly a disparity
between contributing a kernel patch and dedicating your entire life to
researching a mathematical topic.

As for the Linux kernel, I would like to point out that the original paper [3]
states that 7,944 developers have contributed since 2.6.11 (which is more than
your estimate, but important regardless). A paragraph later on is a little
more telling:

>[D]espite the large number of individual developers, there is still a
relatively small number who are doing the majority of the work. In any given
development cycle, approximately 1/3 of the developers involved contribute
exactly one patch. Over the past 5.5 years, the top 10 individual developers
have contributed 9% of the total changes and the top 30 developers have
contributed just over 20% of the total.

That is to say, 30 developers have made 20% of Linux. This is not unlike the
classification of finite groups, where a small amount of people have done a
large majority of the work. It would be insane to say only 100 people have
ever looked at finite groups. Indeed, proving something is exponentially
harder than learning it. Similarly, you would not say that 7000 people use
Linux, because 7000 have hacked on the kernel.

As for your last comment regarding about how you did not choose the topic, I
agree. But I think that this entire debate is silly: neither side is more
important than the other, more difficult than the other, mutually exclusive,
or so on. I do not agree with Dijkstra on this point, and I feel that his fame
is no reason to excuse him from it.

[1]
[http://www.ams.org/journals/bull/2001-38-03/S0273-0979-01-00...](http://www.ams.org/journals/bull/2001-38-03/S0273-0979-01-00909-0/S0273-0979-01-00909-0.pdf)

[2] A History of Finite Simple Groups, by FCC Doherty.

[3] <http://go.linuxfoundation.org/who-writes-linux-2012>

~~~
btilly
I would assume that "et al" hides a lot of the same people contributing to
multiple papers. The wikipedia estimate is likely wrong, but I don't think by
an order of magnitude.

I agree with you on the Linux kernel. I even noted that about half only
contributed one patch.

I suspect that I disagree with you about what Dijkstra's point was. When
you're programming, there is nowhere to hide from the fact of your code not
running. By contrast in pure math, a surprising amount of sloppiness can be
hidden in the fact that minor papers do not always get completely rigorous
view. Therefore if you're inclined to that form of sloppiness, you're better
off in pure math. (Which is not to say that all people in pure math are sloppy
in that way.)

I don't think he's right - I've met too many programmers and bad computer
scientists to have illusions that all are capable - but he does have a point.

------
leoh
What sort of languages did Dijkstra really like?

~~~
enduser
He didn't use computers much, other than for email and browsing the web. Most
of his writing was composed by hand using a fountain pen. He did work on ALGOL
60 and it is believed that he approved of the language.

Source: Wikipedia

------
aeflash
I wonder what the uncomfortable truths that everyone is ignoring are in
today's programming world?

~~~
jlouis
HTML .. when PostScript was way better.

The many years spent with C and C++ when there are better alternatives around.

Microsoft Windows, OSX, Linux when Plan9 exists.

OOP.

NoSQL and its total misunderstanding of the power of having a declarative
query language.

Cache coherency, ordering guarantees, where we should opt for less.

~~~
arethuza
Out of interest, have you written much interactive PostScript?

~~~
tluyben2
No, but they could've extended that instead of creating a new monster?

~~~
rfurmani
I've done a fair bit of development inside Postscript (way back when it was
one of my favorite languages) and its problem in this context is that it /is/
a programming language, and one that would have to be effectively sandboxed.
HTML on the other hand is purely a markup language, is extensible, can be
reflowed, and is much more natural for writing documents by hand.

------
peter_l_downs

        Besides a mathematical inclination, an exceptionally good
        mastery of one's native tongue is the most vital asset of
        a competent programmer.
    

Surprisingly true.

~~~
sb
I love that one, and it always surprises me when I meet people who can't
express themselves properly. There are some of the earlier EWDs, where he
meets a lot with German computer scientists (in particular one of the founding
fathers of the field there, Friedrich L. Bauer, whom he calls "Uncle Fritz"),
and it seems that he had good understanding of German, too.

There is also an interesting video of Dijkstra from a Netherland's television
station ([http://www.cs.utexas.edu/~EWD/video-
audio/NoorderlichtVideo....](http://www.cs.utexas.edu/~EWD/video-
audio/NoorderlichtVideo.html))

------
cafard
If you are Dijkstra, with a zest that becomes its own attraction. He did a
great deal of important work, but seems too often to be remembered for
stinging epigrams.

------
pretoriusB
For a supposedly hard-core science guy, he was not beyond name-calling and BS
accusations based on pure opinion.

~~~
rwallace
Indeed, looking back over that list, I'm surprised by the fact that almost
everything on it is - and was at the time - simply wrong.

