
On the cruelty of really teaching computer science (1988) - hacktheplanets
https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html
======
vezzy-fnord
Previously on Hacker News (in no particular order):

[https://news.ycombinator.com/item?id=8909529](https://news.ycombinator.com/item?id=8909529)

[https://news.ycombinator.com/item?id=8872085](https://news.ycombinator.com/item?id=8872085)

[https://news.ycombinator.com/item?id=2122826](https://news.ycombinator.com/item?id=2122826)

[https://news.ycombinator.com/item?id=8368788](https://news.ycombinator.com/item?id=8368788)

[https://news.ycombinator.com/item?id=6701607](https://news.ycombinator.com/item?id=6701607)

[https://news.ycombinator.com/item?id=6838434](https://news.ycombinator.com/item?id=6838434)

[https://news.ycombinator.com/item?id=3553983](https://news.ycombinator.com/item?id=3553983)

[https://news.ycombinator.com/item?id=1666445](https://news.ycombinator.com/item?id=1666445)

[https://news.ycombinator.com/item?id=43978](https://news.ycombinator.com/item?id=43978)

[https://news.ycombinator.com/item?id=2090256](https://news.ycombinator.com/item?id=2090256)

[https://news.ycombinator.com/item?id=1989473](https://news.ycombinator.com/item?id=1989473)

~~~
AdieuToLogic
If only we could remember history...

------
dkarapetyan
Most of Dijkstra's writing is overly melodramatic and this is no exception.
Now that computing is even more ubiquitous his elitist attitude about
programmers is even less relevant. Driving a car does not require a degree in
mechanical engineering and thermodynamics and similarly writing programs
should not require a degree in mathematics and logic. Also, reasoning by
analogy and intuition is one of the better ways to solve problems. If it is
good enough for people like Einstein and Feynman then it is good enough for me
and Dijkstra in this case is just plain wrong.

------
Ologn
> Carl Friedrich Gauss, the Prince of Mathematicians but also somewhat of a
> coward, was certainly aware of the fate of Galileo —and could probably have
> predicted the calumniation of Einstein— when he decided to suppress his
> discovery of non-Euclidean geometry, thus leaving it to Bolyai and
> Lobatchewsky to receive the flak.

There's a long history of mathematicians and scientists suppressing their
results.

Copernicus didn't publish about heliocentrism until he was on his deathbed.

Newton started developing calculus in 1665 but didn't publish his theories and
methods for calculus until 1693.

Darwin didn't publish until 1858, when Alfred Wallace had begun to rediscover
Darwin's theories. Darwin had already theorized about natural selection for
himself and privately among naturalist friends since he returned from the
Beagle in 1836.

I am not a physicist, but my understanding is that Einstein's formulas showed
him that the universe was expanding. He didn't trust what his own formulas,
which were radical at the time, were showing him, so he modified his equations
so that the universe would be stable - adding a cosmological constant.
Observations by astronomers like Hubble later showed him that his original
formulas were correct (my understanding of the field equations are shallow so
I may have mischaracterized this).

~~~
pdonis
_> Einstein's formulas showed him that the universe was expanding._

More precisely, they showed him that there was no solution describing a static
universe; the universe would have to be either expanding or contracting.
Nothing in the math gave any reason to favor expansion over contraction; it's
just that, since we now know the universe is expanding, that's the possibility
we always focus on when talking about this episode.

 _> He didn't trust what his own formulas, which were radical at the time,
were showing him_

I don't think it was that he didn't trust the formulas; he had already
explored other solutions that were giving correct predictions. It was just
that he thought the universe should be static, so he didn't like the fact that
there wasn't a static solution to his original equations.

 _> Observations by astronomers like Hubble later showed him that his original
formulas were correct_

More precisely, they showed that the universe was expanding, so there was no
need to insist on having a static solution to the equations. But, as FreeFull
pointed out, we now know that there is a nonzero cosmological constant, so
Einstein's original equations, without that constant, weren't quite correct
anyway.

------
serve_yay
"It is probably more illuminating to go a little bit further back, to the
Middle Ages. One of its characteristics was that 'reasoning by analogy' was
rampant; another characteristic was almost total intellectual stagnation, and
we now see why the two go together. A reason for mentioning this is to point
out that, by developing a keen ear for unwarranted analogies, one can detect a
lot of medieval thinking today."

Nice.

------
millstone
This speech was so quintessentially Dijkstra. Here's where it got _super_
-Dijkstra:

> Right from the beginning, and all through the course, we stress that the
> programmer's task is not just to write down a program, but that his main
> task is to give a formal proof that the program he proposes meets the
> equally formal functional specification.

And so we produce programmers who are helpless without formal functional
specifications, and are therefore incapable of writing any software in the
real world, with conflicting, underspecified, and post-hoc requirements.

But more importantly: this sucks all of the joy out of it! Who here got into
programming out of a passion for translating formal functional specifications
into formal proofs? How can any of the _magic_ of software survive such a
mindset?

~~~
agumonkey
Honest question, isn't it a historical bias ? maybe in his days programming
projects were devoted to huge, complex and costly problems that were always
pretty strictly defined, unlike the late 70s ~Apple/Microcomputer days ?

> But more importantly: this sucks all of the joy out of it! Who here got into
> programming out of a passion for translating formal functional
> specifications into formal proofs? How can any of the magic of software
> survive such a mindset?

Sure, but I do like the bliss of watching a tiny piece of code knowing it's
sound by induction.

~~~
AnimalMuppet
> maybe in his days programming projects were devoted to huge, complex and
> costly problems that were always pretty strictly defined, unlike the late
> 70s ~Apple/Microcomputer days ?

Huge? Not by today's standards. Complex? Again, not by today's standards, but
they didn't have today's libraries to build on. Costly? Definitely.

Pretty strictly defined? They tried to be, but the gaps always showed.
Certainly they were _not_ strictly defined by Dijkstra's standards.

------
galaxyLogic
I think Dijkstra's viewpoint is a bit "elitist" here. He talks about what are
the best things students should learn (formal methods etc.) instead of talking
about what would be the best ways to teach those to them. He takes the
viewpoint of the professor, unsurprisingly, not the viewpoint of the student.

~~~
JadeNB
> I think Dijkstra's viewpoint is a bit "elitist" here.

I don't think that there's any need for scare quotes; Dijkstra was
consistently and unapologetically elitist, and probably would (and maybe even
did?) say as much himself.

------
tbrownaw
If you have a hammer, everything looks like a nail. If you're a mathemetician,
everything looks like a math problem.

.

We're not really that special.

.

.

 _The programmer is in the unique position that his is the only discipline and
profession in which such a gigantic ratio, which totally baffles our
imagination, has to be bridged by a single technology. He has to be able to
think in terms of conceptual hierarchies that are much deeper than a single
mind ever needed to face before._

I disagree.

A city planner does not need to know or care how to make bricks. An
application developer does not need to know or care how HashMap<> internals
work (well, not after they pass the interview at least).

An identical program can run on 1MB of data or 1TB of data, without changing
it or adding layers of abstraction. A city with 4x as many people is not just
4 1x cities stuck together.

Complexity and the number of abstraction levels _do not_ necessarily scale
with system size in elementary particles.

.

 _To this I should add that, to the extent that we view ourselves as
mechanisms, we view ourselves primarily as analogue devices: if we push a
little harder we expect to do a little better. Very often the behaviour is not
only a continuous but even a monotonic function: [...]_

Of interest is that while this view may be _popular_ , I think it's beginning
to become apparent that it's not entirely _accurate_. Besides more well-known
effects like overtraining, there's also reduced productivity with too-high
work hours, reduced creativity in the face of excessive pressure, etc. And
it's not just the "monotonic" part that's questionable -- things like burnout
and the traditional "mid-life crisis" and mental breakdowns hint that we're
not even particularly continuous.

 _In the discrete world of computing, there is no meaningful metric in which
"small" changes and "small" effects go hand in hand, and there never will be._

This also applies to complex mechanical systems, and to anything with DNA.
Whatever mutations cause cancer can't be that big a change, or they wouldn't
happen so often.

.

 _The nice thing of this simple change of vocabulary is that it has such a
profound effect: while, before, a program with only one bug used to be "almost
correct", afterwards a program with an error is just "wrong" (because in
error)._

This is very silly. Not all bugs/errors render a program wholly useless.

.

 _[Don 't anthropomorphise programs/computers.] ... ...deal with all
computations possible under control of a given program by ignoring them and
working with the program. We must learn to work with program texts while
(temporarily) ignoring that they admit the interpretation of executable code._

This depends entirely on what you're doing. Are you trying to study
computation using some program as an example, or are you trying to accompilsh
some real-world effect with that program?

 _A programming language, with its formal syntax and with the proof rules that
define its semantics, is a formal system for which program execution provides
only a model. It is well-known that formal systems should be dealt with in
their own right, and not in terms of a specific model. And, again, the
corollary is that we should reason about programs without even mentioning
their possible "behaviours"._

Since the current context is the evils of anthropomorphization, what is the
assumed use that "formal system" is a better (more useful) model than
"intentional actor"? Insistence on formality becomes especially weak when the
system _can 't_ be formally defined, either because the behavior depends too
much on (unknown) input data or because the formal spec is intractable.

.

.

 _Right from the beginning, and all through the course, we stress that the
programmer 's task is not just to write down a program, but that his main task
is to give a formal proof that the program he proposes meets the equally
formal functional specification._

We've since discovered that all this will accomplish is to pass the buck to
the people who wrote the invariably-buggy (I mean, erroneous) spec. Which
since it's fully formal, could just be mechanically translated into a program
anyway.

.

.

I think some of the things I learned were pretty good (and in a pretty good
order): use of mathematical induction to prove that recursive algorithms work
(introductory); complexity analysis (same introductory course); lambda
calculus and compilers (4xx level). That's certainly better than trying to
focus on formal methods, when all they can accomplish is to _move_ the
informality. I'd have also benefited from something on creating and refining
models, or turning a problem into a spec (because really, building the right
thing is always harder than building the thing right). For CS rather than
CompEng, I'd also suggest more depth on discrete math than I took, and
whatever kind of math is appropriate for distributed systems (queues and
graphs and flow rates and stuff).

~~~
AnimalMuppet
> This depends entirely on what you're doing. Are you trying to study
> computation using some program as an example, or are you trying to
> accompilsh some real-world effect with that program?

I don't think Dijkstra was actually interested in producing programs that
accomplished some real-world effect.

> That's certainly better than trying to focus on formal methods, when all
> they can accomplish is to _move_ the informality.

Correct, and very insightful. (Roger Penrose made the same point in "The
Emperor's New Mind - at some point, it all starts with an informal idea of
what the program is supposed to do.)

------
ForcesOfOdin
This is a good read.

