
Computer Science vs. Computing Science (1999) - tosh
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/EWD1284.html
======
AnimalMuppet
> The latter, not surprisingly, turned programming from an intuitive activity
> into a formal one in which the program is derived by symbolic manipulation
> from its functional specification and the chosen form of correctness proof.

> The idea of a formal design discipline is often rejected on account of vague
> cultural/philosophical condemnations such as ”stifling creativity”; this is
> more pronounced in the Anglo-Saxon world where a romantic vision of ”the
> humanities” in fact idealizes technical incompetence. Another aspect of that
> same trait is the cult of iterative design.

> Industry suffers from the managerial dogma that for the sake of stability
> and continuity, the company should be independent of the competence of
> individual employees. Hence industry rejects any methodological proposal
> that can be viewed as making intellectual demands on its work force. Since
> in the US the influence of industry is more pervasive than elsewhere, the
> above dogma hurts American computing science most. The moral of this sad
> part of the story is that as long as computing science is not allowed to
> save the computer industry, we had better see to it that the computer
> industry does not kill computing science.

Typical Dijkstra. You do it his way or you're not only wrong, but also stupid.
But "his way" is to have a _correct_ formal specification that you can operate
on using mathematical logic. We have learned, from bitter experience, that
that's asking quite a lot, and even when you can get it, it's a recipe for
building systems that are obsolete on the day they are completed. But Dijkstra
is content to pronounce the system produced to be "correct". That approach may
cut it in academia, but in the real world it has been tried and found to be
sorely wanting.

Then you get to aspects that really are a combination of art and science.
Would Dijkstra's approach have produced, say, the iPhone user interface? I
strongly doubt it.

~~~
rramadass
I used to think like you, until one day i read D.L.Parnas' paper "A Rational
Design Process: How and Why to Fake it"
([http://users.ece.utexas.edu/~perry/education/SE-
Intro/fakeit...](http://users.ece.utexas.edu/~perry/education/SE-
Intro/fakeit.pdf)). I then could understand Dijkstra's pov. He was advocating
for rigor in Programming and that the very effort reqd. to hew to his ideal
would result in better Software. That is the gist of his argument.

Unfortunately, Humans are heuristic, pattern-matching, infinitely adaptable
organisms and hence absolute rigor is very difficult. The fact that
programming languages are so evolved that one can easily code-up any
incoherent stream of consciousness and still have the system-as-a-whole work
in some fashion is what has contributed to both the boom (anybody can
contribute) and mess (quality sorely lacking) in the industry.

PS: Never ever dismiss pioneers like Dijkstra, Hoare, Wirth and others
offhand. They built up the Fundamental Theory, Science and Tools which
directly led to today's Computer Industry and its effect on Society.

------
omazurov
_> In the early days, John von Neumann has looked at the problem of
constructing a reliable computer out of unreliable components but the study
stopped as von Neumann died and transistors replaced the unreliable valves. We
are now faced with this problem on a global scale and it is not a management
problem but a scientific challenge._

Wow, John von Neumann is my man! I need to get more exposure to his
intellectual legacy. I couldn't articulate why my POC of parallel computation
surviving asynchronous loss of threads [0] would be of any interest (I don't
see any practical value in it myself and treat it as a piece of art). I think
the man would find more than two words to respond to it.

[0]
[https://github.com/OlegMazurov/Robusta](https://github.com/OlegMazurov/Robusta)

~~~
procgen
You might be interested in David Ackley's work on robust-first, best-effort
computing. Here's a relevant paper:
[https://www.usenix.org/legacy/events/hotos11/tech/final_file...](https://www.usenix.org/legacy/events/hotos11/tech/final_files/Ackley.pdf)

He's working on a pertinent hardware project called the T2 Tile and is
documenting it all on YouTube:
[https://www.youtube.com/channel/UC1M91QuLZfCzHjBMEKvIc-A](https://www.youtube.com/channel/UC1M91QuLZfCzHjBMEKvIc-A)

~~~
omazurov
Thank you so much! From the first look it's right on the money.

 _> Our strategy is to entice smart people to play with programming robust,
indefinitely scalable computations by lowering barriers to entry, and as
idioms, motifs, and best practices emerge from such explorations, we believe
opportunities for wise optimization will arise._

I believe I have unknowingly contributed to that strategy.

------
AnimalMuppet
> LISP had its serious shortcomings: what became known as “shallow binding”
> (and created a hacker’s paradise) was an ordinary design mistake; also its
> promotion of the idea that a programming language should be able to
> formulate its own interpreter (which then could be used as the language’s
> definition) has caused a lot of confusion because the incestuous idea of
> self-definition was fundamentally flawed.

Would any Lisp expert care to comment on this?

~~~
olooney
Modern terminology is "dynamic" vs "lexical" scope[1][2]. Lexical scope makes
it very easy to reason about closures[3]. Dynamic scoping is slightly easier
for language authors to implement. Terrible languages like ColdFusion[4] and
MUMPS[5] use dynamic scoping, although lexical scoping constructs were bolted
on later. LISP has both[6], at least it does now. Dynamic scope makes it
possible to do certain kinds of code re-use not possible with lexical
scope[7]; these are often called "macros" but what exactly is meant by this
varies from language to language; in general, only an "unhygienic" macro
requires dynamic scope to implement, while hygienic macros generally require a
pre-processor but no access to dynamic scope[8]. Because dynamic scope leads
to all kinds of unexpected side-effects, makes closures difficult to
implement, and is only really useful for writing the worst and least useful
kinds of macros, the programming language world has come down hard in favor of
lexical scoping, which is now ubiquitous; almost every modern language you've
heard of exclusively uses lexical scope.

[1]:
[https://en.wikipedia.org/wiki/Scope_(computer_science)](https://en.wikipedia.org/wiki/Scope_\(computer_science\))

[2]: [https://stackoverflow.com/questions/1047454/what-is-
lexical-...](https://stackoverflow.com/questions/1047454/what-is-lexical-
scope)

[3]: [https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Clos...](https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Closures)

[4]:
[https://en.wikipedia.org/wiki/ColdFusion_Markup_Language](https://en.wikipedia.org/wiki/ColdFusion_Markup_Language)

[5]:
[[https://en.wikipedia.org/wiki/MUMPS](https://en.wikipedia.org/wiki/MUMPS)

[6]:
[https://www.emacswiki.org/emacs/DynamicBindingVsLexicalBindi...](https://www.emacswiki.org/emacs/DynamicBindingVsLexicalBinding)

[7]:
[https://en.wikipedia.org/wiki/Scope_(computer_science)#Macro...](https://en.wikipedia.org/wiki/Scope_\(computer_science\)#Macro_expansion)

[8]:
[https://en.wikipedia.org/wiki/Hygienic_macro](https://en.wikipedia.org/wiki/Hygienic_macro)

~~~
AnimalMuppet
So, if I understand all this correctly, Lisp originally had dynamic scope only
(which was Dijkstra's criticism), and later came to rely mostly on lexical
scoping?

And, would you also comment on Dijkstra's "self-definition" comment?

~~~
olooney
The self-definition comment is a reference to the common LISP exercise of
writing a "meta-circular evaluator"[1] - that is, writing a LISP interpreter
in LISP. This is famously done[3] in "Structure and Interpretation of Computer
Programs", which is a very popular textbook and MIT course which uses LISP as
a teaching language[2]. This can be found for example, in lecture 7 of the
SICP lecture series on Youtube[4].

Dijkstra's comment on this is quite acerbic, isn't it? Here is the full quote
for easy reference:

> [LISP's] promotion of the idea that a programming language should be able to
> formulate its own interpreter (which then could be used as the language’s
> definition) has caused a lot of confusion because the incestuous idea of
> self-definition was fundamentally flawed.

I agree to the extent that the exercise of writing a LISP interpreter in LISP
is indeed often a very confusing exercise for beginner, partly because writing
any interpreter (or compiler) is fairly arcane for a beginner, and partly
because keeping track of which parts are in implementation and which parts are
in the target language can be made more difficult; also because it's easy to
"cheat" and rely on outside language in inappropriate ways. I would recommend
using a language you already know well and implementing a moderately simple
language as a first exercise to a student. But on the whole the exercise of
writing an interpreter in its own language - or alternatively writing a self-
hosting compiler[5] - is rewarding and does teach you a great deal about how
programs work.

I also can't agree the so-called "incestuous idea of self-definition" is
"fundamentally flawed:" it is in fact fairly interesting and related to
important meta-mathematical concepts like Godel numbering[6], reductions[7],
and the Church-Turing thesis[8]. Writing a meta-circular evaluator seems like
a reasonably concrete way to start building intuition towards these important
ideas. Perhaps Dijkstra has in mind such problems as Curry's Paradox[9] or
Russel's Paradox[10], both of which are paradoxes arising from self-reference?
In my view a meta-circular evaluator is much closer in spirit to Godel
numbering (which _is_ valid and useful) than these paradoxes.

[1]: [https://en.wikipedia.org/wiki/Meta-
circular_evaluator](https://en.wikipedia.org/wiki/Meta-circular_evaluator)

[2]:
[https://en.wikipedia.org/wiki/Structure_and_Interpretation_o...](https://en.wikipedia.org/wiki/Structure_and_Interpretation_of_Computer_Programs)

[3]:
[http://www.sicpdistilled.com/section/4.1/](http://www.sicpdistilled.com/section/4.1/)

[4]:
[https://www.youtube.com/watch?v=0m6hoOelZH8](https://www.youtube.com/watch?v=0m6hoOelZH8)

[5]: [https://en.wikipedia.org/wiki/Self-
hosting_(compilers)](https://en.wikipedia.org/wiki/Self-hosting_\(compilers\))

[6]:
[https://en.wikipedia.org/wiki/G%C3%B6del_numbering](https://en.wikipedia.org/wiki/G%C3%B6del_numbering)

[7]:
[https://en.wikipedia.org/wiki/Reduction_(complexity)](https://en.wikipedia.org/wiki/Reduction_\(complexity\))

[8]:
[https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis](https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis)

[9]:
[https://en.wikipedia.org/wiki/Curry%27s_paradox](https://en.wikipedia.org/wiki/Curry%27s_paradox)

[10]:
[https://en.wikipedia.org/wiki/Russell%27s_paradox](https://en.wikipedia.org/wiki/Russell%27s_paradox)

------
the_af
Interesting as expected of Dijkstra. I found this part funny:

> _" That name ['programming languages'] was unfortunate because very soon the
> analogy with natural languages became more misleading than illuminating. It
> strengthened the school of thought that tried to view programming primarily
> as a communication problem, it invited the psychologists in, who had nothing
> to contribute, and it seriously delayed the recognition of the benefits of
> viewing programs as formulae to be derived. It was the time of Edmund C.
> Berkeley’s “Giant Brains or Machines that Think”, it was the time of rampant
> anthropomorphism that would lead to the false hope of solving the
> programming problem by the verbosity of COBOL and would seduce Grace M.
> Hopper to write a paper titled “The Education of a Computer”. Regrettably
> and amazingly, the habit lingers on: it is quite easy to infuriate computing
> scientists by pointing out that anthropomorphizing inanimate objects is in
> science a symptom of professional immaturity."_

This school of thought that Dijkstra decried is the one that eventually
triumphed in our industry, and nowadays it is widely held that a general-
purpose programming language "succeeds" primarily when it enables
communication with/between programmers. Armies of consultants and bloggers and
Uncle Bobs make a living out of this idea. The whole "languages are first for
programmers to read, and second for computers to run" shtick.

I'm not sure which is my own position, but we do know which was Dijkstra's and
that he would not be happy today.

~~~
AnimalMuppet
It's incongruous that Dijkstra, in a paragraph decrying anthropomorphism,
anthropomorphizes anthropomorphism (via saying that it seduced Hopper).

~~~
Jtsummers
I saw that and suspected it was intended as humor on his part. That or he
actually missed it when writing that letter.

------
shmerl
Computing Science is a lot better name. I prefer to use that term to "Computer
Science".

~~~
mhh__
On the one hand yes, but I think Computer science (as a superset) is the
better name due to the added focus on learning how computers "actually" work.

The focus should be on computing, but understanding the machine is very
important

~~~
User23
I agree with your observations, but reached the opposite conclusion and prefer
Computing Science as a name. I take this as evidence that it’s not really
important what we call it so long as we’re clear on what’s being studied.

------
Mountain_Skies
>would seduce Grace M. Hopper to write a paper titled “The Education of a
Computer”.

I'm one of the least likely people to start screaming sexism at the drop of a
hat but does this turn of phrase seem a bit, well, sexist?

On the nomenclature topic, it's interesting to note that Georgia Tech and many
others have a School of Computing so it's not that unusual of a term.

~~~
the_af
I don't see it. What part seems sexist to you? The sentence is relevant
because Grace Hopper's work was the precursor to COBOL, and COBOL is the
embodiment of what Dijkstra was complaining about.

~~~
oarabbus_
It's a bit odd to use the word "seduce" there.

~~~
the_af
Fair enough. It doesn't seem odd to me (one is seduced by dangerous but
superficially appealing ideas, and this has little to do with one's sex), and
more importantly, look at the whole sentence:

> _" It was the time of Edmund C. Berkeley’s “Giant Brains or Machines that
> Think”, it was the time of rampant anthropomorphism that would lead to the
> false hope of solving the programming problem by the verbosity of COBOL and
> would seduce Grace M. Hopper to write a paper titled “The Education of a
> Computer”. Regrettably and amazingly, the habit lingers on: it is quite easy
> to infuriate computing scientists by pointing out that anthropomorphizing
> inanimate objects is in science a symptom of professional immaturity._"

Like another comment here points out, this use of "seduce" is akin to "being
seduced by the Dark Side of the Force". Anakin Skywalker was a man!

Also, this was Dijkstra's (slightly vitriolic) way of speaking. If you've read
other [0] stuff [1] by him [2], you'll instantly recognize his style. And he
isn't singling Grace out either; in any case, he seems to refer to her as
another computer scientist, sharing the "professional immaturity" with Edmund
Berkeley, a man.

[0] "Go To Statement Considered Harmful"
[http://www.u.arizona.edu/~rubinson/copyright_violations/Go_T...](http://www.u.arizona.edu/~rubinson/copyright_violations/Go_To_Considered_Harmful.html)

[1] "On the cruelty of really teaching computing science"
[https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD103...](https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html)

[2] "The use of COBOL cripples the mind; its teaching should, therefore, be
regarded as a criminal offense."

"FORTRAN, 'the infantile disorder', by now nearly 20 years old, is hopelessly
inadequate for whatever computer application you have in mind today: it is now
too clumsy, too risky, and too expensive to use."

"It is practically impossible to teach good programming to students that have
had a prior exposure to BASIC: as potential programmers they are mentally
mutilated beyond hope of regeneration."

[https://en.wikiquote.org/wiki/Edsger_W._Dijkstra](https://en.wikiquote.org/wiki/Edsger_W._Dijkstra)

~~~
oarabbus_
Good points - I don't think this is a case of egregious sexism or anything.

But I think greed can seduce a person, I think power can seduce a person, I
think fame can seduce a person.

I've never heard of "false hope" or "rampant anthropomorphism" as seductive,
though.

