

Why I Love Computer Science - Jebdm
http://www.cs.caltech.edu/~mvanier/hacking/rants/computer_science.html

======
cchooper
I think this guy has mathematics totally wrong. Maths just doesn't work today
the way it did in Newton's time, and even back then people weren't satisfied
with Newton's proofs, but they lacked an alternative so they had to use them
anyway. If the author had his way, we would have refused to accept Newtonian
physics for two centuries! Could you imagine the damage that would have done?

 _> [A]ny realistic mathematical proof will leave out a great many steps,
which are considered to be the "required background knowledge"_

Computer science papers are different how? Computer science != programs!

 _> [T]he inference rules are not always specified accurately, or are not
believable if they are. This is why you will sometimes read a mathematical
proof and say "I understand the reasoning, but I just don't buy it"_

I think this guy just isn't too hot at mathematics. Omitting a trivial step
(or domain specific knowledge) is _not_ a lack of rigour, but a courtesy to
the reader. The details can always be filled in cleanly. If you ever see a
modern mathematical proof that is accepted by all mathematicians, but you
don't "buy it", then I can assure you that it's _you_ that's at fault, not the
proof.

Oh, and computer science papers never leave out trivial steps or assume domain
knowledge? Not the papers I've read. Once again, CS != programming!

 _> This is reminiscent of Whitehead and Russell's Principia Mathematica where
hundreds of pages pass until the authors can prove that 1 + 1 = 2._

Surely _Principia_ is a _reductio ad absurdum_ of the argument that everyone
should always spell out all the steps!

~~~
tokenadult
_I think this guy just isn't too hot at mathematics. Omitting a trivial step
(or domain specific knowledge) is not a lack of rigour, but a courtesy to the
reader. The details can always be filled in cleanly._

A very astute comment. In principle, mathematical proofs are supposed to be
every bit as completely described as computer programs, but perhaps not as
explicitly expressed. Edsger Dijkstra used to refer to mathematical proofs as
a model for computer programming.

<http://en.wikipedia.org/wiki/Program_derivation>

~~~
derefr
I'm not a mathematician, but I can imagine some sort of "library of
derivation" whereupon, in order to prove something, say, based on arithmetic,
you _include_ (the language directive, not the verb) the specification of
arithmetic derived in Principia Mathematica. Then the proving system follows
your logic to its logic, and then to the logic they relied on, and so on,
until everything is completely explicit. Basically, formal execution of
bibliographies.

------
randomwalker
Quick note: many of you are saying that this guy is totally out of touch with
how math works, and that formal derivation of proofs is just fantasy. It's
true that that's an extreme view, but it is in fact one that is held by some
top mathematicians, even if a small minority. There are people who have built
up research programs out of trying to make it a reality. See, for instance,
Doron Zeilberger, who apparently describes himself as an "ultrafinitist":

<http://en.wikipedia.org/wiki/Doron_Zeilberger>

<http://www.math.rutgers.edu/~zeilberg/OPINIONS.html>

I'm not saying I agree with that, just that it's not lunacy. Carry on.

~~~
jsyedidia
I agree. The original post is echoing many of the views expressed at more
length by Sussman and Wisdom in their book "Structure and Interpretation of
Classical Mechanics." There they showed that a computational expression of
classical mechanics was much more explicit and rigorous than the standard
mathematical or physical treatments. They also have a similar long paper on
differential geometry from a computational perspective:
[http://groups.csail.mit.edu/mac/users/wisdom/AIM-2005-003.pd...](http://groups.csail.mit.edu/mac/users/wisdom/AIM-2005-003.pdf)

~~~
cchooper
Their point in SICM was that traditional notation was not as ambiguous as
their computational notation. That's because traditional notation is designed
to be convenient rather than explicit. They are right, but this does _not_
mean that physics lacks rigour. More explicit notations were always available
and physicists, being more intelligent than computers, were capable of
resolving the ambiguities to reveal the fully rigorous structure underneath.

There's a difference between using a convenient notation and lacking rigour.
If you are always capable of discerning the true formal equations behind a
convenient notation then everything is OK. If you can't, or people can't agree
on what the formal meaning should be, then there is indeed a problem. The fact
that the authors could derive a computational notation that no physicist would
disagree with is _proof_ that a lack of rigour never existed in the first
place.

~~~
jsyedidia
You write that "More explicit notations were always available and physicists,
being more intelligent than computers, were capable of resolving the
ambiguities to reveal the fully rigorous structure underneath."

I doubt that's true. For example, here's what Piet Hut, now a professor of
physics at the Institute for Advanced Studies at Princeton, writes about his
experiences with classical mechanics as an undergraduate in his review of SICM
available at <http://www.ids.ias.edu/~piet/publ/other/sicm.html> :

"Soon I went through the library in search of books on the variational
principle in classical mechanics. I found several heavy tomes, borrowed them
all, and started on the one that looked most attractive. Alas, it didn't take
long for me to realize that there was quite a bit of hand-waving involved.
There was no clear definition of the procedure used for computing path
integrals, let alone for the operations of differentiating them in various
ways, by using partial derivatives and/or using an ordinary derivative along a
particular path. And when and why the end points of the various paths had to
be considered fixed or open to variation also was unclear, contributing to the
overall confusion.

Working through the canned exercises was not very difficult, and from an
instrumental point of view, my book was quite clear, as long as the reader
would stick to simple examples. But the ambiguity of the presentation
frustrated me, and I started scanning through other, even more detailed books.
Alas, nowhere did I find the clarity that I desired, and after a few months I
simply gave up. Like generations of students before me, I reluctantly accepted
the dictum that `you should not try to understand quantum mechanics, since
that will lead you astray for doing physics', and going even further, I also
gave up trying to really understand classical mechanics! Psychological defense
mechanisms turned my bitter sense of disappointment into a dull sense of
disenchantment."

~~~
cchooper
I actually agree with Sussman and Wisdom (and the author of this article) that
physics is taught in a very sloppy manner, and that the computational notation
has huge advantages. But that's not the point I (or they) were making.
Lagrangian and Hamiltonian mechanics were perfectly rigorous before Sussman
and Wisdom came along, even if they weren't taught very well. If mechanics had
really been deep-down sloppy, then Sussman and Wisdom would now be heralded
alongside Newton and Einstein for their incredible contribution to science!
Rather than just creating a new notation, they would have advanced science
incalculably, turning vague and untestable theories into hard empirical
science for the first time!

But that's obviously not what happened. Physicists were always capable of
making exact calculations and predictions from Lagrangian mechanics. SICM
contains no new theories, theorems or proofs, just a more explicit way of
representing old ones. Nothing new was discovered and no old notions were
clarified. Instead, they just found a better way of _teaching_ mechanics, one
that didn't rely on the implicit knowledge that masters of the subject already
possessed. They were only capable of doing this because Lagrangian and
Hamiltonian mechanics were well-defined in the first place. If they hadn't
been then they would have had to advance a new theory of mechanics to replace
them, rather than just re-presenting an old one.

------
Herring
This disdain for fields without rigor reminds me of the general attitude
towards people using excel macros or vb. I understand it's fun & I used to
feel the same way. But at some point you just have to stop playing with your
tools & get some real work done.

------
glymor
A compiled computer program is certainly a rigorous description of something,
God knows what, certainly not the programmer.

This is just funny:

 _"I think computer science has a tremendous amount to offer the fields of
logic and mathematics. Specifically, I think that requiring all formulas to be
executable by a finite, deterministic system (a computer program) could lead
to a great increase in the level of rigor of these fields, would make it
easier for students to learn existing results, would make it easier for
practitioners to develop new results, and might possibly suggest whole new
approaches..._ *

Also he keeps refering to computers as finite. I can only assume he means in a
physical sense that there are a non-infinite amount of atoms making up his
CPU.

I suppose I should critic his argument seriously but it's just too far away
from any actual reality.

~~~
nslater
What, with his comment about being able to eventually understand sociology
using physics, I think this essay is easily 3 Cuil.

~~~
hc
what does that mean?

------
pixcavator
The dichotomy he puts forward - algorithms (CS) vs. proofs (math) - is totally
made up.

~~~
Dilpil
The dichotomy of CS vs Math is tenuous as well.

------
jmorin007
Links down, here's the cached version:
[http://209.85.173.132/search?q=cache:YchvBt7eH24J:www.cs.cal...](http://209.85.173.132/search?q=cache:YchvBt7eH24J:www.cs.caltech.edu/~mvanier/hacking/rants/computer_science.html+~mvanier/hacking/rants/computer_science.html&hl=en&ct=clnk&cd=1&gl=us&client=safari)

------
nslater
So much fail.

------
dhughes
"I really loathe vagueness in science (or, for that matter, in philosophy and
argumentation in general)."

Is he a computer, true or false no other answer? I think he's fooling himself,
what happens if quantum computing becomes common and a result can be a 1, 0 or
'maybe'?

