

Notation And Thinking - Randy00
http://rjlipton.wordpress.com/2010/11/30/notation-and-thinking

======
andrewl
Numerals themselves are also notation, and I've always liked this from Alfred
North Whitehead:

"By relieving the brain of all unnecessary work, a good notation sets it free
to concentrate on more advanced problems, and, in effect, increases the mental
power of the race. Before the introduction of the Arabic notation,
multiplication was difficult, and the division even of integers called into
play the highest mathematical faculties. Probably nothing in the modern world
would have more astonished a Greek mathematician than to learn that ... a
large proportion of the population of Western Europe could perform the
operation of division for the largest numbers. This fact would have seemed to
him a sheer impossibility ... Our modern power of easy reckoning with decimal
fractions is the almost miraculous result of the gradual discovery of a
perfect notation. [...] By the aid of symbolism, we can make transitions in
reasoning almost mechanically, by the eye, which otherwise would call into
play the higher faculties of the brain. [...] It is a profoundly erroneous
truism, repeated by all copy-books and by eminent people when they are making
speeches, that we should cultivate the habit of thinking of what we are doing.
The precise opposite is the case. Civilisation advances by extending the
number of important operations which we can perform without thinking about
them. Operations of thought are like cavalry charges in a battle -- they are
strictly limited in number, they require fresh horses, and must only be made
at decisive moments."

\--from An Introduction to Mathematics, 1911

~~~
scott_s
Amazing - that's the same argument people make regarding higher level
abstractions in programming languages. For the first time, I've seen the
connection between how we have been evolving expressing computations for
computers and how we have evolved expressing computation for our own minds.

Of course, I have known for a long time that math is a language unto itself,
and that, say, the Arabic numeral system is superior in most ways to the Roman
numeral system. But I never before saw that relationship on the same spectrum
as programming languages.

------
kragen
It's disappointing that this whole article ends without ever mentioning Ken
Iverson's famous Turing Award Lecture, which is entitled "Notation as a tool
of thought": <http://www.jsoftware.com/papers/tot.htm>

Perhaps the most interesting modern controversy about notation for programs is
the one about point-free style in, for example, Haskell; advocates claim that
it makes refactoring ("calculating") much easier, while opponents claim that
it makes programs hard to understand. They may both be right. A fun paper on
this from 1994 is "An Introduction to the Bird-Meertens Formalism",
[http://www.comlab.ox.ac.uk/people/jeremy.gibbons/publication...](http://www.comlab.ox.ac.uk/people/jeremy.gibbons/publications/nzfpdc-
squiggol.ps.gz) \--- it shows a step-by-step transformation from a one-line
program that takes cubic time, but is obviously correct, to a one-line program
that takes linear time. It occurs to me that 1994 is rather a long time ago
now, though. What's the current status of this line of work?

------
dangrover
For some reason, math notation has always felt less formal and rigorous to me
than any programming language. It seems very "analog" and vague, whereas you
can always run a program to get a result.

~~~
ced
_For some reason, math notation has always felt less formal and rigorous to me
than any programming language._

Think of how liberating it would be if you could just write pseudo-code, and
the computer would figure out what you meant, instead of complaining about a
missing semi-colon. That's how mathematics is typically written: for other
mathematicians to read. But the underlying meaning conveyed is usually very,
very precise, more so than in a typical program (which may have bugs in spite
of being "formal and rigorous").

 _you can always run a program to get a result_

That's the distinction between _procedural_ and _declarative_ information.
Most lower animals, and most programming languages are firmly in the
procedural camp. Perhaps the biggest breakthrough in Homo Sapiens was the
ability to do declarative reasoning, which is much more flexible and powerful.

There are a few baby steps in declarative programming languages, like Prolog
and Mathematica, but the revolution has yet to happen IMO.

~~~
dangrover
Well, if by pseudo-code you mean super-terse language fitted for the problem
at hand, then yes, that's much better than having to remember semicolons and
stuff.

But not if you mean not having to fully specify the details of new ideas
you're expressing in your chosen language/notation (e.g. the mythical
"powerpoint compiler" that turns ideas into code).

If your program doesn't compile, or has a fatal flaw in its logic -- whether
it's written declaratively or procedurally -- you have to fix it. You don't
get to just say "Oh, this is an exercise for the reader, fuck you."

I guess my problem really is with teaching more than the actual subject
matter.

Math, at its best, is a formal system just like anything else. You build
understanding through axioms and proving things and you can always test an
idea by trying it out.

At least, so I hear.

I've never actually had it taught that way, and have never been able to make
myself think of it that way.

In my studies, mathematicians usually prove things more by fiat and
intimidation and hand-waving, not by actually working through a system and its
rules. It's hard to actually trace reasoning down the entire "stack" like you
would when working in code.

The only GOOD math-y books I've read are Sipser's Theory of Computation and
GEB. The tone of these is "Here's how shit is, and here's why Y makes sense if
you accept X", not "Here's the problems you'll be doing on the test, and you
have to practice them so you can solve them with pen and paper in less than 2
minutes each"

When teaching introductory programming to newbies, part of the goal is to get
people to think of it as a consistent formal system that you can do anything
you want within to reach your goals. If something doesn't compile or you find
a bug, you work through and test your thinking at every way to see what the
problem is.

But math is usually taught as a set of heuristics for solving pre-defined
types of problems. You have to just take it on faith half the time. I wish I
could kind of have some kind of epiphany and suddenly understand math, but
it's never going to happen.

~~~
tel
I'm of the opinion that math is something between the two, really. At the end
of the day, understanding math lies both in being able to construct and follow
rigorous formal declarative proofs and being able to visualize and abstract
what those rules have unearthed. Generally conversation switches rapidly
between those two modes using each one in its forte to move the discourse
along.

I won't argue that math is often taught very poorly, but I also want to bring
up that most people _only want_ unconnected heuristics. Number theory and
algebra are far simpler and more beautiful than add/subtract/multiply/divide,
but they're also not very useful in a grocery store. Most people are script
kiddie mathematicians at best.

The final thing I want to mention is that much of math is discussed at a level
that is very far from "formal" but still rigorous and constructive. At this
level, there is exploration whereas the formalities tend to exist to fill in
the gaps and understand all edge cases. Rigor seems to me to have less value
in formal language manipulation and more in complete understanding of the
assumptions and conditions required to uphold your theories.

So yeah, math can be formalized, but a lot of it exists as a notational
convenience sitting upon _oceans_ of convention that let us rapidly invoke and
discuss abstract notions. This is why nigh every paragraph in math begins with
affixing some temporary, useful notation to the abstract concepts you've
introduced.

------
jessriedel
> The power of the [Dirac bra-ket] notation is that can be a symbol,
> expression, or even words that describe the state values.

No, if this were true, then it wouldn't make any difference what the
combination of braces, brackets, parentheses, and pipes were. All that would
matter is that you could label a vector by an arbitrary word. That's hardly
the importance of the notation.

Rather, the asymmetry of the notation for a vector

|x>

and it's mirror image the dual vector

<y|

emphasizes the canonical duality isomorphism on inner product spaces. The
"hanging" pipe on a vector remind one that one must attach a dual vector
(possibly sandwiching operators) in order to form a scalar.

As the author later notes, there are other identities which become trivial
when using the notation.

Incidentally, it's interesting that Einstein notation (which the author only
briefly mentions) is basically a more powerful--though less suggestive--
version of bra-ket notation. Basically, you can represent Dirac vectors in
Einstein notation as variables with a superscript letter and Dirac dual
vectors as variables with a subscript letter. Operators then always have one
superscript and one subscript. The power of Einstein notation is that it
allows high-dimensional tensors than Dirac.

The GR textbook by Wald has a good discussion of this, and Penrose write
poetically about it (and his preferred graphical notation) in "The Road to
Reality".

------
gfodor
The interesting thing to note is that most of this discussion around notation
is being applied to mathematics. Trying to provide higher level notation
(beyond ASCII) to programming apparently was given up on a long time ago. It's
a shame.

~~~
maxwell
The problem with non-ASCII is that it's harder to read, because the
pronunciation (even if only mental) is ambiguous. Imagine if you don't know
how to pronounce Greek letters, for instance. I couldn't begin to pronounce
APL, and thus would have to remember shapes.

I think this is the reason everyone has an issue with Lisp's prefix notation
for arithmetic. They read

    
    
      (+ 1 2 3)
    

as "plus one two three" and it sounds weird. I wonder if a Lisp would sell if
you only allowed alphanumeric function names, e.g.

    
    
      (sum 1 2 3)
    

(Not to mention that "Lisp" is the worst name ever.)

It's just a hunch, but I suspect the root cause for European imperialistic
domination was that they had a more accessible writing notation. It could even
be that the difficulty of Chinese writing actually produced "more Edisons",
but likewise prevented diffusion. And if James Burke has taught me anything,
it's that communication is way more important than inspiration _or_
perspiration.

It's exactly like Lisp. As it took Europe centuries to catch up with Chinese
technology, it seems likely that it'll take Algol-style langs a century to
catch up with Lisp concepts.

~~~
jessriedel
But your argument could be applied to mathematics, where powerful, non-ASCII
symbols are very useful. What's different about programming?

~~~
maxwell
Nothing. I think non-ASCII symbols are a net loss in mathematics too. While
they're powerful (taking succinctness to be equal to power), they come at the
expense of accessibility. With great power comes great power consumption;
there's a reason most people don't have jet engines under their hood or read
mathematics: doing so is very expensive. I believe the cognitive cost
outweighs the expressive benefit.

------
splat
Tangentially related but interesting nonetheless:

<http://en.wikipedia.org/wiki/Abuse_of_notation>

------
amichail
It's amazing how many people don't know proper asymptotic notation and use big
O for everything.

It doesn't help that research papers use wrong asymptotic notation.

~~~
scott_s
I suspect the problem is not that most people do not know the proper
_notation_ , but that they are unfamiliar with the _concepts_ behind big omega
and big theta.

------
thehotdon
Notation is a solved problem. Math got it wrong, Lisp got it right.

