Hacker News new | past | comments | ask | show | jobs | submit login
Notation as a Tool of Thought (1979) [pdf] (toronto.edu)
150 points by tosh on April 16, 2018 | hide | past | favorite | 13 comments



This is one of my favorite papers and a bigger influence on my coding than anything else I've read. It's hard to really take advantage of these ideas without a language designed for that, but I have made an attempt to document a style guide for python that tries to get some of the feel at least: https://github.com/fastai/fastai/blob/master/docs/style.md

(NB: this approach to coding in Python won't suit most people, for many reasons - I wrote it because I had a lot of requests to document my approach, not because I want anyone else to do the same thing. But if you're curious about ways to lay out code that are rather different to PEP8, do check it out!)


Nicholas Cooke in "Handbook of Musical Analysis" says something that has always influenced me very deeply:

> All notation is analysis

Obviously he's talking in the specific context of musical notation, but it seems true in other fields too. Choosing how to notate seems a very important analytical decision and certain forms of notation help or hinder analysis.

Feynman diagrams for example famously help to understand the maths underlying particle interactions.

Roman numerals (for a different example) make all kinds of arithmetic much harder than in Arabic notation.


> Roman numerals (for a different example) make all kinds of arithmetic much harder than in Arabic notation.

Definitely not for simple addition or subtraction, the kind you're likely to do every day haggling over prices or counting things - roman numerals work visually!

What is I + II? You just write them together -> III

What is VVVV - V? Take just one V away -> VVV.

Knowing that IIIII = V or X = VV = IIIIIIIIII only allows you to express things as a shorthand. Also, some numerals allow for a subtraction index, so: X - I = IX (take I from X)


>Roman numerals make all kinds of arithmetic much harder than in Arabic notation

Sure. But of course they made arithmetic easier than it was before ('tallying' I think, i.e. IIIIIII....)

It's interesting how notation and language, which arise to foster communication between different people, become the very things that make it possible for a lone individual to think. This is why I guess that a single artificial person (AI) couldn't be created in isolation.


"The preceding sections have attempted to develop the thesis that the properties of executability and universality associated with programming languages can be combined, in a single language, with the well-known properties of mathematical notation which make it such an effective tool of thought" (378).

I've been working through "Seven Sketches in Composability" [1], posted here a few weeks ago, and thinking about how much math relies in diagrams which are somewhere between drawing and writing. In research meetings, I often see people reasoning with diagrams and mathematical notation. In the vein of this article, I wonder whether it would also be possible to formalize mathematical pictographs to the point where they could be computable--a not-strictly-textual programming language. iPython notebooks often toggle between symbolic expressions and their implementation in code.

One thing that might be missing is context. Diagrams are indexical (pointing to contextual meaning) even more than text, often illustrating a problem that has previously been defined. This feels to me like potentially-fruitful design problem.

[1] https://news.ycombinator.com/item?id=16677395


Diagrams are a mainstay in architecture because they index multiple interreptations. This lends itself to creative work. The kind of computing done would be a weird one, having a stage that would generate multiple semantics for the same symbol and have to be executing on at least a couple of them.


Looking further back to the days before cheap writing, I recently enjoyed reading this paper by Netz about the Ancient Greek use of counters as a tool of thought:

http://articles.adsabs.harvard.edu/cgi-bin/nph-iarticle_quer...

https://scholar.google.com/scholar?cluster=86099607799623165...

It explains how Greeks began using various sorts of counters or tokens for currency, policy making, court judgments, food distribution, recreation, etc., and demonstrates that Greek numeracy was quite varied and sophisticated while remaining concrete/tangible.


Hmm, in that sense an abacus was/is certainly a tool of thought, and a notation system at that.


I’ve been meaning to read this for some time. Thanks for reminding me. What else like this is there? In book form or paper form. Or video.


Me too; I have had the tab open months, or years at this point.

http://prog21.dadgum.com/114.html - "Papers from the lost culture of Array Languages" - has links to this and a couple more interesting sounding things.


I've read and re-read "Notation as a Tool of Thought" ever since I came across it, one of my favorite papers.

Thank you for the link to this other paper, I can branch out to discover related works. Just found this one, I'll chew it well: "Language as an intellectual tool: From hieroglyphics to APL" [0].

[0] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.86....


You might like his work "Elementary Algebra:"

http://www.softwarepreservation.org/projects/apl/Papers/Elem...


this is a pretty cool github repository on notation

https://github.com/hypotext/notation




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: