
Legible Mathematics: Sketches of interactive arithmetic for programming (2014) - throwaway3157
http://glench.com/LegibleMathematics/
======
taliesinb
I always enjoy seeing new attempts at visualizing and explaining function
composition. Here’s something I played around with a few years ago that tried
to visualize function composition: [https://taliesin.ai/posts/function-
trees/](https://taliesin.ai/posts/function-trees/)

------
diffeomorphism
So the only take-away I get is that the author does not like to use brackets
and really likes writing fractions.

Unfortunately, this does work very badly for long fractions, iterated
fractions or, well, anything more complicated than toy examples.

This pops up from time to time with latex and how a/b should give you a
fraction instead of having to write \frac{a}{b}. Problem is, this makes one
easy case marginally shorter, but something like \frac{a+b}{c+\frac{d}{3}}
much harder.

~~~
andrepd
{a+b}/{c+d/3}

Still looks much better

~~~
diffeomorphism
Hm, reading that it is not immediately obvious whether you mean (c+d)/3 or c +
(d)/3\. At that point I think it should be straightforward to implement {}/{}
as syntactic sugar for \frac{}{}.

Mathematics notation is basically completely determined by consensus and
people come up with new notation all the time whenever they need it. In
particular, in order to convince someone that your notation is better you have
to show previously how hard stuff becomes easy and not how easy stuff becomes
even easier.

~~~
abdullahkhalids
TeX already has the primitive of {}\over{} that does what you are saying. It
is not typically used.

[https://tex.stackexchange.com/questions/73822/what-is-the-
di...](https://tex.stackexchange.com/questions/73822/what-is-the-difference-
between-over-and-frac)

------
xixixao
The mathematical notation has developed over centuries to __think about and
solve problems on __paper ____. Besides it being taught in schools and hence
familiar, I think it’s not at all a good tool to use in the significantly
different digital medium.

Another reason is that the notation is mostly a tool for __oneself__ to think
in, but is fairly bad at communicating (think about the level of expertise
required to read a math paper vs reading someone’s code).

Here’s a concrete difference to support these claims. Using math notation we
often write large expressions which include many meaningful subexpressions and
complex operations. In code we prefer to use lots of names (variables) and
simple expressions.

~~~
Grustaf
Reading math is hard because math is hard. The notation is incredibly
powerful, intuitive and flexible. In all my years hanging around
mathematicians I’ve never heard anyone complain about notation. After all, you
are always free to use your own notation if you want, and if others like it
they will follow.

~~~
munificent
_> In all my years hanging around mathematicians I’ve never heard anyone
complain about notation._

There's likely to be some survivorship bias at play here.

~~~
Grustaf
No, since anyone is free to invent and use whatever notation they prefer.
That’s how mathematical notation has evolved so far. People come up with new
notation all the time.

~~~
ivanbakel
"Free" here is a bit abstract. In what way is an undergraduate mathematician
free to use their own notation, as opposed to the notation of their lecturer,
textbooks, and examination papers? You can think in your own terms, but
mathematics has to be taught and communicated with some kind of consensus -
and unless you're lucky enough to be the first in the field, that consensus
has preceeded you.

~~~
Grustaf
Mathematics is completely meritocratic. If you invent a notation that is
better than some existing notation, people WILL use it, without asking for
your credentials.

~~~
ivanbakel
I think that's a naive statement, and it's also not really relevant to the
original point being made, which is that existing mathematical notation can be
difficult to grok.

Inventing new notation does not help with existing material - and that
material is what makes mathematicians. People who can put up with old
notation, especially if it is bad, are much more likely to become
mathematicians: that gives a survivorship bias.

Besides which, you are welcome to invent your own notation, but good luck
trying to get anyone to adopt it, particularly if the mathematical field
involved is studied enough. Inertia is very strong, even in very abstract
disciplines - aren't physicists still treating current as flowing the wrong
direction?

------
lwb
Anyone interested in this stuff who hasn't read it yet should check out Bret
Victor's Kill Math series of essays and explorable explanations:
[http://worrydream.com/#!/KillMath](http://worrydream.com/#!/KillMath)

------
andrewla
This dovetails to some degree with thoughts that I've been having about
language and expression design.

Associativity and operator precedence are ideas that made sense as a shorthand
in written mathematics (and have deep roots in the structure of arithmetic)
but don't actually add a lot of value in programming languages, except to put
a huge mental burden on developers to unpack the precedence.

My thought is just to remove precedence and associativity entirely -- instead
of "a + b + c/2" you have to make it explicit -- "a + (b + (c/2))". The only
problem that I see is that there are two ways of writing this, you could also
write "(a + b) + (c/2)". In my experience, though, especially in programming
(though not always in financial programming), there are almost always
semantics associated with those intermediate values which makes one of the two
groupings preferable.

Taking numeric constants and adding internal separators makes a lot of sense
as well (both before and after the decimal point). Ideally this would be
enforced at some arbitrary cultural level, (e.g. using the US-standard
groupings of three with comma), and would be required -- "2000" is invalid and
"20,00" is invalid (so python3's approach of allowing arbitrary _'s in numbers
is not really great, because it doesn't protect against typos). The right-of-
the-decimal-point separators are not really common, but it would be nice to
take advantage of this to make a sensible arbitrary enforced designation, like
using "," to the left and "_" to the right, once again mandatory and required
to support groupings of three.

Would probably also be desirable to have a convention for hexadecimal numbers
and other supported bases -- in particular binary is not supported by many
languages, but is very natural for most programming languages, if we had a way
of using delimiters -- 0b0100_0000 is more readable and meaningful than 0x40,
but 0b0010_0100_0011_0010_0000_0010_0010 is not so good (quick, how many bytes
did I just type?).

~~~
simongray
> My thought is just to remove precedence and associativity entirely --
> instead of "a + b + c/2" you have to make it explicit -- "a + (b + (c/2))".
> The only problem that I see is that there are two ways of writing this, you
> could also write "(a + b) + (c/2)".

Or how about... (+ a b (/ c 2))

~~~
andrewla
I think S-expressions are annoying in that they tend to infect a language. If
you're not careful, every function invocation becomes an s-expr, and you're
left with the idea that + is no different from any other function, so we could
write things like "plus(a, b, divide(c,2))" if we wanted to make a functional
notation. And S-exprs don't save us from associativity, here you've made +
associative, which is, fine, I guess, but what does (- a b c) mean, or (/ a b
c)? This sort of asymmetry is a burden that isn't worth dealing with in my
mind.

On thing that I think imperative languages bring to the table is naming
intermediate values through assignment, so we can do things like

    
    
        base_index = root_index + offset
        midpoint = length / 2
        final_index = base_index + midpoint
    

It's rarely useful to be so verbose, but it's nice to have the ability.

~~~
BalinKing
(Assuming I'm understanding correctly,) I'm honestly curious why `+` _should_
be different from any other function. (Also, AFAIK the ability of naming
intermediate values is totally separate from imperativeness; see Haskell's
`let` form, for example.)

------
contravariant
While I can see where the author is coming from, I have a couple of points
where I disagree.

1\. I've yet to see any automatically formatting system that's not extremely
painful to work with.

2\. Mathematical notation hasn't just been developed over 100s of years it is
continuing to do so.

3\. Units aren't labels.

------
maire
I would like a computer system that can do algebraic reductions. This is the
only thing that forced me to go back to pen and paper.

This looks like it does substitutions but I didn't see reductions.

~~~
jfarmer
This sort of thing is called a Computer Algebra System (CAS) and there are
good free ones.

[https://www.sagemath.org/](https://www.sagemath.org/)

[http://maxima.sourceforge.net/](http://maxima.sourceforge.net/)

[https://www.sympy.org/en/index.html](https://www.sympy.org/en/index.html)

------
petschge
A big downside of that proposal is that everything takes up much more space,
which makes it much harder to keep track of what else is going on. In toy
problems it is fine, but in real problems having the answer to "did the code
already check for i == j" visible at the same is really important. Let alone
context such as "is this the reference implementation for single core
validation or the vectorized function" or "is the radius at timestep n or
n+1/2".

------
jedieaston
This looks a heck of a lot like Desmos without the graph.

([https://desmos.com/calculator](https://desmos.com/calculator))

~~~
ivanbakel
My thought as well - it's jarring at first to get used to, but Desmos' way of
typing in equations, where keypresses really act more like operations on the
equation than raw text input, gives a natural easy way to write good
equations.

At the very least, typing / automatically makes a fraction with separate top
and bottom input areas - which means that brackets for numerator and
denominator aren't necessary.

------
evanb
If the labels are supposed to indicate units, they should at least indicate
the RIGHT units! rent, electric, and internet are dollars/month. Otherwise the
12480 in the example is dollars*months, clearly not the right units.

------
dang
Looks like a follow-up to yesterday's thread here:
[https://news.ycombinator.com/item?id=22399516](https://news.ycombinator.com/item?id=22399516)

------
Papirola
The post is from 2014. I wonder what progress was made since.

~~~
atomoton
Python3 Allows you to put underscores in numbers for legibility:

a = 100_000 b = 1_000_000

~~~
Someone
That started at least with Ada, in 1979
([https://softwareengineering.stackexchange.com/a/403959](https://softwareengineering.stackexchange.com/a/403959)).

Traditional FORTRAN ignores all spaces, so one can write a million as _1 000
000_ there, C++14 allows apostrophes
([https://en.wikipedia.org/wiki/Decimal_separator#Data_versus_...](https://en.wikipedia.org/wiki/Decimal_separator#Data_versus_mask))

------
guramarx11
the first example feels deliberately complicated, where gravity constant can
be shortened to G_ = 6.67x10^-11, (G_ * M1 * M2 / Rad^2)

------
j88439h84
Observable.js is like this

------
jxy
I guess it is useful if you are teaching kindergarteners. For high schoolers
we write

    
    
          for(i=0;i<N;++i){
              for(j=0;j<N;++j){
                  force = G*mass1[i]*mass2[j]/radius(i,j)^2
                  // do something with the force
              }
          }

