Hacker News new | past | comments | ask | show | jobs | submit login
How should logarithms be taught? (gowers.wordpress.com)
34 points by raviparikh 8 days ago | hide | past | favorite | 15 comments

I'm imaging a story like Gauss where he quickly calculated the 100th triangle number. One time in my primary school maths class (I was 10) we got asked to find out how many doublings it took to get above 1000000 by hand. I imagine if we got asked to do that at a slightly older age, and afterwards taught how to use logs and log tables, we could get the answer in a short amount of time

log(2) = 0.3... log(1000000) = 6

6 / 0.3 = 20

I think that would straight away solidify the usefulness of logs in our minds

I'm not sure that the view that "mathematicians reason syntactically and not semantically" is that unpopular. After all, a lot of modern mathematics nowadays is concerned with how structures are made up of interactions between elements and not the "nature" of the elements themselves.

In that sense, the fact that log(a) + log(b) = log(ab) can be viewed as a homomorphism from an multiplicative group to an additive one (whereas the exponential function is the inverse map).

But I also think it can be illuminating to see that this is not the only "definition" of logarithms, and that there are equivalent definitions. That's precisely the beauty of mathematics, that you can define a number of things and then show them to be exactly the same. There are a number of different ways to define e.g. the exponential function and each of them highlights a different aspect and is interesting to mathematicians working in different disciplines (e.g. the exponential function is also the unique solution to the IVP y'=y with y(0)=1). I don't know if it's possible to teach something like that to children, but it does seem like we're not even trying right now.

My introduction in exponentials and logarithms was via growth rates. My dad used baby fish as an example and I grokked it in a fifteen minute ride. Perhaps not very suitable for the more advanced topics (changes in base rate, the why of e) but you can teach this any ten year old.

Somewhere now typing this I have a feeling that this goes to show I have a very partial understanding of logarithms in that they transform between numbers on an exponential curve and exponents where my exponent is something I think of as periods in a discrete process. So the point of OP of the distinction between syntactic and semantic thinking is very real.

When my 11 old son was born, I told a friend that I have this funny goal to teach him how to mentally calculate natural logarithms by the time he's in 4th grade. Time flies, and before I knew it he was in the last week of the 4th grade. So, I taught him just that. Mechanically. He is now in 6th grade (first day of middle school yesterday, big day). He forgot how to do these logarithms, and he never really understood them, obviously. But he found the trick entertaining, and I hope when he'll get to learn them in high school he'll find them as old friends. I don't know, we'll see.

In any case, calculating natural logs with 2 decimals accuracy is a cool nerdy trick to do. Everyone has a smart phone, you can ask them to start the calculator app, run it in extended mode (tilt the phone horizontally), and press the rand button. For example, right now I got 0.874. Announce that you'll calculate the natural log with 2 exact decimals or thereabouts. You start with 0.9, which has log(0.9) = log(9/10) = log(9)-log(10) = 2log(3) - log(10), and you have memorized a short table of logs of the integers between 1 and 10, and know that log(3) = 1.1 and log(10) = 2.3, so what you have so far is -0.1 . Then you say notice that 8.74 = 9-0.26, you are roughly 2.6% lower than 9. More precisely 2.6% divided by 9, which is very close to 2.6% multiplied by 1.1, and you know the multiplication by 11 trick (add the digits and put the result in the middle), so all in all 8.74 is about 2.9% less than 9. you subtract this number from the logarithm, but first you round it to a whole percentage, so you subtract 3% from -10% and you end up with -0.13. The actual value is -0.1347.

In other words, you use two formulas, log(ab) = log(a) + log(b), log(1+x) = x for small x (this is the first term in the Taylor series, and for our purposes it is exact if x is a single digit percentage).

Kids can learn these manipulations mechanically at an early age, just like they can learn simple magical tricks. And they are having fun with it. Later on, they don't get confused by the formula log(ab) = log(a)+log(b) because they have used it and are familiar with it.

PS. By the way, I once heard the apocryphal story that Gauss was able to calculate any natural logarithm in his head with 5 exact decimals. I did not find (or look for) a confirmation in the literature, but I find it believable. If you teach your kid the trick with the log up to 2 decimals, and they already learned to do the sum of the numbers up to 100, you can tell them they are like a mini-Gauss.

Logarithms can probably be taught using logarithmic Cuisinaire blocks, which might be presented along with suitably scaled graph paper. (You remember Cuisinaire blocks from primary school - they're in lengths of 1, 2, 3... and they're all different colours.) I haven't tried this yet in teaching but it's the next thing I'll try. P.S. The primes assume their proper importance in arithmetic using logarithmic blocks.

Two words: slide rule

The first time grokked what logarithms are good for is when I needed to make nice graph drawing code, which automatically snapped endpoints to nearest full value. Like when you have data in range 0:1.0, you want min and max of your graph to snap to 0 and 1. When your data is 3:360, graph will snap to 0 and 400.

Conclusion - best way is to show real-world examples of usage, which learner can identify with.

Talking about plotting, in ĺog-linear plots[0], exponentials appear as straight lines. In log-log plots[1], functions of form f(x) = ax^k (monomials) appear as straight lines.

For me, I think what clicked is that the log of a number in base 10 measures the length of this number in decimal (as in, the number of digits, possibly off by one)

And more generally, log is an exponent that raises the base to a number. For example, ln(5) is the exponent that raises e to 5.

[0] https://en.wikipedia.org/wiki/Semi-log_plot

[1] https://en.wikipedia.org/wiki/Log%E2%80%93log_plot


"6. (Mar's Law) Everything is linear if plotted log-log with a fat magic marker."

via http://spacecraft.ssl.umd.edu/old_site/academics/akins_laws.....

It's deeper than it sounds. Many of the data derived from physical phenomena will follow some polynomial. On a log-log plot, this looks like a squiggly but straight line, and the magic marker smooths out the lower-order terms noise :).

via the "triangle of power". http://bekawestberg.me/blog/triangle-of-power/

really, courses should revise notation.

More preferable notation (slight correction <in the infinite limit = 1 not e>) : https://news.ycombinator.com/item?id=25278021

Firstly, we should teach that algorithms have nothing to do with logarithms.

I’m in a corporate non-tech company and have, on multiple occasions, encountered someone who talks about algorithms as if they are from math class in high school. Pretty sure they are thinking of logarithms.

As one shape of a curve; in a notebook that demonstrates multiple methods of curve fitting with and without a logarithmic transform.

Logarithm: https://simple.wikipedia.org/wiki/Logarithm ; https://en.wikipedia.org/wiki/Logarithm :

> In mathematics, the logarithm is the inverse function to exponentiation. That means the logarithm of a given number x is the exponent to which another fixed number, the base b, must be raised, to produce that number x.

List of logarithmic identities: https://en.wikipedia.org/wiki/List_of_logarithmic_identities

List of integrals of logarithmic functions: https://en.wikipedia.org/wiki/List_of_integrals_of_logarithm...

As functions in a math library or a CAS that should implement the correct axioms correctly:

Sympy Docs > Functions > Contents: https://docs.sympy.org/latest/modules/functions/index.html#c...

sympy.functions.elementary.exponential. log(x, base=e) == log(x)/log(e), exp(), LambertW(), exp_polar() https://docs.sympy.org/latest/modules/functions/elementary.h...

"Exponential, Logarithmic and Trigonometric Integrals" sympy.functions.special.error_functions. Ei: exponential integral, li: logarithmic integral, Li: offset logarithmic integral https://docs.sympy.org/latest/modules/functions/special.html...

numpy.log. log() base e, log2(), log10(), log1p(x) == log(1 + x) https://numpy.org/doc/stable/reference/generated/numpy.log.h...

numpy.exp. exp(), expm1(x) == exp(x) - 1, exp2(x) == 2*x https://numpy.org/doc/stable/reference/generated/numpy.exp.h...

Khan Academy > Algebra 2 > Unit: Logarithms: https://www.khanacademy.org/math/algebra2/x2ec2f6f830c9fb89:...

Khan Academy > Algebra (all content) > Unit: Exponential & logarithmic functions https://www.khanacademy.org/math/algebra-home/alg-exp-and-lo...

3blue1brown: "Logarithm Fundamentals | Lockdown math ep. 6", "What makes the natural log "natural"? | Lockdown math ep. 7" https://www.youtube.com/playlist?list=PLZHQObOWTQDP5CVelJJ1b...

Feynmann Lectures 22-6: Algebra > Imaginary Exponents: https://www.feynmanlectures.caltech.edu/I_22.html#Ch22-S6

Power law functions: https://en.wikipedia.org/wiki/Power_law#Power-law_functions

In a two-body problem, of the 4-5 fundamental interactions: Gravity, Electroweak interaction, Strong interaction, Higgs interaction, a fifth force; which have constant exponential terms in their symbolic field descriptions? https://en.wikipedia.org/wiki/Fundamental_interaction#The_in...

Natural logs in natural systems:

Growth curve (biology) > Exponential growth: https://en.wikipedia.org/wiki/Growth_curve_(biology)#Exponen...

Basic reproduction number: https://en.wikipedia.org/wiki/Basic_reproduction_number

(... Growth hacking; awesome-grwoth-hacking: https://github.com/bekatom/awesome-growth-hacking )

Metcalf's law: https://en.wikipedia.org/wiki/Metcalfe%27s_law

Moore's law; doubling time: https://en.wikipedia.org/wiki/Moore's_law

A block reward halving is a doubling of difficulty. What block reward difficulty schedule would be a sufficient inverse of Moore's law?

A few queries:

logarithm cheatsheet https://www.google.com/search?q=logarithm+cheatsheet

logarithm on pinterest https://www.pinterest.com/search/pins/?q=logarithm

logarithm common core worksheet https://www.google.com/search?q=logarithm+common+core+worksh...

logarithm common core autograded exercise (... Khan Academy randomizes from a parametrized (?) test bank for unlimited retakes for Mastery Learning) https://www.google.com/search?q=logarithm+common+core+autogr...

If only I had started my math career with a binder of notebooks or at least 3-hole-punched notes.

- [ ] Create a git repo with an environment.yml that contains e.g. `mamba install -y jupyter-book jupytext jupyter_contrib_extensions jupyterlab-git nbdime jupyter_console pandas matplotlib sympy altair requests-html`, build a container from said repo with repo2docker, and git commit and push changes made from within the JupyterLab instance that repo2docker layers on top of your reproducible software dependency requirement specification ("REES"). {bash/zsh, git, docker, repo2docker, jupyter, [MyST] markdown and $$ mathTeX $$; Google Colab, Kaggle Kernels, ml-workspace, JupyterLite}

"How I'm able to take notes in mathematics lectures using LaTeX and Vim" https://news.ycombinator.com/item?id=19448678

Here's something like MyST Markdown or Rmarkdown for Jupyter-Book and/or jupytext:

## Log functions

Log functions in the {PyData} community

### LaTeX

#### sympy2latex

What e.g. sympy2latex parses that LaTeX into, in terms of symbolic objects in an expression tree:

### numpy

see above

### scipy

### sympy

see above

### sagemath

### statsmodels

### TensorFlow

### PyTorch

## Logarithmic and exponential computational complexity

- Docs: https://www.bigocheatsheet.com/

- [ ] DOC: Rank these with O(1) first: O(n log n), O(log n), O(1), O(n), O(n*2) +growthcurve +exponential

## Combinatorics, log, exp, and Shannon classical entropy and classical Boolean bits

https://www.google.com/search?q=formula+for+entropy :

Entropy > Statistical mechanics: https://en.wikipedia.org/wiki/Entropy#Statistical_mechanics

SI unit for [ ] entropy: joules per kelvin (J*K*-1)


In terms of specifying tasks for myself in order to learn {Logarithms,} I could use e.g. todo.txt markup to specify tasks with [project and concept] labels and contexts; but todo.txt doesn't support nested lists like markdown checkboxes with todo.txt markup and/or codelabels (if it's software math)

  - [ ] Read the Logarithms wikipedia page <url> and take +notes +math +logarithms @workstation
    - [o] Read
    - [x] BLD: mathrepo: generate from cookiecutter or nbdev
    - [ ] DOC: mathrepo: logarithm notes
    - [ ] DOC,ART: mathrepo: create exponential and logarithmic charts +logarithms @workstation
    - [ ] ENH,TST,DOC: mathrepo: logarithms with stdlib math, numpy, sympy (and *pytest* or at least `assert` assertion expressions)
    - [ ] ENH,TST,DOC: mathrepo: logarithms and exponents with NN libraries (and *pytest*)
Math (and logic; ultimately thermodynamics) transcend disciplines. To bikeshed - to worry about a name that can be sed-replaced later - but choose a good variable name now, Is 'mathrepo' the best scope for this project? Smaller dependency sets (i.e. simpler environment.yml) seem to result in less version conflicts. `conda env export --from-history; mamba env export --from-history; pip freeze; pipenv -h; poetry -h`

### LaTeX

  $$ \log_{b} x = (b^? = x) $$
  $$ 2^3 = 8 $$
  $$ \log_{2} 8 = 3 $$
  $$ \ln e = 1 $$
  $$ \log_b(xy)=\log_b(x)+\log_b(y) $$

  $ \begin{align}
  \textit{(1) } \log_b(xy) & = \log_b(x)+\log_b(y)
  \end{align} $
Sources: https://en.wikipedia.org/w/index.php?title=List_of_logarithm... ,

#### sympy2latex

What e.g. sympy2latex parses that LaTeX into, in terms of symbolic objects in an expression tree:

  # install
  #!python -m pip install antlr4-python3-runtime sympy
  #!mamba install -y -q antlr-python-runtime sympy
  import sympy
  from sympy.parsing.latex import parse_latex
  def displaylatexexpr(latex):
      expr = parse_latex(latex)
      return expr
  displaylatexexpr('\log_{2} 8'))
  # 'log(8, 2)'
  displaylatexexpr('\log_{2} 8 = 3'))
  # 'Eq(log(8, 2), 3)'
  displaylatexexpr('\log_b(xy) = \log_b(x)+\log_b(y)'))
  # 'Eq(log(x*y, b), log(x, b) + log(y, b))'
  displaylatexexpr('\log_{b} (xy) = \log_{b}(x)+\log_{b}(y)')
  # 'Eq(log(x*y, b), log(x, b) + log(y, b))'
  displaylatexexpr('\log_{2} (xy) = \log_{2}(x)+\log_{2}(y)')
  # 'Eq(log(x*y, 2), log(x, 2) + log(y, 2))'

### python standard library



math. exp(x), expm1(), log(x, base=e), log1p(x), log2(x), log10(x), pow(x, y) : float, assert sqrt() == pow(x, 1/2)

## scipy

https://docs.scipy.org/doc/scipy/reference/generated/scipy.s... scipy.special. xlog1py()


### sagemath


### statsmodels

### TensorFlow https://www.tensorflow.org/api_docs/python/tf/math tf.math. log(), log1P(), log_sigmoid(), exp(), expm1()


SmoothReLU ("softplus") adds ln to the ReLU activation function, for example: https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#So...

E.g. Softmax & LogSumExp also include natural logarithms in their definitions: https://en.wikipedia.org/wiki/Softmax_function

### PyTorch

https://pytorch.org/docs/stable/generated/torch.log.html torch. log(), log10(), log1p(), log2(), exp(), exp2(), expm1(); logaddexp() , logaddexp2(), logsumexp(), torch.special.xlog1py()


Regarding this learning process and these tools, Now I have a few replies to myself (!) in not-quite-markdown and with various headings: I should consolidate this information into a [MyST] markdown Jupyter Notebook and re-lead the whole thing. If this was decent markdown from the start, I'd have less markup work to do to create a ScholarlyArticle / Notebook.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact