How should logarithms be taught? 34 points by raviparikh 8 days ago | hide | past | favorite | 15 comments

 I'm imaging a story like Gauss where he quickly calculated the 100th triangle number. One time in my primary school maths class (I was 10) we got asked to find out how many doublings it took to get above 1000000 by hand. I imagine if we got asked to do that at a slightly older age, and afterwards taught how to use logs and log tables, we could get the answer in a short amount of timelog(2) = 0.3... log(1000000) = 66 / 0.3 = 20I think that would straight away solidify the usefulness of logs in our minds
 I'm not sure that the view that "mathematicians reason syntactically and not semantically" is that unpopular. After all, a lot of modern mathematics nowadays is concerned with how structures are made up of interactions between elements and not the "nature" of the elements themselves.In that sense, the fact that log(a) + log(b) = log(ab) can be viewed as a homomorphism from an multiplicative group to an additive one (whereas the exponential function is the inverse map).But I also think it can be illuminating to see that this is not the only "definition" of logarithms, and that there are equivalent definitions. That's precisely the beauty of mathematics, that you can define a number of things and then show them to be exactly the same. There are a number of different ways to define e.g. the exponential function and each of them highlights a different aspect and is interesting to mathematicians working in different disciplines (e.g. the exponential function is also the unique solution to the IVP y'=y with y(0)=1). I don't know if it's possible to teach something like that to children, but it does seem like we're not even trying right now.
 My introduction in exponentials and logarithms was via growth rates. My dad used baby fish as an example and I grokked it in a fifteen minute ride. Perhaps not very suitable for the more advanced topics (changes in base rate, the why of e) but you can teach this any ten year old.Somewhere now typing this I have a feeling that this goes to show I have a very partial understanding of logarithms in that they transform between numbers on an exponential curve and exponents where my exponent is something I think of as periods in a discrete process. So the point of OP of the distinction between syntactic and semantic thinking is very real.
 Logarithms can probably be taught using logarithmic Cuisinaire blocks, which might be presented along with suitably scaled graph paper. (You remember Cuisinaire blocks from primary school - they're in lengths of 1, 2, 3... and they're all different colours.) I haven't tried this yet in teaching but it's the next thing I'll try. P.S. The primes assume their proper importance in arithmetic using logarithmic blocks.
 Two words: slide rule
 The first time grokked what logarithms are good for is when I needed to make nice graph drawing code, which automatically snapped endpoints to nearest full value. Like when you have data in range 0:1.0, you want min and max of your graph to snap to 0 and 1. When your data is 3:360, graph will snap to 0 and 400.Conclusion - best way is to show real-world examples of usage, which learner can identify with.
 Talking about plotting, in ĺog-linear plots[0], exponentials appear as straight lines. In log-log plots[1], functions of form f(x) = ax^k (monomials) appear as straight lines.For me, I think what clicked is that the log of a number in base 10 measures the length of this number in decimal (as in, the number of digits, possibly off by one)And more generally, log is an exponent that raises the base to a number. For example, ln(5) is the exponent that raises e to 5.
 Related:"6. (Mar's Law) Everything is linear if plotted log-log with a fat magic marker."It's deeper than it sounds. Many of the data derived from physical phenomena will follow some polynomial. On a log-log plot, this looks like a squiggly but straight line, and the magic marker smooths out the lower-order terms noise :).
 via the "triangle of power". http://bekawestberg.me/blog/triangle-of-power/really, courses should revise notation.
 More preferable notation (slight correction ) : https://news.ycombinator.com/item?id=25278021
 Firstly, we should teach that algorithms have nothing to do with logarithms.I’m in a corporate non-tech company and have, on multiple occasions, encountered someone who talks about algorithms as if they are from math class in high school. Pretty sure they are thinking of logarithms.
 As one shape of a curve; in a notebook that demonstrates multiple methods of curve fitting with and without a logarithmic transform.> In mathematics, the logarithm is the inverse function to exponentiation. That means the logarithm of a given number x is the exponent to which another fixed number, the base b, must be raised, to produce that number x.List of logarithmic identities: https://en.wikipedia.org/wiki/List_of_logarithmic_identitiesList of integrals of logarithmic functions: https://en.wikipedia.org/wiki/List_of_integrals_of_logarithm...As functions in a math library or a CAS that should implement the correct axioms correctly:Sympy Docs > Functions > Contents: https://docs.sympy.org/latest/modules/functions/index.html#c...sympy.functions.elementary.exponential. log(x, base=e) == log(x)/log(e), exp(), LambertW(), exp_polar() https://docs.sympy.org/latest/modules/functions/elementary.h..."Exponential, Logarithmic and Trigonometric Integrals" sympy.functions.special.error_functions. Ei: exponential integral, li: logarithmic integral, Li: offset logarithmic integral https://docs.sympy.org/latest/modules/functions/special.html...numpy.log. log() base e, log2(), log10(), log1p(x) == log(1 + x) https://numpy.org/doc/stable/reference/generated/numpy.log.h...numpy.exp. exp(), expm1(x) == exp(x) - 1, exp2(x) == 2*x https://numpy.org/doc/stable/reference/generated/numpy.exp.h...Khan Academy > Algebra 2 > Unit: Logarithms: https://www.khanacademy.org/math/algebra2/x2ec2f6f830c9fb89:...Khan Academy > Algebra (all content) > Unit: Exponential & logarithmic functions https://www.khanacademy.org/math/algebra-home/alg-exp-and-lo...3blue1brown: "Logarithm Fundamentals | Lockdown math ep. 6", "What makes the natural log "natural"? | Lockdown math ep. 7" https://www.youtube.com/playlist?list=PLZHQObOWTQDP5CVelJJ1b...Feynmann Lectures 22-6: Algebra > Imaginary Exponents: https://www.feynmanlectures.caltech.edu/I_22.html#Ch22-S6Power law functions: https://en.wikipedia.org/wiki/Power_law#Power-law_functionsIn a two-body problem, of the 4-5 fundamental interactions: Gravity, Electroweak interaction, Strong interaction, Higgs interaction, a fifth force; which have constant exponential terms in their symbolic field descriptions? https://en.wikipedia.org/wiki/Fundamental_interaction#The_in...Natural logs in natural systems:Growth curve (biology) > Exponential growth: https://en.wikipedia.org/wiki/Growth_curve_(biology)#Exponen...Basic reproduction number: https://en.wikipedia.org/wiki/Basic_reproduction_number(... Growth hacking; awesome-grwoth-hacking: https://github.com/bekatom/awesome-growth-hacking )Metcalf's law: https://en.wikipedia.org/wiki/Metcalfe%27s_lawMoore's law; doubling time: https://en.wikipedia.org/wiki/Moore's_lawA block reward halving is a doubling of difficulty. What block reward difficulty schedule would be a sufficient inverse of Moore's law?A few queries:logarithm cheatsheet https://www.google.com/search?q=logarithm+cheatsheetlogarithm on pinterest https://www.pinterest.com/search/pins/?q=logarithmlogarithm common core worksheet https://www.google.com/search?q=logarithm+common+core+worksh...logarithm common core autograded exercise (... Khan Academy randomizes from a parametrized (?) test bank for unlimited retakes for Mastery Learning) https://www.google.com/search?q=logarithm+common+core+autogr...
 If only I had started my math career with a binder of notebooks or at least 3-hole-punched notes.- [ ] Create a git repo with an environment.yml that contains e.g. mamba install -y jupyter-book jupytext jupyter_contrib_extensions jupyterlab-git nbdime jupyter_console pandas matplotlib sympy altair requests-html, build a container from said repo with repo2docker, and git commit and push changes made from within the JupyterLab instance that repo2docker layers on top of your reproducible software dependency requirement specification ("REES"). {bash/zsh, git, docker, repo2docker, jupyter, [MyST] markdown and $$mathTeX$$; Google Colab, Kaggle Kernels, ml-workspace, JupyterLite}"How I'm able to take notes in mathematics lectures using LaTeX and Vim" https://news.ycombinator.com/item?id=19448678Here's something like MyST Markdown or Rmarkdown for Jupyter-Book and/or jupytext:## Log functionsLog functions in the {PyData} community### LaTeX#### sympy2latexWhat e.g. sympy2latex parses that LaTeX into, in terms of symbolic objects in an expression tree:### numpysee above### scipy### sympysee above### sagemath### statsmodels### TensorFlow### PyTorch## Logarithmic and exponential computational complexity- [ ] DOC: Rank these with O(1) first: O(n log n), O(log n), O(1), O(n), O(n*2) +growthcurve +exponential## Combinatorics, log, exp, and Shannon classical entropy and classical Boolean bits S=k_{b}\ln\Omega  Entropy > Statistical mechanics: https://en.wikipedia.org/wiki/Entropy#Statistical_mechanicsSI unit for [ ] entropy: joules per kelvin (J*K*-1)*****In terms of specifying tasks for myself in order to learn {Logarithms,} I could use e.g. todo.txt markup to specify tasks with [project and concept] labels and contexts; but todo.txt doesn't support nested lists like markdown checkboxes with todo.txt markup and/or codelabels (if it's software math) - [ ] Read the Logarithms wikipedia page and take +notes +math +logarithms @workstation - [o] Read - [x] BLD: mathrepo: generate from cookiecutter or nbdev - [ ] DOC: mathrepo: logarithm notes - [ ] DOC,ART: mathrepo: create exponential and logarithmic charts +logarithms @workstation - [ ] ENH,TST,DOC: mathrepo: logarithms with stdlib math, numpy, sympy (and *pytest* or at least assert assertion expressions) - [ ] ENH,TST,DOC: mathrepo: logarithms and exponents with NN libraries (and *pytest*)  Math (and logic; ultimately thermodynamics) transcend disciplines. To bikeshed - to worry about a name that can be sed-replaced later - but choose a good variable name now, Is 'mathrepo' the best scope for this project? Smaller dependency sets (i.e. simpler environment.yml) seem to result in less version conflicts. conda env export --from-history; mamba env export --from-history; pip freeze; pipenv -h; poetry -h
 ### LaTeX $$\log_{b} x = (b^? = x)$$ $$2^3 = 8$$ $$\log_{2} 8 = 3$$ $$\ln e = 1$$ $$\log_b(xy)=\log_b(x)+\log_b(y)$$ \begin{align} \textit{(1) } \log_b(xy) & = \log_b(x)+\log_b(y) \end{align}  Sources: https://en.wikipedia.org/w/index.php?title=List_of_logarithm... ,#### sympy2latexWhat e.g. sympy2latex parses that LaTeX into, in terms of symbolic objects in an expression tree: # install #!python -m pip install antlr4-python3-runtime sympy #!mamba install -y -q antlr-python-runtime sympy import sympy from sympy.parsing.latex import parse_latex def displaylatexexpr(latex): expr = parse_latex(latex) display(str(expr)) display(expr) return expr displaylatexexpr('\log_{2} 8')) # 'log(8, 2)' displaylatexexpr('\log_{2} 8 = 3')) # 'Eq(log(8, 2), 3)' displaylatexexpr('\log_b(xy) = \log_b(x)+\log_b(y)')) # 'Eq(log(x*y, b), log(x, b) + log(y, b))' displaylatexexpr('\log_{b} (xy) = \log_{b}(x)+\log_{b}(y)') # 'Eq(log(x*y, b), log(x, b) + log(y, b))' displaylatexexpr('\log_{2} (xy) = \log_{2}(x)+\log_{2}(y)') # 'Eq(log(x*y, 2), log(x, 2) + log(y, 2))'  ### python standard libraryhttps://docs.python.org/3/library/operator.html#operator.powhttps://docs.python.org/3/library/math.html#power-and-logari...math. exp(x), expm1(), log(x, base=e), log1p(x), log2(x), log10(x), pow(x, y) : float, assert sqrt() == pow(x, 1/2)## scipyhttps://docs.scipy.org/doc/scipy/reference/generated/scipy.s... scipy.special. xlog1py()https://docs.scipy.org/doc/scipy/reference/generated/scipy.s...### sagemathhttps://doc.sagemath.org/html/en/reference/functions/sage/fu...### statsmodels### TensorFlow https://www.tensorflow.org/api_docs/python/tf/math tf.math. log(), log1P(), log_sigmoid(), exp(), expm1()https://keras.io/api/layers/activations/SmoothReLU ("softplus") adds ln to the ReLU activation function, for example: https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#So...E.g. Softmax & LogSumExp also include natural logarithms in their definitions: https://en.wikipedia.org/wiki/Softmax_function### PyTorchhttps://pytorch.org/docs/stable/generated/torch.log.html torch. log(), log10(), log1p(), log2(), exp(), exp2(), expm1(); logaddexp() , logaddexp2(), logsumexp(), torch.special.xlog1py()***Regarding this learning process and these tools, Now I have a few replies to myself (!) in not-quite-markdown and with various headings: I should consolidate this information into a [MyST] markdown Jupyter Notebook and re-lead the whole thing. If this was decent markdown from the start, I'd have less markup work to do to create a ScholarlyArticle / Notebook.

Search: