
Do Teachers Need to Include the History of Mathematics in Their Teaching? (2003) - lainon
https://www.researchgate.net/publication/281223989_Do_teachers_need_to_incorporate_the_history_of_mathematics_in_their_teaching
======
lordnacho
Yes, yes, a hundred times yes.

My math teacher taught me not only calculus, but also when and who got up to
what. It doesn't have to take a long time, but a bit of context helps a lot.
Euler went to Russia and yada yada bridges graphs etc.

Same goes for all science disciplines. You need to have a rough idea that
Darwin worked in the 19th century, that much of thermo came about in the late
19th century, that quantum is a 20th century thing. You need to know what
people were wondering about, and what experiments they came up with.

I've been listening to a lot of audio courses lately, and those little nuggets
really help to understand things.

The point of the little stories such as how Watson and Crick came up with the
double helix is to help recall. It's hard to remember dry facts, much easier
to remember stories. People are kinda built that way.

~~~
dannypgh
I strongly believe math should be taught with the science that motivated its
development- calculus should be taught with classical mechanics. But I'm less
sure the history of science is worthwhile.

The problem is opportunity cost. Schools (highschool, college intro physics)
spend a great deal of time discussing previous models of the atom. What's the
value of teaching Thomson's plum pudding model of the atom, really? They could
start with the current model and list all of the observations/experiments that
have shown the model to be useful. Previous models could be relegated to an
appendix, or a history class.

This would free up time to more comprehensively discuss 20th and 21st century
physics, which are sorely neglected at this level.

~~~
pkaeding
I think there is value in teaching that there have been prior models (but more
energy should be spent on the current thinking) because it shows that good
scientists are constantly researching and revising their ideas as they get
more information. Science is never done.

~~~
dannypgh
To be clear I agree there's value, I just don't think the value is as high as
the other things that could be taught in that time.

I think appendices are really a good solution here. Those who are curious can
read them- I know I would have. Those who aren't, can stick to focusing on the
current model.

~~~
pkaeding
I'm not sure I agree. I think that understanding that knowledge is never done,
and that science is about exploring new ideas and questioning our current
understanding is much more important to the layperson than knowing how atoms
are structured.

I also think that couching science as a journey--a mystery to be solved, with
clues and red herrings along the way--will help to get students interested in
learning the details. (This is all just my gut instinct; I have no experience
in science education, so I might not know what I'm talking about).

------
Animats
Probably not. It's interesting to plow through Newman's four volume "The World
of Mathematics" as a grad student, but inflicting mathematical history on
ordinary school kids is cruel. They don't need that much math.

Unless you're a mathematician, math should be viewed as a tool, like a lathe.
You don't need to know the history of the lathe, and how Maudslay made it a
precision machine tool. (His original lathe is in the Kensington Science
Museum. It's one of those historic artifacts which looks very different from
its predecessors, and it's successors look a lot like it.)

Few people need to know how to build up mathematics from minimal axioms.
Nobody should have to struggle through Whitehead and Russell below the PhD
level. We have power tools for that now. The original Boyer-Moore theorem
prover from 1992 can build up constructive number theory from the axioms in
under a minute.[1] I fixed it up recently to run on GNU Common Lisp and put it
on Github, so it's runnable on modern machines.

There's certainly no excuse for inflicting Newton's notation for calculus on
kids. It's not even clear that classical geometry proof approaches are that
useful.

[1] [https://github.com/John-Nagle/nqthm](https://github.com/John-Nagle/nqthm)

~~~
theoh
To generalize a bit from your comment: basic education probably shouldn't
follow a strict "ontogeny recapitulates phylogeny" model
([https://en.wikipedia.org/wiki/Recapitulation_theory](https://en.wikipedia.org/wiki/Recapitulation_theory))

Of course it's very important to study the history, but probably not in the
first presentation. But at the same time, it's definitely true, for most
subjects, that historical references often help to motivate less committed
students and enliven the experience.

On Maudslay's role in the development of machine tools, this paper looks
interesting:

FT Evans "The Maudslay Touch: Henry Maudslay, Product of the Past and Maker of
the Future"
[http://www.tandfonline.com/doi/abs/10.1179/tns.1994.007?jour...](http://www.tandfonline.com/doi/abs/10.1179/tns.1994.007?journalCode=yhet19)

~~~
jacobolus
On the other hand, often our tools have somewhat peculiar forms. For example,
“trigonometry” as commonly taught in high schools and used in
science/engineering/mathematics uses bizarre names, terrible notation
conventions, and a giant pile of unmotivated formulas to memorize.

Teaching a bit of history alongside helps students understand why it takes
that particular form.

The word “sine” comes from a weird Latinization of an Indian word for “half a
bowstring”. Draw a picture of a circle with a vertically oriented chord (the
word “chord” also implies a bowstring), and a student will have a much easier
time remembering what the sine is. Likewise tangent (Latin for touching) and
secant (Latin for cutting) make more sense if you think about the meanings of
the words. Cosine means the sine of the complementary angle, etc.

The reason we call inverses “arcsine”, etc. is because originally these were
written as quasi-sentences, and the concept of a mathematical function was not
well developed. So sin⁻¹ _x_ would be expressed as something like: arc (sin. =
_x_ ). That is, the length of the arc whose sine is _x_. This form was
cumbersome so later got shortened to arcsin _x_.

The origins of trigonometry are in astronomical measurement, which is why we
have 360° in a circle (each degree is roughly one day of movement (365
days/year), rounded to a nearby highly composite number), and come from the
Sumerian/Babylonian numerical tradition which used a base sixty number system.
Hence “first minutes”, “second minutes”, “third minutes”, etc. of a degree.
“Minute” (Latin for small) implies 1/60 of the larger unit.

The reason trigonometry focuses on learning a big pile of formulas is because
before the era of electronic calculators, people needed to do all computations
by hand, or by interpolating in pre-computed lookup tables. The goal of
“trigonometry” is to take a given problem and convert it to a form with the
easiest hand computation and the fewest table lookups possible, so that the
mechanical work can be handed off to a team of human computers who can go
through the laborious arithmetic. Memorizing trigonometry formulas is a way to
cut the work done by the human computers to a small fraction of what it might
take for the original problem as posed.

Trigonometry was important in science/engineering because until recently the
abstract vector concept and idea of combining simple single-number parts into
“complexes” were not well developed. People solved problems by breaking them
into coordinates and discrete lengths and angles. Solving triangles and
converting between polar/cartesian coordinates were important steps in almost
any 2-dimensional problem.

If we really wanted to avoid the “ontogeny recapitulates phylogeny” model, we
would scrap the current form of trigonometry (certainly not spend 4+ months
exclusively focusing on it) and set the high-level ideas on a more logical
foundation which was easier to learn and reason about, ditching the parts now
anachronistic in an electronic computer age. We would give students harder
problems to solve and fewer formulas to memorize. But that could leave
students unfamiliar with the existing language commonly used in the existing
literature, so to some extent we’re stuck by our history.
[http://geocalc.clas.asu.edu/pdf/OerstedMedalLecture.pdf](http://geocalc.clas.asu.edu/pdf/OerstedMedalLecture.pdf)

~~~
uryga
> The reason we call inverses “arcsine”, etc. is because originally _these
> were written as quasi-sentences_ , and the concept of a mathematical
> function was not well developed. So sin⁻¹ x would be expressed as something
> like: arc (sin. = x).

Any pointers to resources about those "quasi-sentences"? I'm interested in
language, broadly speaking, so info about how mathematical notation evolved is
interesting to me. The rest of your comment is great to, I'd love to read a
book about stuff like this if there is one!

~~~
jacobolus
The most comprehensive source about this kind of thing is Cajori’s _History of
Mathematical Notations_. You should be able to find a used copy of both
volumes for a reasonable price.

~~~
_asummers
There's a Dover publishing of both volumes in one on Amazon.

[https://smile.amazon.com/History-Mathematical-Notations-
Dove...](https://smile.amazon.com/History-Mathematical-Notations-Dover-
Mathematics/dp/0486677664/ref=pd_cp_14_1?_encoding=UTF8&psc=1&refRID=0W3D09QH7T4RNGMT2HMF)

------
japhyr
I teach high school math, and I incorporate math history informally. I don't
teach lessons specifically about math history, but I often use it as context
for whatever I'm teaching.

For example, if I'm showing students how to find the area of a circle, I ask
them if they know where pi comes from. Many students have no idea that people
had to discover the value of pi, and how it can be used in formulas. To many
students, pi and formulas are just things that have been around forever, that
they have to learn in school. I draw a square around a circle and ask what the
area of a square is. I draw a pentagon and a hexagon, and ask them what will
happen if we keep adding sides. Students spend most of their time focusing on
the practical aspects of math, but they come away with an understanding that
math has been a human endeavor of discovery, and that many of the pieces fit
together in beautiful and surprising ways.

When we have a little time at the end of class, or during transitions, I pull
up little snippets of math to show them. Math videos are great; for example, I
love showing students that ∞ + ∞ = ∞. [0]

There are lots of little things we can do to make math more alive for
students, and sharing some math history is certainly one of them.

[0]
[https://www.youtube.com/watch?v=faQBrAQ87l4](https://www.youtube.com/watch?v=faQBrAQ87l4)

------
Sniffnoy
Short answer: It varies.

Sometimes in math the history provides helpful context and motivation, and
when people leave it out it makes things confusing. I'm not going to elaborate
on this because I assume most people here already agree with this!

But sometimes the modern way is so much cleaner and better, so that even if
you do want to learn the history, everything will probably be easier to
understand if you learn the modern way first and know in advance what truth it
is that they were working their way towards. Sometimes the historical way is
just awful.

(When I took representation theory in college, the professor thought it would
be funny to at one point show us the _original_ definition of an irreducible
character. Nobody should ever have to learn representation theory in such a
way!)

------
laddng
[https://betterexplained.com/articles/developing-your-
intuiti...](https://betterexplained.com/articles/developing-your-intuition-
for-math/)

This website does a really good job explaining complex calculus ideas to me.
One of which is 'e', where the history and its origins will help you
understand why it behaves the way it does. Without this context, 'e' is just
some arbitrary number that you have to memorize. This is solely why I believe
that history of math should definitely be taught.

Finance is also another area where history should be taught since ideas like
continuously compounded interest was a recent development, and it's history
explains why anyone wanted to compound continously.

~~~
kalid
Kalid from BetterExplained here, glad you're enjoying it!

I often go to the history of a concept for both historical appreciation and,
practically, to understand it better. We often study Calculus without looking
at what Archimedes was able to do -- break shapes into smaller parts -- and
without really seeing how the notion of infinitesimals come into play.
(Epsilon/delta definitions don't give the same insight.)

The natural log was discovered before e -- why would that be? Sine, cosine,
trig functions -- they started as measurements of triangles, evolved into
analytic definitions of their own -- why does this progression make sense?

In my mind, truly understanding a concept means you understood the path it
took to the current state.

~~~
laddng
Oh wow, this is why I love Hacker News - thank you for helping me overcome my
mathematical anxieties in college as I was working on my Computer Science
major. Your site really helped me gain an appreciation for math since I had
terrible teachers as a kid and led me to believe that I could never 'get it'.
Would love to see some more linear algebra materials as well

~~~
kalid
That's awesome to hear, thanks (I started the site to help other students).

Yeah, my only linear algebra content is:

General Overview:

[https://betterexplained.com/articles/linear-algebra-
guide/](https://betterexplained.com/articles/linear-algebra-guide/)

Matrix Multiplication for Programmers:

[https://betterexplained.com/articles/matrix-
multiplication/](https://betterexplained.com/articles/matrix-multiplication/)

Hoping to flesh it out over time.

~~~
chillee
Out of curiosity, how is the content generated? Do you write it all yourself,
or are there other contributors?

I've always thought that betterexplained was a great resource, and was
wondering whether there was any way I could contribute.

~~~
kalid
Just saw this now :). I write it all myself, but sometimes get pointed at
resources by others.

[https://aha.betterexplained.com](https://aha.betterexplained.com) \- forum to
discuss ideas. It's in the background currently but I'd like to make it into
more of a public place.

If you like, shoot me an email, I'm putting together a list of people who
might want to be contributors. Thanks!

[https://betterexplained.com/contact](https://betterexplained.com/contact)

------
ezequiel-garzon
Euclid: if 2^n-1 is prime, then (2^n-1) * 2^(n-1) is the sum of all its proper
divisors (and clearly even).

Euler: If an even number is the sum of all its proper divisors, then it has
the form (2^n-1) * 2^(n-1), where 2^n-1 is prime.

You don't need to give a full history lesson every time, but if you omit the
people that came up with this, and the fact that it happened about two
thousand years apart, you're needlessly ditching precious magic. Some
historical gems take away little class time and make mathematics more humane.

~~~
bikenaga
I mention this when I teach number theory - and the fact that the proof in
Euclid is _entirely in words_. People complain about mathematical notation
sometimes, but the imperfect notation we have is better than just words for
everything. (We could use more pictures, however.)

Likewise, if I define normal subgroups, then simple groups - and it would be a
shame, at that point, not to mention the classification of the finite simple
groups.

I think from the students' reactions that these things are interesting to
them. People interest people.

So I mention Euclid, and maybe I take 1 minute. The finite simple groups,
maybe 2 minutes. The class is 50 minutes long. Is that "teaching the history
of math"? Really, the original question is ill-posed. Let us say I have a
50-minute class in a content course. How much of that time do I have to spend
talking about history before I'm "teaching the history of math"? One minute?
Ten minutes? Do I have to give an assignment on history?

But a little bit of history, or culture, or a random story - I think that's
part of learning the subject, broadly understood (as you said, _humanely_
understood). Nothing but definition-theorem-proof would be pretty deadly.

------
analog31
I'm certainly not a real mathematician, though I was a math major in college.
But I use math all the time. Meanwhile, my kids are taking math in school.

Practically everything I do with math, is done at the computer. When I derive
something by hand, it's with the knowledge that I'm just doing it for
nostalgia's sake. I could, and probably should, use Jupyter / Python / Maxima
for everything. And I'd enjoy learning how to use even more interesting tools
such as a proof assistant, even if it would be purely recreational at this
point in my career.

Meanwhile, in their high school math classes, my kids will never touch a
computer. Everything is done by hand, with occasional use of a graphing
calculator (what an archaic device).

In a weird sense, not only are they learning history, but _the entire
curriculum is history._

I can't say if this is good or bad. Whatever I learned in high school must
have paved the way for me to pick up more modern techniques fairly readily.
Math really came alive for me when I began to learn abstract math, and was
simultaneously introduced to computation at the front end of the microcomputer
revolution. That's what made me want to be a math major.

~~~
joeberon
When you say you should use a computer, you mean for arithmetic and
calculations right?

Also calculators are hardly archaic. If I want to calculate something quickly
I'll always go for my Casio FX-83GT, since I can type it much faster in there.
They are archaic in the sense that the number of terms you can have can be
limiting though...

I think a good grasp of arithmetic is incredibly helpful in the real world,
and is something that academics (like me, as a physicist) often lack, whereas
"regular people" are much better at it. I also rarely bother with change, I
pay with card when I can...

I think doing anything more complicated than basic calculations on paper is
pointless though

~~~
analog31
I have a calculator too, though not a graphing one, and I use it for a similar
reason: The keypad is convenient. But if I need to graph something, or do a
repetitive calculation, then I'll turn to Jupyter. In fact, I have it on my
tablet.

I graduated from high school just as graphing calculators were introduced, so
it never became part of my experience.

What seems unfortunate about the graphing calculators is that its special
symbiosis with K-12 math teaching limits the development of both. You can't
add features to the calculator, or offer a free alternative as a phone app,
without facilitating "cheating," and the textbooks can't introduce lessons
requiring computational power beyond the capabilities of the calculator.

Not to mention, the TI monopoly: Every family has to shell out for one of
those things.

------
Jyaif
Not just mathematics, all disciplines should contain a healthy dose of
history.

I think that explaining the thought process is even more important than the
results.

~~~
taeric
There is a difference in history and thought process.

My view is that school should help kids find their thought process. Will
likely be similar to successful thought processes. Didn't have to be, though.

The most instructive thing I have ever heard, was seeing that Feynman made his
own notations for learning math tricks in high school. Fur some reason, this
really made me regret not trying new ways of things.

There was an article recently on the importance of notation. Making your own
goes a long way to understanding others.

------
rastaaaaaa
While studying CS in my first year at university our maths professor spent the
first week teaching us about ancient number systems and for the first few
weeks taught us how count, add and subtract using Egyptian (Base 10) and
Babylonian (Base 60) numerals.

We were given some historical context for those number systems and this was
the perfect way to lead into teaching us binary, octal and hexadecimal
arithmetic.

For those of us who have always been interested in computers it mightn't have
been that useful but for the people who "sorta fell into this course" it was a
great way to learn those concepts without it just being "that binary thing"

------
yequalsx
I try to incorporate a little bit of history into my mathematics courses.

For instance, why do we rationalize denominators? That is, why is 1/sqrt(2)
traditionally considered bad form? There is a historical reason for this that
few students today know about. I think understanding this history puts things
in context and makes the subject less about arbitrary rules.

Here's something that puts basic algebra into perspective. Every equation that
we teach you to solve by hand is reducible to either a quadratic or linear
equation. The rules we teach are all about transforming expressions/equations
into quadratic or linear form. There is a purpose. It's not random.

Does it help? I don't know.

~~~
waterhouse
> rationalize denominators

Hah. Ah, yes. Rationalizing denominators is sometimes useful for calculations
and sometimes counterproductive. I think numbers of the form 1/√n are almost
always simplest and best understood as 1/√n rather than √n/n. (Then you have
symmetry in statements like "The diagonal of a square is √2 times its
side"/"The side of a square is 1/√2 times its diagonal".) Probably the same
for a/√n or 1/b√n. If you're about to add it to another such number and you
need a common denominator, like 1/√2 + 1/√3, then you can turn it into 3√2/6 +
2√3/6; on the other hand, if you were to multiply 1/√2 and 1/√3, it's most
sensible to keep them that way and get 1/√6; if you then needed to square it,
1/6 is simpler than 6/36\. I do believe that teachers' insistence on
rationalizing denominators regardless of context was an instance of cargo-
culting—following an arbitrary rule without understanding where it came from
or when it was appropriate.

(And, incidentally, any denominator of the form "a ± b√c" _should_ almost
certainly always be rationalized, and that is a more difficult and valuable
trick.)

Edit: By the way, I would be interested to know what "historical reason" you
have in mind. I have a feeling that it's of the same form as "at one point,
mathematicians were sort of embarrassed by the idea of negative numbers, which
were _obviously_ not _real_ , so they would prefer forms like x + 3 = 0 over x
= -3". That's the only reason I can think of for _always_ preferring that
form. Was your comment about converting things to linear and quadratic form
meant to apply to this? I hope you wouldn't assert that it was _always_ to be
preferred. (If a^2 + b^2 = c^2, tell me whether "a = 1/√5, c = 1/√2" yields
simpler calculations than the alternative.) But my teachers did not
communicate anything so nuanced—points taken off for any final answer anywhere
with square roots in the denominator—and I don't think it was communicated to
them, either. (At least one of them was led to assert that √5/5 was, in
itself, "simpler" than 1/√5.)

On the original subject of this thread, I might say that, whether or not it's
directly passed down to students, the background of mathematics should be
incorporated into what is taught to _teachers_ , because otherwise a majority
of them will be ignorant, and will come off to intelligent students as blindly
following and enforcing arbitrary rules that they don't understand.

~~~
bikenaga
I've always assumed the "historical reason" was that when people had to do
computations by hand, it was easier to find the decimal value of 1/2^(1/2)
[how did you make those square root symbols?] by dividing 2^(1/2) by 2 than by
dividing 1 by 2^(1/2) (assuming that you knew the decimal expansion of 2^(1/2)
). With calculators and computers, that isn't important. Most of the math
teachers I know would not penalize anyone for leaving an answer as
"1/2^(1/2)".

But the _idea_ of rationalizing is important - for example, so that you can
express 1/[a + b * 2^(1/2)] in the form p + q * 2^(1/2), where a, b, p, q are
rational.

~~~
waterhouse
Square root symbols: option-v on Macs. Don't know about other operating
sysetms. ± is option-+ (aka option-shift-=). I just did option-[every key],
and option-shift-[every key], at various points in the past, and learned the
symbols that I liked.

------
jvvw
The danger is that incorporating history is very easily to do badly. I
remember various science lessons at school which were made duller and more
confusing by the teacher taking a historical approach.

We have better mathematical notation, explanations etc these days that make
various topics far easier to understand. Try reading most old mathematics
books and it's a tough experience. It's certainly also possible to have a good
understanding of mathematics without really knowing much about the history
e.g. I understand Galois Theory but I really don't know much at all about the
history of trying to solve polynomials or indeed about Galois himself other
than he went and got himself killed in a duel. In fact when I was younger, too
much history might have put me off the subject altogether. I'm old enough now
to appreciate history but I wasn't when I was at school.

Instead, I think a better approach is to identify parts of mathematics that
teachers struggle to teach either in terms of concepts or motivation. Then one
can look more carefully at those and see whether examples from history (or
indeed other contexts) might help, rather than necessarily using history as
the starting point. I think the article may be in accord here, but it is so
easy for the message to get interpreted differently by teachers.

------
vladsotirov
History is a convenient and readily available proxy for shifting content focus
away from mathematical facts and toward mathematical processes such as
problem-solving, proof-writing, problem-posing, abstraction, theorizing.

History's appeal I think lies in the concrete, engaging narratives involving
the struggles, dreams, and failures of actual people going through those
processes. In my experience, differences in mathematical inclination are
correlated with the ability of perceiving the abstract mathematical processes
as engaging narratives in and of themselves.

Professional mathematicians, as much as I've witnessed such speak to one
another, tend to describe, e.g. a sequence of algebraic manipulations to solve
a problem, as a journey taken by known facts during which they grow, combine,
and ultimately transform into new knowledge. I myself have always considered
numbers, variables, etc. abstract concepts, to be my friends, and like with
any friends I care about their relationships, their states of being, and so
on.

This of course all begs the question of what mathematical facts and processes
should be part of the curriculum in the first place, i.e. what is it that we
would like to teach better using history in the first place?

------
mcbits
I'm skeptical that adding more history to mathematics education would increase
motivation. If mathematics has a contender as the most boring and dreaded
subject, it's history. For motivation, tie in pop cultural references and show
applications that students actually find interesting. But historical context
could be useful on other fronts. E.g., discussing the arguments from
historical debates probably helps to solidify certain ideas.

------
adpoe
Just speaking personally: for me, learning the context and history of
mathematics would have made a _huge_ difference in my motivation to learn it.

I always _hated_ math growing up (despite being very good at it now); I didn't
sit down and learn it well until I got serious about learning CS, in my 20s.

Our teachers did not contextualize _why_ we were learning this stuff _at all_.
It was more like: here, sit still and spend an hour drawing lines on grids and
arbitrarily shuffling X's and Y's. Boring... And for me, it seemed so far
removed from the reality of my day-to-day life at that age, that I just
couldn't see why learning this "boring" subject was at all relevant.

In reality, math is one of the most fascinating subjects anybody could learn.
But unless you know why and how it's used--there's no motivation to sit down
and plug through tedious exercises about seemingly trivial subject matter. (I
would have rather been playing video games. Or climbing a tree. Causing
trouble. Or, yes, even _reading_.)

This isn't the only missing link in math education in the US -- but it's one
of them.

------
bsaul
The fact that mathematics definition are a _construction_ and not something
that's absolutely obvious, and also that notations have _evolved_ and so are a
bit arbitrary as well is absolutely crucial to demystify mathematics. I think
the reason mathematicians still wonder if it's necessary is because it would
only show that maths are a human construction and not a heaven's revelation,
and so makes math more mundane.

------
indexerror
Absolutely, yes.

1\. It's much more interesting to study something when you have a historical
perspective that you can relate to later on in the course.

2\. It makes you comfortable about the idea that such developments in
mathematics are made by fellow humans only and they can also do such things if
they put in the efforts. This might sound like a small addition but it's
detrimental in developing such sense in young kids.

------
fmap
It depends. There are some topics for which the historical context provides
the perfect motivation. Which problems were considered important and why? How
does this theory solve these concrete problems? It is much easier to motivate
students to learn about abstract theories if you can clearly explain their
usefulness ahead of time.

However, there are some mathematical theories, e.g. Topos theory, whose
historical context is so convoluted that it's just going to confuse students.
I'm speaking from personal experience here... Historically, Topos theory was
developed in the context of algebraic geometry. This context is not (directly)
useful to you if you want to apply these ideas to logic. If you approach the
topic from order theory instead (which is an application that came much later
historically!), you get a very smooth explanation where every step follows
from what you did previously instead of magically teleporting in place from
disparate areas of mathematics...

------
jankotek
Absolutely no! 90% of educated population does not handle basic math (such as
rule of proportions). History would dilute math even more. But it would be
easier for teachers and students to 'pass'.

~~~
jakobegger
I agree. The fact that high school students are struggling with epsilon-delta
notation of limits is not because they lack historical context. It is just far
beyond what most people can grasp.

Most of my school mates had zero understanding of the mathematics we learned.
They memorised how to compute the distance between a point and a sphere, a
line and a sphere, a plane and a sphere, and then they'd solve one of these
problems for the exam, get a good grade, and forget everything immediately
afterwards. At no point have they understood what a scalar product is and why
they are using it.

If you'd really want to improve mathematics education, you should focus a lot
more on the basics, and teach advanced subjects only to the handful of
students that are interested in them.

~~~
im3w1l
I don't know about American conditions, but I remember that many struggled
with epsilon delta because it was their first encounter with formal logic.
"For every... there exists a..." took some getting used to for them.

------
night_heron
Yes. Probably the most inspiring feature of my first number theory course was
a healthy dose of history: about Gauss, Sophie Germain, Riemann, Hardy,
Ramanujan. It was helpful, psychologically, to see the process that gave rise
to number theory, instead of viewing its ideas as God-given.

------
js8
Not sure about mathematics. History of math is interesting, but I have never
found it super enlightening when it comes to actual concepts. Although it's
probably good to understand some of it, because it helps to underline the
reasons why various definitions have been introduced (for example, definition
of continuous function and then later topology).

However, whenever I talk to other people (contrarians) about global warming, I
very much recommend Weart's History of Global Warming as the only book to
read. I think in this case, understanding history of the theory, the timeline
and how convoluted the path of discovery was helps to break the silly
conspiracy theories about climate change.

------
iammiles
I didn't develop a love for mathematics until I picked up Mathematics for the
Nonmathematician at my local library. Having some historical context to what
we were learning would have had a more profound impact on me during high
school.

------
danso
Maybe not for math, but I think a case could definitely be made for
programming. If I had understood the motivation behind Unicode and its full
implications, I would have spent much less time suffering in confusion about
encoding errors.

~~~
zerofan
Unicode is a terrible mess. I remember reading about it with enthusiasm in the
nineties, but it didn't take long to realize the committee of designers simply
punted on every difficult decision instead of making a stand. They dumped
everything on the programmers. At this point, I think it would be less work to
convert 7 billion people to using ascii than it would to enable 10 million
programmers to use that pile of a standard robustly. History lessons won't
help us here.

~~~
js8
> committee of designers simply punted on every difficult decision instead of
> making a stand

Are you sure that it was a bad thing? If they made a stand, we would now had
to use (arguably worse) UCS2 or UCS4 encoding instead of UTF-8 (which de facto
won).

~~~
zerofan
UTF-8 is a nifty compression scheme - like a Huffman encoding that doesn't
require a table to implement, but the committee doesn't deserve any credit for
that (unless Thompson and Pike were on the committee, which I doubt).

Besides, the reason most of us actually like UTF-8 is because it leaves ascii
alone (which is all I ever use) while pretending to handle the general case.
It doesn't help end users or programmers deal with any of the nonsense around
multiple ways to encode glyphs (combining codes vs accented codes), deal with
surrogates (yes, people encode surrogates in UTF-8), lexical sorting, or
anything else. I'll bet there are dozens of incompatible ways strings are
UTF-8 encoded in the real world, each of them a bug for interoperability, and
all of that blame falls on Unicode being a terrible standard.

So yes, I'm sure.

------
alexryan
Physics and chemistry are usually taught from an historical perspective and I
found this to be very helpful for it fires the imagination. We humans seem to
love learning by stories. The hero strives to solve an important problem,
fails a few times and eventually suceeds. Learning in ths fashion seems to
enabler the learner to better integrate new knowledge into mental
representations that are more likely to be recalled at the appropriate time
when solving our own problems. Math is unfortunately not taught in this
fashion. IMHO this is why so many kids have the same complaint: "when am i
ever going to use this?"

------
brownbat
I really liked the third bullet:

Historical problems can help develop students’ mathematical thinking.

Seems really useful to know the problems humanity was struggling with at the
time a tool or its notation was developed. Trigonometric functions were
nonsense to me until I learned more about astronomy. Once I saw all the
problems you could solve with them, everything clicked into place.

------
rebootthesystem
As an expansion of the topic, I've always had a problem with a variety of
subjects being taught without context. Even history itself is taught without
context.

For example, in the US, when you learn US history it is done in almost
complete isolation of what might have been going on elsewhere on the planet.
You get a bit of what was going on in England but that's it.

Many topics, from math to physics, chemistry, geography and even history would
be so much more interesting if they were taught with an underlying foundation
of relevant world history to make them more interesting and contextual at the
same time.

Even woodworking benefits from understanding how and why people were using
certain designs and joints at different times. How did the nail come about?
The screw? Various tools, etc.

And one of my favorites, the number zero throughout history.

------
jacobolus
“Need”? Depends on the level, and maybe not. It can certainly be helpful
though. I particularly recommend in primary school teaching about the history
of counting boards / abacuses, the history of trigonometry, and most
importantly the history of logarithms (every high school student should learn
rudimentary use of a slide rule). All three will help clarify why particular
tools and notations became standard, and give insight into pre-electronic-
computer science and engineering more generally.

For anyone interested in the history of mathematics per se at the
undergraduate level, I recommend Stillwell’s book,
[https://amzn.com/144196052X](https://amzn.com/144196052X)

More important than “mathematical history” is to teach students some measure
of physics/engineering alongside the mathematics, to help motivate concepts.

------
wwarner
no. the math curriculum is too crowded already. i liked reading dirk struik,
but the selfishness of mathematicians and the patrons they served is only
entertainment in the end. want to update the curriculum? then replace some
geometry and calculus with statistics.

------
dave_sullivan
I strongly believe they should.

I've learned much more about math by looking at its history. Then again, I've
simultaneously had to learn a lot of applied math quickly, so I have skipped
the historical perspective on many things too.

I think the way I learn best is simply try to apply a concept first and
struggle until I become deeply frustrated. Then I backtrack and try to learn
the fundamentals that influence the most current tools and methods of
application.

Part of understanding these fundamentals is understanding the history of an
idea and how it evolved.

------
kwhitefoot
Not sure the history in itself is useful but starting from the beginning is
probably a good idea.

A solid grounding in logic should come first. Then we can skip quite a few of
the blind alleys because there simply isn't time to cover every byway from
Pythagoras onwards.

But always the foundations must be solidly built or we end up with people who
can crank the handle on the algorithm they have been taught but can't
understand what to do when it doesn't apply.

------
szul
I never truly understood the quadratic formula until I learned about the
history of the equation and its discoverer. History of Mathematics should be a
fundamental course.

------
Clubber
I think the history of anything is important. It helps the student learn _why_
things are significant, what they changed, and how things were before it
occurred.

------
floki999
Absolutely yes! Without history students are given an often misleading view of
how math constructs evolve. Not only does historical context help some
students learn and accept certain math concepts (helping answer the 'why?'),
but it also provides insight as to how math research is actually carried-out.

------
zeahfj
I think the history is important for expectation management alone. It took a
world of very smart people many hundreds of years to get to where we are today
and there are still many open questions. It's probably ok that you as a
student don't understand it all at once.

------
rokosbasilisk
Yes or at least within the context of discovery. Math never clicked with me
until I approached this way.

------
nonbel
Does anyone have a good ref that reviews the history of dividing by zero? The
current way of dealing with it just seems so ad hoc and dissatisfying... there
has to be a long controversial history but searching around I find very little
that isn't repeating the same stuff.

~~~
bikenaga
I'm a mathematician and I'm not aware that there was a "long controversial
history". (I'll ask a colleague who teaches history of math and see if she
knows anything.) I have a lot of students who plan to teach, and I tell them
that their students (and other people) will probably ask why "you can't divide
by zero". It's important to explain what is meant by "can't". (Does it mean no
one knows how to do it? Or that some mathematical authorities issued a decree?
Of course, it's nothing like that. Those kinds of misunderstandings can arise
because people don't understand how math is done by mathematicians, because it
is done in ways that are different from the way we do things in everyday
life.)

Anyway, the explanation is simple and quick enough that I can do it in any
class where I'm discussing number systems (e.g. linear algebra, number theory,
abstract algebra). The "tl;dr" is that if you want to "divide by 0" you will
have to give up something else, and none of the things you have to choose from
are things you'd want to give up.

In more detail, suppose you could "divide by 0". Division is defined as
multiplying by the multiplicative inverse. (If you don't like that definition,
you have to explain what you'll substitute as the definition of division - and
note that, mathematicians want a definition that extends smoothly to "number
systems" that may be very unfamiliar.)

So saying you can divide by 0 is the same as saying that 0 has a
multiplicative inverse - call it 0^(-1). By definition of multiplicative
inverse, 0 * 0^(-1) = 1.

On the other hand, in any reasonable number system (specifically, in any
ring), 0 * x = 0 for any x. The proof is easy - it uses the definition of "0",
the definition of additive inverse, the distributive law, and associativity of
addition. (Try it!) Therefore, 0 * 0^(-1) = 0, so 0 = 1.

If this doesn't seem enough of a contradiction, just note that it follows from
this (and the definition of "1") that x = 0 for all x. So the only "number" in
the whole world is 0. Well, that makes life simple, but not very interesting.

So: If you want to "divide by 0", you're going to have to give up one of those
algebraic axioms I mentioned. Which one would you give up? Associativity of
addition? The distributive law?

I think it's really important to explain (particularly to kids learning math)
that math is _not_ a bunch of arbitrary rules. "Not dividing by 0" is not an
arbitrary rule - it's a matter of making a trade-off.

~~~
gizmo686
>So: If you want to "divide by 0", you're going to have to give up one of
those algebraic axioms I mentioned. Which one would you give up? Associativity
of addition? The distributive law?

Just to expand on this, mathmaticians have done this; in several ways. For
example, the Projetivly Extended real line, and Riemann sphere add "∞" to the
Real or Complex numbers respectivly, such that 1/0=∞. Note that 0/0 remains
undefined, ∞=-∞, 0∞ is undefined, and ∞ + ∞ is undefined (I am probably
missing other "oddities" of these constructions).

There is also a more general way of defining division by 0, that avoid
undefined instances: wheels.

As bikenaga mentions, the standard definition of a "reasonable" number system
that involves addition and multiplication is a ring. In general, division by
anything is not defined because elements are not guaranteed to have
multiplicative inverses [0]. For example, the integers form a ring, but 5/3 is
not defined in the integers.

If you add the following two properties to a ring, you get an integral domain:
1) Commutativity: xy = yx 2) if xy=0 then x=0 or y=0. Again, the integers are
an example. For a non integral domain ring, consider the integers mod 4, where
2 * 2 = 0.

Once you have an integral domain, there is a standard way of defining division
by any non-0 element: fractions. Informally, we that x^(-1) is the fraction
1/x, and a/b = x/y iff ay=bx. Addition and multiplication of fractions are
defined as you learned in grade school. [1] As you would expect, applying this
approach to the integers gives you the rational numbers. More formally, we
defined the fraction x/y as the ordered pair (x,y).

To define a wheel, we modify the above construction slightly. Specifically, we
say that a/b = x/y iff there exists an s,s' such that (sa,sb) = (s'x,s'y) or
sa/sb = s'x/s'y [2].

Addition and multiplication remain unchanged, but we define a new operation
for taking inverses: /(x,y) = (y,x). That is to say that, to take the
"inverse" of an element, you swap the numerator and denominator.

In this system, we define 0 = (0,1) = 0/1 and 1 = (1,1) = 1/1.

Division by 0 is now a simple matter: 0/0 = (0,1)/(0,1) = (0,1)(1,0) = (0,0)

Notice that, under this construction, (0,0) is not the zero element; (0,1) is.
Further, the equation

(0,0) + x = (0,1) has no solution.

If you keep poking at this structure, I am sure that you can find other bad
things that happen.

[0] In fact, depending on who you ask, a ring is not even required to have a
multiplicative identity (eg. 1).

[1] This construction gives you a structure known as a field; which is, in my
opinion, the point where most non-mathematicians would start consider the
algebraic structure to be a reasonable number system.

[2] Under this construction, we can also loosen the requirements of the
underlying ring. Specifically, any commutative ring will do. We do not require
that xy=0 implies x=0 or y=0.

~~~
bikenaga
The stuff on wheels is interesting! I found the paper by Carlstrom
([http://www2.math.su.se/reports/2001/11/](http://www2.math.su.se/reports/2001/11/))
and it alludes to applications, though I didn't see any specifics about
applications (to computer science, at least). Thank you for the pointer.

~~~
gizmo686
If you avoid division by 0, wheels revert back to normal fractions [0]. This
should mean that you can use them as a drop in replacement for a rational
number datatype. Doing so should allow you to defer checking for division by
zero; possible moving the check outside of a tight loop. Granted, this should
also be doable as an optimization of normal rational arithmetic.

The only use cases I can think of for wheels amount to them being a principled
way of adding NaN to the number system. Of course, if history is anything to
go by, a hundred years from now someone may look back on this comment the same
way we look back on people calling sqrt(i) "imaginary".

[0] At least under the explicit construction presented.

~~~
nonbel
This conversation may have moved well above my head, but I mean start with 0/0
= "undefined". I don't see why that should be the case.

    
    
      x   = 0/0
      x*0 = 0
      
      1*0 = 0
      2*0 = 0
      ...
    

From this we see that it isn't really that x is undefinable, rather it can be
any value at all. There is apparently no issue with an equation having _two_
equally valid solutions (eg quadratic formula), so at what point are there too
many?

~~~
bikenaga
I think I see what you're asking. I think the answer is: Before we start
talking, you have to tell me what _all_ the rules are. So when you say "start
with 0/0 = 'undefined'", what is the ambient number system? The proof I gave
earlier showed the number system can't be a ring - so it's not the real
numbers, the integers, the rationals ... at least not by the standard
definitions of those number systems. When you do math, you don't make up the
rules as you go.

As far as the example you gave goes, you start by _assuming_ 0/0 is defined.
But if you're _trying to show 0 /0 is defined_ you're assuming what you want
to prove. The logic isn't correct.

Note that giving 0/0 the name "x" doesn't do anything. Simply naming something
doesn't establish any fact. It just makes "x" shorthand for "0/0".

Anyway, observing that 0/0 * 0 = 0 but also 1 * 0 = 0, 2 * 0 = 0, and so on
doesn't establish any necessary connection between "0/0" and 1, 2, ... You
wouldn't conclude from "1 * 0 = 0" and "2 * 0 = 0" that "1 = 2", or that "1
could be 2", for instance. So nothing has happened. But what _could_ happen?
Remember that you started by assuming that "0/0" was defined. "Assume" in math
means you've assumed it's true. In that case, you're done, right? Its
"definedness" isn't probabilistic. And if starting with that assumption you
_did_ find out something true, it doesn't follow that the assumption is
"independently" true. (The truth of "if P, then Q" and the truth of "Q" do not
together imply the truth of "P".)

You might want to look at the post on wheels higher up this thread. It shows
what you _could_ do - namely, use a different set of rules.

This may be more than you wanted to know ...

------
kutkloon7
Meh. I feel like American math education is seriously lacking, and I doubt
that adding stuff to the common core would do any good.

------
jaclaz
Anyone can add [2003]?

------
fbreduc
absolutely

------
finid
What's next?

Do Doctors Need to be Taught the History of Medicine?

~~~
jaclaz
Certainly yes (within limits), and as many people said above that applies to
all fields. Sometimes knowing how/by whom/when something was discovered or
entered into practice gives some perspective on what you are doing today.

------
repomannwp
The best class I took in college (a million years ago) was "Men of
Mathematics". Of course today, that course would be impossible to teach as the
SJW's would protest the math dept, etc. The text used for the course was a
book of the same name from E.T Bell.

