That’s not a progression I am familiar with. In trigonometry and precalculus I learned Series then Limits, then in calculus derivative first then integrals. There are definitely new directions to go with Series and Limits after learning integrals, but the concepts are prerequisites to calculus (and arguably trigonometry).
That's the historical progression, not the teaching progression. That's the point. Integrals were conceived over 2000 years ago. Derivatives came Newton-ish. Formalism with limits came last.
There was another step: Leibniz' pre-limits formulation of calculus used hyperreal numbers: infinite and infinitesimals (small-delta) numbers. This version is conceptually simpler and for many people more intuitive than the limits-based explanation. It was the standard approach for many years.
Infinitesimal calculus was discarded in late 19th century pedagogy in favor of limits as no rigorous proofs yet existed of its general correctness, whereas such proofs had been formulated for the limits version. It was suspected that while infinitesimals were a nice intuition, they lacked rigor which might lead to unknown errors.
It was eventually rigorously proven in the 1960s that the infinitesimals approach is indeed as complete and correct as the limits version, but by then standard pedagogy had become entrenched in limits.
I understand that nonstandard analysis is a firm footing on which to build calculus, but is it totally equivalent to real analysis?
I loved real analysis in college, for example I found cantors set kind of crazy. Does that "exist" in nonstandard analysis? Or are there other mind-bending implications?
> I understand that nonstandard analysis is a firm footing on which to build calculus, but is it totally equivalent to real analysis?
This is a bit of a philosophical question which hinges on what the word "equivalent" means. I disagree with the other comment. It is not totally equivalent, or even at all equivalent.
I think about math in terms of set theory. You create sets, like the real numbers, the integers, or otherwise. You can add other elements to those sets. For example:
* You can add a variety of infinities (countable, uncountable, positive, negative, etc.). You have a self-consistent system.
* You can add imaginary numbers, think of the whole thing as a ball, with a zero on one end and infinity on the other. That gives you complex analysis, which is self-consistent as well.
All of those are helpful, useful, and powerful, and generally lead to the same place where they overlap, they're not very compatible with each other, either formally or intuitively.
I view the two formulations of calculus as similar. Even if they lead to the same results, that's not the same as saying they're equivalent. They're different formalisms.
Integral calculus before differential calculus is how it is done in Apostol's "Calculus". Here are the first several chapter titles to give an idea of what is covered when.
Introduction
The Concepts of Integral Calculus
Some Applications of Integration
Continuous Functions
Differential Calculus
The Relation Between Integration and Differentiation
The Logarithm, The Exponential, and The Inverse Trigonometric Functions
Polynomial Approximations to Functions
Introduction to Differential Equations
Complex Numbers
Sequences Infinite Series, Improper Integrals
Sequences and Series of Functions
(the rest of volume I is vector algebra and calculus and linear algebra).
That book was widely used at Caltech, MIT, and several other top schools for decades.
Starting out with integration makes so much more sense, and it's way more graspable from a geometric standpoint (what's the area under this line?) as opposed to the weirdness (and relative abstraction) of the derivative.
Derivatives have nice physical problems to solve like "find the maximum" and "find the relationship between these two variables" which you can find by measuring a linear slope a lot of times. Learning derivatives starts with linear regression, which you do when learning about lines on a Cartesian plane.
By the time you know about drawing lines on planes to find the integral area underneath, you've already figured out the derivative so you know what direction to move your pen in next while drawing the line
> problems to solve like "find the maximum" and "find the relationship between these two variables"
I think these are actually pretty abstract, so it's hard exactly to grasp what a derivative is until maybe Calc II or III -- imo, the main problem is that it just seems like mathematical black magic: you need to define a limit, you need to use that definition to subtract two slopes, the complexity of which magically kind of vanishes, and you end up with a "method" for spitting out derivatives. Whereas finding area under a curve seems like a common sense thing to do: you take smaller and smaller slices of a thing and add it all up.
I do agree that if you really learn anti-derivatives (which, by all accounts, is harder), you basically get derivatives for free.
I totally understand your position, and I completely agree that integration is way more complex, often doesn't work, and very quickly gets you in the Gamma function danger zone, but I still think it should be taught first (maybe just looking at elementary functions) because it's way more intuitive in my opinion.
There's a reason it was developed first, historically speaking.
Freshmen calculus I does not have to immediately jump into mechanically difficult integrals. You can leave those for Calculus II and concentrate on the fundamentals of integration and differentiation in Calc I.
You completely missed my point. I mean the actual mathematical analysis concepts involved in integration (measure theory, Rieman integration, principal value) are far more intricate than the concepts involved in differentiation.
This is the same progression that I learned, but I am not sure it is the best one. I wonder if there has been any rigorous scientific study on the pedagogical benefits one way or another?
Why do you think we don't? Different teachers, institutions, and educators have used differing curricula and systems and styles education for all of history. In modern times in civilized countries (and states) they are even required to update their knowledge over time and interact with other educators and experts to improve their techniques, knowledge, and systems. Large government education bodies regularly try to put a scientific spin on the process but it's often clear they aren't being rigorous or honest with themselves.
If your local educators are not required to participate in this system of continuing education and improving techniques, that's a local government decision and you should seek to fix that.
Fixing it is often hard in the states because there is a not insignificant amount of Americans that see schools as undermining their authority, see education as a useless and undesirable endeavor, often as a "radicalizing" and "socialist" influence and institution, and so they fight back against any improvement of schooling. There are states in the union that allow pretty much anyone, with no qualifications or certifications, to become "educators", and other states in the union that require a masters degree and 12 credits of class work every few years to continue to be an educator. Outcomes often imply a correlation between those requirements and the result, and yet the states with much lower bars of entry do not increase them. Education in the states is a politics issue, for some stupid reason. There are people who believe for whatever reason that middle schoolers are being taught "Critical Race Theory", as if that isn't a college level course that the vast majority of humanities majors don't even experience.
Different strokes for different folks and all that but the concept of infinity doesn't really bother me that much because it doesn't really exist to me as a number more than it is a fact of the unbounded-ness of some set with a total ordering. It is more a statement of potential rather than actual existence, that given an infinite set, you can always find a "larger" element, which to me is just a result of the way it is defined.
Infinitesimals to me are a bit more difficult because physics (the real world) doesn't actually need real continuity, otherwise physical models won't work because many continuous quantities in physics (fluid densities, temperatures, etc) aren't really continuous on small scales compared to the problem size, but that isn't needed to model such systems well. It seems like there are only a few pathological examples mathematicians come up with which is interesting and fascinating that require actual continuity, but most physical systems do not need real, actual continuity.
On the same token, I guess (countable) infinite sets aren't really needed either.
I think that's why it's so handy to start with epsilon delta limits, or even better sequence limits. Once you can screen off "infinitely" to "arbitrarily" and see when and where the substitution works, you're golden. Of course there are other things in math which might rightfully be called "infinity" but they don't come up in high school calculus
While it’s still amazing which videos I can still find from 2005 or 2006, there’s not a week that goes by where I’m not kicking myself for not being quicker to the trigger to download something.
My del.icio.us bookmarks go back to 2003. The web of course hasn't gone anywhere, but a significant percentage of those links are either gone or no longer reachable at their original address.
First, my mind doesn't work at the speed of that presentation, so I downloaded it in NewPipe and played it back at about 0.7 its original speed in VLC. After that it was fine.
I can't say I was always good at math as my marks varied all over the place but I never really had problems with calculus. Sure, some of its mechanics were a pain—awkward integrals for instance—but the concepts usually made sense to me.
As I've mentioned on HN previously, I'm a strong believer in teaching a simplified calculus to kids at a young age (in primary school). Early familiarization and appreciation of concepts such as limits and rates of change before they enter high school I reckon would greatly ease their burden of learning the subject.
This needn't be complicated, introducing calculus to kids with, say, versions of Zeno's dichotomy paradox can be easy and lots of fun for them—especially so when they're asked to solve the seemingly intractable problem of the frog in the center of a pool who never makes it to the bank as each successive jump he makes is only half the distance of the previous one.
Kids can clearly see the poor frog never makes it to the bank given those rules but their eyes light up when you tell them there's a 'nice' branch of math that can save him called calculus which says his feet are too big for him to drown.
Thus we've introduced calculus as a friendly helpful subject at a young age and not scary frightening subject it's so often portrayed to be. In this example we've introduced them to the concept of infinite series (and infinity), limits and how calculus can solve problems that cannot be solved by simple arithmetic alone.
I think one power of calculus (and everything else) is that you don't need the history. The same way I don't need the history of the hammer to use a nail. What drives my interest is the examples. At the end, the book author talks about students not paying attention until the teacher starts talking about examples. To me, that suggests you should first contextualize and create a need for the tool, then describe the tool. Not to start introducing names of dead people, no matter how great they were. (At least where I grew up, this will always devolve into the curriculum requiring you to know those names and dates, draining focus from what actually matters.)
I disagree. The biggest thing that can help you learn something complex is by introducing it as it was discovered, and making someone feel like they could have come up with it on their own.
I don’t see why the history of discovery of something should be a particularly good way to rediscover it.
As a concrete example, if someone asks “how do we know the Earth is round” or “how do we know the planets orbit the sun”, one could go through ways to figure this out with math and minimal technology. But one could also point out that we have spaceships and probes and have actual pictures! You can see the Earth from a satellite or the ISS or the moon, and you can see the planets from space!
I agree that a process of rediscovery can be a useful teaching technique, but that doesn’t mean that rediscovering something the way it was first known to have been discovered is the right approach.
Another concrete example of the conflict is classical vs quantum physics which is nearly exclusively taught as the historical story of the quantum revolution or whatever you call it around 1900.
It would be trivial to skip the historical story and just "yo here's a FET transistor, a solar cell, and a vacuum tube, and the math just works, so get used to it" but nobody is ever taught that way.
Would be interesting for someone to attempt to write a non-historical physics textbook.
On the other hand, in IT, this style of learning is considered normal. "This is how the kool kids do things today. The past does not matter and probably didn't exist anyway. The end."
>On the other hand, in IT, this style of learning is considered normal. "This is how the kool kids do things today. The past does not matter and probably didn't exist anyway. The end."
And yet nand2tetris is one of the most common suggestions for people who actually want to figure out how computers work. Because you actually are posed with problems that real world engineers used to struggle with, and are allowed to play with the problem before finding out the answer. It builds a much deeper understanding than "it works like this because it is"
A downside to the modern approach is that “plug the equations into a computer and let it churn for a bit” plays a big role I think. This is seems a bit annoying from a teaching point of view
* “here’s what the computer came up with” doesn’t really build as much intuition as doing the equations out by hand
* there’s enough material in “how to plug it into a computer and make it run quickly” for a whole degree
> Would be interesting for someone to attempt to write a non-historical physics textbook.
I recall some prominent physicist saying they've never read Einstein's original papers on relativity; and they don't have any real desire to, since we understand it much better now, and have better examples, notations, etc.
Most physicists have never read them, fwiw, as interesting as they may be for their historical role. Most classic papers go unread, and it’s not a bad thing pedagogically. How things get sorted out is usually a huge mess and following the real history can be an enormous distraction (and can confuse and misdirect students).
While that is true, I would highly encourage everyone to read the original EPR (Einstein Podolski Rosen) it is so amazingly clear in its writing and can be understood even by laymen.
I've been interested for awhile in a distinction between math and philosophy here. Just descriptively speaking, we don't tend to teach math from original source (at least, not well-established math like you'd see in high school and early college). We don't teach calculus out of Newton/Leibniz/Cauchy/etc, we don't teach Fourier analysis out of Fourier, etc. Maybe geometry students will work from the Elements, but I certainly didn't.
But in philosophy, students do read everyone from Plato to Rawls in the original. There are (lots of) supplementary texts, of course, but they're companions.
I think that shows an interesting divide that the way we actually think about philosophy is not just a catalogue of arguments and positions one could take, but that the famous works are things worth reading unto themselves. There are a few math papers like that ("God Created the Integers" tries to anthologize them) but they are the exception rather than the rule.
“How do we know the world is round” is an interesting example to use. Eratosthenes got an estimate for the radius of the Earth that was pretty close using, I dunno, sticks and his eyeball or whatever ancient Greeks had.
The history is useful there, I think. First off, it is motivating — that’s a pretty big question, really shows what math can accomplish without much hardware. Second, it provides some sort of useful constraints if you are trying to work out the computation yourself. In a textbook the problems are always a bit artificial, some info will be given, some hidden. “Anything an Ancient Greek would have” is a bit more real-world, you can see where the constraints come from.
Erastosthenes did it by comparing the angle of the shadow cast by a stick on the summer solstice at noon in two cities, Alexandria and Syene (now Aswan), which are about 800 km apart. What was special about Syene was that it is so close to the Tropic of Cancer line that the sun would hit the bottoms of wells on the solstice, meaning that he only needed one stick, in Alexandria, and the distance between the cities, to do his estimate. His error was about 2%.
He's also famous for the sieve of Erastosthenes for finding prime numbers and figuring out the earth's axial tilt.
Darn, at some point I started imagining he had a protocol for synchronizing clocks over a distance. assuming the difference was in longitude. (E.g. Symmetric NTP over carrier pidgeon.)
Yea, recalling the book of that name might have contributed to my drift of recollection. Solvable earlier on land (but not sea) on terms of clock synchronization.
You can get a vaguely credible estimate by standing on a (small!) hill and measuring the distance to the horizon. Works better if you have a body of water handy, because bodies of water are nice and flat.
I think it can depend on the particular discovery whether we should teach "here's how the first person figured it out" or skip right to "here's how a later person found a much better, easier to understand approach".
Things were usually discovered in a confusing way that kind of made sense at the time, but had false starts and conceptual limitations driven by people still being in an older mindset when they were trying to discover it. A good teacher sands off those edges, leading to an oversimplified history, but not starting from the firm conceptual footing we only got later on still makes things overly difficult to grasp. Good teaching builds on concepts, even if history took a messier route.
>> Things were usually discovered in a confusing way that kind of made sense at the time, but had false starts and conceptual limitations driven by people still being in an older mindset when they were trying to discover it
To know this gives us some perspective on our own biases/limitations and not take things as gospel.
The way calculus was discovered was via highly geometric methods that are beyond the comprehension of people today. So the historical development of calculus needs to be modernized into modern notation and concepts, which at that point basically becomes examples.
While calculus is actually easy to develop and motivate, the way we "discover" it today is nowhere close to how it was originally discovered.
In my opinion, calculus is most easily motivated via physical examples. Here is a curve, it represents some information, and what are some things we'd like to know about this curve and thus the information it represents? What are some things we can do with that information? Such motivation can be more abstract and not rely on physical examples, but I think the physical examples help tie it to the so-called real world and somewhat mirrors, in spirit, how and why calculus was originally developed.
That's not the biggest thing that can help you learn something complex. Practice is the biggest thing. And there are many variations of what motivates people to practice.
The comparison between a hammer and math doesn't make sense, math is much more than a tool. Personally, I'm able to learn best when I understand the context and the "why" questions that people had which led them to the answer. I know other people learn differently so to each its own. Also, I agree with you in that the traditionally-taught history is awful because it lacks relevance. It's now about "how is this useful for me" or even "why should I care" but rather "why did they decide to act that way?"
I think the dual role of means (tool) and end in itself is part of the reason mathematics is so difficult to motivate. There is this constant tension within the pedagogy between the desires to teach the beauty and the practicality of mathematics.
From what I’ve seen, the beauty side tends to come from mathematics teachers and professors who have developed a passion for the subject and genuinely want to share that passion with their students. The practicality side, on the other hand, seems to be driven by industry, administrators, lawmakers, parents, and others involved in the curriculum-setting process.
Ultimately, both sides struggle mightily for students’ attention and a typical result is a collective sigh from the class. If were to make any progress I think we need to recognize that student motivation comes from within. Some students are genuinely attracted to the beauty of mathematics while others are interested in different subjects (science and engineering) for which mathematics is only a prerequisite. Perhaps one approach for dealing with this disparity might be to offer students a choice of projects with differing mathematical styles (proof vs calculation).
Another thing that makes calculus hard to teach today (and not unrelated to the points being made) is that while a couple generations ago, every calculus student also took calc-based physics, that is no longer true today. With the way we compartmentalize teaching today, it's hard not to teach entry-level math abstractly, and that's only going to work for a small subset of students.
Learning names for the sake of learning names is never helpful. However the history of mathematics can help to contextualise the material and its motivation. For example, group theory has myriad modern applications. It can be (and usually is) presented in the standard mathematically-polished form, beginning with the axioms and perhaps using matrix groups or simple geometric symmetries as examples. For many learners this is a perfectly sensible way of approaching it and they wouldn't be interested to know more, yet I found it rather clinical and dry.
As I'm sure you know, group theory arose as a novel yet natural way of investigating the conditions under which polynomials can have closed-form solutions, and when Galois began to sniff this out it allowed him to get to grips with the impossibility of a closed-form solution to the general quintic polynomial: a problem which had beguiled mathematicians for centuries, solved by a theory which crystallised a more structural way of approaching abstract mathematics. Even though Galois theory is a relatively niche topic within the broad context of academia, for me it draws out the true character of group theory in a way that matrix groups don't, and neither do contrived examples formed from the natural numbers under various quotients. And it clearly solves a problem that required a new way of thinking.
There's certainly a balance to be struck here. If you fixate on historical origins then you aren't engaging with the reasons why the topic is still relevant today, so you risk missing the point. It also slows down the pace at which you can absorb the tools needed for applications. But if you only engage with the modern approach you risk building a disconnected, lifeless archipelago of knowledge, unable to see the beautiful links that unify so much of mathematics.
I totally second this. For someone who has a tortuous relationship with mathematics, I always lovedthe beauty of som concepts but to a certain point, got lost mainly due to knowledge base not really understood. Getting to know some historical facts would have help to be more critical to myself ( rather than just copy and paste formula).
Years later I surprised myself spending time on new maths problems thanks to YouTube maths contents(3brown1blue, Michael Penn, mindyourdecisions,etc...) Which make me reboot my knowledge.
The challenge here is how is it inseminated during your scholarship.
From the what I remember, complex number theory were introduced to me as a rule that need to be acknowledged ( with no incentive to look further).
Welsh lab vid below made me reconsider my own shortcomings.
While I agree with your sentiment, studying abstract algebra at the undergraduate level we never got more than a broad sketch of a proof of the general insoluability of the quintic. It more seemed an opportunity for the professor to tell the life story of Galois -- mathematical genius and political firebrand killed in a duel over a woman at age 20.
That being said, it is difficult to find a motivating example for much of elementary abstract algebra -- which is doubly hard, because for many students abstract algebra is a new level of abstraction, math without numbers and it takes something to make that leap.
As others have argued, one can learn mathematics without its history, but it is very helpful to motivate many of its abstractions. A good example of this are some of the textbooks of David Bressoud. In a "Radical approach to Analysis", concepts are introduced in historical order. The book starts with the heat equation and its solution by Fourier series which was rejected at the time, in constrast to most books that start with the real numbers and limits.
I think something quite different than you, but that isn't important. The important thing is that different people are learn differently and are motivated in different ways. Curriculums should take this into account and accommodate different learning styles. Unfortunately, they usually don't.
If you don't mind sharing, what grade-level, decade, location were you taught this?
The key part about the pizza example is the demonstration of mathematical thinking completely disparate from what I experienced at a Catholic high-school in Chicago during the 2000s.
The diameter of the universe isn't even close to the highest number one could use for a real thing. Try the number of atoms in the universe... but I doubt such a threshold exists
Let me know, not if anyone here can do Calculus (that is ordinary; who can't?), but rather whether you actually personally use Calculus, and please describe generally an example of needing Calculus to figure something out in the last 5 years.
I think my point is that the history of mathematics is exponentially more interesting than mathematics. We like mathematics, but we like stories far more than we like whatever mathematics is.
I do calculations on Maxwells equations all the time, that's s calculus. Doing CAD, or any finite element calculations - > calculus. Modern engineering would not be possible without calculus. Just because software engineers mostly use linear algebra, doesn't mean that most other technical disciplines are the same.
> Modern engineering would not be possible without calculus.
Was not my question. Nearly everyone uses computers. Computers would not be possible without electronics. How many computer users work with Maxwell's Equations or have even heard of Kirchhoff's circuit laws? A very tiny percentage, undoubtedly.
The reason I asked was the Penn State professor's epiphany that no engineer among the many he asked actually used calculus.
How many programmers use linked lists? I just use the list type in python, it's just "list" not a "linked list". I don't have to deal with individual nodes or pointers.
I don't have learn sorting algorithms, I just use list.sort or sorted.
In my personal life, if I talk about the economy with someone, or listen to an expert talk about it, I use the conceptual framework of calculus to evaluate the relations between quantities. For instance, understanding changes in demand of various products in the face of inflation based on the price sensitivity of those products.
At least if you've actually learned the concept of derivatives and integrals, it wouldn't be that hard to refresh enough for a given use case. I'd agree that most of us that have taken a year of university calculus don't know calculus.
To your second point, I would disagree that we learn maths that we may likely never directly use not because of historical fascination, but because progressively expanding student's mathematical knowledge serves the purpose of meta-learning, learning to learn.
As to your question I've touched on a little for manipulation of some of the classic equations in ecology and population biology, growth curves, population growth, etc. Barely needed to crack an introductory textbook to do it though. Had I learned real calculus and maybe differential equations there is some really interesting modelling I'd be able to do in this space, but instead I'm stuck seeing the diffuse shadow of the idea on the wall of Plato's cave. That's not a wistful lament though because to your point, time spent boning up on linear algebra instead of calculus would provide at least an order of magnitude more utility for my work.
> To your second point, I would disagree that we learn maths that we may likely never directly use not because of historical fascination
That was not my point. No one learns math because of historical fascination. The point was simply the stories about the math are more interesting than the math itself, which isn't any stretch of the imagination.
That's comparing a tool to an insight though, like sure 'grammar' isn't as interesting as a poem. Math is interesting because despite being axiomatic there are all of these emergent things it can correctly predict about the world. To mix metaphors, there are very interesting stories that are at least in part in the grammar of math.
I did a module for calculating financial figures and values in work. Knowing calculus was very useful - both to understand what I am doing and to figure simplifications where possible.
Imo, history of mathematics is not interesting unless you are interested in mathematics itself too. It is sort of stuff that some people good in math and interested in care about, but is otherwise completely uninteresting for anybody else.
Unless you turn it into history tabloid. Which you can do in pretty much any field, but also it will loose any meaning. At that point, you can do paris Hilton or Kayne West.
Do tools that use calculus count? If so, switch-mode power supply design uses quite a bit, mostly differential equations. I let the CAD software do the numeric simulations, but there's always some hand calculation needed to pick initial component values.
Well, the answer should be yes. Otherwise you would have to conclude very few people use arithmetic in their work which seems peculiar. A tool based on calculus could be rather abstracted from the math though and the user not even know they are using calculus.
I use calculus quite a lot, basically any time I need to go beyond an existing solution I could look up in a textbook. That is, all the interesting problems! For example in writing this article, especially details in footnotes 4 and 13: https://jbconsulting.substack.com/p/voiding-the-warranty-how...
When I want to write or tune a Kalman filter or some other control system, I need to work out the differential equations that govern a system, often from first principles if the system isn't a simple textbook example like an inverted pendulum. Sometimes I need to design that system myself or find out what change to its dynamics will fix a noise problem (eg. add a capacitor here, etc), and without an understanding of calculus I would be searching in the dark or stuck trying to look up known solutions in textbooks.
It also comes up a lot when I want to do probability or statistical calculations, because I end up needing to integrate various functions or products of functions to find means / medians / covariances / etc. Sometimes I can get away with leaning on Mathematica to do those integrals for me, but sometimes I can't, like when one of those functions is an arbitrary input like f(x). Even when I can use Mathematica to perform the integral, I need to know calculus for that to be any clearer than a magic spell. Without knowing calculus, how would I ever guess that I can solve my problem by typing "ln(x0-a)/arctan(x + a)" into wolfram alpha and then plotting the result?
Recently I had the need to do work backwards: find the function that satisfies certain properties like - being smooth and monotonic, zero at the origin, having a certain area within certain bounds, having a certain covariance, nowhere going to infinity, etc. This sounds very abstract but was the key to cracking a concrete signal processing problem.
And then there's all the optimization work, where I need to write numerical code that will look for the maxima or minima of some unknown function, to eg. design the best possible motor you can pack into a box of certain size and weight. 3rd-party libraries are certainly helpful but I usually find I need to supplement them with domain logic to keep it from getting stuck in a local optimum or rule out physically impossible search spaces.
A few months ago I needed to write a finite-element solver to handle soft, thin woven fabrics, where the material can support tension but not compression stresses and the material properties are different depending on the axis of the weave. I would have been at a total loss to get that done without calculus.
Even when I am pulling something out of a textbook, most engineering textbooks past the first-year-undergraduate level will presume some familiarity with calculus.
Not to mention properly thinking about current issues like inflation, CO2 emissions, etc, where the discussion bounces back and forth between integral and derivative quantities.
I'm not even a "math guy". I didn't do well in math in school, and I always struggle a lot to learn new math until I have an application in mind that I need it for. It's only because of how often I need it for my projects that I've ended up learning a lot of math, not the other way around.
Heh, ditto. I don't think I got any new info that I didn't get in high school calculus, but it is a well done review of things I hadn't seen in a couple of decades
I got calculus in college, from mathematicians. :-(
We spent about 15 minutes (well, ok, one lecture) on series, then a couple weeks on limits, then limits and differential calculus, then the rest of differential calc. (throwing the limit stuff away), then limits and integral calc., and finally the rest of calculus.
Limits are a horrible way to learn calculus if you're trying to understand it at the same time as doing the algebra. :-)
My argument has always been that making unmotivated (it's remedial and annoying work for them), math experts, ie people that probably have never struggled with math in their life, teach entry level math to lots of people who don't "intuitively" get things like the teacher did 20 years ago, has been a huge mistake. These professors never seem to understand what it's like to NOT understand calculus.
The anecdote I always reach for is find someone who has been taught a higher level math subject by both a university professor, and an adjunct or local community college professor who might not have higher level knowledge of math. Many people prefer the "less expert" teacher.
Consider if you have been in software development for a long time, and how hard it is to remember what NOT understanding a variable was like.
Yea, it has gotten worse in the US thanks to the common core bullshit. My daughter’s math education has been eroded with shitty ed tech apps like the ones from MHM[0]. I’ve been able to support her but there’s many kids that are not so fortunate.
Most of the hate seems to stem from the changes in mathematics teaching. Instead of focusing on rote memorization, the curriculum wants students to understand the concepts, thus it teaches more ways to tackle a smaller pool of topics. I believe this idea comes from the way that mathematics is taught in countries whose students outperform Americans. A friend of mine who was schooled in eastern Asia claims his daughter is learning the same methods he did growing up.
FWIW, I learned some cool techniques that I didn't know about when my friend showed me his daughter's homework. But I can see how parents of fifth graders who aren't able to help their kids with their homework because they don't understand it / never learned that way would be angry.
States all had their own standards, which were largely the same, but just kind of different enough. Prior to Common Core, private companies sprang up to fill the need of mapping all of the different state standards to a common standard.
Modern "teaching" practices are just pretending to teach concepts; they're mostly based on the expectation that the students will somehow "rediscover" everything on their own. You can't truly understand concepts without doing quite a bit of solid practice. Many teachers will fail to provide the needed guidance, if only because they often never properly learned the subject themselves; and "Common Core" ends up being used as an excuse for substandard teaching.
It could be worse but one thing that constantly bothers me is the stupid new vocabulary. One of the worst examples: "number sentence". It means what you, an adult, might guess it does (well, it means one of the things you might guess it means) but why use it at all? They introduce it before the kids have a decent idea of what a normal sentence (what, a "word sentence", I suppose?) is. Many of the symbols in them aren't numbers (+, -, =). They don't really act like sentences or serve the same purpose. It's misleading as hell and the kids don't even have the context that might, even hypothetically, make it useful for scaffolding, when they introduce it in kindergarten.
They do this for lots of stuff. There's a whole Common Core Math jargon the motivation for which I just can't grasp.
Also, much of the approach is effectively what you do with kids who are struggling in math—you shower them with techniques and explanations for the same thing, hoping one will stick. The "show 5 different ways how you could have solved this" worksheets drive gifted kids, at least, batty, and that's most of the work. "I fucking solved it, I've already shown you a hundred times I understand this other technique (which I'll never use because I also know several better ones), why are you still bothering me?" They'll have them do two problems with all that extra stuff, in the time and space the kids could have practiced ten problems. It's making my kids hate math in early elementary. Wonderful. Just wonderful.
[EDIT] To illustrate by example why I find "number sentence" in particular so deeply stupid: would anyone think a good approach to introducing sentences to very young children, in language classes, might be to label them "word equations" or "word formulas" or "word algorithms" or anything like that? Would anyone think that would be an improvement? God no, it's plainly a bad idea—but that's exactly what someone, somewhere, decided to do, in reverse.
> One of the worst examples: "number sentence". It means what you, an adult, might guess it does (well, it means one of the things you might guess it means
Specifically, it means “equation or inequality”.
> They don't really act like sentences or serve the same purpose.
They act exactly like sentences, and serve exactly the same purpose, because they are declarative sentences about the relationships between numbers.
> It's misleading as hell land the kids don't even have the context that might, even hypothetically, make it useful for scaffolding, when they introduce it in kindergarten.
The idea is that they are mutually reinforcing, not sequential, concepts.
It also, as a sibling comment notes, has nothing to do with Common Core, its an orthogonal development in pedagogical approach to the Common Core standards.
> The "show 5 different ways how you could have solved this" worksheets
... also have less to do with the Common Core standards thablazy development of the particular materials (demonstrating proficiency with different articulations of operations and ways of solving problems is part of Common Core, to an extent, but even where it is doing it by using them all on the same problem, in the same assignment, and doing that for a substantial number of problems on one assignment, is just bad exercise design.)
> To illustrate by example
What you offer there is an analogy, not an example. Analogies don't really act like examples and don’t serve the same purpose.
> They act exactly like sentences, and serve exactly the same purpose, because they are declarative sentences about the relationships between numbers.
"Exactly" is simply incorrect in both occurrences, here.
> It also, as a sibling comment notes, has nothing to do with Common Core, its an orthogonal development in pedagogical approach to the Common Core standards.
All this hit alongside Common Core. Ask teachers and they won't quibble with lumping this practice in with Common Core, since that's how it's expressed in practice, though some might be aware that it's not in the standards. Source: I know a lot of teachers.
> What you offer there is an analogy, not an example.
It's both, so you're technically wrong, which I gather you think is the worst kind of wrong. Analogy's probably the better word here, though, sure, if I was only going to use one.
> Analogies don't really act like examples and don’t serve the same purpose.
They act exactly like them. Using your sense of "exactly" from above.
Well, if you take Montague semantics seriously then a "word sentence" is precisely a very broad generalization of a "number sentence", mostly differing by modality (accounting for "possible worlds", time, epistemic state, indexicals, attitudes etc. etc.) The basic "algorithmic" approach to language and words was already known to the ancient Indian grammarian Pānini. It's incredibly sad how few in the supposedly advanced West are aware of the countless innovations India brought to so many fields of science, art and culture.
If something isn't part of the formal Common Core standard, but is part of the "reference implementation," then practically speaking calling it part of Common Core is usefully descriptive.
We use this kind of synecdoche all the time when we do things like say Blub is a high performance programming language. If you want to be pedantic it's actually the Blub language, the Blub compiler, and the Blub runtime as a whole that determine whether or not Blub programs execute performantly. Sure it's useful to be aware of the distinction, but it's annoying and even disruptive to get caught up on it when everyone knows what is meant.
Number sentences existed prior to Common Core, so saying that number sentences is part of Common Core is not usefully descriptive and simply serves to confuse the issue.
Most people's complaints stem from assuming their local schools implementation of common core, and who they purchased the teaching materials from === "Common Core"
There are some SERIOUSLY garbage work books available for common core curricula. Some of them are obviously trying to touch on the understand that common core is trying to push, but seemingly written by someone who never understood that concept themselves, so they are basically nonsense word salad.
I remember when I was a kid, our state pushed a different direction in math education about helping students understand how to turn situations and problems into a mathematical equation to solve. Local parents decried the situation as "New Math" (importantly this stuff was distinct from the actual 70s era "New Math"), and that it was terrible and blah blah blah the same thing that happens any time you have poorly educated parents struggling to understand their children's educational material.
Of course, we children did absurdly well in education terms, literally a stand-out class of students in terms of educational outcomes, maturity, flexibility, and standardized test scores. Unfortunately it would be difficult to tell what the heck caused the improvement because there were MANY weird things to happen to my class (we were called the "Guinea pig class", we had second language schooling from first grade, our middle school scheduling was changed up to give us more specialized teachers and more classroom variety, and we always got brand new curricula to test out, along with just plain notable demographic changes happening in the area at the time)
That’s not a progression I am familiar with. In trigonometry and precalculus I learned Series then Limits, then in calculus derivative first then integrals. There are definitely new directions to go with Series and Limits after learning integrals, but the concepts are prerequisites to calculus (and arguably trigonometry).