When I started a gamified music discovery company, I actually ended up using DiffEq to define a scoring algorithm that would produce continuously varying point values based on time series input data. I had to relearn how to do it, but the concepts made far more sense with a real application.
That said, even if you need a numerical solution it will still often require a lot of simplifications in order to be tractable. Multiphase fluid flow, for instance, relies on tons of physics simplifications and empirical correlations in order to make numerical techniques viable.
The problem with the analytical approach to differential equations is that it doesn't scale well, and you don't know beforehand whether the approach will work, so you might as well use the numerical approach from the start.
What I mean is that typically an electrical engineer will convert L and C elements to complex impedances (which depend on the frequency through s), and will then compute as though the elements are ordinary resistances. The expression "d/dt" isn't used in the entire analysis.
> the phasor transform thus allows the analysis (calculation) of the AC steady state of RLC circuits by solving simple algebraic equations (albeit with complex coefficients) in the phasor domain instead of solving differential equations (with real coefficients) in the time domain
You can show that by generalizing calculus so the values are functions rather than real numbers, then trying to find a max/min using the functional version of dy/dx = 0, you end up with an ODE (viz. the Euler-Lagrange equation).
This also motivates Lagrange multipliers which are usually taught around the same time as ODEs. They are similar to the Hamiltonian, which is a synonym for energy and is derived from the Euler-Lagrange equations of a system.
Of course you would brush over most of this mechanics stuff in a single lecture (60 min). But now you've motivated ODEs and given the students are reason to solve ODEs with constant coefficients.
In case you aren't aware (and I wasn't until I trained to be a teacher, so this isn't meant to be condescending), there are alternative methods of teaching besides abstraction-first. See https://en.m.wikipedia.org/wiki/Inquiry-based_learning
10. TEACH CONCEPTS, NOT TRICKS
What can we expect students to get out of an elementary course in differential equations? I reject the “bag of tricks” answer to this question. A course taught as a bag of tricks is devoid of educational value. One year later, the students will forget the tricks, most of which are useless anyway. The bag of tricks mentality is, in my opinion, a defeatist mentality, and the justifications I have heard of it, citing poor preparation of the students, their unwillingness to learn, and the possibility of assigning clever problem sets, are lazy ways out.
In an elementary course in differential equations, students should learn a few basic concepts that they will remember for the rest of their lives, such as the universal occurrence of the exponential function, stability, the relationship between trajectories and integrals of systems, phase plane analysis, the manipulation of the Laplace transform, perhaps even the fascinating relationship between partial fraction decompositions and convolutions via Laplace transforms. Who cares whether the students become skilled at working out tricky problems? What matters is their getting a feeling for the importance of the subject, their coming out of the course with the conviction of the inevitability of differential equations, and with enhanced faith in the power of mathematics. These objectives are better achieved by stretching the students’ minds to the utmost limits of cultural breadth of which they are capable, and by pitching the material at a level that is just a little higher than they can reach.
We are kidding ourselves if we believe that the purpose of undergraduate teaching is the transmission of information. Information is an accidental feature of an elementary course in differential equations; such information can nowadays be gotten in much better ways than sitting in a classroom.
A teacher of undergraduate courses belongs in a class with P.R. men, with entertainers, with propagandists, with preachers, with magicians, with gurus. Such a teacher will be successful if at the end of the course every one of his or her students feels they have taken “a good course,” even though they may not quite be able to pin down anything specific they have learned in the course.
Or maybe it was Rota again, who knows?
Edit: nope, Rota again. Lesson one in your link :)
his book on ODE is otherworldly
the quote maybe comes from his pde book
Unfortunately it seems to be more about the score than about the learning.
I found ODEs interesting only later when used in Linear Algebra, which could lead to differential algebra. But there are different approaches and not one is the true one, so what should an instructor do? You need calculus for ODEs, so much is true I guess.
Growth in hunter/food populations is interesting, too, plotting fox pop. over hare pop. Finally graphs can go in circles, not just in one direction along one axis.
Relevant to a recent Joe Rogan podcast with Neil Degrasse Tyson on how many teachers in your life inspired you. Not as much about the transfer of information, especially today, it's about the excitement and inspiration around the topics you are presenting.
Basically the equivalent of, if the course makes you feel good, then the course is good. Not something that I'd advocate.
By the way, a differential equation is simply an equation with a derivative in it. If you can't recognize that, then you didn't go far enough in math.
And I think it's evidence that the differential equations are not taught in a way that is beneficial for comp sci students (and other types of students too, but I can't speak to that). I took other classes that have not been applicable to my career after graduation - things like finite state machines, computability, and complexity theory. But I still remember a lot from those classes - due to their focus on fundamental ideas and proving things.
And of course my opinion is in the minority because nearly everybody under the sun complains about how useless college is and how things should be taught with more application without realizing that things are taught minus application for a reason (so that you can apply things generally instead of specifically) and that many of the hot technologies are just re-purposed PhD research.
Also, the effort many students give to college is less than average (at least from personal experience going through a private engineering school) and probably for most college students. So their complaints are really just the result of laziness and lack of responsibility more than anything.
A minority opinion does not make it invalid or worth less, unless you have evidence to discredit it.
My point wasn't that FSMs are useless. My point was that despite the fact that I personally have never needed to convert an NFA to a DFA in my professional career or program a turing machine, I still have a deep appreciation for those courses because they fundamentally changed the way I think about computation.
> I've also solved a complex logic problem a senior engineer couldn't solve by implementing K-maps.
While you are clearly very proud of this fact, I'm not sure why that's relevant here?
> everybody under the sun complains about how useless college is and how things should be taught with more application without realizing that things are taught minus application for a reason (so that you can apply things generally instead of specifically)
This is basically the exact opposite of my complaint. I was complaining that differential equation courses essentially focus on teaching a bag of tricks for solving specific types of equations. I'm sure that behind each of those tricks there is a very fascinating how and why that - upon deeper exploration - may have changed the way I think about numbers. But that certainly was not the focus of the class that I took.
> the effort many students give to college is less than average
So you're saying in a given population, many of its members will be less than average? Very insightful. If only I had gone further in math maybe I would be capable of such insights, too. :)
> So their complaints are really just the result of laziness and lack of responsibility more than anything.
Be careful with this line of thinking. You could say the same thing to discredit any attempt to improve the way a course is taught. But surely you must agree there is room for improvement, right?
> A minority opinion does not make it invalid or worth less, unless you have evidence to discredit it.
Given that you made no attempt to substantiate your opinion - it seems to me that the logical thing to do here is to side with the majority.
No, but your general point was that classes aren't taught in a beneficial manner, when in fact they are and are often directly applicable to what you're doing. It may be abstracted away to the point where you think you aren't using them, but you very much are. And this illustrates the point of not needing to know the how and why something works and why its often less beneficial to do so because knowing more detail often leads to more confusion.
>While you're clearly proud of this fact, I'm not sure why that's relevant here?
It was simply an example of a direct application of something I learned in college and is thus evidence that what you learn is useful and applicable. I mentioned that it helped a senior engineer because learning such abstractions can make experience irrelevant. You were the one that seemed to bring pride into the picture.
> This is basically the exact opposite of my complaint. I was complaining that differential equation courses essentially focus on teaching a bag of tricks for solving specific types of equations.
So why even mention that things like differential equations are taught in a inadequate manner?
Yes, its quite possible that student effort does not follow a normal distribution. All I can say is, from my experience, it was high. Do I have data to back this up? Not rigorously, but given the numerous complaints about how college isn't teaching things right and that professors and universities aren't stupid, I'm just assuming its lack of effort, priority, or skill.
>But surely you must agree there is room for improvement, right?
Be careful with this line of thinking. You could say the same thing to assume improvement must be made in the way a course is taught, when its really another factor that's the cause of peoples complaints that isn't mentioned (i.e. effort, professor, lack of skill, etc.)
> Given that you made no attempt to substantiate your opinion - it seems to me that the logical thing to do here is to side with the majority.
You haven't done the same either. And the most logical thing is to abstain from siding with any position until more evidence is present. After all, the majority of people believe in a religion as taught from selected texts, of which such texts have numerous factually incorrect statements in them.
This is why I just dropped off my DiffEq class. It was optional anyway, but when I go to a university level math class I expect insight, not rote memorization.
Then you hear students asking, "when am I ever going to use this?"
My good teachers, on the other hand, always tied what we were doing into a larger scheme. If there were similarities or other relations between concepts, they'd be pointed out. If someone wasn't 'getting it', they had other ways of looking at it at hand, would sometimes give alternate methods of doing the same thing, etc.
In short, one tells you to memorize in a vacuum for no good reason. The other helps you learn.
I didn't get calculus. It was a disaster for me.
Now I'm a mathematical epidemiologist. Why? Because someone introduced me to the grander scheme of things.
I remember my undergraduate mechanical vibrations class. Every exam was basically a test of how well you could do the Laplace transform on some linear ODEs. I memorized the most common transforms, so this became fairly straightforward and fast for me, but it was obvious the other students were struggling.
If I had a problem that wasn't solveable with the Laplace transform, say a linear ODE with variable coefficients, I'd likely take longer to do the exam, but those never appeared in the class.
E.g. in finding faster ways to do Ray/surface intersections (if we're talking about actually industrially useful geometry like all kinds of splines and not just triangle meshes), differential geometry is essential - even with triangle meshes you can apply it in normal and curvature estimation. Differential equations and integrating them enter the picture if you want to find the shortest way from one surface point to another along the arbitrarily shaped surface.
With simulations differential equations are everywhere because any physical system as a function of space (and time) is a collection of differential equations that you need to solve.
Earth-movers distance and the Wasserstein metric have recently got attention again, its original relevance was in the Monge-Ampere problem, how to distribute a continuous distribution of 'heaps' of some kind into a distribution of 'sinks' with the least amount of total distance moved. Which is a nonlinear partial differential equation to solve in two dimensions.
We need to apply numerical methods, nonlinear optimization to solve such problems and CS is a part of doing that quickly. Because there are no general closed form solutions for most of the systems of differential equations or there we need algorithms to solve them approximately.
As I remember, population ebb-and-flow based upon available resources is something it was especially good at estimating and tracking over time, so I would imagine that any of the Sims-type games they would be quite useful.
It turns out they are damn good at estimating, over time, many things, so a little bit of research and the right game and it's not hard to see how the two could work well together.
For a lot of things in life we plot graphs. Your software might do it transparently to you, but how it works internally is using mathematical concepts. It is good to know it and I really enjoyed learning it at college, as it opens your mind about how things work. But I don't believe it is a must to know.
After 20 years of engineering--in almost every case--numerical methods have been the only way forward. In hindsight, a year-and-a-half long course to convey the fundamentals seems excessive.
You can't actually _understand_ numerical methods without a fairly deep grounding in analytical methods.
The real problem is here is a lack of context. Engineering and most science curriculums take a "short-cut" through mathematical education. They try to teach just enough math to get through the major coursework. As a result you end up with students who feel it's all just one big memorization trick .
How so? My experience has been that the "physical world" is where the symbolic approach completely breaks down.
> Engineering and most science curriculums take a "short-cut" through mathematical education.
Only people taking more math than scientists and engineers would be mathematicians. A year-and-a-half course to cover the limit, tangent-at-a-point, functions of tangent-at-a-point, area-under-the-curve, and generalizing all of that to higher dimensions doesn't seem like much of a "short-cut" if you ask me.
edit: IMHO, many of those "gotchas" are much more interesting than the fundamentals of calculus.
This is just my favorite example, but it illustrates how understanding the fundamentals also explains the gotchas. Just getting a feel for them through experience is again just black magic by building up a table of what to use when without the generalizing principle behind it.
Even so, I spent a year of my time--and God knows how much of other people's money--grinding out the mathematical equivalent of crossword puzzles so I could get my job certificate--just like every other engineer.
Use that same time to apply the fundamentals to numerical methods, and you get to go in far more interesting directions--like symplectic integrals, or chaos theory.
It's no more busywork than being able to multiply two single digit numbers in your head. Whether it's useful to your job really depends on the job. I had a job once in the engineering industry. When we were in meetings discussing projects, if you could not do those types of analyses (e.g. asymptotic behavior of certain Calc II type integrals) in your head, you would not know what's going on. Sure, everyone could explicitly show all the steps for your benefit, but you'd be slowing everyone down.
> "physical world" is where the symbolic approach completely breaks down.
One needs to be able to solve problems that have all but the most essential details stripped out in order to develop a sense of how physical law actually works. Many times that is even "good enough" to get to a solution.
The best way to do that is through analytic methods, which give not only "an answer" but also tell you important features of the answer. These analytic solutions have "handles" you can use to ask "what-if" questions -- eg zero's in the denominator to indicate poles, behavior of the system as you take certain limits, geometric aspects such as symmetry, patterns in recurrence relations, etc, etc, etc..
Something most people in the STEM fields refuse to acknowledge is that throwing away information complicates things just as often as it simplifies them.
I say that if the ball doesn't bounce forever, the equation should reflect that.
OTOH, if you study physics at an advanced level, it's rather shocking how effectively all that analysis models the world, despite throwing away a lot of information. Try studying solid state physics. It's crazy the number of assumptions they make, and yet the theory still produces very accurate results.
There's a reason Eugene Wigner penned an essay with the title "The Unreasonable Effectiveness of Mathematics".
The material in a physics 101 course is just the barest minimum and it goes beautifully hand-in-hand with calculus 101.
Whether numerical methods are viewed as the primary way forward is a bit of a self-fulfilling prophecy. If you don't think analytical solutions end up being useful, you probably won't put in the work needed to generate them in the first place, so you never see the value.
Even if you go all in with numerical methods, you need to test your code. This requires an exact solution and knowledge of the convergence rate of the numerical scheme. The exact solution can be for a special case that is easy to solve. You might need multiple exact solutions to cover all the physics. You can also use techniques like the method of manufactured solutions, but if you don't like analytical methods you'd probably hate that.
You need to check if the empirical convergence rate matches the theoretical one. In practice this is rarely done, but it's essential towards eliminating bugs. So you can't entirely avoid exact solutions if you want to do purely numerics right. This was not covered in my first differential equations class, unfortunately, but I think it's an essential topic.
Exact solutions are often impossible, but less so than most people believe. I've produced exact solutions many times to equations people thought required numerics. The exact solutions are very valuable by themselves, as they can be used much faster than numerical solutions in most cases and allow you to see the structure of the solution. I think you should always try hard to make an analytical exact or approximate solution. It might be rare that you can do it, but the value is large and if we stopped teaching these methods it would become much more rare.
As for you mentioning in another post the problem of "pounding the square peg of law into the round hole of analytic methods", you should learn about approximate analytical solutions, which give you a lot more flexibility. You still ultimately have the same problem, though.
I can't decide whether to continue reading the Advanced Engineering Mathematics book or learn the topics it contains via Paul's Notes and other resources. My worry is that the reason Paul's Notes seem clearer is simply because they're more superficial.
What are some other good learning resources for advanced engineering undergrad math?
those are fringe subjects, completely irrelevant for the modern usage of differential equations. They are useful only in computer algebra when you want to implement differential galois theory. In practice you want to understand the overall behavior of your system (qualitative theory) or compute particular solutions numerically (using numerical methods, which are more precise than evaluating the expression of the exact solution).
You'd do much better with a qualitative book about differential equations (e.g., Arnold), about numerical analysis, or about dynamical systems (e.g. Strogatz).
I highly recommend Numerical Linear Algebra by Trefethen. It gives very detailed descriptions of particular interpretations of the singular value decomposition and eigenvalues, it works out detailed algorithms for LU factorization, eigenvalue/eigenvector decomposition, QR factorization etc. If you know basic linear algebra, the book is a pleasure to read through. For this crowd of people it is also very practical.
I don't know a good resource for probability... it is a much more diverse subject than linear algebra (which is a very small, very detailed subset of algebra).
these are the best notes ever if you missed class or a concept.
https://news.ycombinator.com/item?id=18182657 is an example ITT WRT computer game writing.