I've done these classes. It's typically 150 hours per class and it's not something you do after coming exhausted home from work either. After those 150 hours you'll get a basic understanding of the topic. You won't be an expert by any means. That will require more exposure, more time.
The lectures themselves are not that useful, I find. The lecturers are mostly useful in guiding you along, telling you which aspects of the theory to focus on and weeding through the study material to deliver you the best bits. The problem sets are indispensable. Exams make sure you actually know the basics in depth instead of just knowing about them.
My advice: enrol part-time, take one class at a time, catch up on the lectures and do the problem sets and the homework over the weekend.
Interesting, anecdotally, I went to three different universities and they all barred this from happening without an explicit override from a professor or occasionally an adviser. Just curious what experience you have that makes you say this?
Also, at least when I was an undergrad, the Banner student system (from ellucian company) had no system in place for barring you from registering from classes. This was frequently the point of discussion between the professors that I TA'd for.
As one anecdote, they once told me I needed to retake intro physics. On the pretest given on the first day of class, I came within one problem of a perfect score. Didn't make a lick of difference - their syllabus differed from the last university in the most minor of ways, and despite the fact that the class never actually covered even 50% of the stuff it claimed to on the syllabus, I was made to retake the entire sequence anyway.
Although I think your situation was pretty special as well, transferring universities is usually incredibly annoying and filled with road bumps. I've found there's a lot more leniency given to students who remain within the same university.
The way to get around this isn't by taking pretests (which don't mean much) it's by writing the final exams. In some institutions you will be able to do this without (full?) course fees if you are attempting to demonstrate equivalence.
I failed to mention I'd also already spent three years as a physics major and had already taken classical mechanics, electrodynamics, and quantum mechanics - so being told to retake the intro physics sequence was quite silly indeed.
It's a bit of a pain, but the point wasn't that they should give you some questions from the old exam but that you should actually sit the new one, under exam conditions. That resolves the problem for everyone without you having to spend the time repeating lectures etc. In the best case you don't pay full rate either.
Do they accept credits from any other institutions? Isn't there a national agreement on accepting credits for certified degrees?
Then find a lower-tier university. In any densely populated region there will be at least one that's happy to take your tuition money to let you enroll in a non-degree-earning course of study.
You'll only have an issue if you want a degree which, BTW, is typically intended to be broader than a narrow course of study in a particular area of expertise.
But, I went back and studied Real Analysis, Measure Theory, Combinatorics, Topology, and Stochastic Calculus.
I have found, though, that while I have a decent grasp of the concepts my understanding and ability to solve problems isn't as strong as the math grad students who studied these topics deeply.
I have found the knowledge useful, but I would agree that it would be hard to achieve the same results.
Lots of mathematicians switch to a different subfield within their careers. And they do so by self studying, obviously.
If your compsci or physics undergrad provided you with a decent degree of mathematical maturity, it should be doable. The problem here is that compsci is still young, and there is a lot of variability. So diving into differential forms after attending a Java school sounds like a bad idea.
Humans are capable of great things, don't discourage someone from trying.
Assuming you can only invest, lets say, 5 hours on the weekend, it will be 30 weeks per class, summing up to ~1.5 classes per year year.
I'd say that if someone can keep that up for 10 years, theirs understanding of math will be far above the average population...
And I think the latter category is more important than people realize. When I was a high school student, I benefitted greatly from t'Hooft's theorist.html (like this, but for physics, and put together by one of the Greats of the field). It's part of what got me really interested in physics, it was a whole lot of fun, and it actually did a pretty good job of preparing me for graduate-level coursework. Eventually I left physics and math for CS (I'm in PL, so this is even less drastic of career change than one might think), but I still have warm memories of working along t'Hooft's guide and checking off topics as I finished the problems in each textbook.
I don't think that the author of the post you're responding to is making any sort of normative claim.
> Sometimes the issue isn't time - it's money and access to schooling
I think the author is suggesting that if you have that kind of free time, then you either a) must be independently wealthy, or else that b) trading some of that free time for cash you can use to buy credit hours is a net positive because the guidance is worth it (again, assuming your goal is to learn math as opposed to eg obtain a uni degree).
I agree that's not necessarily a good assumption globally, but it's probably a decent assumption for almost everyone in the west, both in terms of self-learning capabilities and in terms of available funds.
Not at all; I meant in the form of a loan, where free time is time you would otherwise spend retired.
And I was also assuming that you already have a decent job that allows you to work 40 hours or so and make a comfortable enough living to consider spending your free time learning math.
Frankly, I can't possibly imagine trying to do something like learn advanced mathematics while also struggling with un- or under-employment. I grant that there are probably people far more motivated and resilient than I assume :-)
I'll pass on a world where policy and education only deals with the common case.
I interpreted this thread as being about advice to a typical person interested in learning math who exists in the society we (specifically, Americans) live in today. For the exceptional, the appropriate advice is of course different.
I think, math, unlike CS, is not a "skill" you use in your everyday work. Whoever does math/science has to dedicate much of their life to it. CS you can learn on the internet, by MOOC's, or just by reading books about it. Math on the other hand, simply doesn't work like that.
To be clear, by math, I mean "advanced" mathematics as the title indicates. Elementary (and intermediate) math is required to learn no matter what.
Going to university part time eats up more time than reading a textbook at home or wherever you are in your spare time. It also locks you in to the specific pace that the class is at which is not optimal for most people, because people learn different things at different paces. There is also the cost of university to consider.
I learn different things at different speeds, a course makes me go at a speed I don't enjoy. This is perfect.
And trigonometry is taught in high schools? It's something most children pick up, I don't think it's a fair comparison to say, measure theory.
You miss the point also: I was taught trig in high school, and I remember SOHCAHTOA, etc - but until I looked at lines drawn on a unit circle and worked put what secant, sine, tangent and their corresponding ratios meant I didn't really understand trig.
I wonder if it's even possible. Learning maths requires much work, time and dedication. Doing so alone must be very difficult.
There are several things universities provide that are hard to replicate alone: a degree, which gives you access to a job, motivation, learning environment, and "peace of mind".
What I mean by peace of mind is that, when you're a student, your job is to study, that's what you're expected to do and normally your degree will give you access to a job (esp. if your university is reputable).
Now suppose someone learns advanced maths on their own. There's a huge opportunity cost. Not only it takes a lot of time, and the few lucrative jobs that make use of maths are in finance. I suspect financial institutions are very conservative and rarely recruit someone without a proper academic background.
An other thing when learning things alone, is that your job is twofold. You must be teacher and student at the same time. You need to find the material, impose yourself some pacing, decide when it's ok to move on etc... It may be ok when you want to learn a new technique in a field you already know, but something as broad as "learning advanced mathematics" seems impossible.
I graduated several years ago with a BS in Computer Science, with a Focus on Networking. And during that time, I held 3 part time jobs while also being a math tutor. Almost none of the math I use today as a physics developer was learned from schools. I also never had that piece of mind you mentioned, because I was constantly juggling several things at once while going to school. My knowledge of advanced mathematics at the time of my graduation was pretty non existent. I think the most advanced math I had was Algebra 2 or something like that, and the Professors just basically read verbatim from the book.
A few years after Uni, I started teaching myself Calc, Trig, Vector maths, Diff Eq and Physics strictly from what I have found on various sites, software and books. Because of that, I ended up getting a physics simulation developer position at a software company. Because in my companies view, being able to teach yourself all that math is much more impressive than being taught from a University.
I hated math during High School and College, but since then, I have found that I absolutely love math, and I will never stop trying to learn or do new things. My degree was two small lines on my CV, while about 50% of what I had on my CV was all learned on my free time, by myself.
So learning math without a College or University is totally possible, and in my situation, worked way better. Sites like Khan Academy, Wolfram, Youtube, etc. all give you the resources and leave it up to you to progress at your own pace, for free.
Right, but the math tagged as 'advanced' in the article is fairly applied.
I consider the math I do at work to be somewhat advanced. Statics, dynamics, a touch of thermo dynamics, etc. But if you are talking like quantum mechanics or NASA JPL level math, then yeah I totally agree those topics would definitely be better learned in a proper environment.
Here I talk from experience. I remember reading books on some of these subjects and understanding few things, without really getting a grasp of what they're talking about. A lot of times, the problem is that you don't know what is missing in your knowledge. You need a clear roadmap, you need relationships, you need to solve a lot of questions, you need to do exams and, most importantly, you need to test your knowledge. I cannot even count how many time I thought I understood some theorem only to do some exercise and see that I had absolutely no idea. Sometimes you notice yourself, sometimes you do it so bad that you don't even notice it is incorrect.
And, for these subjects, the material on the Internet starts to diminish and be less accessible (more oriented to professional mathematicians than to learners). Khan Academy does not have advanced courses, the definitions on Wolfram or Wikipedia are only useful if you have already a grasp of the subject (see for example https://en.wikipedia.org/wiki/Measure_(mathematics)#Definiti... - What is important? What are the critical aspects? Which are the subtle parts of the definition that you must read carefully?) and in Youtube you may find lectures, but usually they're like the books: you will be lucky if it's not a succession of theorems and definitions, and you still lack the possibility of checking and testing your knowledge.
So, while some parts of math can be learned independently, I don't think that advanced mathematics can be done. Myself, only after 5 years of mathematics I'm somehow comfortable to study subjects by myself, and it's still hard.
As you say, though, you need to solve a lot of questions (which I interpret to mean "do a lot of exercises" or "do a lot of problem sets") to understand something. Reading a textbook without doing exercises is minimally useful, although it can help with the "roadmap"/"relationships" thing. Wikipedia is usually a pretty good roadmap, too, although it varies by field.
But you can also read textbooks and do exercises. This depends on the existence of, and access to, sufficient textbooks and exercises, but Library Genesis has recently extended that kind of access to most of the world. Taking functional analysis as your example, the 1978 edition of Kreyszig is on there, and it averages about two exercises per page, and has answers to the odd-numbered ones in the back. This quantity of exercises seems like it would probably be overkill if you were taking a class in functional analysis and could therefore visit the professor during office hours to clear up your doubts, but it seems like it would be ideal for self-study. And if two exercises per page isn't enough, you can get more exercises out of a different textbook, like Maddox (1970 edition on libgen) and Conway (first and second editions on libgen). You can find textbooks on scholar.google.com by searching for the names of general topics and then looking for "related articles" with thousands of citations, because for some reason people like to cite their textbooks.
Unless you can find a desperate adjunct math faculty member looking to make some extra bucks on the side or something, it's true that comparing your answers to the exercises to those given isn't as good as having a TA actually correct your homework. But it's usually good enough.
(Of course you should only download these books if that wouldn't be a violation of copyright, for example, if their authors granted libgen permission to redistribute them or you live in a country not party to the Berne Convention.)
Progress will be slow. But I think the key thing here is to start with low expectations: expect that you'll manage to read about 15 pages a week and understand half of them. I don't think you have to be a Terence-Tao-level genius.
I got 800 on the 1980s-era math SATs, came in third in the Portland OR area in a math contest in high school, and did OK at Caltech (not in a math major), but I'm no Terry Tao, and I very much doubt I'd've been anything very special in a good math undergrad program. Some years after graduation, I found it challenging but doable to get my mind around a fair fraction of an abstract-algebra-for-math-sophomores textbook, including a reasonable amount of group theory (enough to formalize a significant amount of the proof of Solow theorem as an exercise in HOL Light, and also various parts of the basics of how to get to the famous result on impossibility of a closed-form solution for roots of a quintic).
From what I've seen of real analysis and measure theory (a real analysis course in grad school motivated by practical path integral Monte Carlo calculations, plus various skimming of texts over the years), it'd be similarly manageable to self-learn it.
One problem is that some math topics tend to be poorly treated for self-learning, not because they are insanely difficult but because the author seems never to have stepped back and carefully figured out how to express what is going on in a precise self-contained way, just relying (I guess) on a lot of informal backup from a teaching assistant explaining things behind the scenes. On a small scale, some important bit of notation or terminology can be left undefined, which is usually not too bad with modern search engines but was a potential PITA before that. On a larger scale, I found the treatment of basic category theory in several introductory abstract algebra texts seemed prone to this kind of sloppiness, not taking adequate care to ground definitions and concepts in terms of definitions and concepts that a self-studying student could be expected to know, and that's harder to solve with a search engine, tending to lead into a tangle of much more category theory and abstraction than one needs to know for the purpose at hand. My impression is that mathematicians are worse at this than they need to be, in particular worse than physicists: various things in quantum mechanics seem as nontrivial and slippery as category theory to me, but the physicists seem to be better at introducing it and grounding it. (Admittedly, though, physicists can ground it in a series of motivating concrete experiments, which is an aid to keeping their arguments straight which the mathematicians have to do without.)
I have been much more motivated to study CS-related and machine-learning-related stuff than pure math, and I have been about as motivated to self-study other things (like electronics and history) as pure math, so I have probably put only a handful of man-months into math over the years. If I had put several man-years into it, it seems possible that I could have made progress at a useful fraction of the speed of progress I'd expect from taking college math courses in the usual way.
I think it would be particularly manageable to get up to speed on particular applications by self-study: not an overview of group theory in the abstract, but learning the part of group theory needed to understand the famous proof about roots of the quintic, or something hairier like (some manageable-size fraction of) the proof of the classification of finite simple groups. Still not easy, likely a level harder than teaching oneself programming, but not an incredible intellectual tour de force.
"Myself, only after 5 years of mathematics I'm somehow comfortable to study subjects by myself, and it's still hard."
Serious math seems to be reasonably difficult, self-study or not. Even people taking college courses in the ordinary way are seldom able to coast, right?
Any advice on how to use those textbooks the best way?
Same problem in programming and machine learning - people need a little hand holding in the form of a sequence of problems to solve that would never be either too difficult or too easy. Examples usually jump from Todo MVC to full apps, in one step, or in ML, from a simple MNIST example (or even the minuscule Iris dataset) to double LSTM with memory and attention. Where are the intermediary nice problems to learn on?
When I was learning math in school and high school there were loads gradual problems to solve, but at university suddenly there was just theory and almost no useful problems to practice on.
Most unis I know of (I'm in the US) require those courses to be taken as part of your undergrad before you can attain the CS degree. Furthermore, with the prevalence of AP courses at the high school level, many students enter college already having taken some, possibly all of those courses.
How could you get a BS in CS without taking calculus courses? Which school did you go to?
(Although maybe this will change with machine learning.)
By the end of that summer of 1983, Richard had
completed his analysis of the behavior of the
router, and much to our surprise and amusement, he
presented his answer in the form of a set of partial
differential equations. To a physicist this may seem
natural, but to a computer designer, treating a set
of boolean circuits as a continuous, differentiable
system is a bit strange. Feynman's router equations
were in terms of variables representing continuous
quantities such as "the average number of 1 bits in
a message address." I was much more accustomed to
seeing analysis in terms of inductive proof and case
analysis than taking the derivative of "the number
of 1's" with respect to time. Our discrete analysis
said we needed seven buffers per chip; Feynman's
equations suggested that we only needed five. We
decided to play it safe and ignore Feynman.
The decision to ignore Feynman's analysis was made
in September, but by next spring we were up against
a wall. The chips that we had designed were slightly
too big to manufacture and the only way to solve the
problem was to cut the number of buffers per chip
back to five. Since Feynman's equations claimed we
could do this safely, his unconventional methods of
analysis started looking better and better to us. We
decided to go ahead and make the chips with the
smaller number of buffers.
Fortunately, he was right. When we put together the
chips the machine worked.
It's true that in a lot of cases, deeply understanding discrete numerical algorithms is a lot easier if you can analyze the continuous versions, which of course cannot be executed directly. But you can get really far with just the discrete versions, and you can understand useful things about the continuous versions without knowing what a derivative or an integral is.
And I don't just mean that you can use Unity or Pure Data to wire together pre-existing algorithms and get interesting results, although that's true too. You don't even need to understand any calculus to write a ray-tracer from scratch like http://canonical.org/~kragen/sw/aspmisc/my-very-first-raytra..., which is four pages of C.
You could maybe argue that it's using square roots, and calculating square roots efficiently requires using Newton's method or something more sophisticated. But Heron of Alexandria described "Newton's" method 2000 years ago, although he hadn't generalized it to finding zeroes of arbitrary analytic functions, perhaps because he didn't have concepts of zero or functions.
You could argue that it's using the pow() function, but it's using it to take the 64th power of a dot product in order to get specular reflections. People were taking integer powers of things quite a long time ago.
Even using computers for really analytic things, like finding zeroes of arbitrary analytic functions, can be done with just a minimal, even intuitive, notion of continuity.
Alan Kay's favorite demo of using computers to build human-comprehensible models of things is to take a video of a falling ball and then make a discrete-time model of the ball's position. A continuous-time model really does require calculus, and famously this is one of the things calculus was invented for; a discrete-time model requires the finite difference operator (and maybe its sort-of inverse, the prefix sum). Mathematics for the Million starts out with finite difference operators in its first chapter or two. You don't even need to know how to multiply and divide to compute finite differences, although a little algebra will get you a lot farther with them. A deep understanding of the umbral calculus may be inspirational and broadening in this context, and may even help you debug your programs, but you can get by without it.
I agree that calculus is really powerful in extending the abilities of computers to model things, but I think you're overstating how fundamental it is.
A programmer equipped with a bit of calculus is so much more powerfull than a programmer without. It's like one is climbing from a canyon. Both the guy with the training and the utilities and the rookie with bare hands will probably reach the top, but it takes a shorter time for the better equipped person to reach the top, and he is already tackling other interesting problems when the other finally reaches the top.
Humans have a limited time on this planet. Really, learning calculus formallly is one of the most efficient and painless boosters for productivity when creating new bicycles of the mind. It's not the only one, and it's not necessary like you pointed out, but compared to the utility it's so cheap to aquire I can't really see no reason not to force it on people. This is still my opinion, I don't have sufficient practical didactic chops to even anecdotally prove this.
I totally agree that (integral and differential) calculus is a massive mental productivity booster. I'm not very convinced of the utility of schooling in acquiring that ability, because I've known far too many people who passed their calculus classes and then forgot everything, probably because they stopped using it. I've forgotten a substantial amount of calculus myself due to disuse. But I agree that schooling can work.
But I wasn't arguing against schooling, even though our current methods of schooling are clearly achieving very poor results, because they're clearly a lot better than nothing.
I was arguing that, for programming, the schooling should be directed at the things that increase your power the most. Two semesters of proving limits and finding closed-form integrals of algebraic expressions aren't it. Hopefully those classes will teach you about parametric functions, Newton's method, and Taylor series, but you can get through those classes without ever hearing about vectors (much less vector spaces and the meaning of linearity), Lambertian reflection, Nyquist frequencies, Fourier transforms, convolution, difference equations, recurrence relations, probability distributions, GF(2ⁿ) and GF(2)ⁿ, lattices (in the order-theory sense), numerical approximation with Chebyshev polynomials, coding theory, or even asymptotic notation.
In many cases, understanding the continuous case of a problem is easier than understanding the discrete case; but in other cases, the discrete case is easier, and trying to understand it as an approximation to the continuous case can be actively misleading. You may end up doing scale-space representation of signals with a sampled Gaussian, for example, or trying to use the Laplace transform instead of the Z-transform on discrete signals.
If you really want to get into arguing by way of stupid metaphors, I'd say that when you're climbing the wall of a canyon, a lightweight kayak will be of minimal help, though it may shield you from the occasional falling rock.
But I don't know, maybe you've had different experiences where itnegral and differential calculus were a lot more valuable than the stuff I mentioned above.
It's true I don't need that suff in my daily work that much. But I recognise a lot of problems I might meet are trivial with some applied calculus. Like the newton iteration, which you mentioned.
You definitely don't need calculus to transform between spherical and Cartesian coordinates. I mean I'm pretty sure Descartes did that half a century before calculus was invented. You do need trigonometry, which is about a thousand years older.
Newton iteration is a bit dangerous; it can give you an arbitrary answer, and it may not converge. In cases where you think you might need Newton iteration, I'd like to suggest that you try interval arithmetic (see http://canonical.org/~kragen/sw/aspmisc/intervalgraph), which is guaranteed to converge and will give you all the answers but is too slow in high dimensionality, or gradient descent, which does kind of require that you know calculus to understand and works in more cases than Newton iteration, although more slowly.
Calculus does usually build some mathematical maturity for those who haven't encountered it. And it's useful as an introduction to sequences and series, and for anyone interested in numerical analysis or physics simulation (e.g., computational science, modeling, game engine development, etc.).
Not to mention having it is useful if you find that you'd rather do computer engineering or EE halfway through your undergrad career (though this last point is tangential at best).
I do wish linear algebra was a more commonly required course in CS programs.
I spent a year in a community college making up for what I should have learned in high school; basic math up to advanced algebra. Sure I applied myself more, but the teachers, and even the text books seemed better?
Once I learned the basics, it made math enjoyable, and I didn't fear courses that were heavy in math.
By the way, most Medical doctors never sat in a calculus course. Here, in the U. S., there's always had two calipers of physics courses. The hard, and easy physics courses. The easy physics courses don't require calculus. They hard require calculus. Most med students too the easy courses, and aced them. It's all about the GPA when trying to pretty yourself up for med. school.
I worried way too much about grades in college. I look back and wish I took the courses I was interested in.
My interests are completely different as I've aged. It's tough in college because so much rides on getting into that certain graduate program, or professional school-- graduating, and getting a Job.
Heck, even last year I talked to another University to look into electrical engineering and only 1, ONE, class would transfer. All others wouldnt count because since they were a private school, their curriculum was different than most. Thats not something they put in brochures.
I'd point you and other HNs to Srinivasa Ramanujan. He is self taught but...he was wrong . He had a brilliant mind but...due to being self taught, he made some critical mistakes.
Being self taught can easily lead the learner to some critical mistakes. Eventually, they may be corrected (and at what 'cost' does this mistake cause an organization or business or those involved) but it's more efficient of someone's time to just learn from another. I'm not saying everyone needs a University Degree. I'm saying that everyone needs a teacher. Everyone. Why? Because instead of 'the blind leading the blind' (you as a 'blind' teacher, leading you as a 'blind' learner). You have the efficiency of being led by a mentor of some kind that can steer you away from faulty concepts that may come in.
It's great that we now have more free/cheap materials than ever before at our disposal but without a mentor or some kind of peer-review, we could be misapplying concepts.
Also, to comment on something you specifically said:
> Because in my companies view, being able to teach yourself all that math is much more impressive than being taught from a University.
Yes, it's 'impressive' but...most don't learn this way. Which is way it's 'impressive'. Also, being self-taught, how do you truly verify what you understand mathematically is accurate and solid?  You might be and I'm not going to fault you but learning concepts is one thing but applying them is even more challenging. It's one thing to be 'impressive', it's a whole other thing to have mastery over a topic. And I'm a firm believer mastery is mostly achieved with peer/mentor feedback.
I applaud you but let's not steer others to just teach themselves, without help from others. Let's encourage self taught and peer feedback. It's not one or the other, it's both.
 - https://www.youtube.com/watch?v=jcKRGpMiVTw
 - I searched for 30 minutes to find this article, that I read, that stated the current environment of Mathematical Research . Namely, it stated that a lot of research is being published that is NOT peer-reviewed because there isn't enough skilled* Mathematicians to review the work. That it's a 'dirty little secret' in the industry that "known" Mathematicians would get a pass (published w/o review) but many others trying new groundbreaking ideas couldn't get their research peer-reviewed. And with the given University culture to publish NEW research and not review, it's understandable how this environment was created. Namely, Einstein gets the fame but it took numerous people to peer-review his work before it was accepted.
 - I know this article exists. It's one of the reasons why I'm becoming a Mathematician. I read it in the past 2-3 years. It was a major site (NewScientist or something that focuses on emerging research). If you can find it, I'd be very grateful. I'm now* using Zotero to save all my findings, so hopefully when I quote something I'll have a source. ;)
*(edited) - original said 'not'. I meant 'now I'm using Zotero'. ;). original said 'skill', I meant 'skilled'
>Being self taught can easily lead the learner to some critical mistakes. ...
I really glad you brought that up. There have been countless times that I was working on some formula which looked good to me, and even had correct results (some of the time), only to find that it was completely backwards when someone else looked at. Its essentially like learning to program versus learning to program correctly. I cant tell you how many times I pronounce words incorrectly because I have only read it and never heard someone talk about it. Also embarrassing.
I have actually read that same thing about your  foot note and I wanna say I saw it here on HN but cannot remember when. It was pretty interesting and I can totally see how not learning math the proper way can cause a lot of issues related to research. In my case doing physics for simulations, its not as pronounced, because its a small user base but in a larger scale, I would be terrified of publishing my work for this exact reason.
And I by no means intend on convincing others to learn this on their own. I would actually suggest doing it the standard way because it was much more difficult and time consuming trying to learn this stuff by your own. Especially since I had no real person to talk to about it. I kinda wish I could have gone back and changed majors.
I think many people forget that THIS is what a Scientist is. Someone subjected to their peers. This humble way of looking at things (that our work isn't accepted until it's verified/peer-reviewed), is our way of life. It's a shame to me that the current culture has a massive backlog of research, without peer review.
I'm grateful for your reply as it will give others insight into the 'less trodden path' of trying things yourself. It worked for you, so that should motive others. And hopefully I added to the conversation to encourage others to seek out peers/mentors, since that will accelerate their learning.
From a practical perspective it definitely is. I've picked up a fair amount of graph theory and with nothing but extreme persistence have grokked and used some fairly advanced stuff (2nd-year dropout). It was, however, work-related. Just don't ask me to proof anything.
> the few lucrative jobs that make use of maths are in finance
There is also competency on the table here. Graph theory crops up day-to-day with the business software work I'm doing (three separate deliverables). Calculus is used to a point of absurdity in game development - e.g. the Fresnel term. Machine learning? Calculus, linear algebra, tensors. Profiling? A basic understanding of statistics. Compilers? Category theory, graph theory. Physics engines? ODE.
It's extremely valuable to know this stuff.
I don't mean to offend, but being able to prove things is generally the main focus of advanced mathematics. If you can't prove what you know, or at least have a rough outline of a proof you could construct after referring to something, you haven't learned it in the same way those at a university have.
> If you can't prove what you know
Does the Bayesian vs. Frequentist debate hold back working statisticians?
Or debate wrt constructivism ( https://en.wikipedia.org/wiki/Constructivism_(mathematics) ) hold back math in general?
I admit I'm a bit out of my depth on this point though...
As for Bayesian vs. Frequentist, it's another vim vs. emacs style debate most of the time - which is most appropriate to use, as opposed to which is right and which is wrong. Quite a lot of the time, it just doesn't matter.
I'd say being able to use advanced math, even for engineering, is a suitable definition for "having studied advanced mathematics".
'versus "mathematicians" in the sense that they can use advanced math.'
Mathematicians are people who create new mathematics, not people who use mathematics.
I feel the term is often used more broadly.
In any case, the point stands - The need for persons who can construct mathematical proofs, versus those who simply need to derive the correct numerical result, is very different.
As such, most classes that teach calculus are for practical, applied purposes - who don't need to "prove what they know" beyond demonstration procedural competence.
Compare perhaps "composer" to "musician", they are both involved in music but operating on different axis. Most people would agree that there isn't a strict relationship superset, and there is overlap. There are skilled composers who are lousy musicians, and vice versa. There are a few people who are top rate at both. However, it is very useful to have the distinction between creating and performing.
It's much the same with mathematicians.
If we're relating to how people use words, it's not "much the same with mathematicians"
That rather depends on the field. For engineers, the main focus of advanced mathematics is to be able to apply it to real world problems.
With a few, minimal yet illustrative examples (ala katas), plenty diagrams/illustrations and other mental aids, an no rigour - not a single bit of set-theory!
The proofs and rigor can come later...
Incidentally, I'm a dev in finance, looking to move into quant dev. I have a math degree (completed 2007) and I'm doing a "CQF" to catch up with the relevant quant knowledge. I think cyptography, and stats/data analytics might also be a good area for mathy-dev.
For example, consider learning vectors, without the spacial/Cartesian visualisation as an aid. Or geometry without the visuals.
An "intuition" wrt skill can only come from experience - repeated exercises and practise. But before that another kind of "intuition" can come from a useful mental model. Maybe at some stage, working mathematicians stop using these models, but I recon:
- They helped to learn the subject, in the early stages.
- They help in simple cases.
- They are not simply abandoned, but replaced with more powerful mental models.
This only works with visuals due to the relative simplicity of the topic, and simple visuals such as this are commonplace in modern textbooks and lectures. This , for example, is a visualization describing the one-way functions with hardcore predicates from a lecture.
However, these visualizations fall apart exponentially as you ascend the mathematical ladder of abstraction. Mathematical nomenclature becomes overburdened by many assumptions, and without proper rigor, becomes incredibly difficult and long-winded to explain. This is why newcomers find it impossible to pierce high level mathematics, each rung of the mathematical ladder builds upon the last. How would you suggest a visualization that is useful for the Kelvin-Helmholtz instability  for example? You can look at all the visuals and simulations you'd like on Wikipedia, but unless you're a mathematical savant you'll have to dig deep into mathematical rigor, borrowing work done by giants in the past . There's really no easy shortcut to this.
> But before that another kind of "intuition" can come from a useful mental model
This mental model can be just as unhelpful as helpful. It is notoriously hard to fix false preconceived notions, and someone that develops an "intuition" that only applies as at basic level could easily lead them astray, a la the Dunning-Kruger effect. Beginning tabula rasa is often the path of least resistance, since once someone learns something /properly/ the first time, they're more likely to apply it correctly, rather than trying to apply a model that falls apart at higher abstractions. You can't really jump rungs in the math ladder, or even stave it off as a form of debt, telling yourself you'll learn it later.
My current idea is that Math could be taught as a language and taught as a critical thinking class. A condensed class would like like 'this is an equation...here is what we can do with it (derivatives, areas/3D/4D, etc)...but...99.99% of you won't need to know it this way. You need to use math in a way that indirectly teaches you how to creatively look at problems in life.'
I'm not sure why everyone is forced to learn math without knowing WHY they are forced to know it. Creative problem solving is one of the best takeaways, imho, for the masses.
As for Adv. Math...I think it's not effective for most people's career paths and skillset they will require in the real world.
Terence Tao put it this way:
»The point of rigour is not to destroy all intuition; instead, it should be used to destroy bad intuition while clarifying and elevating good intuition. It is only with a combination of both rigorous formalism and good intuition that one can tackle complex mathematical problems; one needs the former to correctly deal with the fine details, and the latter to correctly deal with the big picture.«
I agree with Terence Tao's sentiments.
Math, for the masses, is a great way to abstractly teach the masses how to critically think about things. Math, for the masses, shouldn't get bogged down in the rigour. But if one were to go on to Adv Math, then yes, rigour is needed and demanded of the mathematician.
I disagree. I think maths are intrinsically complex. Some results may have intuitive geometric interpretations but if you want to understand the whole edifice, there's no shortcut, you have to absorb tons of theories.
Take probability theory and statistics, you can always see it a set of recipes, but if you really want to make sense of it, you need to study maths for a few years.
So you can compare current mathematics to be like a certain programming language. Let's say it's like FORTRAN. There might be C++ for the same concepts, there might be Python, Smalltalk, Prolog or Haskell for the same concepts, but everything you read is in FORTRAN. And very few people like or are capable reading FORTRAN.
The theorem is "there's no injective function whose codomain is smaller than its domain". It's not stated this way because mathematicians are snobs or to impress students! abstraction is the very nature of mathematics.
"Abstraction in mathematics is the process of extracting the underlying essence of a mathematical concept, removing any dependence on real world objects with which it might originally have been connected, and generalizing it so that it has wider applications or matching among other abstract descriptions of equivalent phenomena."
Surely because that's history, we don't teach it that way because then you lose the links that have [much] later been found with other areas of maths -- isn't it the linking in to different areas that provides all the power? We want current students to understand a far wider curriculum and realise the links that come out of those abstraction, no?
I guess it's like whether you teach grammar to language students or hope that through language use they'll derive their own abstractions that allow them to understand the grammar sufficient to say things that they've never heard before.
From a history perspective we probably don't know how they came up with the idea, even if their journals (!) had a specific derivation of a proof then that wouldn't mean that was their initial direction of travel necessarily.
Also, yes, introducing the simplest version of a concept using examples before the most general version is a good thing. This is a recommendation commonly made in mathematics exposition. For instance Arnold, a Russian mathematician known for insistence on examples, introduces groups as a bunch of permutations closed under composition, and a manifold as smooth subset of R^n.
There are situations when the abstract definition itself has value, even for expository purposes. For instance, the abstract notion of a group or manifold or vector space helps one to understand which constructions are manifestly invariant under different coordinates. Linear algebra is all about understanding this point.
The same point appears in programming when the value of an abstract interface, which can be introduced by an concrete example, lies in the generality with which it deals with different examples. See Functor(Mappable), Monad, or Foldable in Haskell. A more common example is the Iterable interface which can be illustrated via a list, but the value lies in the fact that interface applies to many data structures.
Two more points - sometimes a concept is unsatisfactory because mathematicians haven't achieved a good understanding yet. It's just that the given concept is what was needed to solve some previous problem. Often future concepts, (which one learns later in one's education or newly discovered in research) clarify older unsatisfactory concepts.
Also, the aha insight that one gets that a seemingly abstruse concept becomes clear is often dependent on past work which has helped one to internalize some details. After the insight, just a couple of words can stand for long statements. For instance, the word 'manifold' stands for what would be a complicated notion for 19th century geometers, or a more simple example, 'local isomorphism' stands for a statement like inverse function theorem. But if one goes to a new student and repeats the insight, they may not get it as a certain amount of background work needs to be done.
Famously, for example, Bertrand Russell and Alfred Whitehead prove in Volume II of their Principia Mathematica, using theorem 54.43 from page 379, Volume I, that 1+1=2 (adding that "the above proposition is occasionally useful.")
Now, that is clearly obvious to everyone, and yet what Russell and Whitehead achieved in the intervening 400+ pages was more than just obfuscation.
Try to use mathematics to describe an artistic work. Or even precise muscular movement of a human arm in a ballet in its wholeness. Good luck with that!
See also Hempel's raven paradox.
One of Erdős's quirky notions was THE BOOK, in which God collected the most elegant and wonderful mathematical proofs. He said "You don't have to believe in God, but you should believe in THE BOOK."
The book above collects some wonderful proofs that could have made it into THE BOOK.
Learning math outside of high school has also helped me identify 'snake oil' statistics in my industry and challenge the validity of information I've been presented as fact.
Now not all professors are great at this, but I would say that a great many would love nothing more than to talk about the things that they know very well.
Maybe the authors of the papers that I read aren't always that pedagogical, and I get totally lost when someone tosses in a variable only to half-heartedly define what it is a page later.
I think it's mostly due to that I suck at math, and need to figure out obvious things on my own - but perhaps also due to my programmer-view of the world were you typically define things before you use them...
But learning on your own is probably hard. I got irritated once when I needed some not totally trivial transformations for a GIS application. I spent some evenings repeating from my old books, but it was unfathomably boring, so I gave up as soon as I got my transformation working :-)
John Baez's recommendations: http://math.ucr.edu/home/baez/books.html
For theoretical physics 't Hooft's recommendations: http://www.staff.science.uu.nl/~gadda001/goodtheorist/
I don't care if I need 2 lifetimes to learn advanced mathematics. I might not even scratch it. It's the journey that counts to me - if I can learn one new tool, one new perspective of looking at problems and the world, I'm a very happy man.
The only person who loses out is my poor wife who must listen to my excitement and then has to go lie down for a bit because it's too much to digest.
Previously discussed on HN:
On the other hand, books written for mainstream math majors and graduate students are not necessarily the ones best suited for an autodidact. Perhaps the author of this post has selected those that are more appropriate, but I can't judge. Also, Springer is a great name in math books and you generally don't go too far wrong by sticking with them, but I've never seen their undergraduate series before. Perhaps they're more common in the UK than the US?
book that's more popular
Don't underestimate how much a strategically placed typo can confuse a learner.
Rule of thumb: avoid first edition maths books.
I can't speak for all universities over here, but my undergrad had a single textbook from what I recall. The rest were all printed notes or simply lecturers writing with astonishing speed on the blackboards.
There was certainly extra reading we could do, and I'm sure someone did. But most of the learning was from attending lectures and watching someone go through things step by step.
I'd be interested to know if other courses are more textbook based - although I know which I would choose.
The Springer SUMS series (which appears on this list a couple of times) is very nice for autodidacts, so your conjecture could well be correct.
On the other hand, if you have the time (and ability) to learn some of this material on your own, for a purpose other than competing for a highly paid job as a mathematician, great.
I was astounded by this: https://acko.net/blog/how-to-fold-a-julia-fractal/
Amazing intro to complex numbers.
They replied "I'd be equally happy if you made a charitable donation this holiday to someone or some organisation that can use it more."
I don't recall ever asking a lecturer anything. Whenever I'm stuck I like to double down and stare the sheet until I do some progress.
And unlike you, I most definitely don't feel like a genius in class lol.
It’s great that he’s making the effort, but I’m not sure this is the most useful resource for a typical autodidact.
There's also programming. That's a whole can of worms in itself. There's both theory and practice, where I'd say the practice is far more important than it seems. You really have to have bashed your head against a wall (of your own making) to appreciate how to code in a sensible, maintainable way.
I'm already a dev, but trying to catch up wrt the math at the moment :-O
I'm finding there are a whole bunch of skills unrealted to most dev concerns. looking at this: http://quantjob.blogspot.com/2011/12/how-to-avoid-quantdevel...
I think a lot of dev skills are "housekeeping" - VC, commenting, testing, agile, automation, standards etc.
The quant dev stuff seems to be a lot more concerned with performance, correctness and accuracy, and the last two in particular are somewhat specialist dev skills I think - I've found code derived from mathematical equations be be a little different to other code.
If anything it's knowing the plumbing that makes you productive as a dev, of any kind. You just can't get around understanding how branching works, or having some unit tests.
Very little of the work ends up being the bit you think you're there for. I suspect it's the same in many industries. My parents ran a restaurant, and there's a lot of cooking, but there's also a lot of driving to the wholesaler, picking out vegetables, cleaning surfaces before and after a day, doing the plates, accounts, and so forth.
Is this true of quant devs? Seems the focus is different - regular devs aren't as rare, so maybe the most important thing is producing a viable POC?
As for quants not knowing VC - did they come from a dev background?
No, and that was the problem. The more you venture into this field, the more you realise how much of it is coding. New idea about how price series X relates to Y? No use unless you can pull the data, do the transformations yourself.
Another critical problem is that when you're unproductive, you make contortions to make your results "real". You make rationalizations that don't hold up, because OMG it's a lot of work to test some more ideas.
But once this stuff clicks it becomes very easy to teach yourself. I've been learning stuff like quantum algorithm, network analysis, etc.
Do you think this would work well? Obviously it costs money but I'm guessing the rate wouldn't need to be too high to make it worth their time. They wouldn't need to do any preparation, just have a good grounding in the language of maths.
In 2010, I was very interested in foundations of mathematics, an extremely abstract math branches:
In particular I spent huge amount of time on:
https://en.wikipedia.org/wiki/Nicolas_Bourbaki (Set theory)
What attracted me is that these books doesn't require any specific knowledge of classical math. I.e. they are self-contained.
It was fun and ... the experience to delve into highly abstract view on entire math.
The big problem is that while I read that for more than a year, I had no experience in problem solving and just ignored exercises (thinking that concept is everything). As a result of that, my entire knowledge is completely evaporated and I literally can't solve any of exercises.
After that year, I dropped math till recently.
Now, I have completely different approach. I learning elementary olympiad style math and most importantly solving problems all the time. Currently, I'm into series of books:
These books made for math olympiad preparation. While I solving exercises, I feel how solid my knowledge is.
So if you want to learn advanced mathematics, learn elementary olympiad-style math first. It will give you solid background to start learning advanced math (not just knowledge background but most importantly problem solving skills).
I recommend (surprise surprise) programming. Implement fast fourier transform in C and then Common Lisp. Write a finite difference PDE solver. Try solving actual problems to motivate you. Signals analysis can be a fun way to exercise your knowledge. Try analyzing your favorite songs and figuring out what makes them sound the way they do. Maybe implement some audio filters. If that's not your cup of tea, write physics or chemistry simulations instead. Then use OpenGL to visualize them. Then make them interactive.
I can go on and on, but I'll just leave two book recommendations for those who might enjoy programming advanced mathematics.
Structure and Interpretation of Classical Mechanics
Functional Differential Geometry
I'd pay handsomely for a personal tutor / teacher.
You might still get some exposure to discrete mathematics once in a while. Statistics is always there to help you, some people avoid it, some others embrace it.
how do you survive as a software engineer without data science?
I love this sentence. I'm not sure it is true, but it is nonetheless a great sentence.
Net: What I found was not "hot" but ice cold.
In contrast, early in my career around DC, for applied math for US national security and NASA, in one two week period I went on seven interviews and got five offers. In four years, my annual salary increased by a factor of 4 to six times what a new, high end Camaro cost. That was "hot".
When I went for my Ph.D. in applied math, I'd read E. O. Thorpe who had, basically an early but basically correct version of the Black-Scholes option pricing model. In the back of his book, he mentioned measure theory. So, I dug into Royden's Real Analysis, and in grad school I got a really good background in measure theory, probability, and stochastic processes from a star student of E. Cinlar, long in just those topics and the mathematics of operations research and mathematical finance at Princeton.
In more detail, about 1992 to 2000, after my Ph.D., I tried to get into finance in NYC as a quant. My Ph.D. dissertation research was in stochastic optimal control, with careful attention to measure theory and the relatively obscure topic of measurable selection and with a lot of attention to real world modeling, algorithms, and software. I had a good background in multivariate statistics and time series techniques, an especially good background in advanced linear algebra and numerical linear algebra (e.g., numerically exact matrix inversion using only double precision machine arithmetic and based on number theory and the Chinese remainder theorem), double precision inner product accumulation and iterative improvement, etc.
So, I sent nicely formatted resume copies, in total 1000+.
I have held US Federal Government security clearances at least as high as Secret; never arrested; never sued; never charged with worse than minor traffic violations; never bankrupt; good credit; physically normal; healthy; never used illegal drugs or used legal drugs illegally; married only once and never divorced; etc.
(1) I got an interview at Morgan Stanley, but all they wanted was software development on IBM mainframes (where I had a good background at the time) with no interest in anything mathematical.
(2) I got a lunch with some guy who claimed to be recruiting for Goldman Sachs, but, except for the free lunch and what I had to pay for parking in Manhattan, that went nowhere.
(3) I had a good background in optimization, published a nice paper in JOTA that solved a problem stated but not solved in the famous paper in mathematical economics by Arrow, Hurwicz, and Uzawa.
So, for mathematical finance, I got a reference to
Dynamic Asset Pricing Theory,
Princeton University Press,
Princeton, New Jersey,
and dug in: The first chapters were heavily about the Kuhn-Tucker conditions, that is, the topic of my JOTA paper. By the end of the chapter, I'd found counterexamples for every significant statement in the first one or two (IIRC) chapters. I had to conclude that Duffie was not a good reference for anything good!
(4) Headhunters: I tried them, especially the ones claiming to be interested in technical work, computing, etc. They were from unresponsive down to insulting. It wasn't clear they had any recruiting requests.
(5) In those days, getting names and mailing addresses of hedge funds was not so easy. But I did get some lists and mailed to them. Got next to nothing back. I didn't hear about James Simons until well after year 2000.
(6) Right, there was Black-Scholes. Well, of course, that was Fisher Black at Goldman Sachs. So I wrote him and enclosed a copy of my resume. I got a nice letter back from him saying that he saw no roles for applied mathematics or optimization on Wall Street.
So, I gave up on trying to be a quant on Wall Street!
So that was 1992-2000, 8 years, 1000+ resume copies, and zip, zilch, and zero results.
Curious that the OP thinks that 1997 was a "hot" year for applied math on Wall Street.
Now I'm an entrepreneur, doing a startup based on some applied math I derived, computing, and the Internet! To heck with Wall Street: If my startup is at all successful, I will make much more money than I could have on Wall Street. And I don't have to live in or near the southern tip of Manhattan and, instead, live 70 miles north of there in the much nicer suburbs!
Lesson: Take the OP with several pounds of salt!
I also just left without reading it, due to the ads.
I'm actually in the process of overhauling the design of the site, particularly with regards to mobile, as the current Bootstrap-derived design pushes all sidebar content to the top on mobile/table.
That said, I'm not sure of the value of linking to part 3, unless the idea is it is the latest and already links back to parts 1 & 2.
Each part of the series is designed to cover the "typical" modules on a UK four-year undergraduate Masters of Mathematics degree.
However, it can be challenging in the latter two years to include a broad enough set of modules to cover all interests, so I have had to stick to those "core" modules likely to be found in many degrees, as well as those more specifically related to quant finance and machine learning.
However, it is my hope that individuals will be motivated to look at other areas as well, even if they're not directly related to career paths!
It seems as of late ,especially since 2013, there is huge demand for learning complicated mathematics, coding, and trading algorithms. It's like the AP-math class of high school, but as of 2013 expanded to include almost everyone, not just a dozen students lol. This recent obsession with math and finance is described in more detail in . People observe, read headlines about high-IQ founders, venture capitalists, and coders making tons of money in Web 2.0 (Uber, Pinterest, Snaphat, Dropbox, etc.); STEM people getting tons of prestige, status, and global notoriety for their finding (Arxiv physics and math papers frequently go viral); and how the economy, especially as of 2008, rewards intellectualism and STEM in terms of higher wages and surging asset prices (like stocks (the S&P 500 has nearly tripled since the 2009 bottom), web 2.0 valuations (Snapchat is worth $15 billion, on its way to $50 billion), and real estate (Palo Also home prices have doubled since 2011)), and, understandably, many people want a piece of the wealth pie. They see that intellect - which includes STEM, finance, and also quantitative finance - is the path to both riches and social status (as embodied by wealthy geniuses like Musk, Thiel, Zuckerberg, Shkreli), which is why there is so much interest in these technical, difficult subjects, unlike decades ago when only a handful of people were interested.
But another question is: Does algorithmic trading work? I don't know for sure, but I think a lot money is made in market making (Citadel Capital comes to mind), which tends to full under the umbrella of algorithmic trading - the two are closely related. And the math in involved has much less to do with differential geometry and number theory and more to to do with statistics and linear algebra (such as analyzing correlations between data). This involves a lot of trading and paying constant attention to order books - it's a full time job. I don't think it's as glamorous as many think it is, and I'm not sure if the returns are worth the effort. There are simpler methods, based on mathematics such as the ETF decay, that an also generate very good returns and don't require full-time trading. Here is one http://greyenlightenment.com/post-2008-wealth-creation-guide...
Obsession with Math? Where do you live where people have math obsessions? Where I live STEM graduates are a massive minority (We had an HN article on the front page about this very topic just yesterday) Mathematics is some sort of taboo magic in the eyes of 99.99% of humans, not something anybody studies to gain prestige, status or global notoriety. Your average citizen can't even name one mathematician.
People observe, read headlines about high-IQ founders, venture capitalists, and coders making tons of money in Web 2.0 (Uber, Pinterest, Snaphat, Dropbox, etc.); STEM people getting tons of prestige, status, and global notoriety for their finding (Arxiv physics and math papers frequently go viral); and how the economy, especially as of 2008, rewards intellectualism and STEM in terms of higher wages and surging asset prices (like stocks (the S&P 500 has nearly tripled since the 2009 bottom), web 2.0 valuations (Snapchat is worth $15 billion, on its way to $50 billion), and real estate (Palo Also home prices have doubled since 2011)), and, understandably, many people want a piece of the wealth pie. They see that intellect - which includes STEM, finance, and also quantitative finance - is the path to both riches and social status (as embodied by wealthy geniuses like Musk, Thiel, Zuckerberg, Shkreli)....
There is NOTHING genius about narcissistic photo-sharing websites or SnapChat. These are just illusive innovations, a fools-paradise for the masses. Not only that, but the "social status" of these founders you mention has rarely left the confines of the tech-world anyway, if you want social status in our society go to acting school and move to Hollywood. I mean..... this world you mention where STEM students gain so much prestige and math papers go.... viral? Where is this world? What planet are you posting from? Which galaxy is it located in? Is this post of yours real-life or am I dreaming?
The author is called Michael Halls-Moore. Why would a person called Michael Halls-Moore write a sentence like this? I'm genuinely curious, coz I'd assume this is a native English speaker.