I started a Math degree after 16 years of programming without any Math beyond high school (the highest being high school calculus). Most of my work as a software developer didn't require any "higher" Maths.
Once I began studying math, including Modern Algebra, Analysis, Graph Theory, Category Theory, etc., I realized I understood many topics on an informal level, in a non-rigorous sort of way, through programming. I had a good sense of major algorithms and data structures as well as their running times. Once I did have more math under my belt, things did become easier, and I started to see connections and commonality between problems across different domains, i.e. more than one way to skin a cat.
Part of the reason I began studying math, is that I felt it was my limiting factor. The range of problems I could tackle as a programmer was limited by math. It turns out this was partly true.
The biggest misconception is that in Math there is one "correct" answer. This is almost never the case. Some of the most interesting solutions in Computer Science come directly from Math topics that were once considered "abstract". Likewise, some of the most interesting problems are solved through approximation algorithms of seemingly intractable problems, often requiring a bit of "hacking" and real world experience beyond what you'd get from a formal education in Math or Computer Science.
Some things I'd add:
1) Math is fun! If you have the aptitude and disposition to enjoy writing software you'll love working out math problems. They're little nuggets of mental stimulation that you can work on with just some paper, a pencil, and maybe a pocket calculator.
2) You're spot on about an experienced programmer already having an intuitive but non-rigorous understanding of many concepts. It's mostly a matter of learning to read and write comfortably using the notation, which is really similar to learning the syntax and semantics of a big computer language with poor reference material.
3) You really have to have basic math down. This means going and re-learning stuff like applying FOIL to a binomial or dividing by a reciprocal.
4) Calculus and Linear Algebra are the father and mother of applied math. You'll save yourself a ton of grief if you learn them first (and I mean really learn them, maybe you took a calculus class in college but can you apply the Chain rule right now?). I'm learning Linear Algebra currently, which is something I should have done years ago. Part of the problem with self-teaching is getting things out of order.
Theorem: Let x be an integer. Then x^2 is even if and only if x is even
It seems so simple, and I think would be accessible to anyone who had completed high school algebra but I found that even having done those calculus and linear algebra courses, I had now idea how to go about actually PROVING this! The book however, goes through the thought process step by step, and teaching the skills needed to be able to understand the real math books like Rudin.
Even more than the chain rule, Taylor series approximations are what I constantly see applied in computer science and applied math.
Linear algebra certainly has applications in some of the above. But I don't think that calculus & linear algebra can be fairly described as "father & mother" to these areas. (Am I wrong? I could be missing some connections; I'm not a mathematician.)
I would say graph theory is part of combinatorics, and set theory is part of logic.
Category theory was born out of trying to abstract the relationships between different objects in abstract algebra, so is kind of the child of abstract algebra and logic. I think it's fair to say the parents of abstract algebra are combinatorics and linear algebra.
Number theory at an elementary level is combinatorics, but at higher levels branches into analytic number theory (Calculus) and algebraic number theory ((Linear) Algebra).
> You'll save yourself a ton of grief if you learn them first
Baby Rudin and Axler are used currently by Harvard Math 55 to teach those subjects. Rudin might not be very didactic (I would be happy to hear about alternatives), but Axler is a fantastic choice.
For a quick intro to Lebesgue integration you can read the beginning of Rudin's "Real and Complex Analysis" or Halsey Royden's "Real Analysis".
I haven't read Axler's book. I liked Hoffman and Kunze's "Linear Algebra"
It's fun because its incredibly rewarding! The elation of the "a-ha!" moment in math is second to none.
> 4) Calculus and Linear Algebra ...
Though I wouldn't get too caught up in the rigor of analysis or vector spaces right away. If you are self-studying, just spend enough time to feel confident computing and manipulating integrals, differentiation, and matrix math.
Then find a good intro to discrete math textbook covering a wide range of topics: number theory, graph theory, logic, set theory, etc, and learn how to write a "good" proof. This will open up a number of mathematical doors.
This has been my biggest realization as I started learning more math. What before seemed very arbitrary and unrelated becomes much more interesting and exciting once you have a bit of background. Unfortunately, I don't know of any way to get people to see the fun in math until they already know quite a lot of it... this was my experience at least, and seems to be pretty common among people who didn't gravitate towards math immediately.
I haven't really found any real world applications of the concepts I've learned, aside from having to hold a meticulously constructed symbolic reasoning world inside my head for a really long time without observational reality confirming it's correctness as a model to describe all things. This makes me pretty good at programming things that are incompletely described, I think, but also explains why Tarski said he was the only sane logician.
I never really hear about autodidacts talking about their experience. It can be really rough most of the time. I literally think it's just luck that I stumble across the right words. I also think it's luck when I manage to understand things and make a connection between them. I have managed to connect such disparate symbols together and maintain that connection strongly for long periods of time (with absolute conviction), that it all really seems like magic when it does work. But, giants, shoulders, yada yada.
Out of curiosity, could you give an example?
I use computer science to explain psychology, in a way the makes the person being judged correct, instead of requiring their behavior to be altered based on personal opinion.
Imagine you have two conflicting sets of data from observation in your mind, and you have to process this data quickly. Taking an arbitrary and insufficient amount of data is selective and results in bias. Over time this results in contradiction even though both instances of inference are correct with regards to the logical model they rely on, and the data fed to the model. Now imagine that you received this data because over a short period of time, you have experience such a wide range of life experience that your observations allow you to collect both sets of data simultaneously and with correctness. Both data models model the world correctly, but when separated into distinct models of 'knowing things' instead of 'one confusing mass of data', you get contradiction.
So imagine someone endures trauma in their life, and has their mind molded in a specific way based on the current state of psychology, because over time the thoughts in the patients mind are shortly transformed to the thoughts in the therapists mind. Psychology did not experience the trauma, so how can psychology have an opinion on the consequences of bad things happening?
Making inferences adds to data and alters future data models and inferences. How people are judged while they are being 'helped' affects whether that help harms or helps them. I was in a group therapy for victims of domestic abuse and my "counselor" told me that she hated people like me.
Would you mind expanding on this? I've often thought about going back to school for a math degree, or at least for the core degree courses.
Did functional programming, particularly in a pure language like Haskell, become easier to reason about once you had studied Category Theory in depth?
Well written...and I'd argue the same is for both Computer Science and software engineering in general. When teaching beginners, it still astonishes/annoys me how many students tell me, "My program didn't work"...as if there was just one reason why it didn't work, as opposed to hundreds of possible reasons.
Examples, there's almost no math running Hacker News. There's no math in programming most blogs. There's no math in most apps. There's no math in most text editors. etc etc etc. Most programs don't need anything more than arithmetic.
I'm not saying math won't help with lots of problems. Like you said you found it limiting at some point. But you managed 16 years as a programmer without much math. I'm in a similar boat. I've shipped 17 commercial games, written 6 game engines, world on Chrome for 5 years. My math sucks. Would I be better if my math was better? Of course! But that I've been productive without much math knowledge shows, at least one data point, that you don't have to be good a math to program
I am much in agreement with this. I know that the guidance counselor on staff when I was in high school would heavily steer people away from going into computer science/programming if they hadn't completed the entire catalog of math classes available at our school. Her assumption was that you needed to be some sort of math wizard in order to be successful in a CS or programming degree.
On top of that lots of game engines aren't 3D. 2D Mario? No serious math in there, at least not the SNES/NES ones. The 2D Zeldas? Even less. 2D Metroid? Probably less than Mario. Those games didn't use real maths for physics which is about the only place they could possibly use anything more than basic arithmetic.
I started studying math intensely (doing every exercise in books, etc) when I realized the same thing: math was a limiting factor for my programming ability. Michael Abrash hints about this in some article, and I sneered at it until I realized it was true.
I considered going back to do a math degree but the amount of hassle involved, as well as other life changes required, made that impossible.
I would like to know how it worked out for you. Are you glad you did a degree program? Do you feel you met people and made connections that were valuable, that couldn't be made by an autodidact?
I did try to learn math as an autodidact before I began the degree. The more abstract the math, the harder it was for me to self-study. It was inefficient at best. At worst, I'd hit a wall and not have anyone to reach out to.
The other thing is that in Math you are often dealing with the same question but with very different objects or variable types. The same question where your numbers could be real or complex, integers or finite fields, vector spaces or topological spaces, etc., change what the "correct" answer to the question might be.
While this is true, I've always found applying mathematical analyses such as algebraic reasoning to computation admits a single, minimal, canonical solution in the end.
Edit: Calling it a mini-TAOCP of most of the maths needed for physics/EE work might be a bit of a stretch, but I've yet to see another maths text that does better as a highly readable, self-contained and compact reference.
Edit2: I moved house once and thought I'd lost my copy from university. I eventually found it, and yes, I have two copies... It's that important to me for brushing off the things I've forgotten :)
The book has many worked examples, and the extensive end-of-section questions have the answers in the back of the book (for every 2nd question). This means you can learn by "reading then doing", and see if you have got the answers right - something many textbooks lack.
When I try to learn from other technical books, I often find myself thinking "I wish they'd written this in the same style as Boas".
It got me wondering...suppose there were a website for autodidacts in math and similar topics? Something where people could post and discuss their answers to exercises. It'd solve the whole problem.
Would textbook publishers sue?
Or they could just trust the students. At my university the honor code such a big deal that they let students take closed-book tests at home.
Maybe DennisP's idea could do the same thing - only post answers to the odd-numbered questions. Of course, DennisP's scheme would only work for books that actually have decent end-of-section questions, unless people made up extra questions as well ...
* I say much rather than most or all since it's focused on asymptotics, recurrences, number theory. Modern theoretical Computer Science draws on a much wider variety of mathematical methods.
I honestly believe math language is seriously outdated. It's like using COBOL to express everything. Yes, you can do that, but would you really want to given a choice? The most trivial things are so insanely complicated in math it's unbelievable (try to describe geometrical objects with the current math formal language if you do computer vision), yet there is very little work on developing better formal language of math. It's like with Turing machines and the complexity theory - who is going to move around a tape in the real world besides some specialized biological systems, not mentioning magical 'oracles'? Those abstractions were useful in their day, brought their fruits, but why do we still stick to them and just increase the gulf between more and more closed-unto-itself-theory and reality? Yes, it's great some theory is super cool but what do we do when we find in 20-30 years that the set of objects satisfying this omnipotent theory is empty? And when somebody like Mochizuki invents their own formal language to solve some cool problem like ABC conjecture, we all hate him, refusing to read the proof because it doesn't follow our outdated formal ways...
All that said, I think there are a number of misconceptions here.
First, mathematicians invent new languages all the time. That's the point of definitions, otherwise we'd be using sets to describe everything. The problem is, you first have to understand the concept well in order to apply suitable definitions. Think probability before Kolmogorov.
Second, Turing machines are a formalism to introduce you to the theory of computation because they are the simplest (or close to it) thing that can compute in the current sense of the word. Once you learn how TMs work, pretty much everyone just accepts them as a given and deals at a higher level.
Third people are trying to read Mochizuki's proof, but it is very hard. He basically invented his own way of doing things and so to understand his proof you first need to understand his methods. It's understandable that professional mathematicians with their own careers and areas of research find it hard to read the ~1000 pages of dense mathematics (proof + prior papers) to understand what is going on. Most people probably haven't read 1000 pages of math in their life, and it takes a while to come to terms with it no matter how smart you are.
I skimmed SICM (Structural Interpretation Of Classical Mechanics) just to get an idea of how they represented Langrange equations in Scheme and went from there.
The same is true of mathematics. It might be hard to get everything precise, but that's not the point of mathematics. Nor is the point to be close to reality.
A more elegant model of computation is the lambda calculus. It's also very bare-bones, but it's easy to imagine writing real programs in it. Functional programming languages, at their core, are just lambda calculi with some syntactic sugar.
It's practical, but it's also a good foundational model. It's easy to reason about, due in part to the fact that it models familiar math (partial functions).
Not to mention there are some issues with formal logic that might cause you problems (hint: why do medical doctors use counter-factuals and not mathematical logic?)
"Beware of bugs in the above code; I have only proved it correct, not tried it" -- Donald E. Knuth
I thought it was just an advertisement for Computer Concrete Roman (Knuth's other font family) and the Zapf(?) Euler math fonts.
^ Yes, that Ron Graham.
^^ Yes, that Don Knuth.
^^^ I don't recognize Patashnik. Sorry.
The thing I have needed all the time is statistics and prob. theory, that keeps coming up literally everywhere. Calculus - not so much. If you need something, you can always re-learn it quickly. (For example, I learned quite a lot of linear algebra, but haven't used it for ~10 years, so when I had to write some 3D gfx/shader code, I had to spend like two days on quaternions, etc.)
Being an engineer means using technology, science, mathematics to solve problems. Well in many cases you don't need math to solve these problems. The word itself as no root in math either, it's based on latin for devise/contrive, sure it was used for builders at first, where math was important, but context changes.
Also Software Engineer is wide and reaching, a 3D GFX Software Engineer will need some heavy mathematics to do his job and do it well, a Web Software Engineer not much, but he'll need to know a wide range of other knowledge GFX guy doesn't know (HTTP, network protocols, various languages, server technology, database technology, browser knowledge,...).
It's not about being good at mathematics, it's about knowing as much as possible about the domain of knowledge your role entails, and surprise, not every one of these requires deep math knowledge and understanding.
Your attitude is pure and simple elitism.
That's called practicing engineering. An engineer is a professional practitioner of engineer. You can practice engineering all you want, but if you're not a professional (having received an engineering degree from a certified university), it's dishonest to call yourself one. Honestly, it is elitist. But those of us who obtained our degrees worked our asses off. And I personally hate when people abuse the term to mean anything that took skill. "Candy cane engineer". "Beats engineer". "Drink mixing engineer". It's linguistic prescription to make yourself sound more important/skilled.
But my years of engineering experience have shown me that the title "Engineer" is really more about how you approach problems and what you do to solve them and less about degree credentials. Now, the following example is in the context of electrical engineering on airplanes, not computer programming, but I think it holds true.
One of the best people I work with does not have a college degree. But over years of self-study and real world experience, he has taught himself electronics, some computer programming, and enough mathematics to get by. And when there is a technical problem to be solved on one of our airplanes, he will chase after it relentlessly, and smartly, until it is solved. His system designs are clean and well thought through. He has taught me much about designing for real world implementation. Is he not an engineer? He does more than many of my coworkers who are degree holding EEs. I am not afraid to call him an engineer, because he has earned the title in a different way.
However, I still think it's wrong for people who don't exhibit these qualities to call themselves an engineer. If you make sick beats on your macbook, that's great. But don't call yourself a beat mix engineer.
Edit: Thought of some reasons
It's similar to the "doctor" title. You can be the worlds greatest doctor. Self-taught, you can do everything from intubation to surgery. However, you're still not a doctor. You practice medicine. Why? There's things that you can only learn from someone who is more skilled than you, and who is skilled at teaching. That's what a professor is (simple definition). They are an authority on their topic and are the best place to learn from. They teach things that books don't cover. They have experience. They can tell you when you're wrong, and unlike a book can teach you the most current standards and techniques.
Another point is the completeness of education. Your coworker, does he know vector calculus? Linear algebra? The forward-active voltage for a BJT? Maybe. But there's no guarantee he does. A degree from a certified university guarantees that you know the salient points of your field (not always true, but for my argument it is). If you don't have a degree, there's no guarantee. And this knowledge is important.
And as for experience, well, in the case of a CS student wishing to enter industry there's a good chance that the majority of your professors never even worked in industry. So if you're looking for people with experience to learn from, well, then you're in quite an unfortunate situation.
This led me to conclude that a degree offers no such guarantee that someone knows something. It offers a guarantee that someone was introduced to a number of concepts and demonstrated an understanding (or knack for cheating, cramming, what have you) good enough to pass and move forward. This is why I shudder at the thought of hiring old classmates who had to be hand-held through their 4 (or more) years of university, I know better despite what their degree might say.
Which is why that people who earned a degree...earned a degree, that's it. As far as I can tell they have no right to call themselves an engineer until they begin to practice engineering and practice it well enough to demonstrate the value of their thinking.
But the best professors I've had (and I've had a bunch) were the ones that really did combine both. (And frankly, I can forgive a lot of poor teaching in return for a "well, this is technically true, but no one really does it that way; they use this shortcut...."
As for a university education not providing immediately applicable industrial experience, well, that's kinda not the point of it. Sort of the difference between passing the FE exam and being a PE.
I agree with you, for the purposes of your argument, that degree should serve as a guarantee. It is an important signifier of mastered domain knowledge, and more importantly, a signifier of the ability to master new domains.
EDIT: That last sentence was a run-on, fixed it.
So basically, don't call yourself an engineer unless you're willing to sign off on something, and be legally bound by it. This implies a strong background in problem solving and structured design processes, to remove as much risk (both personal and to the public) as possible, which is also vital to engineering.
>So basically, don't call yourself an engineer unless you're willing to sign off on something, and be legally bound by it.
I'm sorry to say it (not really); but for a number of reasons, some good and some bad, the title has been co-opted, and there is no going back.
>This implies a strong background in problem solving and structured design processes
As far as that goes, I've met a number of pedigreed folks who can't engineer their way out of a wet paper bag.
To some extent this happens in Australia as well, although there is a movement both to require things to be signed off by a PE (or CPEng here), and to have those engineers provide documented supervision of the work they sign off on.
> As far as that goes, I've met a number of pedigreed folks who can't engineer their way out of a wet paper bag.
No argument there, certification is never proof positive of competence. I've met very good engineers who aren't Engineers with a capital E, and very bad Engineers who knew enough to fool a test board, but not much more.
The existence of these licensing schemes is far from perfect, but better than nothing IMO. Applying the concept to general purpose software is another discussion entirely!
If he's as good, and as experienced as you say, he should be able to just do the legal/ethics stuff and get a license. Unfortunately, the professional associations are streamlined for people who take the usual path through university. At least in my jurisdiction, it is technically possible to have the experience counted, rather than the degree, but it's much harder.
The system is clearly not perfect, but when you ask the average engineer whether it's ok to do things like lie about their experience, you will get very different answers than if you asked the general public. The gatekeeper is doing a real job, even if they don't do it perfectly.
I applaud the effort to learn the topics of "statistics, probability, and linear algebra", but these would have been relatively fundamental courses in most software/computer/electrical engineering curricula that I've known about, and most definitely a prerequisite to calling oneself an engineer.
Don't go near any dark alleys.
Definitely semantics. For example, I have a BS/MS in applied math from a good engineering school university. I am a software engineer mainly working with ECEs, physics, and other math guys, who are all "software engineers".
I see where you're coming from about being a SW eng without knowing a lot of math - but there are other majors - math/physics/stats etc that will be very math heavy and not "engineering degrees".
>When you get an engineering degree from a certified university, you are an engineer.
That's not true. When you get a degree you have an engineering degree. When you get hired and employed as an engineer, you are an engineer. You have an engineering degree and I have a mathematics degree. Our employers hire people for engineer positions and call them such. If I were a professor of math, i could call myself a professor. If I were a mathematician at the NSA, a Mathematician. But my employer calls me an Engineer. The degree does not do that.
And I'm not merely referring to software. The above is true for my companies 20+ different engineering positions across all disciplines they hire for (Aerospace, mechanical, electrical, materials, software, etc).
The other reason, of course, is that the "software engineer" term comes from a group of people who really wanted the respect that comes with "engineer" but realized that the big three don't get very far, software-wise. (And coincidentally didn't want to do all that icky math stuff. Not to mention much of the icky programming stuff.)
A filter / weedout system was required because of too many students, so its turned into something else entirely and now a engineering degree often means nothing other than having passed the weedout math classes. It really shows in some new grads that don't have any actual engineering skills but are really good at calc problems.
Lived and breathed math for 5 years in college before I could look myself in the mirror and call myself an software engineer.
Not really. Mostly just algorithms and procedures of some math concepts(Calculus and Linear Algebra) which roughly corresponds to the first year of North American math major.
In contrast, I know engineers who live and breathe PDEs and tweak compilers to solve them faster.
I doubt my CS professors would be able to solve a PDE..
Beyond the basics, not even many math professors can do that. Math is too vast and people specialize. Strong algebraic geometers are not necessarily strong analysts or algebraists or logicians.
1) No such thing as universal mathematician in this day and age.
2) Engineer's PDEs(algorithms) are not the same as mathematician's PDEs(theory). Same as comparing a student in China who learned English to communicate with English speakers to English majors from English speaking countries.
So yes, if you take a Computer Science major with a focus on Software Engineering, you will not learn enough math to minor in math "for free". If you pick a more mathematical subfield to focus on, you should probably declare a math minor for the one or two additional courses it will take you.
I made a truly stupid choice: I graduated in 7 semesters with a Comp Sci degree concentrated on PL theory without picking up the additional courses for a math minor. As a result, I'm "condemned" to learn that material independently later on. "Luckily", the Technion required me to do extra coursework for my MSc, so I've had to buck up and learn more theory.
Now get off my lawn until I'm done with my highly theoretical machine learning exam ;-)!
If you want to do good at machine learning...electrical engineering is probably a better choice; the maths learned in EE overlap fairly well with what is needed to do ML. PL theory is quite niche, even for PL researchers.
Besides, I never heard of math majors taking cryptography that early. Usually, Intro to Real Analysis, Abstract Algebra and Abstract Linear Algebra come first.
Really glad to see I'm not the only one.
I don't have any recommendations for linear algebra, but for stats and probability (which I always found intimidating in the past), Allen Downey's "Think Stats" and "Think Bayes" did the trick.
For programmers with less math experience it serves as a perfect transition with which to then go into more theoretical aspects of LA.
He says: "Don't try to read it all. It's a map... It's there to help you keep track of where you are and where we're going." Every professor should do that for their course. And every department should put up a big poster with something similar: these are the subject areas you will study, how they relate to each other, and the courses that cover those areas; if you choose this specialization these are the areas you'll focus on. Put-out a mind-map of the subject area that relates to the available courses—help students start building Elon Musks' mental-hyperloop / semantic-tree.
available on the course website: http://cs.brown.edu/courses/cs053/current/graphical-outline....
I'm at the very start of what I hope might be a similar journey, and have signed-up for a Coursera "Introduction to Mathematical Thinking" course. I'm hoping it might give me some insight to build on. The course starts in about ten days, so apprehension hasn't kicked-in yet.
I have significant trouble with doing this with mathematics, though. Programming languages (at least the ones I know) just don't seem to be good for expressing things like identities and invariants. Anyone have any suggestions on how to handle such things?
One of the masterpieces that has gone the opposite direction is:
Mathematics: Its Content, Methods and Meaning (three volumes bound as one) by A. D. Aleksandrov, A. N. Kolmogorov, M. A. Lavrent'ev (18 authors total)
This book is really good companion for autodidacts. It's basically overview of mathematics.
There are some great textbooks translated from Russian. Analysis by Kolmogorov, (rigorous) Linear Algebra by Shilov, Complex Analysis by Markushevich to name a few.
The book covers too much to be thorough. Each chapter gives good introduction to the subject matter and ends with list of suggested reading.
I always read the relevant parts from this book before going deeper. Not everyone is going to dwell into non-euclidean geometry, functional analysis and topology.
Furthermore, I don't think typical self studying engineer in Hacker News wants to learn math using rigorous introduction to analysis. You can get good working knowledge and intuition without knowing what delta epsilon is.
How to Think About Analysis by Lara Alcock.
Understanding Analysis by Stephen Abbot.
Mathematical Analysis and Proof by David Stirling.
Numbers and Functions: Steps into Analysis by Burn.
Analysis: With an Introduction to Proof by Steven Lay.
A First Course in Mathematical Analysis by David Brannan.
How do you know that?
An UNESCO study of translations between Swedish, Chinese, Hindi, Arabic, French, German and English over a decade showed that 104,000 of the 132,000 translations made between all those languages were translations from English.
Yeah, no. Usually good books get translated, period.
From Homer and the Bible to Pascal, Leibiz, Dostoyevsky, Godel and Einstein, good books fly the other way around all the time.
Great books certainly get translated in every direction. But for merely good books, I wouldn't be surprised if readers outside the US consumed more books translated from English than readers in the US consume books translated from other languages.
I don't think so for the simple reason that academic books in English usually aren't translated because people in countries outside of the US can read English. Even academic books that do not have any native English speaking authors are usually written in English. Books are more likely to be translated to English than from English, because translating to English multiplies the size of the audience many times, whereas the other way around does not.
Considering that the average reader reads more in Europe than in the US and that there is a very dynamic domestic industry in many of these countries, I would say the opposite.
And I didn't even take India and China into account...
Mathematical Proofs: A Transition to Advanced Mathematics by Chartrand and others.
How to Prove It: A Structured Approach by Velleman.
Learning to Reason: An Introduction to Logic, Sets, and Relations by Nancy Rodgers.
Elementary Discrete Math books make for great intro to proofs and math thought:
Discrete Mathematics with Applications by Epp.
Mathematics: A Discrete Introduction by Scheinerman.
There are tons of terrific introductory books on other subjects in math if anyone's interested.
I'd add a few of books to your list:
Graph Theory by W. T. Tufte: http://www.chapters.indigo.ca/en-ca/books/product/9780521794...
On Numbers and Games by John H. Conway: http://www.amazon.com/On-Numbers-Games-John-Conway/dp/156881...
And for good re-introduction to geometry and it's practical application to computer graphics: http://www.amazon.com/Primer-Graphics-Development-Wordware-L...
I think maths is amazing. It's a useful tool to have in one's arsenal regardless of your profession or walk in life. It's also not the sole domain of the highly educated elite. Anyone can do it.
It's not important to be "correct" all of the time so much as it is to be curious and willing to learn -- and willing to share.
Keep at it! There's so many cool things you can do as you learn more!
The author for Graph Theory is W. T. Tutte, not Tufte, and much appreciate the link which helped clear up my initial confusion.
I enrolled in Coursera's Discreet Optimization about a year ago. It was way beyond my ability [No SoA], but I learned a hell of a lot about computing and solving hard computational problems. It also suggested that mathematics might be important when grappling interesting problems.
For fun I watched a number of Strang's Open-Courseware videos on matrices. I hit Chapter 1 of TAoCP. I treated it like programming - it was ok to only partially understand. I took an Algorithms MOOC or two. I actually enjoyed following the analysis even formal analysis is not something I would do for pleasure.
The turning point though was when Lamport's "Thinking for Programmers" [Lamport: Thinking] hit HN. It threw down the gauntlet. Specification is a prerequisite for a commitment to getting it right, where the it is computing. And when the it is computing, we are talking about math whether we like it or not.
It's not that the math and specification replaces testing. It's that tests are ad hoc without a mathematical understanding of the computation.
[Lamport: Thinking] http://channel9.msdn.com/Events/Build/2014/3-642
At least on my home country, a Software Engineer job title implies a CS degree with at least two years full of math.
But besides that, I see a deeper motivation in the article. The author says
"My dream is to learn the statistics, probability, and linear algebra needed to really understand machine learning and computer vision...I need a solid foundation so that I can truly understand what's going on: why something works, when it won’t work, and what to do differently if it doesn’t."
I contend that even many who have formally studied CS and math probably don't truly understand these math tools, that is, if they are using them in the first place. Intuition in math takes time to build up, and requires considerable mental effort.
By analogy a business school IT degree is two years of cs plus a bunch of biz classes instead of compilers and automata theory (it varies, huge simplification, etc) However, its possible to study CS at higher levels for immensely longer than the two years an IT grad will get.
Now with Bologna reducing the degrees to 3 years, the title requires 3 years Bologna + 2 years master as workaround to keep the old 5 years.
And, "validated university degress" or not, I doubt a fizz buzz interview would have much different results in your country as opposed to the US:
If you required prior knowledge of Fizz Buzz to solve it successfully, you should absolutely never be hired to program anything.
Companies choose employees, but employees can also chose companies which can recognize the value someone brings in, besides a few programming exercises on a sheet of paper done in 1 hour interview.
As for Fizz Buzz, never bother coding it. I see no value.
I've never personally had a FizzBuzz test, but I also have no doubts about my ability to solve it in my sleep.
I'm against trick interviews or algorithms quizzes, but FizzBuzz is again just establishing a ridiculously low baseline that you know the most essential aspects of programming. Someone a few weeks into CS 101 should be able to do it, so any professional developer who can't deserves to be immediately laughed out of the room.
FizzBuzz is only a "dummy test" in the sense that only dummies will fail it.
The most interesting side effect I've noticed from doing this is a dramatic improvement in my ability to focus. I had been suffering from a general scatterbrained, distracted feeling for a while before I started. I think it was due to the way I consume content online, trying to follow too many interests at once. I set aside about an hour a night to work on these math courses and within a couple of weeks I noticed that my focus was greatly improved. For that reason alone I'm counting this project as a major win, regardless of whether I master these topics or not. Having something to study seems to be extremely valuable for general mental health, at least for me.
It reads as a high level false-historical account of how math might have been developed, in its entirety, if it unfolded in the most direct and logical way from the human experiences we all share (guided by understanding of where mathematics has ended up in the modern age).
The false-historical expose is fantastic. The goal becomes less to express things as accurately and comprehensively as modernly possible—which you'll experience as you take on graduate level books or modern papers—but instead to demonstrate large-scope intuition for why whole fields evolved as they did and how they are all intermingled.
I need that person to say: "I don't understand it". Even if I don't understand it myself, as I try explaining it to my study partner, it starts clicking in my brain. I suddenly start to understand these parts as I'm explaining it. Hands down the best way for me to learn: explain to others.
(And first recommendation by Cosma Shalizi: http://vserver1.cscs.lsa.umich.edu/~crshalizi/notabene/infor...)
I studied Software Engineering, and am a certified engineer. I sat next to the Aerospace, Chemical and Mechanical engineers for all 4 years of "Engineering Math". (They did crazy physics and chem that I didn't)
How can you call yourself a "Software Engineer" without having studied a huge amount of math?
I decided to "scratch an itch" in open source parlance and start a study group I call colearning. Colearning provides a place for people doing remote education (or working on their own projects or writing) to have structured time to study and move forward towards their goals. It helps provide the positive pressure that a real physical class provides.
Our colearning group currently meets at Borderlands Books in San Francisco on Sunday's from 11 am to 4 pm. Message me and you can come join!
I've done this to help provide the structure I need to keep learning; its already helped me advance greatly in my own studies as I make forward progress every week.
Years ago I also identified a need for community while being self employed and started the coworking movement.
I feel like there's so many different directions to take, but which ones are the most applicable to practical Software Development? Particularly in the realm of Web Development?
Editing my question to ask what's the most applicable math subjects regarding the _PRACTICAL_ applications of software development, hopefully clears it up a bit.
Sounds like Linear Algebra would be good to learn.
Have you done any study of Graph Theory? I've found that a surprisingly large number of problems in computing can be morphed into graphs. I was working on an issue last week (in a web app) that turned out to be a variant of Exact Cover which straight away gave me a large body of literature / algorithms to pull from - and told me I was in a slightly dark place :). Graphs and the algorithms around them feel very close to the work we do with computers, so it will probably be pretty approachable given your background.
My advice would be to not worry too much about how applicable it will be to web development and just pick something to study. Ultimately, it will end up being useful in time. And maths is just beautiful.
You might get something interesting out of learning category theory and Haskell.
The lectures are quite clear and easy to follow. I did do all this math at university, but that was over 20 years ago, so I had forgotten a lot, so for me this course is perfect. But I think it can be quite useful even if the concepts a re new to you.
Another thing that can help a lot is using a computer algebra system to solve tedious arithmetic steps or to verify one's calculations.
My understanding was that to call yourself an engineer one must be capable of building robust and efficient things, whether that thing is a plane, or a bridge, complex software, even a website, matters not, so long as it is robust and efficient.
The degree to which you understand mathematics is useful only in how greatly it can enable you to build something robust and efficient. Thus, if you know more linear algebra than 90% of all mere mortals, but write inefficient programs, then at best you are a bad Software Engineer, at worst you're simply not a Software Engineer.
For that reason, at my current job I was asked to replace a man who earned his PhD in Computer Science from Georgia Tech who's job was to build an maintain an API in Go, despite the fact that I'm a college dropout. Smart man, with a far better understanding of mathematics, but ultimately I could build things more robustly and efficiently. I had stronger engineering skills.
Going back to OP, right on man. I personally didn't enjoy math until I learned it using my own rules, by own rules I mean I shunned the established method of repetition until you memorized the process, instead opting to understand the underlying concept and keep my thoughts written away somewhere for future reference, knowing all too well by now that memorization is futile given how frequently I have to reference Stack Overflow.
Of course, it's that remaining 1% that will fuck you if you don't have the maths chops. And this is as a consulting engineer, which arguably uses the least maths. Once you get into actual design or analysis, that number becomes a sliding scale in the other direction very quickly.
> build an maintain an API in Go
Unfortunately, a much larger proportion of engineering jobs require mathematics than you make it seem. Software engineers seem to forget that engineering includes designing planes, bridges, cars, lasers, electrical circuits, materials, and a million other things that require you know mechanics, statics, statistics, logic, linear algebra, differential equations, etc.
The point is, the engineer who designs the plane only needs to know enough math to design said plane. The engineer who designs the bridge? The same. Theoretically, they're supposed to understand a great deal. Realistically, the degree to which technology automates the mundane task of calculation could very well mean that their daily application and actual understanding of mathematics is overestimated anywhere from slightly to greatly. This will be especially true as time goes on and the tools of the trade become even more advanced.
To design a web application that operates efficiently in terms of cost, response time, and required maintenance, along with a number of other variables, requires good software engineering. Once upon a time, this required strong mathematical aptitude. Now, it requires more of an understanding of the language used for the design of the application itself, an ability to justify why one approach should be faster through algorithm analysis and efficient testing, logical thinking, etc, it certainly doesn't require the mathematical aptitude to design your own cryptography protocol whenever you can simply use a library demonstrated to be safe and reliable, although an understanding of the underlying concepts of how said protocol works might be nice, if anything to understand why you shouldn't roll out your own.
It's not too far off to imagine a near future where the average <insert engineer here> functionally requires the same degree of mathematical aptitude as today's current web developer.
Are there any projects you maintain to keep your knowledge fresh?
So far it tracks the matching textbook pretty closely. The first little set of lectures covered the first 62 pages.
computer science is computer program, when you abstract away all context so it is just a bunch of symbols
statistics is mathematics applied to the "real world" data, and the art of turning it into form suitable for computer.
Actually, from Lao Tzu's "Art of War", he put's it this way:
In respect of military method,
secondly, Estimation of quantity;
fourthly, Balancing of chances;
I love math.
> Why not quit my job and go back to
school? Well, that’s not really for me.
Reading books at my own pace lets me try a
subject out without fully committing to it
and making it a necessity that I find work
based off of it.
I respond here in three parts, the first
general, the second more specific and
about linear algebra, and the third about
Really, in school, you still have to learn
the stuff. And, there a class might help
but won't be enough; so, right, again,
during the course and in the hours not in
the class, you still have to study and
learn the stuff. Or for the class,
"mathematics is not a spectator sport".
That means you still have to study the
stuff, get it between your ears,
Eventually you can conclude that mostly
college and its courses are more for
certification than education. For the
education, that's heavily up to you.
But there are some dangers in self
learning: Not all the ideas and learning
materials are good; the really good ones
are only a small fraction of all the ones
you will likely encounter.
So, in praise of college and profs, they
can (1) get you on a good track with good
ideas and learning materials and (2) get
you unstuck and keep you on track. In
such study, it's possible to do too much
or too little -- from some experts you can
see about what is the right amount to do.
I said "college"; revise that to read "one
of the world's best research
universities", say, in the US, 1-2 dozen.
The ideas and materials you will see in
such a university really do stand to be
better than nearly all you will encounter
otherwise. Such learning is where there's
not much substitute for quality. But,
still, just to learn the material, you
don't really have to enroll in such a
university and, instead, just borrow from
their course descriptions and materials.
Indeed, some of the best such universities
are working hard to make their learning
materials available to all for free over
the Internet. Why? Because those
universities want to concentrate on
pushing forward with research.
Here's a way: Show up at such a
university and appropriate department, in
your case, applied math, mathematical
sciences, operations research, statistics,
whatever. Maybe show up at some public
department seminars. Talk with some of
the students. Say you have a career
going, for your career are interested in
what the department is doing, and want to
learn more about the program and the
mathematical content. So, get some of the
students to talk and explain.
Then for some of the courses you are
interested in, see who the profs are and
look at the course materials, say, texts,
handouts, on-line files, etc.
Then after looking at the materials and,
say, have made some progress with them,
try to get 15 minutes to chat with a prof.
Then take what you just got from that
department for free -- broad directions,
what they regard as more/less important,
texts, course materials, etc. -- and go
off and study on your own. When you think
that maybe you have some course studied
well, try to get a copy of the course
final exam or Ph.D. qualifying exam, work
through it, and, if it appears you did
well, ask a prof to check your solution to
a few of the most difficult questions. If
your solutions look good, then you will
start to look good and may get asked if
you would like to apply as a student in
the department. So, here you and the prof
and department are interviewing each
Such things worked for me: (A) In my
career I kept running into the work of
John Tukey at Princeton and Bell Labs.
So, it was stepwise regression,
exploratory data analysis, power spectral
estimation, convergence and uniformity in
topology, his statement equivalent to the
axiom of choice, etc. So, I wrote him at
Princeton asking about graduate study,
mentioned those topics, and got back a
nice letter from the department Chair
basically inviting me to apply.
(B) I applied to Cornell and got rejected.
But largely independently I visited a prof
there to discuss optimization, asked about
being a grad student, and soon got another
letter accepting me to grad study.
(C) At least at one time, the Web site of
the Princeton math department said,
"Graduate courses are introductions to
research by experts in their fields. No
courses are given for preparation for the
qualifying exams. Students are expected
to prepare for the qualifying exams on
their own." or some such.
Lesson: In grad school at Princeton, are
still expected to learn the qualifying
exam materials on your own. Well, to do
that, don't have to be in the high rent
area around Princeton, NJ.
When I did go to grad school and got my
Ph.D., what really saved my tail feathers
was what I had done and did on my own as
independent study. Some of the courses
helped in providing high quality
directions and materials, but the real
work was nearly all independent.
And, one of the crucial inflection
points was when I took a problem in a
course but not solved in the course, did
some research, and found a solution.
The solution was novel, and word spread
around the department quickly. My halo
got a high polish, and that greatly eased
my path through my Ph.D. That is, I had
proven results on the most important
research academic and Ph.D. bottom line
-- I'd done good, novel, "new, correct,
significant" research. Then my hair cut
or lack there of, sloppy hand writing,
occasional upchuck at some bad course
material, etc. no longer mattered.
My research looked publishable, and it was
-- I did publish it later, in one of the
best journals, easily, no revisions. So,
lesson: That inflection point was from
So, don't feel that your independent work
is inferior to enrolling as a student.
Instead, in the best universities, as a
student, nearly all the work is for you to
do independently anyway.
Still, I'd repeat -- try to pick the
brains, for free, without hurting your
present career, of the courses, profs, and
materials at a world class department in a
world class research university.
Why a research university? World class
research is a very high bar and some of
the best evidence of good expertise,
insight, and judgment in the field -- and
you don't want the opposites.
After freshman calculus, it would be good
to start with abstract algebra. There
you will get handy with (relatively
simple versions of, and, thus, a good
place to start -- and, I have to say, put
you ahead of a surprisingly large fraction
of the best chaired professors of computer
science) theorems, proofs, sets, axioms,
groups (once I published a paper in
statistical hypothesis tests where the
core of the math used some group theory),
rings, fields, the integers, rationals,
reals, and complex numbers and their
leading properties, some important
algorithms, e.g., Euclidean greatest
common divisor (also the way to find
multiplicative inverses in the finite
field of the integers modulo a prime
number and, thus, the core of a super cute
way to do numerical matrix inversion
exactly using only short precision
arithmetic), number theory and prime
numbers (crucial in cryptography), vector
spaces (the core of multi-variate
statistics and more), and some of the
classic results. Some of this material is
finite mathematics at times of high
interest in computing -- e.g., error
correcting coding. For such a course, a
good teaching math department would be
good. Have a good prof read and correct
your early efforts at writing proofs --
could help you a lot.
But starting with linear algebra is also
good. There are lots of good books. The
grand classic, best as a second text, is
P. Halmos, Finite Dimensional Vector
Spaces. He wrote this in 1942 after he
had gotten his Ph.D. from J. Doob (long
the best guy in the US in stochastic
processes) and was an assistant to von
Neumann at the Institute for Advanced
Study (for much of the 20th century a good
candidate for the best mathematician in
the world). The book is really a finite
dimensional introduction to von Neumann's
Hilbert space theory.
von Neumann is the guy on the right. The
guy on the left is S. Ulam (has a cute
result the French mathematician LeCam once
called tightness I used once). The guy
in the middle is just a physicist! Of
course, in that picture they were working
up ways to save 1 million US casualties in
the Pacific in WWII and were astoundingly
successful. Ulam is best known for the
Teller-Ulam configuration which in its
first test yielded an energy of 15 million
tons of TNT. There are rumors that von
Neumann worked out the geometry for the US
W-88, 475 kilotons in a small package as
Von Neumann also has a really nice book on
quantum mechanics the first half of which
is a totally sweetheart introduction to
Of course, Ulam was an early user of Monte
Carlo simulation, still important.
Other linear algebra authors include G.
Strang, E. Nering, Hoffman and Kunze, R.
Bellman, B. Noble, R. Horn. Also for
numerical linear algebra, e.g., G.
Forsythe and C. Moler, the LINPACK
materials, etc. There are free, on-line
PDF versions for some of these. Since the
subject has not changed much since Halmos
in 1942, don't necessarily need the latest
paper copy at $100+!
For statistics, that is a messy field. It
has too many introductory texts that over
simplify the subject and not enough well
done intermediate or advanced texts.
Also the subject has essentially a lie:
They explain that a random variable has a
distribution. Right, it does. Then they
mention some common distributions,
especially Gaussian, exponential, Poisson,
multinomial, and uniform. Then the lie:
The suggestion is that in practice we
collect data and try to find the
distribution. Nope: Mostly not. Mostly
in practice, we can't find the
distribution, not even of one random
variable and much less likely for the
joint distribution of several random
variables (that is, of a vector valued
random variable). Or, to estimate the
distribution of a vector valued random
variable commonly would encounter the
curse of dimensionality and require
really big big data. Instead, usually
we use limit theorems, techniques that
don't need the distribution, or in some
cases make, say, a Gaussian assumption and
get a first-cut approximation.
Early in my career I did a lot in applied
statistics but later concluded I'd done a
lot of slogging through a muddy swamp of
low grade material.
A clean and powerful first cut approach to
statistics is just via a good background
in probability: With this approach, for
statistics, you take some data, regard
that as values of some random variables
with some useful properties, stuff the
data into some computations, and get out
data that you regard as the values of some
more random variables which are the
statistics. The big deal is what
properties the output random variables
have -- maybe they are unbiased, minimum
variance, Gaussian, maximum likelihood,
estimates of something, etc.
For this work you will want to know the
classic limit theorems of probability
theory -- weak and strong laws of large
numbers, elementary and advanced
(Lindeberg-Feller) versions of the central
limit theorem, the law of the iterated
logarithm (and its astounding application
to an envelope of Brownian motion), and
martingales and the martingale convergence
theorem ("the most powerful limit theorem
in mathematics" -- it's possible to have
making applications of that result much of
a successful academic career). And,
generally beyond the elementary statistics
books, you will want to understand
sufficient statistics (and the
astounding fact that, for the Gaussian,
sample mean and variance are sufficient
with generalizations to the exponential
family) and, also, U-statistics where
the order of the input data makes no
difference (and order statistics are
always sufficient). Sufficient statistics
is really from (a classic paper by Halmos
and Savage and) the Radon-Nikodym theorem
(with a famous, very clever, cute proof by
von Neumann), and that result is in, say,
the first half of W. Rudin, Real and
Complex Analysis (with von Neumann's
Also with the Radon-Nikodym theorem, can
quickly do the Hahn decomposition and,
then, knock off a very general proof of
the Neyman-Pearson result in statistics.
How 'bout that!
Thus, to some extent to do well with
statistics, both for now and for the
future, especially if you want to do some
work that is original, you will need much
of the rest of a good ugrad major in math
and the courses of a Master's in selected
topics in pure/applied math.
So, for such study, sure, at one time
Harvard's famous Math 55 used the Halmos
text above along with W. Rudin,
Principles of Mathematical Analysis
(calculus done very carefully and a good
foundation for more), and Spivak,
Calculus on Manifolds, e.g., for people
interested in more modern approaches to
relativity theory (but Cartan's book is
available in English now). It may be that
you are not interested in relativity
theory or the rest of mathematical physics
-- fine, and that can help you set aside
Then, Royden, Real Analysis and the
first half of Rudin's R&CA as above,
along with any of several alternatives,
cover measure theory and the beginnings of
functional analysis. Measure theory does
calculus again and in a more powerful way
-- in freshman calculus, want to integrate
a continuous function defined on a closed
interval of finite length, but in measure
theory get much more generality.
And measure theory also provides the
axiomatic foundation for modern
probability theory and of random
variables. Seeing that definition of a
random variable is a real eye opener, for
me a life-changing event: Get a level of
understanding of randomness that cuts
out and tosses into the dumpster or bit
bucket nearly all the elementary and
popular (horribly confused) treatments of
Functional analysis? Well, in linear
algebra you get comfortable with vector
spaces. So, for positive integer n and
the set of real numbers R, you get happy
in the n-dimensional vector space R^n.
But, also be sure to see the axioms of a
vector space where R^n is just the leading
example. You want the axioms right away
for, say, the (affine) vector subspace of
R^n that is the set of all solutions of a
system of linear equations. How 'bout
Then in functional analysis, you work
with functions and where each function is
regarded as a point in a vector space.
The nicest such vector space is Hilbert
space which has an inner product
(essentially the same as angle or in
probability covariance and in statistics
correlation) and gives a metric in which
the space is complete -- that is, as in
the real numbers but not in the rationals,
a sequence that appears to converge really
has something to converge to. Then wonder
of wonders (really, mostly due just to the
Minkowski inequality), the set of all real
valued random variables X such that the
expectation (measure theory integral)
E[X^2] is finite is a Hilbert space,
right, is complete. Amazing, but true.
Then in Hilbert space, get to see how to
approximate one function by others. So,
in particular, get to see how to
approximate a random variable don't have
by ones you do have -- might call that
statistical estimation and would be
Then can drag out the Hahn-Banach result
and do projections, that is, least
squares, that is, in an important sense
(from a classic random variable
convergence result you should be sure to
learn), best possible linear
approximations. And maybe such an
approximation is the ad targeting that
makes you the most money.
So, that projection is a baby version of
regression analysis. There's a problem
here: The usual treatments of regression
analysis make a long list of assumptions
that look essentially impossible in
practice to verify or satisfy and, thus,
leave one with what look like unjustified
Nope: Just do the derivations yourself
with fewer assumptions and get fewer
results but still often enough in
practice. And they are still solid
For the usual text derivations, by
assuming so much, they get much more,
especially lots of confidence intervals.
In practice often you can use those
confidence interval results as first-cut,
rough measures of goodness of fit or
But the idea of just a projection can
give you a lot. In particular there is an
easy, sweetheart way around the onerous,
hideous, hated over fitting -- it seems
silly that having too much data hurts, and
it shouldn't hurt and doesn't have to!
And the now popular practice in machine
learning of just fitting with learning
data and then verifying with test
data, with some more considerations which
are also appropriate, can also be solid
with even fewer assumptions.
Go for it!
All praise welcome!
> I'd be interested to see what else you've written.
For some more math, more technical,
- a few universities have put (many/most) of lecture notes and student notes up: http://www.maths.cam.ac.uk/studentreps/res/notes.html as well as study guides: http://www.maths.cam.ac.uk/undergrad/studyskills/text.pdf
- books about how to think like a mathematician: Keith Devlin, Kolmogorov/Alexandrov et al did 2 Dover books, and Houston: http://www.amazon.com/How-Think-Like-Mathematician-Undergrad... and http://www.amazon.com/How-Study-as-Mathematics-Major/dp/0199... and Ellenberg: http://www.amazon.com/How-Not-Be-Wrong-Mathematical/dp/15942...
- Concrete Math by Patashkin, Knuth et al; Streetfighting Math by Mahajan and his newer, freely available: http://mitpress.mit.edu/books/art-insight-science-and-engine...
- this machine learning/data science list: http://www.reddit.com/r/MachineLearning/comments/1jeawf/mach...
- Cal newport blog: http://calnewport.com/blog/2012/10/26/mastering-linear-algeb...
- besides Dover, Schaum Outlines are a good cheap resource abundantly available in used bookstores(tho there are in fact some type-ridden ones also)
the best advice general advice i've seen is the same as what they tell you in college: form study groups and make commitments to regular discussion. Stronger students strengthen their understanding by tutoring others at the whiteboard. There's lots of machine learning and data sciencey meetups and informal groups springing up e.g.http://machine-learning.meetup.com/
The question invalidates anything he has to say on the subject and makes me question his claim of being a software engineer. No self-respecting person in that position would ever need to ask that question or write an article about it. Nor is it worth wasting anyone's time to read past the first paragraph.
"ill advanced shit"? I know, I know, it's a quote and probably intended to be funny. Heck, it might be.
But I swear, if I start hearing people say that without an awful lot of irony, I'm going to be kicking people in the testicles.