Hacker News new | past | comments | ask | show | jobs | submit login
How I faced my fears and learned to be good at math (niemanlab.org)
224 points by mxfh on Nov 13, 2013 | hide | past | favorite | 163 comments



I hit a wall in college where math just stopped being something I intuitively "got". I'm sure that given enough time and motivation, I could have continued being "good at math" even at the higher levels, but these were luxuries I did not have, given everything else on my plate at the time.

My biggest problem with math, especially once I got into academia, was how it was taught. So many professors would scribble what seemed like nonsense on the board (symbols that change from professor to professor, or even from lecture to lecture) and then go on to say things like "...and the proof is trivial" or "...it obviously follows that...", and I'd sit there wanting to shout "NO, NO it's not obvious!"

Finally, I'd find a tutor to explain to me what it was I was missing, and it really WAS obvious. If only it had been taught that way in the first place!

Admittedly, not everyone has the same learning style, but the classes I took seemed really tailored towards the students who already had the intuition that I lacked.


I agree, college mathematics education is usually horrible, especially in the start. I remember back when I was attempting to read my calculus 1 textbook and not understanding the thing at all. I realized that about 3 or 4 classes later in discrete math the reason why is because it was using fairly foundational mathematical terms and concepts such as sets, proof by induction and so on that it was small wonder why my calculus textbook was incomprehensible to my grade 12 mathematical education.

I confronted a professor about why they have it backwards and don't teach discrete math course or similar foundational course FIRST so people can actually read their textbooks, or at least let people take that path. They basically said that since it's not relevant to many majors and it's harder for most people since they don't have the 'mathematical mind' they do it in that backwards way. The professors being Math PhDs, don't adjust their any of their classes enough for the lack of foundational knowledge. It frustrated me very much. I don't think it's a big mystery why you probably hit that college wall when most college curriculums are set up that way.


I think this is due to the fact that many many years ago logic used to be taught in highschool and all of those concepts would have seemed significantly less foreign.


You have hit upon my biggest gripe with mathematicians, their love of 'notation', or more precisely their love of writing in symbology that makes mathematics seem more arcane than it actually is.

When ever I've run up against "impenetrable" math I often ask "So how would you use this?" and connecting it to the real world helped tremendously.


Mathematical notation is a must. Math in "plain English" would be as big a nightmare as programming in plain English. Natural language is so ambiguous that your "plain English" would turn into legalese every time you had to explain something unambiguously to someone who didn't already know it.

The problem isn't the notation per se, it's that teachers don't spend nearly enough time explaining the notation itself. It's a foreign language that they are so skilled at that they don't understand how unfamiliar it is to their students.

I was well into a physics major before I really stopped and carefully considered the many different notations used to represent derivatives (dx/dy, f'(x), y', y-dot, Dsub-yX, etc.) I realized that I had developed separate context-specific bodies of calculus knowledge/skill from different fields with different notations and approaches, and that these were all the same thing. Different techniques in different contexts were an artifact of different styles of notation, not differences in the actual math.

I can understand Italian to some extent as a side-effect of my study of Spanish. If I took an electronics class in Italian, I would understand some of the concepts and misunderstand others. Would my troubles to understand certain electronics concepts be due to trouble with electronics or trouble with Italian? Who knows? Both types of misunderstanding would compound each other.

If teachers spent more time carefully teaching this foreign math language before (and while, and after) using it to teach math, I'd guess a lot of students' "math problems" would magically disappear.


The problem isn't with using notation. Lots of people on HN are programmers, we understand the value and necessity of unambiguous artificial languages.

The first problem is that the notation is usually "the first symbol that popped into some random genius's head 200 years ago". And once the notation is set, it's set, no matter how poor it is, or how many other places it's already in use etc etc. Then, as you note, sometimes there are multiple notations. Ugh.

The second problem is closely related to the first. Mathematical notation is write-optimised. This makes sense because of the long history. But that doesn't change that write-optimised languages are harder to read and understand, even for experts, than read-optimised languages.

In a programming context if you today reduce all your variable names to single latin letters and all your function names to single greek letters, you will be widely mocked and reviled. In maths it's Just How Things Are Done.

I guess what I'm saying is: the curse of mathematical notation is pen and paper. The boundaries of QWERTY liberated (almost all) programming languages from the curse.


Good maths proofs proceed step by step, with intermediate goals written in clear English (or French, for me). And to go from step to step, you don't need explicitly long names, just like you don't name your index loops "index_over_collection" but just i. Because your function will only be a few lines long.

I actually really like mathematical notation. Once you get used to it, the terseness can make things a lot clearer than natural language.

(If you've ever tried to read old mathematical articles / books that are a few centuries old, you'll understand the power of mathematical notation. Check out "God Created The Integers" or "On the Shoulders Of Giants" by Stephen Hawking if you're curious: these are two books that provide excerpts of highly influential works from earlier mathematicians and physicists, respectively)


> In a programming context if you today reduce all your variable names to single latin letters and all your function names to single greek letters, you will be widely mocked and reviled.

Serious question. What do you think is the general impression of APL programmers? =)


I thought about mentioning APL, but sometimes the correctness of an argument is best served by inexactitude.

I already compromised by adding "today" and "almost all" as qualifiers.


What would you use other than symbols? Describing everything with words? For example:

When given two numbers, if one wishes to find the quantities that give 0 when the second of the numbers is added to the product of the quantity and the first number and the quantity multiplied by itself, one should negate the first number and then either add or subtract the square root of the sum of the square of the first number minus four times the second number, and divide this summation by two.

That's just the quadratic formula in disguise:

Let b and c be real [or complex] numbers, then

   x^2 + b x + c = 0
implies

   x = (-b +- sqrt(b^2 - 4c)) / 2
Clearly the latter is easier to understand and digest; the same holds for higher mathematics. In fact, there are multiple interpretations of the text (admittedly I just wrote that now, and I'm not the best writer), while the symbolic mathematics itself is essentially entirely non-ambiguous (given some background in symbolic algebra).

(Saying "let's do maths without the symbols" is a little like saying "let's do programming without special languages"... it is very very hard to make it work.)


Ah yes - I used to love reading Euclid's proofs [1]. Wonderful descriptions. He used such poetic phrases too - things like describing a line as 'a length without breadth.'

As wonderful as it is, mathematics needs notation, and lots of it. You can express incredibly complex ideas in mathematics, totally unambiguously, through a collection of symbols. Not to mention that they're universally recognised.

The reality is that mathematics is 100% about thought. You'll struggle to put together the concepts in your mind long before the notation is the real issue. Once you have a clear picture of the abstract space you can use the notation you've learned to communicate the world you've created to others. What could be more wonderful?

[1] http://aleph0.clarku.edu/~djoyce/java/elements/bookVI/propVI...


Agreed. Plus, the current notation is a result of multiple iterations by some brilliant people. If you look at older attempts at formal mathematical notation, some of it is laughably bad in comparison. We are standing on the shoulder of giants :)


Pictures are worth a thousand words! Now let me put the quadratic formula in disguise as well.

Suppose we wish to make a rectangle with a given area and perimeter. This is an interesting problem! Does the number of possible answers depend on the specific area and perimeter? Certainly! It all comes down to thinking about squares, since squares maximize the area given a fixed perimeter. If the area of a square with the given perimeter is LESS than the desired area, then there's no way we can make such a rectangle. If the area of the square is equal to the desired area, then that's our only answer! Now, how about if our square's area is larger than the desired area? We'll get two possible different lengths of a given side of the rectangle - one representing the rectangle's width, and the other representing it's height. Or we could also think of them as two different rectangles - a tall one, and its rotation by a quarter turn (which makes it wide). By symmetry, we know that the difference between the square's side length and the shorter side will be the same as the difference between the square's side length and the longer side. How large is that difference? Exactly enough to diminish our shape's area from the square's area to the desired area. And that difference in length is simply the square root of the difference between the square's area and the desired area!

It would have been better with pictures :). Anyway, the quadratic formula is probably the greatest mistake in all of mathematics education. Somehow we use the word "quadratic" and even the phrase "complete the square," but never have I ever seen someone draw the said square!

While I think notation is often great for expressing ideas concisely and precisely, I think an excess of notation not a good way to communicate concepts. Nobody should memorize the quadratic formula! We should understand instead how to think about areas and lengths, and then we solve the problems in quadrature that we want.


I agree with you, but not everything can be done pictorially.

FWIW, an animation of the quadratic formula/completing the square: http://en.wikipedia.org/wiki/File:Completing_the_square.gif


With more notation, some readability is lost again:

Let x, b, c \epsilon C:

x^2 + b x + c = 0 => x = (-b +- sqrt(b^2 - 4c)) / 2


Well, this is highly tangential, but if you're talking about LaTeX formatting, \epsilon is not generally what you want for the "element of" symbol. Instead, use \in. The biggest difference is that \in is a binary operator and has the appropriate spacing.

Example of the difference: http://i.imgur.com/gwAqirx.png

generated by code:

  \textbackslash{}epsilon: $\epsilon$ \\ 
  Example: $x \epsilon \mathbb{R}$
  
  \textbackslash{}in: $\in$ \\ 
  Example: $x \in \mathbb{R}$
If you use \mathbin{\epsilon} instead, you'll get proper binary operator spacing, but you'll still get odd looks from people who are accustomed to \in. Admittedly, the symbol did historically begin as an epsilon, but that notation died off a while ago.


Notation is not a bad thing. What is bad is an inconsistent, ethereal form of notation.

Computer science is also essentially about notation and vocabulary, but we have to make our notation understandable to the computer, which is a much higher standard than what mathematicians have to adhere to.

We are in a field that demands a much higher level of rigor than mathematicians are accustomed to, as much as they'd hate to hear it.


Mathematicians have a relatively fluid relationship with notation. Since higher math is constantly introducing new abstractions, or applying new techniques to old abstractions, new notation is introduced, or old notation reused in a slightly different way in a large percent of papers that are influential. I think mathematicians would happily admit that.

That said, it does bring me back to a math class where a professor, after realizing he needed to introduce a subscript, to a subscript, to a subscript of something that had both a subscript and superscript already, made a comment along the lines of "please excuse my poor notation."


I certainly agree with this that it is inconsistency. With a straight up notation handbook and agreed upon "language" I would have been happy to get over the hump once, maybe twice, but when I spend 15 minutes re-assuring yourself that the notation in this proof/formula/paper is just a variation of another equivalent notation, it makes me irritated and that interferes with my appreciation of the concepts being presented. Do it enough and I just throw the paper out.


There was some point in my education where I was taking three classes, each with their own definition of phi. It drove me nuts.

Coq notation, lisp-style notation, even python-style notation- anything would be better.


I remember taking three classes: informal logic (philosophy) which ended up talking about formal logic anyway, electrical engineering, and a math class. It make the logic class really easy, already knowing it from EE. They all had different notation for 'implies', 'not', 'and', 'or', etc. You could argue that the computer languages have a 4th notation, but I won't.


You're confusing pedantry with rigor. The fact that a compiler will complain at you if you misspell 'continue' but a mathematical proof will keep going just fine doesn't mean that programming is more 'rigorous' than mathematics.


I'll disagree that it's the use or non-use of notation. I had the great joy of taking an Intro to (mathematical) Logic class with Dr. Richard Vesley (who had Kleene as his Ph.D. advisor!) By that point, I'd had many math classes with instructors all over the bell curve.

Dr. Vesley blew them all away. There was notation, but the real clincher was his absolute clarity of communication. He covered a lot of material, but the pace never felt rushed. In fact, it was so calm and so clear it was refreshing, more like meditation by a babbling brook. I wasn't the only one to feel that way -- the whole class seemed to have a similar experience.

Related to the story in TFA, a friend of mine with a towering math background said a few years ago, "I remember when math was easy -- back when I had time." Math that we've learned and mastered is "easy", but new areas of math can require a LOT of mental energy to gain traction in.


University math departments really should reconsider their approach to real world applications. Most math professors don't seem to consider it a part of the curriculum, and that can be really detrimental.

My largest college regret was blowing off linear algebra- it was an annoying class taught in an annoying way (handwritten homework showing your work for each step of matrix multiplication, no proofs, ect). I blew it off because there were no applications of it in anything I cared about.

A semester later, it showed up somewhere in every single advanced computer science class. Really wish there had been a proof-based linear algebra class that showed up later in the curriculum so by the time we reached it we knew it had value.


I think you need to go through that with linear algebra, though. There's a stage early on where you just have to multiply matrix after matrix until it's second nature. Shame your course wasn't taught in a compelling way and it put you off. Having said that though - I had no idea how useful linear algebra was until much later in life.


Same exact story here. Linear algebra seemed like a tautology, I was ranting about it until I started using it in CS...


There are plenty of worthwhile subjects in Math which have no concrete application to the real world. One needs to be able to understand these abstract ideas for what they are, not merely formality surrounding a simple real life phenomenon.


Yeah I totally agree. I took one of the car programming classes online and found that while the instructor was (obviously) really smart, he sucked at programming. He tried to write out mathematical equations in python rather than structure the code in a way that simply described what was going on. At the end of each mini lesson I'd refactor his code so that it made sense (mostly doing small stuff like changing variable names from their corresponding mathematical symbols to the word of what they actually were, or factoring blocks into methods). Eventually I gave up not because I couldn't understand the domain of self driving cars (he was great at explaining that stuff), but because I couldn't keep up with the mathematical syntax (and the online course kept wiping my code and resetting with his, which was extremely frustrating).


If you do not learn the mathematical structure it is going to be hard to do anything further in the field after the course. Why not just bite the bullet and properly learn the pre-reqs?

With a basic course in linear algebra (such as Gilbert Strang's on MIT OpenCourseWare) and potentially some intro calculus you should fly through that course.


Are you talking about Sebastian Thrun's AI class on Udacity [1]? I haven't yet taken it, but I have on my todo list.

[1] https://www.udacity.com/course/cs373


Yes, he is.


Reminds me of my friend who once said. "I understand calculus but whenever I see the little snake (integral symbol) I don't know what to do"

But following up on your comment, the difficulty of math is that it is build on foundations. If you miss something because you were distracted, that hole is going to be an impediment over and over and it will create more holes until it is very difficult to make progress.


Structure and Interpretation of Classical Mechanics by Sussman and Wisdom explores some of the issues surrounding mathematical notation. Their primary thesis seems to be that by using uniform notation, s-expressions in this case, we can better understand and reason about mathematical concepts than what using standard math notation permits.


> Sussman explores some of the issues surrounding mathematical notation

see this video, starting at 8mins: http://www.infoq.com/presentations/Expression-of-Ideas


> and I'd sit there wanting to shout "NO, NO it's not obvious!"

Toward the end of my degree program, I became the annoying guy in class who would do exactly that. I remember one time in particular (I think the topic was something on wavelets, which I barely remember now anyway) when I stopped the professor and said "Can you explain all of that over again, from the beginning?"

Worked great for me, but I'm not sure what the rest of the class thought of it. At the time I just assumed they were as lost as me and would appreciate it, but that may well have not been the case.


I bet a lot of them were lost. If I were in your class, I probably would not have been bothered by your question.

The questions that bother me are the type that stroke the ego of the person asking because they already know the answer.


Good for you. Didn't work for me at university. We had one lecturer who was great and would adjust his material on his OHP transparencies if questions were asked and explain in detail. Unfortunately questioning others or asking for clarification would result in either being summonsed by your tutor and getting a bollocking or being asked to leave instantly for "not reading the material".

I quit after the first year. Best thing I ever did.


>> wanting to shout "NO, NO it's not obvious!"

Usually "obviously", "trivial" and such are used to point out that: "this should be obvious/trivial by now", if not: you are getting behind/need to study more/be better prepared before class/....


That is how I read these signals, and sometimes it was true. I'd study a bit more, meet with the class's TA or a university-provided tutor to go over the material, and I'd be fine.

But even then, in many cases, the professor was simply expecting us to have made an intuitive leap. And those of us who hadn't made it were left behind, with no explanation as to why or what it was we needed to understand.


you know you've stepped into the woods when you stop hearing "this is obvious" in math classes...


I was an art major in a mostly engineering school. I always had trouble with math but loved learning it. I also hit a wall in college like you. The first day of class the teacher asked how many engineering students there were in the class. Just about everyone raised their hand. This was her sign that she could teach fast. I knew I was screwed immediately. I wish she had asked how many art students there were so we could go slower.

I remember the math stopped making sense. The teacher would do exactly as you described, saying things like "...it obviously follows that...", etc. A girl who sat next to me would try to explain but was no less clear than the teacher. All the engineering students just got it.

My grades started high and then rapidly fell each week until I hit a string of zeros for a month. I was too proud to ask for a drop but eventually did but only after skipping a month of classes. The teacher was kind enough to understand that I was trying but my effort was for naught. She gave me the drop.

I've never pursued math any further, having felt defeated.


I think that's part of what the author is trying to approach in the article. I could be wrong, but the engineering students were probably immersed in math way more than you were, so concepts that seemed natural to them were only because they'd banged their head against it more often.

I run into this a lot when people talk to me about programming and I get something faster than they do. I've spent a lot of time reading book after book, listening to podcasts, learning new languages, and studying new concepts, so it can be really easy for me to fit new information or ideas into some context and get them. I don't think that I'm necessarily smarter for it. I think I've just had a passion for it, so I never get tired of reading the articles and absorbing the material.

Now I'm personally running into the place where there are a lot of things I've wanted to learn for awhile, but my weak background in math hinders me (machine learning, more advanced algorithm analysis, signal processing, machine vision, etc).

I'm working my way through a calc book. I don't think I could have done it in your position, either though. I'd have psyched myself out. I've got to learn it on my on, with my own rhythm.

Kind of rambly, but you should find something related to what you like and maybe jump back in. Just find a learning mechanism that's suited to your background :)


i got As in almost all my classes in math without really understanding anything. It was like a train I couldn't stop. I couldn't ever slow down for a month and say "I'm going to work on trigonometry this whole month so I can know what I'm doing when I do integration." I just had to memorize the steps to get the right answer, get my A, forget it all, and wait for whatever was coming next.

Years later I go straight back to the beginning and figure everything out, starting with Serge Lang's Basic Mathematics. I didn't even know where the Pythagorean Theorem came from, and when I learned it the second time around, it was damn beautiful.

my advice is to go back all the way to the beginning and get a book written by a real mathematician. I.M. Gelfand's Algebra and Trigonometry were truly enlightening.


Thanks for the book suggestion. I too, would like to go through and replace the "method" regions with "conceptual" understanding


In that vein, may I also suggest Ordinary Differential Equations by V.I. Arnol'd.


> So many professors would [...] say things like "...and the proof is trivial" or "...it obviously follows that...", and I'd sit there wanting to shout "NO, NO it's not obvious!"

Agreed. What makes a great teacher is not the depth of their knowledge of the subject matter, but how well they are able to put themselves in their students' shoes and overcome the "Curse of Knowledge"[1]. Unfortunately for students, university professors are usually hired for the former (knowledge & research) and not the latter (real teaching ability).

[1] http://en.wikipedia.org/wiki/Curse_of_knowledge


I really felt college level math (abstract algebra, groups, monoids and such) was abstract painting in hieroglyphics, until I made a full turn into programming where you start to speak about abstract patterns that makes absolutely no sense (iterators, monads) at first, then I felt that this way of thinking seemed to be about finding your own solutions by being "mathematical" in the way you model your problem. Then abstract algebra started to feel like 'math + generics' and I went into re-reading my college textbooks. I still don't understand more than 15%, but I feel I have a chance.

I agree about the way its taught, teachers either forgot their own learning process, or they're all very advanced brains aiming at younger advanced brains that can unfold the possible application behind the abstractions.


I know that my personal experience won't be of much of help to lots of HN-ers who are passed the highschool - early college-age, but I can say that in my case my love and understanding for Maths was inspired by two great teachers that I had, my highschool Maths teacher and my Calculus prof in my first year in college.

Maths is not about arithmetic computations or getting the "exact" answer, is realizing that things like convergent series or Real numbers are extraordinary things, almost magical, as in you somehow get the sense they all come from a different Universe. Sometimes one is lucky enough to have these things revealed to him, like it happened for me.


This resonates with me- particularly the 'it obviously follows', (huge jump in complexity and then --->) 'so we already know that x, so obviously- y'.

I think a lot of teachers aren't cognizant of the fact that what is obvious to then isn't automatically obvious to students. I had what I would say is at best a mediocre HS math teacher and completely tuned out. It was my fault in the end, but the teacher didn't help.

I'm now trying to learn a bunch of things that require math and mathematic theory when you get to higher levels. So - learning math is what I have to do. It's kind of fun, and yes it is hard work. Like, brain-hurting hard work.


I usually just raise my hand and ask the professor to elaborate. If it's genuinely time intensive, they'd ask me to stop by during office hours.


> I hit a wall in college where math just stopped being something I intuitively "got".

That's the college experience. Everyone that goes to that school is smart. Since the professors go to this level, all that remains is hard work. They understand that everyone there is smart.


One thing that worked well for my wife was a series of games from Nintendo called Fire Emblem.

This isn't a series of educational games: it's actually turn-based strategy. But the mechanics are all derived from very simple arithmetic, and although they don't give you the actual formulas, they give you, up front, every single number that goes into them formulas. Use a FAQ to get the small number of formulas involved, and the randomness all but vanishes: for any given unit on the field, you can always tell exactly which other units can attack it, how many times those units can try to attack, how much damage they'll do if they hit (and the exact odds of them hitting), how much damage you will do in response if your counterattack hits, and so on. The game will do this for you, but only for units that are directly in range at any given moment. With the numbers, you can calculate for any unit on the field, and that lets you start thinking multiple moves ahead.

The end result is that if you work out the math in your head, you can Neo your way through the games, and this is exactly what my wife did. I have never been the math-head in the family, but before she started this, I was still handier with numbers than she was: now it's the other way around. I should look into this myself.


Great suggestion! Just for fun and a share: my parents didn't allow me to have a video game system as a kid, but they did allow me to have computer games for my Commodore 64.(!)

My personal favorite was "Algebra Dragon" in which you slayed dragons by solving equations.To this day I'm convinced it helped my mindset towards math.


Some people are very good at chess. Similar idea. Are these people also good at math?

"One thing that worked well for my wife was a series of games from Nintendo called Fire Emblem."

So are you trying to make a scientific claim? This is starting to seem like an anecdote supporting brain training games.


No, that isn't remotely his claim.

What happened with his wife was that she became motivated to learn number crunching. The game provided or increased a motivation to become comfortable working with numbers. That's all that happened.

Brain training has nothing to do with learning math. It just uses arithmetic the same way cross training uses a mountain trail.


What are you even arguing about?


There are a lot of feel good nonsense posts in this thread. The parent gave no indication he was referring to motivation. It sounded like he was saying his wife suddenly became better at whatever he considers math to be by just playing a video game. That's basically claiming that brain training works. I may have a slightly different definition of "brain training", but it doesn't really matter because it's just as unscientific. There is no evidence to support the idea that playing video games somehow makes you better at "math" (where "math" is referring to those activities that require deductive thought and understanding of mathematical structures, not number crunching).


Not even so much "brain training works" as "practice makes perfect". It was a fun way to practice, and I thought it might be helpful for other people looking for a way to brush up on the basics.


She's practicing number crunching and maybe some other cognitive tasks that probably can't be related to mathematical thought, unless you squint really hard. I can identify at least one cognitive task: holding configurations in your head (I have played Fire Emblem). From my experience, I would say that this has nothing to do with the type of thought that goes into mathematics. Like I said in my first post: good chess players can do this well. Does that mean good chess players are also good at some part of math?

And, you know, one would have to show that there is some "mathematical thought" that can be trained to begin with. I'm not entirely sure there is.

I don't disagree that "practice makes perfect". I disagree with the statement "practice in fire emblem makes perfect in math (not number crunching)".

People want to share stories. I get it. But if we're not being rigorous about it, then we're just fooling ourselves. And then the conversation devolves into a circlejerk where everyone thinks they're brilliant.


We were all newbies once, and newbies have to start somewhere, even if it is not the most "pure" of beginnings. The kind of proof-snobbery you're displaying here likely bears a large part of the blame for driving people away from math in the first place.

Yes, there is more to math than number-crunching, but to claim that number-crunching is not math is to forget the very roots of the discipline. This is where it begins, and it's where people who have been out of practice for a while return. Give it some respect.


I aced all my high school math classes including calc. Because my school was small and rural they didn't have any more classes for me. So senior year I had no math.

Got to college, took a math placement exam and bombed out, so upset. Then as I was leaving knocked a chair down and everyone stared at me.

I ended up with a BFA in Drama.

Fast forward 10 years and like the author I worked my ass off to get into a top MBA program and not only that, major in finance.

So yeah, it can be done. Hard work, and not accepting the bullshit line "oh I'm not good at math." And without attacking my own gender, women tend to be let off the hook more easily with this excuse, as if we accept that girls can't do math.

Fuck that.

This post rocked. Thanks to the HN community for bringing it to my attention.


What was your path to an MBA? I'm assuming a degree in drama doesn't exactly prepare you for an MBA in finance. What happened in between that gave you the motivation and whatever prereqs necessary to even be considered for a spot?


There are no real academic requirements for an MBA program other than you have an undergrad degree and did well in it. Prerequisites would be nil. On that note, most adcoms likely value work experience more than undergrad experience.

You don't jump right into advanced topics, most streams give you an intro sequence first assuming you know little to nothing about the topic.

That said coursework is like < 50% of the value of an MBA, it really isn't the point of the degree.

Plenty of smart but somewhat wayward humanities/soc sci majors end up graduating into a random office job, find out that surprise! they perform well and enjoy business, and proceed to do an MBA.

Disclaimer: I'm not an MBA, this is second-hand from MBA'd friends and acquaintances.


it's true that there are intro sequences in some programs, but due to Cornell Johnson School's unique structure, we did an entire semester of accounting in six weeks; ditto for finance. I'd passed everything I learned in a semester of accounting in 4 days.

agree about coursework not being the point, networking is; but coursework to a point is a factor for many things: career changing, or in my case, burrowing into math and finance so deeply you can hold your own with the best and brightest in the world. (for example you learn when to recognize when you are not the smartest person in the room, so it's time to shut up and let other people teach you)


Thanks for asking.

1) spent 2 years in an MFA program in Stage management at Rutgers. I have dozens of blog posts in me about what theatre teaches you that is directly applicable to business especially the collaborative nature of software development. 2) I burned out. I saw how Tony award winning lighting designers lived and realized that my desire to make art wasn't greater than my desire for health care, vacation and a nice place to live. 3) I took a year off to be an admin assistant. this was at the dawn of computers at every desk. Because I'd built my first computer at the age of 10, and taught myself to program in basic in junior high, I was very comfortable with computers and troubleshooting. I quickly morphed into the office computer guru. installed network cards, taught the CEO of the mergers and acquisitions firm how to use a mouse (!), etc. As a fun aside one of my tasks was to print out my boss'e mail for him. :P 4) Fortunately this same boss took a shine to me (in a not creepy way) and taught me a lot about finance, balance sheets, and how deals got done. He said to me one day, you're smart enough to be on the other side of the desk, go do something with your life. 5) I took a job working for a company that sold equipment to the entertainment industry, turned around my first division of the company at the age of 25 (just as the internet rose), then took over a sales territory and tripled it. (that, along with my sales experience got me fantastic references) 6) About this time I got encouragement to go to business school since it was obvious I had a knack for it. I got good advice to not bother unless it was one of the best. 7) I took 3 classes to prove my status and A+ed all them: microeconomics, business math, and accounting. University of Connecticut night school, while working full time. 8) I studied long and hard for the GMAT and got a great score.(again I was just before the online practice questions. I used to sneak the test booklets out of kaplan, and photocopy them so I could work problem sets on the plane.)I scored high enough on the GMAT that when I graduated in the horrific post 9/11 economy I got a job at Kaplan teaching gmat classes here in SF. 9) I wrote good essays about the future of Tech and its impact on the entertainment industry--which have all since come to pass (i.e. content proliferation, the rising value of top talent). 10) Accepted to Cornell Johnson School at the age of 30 (top 10 school). 11) if I May humblebrag, in spite of the fact that I majored in finance so that I could improve my math skills, which cost me GPA, I still finished in the top third of my class.

Was luck a factor? for sure. But so was my persistence, drive, and commitment. Going through this process taught me a lot about those things. About never giving up. About working hard. I came from a dirt poor family and the internet didn't exist, so there was no one to really mentor me early on and I had to fix the mistakes of doubt and lack of self confidence and ignorance I'd made as a young woman.

People ask me how I got the courage to start a company. After what I just described, after pulling myself out of the hole of poverty and debt, this is a cakewalk.


Thanks for the response and congrats on all your success!. Your story is very inspirational.


MBA programs not in the top 50 or so will let pretty much anyone in, even people with zero work experience.


For whatever reason, I hit a poor grades stretch in math for my first three years of high school. It was just boring, and high school had distractions. My school almost blocked me from taking calculus. When I hit Calculus, I had my "Aha, I get it now" moment, and had great grades in math ever since.

Some students don't need to be motivated to work hard. Others do. Some in the latter camp are led to believe that they're not good at math, when the reality is that they're just not motivated for it. I'm glad that I ultimately found my motivation.

The OP seems to have found this moment too.


>Some students don't need to be motivated to work hard. Others do. Some in the latter camp are led to believe that they're not good at math, when the reality is that they're just not motivated for it. I'm glad that I ultimately found my motivation.

Approaching it without other academic pressures and plenty of confidence from being successful in other areas also helps. Plus the article gave no hint that he was taking a full schedule. Personally, I would likely get bad math grades if I had to go back and study math and other things.

Perhaps for these students that "didn't get" or were "bad at" math, maybe they just need to be given some time to only focus on math to raise their confidence and abilities. I had similar results as the author when all I was doing was working and doing some math during the day. I also barely graduated high school.


I'm currently going back and learning linear algebra from the excellent MIT OCW course [1]. I somehow didn't have to learn linear in the course of a CS degree at a big state school. Focusing solely on learning this one subject has allowed me to get the subject at a much deeper level than learning enough to pass a test.

I'm hoping to move on to statistics, which I slept through because I was an immature shit-head my sophomore year.

Then a review multivariable and diff-eq, and the end goal is to do the full Stanford Machine Learning class, taught by Andrew Ng [2]. I also really want to tackle "Underactuated Robotics", which is another MIT OCW course.[3]

[1] http://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebr...

[2] http://see.stanford.edu/see/courseinfo.aspx?coll=348ca38a-3a...

[3] http://ocw.mit.edu/courses/electrical-engineering-and-comput...


It's amazing how motivation increases once you know what it's useful for. :-)


Like the OP, I also went to journalism school and can confirm that, while Statistics 101 is sometimes a highly recommended elective, math is seen as not needed, especially for those who want to tell "stories with impact."

I would've agreed back then as a student, but I also happened to be studying computer engineering so I took math for granted. In the professional world, it's astonishing how hard it is to explain ratios and basic enumeration to those who didn't try math, and how that greatly affects the range of story ideas you can conceive of.

And I say that as someone who still has to look up the quadratic equation...something virtually all college grads learned at least in high school. But there's a huge chasm between knowing that the quadratic equation exists and is applicable and not remembering that it exists at all.


> [I do not] carry such information in my mind since it is readily available in books. ...The value of a college education is not the learning of many facts but the training of the mind to think.

-- Albert Einstein

Source: http://en.wikiquote.org/wiki/Albert_Einstein

The above quote seems to be the genesis of the folk trope, "Never commit to memory that which can be looked up in a book." To me, what you describe above is why people think they are "bad at math," because they cannot commit to memory long equations with foreign-looking symbols. Even the function sheets provided during some math exams are needlessly obtuse, for the sake of proving whether or not somebody can remember exactly what the different constants or inputs into the functions are. It would be like encountering a function in somebody's code that had cryptically-named parameters without comments or documentation.


"The difference between good at math and bad at math is hard work. It’s trying. It’s trying hard. It’s trying harder than you’ve ever tried before."

-while that's certainly true, when people say they are "bad at maths" they usually mean exactly that they have to put in a lot more effort (=trying harder) to reach the same level of math skills as the "gifted" guys.


I was one of the "gifted" math students in high school, and I highly doubt anyone "bad at math" put in nearly the effort I did.

The ease of doing it in class or breezing through homework assignments was backed by hours a night of looking at high level concepts and talking about math with people more advanced than me, including teachers during lunch breaks and after school - a habit I'd had since late elementary school.

At the end of the day, they weren't "bad at math", I just put a lot more effort in to my practice - and it showed on game day, as it were.

Just like the guy in my school who was "gifted" at basketball spent hours a night practicing since early elementary school.


I was one of the gifted math students in high school, and I put in a very little amount of work.

In college, I had to put in a lot more work, but the math still came somewhat naturally to me. I'm pretty sure I still had to put in less work than some of the other college students to reach the same level of understanding, though.

I neither put in as much work, nor reached the same level of understanding, as the 'gifted' college students, though.


This is generally the case; however, for those truly gifted at math, high-school math, and for the most part college math, is fully understandable by intuition alone.


Except... just because you haven't observed it doesn't mean it isn't true. And just because you believe you're gifted doesn't mean you are. It's really easy to be a big fish in a small pond. It's also possible to think you're a big fish in a big pond when you swim in a pack with smaller fish.

edit: I just realized you said high school.


I also put "gifted" in quotes for similar reasons.

I picked high school though because most of the comparisons about being "bad at math" start around then. The reality is that I just put in more work than many people and was a class or two ahead of the average student up through the start of university, where I went on to study math (and was at a similar starting point with most other math majors at my particular school).

At the end of the day, I'm just an average fish in the math pond, but it's easy to think you're not as good if you don't see the work backing my talent in high school and base your comparisons for life on that (or even the first year of undergrad).


There is something to this, but I think it is very easy to discount the amount of work that others put in. Also, these things are cumulative, so someone who has put in more work earlier in their schooling will have an easier time learning the new material than someone who has coasted by.


Jim Fowler's calc II course on Coursera[1] is wrapping up right now, and I think he's the most engaging math professor I've ever had to please of learning from. Highly recommend.

He's also a maintainer of an OSS MOOC platform, MOOCulus[2], built with Rails.

[1]: https://class.coursera.org/sequence-001/class [2]: https://github.com/ASCTech/mooculus


Even if you're not interested in doing the course work (though really, it's only six homeworks) everyone should take a second to watch some of his videos... definitely some of the most entertaining, engaging, and clear math lectures I've ever seen. I also highly recommend it.


(I'm Jim Fowler.) Thank you---I certainly appreciate it.

The Calculus One videos are available at http://youtube.com/kisonecat and the new Calculus Two: Sequences and Series videos will be posted soon.


I feel this applies to programming, too. I do not buy that some people can "just program" and others cannot.


This. Whenever someone tells me programming is some kind of magic I tell them to write me a note with steps of making pancakes. That's the most basic form of programming I can think of. That opens their eyes.


Well people have varying abilities towards algorithmic thinking, abstraction/generalization, spatial reasoning, creativity, focus and attention to detail, etc.


Very few people invent math or learn programming on their own without any help, but there are huge differences in innate ability.

I don't agree with the author's claim "The difference between good at math and bad at math is hard work."

I have a M. Sc in math, but never worked hard at it. I went to the classes, paid attention to what was told and visited most of the exercise classes, but, except for the first few weeks, never prepared for either of them. That left plenty of time to spend on other things, such as sport and reading the art of computer programming, Compute! Magazine, Byte, etc. (partly because of that, I am not that good at math)

IMO, it is the same with about everything else. For example, there are people who just throw a baseball well, and there are those who can learn to throw a decent one through hard work.


I think you would at least need to take a fairly broad view of "innate" that extends past the literal moment of embryo fertilization. How someone's raised in childhood can have a huge impact on what seems innate by the time they get to their teenage years. I've been "good with computers" since a young age, for example, but I also had an Apple //c in my house since I was 3 years old, and parents who encouraged me to use it. I don't think my teenage computer proficiency would have been the same if that had not been the case.


I think programming is much easier to "learn on your own". I would define "learn on your own" as without an instructor - so with books, tutorials, etc.

Two reasons: 1. The desire to create something or solve a real problem

2. The interpreter


> [The claim that] "bad at math" was a thing — probably even genetic... was all a lie.

I disagree. Math ability is highly genetic: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2913421


At first, I was terrible at maths at University and indeed had to retake a year due to my awful attitude. Oddly enough after almost flunking out completely I came to really enjoy the more abstract mathematical aspects of CS (e.g. lambda calculus) and even in the mainstream maths classes I ended up getting rather splendid marks which eventually led to me getting a 1st.

At a post-grad level I then ended up working in an Electrical Engineering department with Control Engineers - rather ironic for someone who started off almost failing because of maths...

Maths went from something I had no interest in, and therefore did terribly, to something I loved.... maybe it was age/maturity or just plain getting a scare!

[NB I'm very happy we had the UK style degree grading system rather than GPAs]


I know he mentions people doing their own googling, but I'd like to recommend this coursera course: https://www.coursera.org/course/maththink

It's one of my favorite MOOCs of all time, a fantastic intro to mathematical thinking.


I think the author is making an excellent point about needing better math in journalism. An article in the NY Times yesterday about online shopping China had this line (pulled from a press release) "Tmall.com, one of Alibaba’s shopping sites, said Chinese bought...two million pairs of underpants, which if linked together would stretch 1,800 miles..." Why should I believe the rest of the article when the author is quoting someone saying that underpants are 57 inches wide?


To cross check your claim about the author, I did:

1800 miles ~ 2000 miles 5280 ft/mile ~ 5000 ft/mile So 10M ft / 2M underpants = 5 ft/underpant. 12*5 = 60 in/underpant, which is close to 57, so I trust your statement :)

I think this sort of "back of the envelope" calculation is something that just comes by practice/habit and is not some innate ability. I love this article by Jon Bentley on the topic: http://www.csie.fju.edu.tw/~yeh/research/papers/os-reading-l...


Sorry, but you Americans need to get rid of that imperial system: 1800 miles ~ 2000 km = 2,000,000 m. Each pair is around a metre. Simple, isn't it? Okay, admittedly, 1800 miles is definitely more than 2000 km...


This is great and needs to be said more often. I feel as though I am the opposite. I absolutely love math, but I am terrible at it. I have taken tons of math, both formally in undergrad (up through diff eq) and on the side on my own. I struggled like hell through Calc 2 because I just couldn't visualize the transformations. I still feel like I want to take two years off and just start over from scratch.


I have the exact opposite problem in math courses. I can visualize the transformations and what the math means, I can know the exact approach to take to solve a problem, but I can't seem to get all the way through without making stupid mistakes. "Partial Credit" for showing work never really helped me much.


You are good at math and bad at calculation. There are branches of math where calculation is not so important... also, use a computer for the calculations :)


What branches of math are good for people who are bad at calculation?


Abstract algebra, number theory, geometry, logic, topology, graph theory - off the top of my head.


Use Mathematica.. eliminate algebraic mistakes.


I was never really got math when I was in school but after years of programming for some reason I've become good at it.


Good for you!

"The difference between good at math and bad at math is hard work. It’s trying. It’s trying hard. It’s trying harder than you’ve ever tried before. That’s it."

I love this quote. I think that nails it. When you see someone run a marathon, you don't think they are just naturally good at exercise, you recognize that they've put in a lot of hard work.

I think the value of hard work is under appreciated. Thanks for writing this.


I think the main problem is kids aren't taught abstract problem solving, and don't learn how to process the deep concepts that you find in math. This is largely due to our rigid, rote memorization math teaching methods (here is formula x, here are the type of problems formula x can be applied to, now do this pracitce set with 100 of them, and so on).

The majority of math currently taught k-12 is also largely useless to most students who don't go into hard sciences. This time would be better spent learning math as an art. They could struggle with problems without knowing the formulas ahead of time. They could modify the axioms and see where that leads. And so on, working their way through the different areas of math, and actually internalizing the concepts. In high school, the curriculim would include a life skills: arithmetic/statistics class, to ensure that day to day practical math skills are learned.

There was a really good paper on HN a few months about math as an art form which influenced my thinking in this regard.


I agree completely. I'm working on a curriculum just like this :)


Working hard at math might be necessary, but it doesn't have to be painful! I have found learning about the history of math has me puzzling over difficult concepts on my own time because it is fun.

The book I am reading (Journey Through Genius) is really good, but not very advanced math. I would love to find some self-study math courses that approach math not as a bunch of symbol-pushing but as an art with a history.


I've thought a lot about this and haven't found many good resources. I finally gave up and started reading history of math books. Unfortunately, most of them are aimed at grad students in history or grad students in math.

It has frustrated me enough that I've started developing my own versions. I taught a history of ancient mathematics class to homeschoolers locally and now I'm writing a book that works through Euclid while placing everything in historical context and focusing on the story of its development.


I found Lancelot Hogben's "Mathematics for the Million", originally published in 1937, to combine history and biography, concepts and calculations, explanations and illustrations, in a very engaging manner.


Sounds cool!


Absolutely have to recommend Paul Lockhart's Measurement [1]! Journey Through Genius is a great book as well.

[1] http://www.hup.harvard.edu/catalog.php?isbn=9780674057555


There are some great books to build up some intuition and rigor in all branches of math, with no prior knowledge:

* How to Prove It (Velleman)

* Algebra; Trigonometry; F&G; The Method of Coordinates (Gelfand)

* Geometry (Kiselev)

* Calculus Made Easy (Thompson)

* How to Count without Counting (Niven)

* Introduction to Probability Theory (Hoel)

* The Little Schemer (Friedman)

The proceed to more advanced texts like:

* Naive Set Theory (Halmos)

* Linear Algebra Done Right (Axler)

* Geometry Revisited (Coxeter)

* Infinitesimal Calculus (Keisler)

* Concrete Mathematics (Graham)

* Information Theory, Inference and Learning (MacKay)

* SICP (Abelson)


All of these are great. I'd like to add, maybe not for everyone, Calculus by Spivak. For me, it was the calculus book for which I had been looking for a long time.


I always thought of myself as good at math, but I had an unexpectedly hard time in required freshman calculus courses en route to a CS degree. I think the source of all of these paradoxes is that math is too broad a thing to be simply good or bad at. The objective in first year calculus seemed to be to practice basic techniques until they became rote. If you were an engineer, you'd have much more calculus to do and the early stuff was like learning to tie your shoes. I didn't encounter any really mind-bending concepts (I actually had to retake some calculus I originally took in high school and placed out of); the problem was that I couldn't do the problems fast enough on the tests, and I had to devote way more energy than expected to practicing doing them fast, which felt sort of like learning to play a musical instrument -- training muscle memory so the mind didn't have to be involved. I figured that calculus wasn't for me and I stopped after I met the degree requirements.

My point is that different scenarios involve different "math." For engineering, math is a hill to climb on the way to the interesting part. For CS, math is not about learning math per se, it's about learning to prove things and think rigorously. For math majors, it's about deep concepts in math itself. You could be good at proofs and bad at adding numbers in your head, and be good and bad at "math" at the same time.

Dividing people into good at math and bad at math is pretty meaningless. If you were truly bad at all math, you could probably not function in any job, because everyone needs to do something that could be called math. On the other hand, hardly anyone is just plain good at math, because you would have to be good at pretty much every hard subject that exists (statistics, economics, physics, and so on). If anyone is that smart, there can't be more than a couple of them.


My wife has dyscalculia[1]. She might take offense at the statement that "Bad at math" is a lie you tell yourself to make failure at math hurt less.

1. http://en.wikipedia.org/wiki/Dyscalculia


So she's not "bad at math", she has dyscalculia. Like someone who's wheelchair-bound with a development disability isn't "bad at walking".


> Estimates of the prevalence of dyscalculia range between 3 and 6% of the population.

I think this article is targeting a much larger audience than the small percentage of the population that has genuine problems with math.


All the author needs to do is reword a bit: "Bad at math" is a lie that some people tell themselves to make failure at math hurt less.


Indeed, that's what I was going for.


Did you consider asking your wife whether or not she takes offense and whether or not she wants you to do it on her behalf?


Part of what makes (re)learning basis math difficult is the lack of math textbooks written for adults. Most math books are either too advanced (assume prior knowledge) or too basic (assume reader is retarded). In both cases, an otherwise intelligent adult with some gaps in their knowledge will get discouraged.

I wrote a math textbook which starts off from the very basics (numbers, equations, functions) and proceeds all the way to university level topics like calculus and mechanics.

Check out the "No bullshit guide to math and physics:" http://minireference.com


This is what I tend to struggle with when reading maths textbooks.

Example from your page:

Sup­pose that you monitor the file size dur­ing the en­tire down­load and ob­serve that it is de­scribed by the func­tion:

f(t)=0.002t2[MB].

- 'Supposed that it is described by the function' - Where does this function come from? Is it derived from something else?


I understand your unease. In this case I totally pulled this equation f(t)=0.002t^2 out of nowhere. It is just "some" function that satisfies f(0)=0 and f(600sec)=720MB.

I basically skipped (invented) the modelling step. In practice you would observe the download size f as a function of time t, and then ask yourself "which function f(t) correctly describes what I see." I wanted to describe a download that gets faster and faster with time, so I picked a quadratic function f(t)=At^2 as a template, then chose A=0.002 so the function matches the problem statement.

Suppose instead I wanted the download rate to be uniform (constant download speed), then the function which describes the file size would be of the form f(t) = mt+b. More specifically, the function would be f(t)=720/600t+0=1.2t. (You can check that f(0)=0 and f(600)=720, as in the problem setting).

In general, once you become familiar with the main function families f(t)=mx+b, f(t)=ax^2+bx+c, f(t)=Aexp(kx), f(t)=ln(x), f(t)=Asin(kx-ϕ), etc, you will be able to describe real-world phenomena by using one of these equations. This ability to "model" the real world through mathematical equations is one of the super-powers that comes with math knowledge, but my download example doesn't quite manage to communicate that. I will have to rework it...

For a slightly better example of "modelling" using equations, check out page 26 in the PDF preview: http://cnd.mcgill.ca/~ivan/miniref/miniref_v4_preview.pdf#pa...


I'm especially interested to hear from HN readers about good methods for teaching math to children.

My son is 3, so at the moment we're just counting everything and making little groups to add them together.


I have a 4yo and 6yo. The 4yo can add 2 to some numbers up to about 20. The 6yo can add any single digit to any number below 100. Most of the time. They both can read pretty well. We practice both math and reading every 'school' night for about 15 minutes.

The only goal is to familiarize them with doing stuff with numbers. The 6yo's kindergarten class has heavy emphasis on math, so I try not to get in the way of what they are doing. I avoid confusion by just staying away from the same stuff (at first, I thought I'd reinforce, but that turned out to be a bad idea. The 6yo needs a disconnect from what's happening in school, not a reminder. It's tiring).

The one thing I can confidently say is that what works changes as they pick up more concepts, and have more influences on learning (ie, other parent, school, other kids). My kids are not especially smart or gifted. The trick is that we stay just at the border of what they can do. We try new things, which fuels their confidence when they get it. We practice old things, which fuels their confidence with little cost. We stay focused and brief.

I try to get as many information and pedagogical tricks as I can. But they only work, if they work, for brief points in time. Kid thinking changes much faster than adult thinking.


This is just my opinion and intuition (so no experience with it), but I have the idea that establishing that numbers are an abstraction is probably more important than adding them together. I have the notion that rote practice of arithmetic is mostly just an easy and transmissible way to help establish this concept (it sort of follows from this that people who have trouble with things after arithmetic are often missing this abstraction. Plenty of people who think they are bad at math can tell you how long it will take to drive somewhere (basic algebra), they just don't like multiplying).

A concrete way of demonstrating this would be to point out that several same size groups of objects have something in common.


I wouldn't focus on "math" so much as logic at that age. This link may be useful to you - http://www.scholastic.com/teachers/article/ages-stages-helpi... (my source: my wife, who is a Reading Corps member that also assists in teaching math to 4 year olds)


I have been researching this for a couple of years and can recommend a couple of books that I thought were particularly good:

* Good questions for math teaching (http://amzn.com/0941355519)

* Young Mathematicians at Work (http://amzn.com/032500353X)

* Number Sense Routines (http://amzn.com/1571107908)

* Dr Wright's Kitchen Table Math book (there are three) - http://amzn.com/0982921128

I haven't finished these next two, but they look promising:

* Fostering Geometric Thinking (http://amzn.com/0325011486)

* Fostering Algebraic Thinking (http://amzn.com/0325001545)

I also like the approach of the Art of Problem Solving curriculum, though it doesn't currently have elementary school material. I haven't used it on my own kids yet because they're still too young, but I did buy the set and I think it looks good. If you read "Good Questions for Math Teaching" you will probably modify the way the problems are presented to make them more open-ended, but I like that topics are introduced with a problem that is later explained, instead of explaining then drilling.

http://www.artofproblemsolving.com/Store/curriculum.php

Finally, I thought that this online course was a nice introduction to the approach. It's a pretty short course, but I just put the audio on an mp3 player so I could listen while working on other things and there were only a couple of places where I couldn't tell what was happening in the video.

* https://class.stanford.edu/courses/Education/EDUC115N/How_to...

For a more general book about early childhood education, I really liked

* "Engaging Children's Minds - the Project Approach" (http://amzn.com/1567505015)

* Making Thinking Visible (http://amzn.com/047091551X)

* Learning Intelligence: Cognitive acceleration...(http://amzn.com/0335211364)

Sadly, most of these books are pretty pricey for the page count, but the material I thought was quite good. If you were to get only two, I'd say go with "Young Mathematicians at Work" and "Good Questions for Math Teaching" because they will probably give you the quickest jump start, especially if you can get the "How to Learn Math" course.

If you're interested in where my evidence for the approach outlined here comes from, my main sources are the following books:

* Effectiveness In Learning (cognitive load theory) http://amzn.com/0787977284

* Visible Learning (synthesis of 800 meta-analyses) - http://amzn.com/0415476186

Edited because I put some of them in the wrong spots and forgot links...


The curriculum for Math 101 looks like something that would be taught in the ninth grade in a reasonable school district. http://bulletin.unl.edu/undergraduate/courses/MATH/101

It is alarming (for our country) that this needs to be offered at the university level, and even more alarming that most students are (according to the OP) failing it.


Math didn't really "stick" for me until Calc 2. I'm not sure why--perhaps just maturity?--but at that point it really clicked.


The thing about math is that you don't know what you're missing until you learn more. It sometimes lets you solve hard problems in an incredibly obvious and simple way because you knew a mathematical solution existed. Ignorance of the existence of that solution would have lead you to build a complicated and clunky chunk of code.

I recently was able to apply a few math techniques in a project to solve a problem elegantly[1][2]. It is useful in my career as a software developer.

[1] https://github.com/SGrondin/map-reporting/blob/master/src/ge... [2] https://github.com/SGrondin/map-reporting


I thought i was bad at math, but compared to people in my high school calculus class, of whom one is an econ prof at a top university, one a MD/PhD, and they weren't even the best student. The best student was so delighted to solve a problem that he would bust out laughing (he was one of the 2 top math students in the whole state, according to competition results). You never forget when you see somebody that's so entertained by studying math.

________

As with learning to program or play piano/violin, you can:

- take private lessons

- take group classes

- join some kind of study group or peer teaching. I've seen meetups, google groups for that, and here's a subreddit for Bishop's ML text, which takes a decnet amount of concentration: http://www.reddit.com/r/mlstudy


My math career turned a corner when I started exercising in college. I would take flash cards of all the formulas I needed to memorize and just burn through them on my runs. I found that that was one of the best situations for me to learn the tools I needed to approach problems.

It was kind of by accident. I knew I needed to memorize the formulas, but I couldn't find a situation where I could sit down and do them without becoming anxious and distracting myself.

On a run, there is no alternative. The eventual runner's high helped me memorize things even more, and I found I'd associate the mental high that comes with working on a math problem for a long time with a runner's high.

I'm super exited to share this wisdom with my kids. I wish I knew this secret when I was in high school.


It's so sad how many things are taught so terribly in formal education and end up turning away many people from things like literature, math, and science. One of the things I'm looking forward to with the proliferation of mobile computing devices is the opportunity for interactive self-directed learning through educational applications. Not only will it allow for people to learn at their own pace and to take the time to focus on aspects that they are having a hard time getting (rather than being driven over like a speed bump the way conventional learning works) but it will allow for a diversity of teaching/learning styles.

I have a feeling that in a century people will look back at formal education today and view as barbaric and ineffective.


The number of us who look at the school system today and see it as barbaric and ineffective is growing rapidly...


I told myself i was bad at math all the way through high school. I realize now i was just lazy. Thankfully, through game development i discovered how cool math can be when you actually do stuff with it.

Now i find really enjoying differential equations.


>Journalism's problems aren't with journalism.

Journalism has many problems, but make no mistake, this certainly is the biggest. Journalism hasn't been a respectable profession for at least a hundred years. Sure, there are examples of good journalism, but sadly they are the exception. The overwhelming majority of journalism today is either meaningless pop-culture drivel, or hyper-politicized, sensationalist propaganda.

Its not even the journalism industry's fault. Its just a sad fact that unbiased, factual articles are incredibly hard to write, and nearly impossible to make interesting to the average person.


The problem is that journalism, as a bias-controlled profession, is only just one hundred years old.

The National Press Club, founded in 1908, is seen as the beginning of professionalism in journalism. They were built upon promulgating a set of ethics and standards for journalism, including a focus on the facts, accuracy, and bias-control.

Before that there was very little focus on what we consider bias-control. In the early years of the United States up through the 1890s, newspapers were most often owned by political parties and were explicitly for slinging innuendo, suggestion, rumor, and scandal at the political opponents of that party. So if you think it's a new problem or that it's bad now, history has shown that it has been worse.


I didn't say that it was a new phenomenon, that's why I said its been journalism's biggest problem for AT LEAST 100 years.


I've been wanting to write something similar for a long time. I left high school feeling very "bad at math", and now I'm a CS major with a math minor.

I think the biggest problem in my approach to learning was my refusal to practice. From other classes, I was used to reading the textbook and understanding immediately. That meant to study for math exams, I would read the chapters, try to learn principles, and memorize some formulas.

I later realized that I would never get any better without doing practice problems - and lots of them. The best advice I've gotten is to start with the problem, and then learn.


I find this more difficult to do as I progress. My calculus book was full of difficult example problems so that I was prepared to tackle the problem sets whereas my number theory book has some proofs and trivial examples, leaving me almost completely unprepared for the problem sets without hours and hours of struggle.


I just sent this link to a friend of mine. He just crammed enough high school pre-algebra to get his GED. He is terrified of math, but realizes he has to learn it. I worry for him if he doesn't. He's 36.


Thanks for this post! I'm 36yrs old and now I'm even more pumped to get better at programming and start reviewing my MATH.

I just wanted to share to you guys that this morning I really feel bad on my way to work (I'm an iOS developer). I told myself if only I'm really good at math maybe I could have use a lot more algorithms for my apps that I was making. All I can think of until I got to the office was math math math math. And I admit I was envious of other people who are really good at math.

Now I'm not envious anymore! Thank you Mr. Matt Waite!


This article confuses statistics with (at least on my assumption) pure math. He talks about the use of math by journalists in data analysis and visualisation, when that's purely the realm of the statistician, who need not necessarily know any calculus. Similarly I've met super intelligent pure math PhD candidates that are working at the precipice of their (very esoteric) field but wouldn't know where to start with applying "data science" stats/ML techniques some interesting slab of data.


If you think a statistician need not know calculus, I sure as hell hope you're not a statistician :) This is a central concept in stats: http://en.wikipedia.org/wiki/Maximum_likelihood_estimation


Your assumption being that I don't know calculus? Anyway I stand by my original statement, if you're working in data analysis (even as a frequentist) you should be able to see the truth in it; "data scientists" aren't deriving MLEs for wacky distributions and using the mean to parametrise a Gaussian doesn't require calculus. Thanks for the link, I guess?


If you read any classic text on statistics like Feller you can't get past the first chapter without hitting calculus. Any statistician will be very, very familiar with the Central Limit Theorem, for example.


Yea of course they will. That doesn't mean they spend their days explicitly enumerating limits of functions, and limits are pre-calc anyway. The point isn't that calculus isn't /integral/ in mathematics, just that it's not _necessarily_ (I did say necessarily) the workhorse of the applied statistician.


No, I'm disagreeing with the statement that you don't need calculus as a statistician. You absolutely do. The expected value is an integral, and is the very first concept in stats / data analysis / whatever you want to call it.


It’s a really great and inspiring article as I’m thinking of taking a Maths course myself. After being a straight A student in school, I graduated a Linguistic department and here I am thinking I can barely add sufficient sums in my head. I guess the disadvantage of Not knowing Maths is different thinking from tech guys who I need to communicate a lot to. The thing that can cross the bridge of misunderstanding should be Math thinking. So thank you for a kick to get the ball rolling.


Basic arithmetic isn't math in this context, sitting through a multivariate calculus MOOC isn't going to help you with addition.


Can anyone here recommend a good resource for learning discrete/computer math? I'm a self taught developer doing well in the CRUD/Enterprise arena. But whenever I try to delve into books about things like algorithm analysis I usually have to tap out after the first few chapters, which are usually heavy with mathematical explanations and proofs. I need to dedicate some time into learning the basic comp sci math foundation.


My high school math teacher used to say math is like handcraft. I'm a lousy student but if you have the discipline and will to learn something I'd look at where you tap out in the chapter and try to learn that. Math is based on a lot of concepts. You have to solve the problem sets in order to grasp a lot of concepts. A lot of books are very dense and theoretical. Start small. Read the college textbooks for the algorithm courses and try to solve the problem sets on your own. If you are stuck join some communities. There are subreddits and forums dedicated to math. If you ask there kindly with a concrete problem and your efforts you'll get an helpful answer. That's the only thing that works for me: Solve the problem sets on your own. Everything will be easier. And don't take shortcuts. That's at least my experience.

There is: http://en.wikipedia.org/wiki/Introduction_to_Algorithms which is pretty readable.

And there are a lot of MOOCs:

https://www.coursera.org/course/algs4partI

https://www.coursera.org/course/algs4partII

https://www.coursera.org/course/algo

https://www.coursera.org/course/algo2

If you work yourself up from there and solve one problem after another you'll be quite good at these things in a few months time.


You could always try the backdoor approach. Maybe try reading a few recreational books on computing such as The Tinkertoy Computer and The New Turing Omnibus to familiarize yourself with some of the concepts then perhaps swing back to the hard stuff with a proper textbook like Concrete Mathematics by Graham, Knuth, and Patashnik.


>Young man, in mathematics you don't understand things. You just get used to them.

John von Neumann

The only way to get used to things is to work with them regularly!


I feel like a tool for saying this, but getting a B in a calculus class doesn't qualify as "good at math." I was hoping this would be a story about someone bad at math ending up majoring in it. If the author had to put this much effort into it, well, no, they're probably not that good at math.


You should.

C is the average grade. B is above average. A is excellent. You think he's not "good" because he's above average while learning calculus at 37?


I see your point, but I see a world of difference between "good at math" as you've used it here and as it's used in the article.

When you (or I) hear "I'm not good at math", you presumably take that to mean "it doesn't come easily, and I certainly wouldn't want to study it full time". But I think that when a lot of people hear the words "good at math", they take it as a statement about just basic competence. People saying "I'm not good at math" all too often mean "numbers scare me" or "I would never be capable of understanding that at all". And I think that's what this article is responding to.


There's a foolproof way to learn math -- find somebody to coach you through enough problems until you "get it".

Alternatively -- and this is much more common -- self-coach. But then you're in the hard work territory that the author speaks of, since you'll be banging your way through a number of walls headfirst.


I sat in the front row. I asked questions non-stop. I did all the homework. I did extra practice problems. I raised my hand to answer questions so much the instructor asked me to stop.

Oh no, you're THAT person.

https://twitter.com/MatureAge


I used to think I was 'bad at spelling', I realize now it is the same.

I was 'good at math' but 'bad at spelling' and that was normal and natural. Except I found out it is just as much a lie as the opposite thought the author had.


It's funny, because I don't think it's the same. And I think this is a good example of why I don't agree with the article.

Even in 2nd and 3rd grade (American elementary school system), any time we had a "spelling test" I got 100%. The first time I looked at any word, I remembered how to spell it perfectly. To this day I still have the ability. I can read a 5,000 page essay and immediately notice every typo and grammar misusage; and when writing anything myself, the only reason I ever misspell a word is due to an accidental typo, not a result of forgetting the correct spelling.

I saw many classmates struggle with trying to remember the correct spelling of various words, and I never had to do any of that.

However, in contrast, I've always had to put in much, much more effort than some of my peers to properly grasp math concepts. Some people could basically be introduced to a concept once. I wonder if this may be due to dyscalculia on my part, becuase the math I struggle with the most is basic arithmetic and algebra; I'm ok with the more advanced concepts, generally, except when I have to do the real number crunching.


TL;DR: Study harder


Wow! That is great! Math is one of my fears too and I wish I had done the same when I was still in school.. (",)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: