It's OK to suck at math; you won't be using much anyway.
I'm not sure how this stereotype got instilled in
me, but I'm slightly embarrassed to admit that I was
surprised to discover how little math is involved in
day-to-day programming, ...
It's OK not to have a PhD, it's OK not to have a degree, but at least knowing about, and having a good, intuitive grasp of what goes on in topology, linear algebra, analysis, graph theory, game theory, logic and more is immeasureably useful.
Unless you're doing the kind of coding that really doesn't require design or algorithms or data structures or communications or database work.
I've never used stuff I did in my PhD, but time and time again, continuously (and I use the work correctly, I don't mean continually) I've used the stuff I learned on the way that enabled me to do my PhD.
Learn more math. It will be useful, even if only to give you the option of rigorous reasoning.
I surprised a fellow developer in the Dark Mists MUD by converting a table into a simple linear formula ("What should Y be when X is at this point in the range?"); this is grade school geometry, right? I just needed the formula for a line!
In my regular work, it is surprising how many times people do not understand the need for variance or standard deviation when they are looking for quality in averages of their data sets.
So web programming? If you have to put up charts of large data sets, things will get hairy without at least knowing how to interpolate the data points. A slow website may be a result of brute-force simplistic approaches in the code. Some of the packages we use (such as jQuery) require some good math knowledge to implement (example: drag-and-drop). We certainly cannot generalize that in web programming, it is acceptable to not know math.
You can get away with not using math in our industry if you leave it to others to do it for you.
Many people grab a scientific toolkit (psipy and the like), but understanding what a Gaussian filter is vs calling a function for it, does an immeasurable amount of good anytime you need to decide what your particular dataset needs.
The other thing is taste. If you are quite good at math you will like those jobs where math is involved. Unfortunatelly this can hurt if you have only have some mundane programming job. I certainly cannot use my theoretical knowledge at my job, I can only use it in side projects. I know a mathematician with orders of magnitudes bigger math knowledge than me: his taste is so abstract that even tasks which I find exciting are boring to him. His taste is so abstract that he is excited about only theoretical math reasearch. He is a researcher mathematician and he don't create any programs at all and his research is not really related to practical things.
I know programmers who have smaller math knowledge than me but are more passionate about their day job than me and perhaps they are even more productive for this reason at their day job (which is the same as mine.) Passion is more important than knowledge.
So if you want to be a good and happy web developer I think there IS something like too much math knowledge.
I don't know what kind of programming the author does, but I was sending such a letter to myself, I would absolutely not say 'Its ok to suck at math'.
I'd be much more likely to tell myself:
"Don't worry so much about learning about hardware specifics - spend lots of time on abstract math, it'll come in handy all sorts of places later".
As commented in another reply:
... Never used more than a trivial amount of the math
content, but the math processes of thought have been
How many people are crap at something, realise they can get along without it, and thereby conclude that it's useless? Most. It's like being incompetent and unaware of it. It's an unknown unknown.
The reverse is also true. If you know a lot of math, you don't really know how you might have gotten along as a programmer without it.
You might know that you would have struggled at the mathy programming you have done, but you wouldn't necessarily know if you would have excelled at other kinds of programming despite a lack of math depth.
I think your assumption is that people think like:
I like programming.
I'm bad at math.
Math doesn't help my programming anyways.
Instead I think people think like:
I like programming
I'm bad at math, so I don't like it.
I'm going to look at other fields to improve my programming.
I think the problem is, and the author hints at it, is that people tend to over glorify the importance of math at the expense of drowning out other fields, like prose.
In some fields like web programming, I would say that yes, my time is better spent learning how to write understandable code instead of how to solve complex equations. As a result, I probably won't ever program video games, but thats ok, games take a lot of math, and I wouldn't enjoy that anyways. It's not that I'm lazy and just don't want to get better at programming. I just want to get better at programming in ways that I enjoy the most.
With infinite time, I would certainly do it both.
I would say that yes, my time is better spent learning
how to write understandable code instead of how to
solve complex equations.
I don't solve complex equations, and yet I use math every day.
The problem I see is that people who might be really really good at advanced math are often completely turned off by high school math and the teaching thereof. Hence they never get the chance to see the really fun stuff that turns out to be useful in unexpected ways. Even more, I'd guess that the really good programmers who hate math are in that group.
I would so like to be able to test that hypothesis. There's a Ph.D. waiting in it.
That said, to re-summarize my point tersely: If someone doesn't like a particular field, I still feel that they can be successful programmers by focusing on fields that they do like which are also useful.
Can you explain what you mean by this?
In my experience, most people who go through math classes at a University, not to mention classes in High School, are mostly "blindly" following a template of how to do induction. They don't really understand the principle behind it, why it's a valid proof technique, etc. Put another way, if I used induction incorrectly, I could probably still trick them into thinking it's a valid proof because it looks like induction.
The same goes for a lot of things in Mathematics. The same also goes for recursion - most people don't really understand recursion (unless it's used at a "simple" level, like in Tree Recursion).
You mean like this one?
My college calculus professor used this example when she was teaching induction.
One of the best software engineers I know has a law degree and says exactly the same thing. So I suspect that the same is likely to be true for anybody else who's gotten a PhD or other advanced degree.
I would have thought they would be more useful for the sorts of things that programmers are bad at, like dealing with, or in, lies and half-truths. And not be so good for the 'long chains of precise reasoning' type problems.
Of course, you might equally say that being able to handle the political side of life 'makes you a better programmer', but I don't think you'd mean it in the same way.
Ideally you'd want both.
and yes, I do mean that being able to handle the social side of life (including politics) makes you a better programmer in the same way.
agreed that you want both.
There can be many explanations for this, but the simplest is that when you have a tool in your tool chest, you tend to use it, and when you use a tool, you see opportunities to use it everywhere. Whereas if you don't have a tool, you don't see the opportunities it affords, so you don't know what you're missing.
Even if my conjecture is true, this doesn't mean that mathematics is more important than some other experience a programmer might bring to their career, so I am not suggesting that the author is wrong when suggesting it isn't important. Perhaps it's useful but less important than some other skills.
I'm not a mathematician, so I don't know, it's all blub to me.
The moment you reason about the control flow in your program, or even when you figure out that a for i=1 to 10 loop will terminate, you are doing math.
If the problems aren't hard, the math isn't hard, and you can even get away with some trial and error math (aka debugging). However, for harder problems such as a file system's source code, an encryption library, or a multi-threaded program, tweaking the code until it 'seems to be robust' is not the way to go. For those, you need good maths skills.
my grandfather worked on AT&T System UNIX with what was basically an 8th grade education.
if you read "Secrets of Consulting" there's a few rules that my grandfather laid out; one of them is "If you're doing math above elementary level, you're doing it wrong".
I've done many very powerful things with that rule.
An anecdote. I once noticed in a web programming context that someone was doing the same calculations over and over. The way the code was organised made it necessary, results couldn't be cached, nor could they be precomputed.
I showed that a sub-class of the operations were commutative (that took several pages, and then a 2 day meeting with the chief designer) which allowed some of the loop/function-call orderings to be reversed. Compute time went down by a factor of 100, and scalability was achieved.
The problem reminded me of some stuff I'd done in ring theory classes, and had something in common with the idea of groups acting on a topological space. The math wasn't actually directly useful - it just brought it all into focus and gave me a way to think.
Extending the mapping analogy, I built a 3D terrain system with a simple mesh overlay, and drew a 3D route inside of the system. I had to map the route onto the terrain, and to do that I needed to understand a good amount of vector manipulation (cross products amongst others). Maybe this is 8th grade math in Russia, but it certainly isn't taught until College here in the US.
So, yes writing a program itself doesn't require math, but many of the cool domains require math.
The cult of the Ph.D. is strong, but you don't actually need some kind of license to think logically about abstract concepts, anymore than a musician needs a formal music education or a cat needs a formal gymnastics education.
However, there are a couple of problems with indrora's comment. First, Jerry Weinberg has a Ph.D.; that's a little beyond 8th grade. I also remember hearing him talk about majoring in physics as an undergrad because computer science didn't exist yet. And I've never heard of him working on Unix. He was an IBM guy after all.
More importantly, I doubt that Weinberg ever objected to using advanced math to solve technical problems, including programming problems, when appropriate. What he repeatedly (and rightly) has objected to many times is complex models and metrics that purport to quantify human behavior when in reality they're not measuring anything significant. Weinberg regards this as a form of escapism (my word not his), a way of hiding behind technicality to avoid facing human situations. He wrote a whole book, in fact, called First Order Measurement arguing that simple measurements -- including just plain personal observation -- are the best way to monitor complex systems like a software project.
Weinberg was the great humanizer of the software industry. You can see his influence hugely on Agile (the good parts, not the slick parts, which I heard him denounce as early as 2004). I think it took enormous courage for him to talk about Virginia Satir in software circles - or would have, if software people had any idea what Satir did.
As for math, it was one of my majors as an undergraduate, and I am a member of Pi Mu Epsilon, and have published a number of mathematical papers, though not recently.
But, I'm a hacker at heart, always have been, always will be. I use math only when it's helpful to solve problems. That's true of any tool I use. Why? Because the higher the math or esoteric the tool, the more you diminish the number of other people who can understand what you're doing.
Indeed, that's why I'm now writing fiction that helps people grasp a few mathematical and software concepts. For instance, my mystery novel, "Freshman Murders," has a team of math geniuses as the "detective." And I still write techie stuff, when I have something to say.
- Gerald M. Weinberg <http://www.geraldmweinberg.com>;
there is a strong correlation between the programmer's exposure to mathematics and his belief in its importance.
+ What is modulo arithmetic?
+ If p is prime and 1 <= a < p, why is a^(p-1)-1 a multiple of p?
+ If a prime is conguent to 3 mod 4, why is it never the sum of two squares?
+ If a prime is conguent to 1 mod 4, why is it always the sum of two squares?
+ Find examples of why path-wise connected is stronger than "not disconnected".
+ Show that in 2D if you consider parallel lines to converge at infinity, and that they all converge to the same point, then what you have is a 2-sphere
+ Show that in 2D if you consider parallel lines to converge at infinity, and that non-parallel lines converge to different points at infinity, then what you have is a Moebius strip with its edge glued to a disk. (projective plane, or RP2)
+ Understand why the sum 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + ... diverges (and what that means)
+ Understand why the sum 1/2 + 1/4 + 1/8 + 1/16 + ... converges, what that means, and what it converges to.
++ Note: most undergraduate mathematicians get this wrong.
There's a bunch of stuff, and this is all straight from the top of my head. It's not necessarily good advice, but they are a few things I found interesting when I was 12 or 13.
My own take on this is that many programmers like to believe their chosen subset of the field (in the case of "learn math" articles, often people who work in game development or scientific computing) is the only "real" programming, and that everyone who chooses another subset is deluded, stupid or otherwise inferior.
Any time you combine previously separate if's and use Demorgan's law, that's basic mathematical logic at work.
True, in programming you need to think rigorously and abstractly, and the same's true in math. However to be a great programmer you also need systems thinking, the engineering skills to use components that don't always work as advertised, understanding of social dynamics [both how software is constructed and how it gets used], etc. etc. Math doesn't give you any of these.
So it seems to me that there are other ways to learn rigor and abstraction that are at least as good a preparation for programming as math: different branches of science, law, operations research, user experience, etc. etc.
He has deep mathematical talent.
Ken Thompson also has deep analytical abilities as demonstrated by his contributions to computer chess: https://chessprogramming.wikispaces.com/Ken+Thompson
Guido van Rossum holds a master's degree in Math and CS.
I'd say that it is common sense that mathematical talent is correlated with coding talent provided that the person receive substantial training in coding. Generally speaking, everyone is a terrible coder for the first 2 years, it doesn't matter if you are a math genius, you still need to learn how to crank out good code.
The real question is: is there extra correlation between math talent and programming talent beyond both being correlated with g, the general intelligence factor.
It's not the math, it's not the advanced concepts. It's the ability to think in abstract terms, about abstract objects, rigorously sometimes, intuitively sometimes.
It's not the direct benefits. Knowing topology or analysis is unlikely ever to help you. Being able to do topology and analysis probably will, in subtle, unexpected and most often unnoticed ways.
Every single advanced athelete does progressive weight training. Why? Certainly not because they want to be able to lift weights. No, it's because of the side-effects. Similarly doing advanced math trains your mind in ways that are relevant to programming, and unobtainable elsewhere.
I won't convince people who are already convinced that math is useless, and most people who haven't done any advanced math will see that they don't appear to use it, and thereby believe that it's useless.
It's all blub. Obviously.
In my experience the single thing in which mathematical thinking can help you best is going meta. When you reason about the program as the program were data. It is not a coincidence that smart people with mathematical talent wants to go meta: they love to write frameworks, new programing languages, compilers, (graphical, database) engines.
But most of the programmers have to write good old application logic (so called business logic) using languages, frameworks, engines, technologies made by other people.
I go further: most people have to use bad programming languages, programming languages which are not their choice. Most people have to maintain bad quality code created by other people.
For these tasks mathematical thinking is really secondary. You have to have other skills: very good memory, finding your way in a complex mess, etc... If you are mathematically 'smart', you will find patterns, you will be bored, you will go meta, and they need to find another guy for the mundane task.
Tool ... good. Tool with two thousand years of history behind it, very good!
However, I haven't notice a similar degree of correlation between programming and Egyptian history... Knuth was about literate programming, so maybe English literature could be good tool too.
> The people who espouse otherwise either haven't studied enough math, or more usually, haven't done that much programming.
Dijkstra? Hoare? Knuth?
It is absolutely crucial to pick a first language with a deep, expressive vocabulary for this reason, and all programming languages are not created equal.
Not really. I started with BASIC, on ZX Spectrum 48, and I actually used it to write a real-world program which was used by real users; if you're curious, it was a pop quiz for biology classes. Yes, it's probably the worst code I've ever seen or written. Yes, it had all sorts of problems and bad practices you could readily think of and then some that would require you to be very imaginative. No, it didn't cripple my development as a programmer.
The problem is not in which language you pick as your first language. The problems start when you start believing that's all there is. Some of the most important things I've learned were things I encountered when learning a new, more powerful programming language, precisely because they were impossible (or extremely difficult, or too messy) to express using the old language. They challenged my understanding of programming and helped me develop even further as a programmer.
In the same way, if you want to learn OO, learn Java rather than C++, where the same concepts seem very much more complex. And that's probably true even if you want to program in C++ eventually!
Probably better to use python than either, because that's got the least accidental complexity of anything and can express a wide range of ideas, as far as I can tell.
My favourite language at the moment is clojure, but actually some of its nice features like pervasive immutability would make things hard for a beginner. Better to learn how to do everything and then decide which powers you want to not use.
So I guess my ideal beginner's computer today would be a modern ZX Spectrum running a version of python with a lisp syntax (for macros). pg's original vision for arc sounds like a good spec.
And of course an essential requirement would be a printed user manual as friendly and straightforward as the ZX BASIC manual or the TRS80 one.
This is, IMO, so wrong on many levels. My native tongue is Spanish, but I'm also fluent in English, French and German; however I find myself thinking in all of them with the exception of German (I guess I'm not as fluent as I thought I was in it ;)). In the same way, I started with C when I was around 12 years old, and I've had no problem thinking in C, Java, Ruby or Python (which are the languages I use mainly). I think as long as you understand the language itself you won't have to think back to a different one to process your thoughts.
And don't get me started on the math. Math is extremely important in programming. I have a friend that shares that though that math is not important, and his code is hideous (Ok he uses ONLY Php, as it's his opinion all other languages are complicated and inflexible). Math can help you accomplish a plethora of things easily, faster, and more productively.
'Languages differ not in what you can say, but in what you must say" -- Roman Jakobson
And yet every programmer with proficiency in several languages knows that some problems are easier solved in Python, others in C++, and others in AWK. Further, every polyglot I know will shift language according to nuances in the thoughts they're trying to express, because some languages have more/better/finer distinctions about different things.
In Spanish the words for "to wait" and "to hope" are the same. In some contexts the words for "Even" and "Same" are both translated into the Frech as "meme" (as in "meme chose" - "same thing"). Perhaps these conflations of words show a conflation of concept. Yes, the differences in concepts exist in the minds of the speakers, but the language to express them precisely isn't always there.
There's a Spanish joke: "¿Cómo se llama a un ascensor en Inglés?" - "Con su dedo." It doesn't translate into English - you need to read the Spanish. Again, vocabulary conflation.
SWH has been debated here on HN before, many times, and no doubt it will again, with the research-based linguists saying there's no difference in lanugage chosen, and the polyglots snorting derisively.
I've repeated this comment in a new submission since I think the subject deserves a discussion of its own:
Also, the joke does have a translation to English: "How do you call an elevator in Spanish?" "With your finger." Expressing yourself is a game of context and expressing an idea is not necessarily a verbatim translation from one language to another.
On another note, on the "to wait" and "to hope" comment, you are right depending on what mannerism the people around. I could very well say "tengo la esperanza de" is as much a Translation as "espero que". Again proof that context and semantics are more important than language choices themselves.
"How do you call an elevator in English?" "With your finger."
And no-one is claiming that you can't do closures in C, but I am claiming that C-only programmers don't usually think in those terms, because the language makes them more complicated than they need to be.
I think the point is that you can only readily think in terms of what you know. Now, if you're fluent in English, French, German, and Spanish, it's almost a given that you'll be able to think in terms of any of the prominent concepts in those languages. I think the author was really saying that, if you learn a language with a broad range of concepts first, it can only help you as you progress, because you'll be able to actively use those concepts.
For example, I know a number of imperative languages, but I've only recently experimented with functional languages like Lisp. The conceptual shift is fascinating, and something I wouldn't have quite grasped if I constrained myself to imperative languages.
And math? I can't help but use it. When people ask for estimates, I always do some math based on known quantities rather than just guessing. I keep track of how far off my estimates were to apply later corrections and reason backwards to figure out which of my assumptions were wrong (leading to better estimates in the future). I've done merge sort by hand on stacks of paper (and found the speedup when you have a big gap between two merged piles and can move an entire chunk to the sorted pile).
And that's not even counting my job, where I end up doing lots of geometry. Knowing the formulas for mutually tangent circles or the formula to find the two intersection points of two circles is surprisingly useful (the case where you have zero or infinity intersection points just doesn't come up and even having one point multiplicity two is rare).
I wish I knew more math. I know more than most people and I even have a degree in it, but there's so much great stuff like linear algebra, statistics, graph theory, and game theory that I never feel like I know enough about any of them.
I am sorry but this is bullshit, I am glad I no longer visit this crappy forum filled with linguist turned Ruby Dev masquerading as Software Engineers and Computer Scientists.
If you have an idea and if you can make it run on a machine, then it means you can convert it into the machine code, which is what everything gets finally converted into. Also if this guy knew math it is called as Turing Principle and as long as the language is Turing complete it makes no difference.
someone in this comment:
"especially for a web developer" is the important part.
As a fellow web developer, I agree. I'd certainly like
to get into the mathier side of programming someday, but
it's rarely necessary for my day-to-day operations.
Why limit your options?
I suspect that focusing on mathy programming jobs would tend to limit your job options, at least in most regions of the US.
It's not doing math that limits your options. I've never seen a prgramming job that wouldn't benefit from the programmers knowing more math. Every programming job I've seen has benefited from more math. And I've seen a lot of programming jobs, Including ATC, weather forecasting, financial, CRUD, web scraping, automotive and more. Even gathering data from a database and presenting it in a template has occasionally benefited from a bit of obscure math.
It's just that people who think math is arithmetic, formulas, equations, calculations and calculus don't realize what they're missing, and just how widely applicable it is.
This is true until you learn enough, then it's not true any more.
I used to work with a guy who learned FoxPro (an old database program). Every problem we had on the team, he would look around and say, "Hey, you know what? We could solve this very easily if we only used FoxPro"
Everybody is like that. We just don't see it. So the author is correct as far as he takes it.
The difference is, once you've become exposed to bunch of languages and problem-solving paradigms, you start to see patterns. And once you see the patterns, the languages aren't that important anymore at least in the way you approach solutions. The coding itself may be more or less gnarly, but the way you go about fixing the problem is based on principles, not language constructs.
This is like when you learn a word processor. The first time you use one, you begin thinking of all word processing problems through this one lens. But once you learn 3 or 4 word processors, it's all kind of the same. Programming is just like that, only it takes about ten times longer.
I'd also note that no matter what tool or problem you have, if you keep refactoring and clarifying your code, you always end up in mostly the same spot -- some kind of DSL-ish place. That's true no matter what language you're using.
So yeah, he's right. But not really. Instead of focusing on which language you start with, focus on broadening your programming experience. Got enough language zealots in the world. No need to make more.
"Look, I'm sorry to be the one to break this to you, but if you have difficulty with any programming concept, you must not be a supergenius. You're just an ordinary genius at best. I'm sorry. Life isn't always fair.
Of course, I say this as someone who hasn't yet tried to learn Haskell. On the other hand, I know someone who competes at the national level and I never saw him have trouble with anything including Haskell, so...
The sad truth is that there are some people for whom programming comes as naturally as thinking, with code formed as easily as thoughts; and if it takes an effort to understand any aspect of programming, you have just learned that you are not one of those people. Alas."
Programming is too broad an umbrella of a term for any meaningful generalizations to be made from it. It ranges from artisan type crafts work to engineering to pure mathematical theorizing. You can't tell the people doing machine learning, signal processing or physics engines that they won't need maths. That is just pure sillyness. It is very context specific.
This person misses the point of math. Its not about arithmetic, its not about calculus or group theory or combinatorics or even numbers or proofs. No. those are by products.
The true point of maths is about distilling observations of past patterns into their essence and core attributes. Packaging them and then using them to lay the foundation to think more complex and less intuitive thoughts. Not unLike libraries in programming. Look at the history of the developments of mathematics - shoulders of giants and all that. Abstraction.
Maths is a creative endeavour and If we had more memory and faster thinking speeds and less fallible intuitions I dare say I doubt our maths would be so 'advanced', abstract. A lot of this would be built in everyday thinking, maybe this is related to why many programmers think they don't need maths - the mathematical principles of why what they do works and its mathematical nature may not necessarily need to be made apparent in order to get by. Or perhaps this is just a reflection of the youth of the field.
But I do believe though that our maths is so advanced only because we are so stupid!
I'd also add that often people online can excascerbate (sp?) the problem. "What, you don't know how to write an O(n) gnome sort!? Dude you are not a programmer." It's really easy to forget that not everyone is an elite CS programmer who graduated with a 4.0. And not everyone needs to be. Learning is a process and one we should help each other with.
That being said, I'm one of those totally wrong people who disagrees with:
You're lucky to have stumbled across a very good first language, but you're likely to encounter developers who will tell you that this decision doesn't matter. These people are wrong, plain and simple.
You won't know what's good and bad about your first language until you try a few others anyway. So don't sweat it. I'd say, avoid zealotry and dogma. Program in as many languages as you can handle.
It all comes down to the question "How do you solve a problem you don't know how to solve", a lot of times hints for how to approach these problems are given in certain areas of computer science and mathematics. If you don't have the tools(computer science and math) it can make your life significantly harder for a particular subset of problems which probably align pretty well with the problems you realize you don't know how to solve.
Programming is such a wide field. There are web designers who fit rectangles together all day long, and then there are the people who write Mathematica.
I would say "Languages do matter. All of them." Now, I certainly don't mean iterations on a single language (though experiencing that evolution first-hand can be very instructive), but insofar as Java 1.3 and C# 1.0 were nearly identical languages, they were still worth learning even at that time because experiencing the vastly different ecosystems completely cuts through all of the dogmatism that every language has.