Hacker News new | past | comments | ask | show | jobs | submit login
Programming is hard (bryanwoods4e.com)
113 points by delano on Oct 20, 2010 | hide | past | web | favorite | 95 comments

OK, here's a really biased point of view. The author says:

    It's OK to suck at math; you won't be using much anyway. 

    I'm not sure how this stereotype got instilled in
    me, but I'm slightly embarrassed to admit that I was
    surprised to discover how little math is involved in
    day-to-day programming, ...
That depends. The amazing thing is how knowing math can help in so many unexpected, tangential, apparently unconnected ways. Time and time again the guys I work with do stuff that would be easier if they knew more math. When I pair with them they frequently say that they wish tehy knew more math.

It's OK not to have a PhD, it's OK not to have a degree, but at least knowing about, and having a good, intuitive grasp of what goes on in topology, linear algebra, analysis, graph theory, game theory, logic and more is immeasureably useful.

Unless you're doing the kind of coding that really doesn't require design or algorithms or data structures or communications or database work.

I've never used stuff I did in my PhD, but time and time again, continuously (and I use the work correctly, I don't mean continually) I've used the stuff I learned on the way that enabled me to do my PhD.

Learn more math. It will be useful, even if only to give you the option of rigorous reasoning.

I agree with you. I find many reasons to know math during my career and in my programming hobbies. It is hard for me to ever take the statement seriously. We are not even talking about calculus here...

I surprised a fellow developer in the Dark Mists MUD by converting a table into a simple linear formula ("What should Y be when X is at this point in the range?"); this is grade school geometry, right? I just needed the formula for a line!

In my regular work, it is surprising how many times people do not understand the need for variance or standard deviation when they are looking for quality in averages of their data sets.

So web programming? If you have to put up charts of large data sets, things will get hairy without at least knowing how to interpolate the data points. A slow website may be a result of brute-force simplistic approaches in the code. Some of the packages we use (such as jQuery) require some good math knowledge to implement (example: drag-and-drop). We certainly cannot generalize that in web programming, it is acceptable to not know math.

You can get away with not using math in our industry if you leave it to others to do it for you.

Spot on. You may not need math if you are working on very very standard boring things, or on some system level stuff. However, the moment you need to work with real data and do cool things with it, you are often left SOL without the ability to figure out the math.

Many people grab a scientific toolkit (psipy and the like), but understanding what a Gaussian filter is vs calling a function for it, does an immeasurable amount of good anytime you need to decide what your particular dataset needs.

I have been a programmer since I was a child, and I was also good at math, I won competitions, etc... But interestingly I was only interested in and good at discrete math. I loved mathematical logic, set theory, number theory, graph theory, algorithms, solving tricky logical problems. But I was not good at calculus. I did not understand why complex function theory is relevant and why I have to leran so much calculus. I now of course know some calculus, and already I don't hate it. But I still think that some part of math is not very cruical to a programmer, and some parts can help the programmer's abstract thinking. So I would change the kind of math which is taught for an average programmer.

The other thing is taste. If you are quite good at math you will like those jobs where math is involved. Unfortunatelly this can hurt if you have only have some mundane programming job. I certainly cannot use my theoretical knowledge at my job, I can only use it in side projects. I know a mathematician with orders of magnitudes bigger math knowledge than me: his taste is so abstract that even tasks which I find exciting are boring to him. His taste is so abstract that he is excited about only theoretical math reasearch. He is a researcher mathematician and he don't create any programs at all and his research is not really related to practical things.

I know programmers who have smaller math knowledge than me but are more passionate about their day job than me and perhaps they are even more productive for this reason at their day job (which is the same as mine.) Passion is more important than knowledge.

So if you want to be a good and happy web developer I think there IS something like too much math knowledge.

Interesting. I'm also very discrete-biased in my math background, perhaps because it lines up neatly with problems in digital computing. I never did well with calculus, but interestingly enough, my interest is more awakened now that I'm doing coding for a software synthesizer, which has to approximate continuous functions all the time(as sample buffers), and deal with the various distortions from said approximation that are taken care of "for free" in an analog environment.

This is framed as a letter sent back through time to an earlier self.

I don't know what kind of programming the author does, but I was sending such a letter to myself, I would absolutely not say 'Its ok to suck at math'.

I'd be much more likely to tell myself: "Don't worry so much about learning about hardware specifics - spend lots of time on abstract math, it'll come in handy all sorts of places later".

He offers a clue: "[...] I was surprised to discover how little math is involved in day-to-day programming, especially for a web developer."

Exactly, but if he doesn't know much math, and he does suck at math, and it doesn't think he uses it much in his day-to-day programming, he will think it's pretty useless. And he'd be wrong.

As commented in another reply:

   ... Never used more than a trivial amount of the math
   content, but the math processes of thought have been
In spades.

How many people are crap at something, realise they can get along without it, and thereby conclude that it's useless? Most. It's like being incompetent and unaware of it. It's an unknown unknown.

"How many people are crap at something, realise they can get along without it, and thereby conclude that it's useless? Most. It's like being incompetent and unaware of it. It's an unknown unknown."

The reverse is also true. If you know a lot of math, you don't really know how you might have gotten along as a programmer without it.

You might know that you would have struggled at the mathy programming you have done, but you wouldn't necessarily know if you would have excelled at other kinds of programming despite a lack of math depth.

I was pretty good at math in school, but I dont like math. Therefore I haven't focused on it.

I think your assumption is that people think like: I like programming. I'm bad at math. Math doesn't help my programming anyways.

Instead I think people think like: I like programming I'm bad at math, so I don't like it. I'm going to look at other fields to improve my programming.

I think the problem is, and the author hints at it, is that people tend to over glorify the importance of math at the expense of drowning out other fields, like prose.

In some fields like web programming, I would say that yes, my time is better spent learning how to write understandable code instead of how to solve complex equations. As a result, I probably won't ever program video games, but thats ok, games take a lot of math, and I wouldn't enjoy that anyways. It's not that I'm lazy and just don't want to get better at programming. I just want to get better at programming in ways that I enjoy the most.

With infinite time, I would certainly do it both.

And here we are again - same mis-conception:

    I would say that yes, my time is better spent learning
    how to write understandable code instead of how to
    solve complex equations.
Math isn't about solving complex equations. Common mis-conception, totally wrong, very understandable for those whose experience of math is limited to calculus or pre-calculus.

I don't solve complex equations, and yet I use math every day.

The problem I see is that people who might be really really good at advanced math are often completely turned off by high school math and the teaching thereof. Hence they never get the chance to see the really fun stuff that turns out to be useful in unexpected ways. Even more, I'd guess that the really good programmers who hate math are in that group.

I would so like to be able to test that hypothesis. There's a Ph.D. waiting in it.

This response is frustrating because I feel like you didn't read my comment, but instead honed in on a single phrase, "complex equations". Even in the context of that sentence that's a moot point. If you'd like, replace "solve complex equations" with what you consider to be the best, most useful, fun part about math, and it does not change my point. I'm sorry I chose an example that seems to have offended you, but I do understand that there's more to math than solving complex equations.

That said, to re-summarize my point tersely: If someone doesn't like a particular field, I still feel that they can be successful programmers by focusing on fields that they do like which are also useful.

I did enough math at college that complex data structures and algorithms were not at all scary. Recursion just reminded me of the infinite descending spiral of the ideals in Noetherian rings :) And working through the proofs of the equivalence of a dozen different versions of the axiom of choice helped me learn to reason about my programs. Never used more than a trivial amount of the math content, but the math processes of thought have been invaluable.

Recursion has always been my favorite example of how learning math helps understand programming. If you've ever proven something by induction, you should have no problem understanding recursion.

Honestly, I think most people who "prove" things using induction don't really understand what they're doing. Then again, most people don't really understand recursion either.

This sounds more like a case of people not understanding what proving by induction is (and as a consequence, not doing so) rather than them proving by induction and and not understanding what they are doing.

Honestly, I think most people who "prove" things using induction don't really understand what they're doing

Can you explain what you mean by this?


In my experience, most people who go through math classes at a University, not to mention classes in High School, are mostly "blindly" following a template of how to do induction. They don't really understand the principle behind it, why it's a valid proof technique, etc. Put another way, if I used induction incorrectly, I could probably still trick them into thinking it's a valid proof because it looks like induction.

The same goes for a lot of things in Mathematics. The same also goes for recursion - most people don't really understand recursion (unless it's used at a "simple" level, like in Tree Recursion).

> if I used induction incorrectly, I could probably still trick them into thinking it's a valid proof because it looks like induction.

You mean like this one?


My college calculus professor used this example when she was teaching induction.

Beautiful. Thank you.

That's a perfect demonstration of the difference between epistemic induction (unprovable) and mathematical induction (provable).

Er, actually it's not. It is mathematical induction, but with a broken case.

> I've never used stuff I did in my PhD, but time and time again, continuously (and I use the work correctly, I don't mean continually) I've used the stuff I learned on the way that enabled me to do my PhD.

One of the best software engineers I know has a law degree and says exactly the same thing. So I suspect that the same is likely to be true for anybody else who's gotten a PhD or other advanced degree.

But law, maths, computer science, and weirdly, classics all appeal to the same minds. Are there examples of people with English Literature or History degrees who say the same?

I would have thought they would be more useful for the sorts of things that programmers are bad at, like dealing with, or in, lies and half-truths. And not be so good for the 'long chains of precise reasoning' type problems.

Of course, you might equally say that being able to handle the political side of life 'makes you a better programmer', but I don't think you'd mean it in the same way.

Ideally you'd want both.

there's a big difference between "appeal to the same kind of mind" and "you should study this subject to become a better programmer"

and yes, I do mean that being able to handle the social side of life (including politics) makes you a better programmer in the same way.

agreed that you want both.

For some odd reason, whenever you ask whether math is important to programming, there is a strong correlation between the depth of a programmer's mathematical training and his belief in its importance.

There can be many explanations for this, but the simplest is that when you have a tool in your tool chest, you tend to use it, and when you use a tool, you see opportunities to use it everywhere. Whereas if you don't have a tool, you don't see the opportunities it affords, so you don't know what you're missing.

Even if my conjecture is true, this doesn't mean that mathematics is more important than some other experience a programmer might bring to their career, so I am not suggesting that the author is wrong when suggesting it isn't important. Perhaps it's useful but less important than some other skills.

I'm not a mathematician, so I don't know, it's all blub to me.

There also is a huge difference between the two groups with respect to what they think math actually is. Many lay persons do not know that logic and discrete mathematics are math.

The moment you reason about the control flow in your program, or even when you figure out that a for i=1 to 10 loop will terminate, you are doing math.

If the problems aren't hard, the math isn't hard, and you can even get away with some trial and error math (aka debugging). However, for harder problems such as a file system's source code, an encryption library, or a multi-threaded program, tweaking the code until it 'seems to be robust' is not the way to go. For those, you need good maths skills.

I'll argue this.

my grandfather worked on AT&T System UNIX with what was basically an 8th grade education.

if you read "Secrets of Consulting" there's a few rules that my grandfather laid out; one of them is "If you're doing math above elementary level, you're doing it wrong".

I've done many very powerful things with that rule.

It's perhaps true that if you're doing arithmetic and calculations above the elementary level it's a sign that there's something wrong. But if you observe and prove that two operations commute and that doing them the other way round allows various advantages such as check-pointing or extraction of common sub-expression computation, it's slightly less clear.

An anecdote. I once noticed in a web programming context that someone was doing the same calculations over and over. The way the code was organised made it necessary, results couldn't be cached, nor could they be precomputed.

I showed that a sub-class of the operations were commutative (that took several pages, and then a 2 day meeting with the chief designer) which allowed some of the loop/function-call orderings to be reversed. Compute time went down by a factor of 100, and scalability was achieved.

The problem reminded me of some stuff I'd done in ring theory classes, and had something in common with the idea of groups acting on a topological space. The math wasn't actually directly useful - it just brought it all into focus and gave me a way to think.

Basic math is all you need to program, but many times, the cool things you can do with programming will need strong math. For example, just working with mapping data, I need to be able to think pretty extensively about functions for changing coordinate systems, figuring out which tiles I need for a specific zoom level, etc. Of course the actual math itself is basic trig, the abstract concepts are much more difficult, and I'd argue those are math related.

Extending the mapping analogy, I built a 3D terrain system with a simple mesh overlay, and drew a 3D route inside of the system. I had to map the route onto the terrain, and to do that I needed to understand a good amount of vector manipulation (cross products amongst others). Maybe this is 8th grade math in Russia, but it certainly isn't taught until College here in the US.

So, yes writing a program itself doesn't require math, but many of the cool domains require math.

Just because someone has no formal training in math, and doesn't recognize that what they're doing could be called math, (or strategically refuses to apply the word 'math' to what they do, for sound pedagogical reasons) doesn't mean that they aren't doing math.

The cult of the Ph.D. is strong, but you don't actually need some kind of license to think logically about abstract concepts, anymore than a musician needs a formal music education or a cat needs a formal gymnastics education.

Your grandfather is Gerald Weinberg? His books have been a big influence on me and I recommend them to lots of people!

Indeed. He has had a massive influence on the profession and on me personally. Secrets of Consulting is a great book and something every software person should read. It's not just about consulting. It's much more useful than that, and so entertaining that it's surprising how profound it also is.

However, there are a couple of problems with indrora's comment. First, Jerry Weinberg has a Ph.D.; that's a little beyond 8th grade. I also remember hearing him talk about majoring in physics as an undergrad because computer science didn't exist yet. And I've never heard of him working on Unix. He was an IBM guy after all.

More importantly, I doubt that Weinberg ever objected to using advanced math to solve technical problems, including programming problems, when appropriate. What he repeatedly (and rightly) has objected to many times is complex models and metrics that purport to quantify human behavior when in reality they're not measuring anything significant. Weinberg regards this as a form of escapism (my word not his), a way of hiding behind technicality to avoid facing human situations. He wrote a whole book, in fact, called First Order Measurement arguing that simple measurements -- including just plain personal observation -- are the best way to monitor complex systems like a software project.

Weinberg was the great humanizer of the software industry. You can see his influence hugely on Agile (the good parts, not the slick parts, which I heard him denounce as early as 2004). I think it took enormous courage for him to talk about Virginia Satir in software circles - or would have, if software people had any idea what Satir did.

I probably knew indrora's grandfather, for though I worked for IBM, I consulted with Bell Labs and other parts of ATT for many years, in IBM and as an independent. In fact, I was twice the Bell Labs Distinguished Lecturer, touring all the labs--helping out with C and Unix.

As for math, it was one of my majors as an undergraduate, and I am a member of Pi Mu Epsilon, and have published a number of mathematical papers, though not recently.

But, I'm a hacker at heart, always have been, always will be. I use math only when it's helpful to solve problems. That's true of any tool I use. Why? Because the higher the math or esoteric the tool, the more you diminish the number of other people who can understand what you're doing.

Indeed, that's why I'm now writing fiction that helps people grasp a few mathematical and software concepts. For instance, my mystery novel, "Freshman Murders," has a team of math geniuses as the "detective." And I still write techie stuff, when I have something to say.

- Gerald M. Weinberg <http://www.geraldmweinberg.com>;

In my experience as someone with a lot of math in my toolkit, many, perhaps most, of the people I work with end up wishing they had similar skills. It's not just me saying math is useful, it's the people who have seen it used.

So perhaps:

there is a strong correlation between the programmer's exposure to mathematics and his belief in its importance.


I agree with that, but it's hard to believe something is important if you have no concept of it. (People know that "math" exists, but until they've studied specific areas, they probably will have never been exposed to those concepts or ways of thinking.)

I'm still a full-time student. What math should I learn?

What year? What experience do you have? What math have you done? What are you interested in?

Number theory.

+ What is modulo arithmetic?

+ If p is prime and 1 <= a < p, why is a^(p-1)-1 a multiple of p?

+ If a prime is conguent to 3 mod 4, why is it never the sum of two squares?

+ If a prime is conguent to 1 mod 4, why is it always the sum of two squares?


+ Find examples of why path-wise connected is stronger than "not disconnected".

+ Show that in 2D if you consider parallel lines to converge at infinity, and that they all converge to the same point, then what you have is a 2-sphere

+ Show that in 2D if you consider parallel lines to converge at infinity, and that non-parallel lines converge to different points at infinity, then what you have is a Moebius strip with its edge glued to a disk. (projective plane, or RP2)


+ Understand why the sum 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + ... diverges (and what that means)

+ Understand why the sum 1/2 + 1/4 + 1/8 + 1/16 + ... converges, what that means, and what it converges to.

++ Note: most undergraduate mathematicians get this wrong.

There's a bunch of stuff, and this is all straight from the top of my head. It's not necessarily good advice, but they are a few things I found interesting when I was 12 or 13.

Totally off topic: have you read PopCo by Scarlet Thomas? If not, you must.

Noted - on my "find to read" list.

Math--the fun kind, at least--consists entirely of thinking about abstract logical objects. A lot of programming is also this. They usually don't line up, except in cases like Haskell's type system, but the brainpower you develop in one is useful for the other.

True. But when people write "programmers should know math" articles, what they're almost always actually saying is "I took linear algebra in college, therefore all programmers should have taken linear algebra in college", not "programmers should be able to engage in abstract logical reasoning".

My own take on this is that many programmers like to believe their chosen subset of the field (in the case of "learn math" articles, often people who work in game development or scientific computing) is the only "real" programming, and that everyone who chooses another subset is deluded, stupid or otherwise inferior.

More specifically, there's so much more to math than the 'boring' number manipulation long division stuff that everyone hated in 4th grade.

Any time you combine previously separate if's and use Demorgan's law, that's basic mathematical logic at work.

I agree about this correlation in general -- although personally I'm a counterexample: a degree in applied math, and a strong belief that it's not particularly important to programming.

True, in programming you need to think rigorously and abstractly, and the same's true in math. However to be a great programmer you also need systems thinking, the engineering skills to use components that don't always work as advertised, understanding of social dynamics [both how software is constructed and how it gets used], etc. etc. Math doesn't give you any of these.

So it seems to me that there are other ways to learn rigor and abstraction that are at least as good a preparation for programming as math: different branches of science, law, operations research, user experience, etc. etc.

There are parts of linguistics that also apply directly to programming, particularly when working with compilers and parsing.

Excellent point

Please see Dennis M. Ritchie's short bio: http://cm.bell-labs.com/cm/cs/who/dmr/bigbio1st.html

He has deep mathematical talent.

Ken Thompson also has deep analytical abilities as demonstrated by his contributions to computer chess: https://chessprogramming.wikispaces.com/Ken+Thompson

Guido van Rossum holds a master's degree in Math and CS.

I'd say that it is common sense that mathematical talent is correlated with coding talent provided that the person receive substantial training in coding. Generally speaking, everyone is a terrible coder for the first 2 years, it doesn't matter if you are a math genius, you still need to learn how to crank out good code.

The real question is: is there extra correlation between math talent and programming talent beyond both being correlated with g, the general intelligence factor.

See: http://en.wikipedia.org/wiki/General_intelligence_factor

Just another possible explanation - cognitive dissonance. You work hard to "gain" a tool, you're going to think it was worth it.

The importance of math in programming is directly related to the domain you are working in. The strong logic background math gives you is important for anyone who programs, but advanced concepts just don't matter that much to most programmers.

<fx: sighs and starts rant and posibly excessive claims unsupported by objective evidence>

It's not the math, it's not the advanced concepts. It's the ability to think in abstract terms, about abstract objects, rigorously sometimes, intuitively sometimes.

It's not the direct benefits. Knowing topology or analysis is unlikely ever to help you. Being able to do topology and analysis probably will, in subtle, unexpected and most often unnoticed ways.

Every single advanced athelete does progressive weight training. Why? Certainly not because they want to be able to lift weights. No, it's because of the side-effects. Similarly doing advanced math trains your mind in ways that are relevant to programming, and unobtainable elsewhere.

I won't convince people who are already convinced that math is useless, and most people who haven't done any advanced math will see that they don't appear to use it, and thereby believe that it's useless.

It's all blub. Obviously.

I agree with you generally, but how much mathematical thinking will help you still depends extremely on the kind of job you are doing.

In my experience the single thing in which mathematical thinking can help you best is going meta. When you reason about the program as the program were data. It is not a coincidence that smart people with mathematical talent wants to go meta: they love to write frameworks, new programing languages, compilers, (graphical, database) engines.

But most of the programmers have to write good old application logic (so called business logic) using languages, frameworks, engines, technologies made by other people.

I go further: most people have to use bad programming languages, programming languages which are not their choice. Most people have to maintain bad quality code created by other people.

For these tasks mathematical thinking is really secondary. You have to have other skills: very good memory, finding your way in a complex mess, etc... If you are mathematically 'smart', you will find patterns, you will be bored, you will go meta, and they need to find another guy for the mundane task.

Yes, math is one tool in a tool chest.

Tool ... good. Tool with two thousand years of history behind it, very good!

However, I haven't notice a similar degree of correlation between programming and Egyptian history... Knuth was about literate programming, so maybe English literature could be good tool too.

The inverse is actually true... the deeper your mathematical training is, the more you realize how unimportant it is in most programming. The people who espouse otherwise either haven't studied enough math, or more usually, haven't done that much programming.

> The inverse is actually true..


> The people who espouse otherwise either haven't studied enough math, or more usually, haven't done that much programming.

Dijkstra? Hoare? Knuth?

Well my hypothesis is that regardless of one's opinion of the value of mathematics to programming or mathematical knowledge, one is likely to advocate that a strong mathematical background correlates with opining one's given assessment of the value of mathematics to programming ;)

Or they're doing programming in a domain that involves a math a fair amount of the time.

(Like robots.)

While I agree with the general idea behind the article, there are lots of specific details I disagree with. For example:

It is absolutely crucial to pick a first language with a deep, expressive vocabulary for this reason, and all programming languages are not created equal.

Not really. I started with BASIC, on ZX Spectrum 48, and I actually used it to write a real-world program which was used by real users; if you're curious, it was a pop quiz for biology classes. Yes, it's probably the worst code I've ever seen or written. Yes, it had all sorts of problems and bad practices you could readily think of and then some that would require you to be very imaginative. No, it didn't cripple my development as a programmer.

The problem is not in which language you pick as your first language. The problems start when you start believing that's all there is. Some of the most important things I've learned were things I encountered when learning a new, more powerful programming language, precisely because they were impossible (or extremely difficult, or too messy) to express using the old language. They challenged my understanding of programming and helped me develop even further as a programmer.

Strong disagreement. My first language was also Spectrum BASIC (I'm not dissing it, by the way, a mighty engineering achievement and my life would have been much poorer without it), but it's not a language I'd expose a child to today, because you need to deal with all sorts of accidental complexity to see some of the deep ideas. On the other hand, the Spectrum environment was almost perfect considering the hardware available.

In the same way, if you want to learn OO, learn Java rather than C++, where the same concepts seem very much more complex. And that's probably true even if you want to program in C++ eventually!

Probably better to use python than either, because that's got the least accidental complexity of anything and can express a wide range of ideas, as far as I can tell.

My favourite language at the moment is clojure, but actually some of its nice features like pervasive immutability would make things hard for a beginner. Better to learn how to do everything and then decide which powers you want to not use.

So I guess my ideal beginner's computer today would be a modern ZX Spectrum running a version of python with a lisp syntax (for macros). pg's original vision for arc sounds like a good spec.

And of course an essential requirement would be a printed user manual as friendly and straightforward as the ZX BASIC manual or the TRS80 one.

"As a linguistics major, you're no stranger to the idea that a person is only capable of having thoughts and ideas that can be expressed in their language."

This is, IMO, so wrong on many levels. My native tongue is Spanish, but I'm also fluent in English, French and German; however I find myself thinking in all of them with the exception of German (I guess I'm not as fluent as I thought I was in it ;)). In the same way, I started with C when I was around 12 years old, and I've had no problem thinking in C, Java, Ruby or Python (which are the languages I use mainly). I think as long as you understand the language itself you won't have to think back to a different one to process your thoughts.

And don't get me started on the math. Math is extremely important in programming. I have a friend that shares that though that math is not important, and his code is hideous (Ok he uses ONLY Php, as it's his opinion all other languages are complicated and inflexible). Math can help you accomplish a plethora of things easily, faster, and more productively.

The idea that language limits what you can think has been obsoleted for quite some time, see http://www.nytimes.com/2010/08/29/magazine/29language-t.html for example.

'Languages differ not in what you can say, but in what you must say" -- Roman Jakobson

Many linguists claim that the Sapir-Whorf hypothesis (SWH) has been soundly disproved. They have bags of evidence to show that the usual, strong interpretation is false.

And yet every programmer with proficiency in several languages knows that some problems are easier solved in Python, others in C++, and others in AWK. Further, every polyglot I know will shift language according to nuances in the thoughts they're trying to express, because some languages have more/better/finer distinctions about different things.

In Spanish the words for "to wait" and "to hope" are the same. In some contexts the words for "Even" and "Same" are both translated into the Frech as "meme" (as in "meme chose" - "same thing"). Perhaps these conflations of words show a conflation of concept. Yes, the differences in concepts exist in the minds of the speakers, but the language to express them precisely isn't always there.

There's a Spanish joke: "¿Cómo se llama a un ascensor en Inglés?" - "Con su dedo." It doesn't translate into English - you need to read the Spanish. Again, vocabulary conflation.

SWH has been debated here on HN before, many times, and no doubt it will again, with the research-based linguists saying there's no difference in lanugage chosen, and the polyglots snorting derisively.





I've repeated this comment in a new submission since I think the subject deserves a discussion of its own:


I agree that some problems are easier solved in different languages, the same way every polyglot shifts languages depending on what he is trying to convey; nevertheless I think this is actually a by product of context, not necessarily because of a necessity to use a language. I generally think of a broad idea and probably pseudocode (which can have any syntax I want or create on the fly) which then morphs into a programming language either as I express the complex idea or as I add detail to the scope of the solution.

Also, the joke does have a translation to English: "How do you call an elevator in Spanish?" "With your finger." Expressing yourself is a game of context and expressing an idea is not necessarily a verbatim translation from one language to another.

On another note, on the "to wait" and "to hope" comment, you are right depending on what mannerism the people around. I could very well say "tengo la esperanza de" is as much a Translation as "espero que". Again proof that context and semantics are more important than language choices themselves.

As a translation to the joke:

"How do you call an elevator in English?" "With your finger."

But until you have the punchline the correct "free" translation of the question is "What do you call an elevator in English?". It's only after you've heard the punchline that you have to go back and re-interpret the question.

Which is why context-free translation is crap. The question itself is translatable, as is the answer itself. But the composition of the two translations is not the translation of the whole. This is equivalent to the translation of a sentence by compositing the translations of each of the words without taking into account their role in the sentence. "How do you call X in L?" is a perfectly valid and understandable English sentence, although less idiomatic than "What do you call X in L?" or "How is X called in L?". The context informs the translation.

That is a fascinating article, but isn't its point rather that languages probably do have an influence on how you think?

And no-one is claiming that you can't do closures in C, but I am claiming that C-only programmers don't usually think in those terms, because the language makes them more complicated than they need to be.

Actually I think they have more an effect on how you express yourself. If closures are the solution to my problem I'll probably think about it in language agnostic way first, and then express it in code in whatever language I prefer or need to write in.

> My native tongue is Spanish, but I'm also fluent in English, French and German; however I find myself thinking in all of them

I think the point is that you can only readily think in terms of what you know. Now, if you're fluent in English, French, German, and Spanish, it's almost a given that you'll be able to think in terms of any of the prominent concepts in those languages. I think the author was really saying that, if you learn a language with a broad range of concepts first, it can only help you as you progress, because you'll be able to actively use those concepts.

For example, I know a number of imperative languages, but I've only recently experimented with functional languages like Lisp. The conceptual shift is fascinating, and something I wouldn't have quite grasped if I constrained myself to imperative languages.

Actually the author kind of made the distinction that he believes your first language should be so an so, so it becomes the 'base thinking platform' (so to speak) in which a programmer develops code in his mind. I agree with you though, you can mostly think (in real life and in programming) in terms of languages you're already fluent in; however being introduced to a new concept can give way to new ways of thinking that can probably be expressed in every language, albeit poorly in some.

Agreed. It makes it sound like we cannot come up with new ideas or new words, which we all know that we can. My first language was a form of BASIC. Then I mostly programmed on my calculator in another BASIC-derived language not far removed from assembly (GOTO was the only flow control it offered). When I moved to C, suddenly the patterns of if/then and do/while/for loops immediately made sense: I had been using them the entire time without realizing it.

And math? I can't help but use it. When people ask for estimates, I always do some math based on known quantities rather than just guessing. I keep track of how far off my estimates were to apply later corrections and reason backwards to figure out which of my assumptions were wrong (leading to better estimates in the future). I've done merge sort by hand on stacks of paper (and found the speedup when you have a big gap between two merged piles and can move an entire chunk to the sorted pile).

And that's not even counting my job, where I end up doing lots of geometry. Knowing the formulas for mutually tangent circles or the formula to find the two intersection points of two circles is surprisingly useful (the case where you have zero or infinity intersection points just doesn't come up and even having one point multiplicity two is rare).

I wish I knew more math. I know more than most people and I even have a degree in it, but there's so much great stuff like linear algebra, statistics, graph theory, and game theory that I never feel like I know enough about any of them.

Is there really no piece of Spanish that you cannot translate into English without changing the meaning? And no English poem that will not translate into Spanish without losing some of its power?

"As a linguistics major, you're no stranger to the idea that a person is only capable of having thoughts and ideas that can be expressed in their language."

I am sorry but this is bullshit, I am glad I no longer visit this crappy forum filled with linguist turned Ruby Dev masquerading as Software Engineers and Computer Scientists.

If you have an idea and if you can make it run on a machine, then it means you can convert it into the machine code, which is what everything gets finally converted into. Also if this guy knew math it is called as Turing Principle and as long as the language is Turing complete it makes no difference.

Last time this got discussed:


someone in this comment:



    "especially for a web developer" is the important part.
    As a fellow web developer, I agree. I'd certainly like
    to get into the mathier side of programming someday, but
    it's rarely necessary for my day-to-day operations.
That misses the point entirely. How do you know it would be useful if you don't have any? And do you only, ever, and always want to be a web programmer?

Why limit your options?

"Why limit your options?"

I suspect that focusing on mathy programming jobs would tend to limit your job options, at least in most regions of the US.

People seem regularly to assume that getting good at math means you therefore need to do mathy programming jobs. That's exactly what almost every comment here is trying to refute.

It's not doing math that limits your options. I've never seen a prgramming job that wouldn't benefit from the programmers knowing more math. Every programming job I've seen has benefited from more math. And I've seen a lot of programming jobs, Including ATC, weather forecasting, financial, CRUD, web scraping, automotive and more. Even gathering data from a database and presenting it in a template has occasionally benefited from a bit of obscure math.

It's just that people who think math is arithmetic, formulas, equations, calculations and calculus don't realize what they're missing, and just how widely applicable it is.

Furthermore, the first language you become comfortable with will influence the way you learn other languages in the future.

This is true until you learn enough, then it's not true any more.

I used to work with a guy who learned FoxPro (an old database program). Every problem we had on the team, he would look around and say, "Hey, you know what? We could solve this very easily if we only used FoxPro"

Everybody is like that. We just don't see it. So the author is correct as far as he takes it.

The difference is, once you've become exposed to bunch of languages and problem-solving paradigms, you start to see patterns. And once you see the patterns, the languages aren't that important anymore at least in the way you approach solutions. The coding itself may be more or less gnarly, but the way you go about fixing the problem is based on principles, not language constructs.

This is like when you learn a word processor. The first time you use one, you begin thinking of all word processing problems through this one lens. But once you learn 3 or 4 word processors, it's all kind of the same. Programming is just like that, only it takes about ten times longer.

I'd also note that no matter what tool or problem you have, if you keep refactoring and clarifying your code, you always end up in mostly the same spot -- some kind of DSL-ish place. That's true no matter what language you're using.

So yeah, he's right. But not really. Instead of focusing on which language you start with, focus on broadening your programming experience. Got enough language zealots in the world. No need to make more.

"Besides, computers are way better at math than you are anyway." <- This is so wrong (at least until we have strong AI). He confuses math with doing calculations.

Yes. As a mathematician colleague of mine used to say, "It is not the job of a mathematician to perform basic arithmetic. It is the job of an accountant." Usually in response to screwing up some basic arithmetic, as I recall ;)

This article came up before (original http://news.ycombinator.com/item?id=1053753 ), I'm going to quote Eliezer Yudkowsky ( http://news.ycombinator.com/item?id=1054201 ):

"Look, I'm sorry to be the one to break this to you, but if you have difficulty with any programming concept, you must not be a supergenius. You're just an ordinary genius at best. I'm sorry. Life isn't always fair.

Of course, I say this as someone who hasn't yet tried to learn Haskell. On the other hand, I know someone who competes at the national level and I never saw him have trouble with anything including Haskell, so...

The sad truth is that there are some people for whom programming comes as naturally as thinking, with code formed as easily as thoughts; and if it takes an effort to understand any aspect of programming, you have just learned that you are not one of those people. Alas."

Developing code, especially complicated algorithms and data structures, is all about proofs. Not long, formal proofs but the kind of informal proof you do in your head to convince yourself that your code will work. Studying maths will give you a hell of a lot of practice working with proofs and exposure to lots of different ways of proving things. Even if I didn't use any of the material I learned that will still be valuable.

Last time this got posted there was some interesting discussion: http://news.ycombinator.com/item?id=1053753

Two things.

Programming is too broad an umbrella of a term for any meaningful generalizations to be made from it. It ranges from artisan type crafts work to engineering to pure mathematical theorizing. You can't tell the people doing machine learning, signal processing or physics engines that they won't need maths. That is just pure sillyness. It is very context specific.

---Second thing---

This person misses the point of math. Its not about arithmetic, its not about calculus or group theory or combinatorics or even numbers or proofs. No. those are by products.

The true point of maths is about distilling observations of past patterns into their essence and core attributes. Packaging them and then using them to lay the foundation to think more complex and less intuitive thoughts. Not unLike libraries in programming. Look at the history of the developments of mathematics - shoulders of giants and all that. Abstraction.

Maths is a creative endeavour and If we had more memory and faster thinking speeds and less fallible intuitions I dare say I doubt our maths would be so 'advanced', abstract. A lot of this would be built in everyday thinking, maybe this is related to why many programmers think they don't need maths - the mathematical principles of why what they do works and its mathematical nature may not necessarily need to be made apparent in order to get by. Or perhaps this is just a reflection of the youth of the field.

But I do believe though that our maths is so advanced only because we are so stupid!

The sentiment of the article is nice. Especially the part about feeling self-conscious and stupid. People need to remember that (myself included).

I'd also add that often people online can excascerbate (sp?) the problem. "What, you don't know how to write an O(n) gnome sort!? Dude you are not a programmer." It's really easy to forget that not everyone is an elite CS programmer who graduated with a 4.0. And not everyone needs to be. Learning is a process and one we should help each other with.

That being said, I'm one of those totally wrong people who disagrees with:

You're lucky to have stumbled across a very good first language, but you're likely to encounter developers who will tell you that this decision doesn't matter. These people are wrong, plain and simple.

You won't know what's good and bad about your first language until you try a few others anyway. So don't sweat it. I'd say, avoid zealotry and dogma. Program in as many languages as you can handle.

I have a CS degree, I have done my fair share of math, its a requirement. That being said, I'm not amazing at math(working at becoming better), I'm passable for sure, understand concepts, not amazing on the details.

It all comes down to the question "How do you solve a problem you don't know how to solve", a lot of times hints for how to approach these problems are given in certain areas of computer science and mathematics. If you don't have the tools(computer science and math) it can make your life significantly harder for a particular subset of problems which probably align pretty well with the problems you realize you don't know how to solve.

I'd like to add: becoming a programmer fundamentally changes your personality and who you are. Seriously.

I think this is true but I think a lot of us on HN are either too far gone or started too early to realize.

The math question just depends on the type of programming you do, and on your definition of math.

Programming is such a wide field. There are web designers who fit rectangles together all day long, and then there are the people who write Mathematica.

Not that hard. It is like other professions: you need skills and intelligence. With a strong math and computer science knowledge base, plus 10 years of programming experience, should be enough for addressing most difficult problems.

>> Language does matter.

I would say "Languages do matter. All of them." Now, I certainly don't mean iterations on a single language (though experiencing that evolution first-hand can be very instructive), but insofar as Java 1.3 and C# 1.0 were nearly identical languages, they were still worth learning even at that time because experiencing the vastly different ecosystems completely cuts through all of the dogmatism that every language has.

Looking at all these comments. I see, people like to argue a lot; some of them just for the sake of it. I think we should always read an article from the writer's point of view. That won't suppress all our mismatched feelings but will certainly reduce it to a great extent.

The language you choose to start with is important. You should start with Ruby because it is less hard (and pretty useful).

Compared to what? Your comment is like saying "A house is bigger."

What about physic ? I think they also have an important role in programming and no, it's not just for games.

Physics is extremely useful in terms of making models of reality. It requires somewhat different ways of thinking in comparison to math, but is no less useful to a programmer. Of course, if you are building things that interact with the real world, it's invaluable.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact