Of course, this is the guy who said "Computer Science is no more about computers than astronomy is about telescopes." And while I believe I understand his point he was trying to make, I have much higher respect for a person who still likes to get their hands dirty (still actually spends significant time programming computers) rather than just dealing with abstract theory, algorithms, analysis, etc. Yes, I know that Dijkstra knew how to program (and did so, extensively, especially earlier in his career). But this is the guy who never owned his own computer -- even after personal computers became commonplace. That, coupled with the above comment about BASIC (and other comments I've read from him like it) make him come across as kind of an elitist -- like programming is beneath him or something.
Perhaps my take is wrong, and it very well could be. But I will say this. If Dijkstra was still alive and well, and you put him, Guy Steele, and Don Knuth in a room and asked me to pick two of the three to spend the day with, Dijkstra would be the one left out in the cold.
No harm, Dijkstra never struck me as the kind of person that would want to spend a day in a room with anyone, either. Computer Science is about studying the computation which can be done in hardware - not just Python, not just on one chipset, not just on a von Neuman machine, but in general. "Programming" as I think you mean it is a very small subset of that.
He was obviously brilliant and very perceptive with regard to many things in the field. However I still think it is over the top that the guy didn't even own a computer.
There are many who would rather spend their careers just writing applications without ever analyzing the differences between two algorithms, much less ever studying or thinking about computation in general. Then there are those who would spend their careers strictly on theoretical endeavors, and do not wish to continue writing applications or software systems of significant value. It is my personal opinion that the best Computer Scientists are the ones that regularly do both.
Or -- and more relevant to their professional competence -- to develop a large, complex and safety-critical system, such as air traffic control; I wouldn't choose Dijkstra over Knuth or Steele for that job either.
I don't think it really matters what language is first/second/etc, just that you keep searching and finding new ways of thinking.
Of course, the danger is that it can quickly become frustrating to program in the likes of Java once you realize all the things that are simple and straight forward in, say, Scheme but require ridiculous amounts of circumlocution in Java.
EDIT: Maybe there no such thing as a Blub language (right tool for the job etc). However, Blub programmers do exist: half of my co-workers and maybe even myself (to some extent).
I think that abstract symbol manipulation within a set of constraints facilitates skill transfer between math and programming. Learning to see paterns, to perform abstractions of statements for simplification, to construct something larger out of component parts bottom-up (like simple proofs,) these are all useful skills in both.
Remember, he's writing a letter to his former self, so the advice applies more to his life than it does in general.
It is always people that don't understand math and its application that say oh, I don't need it for what I do "web development I am looking at you" and then wonder why their systems are, complex, ill performant, and inelegant.
Please learn math before you say you don't need it. Or more specific to this article stating that programming is complex / hard and following that with you don't need math. It's only hard because you are not using math.
I have done 3D simulation, AI, robotics, embedded and now web over the coarse of my career and the only complex systems that I have run into where designed without the assistance of mathmatics.
Of course if you are not competent in math you won't do well at that kind of task, and because you don't know better you won't necessarily recognize your inability for what it is.
Even to individuals with a lesser grasp on certain forms of mathematics. Math helps you find elegant solutions to complex problems, this is hard to convey to someone who does not have the education in mathematics and they only get that they have been doing things the hard way after they have learned mathematics and found that the two are interrelated even for web development.
There would be a lot less terrible code written out there if people had a better grasp of discrete math, knew how to make their O(n) analysis, knew how to create and implement provably correct algorithms, knew lambda calculus and e.t.c. and that's just for general programming. I would say a basic grasp of set theory and graph theory applies to almost everything too.
And... well, I can't say that I've ever explicitly used any of it in programming. I suspect you're falling into the trap of generalizing from a field you're familiar with to all fields, and that's a generalization that doesn't hold up.
But the ongoing influence, the style of thought, the ability to visualise and the ability to abstract away from the details - these things I use all the time, every day. They have been enhanced and honed by all that math.
I use my math background implicitly all the time, and I don't know how I'd do what I do without it.
And this is perhaps the most important point about studying math. Often the greatest take-away is the abstract problem solving capability, not the material itself. I've never had to analyse the genus of a manifold in real life, but I have thought about objects moving in an 11 dimensional space with holes, because while everyone else was stuch in the detail, I was seeing things differently. It turned out that the combination of styles was critical to solving the problem.
That's why I got a degree in philosophy.
OK, not the only reason, but one of them...
Meanwhile, I think my point stands. Too many people seem to have an "OMG you don't use linear algebra every day? What kind of crap programmer are you?" attitude.
I say this as someone who periodically finds yet another way to use my math background to help me do things while my co-workers in the same job don't. Which is why I'm the one who got to do the statistics for evaluating A/B testing, eliminated multiple performance problems others had come to accept, and gets tapped to tackle complex "how do we figure this out" problems from time to time.
And yes, I've had the fun of seeing co-workers who theoretically had the same job as me argue that math wasn't useful for programming, while I was sitting there with a list of examples of how I'd recently used my math background. (But then I got promoted into a different job title and in those discussions those programmers were able to respond that of course I needed more math for reporting, but they didn't need it as web developers. Never mind that I was the guy who got asked to help when the database had trouble scaling at peak traffic of several million pages/hour...)
Really this seems to me to be a variant of the blub paradox. If you haven't learned to reflexively see the benefits of thinking in some new way, you won't recognize how thinking that way could help with the problems you are solving day in and day out. Conversely if you have gained those mental skills, you recognize their utility in situations that other people wouldn't dream they are applicable for.
At that point, if you haven't figured out basic algebra at least, you'll have trouble conceptualizing the data. At best, this means you won't be as efficient as you could be, at worst, you'll write buggy code.
I am, with surprising regularity, annoyed that I can't pull basic trig out of my head without looking things up --- to say nothing of signal processing and number theory.
As a linguistics major, you have no excuse for not knowing that Sapir-Whorf is utterly discredited.
The extreme forms have been rather thoroughly debunked at this point. The milder forms verge on tautologies.
Exactly. The hypothesis is clearly false in any articulation strong enough to matter.
I thought this idea (the Sapir-Whorf hypothesis) had been discredited.
edit: after I wrote this, I realised it's just a restatement of PG's Blub paradox.
I'm not a linguistics major but I do speak several human languages and have no trouble thinking in them and expressing ideas in them. The first computer languages I learned (6502 assembly, BASIC) don't enter my conscious thought when I think about the programming problems I encounter with the languages I use today.
Of course, I say this as someone who hasn't yet tried to learn Haskell. On the other hand, I know someone who competes at the national level and I never saw him have trouble with anything including Haskell, so...
The sad truth is that there are some people for whom programming comes as naturally as thinking, with code formed as easily as thoughts; and if it takes an effort to understand any aspect of programming, you have just learned that you are not one of those people. Alas.
This applies to pretty much any field - engineering, physics, chemistry, even music!
My background is in electrical engineering and it's quite daunting to realize how little I REALLY understand when it comes to the fundamentals... Sure an engineer can make things "go" but they're standing on the shoulders of giants.
Learning is a humbling en devour.
Be sure to view the HTML code to see the "hail satan" comment. This guy has some personality thats for sure....
Another of his sites linked from the main page:
'At this point in the book, I was originally going to present a BSP-based renderer, to complement the BSP compiler I presented in the previous chapter. What changed my plans was the considerable amount of mail about 3-D math that I’ve gotten in recent months. In every case, the writer has bemoaned his/her lack of expertise with 3-D math, and has asked what books about 3-D math I’d recommend, and how else he/she could learn more.
That’s a commendable attitude, but the truth is, there’s not all that much to 3-D math, at least not when it comes to the sort of polygon-based, realtime 3-D that’s done on PCs. You really need only two basic math tools beyond simple arithmetic: dot products and cross products, and really mostly just the former. My friend Chris Hecker points out that this is an oversimplification; he notes that lots more math-related stuff, like BSP trees, graphs, discrete math for edge stepping, and affine and perspective texture mappings, goes into a production-quality game. While that’s surely true, dot and cross products, together with matrix math and perspective projection, constitute the bulk of what most people are asking about when they inquire about “3-D math,” and, as we’ll see, are key tools for a lot of useful 3-D operations.'
(Michael Abrash, "Graphics Programming Black Book Special Edition")
Perhaps you're referring to what industry folk call gameplay programming? There's a lot less hardcore math there - but you still typically need to understand basic physics, trig, interpolation, etc.
The bit about being constrained by your first language is demonstrably not true (read pg's own account!). It can be a burden, but what stops people from progressing isn't this, it's the usual suspects: arrogance and ignorance. Once you stop judging a language purely on its merits and, thinking you've found the best, begin evangelizing it you will have problems seeing more powerful ones (because the language has become part of your id).
You have to treat a programming language like a great chess player treats possible moves: when you find a great one, sit on your hands and look for a better one.
As far as math: in my experience it isn't required. It will make you better and make your work easier. I've had good math people replace whole algorithms of mine with a couple of math statements. But if you really devote yourself to getting better at programming, learning a lot of diverse languages and so on, your math will get better. I've found it easier to learn certain math concepts from related programming concepts that I had already learned.
And I really don't think your first language is all that important, programming is still fun usually, whatever the language. It's only later that we learn the fine art of language snobbery ;)
Working on hard problems, of your choosing, on your own schedule, can be fun and rewarding. But the reality is, you're not always going to get that.
Estimating 3D surface normals and depth from multiple photos of an object? Break out the matrix solvers. Computing homographies between images? Better know what an eigenvector is.
Although, to be intellectually honest, I went only went as far as to research this, and no further. I just didn't see it as worth the time.
"Well I never had any problems learning to program. I am pretty much the smartest person in the world I guess. My "problem" is that I am so great I have too big of a head."
Since it is too late to edit it and add a disclaimer, I guess this thread will do.
What is programming? It is the art of starting with very simple primitives, then rigorously building up slightly more complicated primitives, then building another layer on top of that, until eventually you get to a level where you can do actual work. It is staggering how far we get on how few primitives; it is incredibly educational to read what opcodes a processor actually implements. (Even better, make sure you read just the modern subset.) I mean, it pretty much just has "move this here", "add this", "multiply this", "divide this", and "if this thingy is 0 jump there". Yes, I know there's a few more, but the point is that it definitely doesn't have an opcode that downloads a webpage. It is staggering the subtle ways in which these things can interact.
It is absolutely possible in both the mathematical and programming cases to do "real work" without having the understanding of things that I refer to in my previous paragraphs. A web programmer does not constantly sit and do logic proofs, an accountant does not constantly refer to number theory throughout their day. Of course this is fine for the accountant, who is not expected to do original work in the field of accounting. (It is rather discouraged, in fact.) So of course it's OK for an accountant to have a very tool-like understanding of numbers. Are you, the programmer, expected to do no original work in the field of computing, such that you don't need to understand computing deeply? It may be so. Such jobs exist. But watch out, that means you're one library acquisition away from not having a job anymore! (And if you can't be replaced by a library, you're doing original work of some kind. Most programmers are.)
Look back at my first two paragraphs, where I have obviously drawn parallels. The real value of mathematics for a programmer is not that the programmer is likely to be sitting there doing matrices all day long, or even worrying much about logic problems, and they certainly aren't going to be sitting around all day doing sums. What mathematics provides is a clean place to learn the relationships I talk about, how we build the large concepts from the small concepts, and provides a playground where you can have that famous all-but-100% certainty that mathematicians like to go on about (justifiably so).
This is great practice for programming anything beyond a trivial project, where, if you have a clue, you will probably be starting with building up some reliable primitives, and then trying to build bigger things out of them. Bad programmers just start slopping concepts together with glue and just pour on more glue when they get in trouble, and produce what can only be described as, well, big piles of glue with no underlying order. A programmer who has become skilled in mathematics has at least a chance of producing something that is not merely a big pile of glue, and can have characteristics in their program that are characteristics that a big pile of glue can't have.
It is possible to come to this understanding without passing through formal mathematics, but it is much harder, because the world of programming is ultimately the world of engineering, and it is much harder to see these patterns. They are there, but they are obscured by the dirtyness of the real world.
That the mathematics may have an independent use is gravy; even if they were somehow otherwise worthless but programming was somehow unchanged (not really possible, but go with me here for the sake of argument) it would still be a worthwhile study. There are few better ways a programmer can spend their time than to become familiar with mathematics. Without the understanding of programming I outline above, regardless of which path you take to get there, your skillset will plateau, the maximum size or complexity of a system you can build without it coming apart will top out noticeably sooner than those who do have this understanding, and there will be things that remain forever a mystery to you. (Like how those large programs really work.)
> My mother, who has the same [thermostat], diligently spent a day reading the user's manual to learn how to operate hers. She assumed the problem was with her. But I can think to myself "If someone with a PhD in computer science can't understand this thermostat, it must be badly designed."