

Poor, Poor Child. You have no idea. - bryanwoods
http://writing.bryanwoods4e.com/

======
jerf
What is math? There are many answers, so the one I pick for this post is that
math consists of starting with some basic axioms, chosen to be as simple as
possible, then rigorously exploring what else you can extract from your simple
axioms by concrete proofs. It is staggering what you can get from simple
axioms. It is staggering the subtly with which they can interact.

What is programming? It is the art of starting with very simple primitives,
then rigorously building up slightly more complicated primitives, then
building another layer on top of that, until eventually you get to a level
where you can do actual work. It is staggering how far we get on how few
primitives; it is incredibly educational to read what opcodes a processor
actually implements. (Even better, make sure you read just the modern subset.)
I mean, it pretty much just has "move this here", "add this", "multiply this",
"divide this", and "if this thingy is 0 jump there". Yes, I know there's a few
more, but the point is that it definitely doesn't have an opcode that
downloads a webpage. It is staggering the subtle ways in which these things
can interact.

It is absolutely possible in both the mathematical and programming cases to do
"real work" without having the understanding of things that I refer to in my
previous paragraphs. A web programmer does not constantly sit and do logic
proofs, an accountant does not constantly refer to number theory throughout
their day. Of course this is fine for the accountant, who is not expected to
do original work in the field of accounting. (It is rather discouraged, in
fact.) So of course it's OK for an accountant to have a very tool-like
understanding of numbers. Are you, the programmer, expected to do no original
work in the field of computing, such that you don't need to understand
computing deeply? It may be so. Such jobs exist. But _watch out_ , that means
you're one library acquisition away from not having a job anymore! (And if you
can't be replaced by a library, you're doing original work of some kind. Most
programmers are.)

Look back at my first two paragraphs, where I have obviously drawn parallels.
The real value of mathematics for a programmer is not that the programmer is
likely to be sitting there doing matrices all day long, or even worrying much
about logic problems, and they certainly aren't going to be sitting around all
day doing sums. What mathematics provides is a clean place to learn the
relationships I talk about, how we build the large concepts from the small
concepts, and provides a playground where you can have that famous all-
but-100% certainty that mathematicians like to go on about (justifiably so).

This is great practice for programming anything beyond a trivial project,
where, if you have a clue, you will probably be starting with building up some
reliable primitives, and then trying to build bigger things out of them. Bad
programmers just start slopping concepts together with glue and just pour on
more glue when they get in trouble, and produce what can only be described as,
well, big piles of glue with no underlying order. A programmer who has become
skilled in mathematics has at least a chance of producing something that is
not merely a big pile of glue, and can have characteristics in their program
that are characteristics that a big pile of glue can't have.

It is possible to come to this understanding without passing through formal
mathematics, but it is much harder, because the world of programming is
ultimately the world of engineering, and it is much harder to see these
patterns. They are there, but they are obscured by the dirtyness of the real
world.

That the mathematics may have an independent use is _gravy_ ; even if they
were somehow otherwise worthless but programming was somehow unchanged (not
really possible, but go with me here for the sake of argument) it would
_still_ be a worthwhile study. There are few better ways a programmer can
spend their time than to become familiar with mathematics. Without the
understanding of programming I outline above, regardless of which path you
take to get there, your skillset will plateau, the maximum size or complexity
of a system you can build without it coming apart will top out noticeably
sooner than those who do have this understanding, and there will be things
that remain forever a mystery to you. (Like how those large programs really
work.)

------
kristiandupont
While there are clearly differences between languages, I think it is rather
deterministic to put that much weight on the _first_ language? My first
language was GW Basic. By your logic, there wouldn't be much hope for me I
guess..

~~~
asnyder
"It is practically impossible to teach good programming to students that have
had a prior exposure to BASIC: as potential programmers they are mentally
mutilated beyond hope of regeneration." - Edsger Wybe Dijkstra

~~~
nocman
It is always amazing to me how much of a mix Dijkstra was. Many of the hand-
written articles of his I read are filled with valuable insights, and are
absolutely worth the read. And then there's those other sets of comments (like
this one) that make me want to pay no attention to anything he had to say.

Of course, this is the guy who said "Computer Science is no more about
computers than astronomy is about telescopes." And while I believe I
understand his point he was trying to make, I have much higher respect for a
person who still likes to get their hands dirty (still actually spends
significant time _programming_ computers) rather than just dealing with
abstract theory, algorithms, analysis, etc. Yes, I know that Dijkstra knew how
to program (and did so, extensively, especially earlier in his career). But
this is the guy who never owned his own computer -- even after personal
computers became commonplace. That, coupled with the above comment about BASIC
(and other comments I've read from him like it) make him come across as kind
of an elitist -- like programming is beneath him or something.

Perhaps my take is wrong, and it very well could be. But I will say this. If
Dijkstra was still alive and well, and you put him, Guy Steele, and Don Knuth
in a room and asked me to pick two of the three to spend the day with,
Dijkstra would be the one left out in the cold.

~~~
andreyf
_If Dijkstra was still alive and well, and you put him..._

No harm, Dijkstra never struck me as the kind of person that would want to
spend a day in a room with anyone, either. Computer Science is about studying
the computation which can be done in hardware - not just Python, not just on
one chipset, not just on a von Neuman machine, but in general. "Programming"
as I think you mean it is a very small subset of that.

~~~
nocman
I was not arguing that programming is all there is to Computer Science (which
is why I said I understood the point Dijkstra was making with his astronomy
quote). However, I disagree with the notion that programming is "a very small
subset" of Computer Science -- even though I think Dijkstra would probably
agree with you on that. I think programming is a _large_ subset of Computer
Science. Yeah, there's a lot more to it than that, but in my book the
practical side of things is more important than Dijkstra seemed to think it
was.

He was obviously brilliant and very perceptive with regard to many things in
the field. However I still think it is over the top that the guy didn't even
own a computer.

There are many who would rather spend their careers just writing applications
without ever analyzing the differences between two algorithms, much less ever
studying or thinking about computation in general. Then there are those who
would spend their careers strictly on theoretical endeavors, and do not wish
to continue writing applications or software systems of significant value. It
is my personal opinion that the best Computer Scientists are the ones that
regularly do both.

------
shin_lao
Well, I'm sorry to say I strongly disagree with the mathematics part. Basis in
linera algebra are a definitive plus and helped me approach programming in a
sensible way.

~~~
m0th87
I think it really depends on what you want to make. How does math help you
make a CRUD application, for example?

~~~
dusklight
The thing is the answer is YES, but you won't even understand why unless you
have the appropriate math knowledge.

There would be a lot less terrible code written out there if people had a
better grasp of discrete math, knew how to make their O(n) analysis, knew how
to create and implement provably correct algorithms, knew lambda calculus and
e.t.c. and that's just for general programming. I would say a basic grasp of
set theory and graph theory applies to almost everything too.

~~~
ubernostrum
I have discrete under my belt. And a bunch of other stuff too (came a bit
short of having a minor in math in college).

And... well, I can't say that I've ever explicitly used _any_ of it in
programming. I suspect you're falling into the trap of generalizing from a
field you're familiar with to all fields, and that's a generalization that
doesn't hold up.

~~~
RiderOfGiraffes
In part the key word here is "explicitly." I have a PhD in Combinatorics and
Graph Theory, and I have lectured in Calculus, Group Theory, Functional
Analysis and Topology. In my daily work for the past 18 years I have
explicitly used that background exactly once.

But the ongoing influence, the style of thought, the ability to visualise and
the ability to abstract away from the details - these things I use all the
time, every day. They have been enhanced and honed by all that math.

I use my math background _implicitly_ all the time, and I don't know how I'd
do what I do without it.

And this is perhaps the most important point about studying math. Often the
greatest take-away is the abstract problem solving capability, not the
material itself. I've never had to analyse the genus of a manifold in real
life, but I have thought about objects moving in an 11 dimensional space with
holes, because while everyone else was stuch in the detail, I was seeing
things differently. It turned out that the combination of styles was critical
to solving the problem.

~~~
ubernostrum
_Often the greatest take-away is the abstract problem solving capability, not
the material itself._

That's why I got a degree in philosophy.

OK, not the only reason, but one of them...

Meanwhile, I think my point stands. Too many people seem to have an "OMG you
don't use linear algebra every day? What kind of crap programmer are you?"
attitude.

------
tptacek
There's not a lot of math in plugging forms into database rows, or even in
plugging values into MSRs and managing interrupts, but there's enough math in
general programming that I constantly regret ditching that part of my
education.

I am, with surprising regularity, annoyed that I can't pull basic trig out of
my head without looking things up --- to say nothing of signal processing and
number theory.

------
RyanMcGreal
>As a linguistics major, you're no stranger to the idea that a person is only
capable of having thoughts and ideas that can be expressed in their language

As a linguistics major, you have no excuse for not knowing that Sapir-Whorf is
utterly discredited.

~~~
scott_s
Replying to your post in the recent discussion,
<http://news.ycombinator.com/item?id=1033741>, I don't understand your
reasoning. Sure, you can extend a language introduce a concept. But the lack
of that concept can still have molded speakers' thinking.

~~~
ubernostrum
The problem is that there isn't one "Sapir-Whorf hypothesis". There's a
spectrum of such hypotheses, running the gamut from the mild "language has an
affect on how we think" up to the extreme "if your language uses the same word
for blue and green then you'll be blue/green colorblind".

The extreme forms have been rather thoroughly debunked at this point. The
milder forms verge on tautologies.

~~~
RyanMcGreal
>The milder forms verge on tautologies.

Exactly. The hypothesis is clearly false in any articulation strong enough to
matter.

~~~
scott_s
What is the weakest articulation that you think is clearly false?

------
philwelch
"As a linguistics major, you're no stranger to the idea that a person is only
capable of having thoughts and ideas that can be expressed in their language,
and there is no reason to expect programming languages to differ from spoken
languages in this area."

I thought this idea (the Sapir-Whorf hypothesis) had been discredited.

~~~
dmnd
My hypothesis is that the Sapir-Whorf hypothesis is much more applicable to
programming than natural language.

edit: after I wrote this, I realised it's just a restatement of PG's Blub
paradox.

------
pvg
It's ok to suck at maths if your idea of maths is limited arithmetic,
something computers are indeed very good at and your idea of programming is
limited to hooking up web forms to databases.

I'm not a linguistics major but I do speak several human languages and have no
trouble thinking in them and expressing ideas in them. The first computer
languages I learned (6502 assembly, BASIC) don't enter my conscious thought
when I think about the programming problems I encounter with the languages I
use today.

------
Eliezer
Look, I'm sorry to be the one to break this to you, but if you have difficulty
with any programming concept, you must not be a supergenius. You're just an
ordinary genius at best. I'm sorry. Life isn't always fair.

Of course, I say this as someone who hasn't yet tried to learn Haskell. On the
other hand, I know someone who competes at the national level and I never saw
_him_ have trouble with anything including Haskell, so...

The sad truth is that there _are_ some people for whom programming comes as
naturally as thinking, with code formed as easily as thoughts; and if it takes
an effort to understand any aspect of programming, you have just learned that
you are not one of those people. Alas.

------
proee
"Programming is not always intuitive, it's inherently complex, and it's
challenging. Once you start feeling like you've gotten a handle on it, you'll
learn something new that will make things seem even more complex for a while."

This applies to pretty much any field - engineering, physics, chemistry, even
music!

My background is in electrical engineering and it's quite daunting to realize
how little I REALLY understand when it comes to the fundamentals... Sure an
engineer can make things "go" but they're standing on the shoulders of giants.

Learning is a humbling en devour.

------
NathanKP
Has anyone else checked out the main page of the site?

<http://bryanwoods4e.com/>

Be sure to view the HTML code to see the "hail satan" comment. This guy has
some personality thats for sure....

Another of his sites linked from the main page:

<http://www.howtousetwitterformarketingandpr.com/>

~~~
gabrielroth
I don't think <http://www.howtousetwitterformarketingandpr.com/> is his site.

~~~
NathanKP
Okay, perhaps not. I didn't do a domain name search to see if he owns the
domain.

------
sonofjanoh
What about game programming? Nobody mentioned it. It's the ultimate test. Try
to hack together a simple pool game. You'll be amazed of how much maths and
physics go into a simple game that millions use and enjoy.

~~~
cellis
Not much math. Some simple physics like velocity and momentum transfer that
you can learn in a tutorial (in a day). I think it is important to make the
distinction between game development (lots of simple math and complex
"pluggable" formulas) and game engine development (yes, you need to understand
linear algebra,trig, and perhaps calculus).

~~~
sshumaker
No, game development includes 'game engine development' - the majority of
companies write their own engine or have heavily modified a licensed engine.
And you often need to understand quite a bit of math to use even off the shelf
engines - to debug issues and to tweak stuff.

Perhaps you're referring to what industry folk call gameplay programming?
There's a lot less hardcore math there - but you still typically need to
understand basic physics, trig, interpolation, etc.

~~~
nzmsv
Most of the time working with a game engine does not involve inventing any new
mathematics, so there is an upper bound on just how hardcore it can really get
:)

------
ytinas
I like the layout of the page, nice fonts etc.

The bit about being constrained by your first language is demonstrably not
true (read pg's own account!). It can be a burden, but what stops people from
progressing isn't this, it's the usual suspects: arrogance and ignorance. Once
you stop judging a language purely on its merits and, thinking you've found
the best, begin evangelizing it you will have problems seeing more powerful
ones (because the language has become part of your id).

You have to treat a programming language like a great chess player treats
possible moves: when you find a great one, sit on your hands and look for a
better one.

As far as math: in my experience it isn't required. It will make you better
and make your work easier. I've had good math people replace whole algorithms
of mine with a couple of math statements. But if you really devote yourself to
getting better at programming, learning a lot of diverse languages and so on,
your math will get better. I've found it easier to learn certain math concepts
from related programming concepts that I had already learned.

------
almost
Programming isn't hard, programming is fun! Ok, it is hardish sometimes, but
hard in a fun way not hard in the non-fun way this article seems to imply.

And I really don't think your first language is all that important,
programming is still fun usually, whatever the language. It's only later that
we learn the fine art of language snobbery ;)

~~~
pvg
Hard and fun are not mutually exclusive.

~~~
almost
Absolutely, but there is fun-hard and there is hard-work-discouraging-hard and
this article seemed to me to be talking more about the latter. In fact I'd say
that non-hard programming usually gets not-so-fun (although it still has it's
charms)

~~~
kscaldef
No, really, sometimes it's "hard-work-discouraging-hard". I don't know anyone
who thinks it's fun to pour through strace or tcpdump output trying to figure
out obscure bugs at the OS or network layers. I've known a couple people who
liked looking at generated assembly to debug compiler issues, but they are a
rare breed. Race conditions or 1 in a million bugs really suck. Trying to
debug any of these things while your company hemorrhages money or the phones
are ringing off the hook with pissed-off customers; any "fun" you're having is
just the adrenalin trying to keep you from being eaten by a tiger.

Working on hard problems, of your choosing, on your own schedule, can be fun
and rewarding. But the reality is, you're not always going to get that.

------
dangrossman
I didn't need any advanced math to program until I started tackling computer
vision problems.

Estimating 3D surface normals and depth from multiple photos of an object?
Break out the matrix solvers. Computing homographies between images? Better
know what an eigenvector is.

~~~
cellis
For me it was "want to make a fish swim realistically with inverse
kinematics"? Break out the Jacobian Matrices.

Although, to be intellectually honest, I went only went as far as to research
this, and no further. I just didn't see it as worth the time.

------
EricBurnett
My first experiences with programming must have been very different than his,
with respect to the first section in particular. I always knew that
programming was supposed to be hard - I grew up knowing no programmers,
teaching myself the esoteric art of C++ from a copy of "Sams Teach Yourself
Visual C++ 6 in 21 Days". So when I understood it reasonably well, I felt I
must be above average. Indeed, I've never felt the feelings of "frustration
and discomfort" he references. Instead, I have always had to battle with my
hubris in thinking that I'm that much better than the programmers around me.

~~~
mattiss
Seriously dude....

~~~
EricBurnett
Yes? My point was simply that where he struggled with frustration, I struggled
with my ego. Considering he is probably a better programmer than me, I thought
this was an interesting contrast to bring up. What is the problem?

~~~
mattiss
Maybe you should clarify your posting. It reads as

"Well I never had any problems learning to program. I am pretty much the
smartest person in the world I guess. My "problem" is that I am so great I
have too big of a head."

~~~
EricBurnett
Hmm, I see. Thanks for pointing that out - I'm still working on getting my
intent across properly for this kind of thing.

Since it is too late to edit it and add a disclaimer, I guess this thread will
do.

------
J3L2404
In the beginning of Hillegass's book, Cocoa Programming for Mac OS X, he has a
great quote of someone from Caltech being asked about the real world
usefulness of a degree in astrophysics. His response was "Not much, but if I
run into a hard problem and start thinking I must be stupid because I can't
figure this out, then I remember I have a degree in astrophysics so I am not
stupid and this must be hard. So in that way it is useful." I'm paraphrasing
(the book is buried somewhere) but that always stuck and your post reminded me
how important it is to keep at it because coding is not easy, but it is worth
it.

~~~
bryanwoods
Thanks for mentioning this. I've actually been working through that book over
the past few days and think that anecdote really got under my skin. For some
reason it didn't dawn on me when writing the article. I'll make note of it.

~~~
zck
pg made a similar comment in Undergraduation
(<http://www.paulgraham.com/college.html>):

> My mother, who has the same [thermostat], diligently spent a day reading the
> user's manual to learn how to operate hers. She assumed the problem was with
> her. But I can think to myself "If someone with a PhD in computer science
> can't understand this thermostat, it must be badly designed."

