
The Mathematical Hacker - ColinWright
http://www.evanmiller.org/mathematical-hacker.html
======
p4bl0
I find it disturbing that this article has been written in 2012. While I was
reading it I really thought that it was at least 10 years old.

Computer science researchers are doing actual mathematics, and they are
clearly more in what the article calls "Lisp school" than the "Fortran
school". Research in functional programming is mostly mathematics. Lambda
calculi (which was originally not developed to be a _programming_ language at
all), types, logic, category theory… Most of the tools used in programming
languages research are mathematical tools, and they always have been.

The article is great and I mostly agree with its conclusion, but it seems to
miss a lot about what is actually going on in computer science. I understand
that it mainly talks about programmers and not computer scientists, but it's
less and less true that programmers are not interested in what is happening in
computer science research. Lisp invented a lot of the stuff that are now in
almost every languages (conditionals, garbage collector, first class function,
everything is an expression, recursion…) and it took many years to arrive
here. But nowadays I see a lot of interests from the programmer community for
what is happening in the programming languages research around Haskell, Scala,
OCaml, and even stuff like Coq, Isabelle… ! In order to understand even a
little what is actually going on with these research project one have to study
(even if indirectly) mathematics.

Now that I have written this comment, I'm thinking that may be I have a
skewed/distorded vision and that I'm not really talking about the same people
as the article does. I don't know.

~~~
lightcatcher
Math is a very broad field. The point of this article is that _applied_ math
is generally not done in functional programming, and is not typically part of
computer science research (more often happens as dedicated applied math
research).

Types, logic, and category theory are the sort of things the author thinks
that functional programming people concern themselves with, and he is raising
the point that these mathematical concepts only help people create better
languages and write safer code, not do "useful" things like weather/weapon
simulations, solve optimization problems, or image/video processing.

~~~
discreteevent
I agree - "Mathematics, in the end, does not help you understand computer
programming. It is not about finding metaphors, or understanding
“fundamentals” that will never be applied. Rather, mathematics is a tool for
understanding phenomena in the world." - I couldn't agree more with this. I
come from an Electrical Engineering background originally. We did a lot of
applied mathematics (in the sense of the article). One year we had a lecturer
from the mathematics department who didn't really have a clue about the kind
of mathematics an engineer needs. He spent a lot of time proving the most
basic/fundamental things, nearly down to the level of 1+1=2. My attitude at
the time was "OK that's interesting but so what? I don't need it to get things
done in the real world". Its the same thing with the lambda calculus or type
theory for me. "OK I get the idea - so what". Its not really of any use to me.
Its probably of very little use to anyone except programming language
designers and there are very few of them. Its kind of disheartening to me then
to see such a focus on these things, article after article. To me it seems
that the field is very inward looking in this sense.

For me code is just a tool for modeling/solving problems in the outside world.
It is the outside world that is interesting. I think the emphasis in the
article is the right one. If more people focused on applied mathematics rather
than category theory then we probably would get more innovation. Eric Evans
said something similar in Domain Driven Design "Instead, the technical talent
goes to work on elaborate frame-works, trying to solve domain problems with
technology. Learning about and modeling the domain is left to others.
Complexity in the heart of software has to be tackled head-on. To do otherwise
is to risk irrelevance."

One problem though is that I do like lisp and its lineage (smalltalk etc). Had
to take fortran years ago. Never really liked it.

------
Dn_Ab
The problem with the article is that it dreams up a false conflict and then
buries its thesis, a very valid and solid one, by trying to force a
contentious narrative onto that just isn't there. Very few people these days
are ignorant of the points he makes and neither Yegge nor Graham are posts of
some imagined counter applied math camp. I would tend to think they are pro -
Yegge's interested in bio, Graham did spam stuff.

Here is my attempt at a summary that captures all the detail:

Programmer math is not only restricted to type theory and formal logic.
Applied Math is also a very important aspect. Eric S Raymond is wrong for
trying to make it look like the 'Except for X' is negligible. Plus, machine
learning, bio, geo and so on are increasing in importance and growing rapidly
as fields. Even in the past, applied Math Programs had a massive impact on the
economy by enabling engineers with specialized software. Learn Math. It is
important for you as a programmer going forward.

\---------

I am not sure why he restricts functional programming to recursion with lisp.
There is no need to denigrate functional programming, it fits extremely well
with these applied math problems and helps a lot with reducing complexity of
implementation, in my opinion. They also tend to be at least as fast as Java
with OCaml posting incredible single core speeds.

Another point is math algorithms is distinct from mathematical programming.
Math helps one calculate and choose faster algorithms or find better bounds.
But except for some portions of the haskell compiler there are few uses of
direct calculation to derive programs mathematically. In engineering, math
allows you to precisely calculate behaviour and properties but in programming
you have to debug. The key reason for this divide is that other engineers have
lots of components with well defined properties to work with.

This is another reason why functional programming is more mathematical.
Although the idea of composition is not inherent to FP, in FP it is the
default paradigm. The ideas of combinators with well defined properties, and
theorems proven on them, that only go together in a certain way and that you
can sit down and have a pretty good idea of how your program would behave
theoretically, this idea is what makes FP so close to doing _applied math_.
And Haskellers really shine at that. Haskellers like to talk about monoids and
categories but really most of haskell programming is closer to what a reguler
engineer does with the category theoriests serving the same function as
physicists.

~~~
sirclueless
To add to your claim that math is fundamental to computer science, I think
there's a big factor that isn't being mentioned: knowing math future-proofs
your career better than knowledge of any technology or language.

Mathematical depth is the irreducible core of a computer scientist's value.
Fundamentally, what computers provide is the mechanization of knowledge. In
the same way that robotics came along and mechanized all of the entry-level
manufacturing jobs (the root of the current "skills gap" in manufacturing,
more so than outsourcing in my opinion), over the coming few decades I expect
that computers will mechanize most entry-level knowledge workers' jobs, if
they haven't already.

And the wacky thing about comp. sci. is that our entry level positions _are_
in fact basic knowledge-work. As the field matures, the fields that are the
bread-and-butter of basic vocational computer science are increasingly
swallowed by more sophisticated technical solutions that are more mechanized
and productized. Comp. Sci. is an ouroboros, eating its own tail of unskilled
positions.

The most obvious example currently is sysadmins: computing infrastructure is
being productized and sold under the umbrella of "cloud computing," and its
main value proposition is that you can cut your sysadmin budget by a huge
margin. Similarly, QA and software testing departments are having their lunch
eaten by better testing practices and automated solutions such as CI: the jobs
aren't disappearing entirely, they are just being consolidated into a tiny
department (maybe even one dev, part time) that can maintain an automated
infrastructure. At a more basic level, data entry positions started drying up
years ago with the advent of OCR, and now most of the big easy targets are
entirely mechanical (ex. the post office/shipping).

In short, the best way to provide enduring value is to provide mathematical
insight that can't be mechanized. If you can recognize optimizations and
opportunities that take advantage of the unique structure of your domain, you
are unlikely to be swallowed by someone with a productized version of your
livelihood. If all you can manage is CRUD apps for enterprise, then when
someone comes along with a general product that reduces your department from
many guys writing software to one guy managing the product, you aren't gonna
convince anyone of your value. At that point your only defense is the behemoth
of bureaucracy, but the competitive nature of business suggests that it won't
be a great defense for long.

~~~
shurane
I'm not sure I completely agree with you, but these are some valid points.
What happens once we run out of unskilled jobs due to the machines?

------
jrajav
I think this article raises a great point overall, but the Fibonacci example
is a bit weak and smacks of a strawman. Functional programming tutorials that
teach you to write a recursive Fibonacci or factorial calculator are not doing
so with the purpose of teaching a useful function, or even a workable one - in
fact, more often than not this is called out by showing its performance for
moderately large N. They only use those examples because they're simple,
abstract problems that can be used to elegantly demonstrate recursion in a
relatable way.

If we're going to talk about real world, practical applications, then it might
be more relevant to bring up a problem for which there isn't a trivial O(1)
solution. A dynamic programming problem, perhaps? It's glossed over with a
terse dismissal:

> “Advanced” discussions might consider engineering considerations such as
> tail-call behavior, or the possibility of memoizing results.

... But this is exactly why a functional programming language might be better
than an imperative for certain applied mathematics problems. Note, I'm not
saying that it _is_ better, just that it's not such a simple argument. With a
different set of examples, this article could have been written in _favor_ of
Lisp just as easily. I still think it's a good article overall, just that the
point it tries to sneak in about Lisp and functional programming is a little
shaky.

------
Someone
_"If you are a systems and network programmer like Raymond, you can do your
job just fine without anything more than multiplication and an occasional
modulus."_

A good network programmer should have at least passing knowledge about graph
theory (the network is a graph), queueing theory (how else are you going to
size your buffers?), and also some statistics (given these numbers, how much
bandwidth will be taken by resends? How much data can we send per hour? How
many 9s do we get for 'probability that a messages is handled within 10
seconds'?)

Good systems programmers should have some knowledge of all of these for
essentially the same reasons. They also will need some knowledge of Petri nets
(for deadlock avoidance), formal languages (how else are you going to spec a
language or some complex protocol?)

Also, an article on this subject that mentions neither Knuth nor Dijkstra?
Weird. Is that because they do not have a blog (yes, Dijkstra does not live
anymore, but I doubt he would have a blog if he did)?

~~~
tzs
> A good network programmer should have at least passing knowledge about graph
> theory (the network is a graph), queueing theory (how else are you going to
> size your buffers?), and also some statistics (given these numbers, how much
> bandwidth will be taken by resends? How much data can we send per hour? How
> many 9s do we get for 'probability that a messages is handled within 10
> seconds'?)

Partial differential equations can also be useful there. From Danny Hillis'
TEDxCaltech talk about Feynman's work on the Connection Machine, where Feynman
was figuring out how many buffers they needed in the routers:

\---------------------------

By the end of that summer of 1983, Richard had completed his analysis of the
behavior of the router, and much to our surprise and amusement, he presented
his answer in the form of a set of partial differential equations. To a
physicist this may seem natural, but to a computer designer, treating a set of
boolean circuits as a continuous, differentiable system is a bit strange.
Feynman's router equations were in terms of variables representing continuous
quantities such as "the average number of 1 bits in a message address." I was
much more accustomed to seeing analysis in terms of inductive proof and case
analysis than taking the derivative of "the number of 1's" with respect to
time. Our discrete analysis said we needed seven buffers per chip; Feynman's
equations suggested that we only needed five. We decided to play it safe and
ignore Feynman.

The decision to ignore Feynman's analysis was made in September, but by next
spring we were up against a wall. The chips that we had designed were slightly
too big to manufacture and the only way to solve the problem was to cut the
number of buffers per chip back to five. Since Feynman's equations claimed we
could do this safely, his unconventional methods of analysis started looking
better and better to us. We decided to go ahead and make the chips with the
smaller number of buffers.

Fortunately, he was right. When we put together the chips the machine worked.
The first program run on the machine in April of 1985 was Conway's game of
Life.

\---------------------------

Source: [http://longnow.org/essays/richard-feynman-connection-
machine...](http://longnow.org/essays/richard-feynman-connection-machine/)

~~~
shurane
I am in love with Feynman. He's so captivating.

~~~
Crake
I love him, too.

------
krutulis
As I see it, an essential point of "Data Science" is to finally introduce
mathematical awareness to the world of programming. I welcome our newly
arriving & refreshingly numerate overlords!

"One way to read the history of business in the twentieth century is a series
of transformations whereby industries that 'didn't need math' suddenly found
themselves critically depending on it." This is a significant understatement.
It also summarizes the history of business in the 19th century and even
earlier. See, for example: steam power, metallurgy, bridge building, chemical
industries, and electricity.

Not all hackers miss Miller's point. One of Zed Shaw's older rants
"Programmers Need To Learn Statistics Or I Will Kill Them All" comes to
mind[1].

Perhaps I'm missing some intended irony, but Miller's polemics on Lisp and his
reverence for Fortran feel overdone & unnecessarily narrow. Coders who buy
into Miller's notions can join the "Fortran School" by learning something
along the lines of R programming (lots of fun, functional, array-based, and
adept at statistics) or J (compression and austere beauty that only a
mathematician could love) or any number of other languages that have not
shunned math. Of course, calculus, statistics, and linear algebra also
required. For the university trained, brush up (or curse your university for
not requiring you to learn these subjects). For the autodidacts among us, the
likes of Coursera and Sal Khan are indispensable.

[1] <http://zedshaw.com/essays/programmer_stats.html>

------
sdfqwer
The mere fact that SICP (Structure and Interpretation of Computer Programs)
and SICM (Structure and Interpretation of Classical Mechanics) were written by
the same person should be enough to give the lie to Miller's thesis.

Although willful ignorance of math is clearly a problem among certain segments
of industry, it's just plainly not true in general and certainly doesn't
correlate with lisp.

------
juiceandjuice
The first real library I had to learn to use was GSL. The first framework I
ever learned to use was ROOT. I also had to learn how to use FFTW and all
sorts of other math libraries. I've had to hack fortran a bit, but mostly just
know how to read it and compile it with g77 and stuff because that's what my
professors would program in. In my only real course in programming at a
university I learned how to link C to fortran libraries. I used Maple way more
than matlab, and all of this was the majority of my programming training
before I graduated from college.

If there is a divide between fortran and LISP hackers, I'd have to say that
Python is going to be the adopted parent/mentor, and it's the only programming
language truly situated to bring them together. This is why projects like PyPy
get me excited. Other projects like Julia are really interesting as well.

I have to agree with the premise of the article though. There is a divide
between mathematical hackers and non-mathematical hackers. I don't think it's
anti-intellectualism per se, it's possibly more of a result of the increasing
stratification of backgrounds in an industry which has increasingly focused on
products, services, and most importantly consumers and "social".

I went to a talk yesterday by an veteran applied mathematician who does
research at HP labs. He mentioned something that really resonated with this
notion: The days of academic style research, of Bell Labs and the like, has
disappeared. This isn't to say that research and math has disappeared from the
industries, but as a researcher you have to justify the research by explaining
how it can contribute to a bottom line for the company. There's a few
companies with looser restrictions on this (Microsoft, Google), but typically
this kind of research is best done at companies that are monopolies with high
profit margins. In that case, a company like Apple would be best situated to
conduct research on the level of the Bell Labs of the past, but it's evident
that they don't. The mathematical hacker started falling out of vogue sometime
in the 70s. If the mathematical hacker ever becomes vogue again, it won't be
because of Apple. It will be because of big data, autonomous cars, medicine
and bioinformatics.

------
pdmccormick
I've taken some mathematics and logic courses at a university level, and the
thing I took from those experiences and apply everyday is the reasoning and
proof structuring. Struggling through and coming to something of an
understanding of the concepts of analysis (calculus) and algebra were a
pivotal point in my intellectual development. As somebody for whom math has
never come easily (and still doesn't), but for whom the beauty and wonder of
it all was still enchanting, I'd heartily encourage any and everyone to learn
some higher maths in detail. The specific results may not always be applicable
day to day, but the way of thinking will stay with you. Good luck!!

------
laichzeit0
I don't understand what the apprehension is towards learning mathematics. Take
it at college (university). Take as much as you can. Take it instead of silly
courses like philosophy (for which you can sit in a lecture in your spare
time, or look at online videos, or take out a book from the library and read
at your own leisure). Just learn the freggin' math and stop bitching about "is
it necessary, do I really need it, blah blah blah". It's an IMMENSELY useful
subject in just about every branch of human knowledge. It's never a waste of
time, perhaps even more so if you're a computer programmer.

------
norswap
The article casually confounds what it is programs do and how programs are
written. Programs operate on data, and said operations can be more or less
mathematical in nature.

Writing programs now, that's not mathematical in nature at all. There are an
infinite number of programs that compute the same result. A program is good
because it meets a number of criteria, among which efficiency and math
soundness are not always the most important. Many of the criteria have to do
with how the human mind works and how it intuits.

------
ninetax
>After coding his recursive solution, the Lisp hacker is more likely to ask
the irrelevant question: how can I reduce these two functions down to one
function?

Why would the Lisp hacker ask that? That makes no sense. There is no reason to
combine those two functions.

------
kenko
" long int fac(unsigned long int n) { return lround(exp(lgamma(n+1))); } "

Sounds great! What's the factorial of 40? Let's use some python:

>>> int(math.exp(math.lgamma(41)))

815915283247882431423526575245034982027017846784L

Huh, weird. I would've thought the factorial of 40 would be divisible by ten.
Let's try it another way:

Prelude> let fac n = foldr1 (*) [1..n]

Prelude> fac 40

815915283247897734345611269596115894272000000000

The author's disdain for the lambda calculus, as if it is somehow not real
math (because it doesn't involve numbers in the familiar sense?), is bizarre.

~~~
sbi
Well,

    
    
         lround(exp(lgamma(...)))
    

is just about the worst way to use lgamma. If long ints are signed 64 bit
integers, then the largest factorial you can store is 20!. You might as well
look it up in an array. But you frequently need ratios of factorials (and
values of the gamma function more generally) when doing statistical computing
and lgamma is invaluable.

------
new_test
As a programmer you may not need to use algebra or analysis on a daily basis,
but as a rule I've been noticing that the solutions that are more
mathematically elegant also tend to be more pragmatic.

------
abrezas
"Despite the aesthetic virtues ascribed to functional programming, I find the
preceding solutions to be more beautiful than their recursive counterparts.
They run in constant (rather than linear) time, and they are easily adapted to
work with non-integer inputs."

Isn't this wrong? I don't think pow is computed in constant time.

~~~
Patient0
I think many floating point functions (e.g. sin, cosine) are implemented using
Pade rational approximations - basically, the ratio of two polynomials.
(<http://www.dattalo.com/technical/theory/sinewave.html>)

This usually gives enough accuracy for the purposes of floating point.

However, I'm not sure if "pow" can be usefully implemented this way. I am
guessing no, because pow grows faster than any polynomial eventually...

edit: Hmm, pow at least looks linear here:
<http://www.netlib.org/fdlibm/e_pow.c>

------
S4M
I don't really see the point of this article. You don't need Fortran or C to
implement the calculations of the Fibonacci as described in the article.

On top of that the article is missing the point that some elements functional
programming (map, lambda) are actually making a numerical implementation
neater.

It is not an accident that the original authors of R were themselves lispers
and admitted having been inspired by lisp when creating R.

I think among others, those links will illustrate the previous point:
[http://cran.r-project.org/doc/html/interface98-paper/paper.h...](http://cran.r-project.org/doc/html/interface98-paper/paper.html)
[http://www.stat.auckland.ac.nz/~ihaka/downloads/Compstat-200...](http://www.stat.auckland.ac.nz/~ihaka/downloads/Compstat-2008.pdf)

~~~
jpitz
The article didn't claim that FORTRAN or C are required for these
calculations. Rather, It lamented that the prevailing attitude of Lisp
practitioners was to eschew the knowledge behind those calculations.

------
cs702
This rings True to me.

It also represents potentially really good news, because it means that we are
probably _at the beginning of a huge wave of wealth creation_ as we figure out
how to apply mathematical knowledge to the massive growing mountains of data
generated by everyone and everything everywhere.

Think about the historical impact that statistical-quality-control software
has had in manufacturing, or the impact that economic-forecasting-and-
optimization software has had in farming, or the impact that 'linear
programming' software has had in supply-chain management and logistics; and
then extrapolate these past achievements to get a sense of the potential
impact that applications of mathematical knowledge could have in an everyone-
connected, everything-networked, everything-measured world.

Exciting times.

~~~
b_emery
Yes! This is what had me thinking that this article is the best one I've read
all week. It's not about language wars, it's about finding the new
applications of math. With the data and computational power at our disposal,
big questions are going to fall and big industries will be disrupted.

------
KirinDave
A fine essay, except that it ignores that the "workaday" programmer actually
hates high degrees of abstraction and ignores it.

No, seriously.

We're taught in mainstream programming languages to temper our desire for
abstraction with mechanical sympathy; being careful not to build too high a
tower of abstraction lest our CPU god grows angry and knocks us down to earth
with miserable runspeeds. Even if your algorithm is right, failing to fit in
cache or having too many memory fetches can increase your code runtime by
multiple orders of magnitude from where an optimal, machine-aware piece of
code should be.

And the languages that _do_ try to bring more abstraction to the table are
ironically dismissed as "too academic" or "too strange" or "not practical" by
most people in the industry.

------
meaty
Algebra is required all the time for me for pretty much any non trivial
problem that isn't CRUD (most apps seem to have no functional complexity past
CRUD).

The only time I had to delve into deep mathematics was implementing CORDIC
algorithm for microcontroller powered floating point ops (sin/cos/ln) on a
68HC11 because we couldn't buy an implementation in that had source code so we
could verify it. Even then it wasn't all that hard. Took about a week to wrap
my head round the maths involved.

I do find that mathematical literature is all theory and no application, even
with my engineering background. If we had some applications, people would use
it more and take it seriously as well and therefore there would be more
mathematical programmers.

~~~
wolfgke
> I do find that mathematical literature is all theory and no application,
> even with my engineering background. If we had some applications, people
> would use it more and take it seriously as well and therefore there would be
> more mathematical programmers.

I have a mathematics background and I see immediately thousands of not-
implemented applications when I read mathematical literature (especially the
highly abstract one). Thus applications are there. But getting the abilities
to see these applications takes time and dedication. I try to hint lots of
colleagues to these connections and applications and get ignored (I could talk
to a wall instead - it wouldn't make a difference). The colleagues only want
to get their job done somehow and nothing more...

------
mgnagy
While an interesting read, it struck me as an odd, we've taken the wrong fork,
nostalgia piece.

If you look at the very early days of computing, you tended to find three
types of degree/backgrounds: Electrical Engineering, Mathematics, and
Philosophy. The double E is obvious: building the kit. The other two, the
potentially Left Brained or Right Brained approach to programming. All three
of them have one thing in common: A structured, _logical_ approach to solving
problems.

If there is anything to lament, it is that we still have the tug of war
between left and right brain approaches. The reality is that one size, indeed,
does not fit all. We should accept that and move on to getting the work
appropriately done.

------
mcgwiz
The case for relying on math more when programming is one pillar of a more
general aim: for programmers to be more creative in their approach to solving
problems programatically. IMHO, creativity can be learned, but not over night.

Creativity requires comfort with risk and can therefore be costly, which goes
against the contemporary programmer's drive toward efficiency and
optimization. Creativity requires bring a fresh perspective to each problem
encountered, and can therefore be exhausting, which eschews the programmer's
drive toward re-use and generalization. Creativity requires leaning on other
disciplines for inspiration, which runs counter to the demands of being a
language or technology expert.

With time, the grip of these convictions can be loosened and contextualized,
allowing room to explore curiosities and creative impulses, many of which may
be mathematical (or psychological, or involve re-framing), that arise
naturally during the course of a project.

------
rartichoke
Math seems extremely important to me as a programmer, even for CRUD apps.

Let's say you graduated HS and slept through your algebra classes. Now a few
years later you're a programmer with pretty much no math skills other than
what you've learned in elementary school.

Math seems to teach you one of the most fundamental and useful things about
programming. The ability to reduce a problem from something that is complex
into something that is not complex.

I've only gotten a taste of some basic algebra after taking some online CS
courses and really, coming into it with a background of "embarrassingly poor
math skills" I can really say that it has changed my life for the better.

I'm still clueless when it comes to some algebra but I find myself looking at
problems and being able to solve them much easier now and this is only after a
few weeks of programming related courses that happen to use algebra on some
occasions.

The math isn't what made it easier. It's applying the same things to solve
math problems to programming.

------
irahul
Picking on Fibonacci of all things? The goal of fibonacci and factorial
examples are to teach recursion. Both fibonacci and factorial are good
starting points for a beginner. It can be followed by discussions of dynamic
programming where the student can be introduced to recurrence relations and
solving them top-down and bottom-up.

EDIT: Adding some background on dynamic programming

For dynamic programming, the problem should be breakable in terms of
overlapping smaller problems, and the base case should be recognized. If the
problems don't overlap, they fall within broader divide and conquer
category(mergesort, quicksort etc are famous examples).

fib(n) is defined as fib(n-1) + fib(n-2) (overlapping smaller subproblems) and
fib(0) = 1 and fib(1) = 1 (base cases)

    
    
        fib(n) = fib(n-1) + fib(n-2)
        fib(0) = 0
        fib(1) = 1
    

A relation defined as above(recursively) is known as recurrence relation.
Discrete math courses deal with finding closed form expression - a non-
recursive function of n. But in programming, we are fine with solving the
recurrence relation without finding a closed form expression.

Recurrence relations form the basis of dynamic programming and they can be
solved either top down or bottom up.

The top down approach is the traditional recursive solution.

    
    
        def fib(n):
          if n == 0 or n == 1: return n
          return fib(n-1) + fib(n-2)
    

And then the student is to realize fib(n-1) is recalculating fib(n-2) and
memoization is in order.

    
    
        def fib(n):
          cache = {0: 0, 1: 1}
          def _fib(n):
            if cache.has_key(n): return cache[n]
            cache[n] = _fib(n-1) + _fib(n-2)
            return cache[n]
          return _fib(n)
    

Then the student should realize modifying every function isn't apt, and should
implement a general memoize decorator.

EDIT: Adding table based bottom up fibonacci.

Now once the student understands top down dynamic programming, as in he can
find the recurrence relations and base cases, it's time for bottom up. As the
name suggests, bottom up starts from the bottom and calculates n compared to
top down which starts from n and boils down to base cases.

    
    
        def fib(n):
          vals = {0: 0, 1: 1}
          for i in range(2, n+1):
            vals[i] = vals[i-1] + vals[i-2]
          return vals[n]
    

Student should recognize how top down and bottom up are calculating the same
recurrence relation, but in a different order. The table vals here is the same
as cache above in top down.

Top down is recursive and might trigger the recursion limit. Bottom up doesn't
have the recursion problem. Sometimes in case of bottom up, table can be
eliminated depending on the overlap. But the important thing is, once the
recurrence and base cases are known, it can be implemented quite easily.

In fibonacci's case, nth number depends only on n-1 and n-2 and maintaining
the whole table is wasteful. The bottom up approach will be better.

    
    
        def fib(n):
          f0, f1 = 0, 1
          for i in range(n-1):
            f2 = f0 + f1
            f0, f1 = f1, f2
         return f2
    

Fibonacci just happens to be one of the problems used to demonstrate recursion
and dynamic programming. It's small enough for a beginner to comprehend, and
big enough to explain recursion and dynamic programming.

The article picks one recurrence which has a closed form expression. The
dynamic programming problems which I have encountered aren't that easily
reduced to closed form expressions.

Also, I don't know about Graham or Raymonds, but Yegge advocates maths for
programmers.

[http://steve-yegge.blogspot.in/2006/03/math-for-
programmers....](http://steve-yegge.blogspot.in/2006/03/math-for-
programmers.html)

~~~
tareqak
I think you sort of answered your own question. Fibonacci and factorial
functions have closed forms that can be computed efficiently than implementing
the recurrence relation both in terms of clock cycles and developer time.
While there are a lot of dynamic programming problems that do not easily
reduce to closed form expressions, there are a lot that do. Maybe there are
better examples for students to learn that of similar difficultly, but lack
closed form solutions. The Fibonacci one still seems useful since you might be
on a system that doesn't provide the _lgamma_ function. Now, you might say
_lgamma_ isn't obvious, but then, the author would have proved his point.

There are a lot more closed forms of recurrence relations in the math than the
two we talked about above, and a good deal of us (me included) are
ignorant/negligent of them (by negligent I mean lazy :P). However, this sort
of knowledge is exactly the kind that, in large enough occurrences, leads to
game changers and serious disruption in industries.

~~~
irahul
> Fibonacci and factorial functions have closed forms that can be computed
> efficiently than implementing the recurrence relation both in terms of clock
> cycles and developer time.

It doesn't matter. The purpose of fibonacci is to understand recursion and
dynamic programming. I have never ever encountered a practical problem where I
need the value of nth fib. How does it matter it has a closed form and it
takes less clock cycles when I almost never need it?

> While there are a lot of dynamic programming problems that do not easily
> reduce to closed form expressions, there are a lot that do.

The practical dynamic programming problems which have closed form and we need
the value and not workout the whole series are rare. Fibonacci is a learning
tool, so is factorial. The practical dynamic programming problems viz. longest
common subsystem, interleaving, alignment, travelling salesman, matrix
multiplication etc either don't have a closed form, or the closed form isn't
useful.

"How many ways to change 100 using 1, 5, 10, 20, 50" does have a closed form,
but more often than not, if I encounter a practical variant, the closed form
is useless as the "how many ways" is not interesting, but the actual
combinations are.

> Maybe there are better examples for students to learn that of similar
> difficultly, but lack closed form solutions. The Fibonacci one still seems
> useful since you might be on a system that doesn't provide the lgamma
> function.

The existence of closed form is immaterial. Closed form exists for Fibonacci
doesn't affect learning recursion and dynamic programming.

Also, Fibonacci's closed form isn't defined in terms of lgamma.

> However, this sort of knowledge is exactly the kind that, in large enough
> occurrences, leads to game changers and serious disruption in industries.

I don't have to know the closed form beforehand to find one when I need it.

------
B-Con
Just as an aside, I dual-majored in both pure math and C.S. I found the types
of thinking for both subjects to be very, very similar. In fact, finding a
constructive proof for a math theorem was almost like writing code.

I started programming fairly young, about 12, so the "constructive"
mathematical approach was the most natural to me. By my third year at
university it had worn off and I was much more comfortable with things like
existence proofs and thinking in terms of relations rather than constructions
(a very vague description, but it will suffice for now), but programming
always felt like it emphasized the constructive part of mathematical thinking.
It was like I went to a math class and thought one way, then a C.S. class and
emphasized a large subset of that thinking.

------
agentultra
I found the thesis of this essay to be a foregone conclusion.

We know that math is a universal language that, as far as we know, is capable
of describing every phenomenon in the universe. And while _Computer Science_
(if you can even call it a science) has had a storied history with mathematics
it's not unusual to me to see maths used as a tool rather than the foundation
of study. It's the same in many other fields such as physics, engineering,
chemistry and so on -- mathematics is a useful tool and nothing more. In
Computer Science mathematics is a useful language for discovering and
discussing the properties and consequences of algorithms and their
computation.

It stands to reason then that if one increases ones knowledge of mathematics
then one will have a larger vocabulary to work with in their area of study. If
you're studying physics then the more math you know the more complex
phenomenon you will be able to model and discuss. In Computer Science you will
be able to find efficient applications and optimizations of various algorithms
and model physical and ephemeral phenomenon with more rigour and precision.

However there is a maximum lower bound on the amount of mathematical knowledge
required in order to effectively practice programming. That is where I think
the meme, "you don't need to know math," comes from in our field. More often
than not the primary concern of a software developer is to balance two primary
concerns: "does it work as it is expected to?" and "can another human being
understand and maintain this?" Having a broader understanding of mathematics
will certainly open your horizons but there's only so much you need to know to
get by. Most of your "work-a-day" programming won't involve much math at all
and is primarily concerned with APIs and implementation issues.

I personally believe that Computer Science certainly needs more rigour and
mathematics should be stressed a little harder in our curriculum. However in a
"hacker" culture rigour is the least of your concerns. The hacker is a
pragmatic creature and heavily leans towards asking the question, "Does it
work?" more often than, "is this the right way to do it?" See the "New Jersey"
school of development (or "Worse is Better" philosophy). As a field I think
we'd benefit from a stronger approach to rigour. We are currently mired in a
populist culture that constantly re-invents the wheel every decade or so. It'd
be nice if we had a common literature and history from which to draw upon as
part of our standard curriculum.

Interesting article... but I don't think the treatment of lispers is quite
fair. :p

------
zem
i was amused by "calculus (the real kind)" to distinguish it from usages like
"lambda calculus". don't think i've ever seen that one before, though it
parallels usages like "a real doctor (not a phd)" and "anything with a science
in its name isn't".

~~~
Crake
I assumed he meant you shouldn't take things like "Calculus for the liberal
arts major" classes, but now I'm really not quite sure.

------
ekm2
The irony is that the best way to find out if mathematics is actually useless
to either programming or real world problem solving is to know a lot of
it.Just staring at it from a safe window and then making catholic
pronouncements does not help.

~~~
alexkus
As someone with both a Bachelors degree in Computer Science and a Bachelors
degree in Mathematics I'd agree.

My CS degree was quite theoretical (completed in 1999) so it wasn't just being
taught the language of the day (we did x86 assembly, Modula-2, Prolog, Lisp,
C++ and Occam).

The Mathematics degree has definitely been useful for programming;
specifically set theory, matrix manipulation and linear algebra, number
theory, and graphs/networks/design, but a lot of it is stuff I'm unlikely to
use for usual programming areas. Number theory (and implications within
assymetric encryption) is where I'll probably look to continue learning (as
part of a Masters degree) if I find the time.

------
szany
There's also the simple fact that programming literally _is_ math.

<http://ncatlab.org/nlab/show/computational+trinitarianism>

~~~
btilly
Programming is math, in the same way that accounting is.

Doing programming all day will not make you a mathematician.

~~~
szany
_Programming is math, in the same way that accounting is._

If you read the link that's not what it says.

 _Doing programming all day will not make you a mathematician._

Sad truth. For now!

------
edanm
You can talk and argue all you want. But the _fact_ is, many many successful
systems have been built, and are being built, by programmers who couldn't care
less about mathematics. Many of those couldn't care less about programming,
either, to be honest. We here who are reading such discussions are the elite
of programming, and also the small percentage of programmers who _actually
care_ about our craft. Let's not forget that most don't - and it's not a
problem, the world still works.

------
pmelendez
> "If you are a systems and network programmer like Raymond, you can do your
> job just fine without anything more than multiplication and an occasional
> modulus. "

So a systems programmer would never use a tree, a heap or any not trivial data
structure? Since when graph theory is not part of math? I am with the author
on this, those statements are fairly simplistic. In my experience, a strong
math background is the difference between an A and a B/C programmer.

------
sunkencity
I think it's possible to come into programming from lots of other disciplines,
one example is Philosophy. Reading code is quite similar to doing argument
analysis.

I think it's also easy to stare oneself blind on low-level stuff. We need
people that have higher skills in the domain that the code should solve that
can code. Having coders in one camp and then the "product owners" in another
camp leads to solutions that doesn't work for either group.

------
ianstallings
Can we at least restrict the word "hack" to using something in a way it was
not intended? It's being thrown around for everything and it's downright
confusing.

------
11001
Didn't go over so well the first time:
<http://news.ycombinator.com/item?id=4796586>

~~~
coolSCV
Certainly it was the missing "the" in the title.

------
frozenport
You don't need to be good to program. You need to be exceptional to do math.
Most programmers aren't that good at what they do, and I think they should
focus on their craft. I struggle to explain to my colleagues who make 100k
that they need to do memory tiling: or that their loop doesn't vectorized.
Changing a float from a double in a BFGS is 2 days of work for these people.
Let the chosen few do math.

~~~
11001
>Let the chosen few do math

And then you wonder why an average person hates/is scared of math.

------
erichocean
_They seem to agree on one thing: from a workaday perspective, math is
essentially useless._

I would love it if someone had resources on how mathematics could actually be
used _as a tool_ by programmers.

What kind of problems would mathematics _as a tool_ help me solve
better/faster/more efficiently than the other tools in my programmer toolbox?

~~~
Jacobi
There are plenty. The analysis of basic algorithms involve math (e.g.
QuickSort, binary search, etc). Programming a user friendly website can
involve advanced mathematics, for example you want to recommend to your users
a product that might interest them (Collaborative Filtering)

------
Crake
"Rather, mathematics is a tool for understanding phenomena in the world: the
motion of the planets, the patterns in data, the perception of color, or any
of a myriad things in the world that might be understood better by
manipulating equations."

^---So true.

------
louischatriot
Very interesting thesis. But why did you use the two very specific examples of
the factorial and fibonacci sequence? They don't strengthen your (again, very
interesting) point but rather dilute it.

------
bitanarch
What this article essentially says is one kind of math is better than the
other.

No.

------
darec1
Pointless rant by some math student, or some programmer who feels he is
missing out on math. Unclear on demands, probably wants to teach more math to
kids.

------
peripetylabs
It would be nice if every programmer owned a copy of Abramowicz and Stegun,
and every mathematician knew at least one programming language.

------
minton
"Although braggadocio doesn't come naturally to most computer programmers..."

Ha! Made me laugh.

------
michaelochurch
I'm in the Yegge school, so far as I think that the mathematical ignorance
and, more generally, anti-intellectualism of our industry is its downfall. We
see it in the lack of design sense and the awful code that is produced.
VisitorFactory nonsense is something that was invented by people who hated
math and wanted to tear programming away from its mathematical/problem-solving
roots with a bunch of junk complexity that adds nothing.

If you're doing simple things over and over, with the objective function being
minimal rejection/loss rate, the anti-intellectual/industrial get-your-boss-
off-your-back-oriented development works. Complex work is different. You need
to invest a lot of additional energy into elegance and quality that most
companies won't pay for.

However, what we do is so _intensely_ collaborative if it's done right that we
are _teachers_ more than anything else. Not _engineers_ or _programmers_. Our
job doesn't end when we build the thing: we have to teach people how to use
it, so that it's not mindless complexity being inflicted upon them (as is true
of many enterprise products) but something that actually makes their lives
better (most software doesn't, because bosses have a tendency to force people
to use bad software and this makes engineers' lives, and thus products,
worse). It's sad to me that this is an extreme minority viewpoint, and that
almost no software manager on earth will actually _pay_ for things to be this
way.

~~~
frozenport
There aren't enough humans that can do math, and there are more openings for
jobs than intellectuals.

Programming is not necessarily a job for intellectuals, any more than painting
or automotive repair is reserved for intellectuals. Have you heard of the
balmer peak? This stuff we call code ain't that hard.

I don't like your complexity argument. If you are a gear in a watch you do not
deal with complexity. You are well insulated, dealing with your 1 or 2
immediate neighbors. Most programmers are gears, they are attempting to deal
with the complexity inherent in your company is a great way to procrastinate,
waste time remaking the build system, and get fired. I am sure we have all see
this happen.

~~~
nollidge
The Ballmer Peak was a fictional concept in an XKCD cartoon[0]. Hardly
something to use as evidence in an argument.

But even if it were scientifically accurate, alcohol doesn't make you dumber
(at least not at first) - it makes you less inhibited and slower-thinking, two
traits that may actually be desirable in problem-solving.

[0] <http://xkcd.com/323/>

~~~
11001
[http://www.cbsnews.com/8301-504763_162-57413201-10391704/can...](http://www.cbsnews.com/8301-504763_162-57413201-10391704/can-
alcohol-make-men-smarter-study-suggests-yes/?tag=contentMain;contentBody)

~~~
nollidge
Coincidentally, my team got first place in pub trivia last night after several
beers apiece.

QED!

------
papsosouid
Am I the only one who was put-off at the first paragraph calling Yegge and ESR
"gifted essayists"? Some people genuinely are gifted essayists. People who
write incredibly long-winded essays that jerk wildly to and from seemingly
random topics are not among them. I'd say PG actually does qualify, his essays
are generally well thought-out, and able to convey a concise, clear point. How
does one possibly group PG and Yegge into the same category as writers?

------
nerdfiles
Mathematics, Programming, Logic: We are looking at, like with the
philosophical beginnings of the 20th century a question found in Hilbert,
Russell, et al. Is mathematics reducible to a base system of logic. Today, we
might see the cleanest expression of a theoretical framework answering to this
question: free logic.

At the same time, programming itself, like mathematics, is not a world of
numbers or any of this, and is in fact the expression of complexity: the
complexity of the sign as written or drafted by the mathematician, itself
expressed in a non-denumerable set. Programming itself depends on the
conventional expressions and idioms of programmers who write actual code.

As Wittgenstein might say “Mathematics consists of calculations, not of
propositions.”

Like so: "Programming consists of executions, not of functions."

