
The Mathematical Hacker (2012) - ColinWright
http://www.evanmiller.org/mathematical-hacker.html#HN_Repost
======
VLM
Farsightedness. Things that are too close cannot be focused on clearly.

Step back. Lets talk Ohms law and "electronics people". Some systems level
folks care only about microwave RF scattering parameters and smith charts.
Others just use NEC tables vaguely derived from Ohms law but all that matters
to them is passing inspection. Some consider Ohms law a dumbed down version of
Maxwells equations for fools. Some, believe it or not, actually use Ohms law
directly. There is no "right or wrong" other than grouping them all together
as "electronics people".

In a similar way its pretty dumb to lump language designers, AI researchers,
CRUD frontend devs, and generic code monkeys all under "programmer". The
engineering field went thru a spinoff phase over a century ago where we no
longer have "engineers" we have MechEng, ChemEng, EE, etc. I suspect in a
hundred years it'll be considered humorously quaint that we used to lump all
these vocations together under "programmer" even though they don't fit.

------
stiff
I have spent the last 3 or 4 years self-studying mathematics, and what got me
interested in it is essays from PG, articles by Yegge, and most of all reading
SICP (and then reading "What is mathematics?", which in a lot of ways seemed a
natural continuation of SICP). The argument he builds is really, really, weak:

He invents some concepts like "Lisp tradition" out of blue sky, based on a
sample of three people, who he misrepresents and who aren't even serious Lisp
programmers (except PG), and then fantasizes what he thinks the "traditions"
(my ass, like old Lisp masters are convincing young adepts to not learn math
or something) represent, so discussing this seriously is beside the point. Is
the average C programmer better at mathematics? If anything, the books
introducing Lisp programming go much more into modelling complicated domains,
including mathematics, compare SICP or PAIP to K&R or your typical programming
intro book nowadays.

Then, he seems to imply that focusing on recursion is somehow less
mathematical. Recursion is a lot like mathematical induction, and inductive
definitions are ubiquitous in mathematics. Also, a lot of mathematics is non-
constructive and mathematicians aren't in general that interested in doing
efficient computations, so it's more about knowing algorithms and numerical
methods. So his big revelation reduces really to the fact that people doing
numerical computations should learn numerical methods, or basically "know your
domain", hardly a revelation. Or is it just a cry for programmers to learn
more mathematics? But then the guys he criticizes have already done a much
better job encouraging people to become interested in mathematics.

~~~
prakashk
> _" What is mathematics?"_

Are you referring to the book authored by Richard Courant
([http://www.amazon.com/dp/0195105192](http://www.amazon.com/dp/0195105192))?

~~~
stiff
Yes, it explains clearly a lot of the mathematical concepts that are subjects
of some examples in SICP, and it is as rich in insights. Those are my two most
favourite books ever, and studying them over a period of few years gave me as
much as my whole undergraduate university degree I think.

------
pmiller2
Previous submissions:

[https://news.ycombinator.com/item?id=4796586](https://news.ycombinator.com/item?id=4796586)

[https://news.ycombinator.com/item?id=4915328](https://news.ycombinator.com/item?id=4915328)

Also:

Refuting “The Mathematical Hacker”:
[https://news.ycombinator.com/item?id=4921953](https://news.ycombinator.com/item?id=4921953)

~~~
ColinWright
Fantastic - thank you. Interesting discussions, and I hope people take the
time to read the article, threads, refutation, and other comments.

I look forward to any additional comments and contributions people may choose
to make here.

~~~
scott_s
Colin, I'm curious about your take on the essay. I have to admit that while I
did read it, I did not engage with it deeply because I felt that the author
created strawman views, particularly with regards to the people mentioned and
the "Lisp school of programming." Since I found the premises so confused, I
didn't have the energy to disentangle the real point from them.

(Part of my curiosity on your take comes from the fact that you commented on a
similarly themed essay I wrote some time back.)

~~~
ColinWright
I've put it on my list of things to write about. It's a long list, but I hope
to knock a few things off it over the holiday. Thanks for asking - I'll post
here if/when I get something done.

------
mtraven
As a Lisp hacker, I am somewhat stunned to learn that the world suffers from
too much Lisp philosophy and that Lisp is unmathematical. Not to mention that
Eric Raymond is a spokesman for Lisp.

Actually there is a somewhat valid and interesting point in that essay,
obscured by crap. There are differences between numerical computing culture
and symbolic computing culture; people should know both. But from where I sit
hardly anyone these days is familiar with symbolic computing while everyone
either claims to be or wants to be a “data scientist”, that is, do numerical
machine learning. Nothing wrong with that, but I have a hard time seeing
numbers as the underdog in this fight.

~~~
xixi77
I think what he argues against is a third culture, where the focus is on
implementation of an algorithm in the code, rather than on the algorithm
itself -- and generally, the algorithm is what matters more. Just a couple
months ago I've seen some code that spent most of the time traversing the
entire (large) data structure and counting elements satisfying certain
conditions, while 5 minutes of looking at the actual math was all that was
needed to see that the count was a simple function of known quantities.

But, his valid point is indeed obscured by a lot of crap and strawmen :(

------
crntaylor
The Fibonacci example is flawed, in that the given definition, in terms of phi
= (1 + sqrt 5) / 2.0, is inaccurate even for moderately-sized inputs. A
standard, O(n) definition for computing fibonacci numbers is

    
    
      >> let fibs = 0 : 1 : zipWith (+) fibs (tail fibs)
      >> let fib n = fibs !! fromIntegral n
    

The constant-time approximation is

    
    
      >> let fib' n = let phi (1 + sqrt 5) / 2.0 in round (phi^n / sqrt 5)
    

Around n = 80 the constant-time solution starts giving incorrect results

    
    
      >> fib 80          -- exact O(n)
      23416728348467685
      >> fib' 80         -- approximate O(1)
      23416728348467676

~~~
catnaroek
In all fairness, if there were a type of algebraic numbers, the calculations
would be exact. (But probably not O(1) anymore.)

And of course the sane solution is to use exponentiation by squaring on the
2x2 matrix whose elements are all 1s, except the right-bottom one, which is 0.
Not surprisingly, (1 + phi) is an eigenvalue of that matrix.

~~~
crntaylor
Indeed! In fact, it occurs to me that you can get something of a 'best of both
worlds' solution between the elegant mathematical solution and the exactness
of the recursive solution very easily in Haskell, by defining the field
extension* Q(√5), i.e. the rational numbers extended with the square root of 5

    
    
      import Data.Ratio
      
      -- The number (a :+ b) represents (a + b √5)
      data Q5 = Rational :+ Rational deriving (Show) 
    
      instance Num Q5 where
        (a :+ b) + (c :+ d) = (a + c) :+ (b + d)
        (a :+ b) * (c :+ d) = (a * c + 5 * b * d) :+ (a * d + b * c)
    
      phi = (1 % 2) :+ (1 % 2)
    

It's now enough to notice that the coefficient of √5 in phi^n is half the n'th
fibonacci number, so define

    
    
      fib n = let a :+ b = phi^n in round (2 * b)
    

to get

    
    
      >> fib 10
      55
      >> fib 20
      6765
      >> fib 80
      23416728348467685 -- exact result in O(log n)!
    

* [http://en.wikipedia.org/wiki/Field_extension](http://en.wikipedia.org/wiki/Field_extension)

~~~
leephillips
This is mind-blowing. This comment is the best answer I've seen yet to the
question, "Why learn Haskell?".

~~~
colomon
The only reason I haven't actually implemented a number type like this in Perl
6 (someone proposed it to me years ago) is I didn't really it had such a
lovely and practical application.

~~~
colomon
Props to Haskell, though -- p6 makes it easy to do, but Haskell is definitely
more elegant here. Though maybe if I keep playing with it...

------
ChristianMarks
There is a calculational school of programming in which algorithms are derived
from specifications in a presentation of first-order logic adapted to program
derivation. It does not seem to be popular on HN. _" Programming: the
derivation of algorithms"_ by Anne Kaldewaij is a good source for this. A
derivation of an efficient algorithm to compute Fibonacci numbers is given, as
well as for me the most memorable implementation of binary search.

Miller should have used Kaldewaij's O(log(N)) algorithm to compute the n-th
Fibonacci number, given below.

    
    
      def fib(n):
        """Lifted from Kaldewaij."""
        a,b,x,y = 0,1,0,1
        while n != 0:
          if n % 2 == 0:
            a,b = a * a + b * b, 2 * a * b + b * b
            n = n // 2
          else:
            x,y = a * x + b * y, b * x + a * y + b * y
            n -= 1
        return x
    

As Kaldewaij remarks, the algorithm is harder to understand without its
derivation.

Of course there is TAOCP by Knuth, who added a supplement on probability to
the mathematical preliminaries.

Miller's suggestion that programmers should know calculus, statistics and
linear algebra is sound, though perhaps pedestrian. There is no end to the
mathematics one might apply. The article omits logic and dependent type
theory. The Curry-Haskell-Lambek isomorphism theorem on the equivalence
between the category of cartesian closed categories and functors and the
category of simply-typed lambda calculi and translations is basic in the study
of functional programming languages. Ramsey's theorem has been applied to
program termination. I myself am interested in applications of game theory--
though I am not giving away my secrets.

~~~
yetanotherphd
That algorithm can be de-obfuscated (see my top level post for the intuition).
The above is equivalent to calculating a power of a 2x2 matrix, and since
matrix multiplication is associative, we can use the squaring trick.

In python (I left out the code for matrix multiplication for brevity) this
becomes

    
    
      def nth_power_assoc(comp, f, n):
        """Reduce n copies of f with associative operation comp (short for composition, not compare)."""
        if n < 1:
          raise ValueError
        if n == 1:
          return f
        if n % 2 == 1:
          return comp(nth_power_assoc(comp, f, n - 1), f)
        else:
          g = nth_power_assoc(comp, f, n / 2)
          return comp(g, g)
    
      def fib(n):
        M = ((1, 1), (1, 0))
        M_n = nth_power_assoc(mmul, M, n)
        return M_n[0][0]

~~~
ChristianMarks
Yes, of course, that's how the algorithm is derived. I have to say I prefer
the "un-de-obfuscated" version, though it wasn't intentionally obfuscated--the
code is the way it is in Kaldewaij's derivation from a matrix multiplication.

------
yetanotherphd
I would say a bigger problem with not knowing the foundations of any field
well, is you can fall victim to the errors of the people who do understand it
and interpret it for you.

In the case of recursion, I think the author does not go far enough. The
problem with recursion is that the fact that a given recursion is O(n) is
usually quite opaque, while for a loop it is obvious. This is a much more
significant concern than the fact that the change of state from one iteration
to the next is implicit, rather than explicit. However, you can avoid both
recursion and mutable state by using map and reduce, e.g.

    
    
      def fibonacci(n):
        initial_state = (0, 1)
        state_change = lambda s: (s[1], s[0] + s[1])
        final_state = reduce(lambda s, _: state_change(s), xrange(n), initial_state)
        return final_state[0]
    

The problem arises because even though recursion is relied on heavily in the
foundations of mathematics, mathematicians also like to use the weakest axiom
system they can in a given situation. Therefore they use set theory when they
really have to, and first order arithmetic (which includes recursion) most of
the time. But computer programming does not usually even need recursion, and
so we shouldn't use it except when we have to.

EDIT: seeing the O(log(n)) solutions, I started to wonder if there is a way to
see this without knowing about numbers of the form a + b * sqrt(5). There is,
and you can also see it clearly from my solution.

By observing that the way I use reduce is just applying a function n times,
you only need to see that the function is a linear transformation on R^2, to
see that the problem is just calculating a matrix power. It is trivial to
construct the matrix power in log(n) using the usual tricks.

------
59nadir
Mathematics will take up precisely as much space as it 'needs to' in computer
science. The fact that you're arguing for some kind of effort in getting it
more space says to me you're as misguided as the people who say it has no
place. Just let it happen: If it's needed, it's needed. It's not a matter of
principle. It's a matter of practicality.

~~~
coldtea
> _Just let it happen: If it 's needed, it's needed. It's not a matter of
> principle. It's a matter of practicality._

There's underlying premise behind what you suggest, which is that things turn
out optimally by themselves with no need for any human effort and supervision
in getting them a certain way.

I don't thing this idea is true to history.

------
TheLoneWolfling
Except that the "better" solution posted _isn 't_ better in many programming
languages.

For example: Python. It has infinite-precision integers. As such, although his
"better" formula starts to diverge from the actual answer at n=71
(308061521170130 versus 308061521170129) while only getting worse from there,
the "naive" solution keeps working - albeit with O(n) behavior.

    
    
      def fib(n):
         return round((pow(0.5 + 0.5 * pow(5.0, 1/2), n) -
                        pow(0.5 - 0.5 * pow(5.0, 1/2), n)) /
                       pow(5.0, 1/2))
      def fib(n):
          a, b = 0, 1
          for i in range(n):
              a, b = b, a + b
          return a
    

You can do it with the decimal library as well, but you have to adjust
precision on-the-fly to get the required precision.

This is a better solution if you're willing to allow the additional
complexity: [http://willwhim.wpengine.com/2013/04/02/a-fibonacci-a-day-
fi...](http://willwhim.wpengine.com/2013/04/02/a-fibonacci-a-day-fibonaccin-
in-olgn-steps/)

------
Grue3
He lost me when he claims that "Lisp culture is hostile to mathematics". I
mean, has he heard of Maxima? Axiom?

~~~
coldtea
I'm not deep in Lisp's culture, but his claim might still very much hold.

The fact that you can name two math-related Lisp projects doesn't mean that
those play a big role in "Lisp culture" in general.

How important are Maxima and Axiom for the average Lisper?

Are they pillars of the Lisp culture or outliers? There very well might be a
thriving 1% math-loving Lisp community and a 99% who are hostile to math.

Those are the right questions to ask, not if counter-examples merely exist.

~~~
lispm
> The fact that you can name two math-related Lisp projects doesn't mean that
> those play a big role in "Lisp culture" in general.

Macsyma for example was one of the first big and very useful Lisp applications
which brought down Mainframes and Minicomputers of that time.

Macsyma led to: optimizing compilers with good maths performance in Lisp, the
addition of many numeric data types to Lisp (bignums, floats, complex, ratios,
...), Lisp-based workstations, ports of Lisp to many architectures just to run
Macsyma, ... Common Lisp then was standardized with an extensive numeric tower
- which was reused for other languages like Scheme or Haskell. Even today a
free version of Macsyma, called Maxima is used and maintained.

> Are they pillars of the Lisp culture or outliers? There very well might be a
> thriving 1% math-loving Lisp community and a 99% who are hostile to math.

Many Lisp users apply mathematics. Traditionally either directly with
Mathematics applications (even RPL on the HP calculator stands for Reverse
Polish Lisp) or in application domains like AI (signal processing, image
processing, machine learning, etc etc.).

~~~
catnaroek
Lisp's numeric tower is, if anything, a great example of how _not_ to do
mathematics. It is just too coarse-grained to be useful: it lumps into a
single giant structure every possibly imaginable number. On that basis, you
cannot easily constrain yourself to working on a more specific algebraic
structure of interest. If you need a total order, you are screwed up: the
numeric tower includes complex numbers. If you need to work on a quotient
ring, you are screwed up: the numeric tower includes floats.

Haskell's relatively fine-grained distinctions between numeric types is
nothing like Lisp's numeric tower.

~~~
JadeNB
> Haskell's relatively fine-grained distinctions between numeric types is
> nothing like Lisp's numeric tower.

Ugh; we're not calling out Haskell's numeric hierarchy as a _good_ example,
are we?

(Well, maybe it's a good example of how to handle _numbers_ —although I find
the lack of a positive-integer type a pain—but it's certainly not a good
example of how to handle general algebraic structures, which is what it seems
to pretend to be. What in the world is `sign` doing in there, for example?)

~~~
catnaroek
> Well, maybe it's a good example of how to handle numbers—although I find the
> lack of a positive-integer type a pain—

In another subthread I mentioned the absence of a Semiring type class, which
should be a superclass of Ring (Haskell's Num) and provide addition,
multiplication and fromNatural. So, yeah, I am not entirely happy with
Haskell's standard library either, but the situation in other (general-
purpose) programming languages is even worse.

> but it's certainly not a good example of how to handle general algebraic
> structures, which is what it seems to pretend to be.

How so? The standard library may be flawed, but the core language is perfectly
capable of handling general algebraic structures. The only legitimate point of
pain I can see is that a type or tuple of types can only be an instance of a
type class in at most one way, for which the solution (admittedly, rather
ugly) is to use newtype. A non-ugly solution would be to use something like
Standard ML's module system, but in no way it would improve the situation to
move closer to Common Lisp.

~~~
JadeNB
> How so? The standard library may be flawed, but the core language is
> perfectly capable of handling general algebraic structures.

I mean algebraic structures in the mathematical sense you mentioned (rings,
groups, &c.), not in the computer-science sense of algebraic data types.

~~~
catnaroek
> I mean algebraic structures in the mathematical sense you mentioned (rings,
> groups, &c.)

I know you meant those. And I meant those as well.

------
randomsearch
Mathematics is the foundation of Computer Science. The field was created by
mathematicians.

This is why all the schools of Computer Science I have ever encountered teach
maths to their students.

The truth is, you can be a programmer without understanding the theory of
computation, or logic, linear algebra etc.

But you will be a wiser, more competent, flexible programmer if you know the
maths. And there are some things you can only do if you know the maths.

------
raverbashing
This was a necessary post

I am appalled by the relative "distance" several programmers have in relation
to math

You can sometimes feel the pain when the person that made something _doesn 't
know math_ (especially when it involves math, of course).

But the most people know is "how to calculate fibonacci with a recursive
function". yawn

Math may be part of the "intelligence" of the business. Getting a CRUD
interface unfortunately is usually part of the problem, but several companies
have an automated intelligence that is based on a math background (even if you
can use an existing library)

~~~
VLM
"I am appalled by the relative "distance" several programmers have in relation
to math"

If you're familiar with the adage that people unfamiliar with unix are doomed
to reinvent it, poorly, there is an analogy with people unfamiliar with math
are doomed to reinvent it, poorly. Even if they insist they "dont use math".

An interesting foil the article did use would be to bounce ProjectEuler off
Raymond, Graham, Yegge. I'm a huge fan of project euler.

------
mrcactu5
Wait... this guy is trying to defend the usefulness of math with _fibonacci
numbers_ ? I wish him luck with that.

There are elements of math all over computer programming. I like the blog of
Jake Vanderplas - an astrophysicist at U Washington - who liberally mixes
Python and mathematics as applied to his field. jakevdp.github.io/blog/

I just got interested in image processing - one of the branches our bloggers
says is specialized. The image processing textbooks use fourier analysis and
convolutions to get information out of images. This is probably how instagram
works.

Functional programming is another area that using mathematics - in this case
category - to build an alternative to object oriented programming. To the
train eye tutorials like
[http://twitter.github.io/scala_school/](http://twitter.github.io/scala_school/)
and books like
[http://homotopytypetheory.org/book/](http://homotopytypetheory.org/book/)
have something in common.

Mathematics looks irrelevant to coding because mathematicians are not using
programming. Math requires thinking very hard about the algorithms they are
using. This is time one could spend coding or drinking with friends.

------
ska
The author is fairly obviously suffering from selection bias when claiming
that "Lisp programmers tend to be ignorant of applied mathematics."

In the past I've been part of the overlapping communities to some degree, and
that wasn't my experience at all.

------
leephillips
I found this article so stimulating when it first came out that it led to me
writing an alternative analysis: [http://lee-
phillips.org/lispmath/](http://lee-phillips.org/lispmath/)

~~~
ColinWright
Now with its own submission:
[https://news.ycombinator.com/item?id=6954218](https://news.ycombinator.com/item?id=6954218)

It would be interesting to see a proper discussion of that, too.

------
thkim
I agree with him that mathematical literacy should be more emphasized for a
"hacker" \-- someone who solves a problem. Obviously, mathematical knowledge
comes in handy for any hacker tackling on physical phenomena. That does not
mean, however, mathematics is the most important tool at hacker's disposal. I
think the author just wanted to say that understanding mathematics helps
solving a problem, but he went overboard to make it sound like someone who
does not understand mathematics is a second-rate hacker. I don't believe that
hackers are sized by mathematical aptitude.

------
tlarkworthy
There is still tons of drudgery in software engineering (eg web scraping).
Which makes me believe there are lots more machine learning revolutions to
come. The are all, of course, math led.

~~~
odonnellryan
Well, web scraping can be fun, too! But depends on what you know about the
page. If you don't know anything, it's tough. But if you have an idea about
how it's structured... (see below)

[http://www.crummy.com/software/BeautifulSoup/bs3/download/2....](http://www.crummy.com/software/BeautifulSoup/bs3/download/2.x/documentation.html)

~~~
tlarkworthy
I did a probabilistic approach before:-
[http://edinburghhacklab.com/2013/09/probabalistic-
scraping-o...](http://edinburghhacklab.com/2013/09/probabalistic-scraping-of-
plain-text-tables/) Companies like skyscanner employ loads of people to keep
all the scrapers fresh.

------
AnimalMuppet
For this discussion, I think it might be useful to separate two ideas.

First, there is the math you need in order to be able to program. This varies
by programming language. (For Haskell, you'd better know some abstract
algebra. For C, all you need is simple arithmetic.)

Second, there is the math you need in order to be able to write a particular
program. This varies by the program you are trying to write. Since math
connects with itself in many ways, you can often solve a problem using
unexpected mathematical techniques, so for these purposes, it's best to know a
lot of them.

Now, it seems to me that Miller's point might be that the Lisp (or perhaps
functional programming) community has to focus more on the math knowledge they
need in order to be able to write programs in their language. This leaves them
less room for learning the broader techniques that would make them more able
to write particular programs.

(Note well: I am not necessarily agreeing with Miller. I am merely stating
what I think he is trying to say. In particular, since math areas relate to
each other, knowing abstract algebra well may in fact turn out to be a help on
the broader math, rather than a hindrance. To some degree, it depends on
whether the average member of the "Lisp community" learns the math needed for
functional programming, and says "Cool, that was useful, I think I'll learn
some more math", or whether they say, "That used up all the time I have to
burn learning math - I guess statistics (or whatever) will have to wait for
next week/year/decade".)

~~~
NAFV_P
> _For C, all you need is simple arithmetic._

I would say _C and its standard library_. For example, you have to convert an
element of argv that contains only the numerical characters from a char[] to
an int. This requires a knowledge of power series in order to perform this
confidently. If you have access to stdlib.h, all you do is use strtol() or one
of its little brothers.

------
jhuni
> Lisp programmers tend to be ignorant of applied mathematics.

As a Lisp programmer and a mathematician myself I must disagree. I spent years
working on mathematical applications using Lisp dialects, initially drawing
inspiration from computer algebra systems like Axiom, Derive, Maxima, and
Reduce and later drawing inspiration from ontologies and knowledge bases like
SUMO and Cyc. All of these systems are heavily math based and they are based
upon Lisp.

------
lispm
> Lisp programmers tend to be ignorant of applied mathematics.

Please don't get your knowledge from some more or less useful bloggers.

Raymond a Lisp user? What had he done in Lisp? Yegge? Mostly an Emacs user
with Emacs Lisp usage. Unfortunately his Lisp knowledge mostly ends with Emacs
Lisp. Yegge would not recognize a Lisp, even if you hit him with Lisp Machine
Manual over the head. Only Graham was and is a real Lisp user and hacker. OTOH
there are too many Lisp and mathematics users.

Actually if there is a language with Mathematics background, then it is Lisp.

People like McCarthy, Minsky, Sussman are/were mathematicians.

The MIT Lisp hacker culture had lots of mathematicians from Joel Moses to Bill
Gosper.

Lisp spawned symbolic mathematics which has been used in all kinds of areas
with thousands of applications.

Macsyma, Axiom, Reduce, Derive and several other Lisp written maths
applications have been used in many applications. The whole AI domain is based
on all kinds of diverse mathematics written in Lisp: logic, statistics,
probability theory, neural networks, signal processing, ...

Even students who are reading SICP are usually complaining that most of the
examples are math related.

Let's see some actual Lisp applications:

* [http://www.lispworks.com/success-stories/netfonds-primetrade...](http://www.lispworks.com/success-stories/netfonds-primetrader.html) Real Time Stock trading. Applied maths.

* [http://www.lispworks.com/success-stories/inspiredata.html](http://www.lispworks.com/success-stories/inspiredata.html) Data visualization. Applied maths.

* [http://www.lispworks.com/success-stories/brs-mle.html](http://www.lispworks.com/success-stories/brs-mle.html) Machine Learning. Applied Maths

* [http://www.lispworks.com/success-stories/memetrics-xos.html](http://www.lispworks.com/success-stories/memetrics-xos.html) Marketing Analysis: Applied Maths.

* [http://www.lispworks.com/success-stories/raytheon-siglab.htm...](http://www.lispworks.com/success-stories/raytheon-siglab.html) Signal processing: Applied Maths.

* [http://www.franz.com/success/customer_apps/mechanical_cad/mt...](http://www.franz.com/success/customer_apps/mechanical_cad/mtu.lhtml)
    
    
      Iseogeometric Analysis of aircraft engines. Applied Maths.
    

* [http://www.franz.com/success/customer_apps/data_mining/pepit...](http://www.franz.com/success/customer_apps/data_mining/pepite-DM.lhtml)
    
    
      Predictive analysis: applied maths.
    

and many many others.

It's just that these Lisp users tend to publish their works not as blog posts,
but in domain-specific journals and/or as applications.

~~~
discreteevent
Looks like he was wrong on the Lisp front then. But actually all your examples
kind of strengthen one of the points of the article (for me). That is that of
we want to get things done, make a difference, then we should be focusing on
applied maths and not the lambda calculus and category theory. Only a very few
people are needed to focus on those things. For the rest of us a focus on
those things means taking our eye off the ball to the detriment of what we
work on (as an industry).

I like your point about the blog posts. If you want to find out how to really
make things they're often not where it's at.

~~~
sixbrx
I couldn't disagree more strongly. I think systems built on the lambda
calculus are building nothing less than a new form of constructive
mathematics, taking it squarely out of philosophy and into the realm of
practicality. Theorem provers will only get stronger, and they will change a
lot about how mathematics is done.

I do finite element methods myself, and Rust has helped a lot in writing
allocation-free code (meaning allocation-up-front really), and it has
benefited a lot from functional/induction-centric languages.

------
NAFV_P
> _But I think that most programmers who are serious about what they do should
> know calculus (the real kind), linear algebra, and statistics._

What does he mean by calculus of the real kind? I parsed that as referring to
calculus of real variables, as opposed to complex analysis. One of the reasons
that complex algebra is taught in colleges and universities is that real and
complex algebra are related. Gerolamo Cardano ran into it around 450 odd years
ago, so it isn't exactly an obscure area of maths.

~~~
AnimalMuppet
I believe that he meant real and/or complex calculus, as opposed to lambda
calculus.

------
lispm
> But rarely in these discussions will you find relevant mathematical
> considerations. If the goal is to compute a Fibonacci number or a factorial,
> the proper solution is not a recursive function, but rather knowledge of
> mathematics.

THE introduction to the Lisp culture, SICP, discusses exactly this on page 35
(first edition).

See also:

[http://mitpress.mit.edu/sicp/full-text/book/book-
Z-H-11.html...](http://mitpress.mit.edu/sicp/full-text/book/book-
Z-H-11.html#%_sec_1.2.2)

------
muyuu
One cannot frame the position of someone on a topic from a quotation or two,
and then have the ironic audacity of calling that person myopic.

------
graycat
No, the OP mostly fails to 'get it'.

First, beating up on Lisp is next to irrelevant to the role, potential or
current, of math in business or computing.

Second, the OP fails to make a solid case for the power of math to make a
powerful, valuable contribution to the solution of real problems for which we
do believe we need software. In particular, in spite of the claim of the OP, I
doubt that linear programming did much if anything to defeat Germany in WWII;
why? Because both the Kantorovich Nobel prize for his work on the
'transhipment' problem and Dantzig's work on his simplex algorithm came after
WWII was over.

Third, the OPs recommendation for math for applications is too meager; what he
listed is all good, but we need more to be as successful as we should wish.

Here's a simple description of the potential of math for 'information
technology' (IT) business, i.e., the money making kind: We have a lot of input
data and want to process it, that is, manipulate it, with software to generate
output data that is valuable in business, the money making kind.

So, a question is how to process the data? We want powerful means that will
convert the input data to valuable output data; did I mention valuable in the
sense of business, the money making kind?

Well, the traditional means in business IT has been to take some work that is
well understood just as manual work, that is, in principle could be done
manually. Then take those means and write software to do the corresponding
data manipulations. There nearly always the means were just intuitive,
obvious.

So, what IT did with the input data was nearly always just intuitive, i.e.,
what was well understood from manual efforts. Some of the big examples were
bookkeeping, accounting, payroll, order entry, inventory control, materials
requirements planning (MRP), customer relationship management (CRM), airline
and hotel reservations. Here, again, we have means well understood manually,
and IT wrote software to do essentially the same thing.

So, we are taking input data and manipulating it to get valuable output data.
So, for more valuable output data, we want more powerful means of manipulating
the data.

So, where do we find the more powerful means? E.g., we are flying airplanes
and want to get the work done, that is, meet the published schedule, but at
minimum cost. Turns out, can do much better, save maybe 15% of direct
operating costs, where just the fuel savings alone are finger lick'n good,
with integer linear programming set partitioning, i.e., quite a lot of applied
math, including one of the early and main leads to the question of P versus
NP.

Another example? Okay, how to climb, cruise, and descend an airplane for
minimum cost within a given time window. There are various approaches, but a
start is deterministic optimal control theory, which is some quite good and
advanced applied math.

E.g., once the FedEx Board wanted some revenue projections. A mess up had the
representatives of crucial Board Member General Dynamics (GD) pack their bags
and get plane reservations back to TX, which would have killed FedEx. Here's
what pleased the GD guys and saved FedEx: We know the present revenue, and we
know the revenue at our planned target market. So, the question is how to
interpolate between these two. So, let t denote time, say, in days, y(t) the
revenue at time t, and b the revenue per day when are serving all the planned
market. Then make a 'virality' assumption, that is, that the rate of growth in
revenue is directly proportional to (a) the number of current customers
talking and (b) the number of potential customers listening (or the
corresponding revenue). With y'(t) the calculus first derivative of y(t), that
is, y'(t) is the rate of growth in revenue, in dollars per day, we have that
there exists some constant k so that

    
    
         y'(t) = k y(t) ( b - y(t) )
    

There is a closed form solution via freshman calculus, and the result is a
lazy S curve that rises asymptotically to the full market revenue b. So, with
this bit of math, there is only one guesstimate, k. So, draw some curves for
various values of k and from other considerations pick a reasonable value of
k. Or estimate k from the data on the growth so far. Done. The GD guys stayed,
and FedEx was saved. True story.

There are many more examples, and many more to be found and executed.

Net, the role of the math is to find more powerful means of processing the
input data and, thus, yield more valuable output data, valuable in the sense
of business, the money making kind.

For the relevant math, there is nearly no limit. Fortran, Lisp, C, recursion,
etc. have essentially nothing to do with the value or role of the math. The
math is before the software development, leads to telling the programmers what
to program, that is, what data manipulations to do. The data manipulations can
be programmed in nearly any programming language although some languages are
much easier for the programmer than others.

