
It’s Just Math - raganwald
http://raganwald.posterous.com/its-just-math
======
zackzackzack
First comment: the link to recursiveuniver.se is broken as far as I can tell.
Simple typo it looks like. [Edit: this has been fixed.]

Second comment: the more I work with mathematicians and computer programmers,
the better I get a sense for how long something will take. If the problem is
in the normal range of difficultly, I can usually estimate within a few hours
of how long it will take me to make something from scratch.

I think the regime you are referring to one filled with the hard problems that
fall outside the normal range. The ones were you have no idea how to make
something work. There are no books that give you a rough recipe for what to
do. It's just you and the computer trying to make sweet sweet music with your
ideas.

Whenever I run up into a problem like that, I switch my mindset over from
mathematician or programmer and turn into a explorer and pirate with laughably
low standards. I copy and try every idea I can think of. Oh, that one guy did
something that looked like it might work? I steal the idea whole sale and see
if it solves my problem. No such luck? Time to explore everything. There is no
nook or cranny that gets left unexplored. Every other line is a printf or
console.log. I need to see everything that the computer is doing. I will try
anything once and forgive anything twice if it shows promise.

Normal projects are just you doing some programming. Hard problems are you
doing the footwork and experiments to understand what exactly is going on and
then doing some programming at the very end.

~~~
raganwald
Oh absolutely! If programming is like architecture, you aren’t always building
the first suspension bridge to connect South Africa to Antarctica, most of the
time you’re designing a collection of single family dwellings for a tract
builder.

I was just riffing off the observation that software development is a
behavioural science, not a formal or empirical science. I hoped that this
example, while clearly ridiculous (which client uses terms like “super-linear”
but simultaneously demands that the unknowable be known by Friday?) would
provide a chuckle while making the same point.

Love the description of your approach. Speaking of sweet music, rock on!

------
davedx
I've been working on splines all day.

The client's requirement: to move the control point of the front of a series
of splines, in such a way that it interpolates between the N nearest splines,
where the nearest splines are filtered at the opposite end of the point the
user is manipulating. Basically, you move your control point where you want,
but the other end of the curve moves as little as possible. When you stop
moving it, it interpolates smoothly to the spline whose equivalent control
point is closest to the one you were moving.

I had no idea how long this would take, but I wasn't at all surprised that
it's taken me all day just to write an implementation. Tomorrow I need to test
it. It's so complicated that it's actually impossible to tell if it's working
just by looking at it. My brain feels fried.

It's times like these when you hope your project manager will be sympathetic
:)

~~~
slindsey
I think I'm a good developer then I read a post like this and think, "It would
take me a week just to fully understand the problem. He's unhappy with an
implementation -if not perfect- in a day?"

~~~
davedx
Ah I didn't start the project from scratch today. I've been working on it for
about a week so far. This is a new problem that means changing a lot of the
code base, but it's not a new application at this point (thank god!).

------
orbitingpluto
Axiom: A client or employer's reasonableness of his or her expectation of
performance is _DIRECTLY_ (oops) proportional to his or her ability to
accomplish the task.

Corollary: If a client or employer's expectation requires you to bend
spacetime to accomplish the task in his or her set timeframe, yes, he or she
is dumb as a brick.

2nd Corollary: Need coffee.

~~~
bthomas
Don't you mean directly proportional? More ability => more reasonable
expectation

~~~
orbitingpluto
Thank you. (I was thinking of amount of time to accomplish task, not
reasonableness.)

------
skylan_q
"To an approximation, asking when a non-trivial program will be done is just
like asking when a theorem will be proven."

Thank you for this. ;)

~~~
loceng
This whole blog post should be posted to <http://www.devsigh.com> :)

------
jakejake
Any client that talked to me like that better be paying serious bucks. Like
"early retirement" amounts of cash. Sometimes programmers need a push and need
to be motivated, but unless this was paraphrased for dramatic effect, I found
the tone very insulting.

------
jwco
“Design a two-dimensional cellular automata such that we can create a self-
replicating pattern.”

What clients might require cellular automata, and automata such as these?

<http://en.wikipedia.org/wiki/Rule_30> and
<http://en.wikipedia.org/wiki/Cellular_automaton> tell me they have
applications in cryptography and random number generation. Are there any other
areas where cellular automata are applied?

~~~
UK-AL
I think this was made up; just to prove a point.

------
daenz
Why do you let your client talk to you like that?

~~~
keithlaforce
This is what I was thinking.

------
drostie
The big problem that people don't seem to understand is, what you were taught
in maths class _wasn't mathematics_. We should rename the popular K12 class
"calculations", except for the one called "geometry", which I suppose should
be called "formal geometry" or "Euclid class" or something like that.

The issue is the scope of the course. I know plenty of professional
mathematicians. None of them can calculate a tip. The way to excite their
mathematical minds is to drag out the Cosmic Wimpout dice, play it with them
until they understand the rules, and then ask them, "so, it's obvious that
strategy doesn't matter much, but what's the optimal strategy?" Bam, they're
writing Markov diagrams and fussing over the fact that one of the dice has a
wildcard which you can technically choose to be anything in certain rolls, so
which one should it be, and so forth.

The most accessible and iconic piece of plain mathematics that I know is the
Fundamental Theorem of Calculus. In Math class in high school, you may have
been taught this by saying "it's very complicated and rests on some very
difficult theorems and trust us, the mathematicians have proved it. But no.
That's not mathematics.

I occasionally get the opportunity to talk to high school students and college
freshmen about calculus, not in any formal setting, but just one-to-one. I
tell them that the fundamental theorem of calculus is about + and −. The − is
confusingly called a "derivative" and the + is confusingly called an
"integral", but the fundamental theorem just says: + undoes −, and − undoes +,
and that's it.

They look at me funny. Suppose we start defining sequences (F₀ F₁ F₂ ...), and
suppose we start defining ΔF as a new sequence, Fᵢ − Fᵢ₋₁ . And let ΣF be a
new sequence F₀ + F₁ + ... + Fᵢ .

Then it's not too hard to see that Δ(ΣF) = F, in other words, − undoes +. And
it's not too hard to see also that Σ(ΔF) = F, because if you look at it:

[Σ(ΔF)]ᵢ = Fᵢ − Fᵢ₋₁ + Fᵢ₋₁ − Fᵢ₋₂ + ... + F₁ − F₀ + F₀.

And those cancel out by pairs into just Fᵢ.

Σ of course forms the basis for a Riemann sum and Δ forms the basis for a
difference quotient, and we take limits because the product rule says:

δ(fg) = (f + δf)(g + δg) − fg = g δf + f δg + δf δg

...and we want to say δf δg → 0 to keep the algebra pretty, because if δf and
δg are one in a million, g δf and f δg are a million times larger than δf δg,
and so on until δf δg is negligible by comparison.

I kind of lost my point, so let me return to it and say that being able to see
∫dx as + and d/dx as −, being able to _feel_ these patterns, that is actual
mathematics. Being able to write out the matrix transform of complex
multiplication and just seeing, "oh wow, that's a scaled rotation matrix!" and
the implication -- "wow, complex-differentiable functions must preserve
angles!" -- that sort of thing is mathematics.

I think it would be nice if we taught this sort of thing in a mathematics
class, but in the meantime, I'd prefer if we just insisted that "math class"
doesn't teach mathematics until you get toward college.

~~~
twelvechairs
Mathematics isn't necessarily solely about what professional mathematicians do
in the contemporary world. Saying that 'calculations' and 'geometry' aren't
'mathematics' is like saying that atoms aren't chemistry.... ridiculous.

~~~
wtallis
Memorizing the periodic table does not make you a chemist. It won't even
suffice to pass chemistry 101. High school math amounts to barely more than
memorizing the periodic table.

~~~
krschultz
I took 10 semesters of 'math' in college, was on a 'math' team in high school,
and got a 780 on the 'math' SATs. Yet by the OP's definition, I don't know any
_Math_ because I'm not doing his definition of _Math_.

That's like arguing that you can't swim unless you are an Olympic swimmer. I
can't do the back stroke like Michael Phelps, but I sure can swim.

High school math - at least for most - includes Calculus 1. Yes, that involves
memorizing integrals and using them later. But if you are going to be that
flippant about it, basically what you can marginalize everything. Organic
chemistry? Meh, that's just a bunch of memorizing compounds, it is not more
difficult than 1st grade memorization of multiplication tables. Anatomy? Ha,
that's child's play, it's just like memorizing the names of colors.

~~~
sold
I think the poster is arguing against the conflation. Organic chemistry is not
memorizing compounds; mathematics is not memorizing integrals; anatomy is not
like memorizing colors. There is a great difference between understanding and
memorizing.

I had a very bad history teacher. On most tests, he required the class to
learn about 20 different dates (only) and some facts like where a peace treaty
was signed. Nobody cared what those events meant, what were connections
between the dates etc. Now, many years later, I don't recall anything. I
learned nothing. Was I really learning history or just arbitrary numbers? You
could claim that it was real history, and probably be technically correct, but
I think that's silly. In my mind, the semantics of what I knew did not
resemble history at all. In contrast, my chemistry teacher visibly required
understanding, and till now I can discuss basic chemistry, even though I did
not exercise my skills for a long time.

It's the same with mathematics, learning algorithms means essentially you are
programmed to do mechanical work, just I like I was acting like a lookup table
during history classes. "Real" mathematics starts when the problem you are
doing is not directly amenable to a mechanical approach.

~~~
wtallis
Exactly. In the case of mathematics, if you aren't doing proofs yet, you don't
even _know_ what math is. Most high school math curricula are completely
devoid of proofs or abstract reasoning with the exception of at most a few
months during geometry. How many college students have wrongly expected
"Modern Algebra" to be a rehash of what they learned in high school? The
distinction between _mathematics_ and what gets taught in high school is very
similar to the distinction between _computer science_ and _computer
programming_.

------
psb217
Super-linear complexity is trivially achievable for all computable functions.
Perhaps you intended to say sub-linear (i.e. o(n))?

------
Someone
_“Design a two-dimensional cellular automata such that we can create a self-
replicating pattern.” I’m a smart guy, maybe not as smart as John von Neumann,
but smart enough to come up with a 500 by 500 pattern in a 29-state automaton
that works._

Hm, what about "light goes on if a light is on in a neighboring cell"? Not the
most interesting, but it does self-replicate, if at least one light is on in
generation zero.

~~~
raganwald
Great question!

This is analogous to asking, “How does life as we know it differ from rust?
They both grow..."

There is such a thing as trivial replication, the trick is non-trivial
replication. I don’t have the background to give you the formal definition,
but my layman’s understanding is that non-trivial replication works from a
description of a thing and a general-purpose factory that makes things from
descriptions.

I think the formal definition says something about the information encoded in
the thing that is replicating.

A fundamental questions mathematicians and computer scientists asked
themselves in the first half of the 20th century was whether it was possible
for a factory+description to be encoded in the description. 70 years later, we
know the answer. but at that moment, it wasn’t obvious how this could be done:
If the factory required space F for itself and the description of the factory
required space d(F), F+d(F) seemed to be larger than d(F), so the description
of a factory plus the description of the description of a factory seems
intuitively larger than the description of a factory, call it d(F+d(F)).

This seemed to go on ad infinitem, so how could you make a description such
that when handed to the factory, the factory would make another factory AND
copy the description so you would end up with a new factory plus a new
description? Smart people figured out how to do this theoretically with things
like the lambda calculus.

John von Neumann was interested in self-replciating robots. Stanislaw Ulam had
been working with CAs and suggested von Neumann try buildinga self-replicating
CA on these lines, and von Neumann showed this could be done with a 29-state
CA. This led to John Horton Conway’s radical simplification that we call
“Life” today, and it has been both proven and demonstrated that Life is a
universal computing machine. You can build self-replciating patterns that have
encoded descriptions, universal Turing machines, hexadecimal calculators, and
just about everything else.

Some of the most amusing oddities actually “output” numbers and letters
organized as dots as if printed by a dot matrix printer!

Warning: Life is addictive but unremunerative.

~~~
colonel_panic
"Warning: Life is addictive but unremunerative."

That's how I feel about lowercase life so far.

------
gte910h
Mathematics is not science, it is a tool for doing things, including science.

~~~
lurker14
[https://en.wikipedia.org/wiki/Mathematics#Mathematics_as_sci...](https://en.wikipedia.org/wiki/Mathematics#Mathematics_as_science)

~~~
gte910h
Yes, and geology was once called philosophy. Doesn't mean it's a science in
the way we use science now.

------
earldcox
I should just point out, contrary to the really true-to-life dialog, that
mathematics is NOT a branch of science.

You don't apply the scientific method to establish a law in mathematics, you
prove that it is true.

This confusion is one of the four pillars that comprise the Law of Unintended
Consequences.

~~~
raganwald
I have always understood that mathematics is a _formal_ science, whereas the
scientific method applies to what we call _empirical_ sciences. Then there are
things like “Management Science,” which are considered “Bodies of Knowledge”
and are unlike both formal and empirical sciences. The theoretical side of
Computer Science is a formal science.

I won’t say what psychology or psychiatry are. Depending on who you ask,
they’re empirical sciences, bodies of knowledge, or cargo cult medicine.
Software development seems to be all over the place. Some of it clearly relies
on computer science, some of it seems to be a body of knowledge.

~~~
sopooneo
Though "formal science" is not a familiar term to me, I take your word for it
and hear what you are saying. However, I always considered mathematics to be
it's own thing, different than science. Real science is physics and chem and
then biology starts to merge more into a body of knowledge.

Anything that has "science" in the title isn't really. And in line with that,
I consider "computer science" to be a misnomer as it is more accurately
categorized as a branch of mathematics. And computer science is, of course,
completely distinct from software development, which is a craft.

All that said, I would be hesitant to put psychology and psychiatry in the
same boat. Neither one is a hard science, but psychiatry is much less likely
to be anyone's fall-back major.

