

The Lambda Calculus: What is it and why should you care? - api
http://zeroturnaround.com/rebellabs/what-is-lambda-calculus-and-why-should-you-care/#!/

======
api
Total tangential point: I noticed something while reading this otherwise good
explanation. I can't speak for others, but for me it explains why I often had
a tough time with math in school and possibly also why pure functional
languages have had slow uptake.

The thing I hate about math (and mathematical functional coding by proxy) is
_how it is explained_.

This article contains an absolutely perfect example. Go to the heading
entitled "Church Numerals," which explains how numerals are declared as
functions. Above we've been introduced to lambda function notation, which is
fine. Now he throws this at you:

0 = λs.λz. z the function s is applied to the argument z zero times

1 = λs.λz. s z the function s is applied once

2 = λs.λz. s (s z) the function s is applied twice

 _What the bleep is 's'?!?!?_

I spent several minutes trying to figure that out. What is s? (bang head on
table) What is 's'? (bang head on table more)

OOOOOOOHHHH... there it is in _tiny f'ing print_ :

"(the names s and z are short for successor and zero)"

It's in tiny print so you won't see it, because knowing what symbols mean
evidently is not important to understanding them.

Mathematicians do this habitually. Every math class I attended in college
would consist of a mathematician who threw symbols and notations at you,
proceeded to do stuff with them, and then _maybe_ got around to defining them
after he'd already expected you to get it. Maybe. If he feels like it.

It's like teaching a foreign language by beginning with poetry reading,
progressing to literary theory, and then oh by the way -- as an unimportant
aside -- visiting the topic of words' meanings and their grammar.

Why do mathematicians do this? Are they working to make their subject
impossible to explain? Or is it some weird artifact of how the mathematically-
inclined think-- do they think nonlinearly and so do not grasp that people
learn things by building up ideas hierarchically?

If math were taught by beginning with symbols and operators and their
conceptual meaning and then progressing to their use, I think people would
have far less of a problem with it. It would be like learning any other
language, followed by learning the "literature" that is expressed in that
language.

Edit: I sometimes see programming languages explained this way too. Some code
will be thrown at you, followed by some higher-order language concepts, and
then oh by the way the symbols % means ____ in this language. The example, of
course, was utterly incomprehensible prior to that little detail.

~~~
DanBC
> Why do mathematicians do this?

My other FU-money idea is to fork Wikipedia, then take the 100,000 most
important STEM articles and give them i) rigorous fact checking & ii)
Including an introductory paragraph that explains the concept to anyone. (At
least, anyone with an IQ of about 100 who's prepared to put a bit of work in).

EDIT: This could be tricky, because we sometimes lie when introducing a
subject just so that people can get their heads around it. See, for example,
the speed of light, and then refraction.

~~~
api
"Okay, we're going to start with a simplified version of this concept first.
Keep in mind that the reality is somewhat more difficult to grasp, but
grasping this simplified and idealized version will help you to eventually
grasp the real deal."

No problem with that.

But out of order explanations are unforgivable. I did get better in math when
I figured this out and learned to find the meaning of the symbols first. But
believe it or not, sometimes this wasn't easy. Sometimes I would encounter
some esoteric mathematical notation and have to spend quite a bit of time
digging through books or searching the net to even find a place where _anyone_
bothered to explain what it meant.

By "what it meant" I mean what it meant _conceptually_. A bunch of jiu-jitsu
of symbol manipulation is completely worthless without a conceptual
understanding of what that manipulation is actually _doing_.

My first aha here was with calculus. After struggling really badly in high
school and then college, someone finally told me that a derivative was a rate
of change. After that, I completely got it and had no further problems with
derivatives, integrals, or differential equations. It all made perfect sense.
I also completely understood why calculus was invented, and its historical
importance. The book did say a derivative was "rise over run," which I suppose
is a maximally-obtuse way of saying "rate of change," but it never clicked.

How in the world do these people perpetuate their discipline?

But back to functional programming. I taught myself to program when I was four
years old. It was BASIC on the Commodore VIC-20. In retrospect I think it was
easy for me even as a 4-year-old because everything was explained in terms of
"this statement does X," and then the more complex variant of "this statement
makes the next statement conditional on..." I got that because that's how
things work. Things work by, umm, doing stuff with stuff.

I think this is why procedural languages are so much easier to grasp.
Procedural languages are explained in terms of what they do with stuff, which
is how people think because that's how things happen in the real world.

Functional languages on the other hand tend to be introduced in this obtuse,
theoretical, mathy way, with symbols and expressions thrown out before their
function or conceptual meaning is adequately explained.

Introduce FP like this: language, semantics, and meaning first, then concepts,
then immediately dive into _doing things_ with it. Try that and I think FP
will get more popular.

