
Alan Kay – Normal Considered Harmful (2009) [video] - e12e
https://www.youtube.com/watch?v=FvmTSpJU-Xc
======
e12e
Just randomly came across this Alan Kay talk that I hadn't seen before. Some
very interesting points on the challenge of real innovation:

"Normal is the greatest enemy with regard to creating the new. And the way of
getting around this, is you have to understand normal, not as reality, but
just a construct. And a way to do that, for example, is just travel to a lot
of different countries -- and you'll find a thousand different ways of
thinking the world is real, all of which is just stories inside of people's
heads. That's what we are too. Normal is just a construct -- and to the extent
that you can see normal as a construct inside yourself, you've freed yourself
from the constraint of thinking this is the way the world is. Because it
isn't. This is the way we are."

~~~
agumonkey
I'm curious about system evolution and their tendency to 'normalize'. I too
like the diversity and originality, but constant newness, as in 'javascript
client framework' is exhausting .. surely there's a balance, and hopefully
some kind of theory on where to place the center.

~~~
TheOtherHobbes
Devs like to solve problems for their own sake. Building a completely new
anything is a _completely different challenge_ to solving a problem like
"Build a version of existing thing X using language Y and/or running in
environment Z."

The second option is well-bounded and safe. It requires technical skill, not
creativity. It's a legible, comprehensible challenge.

The first option is unbounded and unsafe. It can't be done without creativity,
originality, _and_ technical skill.

I'm becoming convinced there should be a much stronger creative and artistic
element in developer training. Invention and innovation are so much more
valuable than wheel reinvention that turning out people who are only
comfortable with the latter is selling everyone short.

~~~
vezzy-fnord
You can't really teach such a thing, not in any profound way. The programmer
in question must have an intrinsic or otherwise self-conditioned drive to read
computing history, papers and be interested in actually doing research before
starting a project.

We have largely crafted a culture where doing research before writing code is
considered slow and ineffectual for whatever reason. Instead, we value "moving
fast and breaking things" and whipping up the quick hack, which encourages
people to propagate their computing biases and never step out of their comfort
zone.

This is one of the reasons why I mostly scoff at attempts to make computer
programming a compulsory schooling subject. Coding as "the new literacy"
devalues computing and is the very embodiment of the programming pop culture
that Alan Kay has warned about.

~~~
agumonkey
Moving fast is valuable sometimes, as much as going away in a hammock as
Hickey would say. Learning how to alternate both scales is missing though. And
I often thought smart people had that down, they could think ideals, then get
down through the stack to effectively make things, then get back to the
abstraction without losing focus or getting lost. Quite often I'm stuck at one
level or the other.

> This is one of the reasons why I mostly scoff at attempts to make computer
> programming a compulsory schooling subject. Coding as "the new literacy"
> devalues computing and is the very embodiment of the programming pop culture
> that Alan Kay has warned about.

Especially since the people behind this idea have zero idea about what is
programming. Some want people to learn HTML, which is pretty much void.

------
talles
"I don't think computing is a real field, it acts like a pop culture, it deals
in fads and it doesn't know its own roots."

Harshly put, but there's some truth there.

~~~
fit2rule
I too think its very cynical, but extremely accurate.

~~~
coldtea
How is it cynical?

Devs go for fads (just watch HN homepage over time), and there are tons of
snakeoil salesmen pushing their wares (e.g. Mongo) and millions of programmers
without the basic scientific and engineering rigor.

It would only be cynical if it wasn't an extremely accurate description (which
you agree it is).

~~~
fit2rule
Its possible to address the subject without cynicism.

~~~
coldtea
If something is an accurate description it's no cynicism.

Cynicism implies that something is in a condition X, and the cynic describes
it as a much worse condition Y.

~~~
fit2rule
Actually, that's not quite right.

"Cynicism is an attitude or state of mind characterized by a general distrust
of others' motives."

Perhaps you are conflating the word cynic with critic.

~~~
coldtea
That's just part of the full definition.

And even in this case: it's only "distrust" if other's motives aren't bad in
the first place.

If people in general have bad motives in IT (lazyness, profit,
unprofessionalism etc) it's not "cynicism" to say so.

It's merely calling a spade a spade.

I'm more reffering to the "bitterly or sneeringly distrustful, contemptuous,
or pessimistic" meaning of the lemma though.

Where again, if contempt is waranteed and if the situation is dire, it's not
cynical to be "contemptuous" or "pessimistic" is just realistical description.

------
noobermin
It's funny how he chides compsci's as being ignorant about their founders. As
a physicist, I personally know physicists who know little about the field's
history and care even less. Moreover, I'd argue that in Physics, the push to
normal is much stronger than it is in the tech world. May be I'm just thinking
the grass is greener on the other side, however.

~~~
agumonkey
Maybe because physics is much more tied than computers. In the former, natural
laws are quite few and strict, in the latter everybody is free to invent his
own little abnormal world without nothing to go against it.

~~~
a-nikolaev
Computing natural laws are much much much stricter and simpler than the laws
of physics.

When in physics you have many fundamentally distinct fileds with their own
models and view of the world: mechanics, thermodynamics, statistical physics,
electronics, optics, quantum physics and all sorts of fundamental theories
like string theory. This is a very rich system, with many complex models.
Scienetists only hope fore some unification there, but for the most part you
have to deal many diverse parts of a huge multi-scale puzzle, and the peices
don't always fit together nicely.

When you look at computation, what it really is, well a primitive Turing
machine is all you can hope for really. Some resarchers push it a little bit
into infinite-time computability, but it's not a realistic model of
computation anyway.

What you can do with computation are conditions, loops, and variable
assignment. Or in lambda calculus, it's just substitution and name binding.
Even worse, name binding is not really necessary if you express your program
with combinators. So, fundamentally computation is substitution, a rewriting
system.

Computation is simple and trivial, but still, it's a great model, and you can
create amazing things in it, even though the laws of computation are really
trivially simple.

~~~
agumonkey
Well, nobody lives at the TM or LC level. I agree that these theories are as
strict as a theory can be. But we stack so many layers, the theory disappear
and it's all politics. As if in the end programming was more about running a
city than building an engine. And then, simplicity feels like a pipe dream.

~~~
a-nikolaev
Of course no one in their right mind writes code for complex Turing machines,
but don't be confused by all the layers of abstractions.

Programming at any level is fundamentally the same, it's about iteration,
branching, composition of smaller pieces, and packaging smaller pieces into
bigger ones. No matter at what level you write your code, it's always like
this. The differences in API are not important.

EDIT: I think that concurrency is a bit different in this respect. It really
requires somewhat different prespective, but again it reduces to simple basic
elements (like the agent model, or pi calculus) which are reiterated and
reimplemented many times.

------
nly
As much as I agree with the sentiment presented by Alan Kay, in this talk and
others, his presentations often feel bogged down in philosophical fluff and
flakey analogy. If you want to see what I mean, mute and scan forward in the
video paying attention only to the slides. It's almost impossible to tell what
this talk is trying to convey from the slides alone. Nothing is concise. It's
very lofty, interleaved with seemingly random stories. If this talk was given
by someone without a name, we'd consider it completely whacked.

The tl;dr:

    
    
        * Smalltalk did everything better
        * Software was better in the 60s and 70s
        * Alan Kay really dislikes the web.
    

I'd like to see a lot more talk from his progeny/ilk about modern revivals of
the philosophy from these computing heydeys, alongside practical examples of
how we can do modern applications better.

~~~
11thEarlOfMar
I'd agree that there is a pretty long way between him tossing the ball and the
audience catching and running with it. I take his purpose to be to exhort the
viewers to challenge their own thinking and their own purpose, rather than
trying to achieve a specific improvement.

------
bitwize
I'm a bit surprised that Dr. Kay didn't fact-check the boiling frog story;
modern biologists don't believe it has any basis in reality.

~~~
dwmtm
Do you really need to fact-check a metaphor?

~~~
bitwize
I mean it's a cute story and an idiom that is pretty much indelible from the
language, but Kay was putting it in terms like it's a fact about frogs'
biology when it is not. He has a background in biology as well as CS, so I am
indeed surprised he didn't check this.

~~~
e12e
Appears to be inconclusive: "None of these modern rebuttals – Melton, Zug, or
Hutchison's – attempts to replicate an extremely slow-heating experiment as
cited by Goltz or Scripture: only Hutchison did a thermal trial at over five
times Goltz's slow rate and over nine times Scripture's 0.002 °C per second
temperature rise." [1]

More interesting is perhaps the question of how you get the frog to sit still
-- or maybe change the experiment so there's a grid over the top of the pan --
and try to see if the frog gets more and more "desperate" (flight from danger
as opposed to "just a frog jumping around") as the water slowly heats?

[1]
[https://en.wikipedia.org/wiki/Boiling_frog](https://en.wikipedia.org/wiki/Boiling_frog)

------
mgrennan
This talk describes the problem with software development today. Lot of young
people wanting to create the next new thing with no idea about what they are
creating.

TO MUCH DOING, NOT ENOUGH THINKING.

"THINK" is a motto coined by Thomas J. Watson. If you don't know who it was...
Again you fail.

If you want to create the next new thing, go talk to someone who has been
working in IT for 40+ years. Remember, IT work in dog years. That means the
person you are consulting with has 320 years of experience.

------
marcusarmstrong
I clicked on this talk only to then realize, "Wait, that looks familiar".
Yeah. I was at this talk when it was given. Whoops.

------
therealmarv
best usage of "considered harmful" in a title

------
DyslexicAtheist
If I see one more "considered harmful" post/presentation I'm gonna lose my
sh1t. and no matter if the author is Alan Kay, Dijkstra or Wirth.

maybe I should do a "Using Cosidered Harmful, considered harmful"

EDIT: seems somebody already beat me to it
[http://meyerweb.com/eric/comment/chech.html](http://meyerweb.com/eric/comment/chech.html)

~~~
danbruc
Has already been done.

[http://meyerweb.com/eric/comment/chech.html](http://meyerweb.com/eric/comment/chech.html)

