Hacker News new | past | comments | ask | show | jobs | submit login
Alan Kay – Normal Considered Harmful (2009) [video] (youtube.com)
93 points by e12e on Aug 4, 2015 | hide | past | web | favorite | 52 comments



Just randomly came across this Alan Kay talk that I hadn't seen before. Some very interesting points on the challenge of real innovation:

"Normal is the greatest enemy with regard to creating the new. And the way of getting around this, is you have to understand normal, not as reality, but just a construct. And a way to do that, for example, is just travel to a lot of different countries -- and you'll find a thousand different ways of thinking the world is real, all of which is just stories inside of people's heads. That's what we are too. Normal is just a construct -- and to the extent that you can see normal as a construct inside yourself, you've freed yourself from the constraint of thinking this is the way the world is. Because it isn't. This is the way we are."


Love Alan Kay. Went to a series of talks as part of a week of education for an old employer. We had several very highly qualified people speak to us about various challenges in IT. They were all quite "normal". Alan Kay's talk blew me away because it was so left-field. Interesting guy.



The guy is sickening! Not only did he come across as very likeable he also dropped in that as a child he had to decide between going into computer science or becoming a professional jazz musician (I think he played guitar).

Choosing between musician and computer genius is bad enough but to retain being likable with this is frankly too much :-)


I'm curious about system evolution and their tendency to 'normalize'. I too like the diversity and originality, but constant newness, as in 'javascript client framework' is exhausting .. surely there's a balance, and hopefully some kind of theory on where to place the center.


Devs like to solve problems for their own sake. Building a completely new anything is a completely different challenge to solving a problem like "Build a version of existing thing X using language Y and/or running in environment Z."

The second option is well-bounded and safe. It requires technical skill, not creativity. It's a legible, comprehensible challenge.

The first option is unbounded and unsafe. It can't be done without creativity, originality, and technical skill.

I'm becoming convinced there should be a much stronger creative and artistic element in developer training. Invention and innovation are so much more valuable than wheel reinvention that turning out people who are only comfortable with the latter is selling everyone short.


You can't really teach such a thing, not in any profound way. The programmer in question must have an intrinsic or otherwise self-conditioned drive to read computing history, papers and be interested in actually doing research before starting a project.

We have largely crafted a culture where doing research before writing code is considered slow and ineffectual for whatever reason. Instead, we value "moving fast and breaking things" and whipping up the quick hack, which encourages people to propagate their computing biases and never step out of their comfort zone.

This is one of the reasons why I mostly scoff at attempts to make computer programming a compulsory schooling subject. Coding as "the new literacy" devalues computing and is the very embodiment of the programming pop culture that Alan Kay has warned about.


Moving fast is valuable sometimes, as much as going away in a hammock as Hickey would say. Learning how to alternate both scales is missing though. And I often thought smart people had that down, they could think ideals, then get down through the stack to effectively make things, then get back to the abstraction without losing focus or getting lost. Quite often I'm stuck at one level or the other.

> This is one of the reasons why I mostly scoff at attempts to make computer programming a compulsory schooling subject. Coding as "the new literacy" devalues computing and is the very embodiment of the programming pop culture that Alan Kay has warned about.

Especially since the people behind this idea have zero idea about what is programming. Some want people to learn HTML, which is pretty much void.


It's hard to cultivate a good literary culture until you have a large literate and critical audience. Perhaps the same is true for programming culture.

I agree with your other points though.


Kay draws a distinction between news and new. A new JavaScript framework is news...a person can explain the idea in 5 minutes or so because it is a normal incremental change. It's incremental because it's projecting normal as the future. Evaluating JavaScript frameworks is exhausting because the differences are mostly mundane, not world wide web versus gopher.


So the difference between "new" and "news" is the difference in the perceived scope? Is this what you've got from Alan Kay's philosophy?


Its the difference between looking at something from a different angle and looking at it from space. It's much harder to go to space [than] to turn your head (well, was much harder before anyone did it - now its easy).


Yeah, that's what I thought too!


I think it's about inferential distance. "News" is only 1 or 2 inferential steps from common knowledge. "New" is several step removed, and as such look alien, crazy, or pointless.


No and of course not.


Most of those JS libraries are same MVC/MVP/MVVM done over and over again.


That was the point, you want a certain kind of new, not running around in circles at high speed.


Philosophical question: What's when "running around in circles at high speed" becomes normal? Will then "normal" become "new"?


I deeply believe that, as kay said, we are relative and will stop perceive things after a will, forgetting.. anything old will be new, even if it's normal, slow and ~sensical. Cycles. Or orbitals as I like to see them.


> balance

This is a funny word if you think about it. It's the property where several forces on a point are in equilibrium. This definition generalises to very abstract situations, and especially to political ones. Several sides are all pulling (or pushing... the metaphor doesn't really matter) in their own direction and where the final decision falls is where the powers are in equilibrium. It's the point which is as dissatisfactory for every party as that party is powerless. Sometimes, that's what you want.

Often, balance evokes the image of the halfway point. The balance between killing every Jewish baby and no Jewish baby is not killing half of all newborns. It's also not killing exactly zero babies. If you want a balance between killing all Jewish babies and no Jewish babies, the neo-nazis all have a saying and you end up having to kill one of every hundred million or so babies. Which is unacceptable. This example was an extremist straw-man to illustrate the following point: You're not looking for balance, you're looking to maximise a utility function.

The utility function may be maximised by looking for balance between forces, in fact it often is, or we approach something close to the maximum by finding the balance. However, you mustn't mistake the balance for the utility function. You're not maximising balance. You're looking for a solution to a problem. You want less javascript client frameworks? Ok. You still want some javascript client frameworks? Ok. Why.

Because you want to maximise the utility of javascript client frameworks (I assume). This utility might be different for different people, i.e. they may want to do different things with the frameworks, etc. But, I can agree with you that having a constant stream of new ones serves no utility, except the tautological one of "let's have as many frameworks as we can write".

Now we come back to balance. At some rate of new frameworks being made, we balance out the amount of work spent on maintenance of the old ones with the influx of new ideas that make our work and life easier and from time to time we completely overhaul and forget an old technology. Which, in turn, maximises our utility of these frameworks. It does look like balance <=> utility, but not quite. This is a balance in the context of a specific utility function. You've defined the utility function as "I want X and Y with preference weights α and β", which looks very much like a linear programming[1] problem, the solution to which is... weighted balance.

Others would not agree to your utility function. They maybe don't want new ideas so much. Now you're balancing utility functions. But those people still agree that they want to "find the balance". It's just that their equilibrium is not your equilibrium.

So, please don't just throw out "there's balance" like that. Everyone can agree to balance and it's at a different point for different people. It's a word that causes... well, muddled thinking. If you can, avoid using it at all[2]. Get to the root of your problem as much as you can, then say what ails you. "I like seeing new approaches tried, but it's hard to keep up with the influx of largely samey ones that are constantly cropping up. I wish they would slow down." -- or something along these lines, I'm not saying this is what you specifically want.

Sorry for the long rant. I've had to listen to people wanting to balance a lot lately.

[1] https://en.wikipedia.org/wiki/Linear_programming a type of math problem concerned with finding the point where the given variables all satisfy a set of constraints expressed as linear inequalities and a given function achieves a maximum. [2] maybe there is a balance to abusing "balance"...


Well you actually spelled my thoughts, so thank you :) I could have been more precise in what I meant, but I wanted to stay abstract, not specific to some trendy example, but having a general view on what structures and shapes forces in a 'new' context and how evolution goes. Maybe it's just an entropy thing blended with the memory limit of a generation that will have the same energy as its predecessor but without clear understanding of the state of the art, meaning they will walk the same paths thinking it's new instead of recognizing the old.


Great stuff. This immediately made me think of Steve Jobs and the way he approached - I was going to say product design, but really everything he did. It's a very Budhist way of thinking about the world.


"I don't think computing is a real field, it acts like a pop culture, it deals in fads and it doesn't know its own roots."

Harshly put, but there's some truth there.


I too think its very cynical, but extremely accurate.


How is it cynical?

Devs go for fads (just watch HN homepage over time), and there are tons of snakeoil salesmen pushing their wares (e.g. Mongo) and millions of programmers without the basic scientific and engineering rigor.

It would only be cynical if it wasn't an extremely accurate description (which you agree it is).


Its possible to address the subject without cynicism.


If something is an accurate description it's no cynicism.

Cynicism implies that something is in a condition X, and the cynic describes it as a much worse condition Y.


Actually, that's not quite right.

"Cynicism is an attitude or state of mind characterized by a general distrust of others' motives."

Perhaps you are conflating the word cynic with critic.


That's just part of the full definition.

And even in this case: it's only "distrust" if other's motives aren't bad in the first place.

If people in general have bad motives in IT (lazyness, profit, unprofessionalism etc) it's not "cynicism" to say so.

It's merely calling a spade a spade.

I'm more reffering to the "bitterly or sneeringly distrustful, contemptuous, or pessimistic" meaning of the lemma though.

Where again, if contempt is waranteed and if the situation is dire, it's not cynical to be "contemptuous" or "pessimistic" is just realistical description.


It's funny how he chides compsci's as being ignorant about their founders. As a physicist, I personally know physicists who know little about the field's history and care even less. Moreover, I'd argue that in Physics, the push to normal is much stronger than it is in the tech world. May be I'm just thinking the grass is greener on the other side, however.


Maybe because physics is much more tied than computers. In the former, natural laws are quite few and strict, in the latter everybody is free to invent his own little abnormal world without nothing to go against it.


Computing natural laws are much much much stricter and simpler than the laws of physics.

When in physics you have many fundamentally distinct fileds with their own models and view of the world: mechanics, thermodynamics, statistical physics, electronics, optics, quantum physics and all sorts of fundamental theories like string theory. This is a very rich system, with many complex models. Scienetists only hope fore some unification there, but for the most part you have to deal many diverse parts of a huge multi-scale puzzle, and the peices don't always fit together nicely.

When you look at computation, what it really is, well a primitive Turing machine is all you can hope for really. Some resarchers push it a little bit into infinite-time computability, but it's not a realistic model of computation anyway.

What you can do with computation are conditions, loops, and variable assignment. Or in lambda calculus, it's just substitution and name binding. Even worse, name binding is not really necessary if you express your program with combinators. So, fundamentally computation is substitution, a rewriting system.

Computation is simple and trivial, but still, it's a great model, and you can create amazing things in it, even though the laws of computation are really trivially simple.


Well, nobody lives at the TM or LC level. I agree that these theories are as strict as a theory can be. But we stack so many layers, the theory disappear and it's all politics. As if in the end programming was more about running a city than building an engine. And then, simplicity feels like a pipe dream.


Of course no one in their right mind writes code for complex Turing machines, but don't be confused by all the layers of abstractions.

Programming at any level is fundamentally the same, it's about iteration, branching, composition of smaller pieces, and packaging smaller pieces into bigger ones. No matter at what level you write your code, it's always like this. The differences in API are not important.

EDIT: I think that concurrency is a bit different in this respect. It really requires somewhat different prespective, but again it reduces to simple basic elements (like the agent model, or pi calculus) which are reiterated and reimplemented many times.


As much as I agree with the sentiment presented by Alan Kay, in this talk and others, his presentations often feel bogged down in philosophical fluff and flakey analogy. If you want to see what I mean, mute and scan forward in the video paying attention only to the slides. It's almost impossible to tell what this talk is trying to convey from the slides alone. Nothing is concise. It's very lofty, interleaved with seemingly random stories. If this talk was given by someone without a name, we'd consider it completely whacked.

The tl;dr:

    * Smalltalk did everything better
    * Software was better in the 60s and 70s
    * Alan Kay really dislikes the web.
I'd like to see a lot more talk from his progeny/ilk about modern revivals of the philosophy from these computing heydeys, alongside practical examples of how we can do modern applications better.


I'd agree that there is a pretty long way between him tossing the ball and the audience catching and running with it. I take his purpose to be to exhort the viewers to challenge their own thinking and their own purpose, rather than trying to achieve a specific improvement.


>* If you want to see what I mean, mute and scan forward in the video paying attention only to the slides. It's almost impossible to tell what this talk is trying to convey from the slides alone.*

That's why it's a speech and not a slideshow.

(Besides who said you HAVE to summurize the speech in the slides? You could use them as supplementary material, and have people pay attention to your speech as opposed to PowerPoint BS).

And it's a generic speach, not him presenting some paper etc. Have you seen his research work papers?

>* Smalltalk did everything better

Well, compared to most things we have today, especially JS, it really did.

>* Software was better in the 60s and 70s

No, but thinking about software was better. We got far more breakthroughs in those days compared to today -- and not just because they were "low hanging fruit".

>* Alan Kay really dislikes the web.

Who doesn't? Compared to what it could be?


It is most definitely a "mindset" talk more than a "concrete solutions" talk. That said, bear in mind this talk is from 2009 -- "JavaScript: The Good Parts" came out in 2008. So while what he says about web might seem entirely redundant today, with the focus on single-page apps that misses (at least) two things: it wasn't quite that obvious across the field in 2009, and perhaps more importantly, one of the concrete things he does link to, lively kernel[1] was already a real (though new, and not 1.0 stable), working system -- with persistence via WebDAV and the possibility to solve real problems.

[1] http://www.lively-kernel.org/


scan forward in the video paying attention only to the slides. It's almost impossible to tell what this talk is trying to convey

I don't find it particularly surprising that not listening to a talk makes it hard to divine what the speaker said. If I didn't already believe it, the tl;dr [for a video?] would be evidence upon which I might come to such a belief.


While this talk could be said to be a talk where he points out problems but don’t show solutions.

Anyway he has a talk here which at least show some more recent examples of what he considers a better way: Is it really "Complex"? Or did we just make it "Complicated"?: https://www.youtube.com/watch?v=ubaX1Smg6pY

In this he talks about two programming languages: Nile and OMeta which allows them to solve some problems in very few LOCs (among other things).


I'm a bit surprised that Dr. Kay didn't fact-check the boiling frog story; modern biologists don't believe it has any basis in reality.


You don't need to be a "modern biologist" to assert those claims. I just tried once and it worked (I mean the frog did die)


I was worried about that too but I'm sure someone with a degree in biology would have figured that out, right? Pretty sure its meant as a metaphor


Do you really need to fact-check a metaphor?


I mean it's a cute story and an idiom that is pretty much indelible from the language, but Kay was putting it in terms like it's a fact about frogs' biology when it is not. He has a background in biology as well as CS, so I am indeed surprised he didn't check this.


Appears to be inconclusive: "None of these modern rebuttals – Melton, Zug, or Hutchison's – attempts to replicate an extremely slow-heating experiment as cited by Goltz or Scripture: only Hutchison did a thermal trial at over five times Goltz's slow rate and over nine times Scripture's 0.002 °C per second temperature rise." [1]

More interesting is perhaps the question of how you get the frog to sit still -- or maybe change the experiment so there's a grid over the top of the pan -- and try to see if the frog gets more and more "desperate" (flight from danger as opposed to "just a frog jumping around") as the water slowly heats?

[1] https://en.wikipedia.org/wiki/Boiling_frog


This talk describes the problem with software development today. Lot of young people wanting to create the next new thing with no idea about what they are creating.

TO MUCH DOING, NOT ENOUGH THINKING.

"THINK" is a motto coined by Thomas J. Watson. If you don't know who it was... Again you fail.

If you want to create the next new thing, go talk to someone who has been working in IT for 40+ years. Remember, IT work in dog years. That means the person you are consulting with has 320 years of experience.


I clicked on this talk only to then realize, "Wait, that looks familiar". Yeah. I was at this talk when it was given. Whoops.


best usage of "considered harmful" in a title


If I see one more "considered harmful" post/presentation I'm gonna lose my sh1t. and no matter if the author is Alan Kay, Dijkstra or Wirth.

maybe I should do a "Using Cosidered Harmful, considered harmful"

EDIT: seems somebody already beat me to it http://meyerweb.com/eric/comment/chech.html



same goes for "eating your own dog food".




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: