
The Deep Insights of Alan Kay - mythz
http://mythz.servicestack.net/blog/2013/02/27/the-deep-insights-of-alan-kay/
======
thebear
Thanks for this post; the importance of Alan Kay's work can hardly be
overstated. Here's another quote by him that I find interesting:

 _I invented the term Object-Oriented and I can tell you I did not have C++ in
mind._

Whether or not you share the disdain for C++ that is implicit in this quote, I
believe it is important to understand the difference between the way OO was
originally conceived of and the way C++ interprets it. I believe the best way
to understand the difference is to look at object-variable binding. Smalltalk
and almost every other OO language that followed it uses _reference semantics_
: the assignment

x = y

causes the variable x to release its binding to whatever object it is
currently referring to, and to refer to the object that y is referring to
instead. C++, on the other hand, is pretty much the only OO language that uses
_value semantics_ : the intended meaning of the assignment

x = y

is to copy the state of the object to which y is referring over to the object
to which x is referring. The older and wiser I get the more I am inclined to
believe that this attempt to marry OO and value semantics was an experiment
that has failed. Even if you do not believe this, you'd have to admit that the
collective amount of time and effort that the C++ community has spent on
dealing with the assignment operator is simply staggering. (I would have made
that last statement even two years ago, and now we have rvalue refereces.)

~~~
alephnil
The object model of C++ is almost entirely from Simula rather than from
smalltalk. The object model of Simula is less dynamic and the language itself
is based on Algol rather than being built from scratch like Smalltalk. In many
ways, Simulas rather than Smalltalks object model has been the most
influential, through C++ and later Java and C#.

~~~
skrebbel
Can't be. Simula wasn't invented by Alan Kay.

------
grinich
I've admired Alan Kay's work for years, but I wish he had another Jobs/Apple
to turn his ideas into products. His work on the GUI was put into the Mac,
Smalltalk directly influenced Objective-C, and the Dynabook is clearly a
predecessor to the iPad. I don't think VPRI is a similar vehicle for new work.

It's a really tough dichotomy: people who invent world-changing things very
rarely have the skills (or desire) to spend years polishing and shipping. I
think almost all of academia is great example of this, where invention is
praised over innovation.

I'm not saying this is bad. Researchers excel at doing research, and should
focus in that domain. But papers and conference talks don't change the world.
It takes _real_ products, which are the result of substantial non-idea work.
Even Doug Engelbart's famous demo took decades to actually build at scale.

One of the most important things about PARC may have been that it was a
pairing of research-like projects with real world engineering accomplishments.
For several years, I've felt like Google is on the verge of this as well.
Glass may be their first shipping experimental product, so we'll see.

It'd be interesting if Alan took a research position at Google. Hal Abelson
(inventor of Scheme, founder of Creative Commons, prof at MIT, etc.) spent a
sabbatical there and shipped App Inventor for Android, which essentially took
ideas from the Scratch programming environment and made them work for Android.
Something like Smalltalk on top of App Engine could be really awesome.

    
    
        When the Mac first came out, Newsweek asked me what I [thought] 
        of it. I said: Well, it’s the first personal computer worth 
        criticizing. So at the end of the presentation, Steve came up 
        to me and said: Is the iPhone worth criticizing? And I said: 
        Make the screen five inches by eight inches, and you’ll rule the world.
    
        -- Alan Kay

~~~
slurgfest
Why would it help to have Smalltalk on an opaque, proprietary server platform
which absolutely locks in every project that is coded for it?

~~~
grinich
Where on the web can you run Smalltalk right now? Where is the Heroku/Rails
for Squeak?

And before you write it, I'll save you the time. Here's the typical HN
response to this question:

    
    
        Just go get an EC2 instance and apt-get a bunch of shit and 
        then figure out how to use a CLI to push your code there and 
        add framework dependencies and learn Postgres and and configure it 
        for DNS and oh and you probably need a credit card. Not that 
        hard, srsly guys. Also, Dropbox is just git with a shiny frontend.
    

There is an unfathomably _monumental_ difference between not being able to do
something at all, and doing it in a way that is constricted and limited from
the perspective of an expert. It's the most dramatic in terms of learning,
exploration, experimentation, and imagination.

Lego Mindstorms was proprietary and I loved the shit out of that-- probably
was the primary reason I dove into science and engineering. And to your point,
everything I built with Lego was (literally) locked into the Lego world. But
by the time it actually mattered, I had moved to a legit machine shop. But
those plastic blocks laid the foundation for construction. That's what I
really want Alan Kay's work to do: lay a new foundation for teaching people
how to think about computation and symbolic manipulation with computers.

------
david927
Alan is the most important person alive in Computer Science, showing us
-loudly- the way forward, and he's largely ignored. I don't know of a more
damning indictment than that.

~~~
chc
I agree that Alan Kay is brilliant, but to be completely fair, if Squeak is
the way forward, I think I'm happy where I am. It's not _bad_ , but the
tradeoffs are such that it's hard to imagine ever using it for production
software.

~~~
grinich
Every use Objective-C? It's pretty much Smalltalk with C backwards
compatibility.

~~~
chc
I've been using Objective-C for about 12 years now. It is similar to Smalltalk
in its message system and naming conventions — which, to be fair, were at the
core of Kay's Smalltalk vision — but otherwise the experience of programming
in Objective-C is not much like programming in Squeak.

~~~
nzeribe
That is no bad thing, because programming in Squeak absolutely blows
programming in Xcode away.

------
podperson
I admire Alan Kay as much as the next guy and a lot of this stuff in this
article looks great, but he doesn't need his lily gilded:

* Laser Printers

Yes -- invented at Xerox PARC.

* Object Orientated Programming / Smalltalk

No. OOP was invented in Norway (Simula) / Yes

* Personal Computers

Um... really? No.

* Ethernet / Distributed Computing

Yes Xerox PARC. / Not really

* GUI / Mouse / WYSIWYG

No (Englebart et al) / No (Englebart et al) / Yes

See:
[http://en.wikipedia.org/wiki/History_of_the_graphical_user_i...](http://en.wikipedia.org/wiki/History_of_the_graphical_user_interface)

~~~
mythz
See: <http://www.parc.com/about/> and
<http://en.wikipedia.org/wiki/PARC_(company)>

There is no question that Alan did invent the term "Object Oriented
Programming" (and was rewarded with a Turing award for his work) and also
invented Smalltalk which was well known to be inspired by Lisp, Simula, Logo,
Sketchpad.

The GUI, overlapping windows on a bitmap screen, WYSIWIG editor, etc i.e. the
personal computer as we know it today was first created at PARC that later
influenced and was commercialized by Apple.

~~~
podperson
Naming something isn't inventing it.

Kay's Turing award was "For pioneering many of the ideas at the root of
contemporary object-oriented programming languages" which might or might not
include _inventing_ "Object Oriented Programming..." and as it happens turns
out not to.

See: <http://amturing.acm.org/award_winners/kay_3972189.cfm>

Key quote from the article:

"While working on FLEX, Kay witnessed Douglas Engelbart’s demonstration of
interactive computing designed to support collaborative work groups.
Engelbart’s vision influenced Kay to adopt graphical interfaces, hypertext,
and the mouse."

Incidentally, Doug Engelbart is still alive and a Turing award winner:

<http://amturing.acm.org/award_winners/engelbart_7929781.cfm>

"Engelbart slipped into relative obscurity after 1976 due to various
misfortunes and misunderstandings. Several of Engelbart's best researchers had
become alienated from him and left his organization when Xerox PARC was
created in 1970."

History of OOP including Simula 67: <http://en.wikipedia.org/wiki/Object-
oriented_programming>

Go look at the "Mother of All Demos" --
<http://en.wikipedia.org/wiki/The_Mother_of_All_Demos> \-- which introduces
the mouse, a macdraw-like drawing program, and a mouse-driven word processor.
This is the invention of the GUI that not only predates Xerox PARC but
students from the lab that did this work went on to work at Xerox PARC. (And
the Xerox PARC mouse was a clumsy, expensive, and unreliable device very
similar to the crude device from the older demo -- the modern mouse was
actually invented by an engineer contracted by Steve Jobs).

"the personal computer as we know it today was first created at PARC"

The personal computer as we know it today was invented by a lot of people over
a period of time, although it is usually ascribed to Apple, Altair, IMSAI,
etc. -- not Xerox. Xerox added a GUI to their $10,000 "personal computer" and
that is a feature of today's personal computer, but so are scalable fonts
(btw: that's Donald Knuth who deserves as much fanboi love as Alan Kay) and
web browsers, neither of which were created by Xerox.

~~~
nessus42
At MIT we had perfectly usable and reliable mice on Lisp Machines long before
the Mac or Lisa existed. It is true that they were not mass-produced, however.

~~~
podperson
Interesting: when did the Lisp Machine actually ship as a product? And did it
have a mouse from day one? According to the Wikipedia article it looks like
something may have shipped in 1980 for $70,000 per unit...

~~~
podperson
@nessus42: Thanks for the info. Another question is how intrinsic the mouse
was to use of the machine and did it have more than one button? Part of the
genius of the Mac design was doing with one button what Xerox did (badly) with
three. (Yes, other companies later added more buttons all over again, but the
Xerox UI, which I used briefly, required using different buttons for distinct
operations in a way that, once you'd used a Mac mouse, made no sense at all.)

~~~
nessus42
The mouse was completely intrinsic to the Lisp Machine. Though I'm sure that
most people would consider the GUI rather ugly and primitive by today's
standards. Think Emacs for X11.

The mouse was a three button mouse. The way it worked certainly made sense to
me at the time, but it was made for nerds. Apple certainly did a ton of great
work to make personal computers usable (and affordable) by normal people.

------
espeed
_To me, one of the nice things about the semantics of real objects is that
they are “real computers all the way down (RCATWD)” – this always retains the
full ability to represent anything._

I remember 15 years ago when Philip Greenspun (<http://philip.greenspun.com>)
first introduced me to the idea that a Web service is like an object -- it may
seem obvious today, but at the time it completely changed my Web view:

 _The challenge is in realizing that the Web service itself is an object. The
object has state, typically stored in a relational database management system.
The object has methods (the URLs) and arguments to those methods (the inputs
of the forms that target the URLs). The engineering challenges of Web
development are (a) coming up with the correct data model for the object
state, (b) coming up with a correct and maintainable organization of URLs, and
(c) defining the semantics of each URL. By the time an individual page is
constructed, the engineering challenge is over and it doesn't really matter
whether you build that script in a simple language (e.g., Perl or Tcl) or a
complex powerful language (e.g., Common Lisp or Java)._

[http://philip.greenspun.com/wtr/aolserver/introduction-1.htm...](http://philip.greenspun.com/wtr/aolserver/introduction-1.html)

Philip also discusses this idea in chapter 13 of his classic book, "Philip and
Alex's Guide to Web Publishing" (<http://philip.greenspun.com/panda/>).

But as Alan Kay points out, this RPC-type model still hasn't evolved to a
point where you can pass around self-contained objects that are RCATWD.

------
gnosis
There's really no way to adequately summarize such a varied and content-rich
article. You should really read it in full for yourself. But here are the
section headings:

* On Software Engineering

* On Object Orientated Programming

* Tear it down and build something better

* On Messaging

* On LISP

* The Unknown side of Alan Kay

------
malbs
Anything I've ever read that Alan Kay has written, either inspires me greatly,
or depresses me deeply. His POV is worth 80 IQ quote is something I live by.
It applies to _every_ aspect of life. Then you listen to him talk about
computer science, computing in general, and what the future might hold, and
you end up wondering why we're using languages today that are essentially
prettied up versions of languages 40 and 50 years old (sweeping
generalisation!)

~~~
malbs
reading that back, damn, the "depresses me deeply" is a terrible exaggeration,
making light of anyone who actually suffers from genuine depression.

~~~
talaketu
come now... not all depression is clinical.

------
qznc
If "message passing between objects" is so great, then why do those
distributed object frameworks (CORBA,(D)COM) not rule the world? Bad
execution?

~~~
mythz
CORBA/DCOM/SOAP-RPC are examples of RPC-based services, i.e. exactly what Alan
is advocating against.

Here's my definition of a message-based service:
[https://github.com/ServiceStack/ServiceStack/wiki/What-
is-a-...](https://github.com/ServiceStack/ServiceStack/wiki/What-is-a-message-
based-web-service%3F)

Here are some of the advantages:
[https://github.com/ServiceStack/ServiceStack/wiki/Advantages...](https://github.com/ServiceStack/ServiceStack/wiki/Advantages-
of-message-based-web-services)

Here is the difference between WCF RPC methods and a message-based
ServiceStack service: <http://www.servicestack.net/files/slide-35.png> (i.e. 6
RPC methods vs 1 Msg service)

Here is the difference between WebApi RPC methods vs an equivalent message-
based ServiceStack service: <http://www.servicestack.net/files/slide-34.png>
(e.g. 5 RPC vs 2 msg ones)

------
codemac
> The ARPA/PARC history shows that a combination of vision, a modest amount of
> funding, with a _felicitous context and process_ can almost magically give
> rise to new technologies that not only amplify civilization, but also
> produce tremendous wealth for the society.

[...] felicitous context and process [...]

I am interested in what the process is he's referring to that PARC used.

I think software engineering as a discipline is too young to have decent
processes figured out, as is evident with the takeover of the "agile/scrum"
process (I don't mean this in a bad way, but before this it was either _NO_
process or a bug tracker with dates in it).

~~~
vajrabum
He's thinking process at a higher level than you are. He means process in
terms of funding, choice of vision and who chooses to commit people and
resources. It's his contention that ARPA in the early to mid 60s had a
funding/research process that really worked. The nut of it was to pick
productive people with vision and get out of their way and let them drive.

------
vidoc
While there's no question that some of Alan Kay's achievements are very
important, I cannot help but thinking that our industry's tendency to
surrender to 'fanism' (e.g: douglas crockford, linus torvalds) - is
ridiculous.

~~~
mythz
Learning the educated insights from the brilliant minds in history is not
'fanism', it's benefiting and building off their experiences and is what Alan
refers to when he says: "Point of view is worth 80 IQ points"

In his talk he uses Leonardo Da Vinci who despite being one of the smartest
people in existence wasn't able to invent a motor for any of his machines, a
feat accomplished by a less capable Mr Ford who did it en-masse.

Not learning from history is one of the reasons why the state of the industry
is where it's at today.

~~~
vidoc
Don't get me wrong, I'm not contradicting that, nor am I advocating not
learning from history. I just think that stuff like that:

"The Deep Insights of Alan Kay" "Alan is the most important person alive in
Computer Science""

is over the top.

~~~
mythz
I never said "Alan is the most important person alive in Computer Science"
(though he's obviously a contender)

And I have no regrets over the title "The Deep Insights of Alan Kay"...

Alan has generated a wealth of intellect, experience and insights in his works
that most IT people still don't know about or even know who he is. The
objective of the post is to bring some of his meaningful and powerful ideas to
the surface and summarize them so it's more palatable and more accessible to
more people - I think the title accurately captures this.

------
army
Some of those points are a bit misleading. Sure, you can define a lisp
interpreter in very few lines of code. But to build a decent standard library
and optimizing compiler/JIT, that is where the LOCs rack up.

~~~
mythz
Attitudes like this are short-sighted, and part of the problem. Finding
innovative and elegant solutions to solving problems is more scalable than the
brute-force approach of throwing resources and having them churn out more LOCs
to solve it. Not saying its easier and requires less effort to do, just that
it results in less LOC's that scales better and provides better building
blocks to build on.

Look into the special properties of LISP which Alan refers to as the Maxwell’s
equations of software: [http://www.michaelnielsen.org/ddi/lisp-as-the-
maxwells-equat...](http://www.michaelnielsen.org/ddi/lisp-as-the-maxwells-
equations-of-software/)

Its more powerful and extensible than most languages despite requiring orders
of magnitude less effort to implement (see: Peter Norvig's Lisp interpreter in
60 lines of Python <http://norvig.com/lispy.html>, and re-written in 40 lines
of Ruby <http://stuff.thedeemon.com/rispy/>).

~~~
army
I'm familiar with the LISP metacircular interpreter and programming language
theory. I know you can define all sorts of things with a very small set of
primitives.

It's one thing to build a minimal implementation where your main constraint is
that it works and its concise. It gets a lot messier once you start worry
about constraints like efficiency on real machines - there are limits to how
much you can abstract out the heuristics and special cases.

~~~
mythz
And its macro support has a major impact on the small code-base size of their
class libraries.

Right, if you need it to be efficient than mechanical sympathy starts creeping
into your code-base, though language design should optimally be designed to
capture human thought - that's the part that doesn't scale well or solved by
adding more machines.

The Nile/OMeta work is an example of using DSL's as one way to reduce the
LOC's required, it captures the required top-level math fn's for image
processing and uses OMeta to convert it to runnable source code. DSL's might
not be the best way forward in all situations, but investing in research of
different ways to the solve the problem can only help.

------
Uhhrrr
"Object Orientated"? Really?

How does someone read so much material on Alan Kay and still manage to have
"object orientated" stuck in their head? It's like putting together an
overview of Edward Said's "Orientatilism".

~~~
cgh
I feel the same way about "orientated". But my British friends tell me that's
the normal usage of the word in the UK and "oriented" sounds as strange to
them as "orientated" does to me.

That said, Kay did call it object-oriented programming, so it makes sense to
stick with his name.

~~~
rjknight
I'm British and I think your British friends are wrong. I do agree that most
of the people who use "orientated" are British but I don't understand where
they get that from.

"Orientation" is the term for the process of orienting something. So a design
process in which, say, user stories are turned into object-oriented designs
could be called "object-orientation". But it does not follow that the result
is "object-orientated" - we'd just call it "object-oriented", in the same way
that something which has been through a transformation is not "transformated"
but "transformed".

------
psadri
... orientated around “visions rather than goals” and “funded people, not
projects”

being exclusively goal oriented constrains the originality of the high level
ideas that are produced.

on the other hand, being constrained by specific goals (eg: put a person on
the moon) can produce a lot of creativity as people's efforts are focused on a
narrow domain of problems to solve.

------
ternaryoperator
Surprised not to see any quotes from Dr. Dobb's interview with Kay from 2012 (
<http://www.drdobbs.com/240003442> )

Some quotes from that interview:

"My interest in education is unglamorous. I don't have an enormous desire to
help children, but I have an enormous desire to create better adults."

"Extracting patterns from today's programming practices ennobles them in a way
they don't deserve."

"I like to say that in the old days, if you reinvented the wheel, you would
get your wrist slapped for not reading. But nowadays people are reinventing
the flat tire. I'd personally be happy if they reinvented the wheel, because
at least we'd be moving forward. If they reinvented what Engelbart did we'd be
way ahead of where we are now."

and so on...

------
elviejo
Alan Kay is a voice that we need to listen more... we are distracted from the
fact that our current Industry generates a lot of money... too look at the
fact that our Craft is in the pyramid construction age.... we need real
architecture and real engineering.

------
pbreit
I don't think the dig on the "web amateurs" is fair. It seems clear to me that
the simple and, perhaps, amateurish underpinnings of the web (http & HTML)
ended up being a great strength.

~~~
moe
I disagree.

HTTP is defensible, HTML also has its time and place.

But what it evolved into (HTML/CSS/JS) is pretty obviously a dead-end waiting
to happen. The number of hacks we pile on one another for a "modern web-
application" is nothing short of ridiculous.

Realistically we still live in the internet stone-age, helplessly piling up
stones and wooden sticks...

------
qznc
Oh, how I would like to base everything on simple abstractions like “real
computers all the way down (RCATWD)”. However, reality seems to think
differently. James Hague recently said it well: "Simplicity is Wonderful, But
Not a Requirement"

<http://prog21.dadgum.com/167.html>

------
ned11
Good article. I Just translated it into Chinese and posted that in my
blog(<http://ned11.iteye.com/blog/1828088>). Thanks for the introduction to
such a great people.

------
warmfuzzykitten
It's a small thing, I know, but the author's use of "orientated" for
"oriented" was a bit annoying.

Other than that, it's a nice collection of quotes and links. Good to see.

------
drpepper
I'd love to get a hold of that Bob Barton article Alan Kay talks about, but
strangely it seems like all the PDFs of it have been taken down...

~~~
mythz
The revolutionary 1961 Bob Barton paper Alan Kay was referring to was called
"A new approach to the functional design of a digital computer".

Here's a copy of it on Scribd:
<http://www.scribd.com/doc/61812037/Barton-B5000>

~~~
drpepper
Great!

------
lani
has anyone seen him talking about a Tim Gallwey video ? Kathy Sierra mentioned
one of Alan Kay's talks called 'doing with images makes symbols'... I'm trying
to find this one ..

------
endlessvoid94
What network protocol did the ARPA network at PARC use?

------
mike_ivanov
The author lost me here: "Which didn’t even require inheritance, which is not
like we know it today".

------
martinced
My new favorite from Alan Kay, which I just learned from TFA:

 _"Lisp: greatest single programming language ever designed"_

My previous favorite quote attributed to Alan Kay, which is not in TFA:

 _"I made up the term "object-oriented", and I can tell you I did not have C++
in mind."_

: )

