
Alan Kay on AI, Apple and Future - pm2016
http://factordaily.com/alan-kay-apple-steve-jobs/
======
analyst74
> Part of the idea behind “real objects” was to act as “virtual machines”
> connected by “neutral messages” in the very same way as we were starting to
> design the Internet — and that the “virtual internet of objects” would map
> into the subsequent hardware network. The latter got made, and the former
> did not get understood or adopted.

Can anyone elaborate what he means by this (and the overall idea of "real
objects"?

~~~
noahlt
"In computer terms, Smalltalk is a recursion on the notion of computer itself.
Instead of dividing "computer stuff" into things each less strong than the
whole—like data structures, procedures, and functions which are the usual
paraphernalia of programming languages—each Smalltalk object is a recursion on
the entire possibilities of the computer. Thus its semantics are a bit like
having thousands and thousands of computers all hooked together by a very fast
network."

That's Alan Kay in The Early History of Smalltalk, which you might enjoy
reading in full:
[http://worrydream.com/EarlyHistoryOfSmalltalk/](http://worrydream.com/EarlyHistoryOfSmalltalk/)

~~~
Animats
It isn't, though. Smalltalk "messages" are just function calls. It's not like
all Smalltalk objects are running asynchronously, in parallel, with
unsynchronized messages flowing around.

~~~
tailrecursion
In my opinion this is the most important insight into Smalltalk: that
Smalltalk is renaming indirect function calls as "messages". If Smalltalk
message passing were asynchronous, and if Smalltalk objects were processes,
then Smalltalk would be closer to the vision that Kay conveys.

I believe it was Jonathan Rees who emphasized the relationship between
Smalltalk's messages and generic functions.

~~~
Animats
Smalltalk was descended from Simula, which was a variant of ALGOL-60 with
discrete event simulation capabilities. As a side effect, Simula was the first
object-oriented language. This confused things. Objects were associated with
discrete event simulation, which led to the obsession with messages. Kay liked
discrete event simulation - he thought that one of the big applications for
personal computers was going to be simulation and scheduling. There's a little
hospital simulation for the Alto shown in the Personal Dynamic Media book.

In a discrete event simulator, there's a notion of time, and a simulated
pseudo-clock, but in reality, all the events are sorted in time order and
executed sequentially. (You can schedule something to happen in 2 seconds, but
that just puts it in the event queue in the proper place.) Locking against
concurrency is not required. This serialized notion of concurrency is more or
less equivalent to just calling functions.

Once people realized that object-oriented programming doesn't require discrete
event simulation, the concepts parted company. OOP became more about
encapsulation, although some languages still use the "message" terminology.

So this is a historical artifact. Those happen. In von Neumann's EDVAC report,
where he laid out the design for most modern computers, there's discussion of
logic gates as simplified neurons and synapses. Nobody thinks of logic gates
that way any more, and we now know that neurons don't work like logic gates.
But at the time, people thought of them as similar.

~~~
jecel
Messages were clearly not fancy subroutine calls in Smalltalk-72 and -74: they
were a stream of tokens between the sender and the receiver. This was
optimized away in Smalltalk-76 (and so -78 and -80) so that messages no longer
seemed like the ones in Actor languages or Erlang.

But I don't think it is unfortunate that the name "messages" has persisted.
Check out Squeak running on a 56 core Tilera chip with the RoarVM. Messages
from an object to another one in the same core are indeed just fancy
subroutine calls. But if the receiver is in a different core, then a message
is a bunch of bytes sent from one core to the other. Even though the two
messages are the same at the source level and at the bytecode level.

~~~
Animats
When you write

    
    
         i := (j + 1)
    

this is said to be sending a "+" message to j with argument 1. But how does
the value get sent back to the assignment. For consistency, it ought to be j
sending an assignment message to i with the the value. But how did j find out
about i? The assignment message is sent to i, not j. Problem. So the Smalltalk
convention is that each message sent produces a value. That value is
determined by the recipient of the message; it's not just a send status code
as in a real message passing system.

Values break the message passing paradigm, and force the "message passing" to
work like a function call. So there was no real reason not to implement them
as function calls. Then there was no real reason not to think of them as
function calls.

A purer message passing approach would involve a callback when the result is
available. That's how everything with a delay works in Javascript. It's a bit
unwieldy to do that for every piece of computation, though.

Real asynchronous messages are something else. Go uses them extensively. Then
you have locking, race conditions, lockups, and all the problems of
concurrency, but you can get multiple processors working on the problem.

~~~
jecel
In Smalltalk-72 and -74 the assignment (left arrow character) was a message
just like any other. This became a special case in Smalltalk-76 and later and
became just a message again in Self, where you wrote

    
    
      i: (j + 1)
    

meaning

    
    
      self i: ( self j + 1 )
    

For tinySelf 1 I did implement each message using future objects. This is, as
you say, slooooow to do for every piece of computation but there are
implementation tricks that can optimize away all the cases where it is not
really needed.

[http://www.merlintec.com/lsi/tiny.html](http://www.merlintec.com/lsi/tiny.html)

------
hashkb
> Startups are not a good place to do research

I saw Alan Kay give a talk on Squeak in NYC when I was a little Sega Genesis
kid... he made me want to be a programmer because I hoped one day to work on
the stuff he does the way he does it. That's like wanting to be an astronaut.

The tricky bit is that startups pretend that they're doing that kind of work
to hire engineers (or get press/funding/etc).

~~~
im_down_w_otp
Often times they demand that you uphold the fiction in order to work for them,
which makes for fun social experiments in deflating mass self-delusion.

Them: "We just really want to find people who are passionate about this work
and these tools!"

Me: "Anybody who is telling you they're 'passionate' about making yet-another-
dashboards-as-a-service product on top of a tired old Java Spring software
stack is lying to you."

Them: "So you're not passionate about dashboards or Java?"

Me: "No. No I am not. What I am is capable of fixing your performance and
scale problems so that you don't lose your reference customers and destroy any
chance you have at acquisition or IPO. Look on the bright side. At least I'm
not lying to you."

Them: "Oh. I see."

Me: "I'm passionate about making money."

~~~
jfe
You captured 18 months of personal depression in 7 lines :)

~~~
im_down_w_otp
Sorry you've been dealing with depression. From experience I know that's a
tough thing to grapple with.

Being compelled to maintain sustained cognitive dissonance is a form of low-
grade chronic stress, and unfortunately depression is often a side-effect of
chronic stress.

I'm always weary of places that are more interested in building a cult than a
company for precisely these kinds of reasons.

Get well, mate.

------
steveeq1
"The irony is that the ‘second Steve’ of the later Apple made and sold the
equivalent of mental sugar water to all via convenient appealing consumer
gadgets."

Ouch.

~~~
chillaxtian
how was the macintosh different?

~~~
ktRolster
For one thing, it was open. You could install your own software on it.

~~~
chillacy
It was so open that your software which could write all over the memory space
of other programs:
[https://en.wikipedia.org/wiki/Mac_OS_memory_management](https://en.wikipedia.org/wiki/Mac_OS_memory_management)

Then OSX introduced virtual memory. Programs became a little bit safer.

Then in later versions, Application Sandboxing, code signing, etc.

Every step of the way programs can do less but are safer for the OS.

~~~
TeMPOraL
> _Every step of the way programs can do less but are safer for the OS._

The important thing here is that it's _a tradeoff_ , not an universal good.
And a tradeoff that I personally don't like very much. Doing fun and
interesting things keeps getting harder and harder because you have to jump
through increasing amount of security hoops. For instance, reading and writing
memory of other programs enabled you to do tweaks that are now almost
impossible to perform.

I understand the need for change, now that a computer is expected to be
connected and running untrusted third-party code by default. But we've lost
something with that change, and I wish for a way to get it back.

------
thecolorblue
Interesting interview. My experience with start ups has been different than
what he talks about at the end. My experience has been of using a modified
version of the scientific method to accelerate product development. There is
less peer review, and a focus on keeping the subject matter (the customer
usually) close. Alan makes it sound like startups come up with an idea and
push it into the market as fast as possible.

His quote is “research means you can change your mind”. Well, I change my mind
all the time.

~~~
bluetomcat
His main point has always been that for significant breakthroughs, you start
by searching for a problem that isn't even formulated yet. Solving the
problems of your customers is the opposite of that. The impact of solving the
"unformulated" problem may be so strong that it eliminates the need of solving
the little particular problems. In that sense, office workers of the 1970s
probably wanted better typewriters instead of GUIs.

~~~
davmar
Depends on your definition of "significant breakthrough". Technical
breakthroughs are what Kay values, and that's OK. But to suggest they're the
only kind of breakthroughs is myopic. Startups can, in their best
incarnations, change human behavior and the way we interact with our world.

~~~
nikki93
To call it purely technical is also myopic IMO... The GUI/typewriter change
the previous post talks about is a paradigm shift much like in Kuhn's
scientific revolution literature. Some neat aspects are paradigm shifts in
like music production or other art, which is yes technical but achieves
aesthetic end points and is about the human-tool relationship.

------
asragab
Orthogonal to the article, but: "He is widely considered one of the fathers of
object-oriented programming, or OOP, which at its simplest is a “if-then”
programming paradigm that links data to pinpointed procedures."

I'd like to think I have read a few definitions of OOP, I have to say I don't
recognize the above - what does "pinpointed procedures" mean? Also, "if-then"
programming doesn't seem like a paradigm or a necessary/representative flow
control structure of OOP.

To be fair, I am not familiar with Smalltalk and perhaps this definition makes
more sense in that context.

~~~
stonesam92
I think that the "if-then" part is a simplification of imperative paradigms.

I've never heard it described that way, but I assume the part about "pin
pointed procedures" tied to data is a weird way of describing the way that
methods relate to the data encapsulated by their owning object.

Ultimately, I think this is a case where someone familiar with the concept has
explained it to someone who is not, who has then tried to paraphrase and,
lacking any domain knowledge, summarised it in an unusual and vague way.

~~~
infinite8s
Actually that reads as a pretty good non-developer summary of how current
object-oriented programming languages work. Give that to a business person and
it would probably make sense to them.

------
VladimirGolovin
If you skipped the linked PDF with his tribute to ARPA/PARC, read it. Here are
some quotes that I just put in my quote file:

 _> "A fish on land still waves its fins, but the results are qualitatively
different when the fish is put in its most suitable watery environment."_

 _> "Because of the normal distribution of talents and drive in the world, a
depressingly large percentage of organizational processes have been designed
to deal with people of moderate ability, motivation, and trust. We can easily
see this in most walks of life today, but also astoundingly in corporate,
university, and government research. ARPA/PARC had two main thresholds: self-
motivation and ability. They cultivated people who "had to do, paid or not"
and "whose doings were likely to be highly interesting and important"._

 _> "Out of control" because artists have to do what they have to do.
"Extremely productive" because a great vision acts like a magnetic field from
the future that aligns all the little iron particle artists to point to
“North” without having to see it. They then make their own paths to the
future."_

 _> "Unless I'm badly mistaken, in most processes today—and sadly in most
important areas of technology research—the administrators seem to prefer to be
completely in control of mediocre processes to being "out of control" with
superproductive processes. They are trying to "avoid failure" rather than
trying to "capture the heavens"._

~~~
infinite8s
Although a fish on land is what eventually led to the evolution of all land
and air based lifeforms :)

------
ktRolster
Wow, this guy is humble:

 _I’m not interested in being remembered — but I would to have the ideas,
visions, goals, and values of my whole research community not just remembered
but understood and heeded._

~~~
bluetomcat
From listening to his talks and interviews, I see Kay as a true renaissance
person. Close enough to the industry to know its inherent illnesses, distant
enough to not be indoctrinated by contemporary BS. The absolute opposite of
serial bullshitters like Martin Fowler, for example.

I recommend his talk "Normal Considered Harmful".

------
mangeletti
Kay compares the iPhone, et al to "selling sugar water to children" in an
attempt to relegate them to "consumer gadget" status... compared to what,
enterprise gadgets?

~~~
davmar
He sounds like a grumpy old man mixed with a bit of hipster. If you can
dismiss the mobile revolution as selling "sugar water to children" you're
wearing blindfolds. Look at adoption rates of these devices all around the
world - it's unmatched in human history. Look how they've given access to
healthcare and finance in poor countries.

And he calls it "sugar water". Baffling.

~~~
mundo
He didn't dismiss the mobile revolution, he dismissed consumer Apple products.
Not quite the same thing, unless you think the impoverished masses were
uplifted by itunes.

------
digi_owl
Funny thing about his "real objects" answer is that it sounds very close to
what unix shell scripting does. Shell scripting that the web kiddies turn
their nose up at.

~~~
Jtsummers
I think it's fairer to compare his "real objects" to actors in the actor
model. Live processes with state that can spin off new processes, send and
receive messages, and change state based on the messages they receive.

Shell scripting is a subset of this with short-lived processes typically
communicating sequentially via a text-based messaging system (particularly
line-oriented, text-based, moderately structured data).

Long-running unix processes communicating with each other or with external
processes via some other mechanism is also an example of the actor model, and
consequently can each be conceived of as "real objects". But they aren't
typically modeled this way or constructed around this concept, so it's more a
post hoc description rather than a deliberate consequence of their design.

For "web kiddies", they deal with multiple objects all the time. Their
database is an object (or collection of objects), it responds to queries by
returning information, changing information, storing information. Their web
server is an object responding to HTTP connections, generating session
objects, which deal directly with each browser instance viewing the web
content. Again, not typically modeled in this way, but it's the actor model at
work.

When software is designed in this way, IMHO, it comes out better in the end.
Though perhaps not as performant so sometimes we have to take those designs
and drill down a bit and end up with something slightly different than the
high level model.

------
infinite8s
This is the first interview I've seen with Alan Kay where he discusses some of
his thoughts on AI.

------
sebringj
Was it just me or did Alan say the brain is magic? It's so arrogant of him to
think he believed he understood the brain enough to say he doesn't understand
it so it must be magic. I so love Minsky. Long live Marvin Minksy.

