
Alan Kay is still waiting for his dream to come true - sohkamyung
https://www.fastcompany.com/40435064/what-alan-kay-thinks-about-the-iphone-and-technology-now
======
alankay1
Let me try to help this community regarding this article by providing some
context. First, you need to realize that in the more than 50 years of my
career I have always waited to be asked: Every paper, talk, and interview has
been invited, never solicited. But there is a body of results from these that
do put forward my opinions.

This article was a surprise, because the interview was a few years ago for a
book the interviewer was writing. It's worth noting that nowhere in the actual
interview did I advocate going back and doing a Dynabook. My comments are
mostly about media and why it's important to understand and design well any
medium that will be spread and used en-mass.

If you looked closely, then you would have noticed the big difference between
the interview and the front matter. For example, I'm not still waiting for my
dream to come true. You need to be sophisticated enough to see that this is a
headline written to attract. It has nothing to do with what I said.

And, if you looked closely, you might note a non seq right in the beginning,
from "you want to see old media?" to no followup. This is because that section
was taken from the chapter of the book but then edited by others.

The first version of the article said I was fired from Apple, but it was Steve
who was fired, and some editor misunderstood.

In the interview itself there are transcription mistakes that were not found
and corrected. And of course they didn't send me article ahead of time (I
could have fixed all this).

I think I would only have made a few parts of the interview a little more
understandable. It's raw, but -- once you understand the criticisms -- I think
most will find them quite valid. In the interview -- to say it again -- I'm
not calling for the field to implement my vision -- but I am calling for the
field to have a larger vision of civilization and how mass media and our tools
contribute or detract from it. Thoreau had a good line. He said "We become the
tools of our tools" \-- meaning, be very careful when you design tools and put
them out for use. (Our consumer based technology field is not being at all
careful.)

~~~
austenallred
Wow, this article is a pretty egregious misinterpretation of those points. I
didn't come away with anything even vaguely resembling what you just said,
even if we disregard the factual errors.

Candidly, your comments here make a _lot_ more sense than some "We haven't yet
built the DynaBook the way we should" piece.

~~~
alankay
People often ask me "Is this a Dynabook, is that a Dynabook?". Only about 5%
of the idea was in packaging (and there were 2 other different packages
contemplated in 1968 besides the tablet idea -- the HMD idea from Ivan, and
the ubiquitous idea from Nicholas).

Almost all the thought I did was based on what had already been done in the
ARPA community -- rendered as "services" \-- and resculpted for the world of
the child. It was all quite straightforwardly about what Steve later called
"Wheels for the Mind".

If people are interested to see part of what we had in mind, a few of us
including Dan Ingalls a few years ago revived a version of the Xerox Parc
software from 1978 that Xerox had thrown away (it was rescued from a trash
heap). This system was the vintage that Steve Jobs saw the next year when he
visited in 1979. I used this system to make a presentation for a Ted Nelson
tribute. It should start at 2:15. See what you think about what happens around
9:05. [https://youtu.be/AnrlSqtpOkw?t=135](https://youtu.be/AnrlSqtpOkw?t=135)

Next year will be the 50 anniversary of this idea, and many things have
happened since then, so it would be crazy to hark back to a set of ideas that
were in the context of being able to be built over 10 years, and would be
ridiculous if we didn't have in 30 (that would be 1998, almost 20 years ago).

The large idea of ARPA/PARC was that desirable futures for humanity will
require many difficult things to be learned beyond reading and writing and a
few sums. If "equal rights" is to mean something over the entire planet, this
will be very difficult. If we are to be able to deal with the whole planet as
a complex system of which we complex systems are parts, then we'll have to
learn a lot of things that our genetics are not particularly well set up for.

We can't say "well, most people aren't interested in stuff like this" because
we want them to be voting citizens as part of their rights, and this means
that a large part of their education needs to be learning how to argue in ways
that make progress rather than just trying to win. This will require
considerable knowledge and context.

The people who do say "well, most people aren't interested in stuff like this"
are missing the world that we are in, and putting convenience and money making
ahead of progress and even survival. That was crazy 50 years ago, and should
be even more apparently crazy now.

We are set up genetically to learn the environment/culture around us. If we
have media that seems to our nervous systems as an environment, we will try to
learn those ways of thinking and doing, and even our conception of reality.

We can't let the commercial lure of "legal drugs" in the form of media and
other forms put us into a narcotic sleep when we need to be tending and
building our gardens.

The good news about "media as environment" was what attracted a lot of us 50
years ago -- that is, that making great environments/cultures will also be
readily learned by our nervous systems. That was one of Montessori's great
ideas, and one of McLuhan's great ideas, and it's a great idea we need to
understand.

There aren't any parents around to take care of childish adults. We are it. So
we need to grow up and take responsibility for our world, and the first
actions should be to try to understand our actual situations.

~~~
pls2halp
I'm interested to see if you've seen the concepts of atemporality[1] and
network culture[2] floating around. Basically, the core thesis associated with
these is that we have adopted the internet as our primary mode of processing
information, and have as such lost the sense of a cohesive narrative that is
inherent in reading a book/essay, or listening to a whole talk, in the
process.

You become fully immersed in Plato's worldview when reading The Republic, but
if you were see someone explaining the allegory of the cave in the absence of
a wider context you will only take the elements which fit your worldview and
not his wider conception of knowledge.

I think this ties into what happened to the concept of a centralised computer
network, working for the good of humanity, turning into todays fractured
silos, working to mine individuals for profit.

[1][http://index.varnelis.net/network_culture/1_time_history_und...](http://index.varnelis.net/network_culture/1_time_history_under_atemporality)
[2][http://index.varnelis.net/network_culture](http://index.varnelis.net/network_culture)

~~~
alankay
Thanks for the references. I haven't seen these, but the ideas have been
around since the mid-60s by virtue of a number of the researchers back then
having delved into McLuhan, Mumford, Innis, etc. and applied the ideas to the
contemplated revolutions in personal computing and networking media we were
trying to invent.

I think a big point here is that going to a network culture doesn't mandate
losing narrative, it just makes the probability much higher for people who are
not self-conscious about their surrounding contexts. If we take a look at
Socrates (portrayed as an oral culture genius) complaining about writing --
e.g. it removes the need to remember and so we will lose this, etc. -- we also
have to realize that we are reading this from a very good early writer in
Plato. Both of them loved irony, and if we look at this from that point of
view, Plato is actually saying "Guess what? If you -decide- to remember while
reading then you get the best of both worlds -- you'll get the stuff between
your ears where it will do you the most good -and- you will have gotten it
many times faster than listening, usually in a better form, and from many more
resources than oral culture provides".

This was the idea in the 60s: provide something much better -- and by the way
it includes narrative and new possibilities for narrative -- but then like
reading and writing, teach what it is in the schools so that a pop culture
version which misses most of the new powers is avoided.

------
firasd
I found it interesting that in an HN discussion about how to auto-update
comments Alan Kay said, "How about a little model of time in a GUI?"[1] It
ties into when he said of smartphone apps, "It's painful to see people using
billions of devices that have forgot that undo is a good idea." [2]

It made me think about how things like Redux (or even just Google Drive
revision history) where you can 'time travel' through state can be more
important for UI than we realize sometimes.

1)
[https://news.ycombinator.com/item?id=11940472](https://news.ycombinator.com/item?id=11940472)

2)
[https://www.youtube.com/watch?v=S6JC_W9F8-g&t=40m30s](https://www.youtube.com/watch?v=S6JC_W9F8-g&t=40m30s)

~~~
cma
It is painful that most desktop apps never even got branching undo/redo.

~~~
jcelerier
Most non-technical people I know have trouble with "simple" undo-redo and
copy-paste ; this would only make it harder for them.

~~~
zmix
So what? It shouldn't be too difficult to implement three states of userspace
and have it be selected upon login:

\- Novice \- Advanced \- Expert

'Novice' gives you simple 'undo'. 'Advanced' gives you 'undo' with history.
And 'Expert' gives you branching. For example... For file-managers this could
be made so, that the novice gets something like the Finder or WindowsExplorer.
Basically a file-browser, that handles its pane contents as a document.
'Advanced' gives you tabs, rearrangable layouts (dual-pane) and some more
configuration. 'Expert' gives you fully configurable menus and toolbars,
integrated with the hosts generic scripting host. Etc.

~~~
astrange
Finder does have tabs, and further automation is available through scripting
(AppleScript and Terminal).

Actually, both scripting and the commandline are effective ways of making you
organize your program into nouns and verbs that can be extended by the user.

But I don’t think rearranging the basic UI is that useful, because all your
users are basically the same - we’re all human and not Lisp programmers.

------
david927
If, like Alan Kay, you think we need a major overhaul in software construction
and are in the San Francisco Bay Area, let me know. I'm part of a group that
meets on Saturdays once a month and we work on side research projects toward
that end. My email is in my profile.

~~~
mncharity
If anyone in the Boston area is interested in whiteboarding high-skill
software development environments in the context of 2018/2019 hardware, I'd be
interested. So: continuous speech recognition; hand tracking, controllers, and
haptic exoskeletons, but still keyboards; HMDs with 30+ px/deg, and eye
tracking, but still screens.

Dynabook was a nice vision. As always, society executed poorly, and failed to
realize much of the potential of its technologies. No doubt that will happen
again with VR/AR. For example, there's a toxic cauldron of patents brewing.
Screens and keyboards and mice, have had a good half-century run. And GUIs,
and personal local hardware. But perhaps time to shift focus.

~~~
theoh
When you say society always executes poorly, what alternative social reality
are you comparing it with?

~~~
mncharity
I don't understand - are you suggesting the only way to know you could have
done something better, is to see someone else doing the same thing for
comparison? That's certainly one way. Others include... all the ways you
evaluate the execution of a company or project, when you don't have a direct
competitor to compare it with, no?

~~~
theoh
Maybe you would agree with the statement "reality is broken"? I certainly
don't think it's meaningful.

The unfortunate associations of wanting society to "execute better" (i.e. the
merger of industry and the state found under fascism) just underline the fact
that applying performance measurements to human life/social phenomena is
misguided. We don't all agree on the purpose of life or the nature of
happiness, nor should we.

------
mkhalil
I don't buy it that everyone would be a digital creative if they just had the
right tools. To make quality content, whether it be apps/music/movies/etc..it
requires much more than tools.

I respect Alan Kay, but I don't understand the need to bash on modern day
technology.

Have we really come along way in terms of general computing? Maybe not
[Example: 0]. But in terms of the digital world, I can take a video of my
parents and send it to my cousin who lives 10,000 miles away and he can
respond in a matter of seconds.

I can literally meet people in remote areas of the world. I can interact with
people who barely have food but somehow can get cell phone access and now they
are learning and communicating with everyone else. I can't imagine a better
way to level the playing field (socioeconomically) globally then how we have
it.

Do we have a lot more work to do? Sure. But Rome wasn't built in a day.

The future of the internet & technology, the direction it's headed, is going
to come from small contributions from millions across the world.

What people will need will turn into what we have and use. And there won't be
some magical device that just pops out of nowhere that will change everything.

[0]: [https://www.youtube.com/watch?v=yJDv-
zdhzMY](https://www.youtube.com/watch?v=yJDv-zdhzMY)

~~~
scroot
> "I don't buy it that everyone would be a digital creative if they just had
> the right tools."

That is not Kay's argument. Not everyone who scribbles notes or draws
something on a piece of paper is a "creative" or artist. Not all who write at
all are professional writers. But we do not have the computing equivalent of
paper for everyone. Sure, someone can write in a word processor or draw in a
drawing program, but that's not all that a computer is for -- that's just
imitating paper. The outstanding question is: can we make something that is as
extensible as pen and paper and literacy for aiding human thought for the next
level, the computing level? All we have now are stiff applications. Saying
that such things make people literate in computing is like writing by filling
out forms.

> "Do we have a lot more work to do? Sure. But Rome wasn't built in a day."

Kay is your ally here, a constant gadfly putting the lie to all the hype and
BS. He's a constant reminder that we shouldn't feel so satisfied with our mud
huts. He's reminding us to build Rome at all.

------
pcl
It's clear from the interview that Kay thinks mobile computing isn't where it
should be. But I haven't studied the DynaBook. Does anyone know exactly what
Kay thinks is needed, in order to realize his dream?

I found a few such things in the article:

\- more-discoverable undo

\- some sort of AI-based virtual assistant

\- a stylus and holder, and presumably input methods and applications that are
better suited for that stylus

\- something nebulous about having warning labels and being designed for the
higher-order cognitive centers of our brains

~~~
TuringTest
What's missing in today's software is described by the field of End User
Development, the ability for normal people to craft new _software artifacts_
without the need to learn a formal programming language.

For most people, using computers means being force-feed a selection of
existing applications; and most applications are pro- consumption, which is
favoured by the media companies.

App stores (and the Linux repositories that preceded them) were a significant
advance for the public, as they allowed common people top be able to find
tools to cover their needs; previous to that, only power users were able to
tune the computer to their needs.

However, non-developers still depend on the capabilities put in there by the
developers, and can't fine-tune their behaviour.

~~~
jcelerier
> the ability for normal people to craft new software artifacts without the
> need to learn a formal programming language.

The problem is not programming. It's conceiving. You can get 8 years old kids
to do pretty advanced imperative algorithms with scratch. What's hard is to
get people to write down in a blank sheet _what they actually want to do_
(mostly because you _think_ you want something and then when you've got this
thing you notice you actually wanted something a bit different).

~~~
sbov
You don't need to be able to write down in a blank sheet what you actually
want to do if you don't need someone else to do what you want to do.

Software developers aren't special in this sense - even when writing our own
software we can't do what you want. This is why "scratching your own itch" is
so much easier than the alternative. You reduce the "is this what I really
want?" loop by an order of magnitude by cutting out most of the people from
the process.

~~~
TuringTest
_> You don't need to be able to write down in a blank sheet what you actually
want to do if you don't need someone else to do what you want to do._

This.

Why is it so hard for developers to understand this simple idea?

Life in computing would be so much easier if the people building the system
understood what part of their job is essential (finding out the best way to
arrange hardware and software to satisfy a requirement) and which part is
circumstancial, gathering a set of precise requirements to begin with.

The last part wouldn't be needed if end users were able to build their own
working prototypes to solve their needs, and engineers were simply called to
rebuild the prototype with best engineering practices.

------
GuiA
_> rather than the wheels for the mind that Steve Jobs envisioned._

Proper quote, for those confused:

 _" I read a study that measured the efficiency of locomotion for various
species on the planet. The condor used the least energy to move a kilometer.
Humans came in with a rather unimpressive showing about a third of the way
down the list....That didn't look so good, but then someone at Scientific
American had the insight to test the efficiency of locomotion for a man on a
bicycle and a man on a bicycle blew the condor away. That's what a computer is
to me: the computer is the most remarkable tool that we've ever come up with.
It's the equivalent of a bicycle for our minds.”_

~~~
hyperpallium
oblig. Bikes need road infrastructure, or at the very least, a path cleared of
trees/underbrush (though less efficient). Condors don't.

~~~
Pulcinella
Pretty sure Condors need a clear path. You usually don't see them flying below
the tree tops low to the ground.

------
noam87
Reminds of what the people of Red lang are trying to achieve: [http://www.red-
lang.org/p/about.html?m=1](http://www.red-lang.org/p/about.html?m=1)

Maybe I'll finally give it a try!

Every once in a while I daydream about a sort of "distributed app platform"
built on top of Scheme and IPFS. -- Perhaps Red would actually be perfect for
this...

~~~
detaro
> _Every once in a while I daydream about a sort of "distributed app
> platform"_

Have a peek at dat and beaker browser, they've done interesting stuff in that
regard recently.

------
ozaark
Without the vision that came from parc none of us would be viewing this today.
The vision they wanted was vast and it still has more to offer today.

Disappointed many comments have disassociated this from where we are now.

Thank you Alan. Excellent read.

------
zmix
Alan Kay comes from a time, when IT business was all about the creation of
tools. The companies forming out of the 60's/70's experiments, of which he was
a main protagonist, would drive the innovation according to more abstract
needs of information management. That era ended in the 2000's.

The companies, who drive techical advancement and innovation now, are a
shopping mall, some telocs, family camps with added spying bonus, and often
any form of information gatherers, that do business by the worst part of our
economy, selling image rather than delivering identity. I am talking about the
advertisement business.

Alone, compare the Windows 10 address book to former installations. Back in
the old days, the addressbook was scriptable. You could query it from other
apps. Some third party apps had powerful filtering. And now? It is totally
dumbed down to the most basic, primitive needs. And while we use cloud based
addressbooks these days, it is much more difficult to keep them neat and
ordered. I could go on for hours. Like XHTML vs HTML5 and so on.

------
jeffdavis
Simplicity is closely related to education, and education is closely related
to freedom.

How are we supposed to take education seriously when we create devices of such
complexity that they are essentially magic, and hand them to students?

Before computers, complexity always had tight upper bounds. A watch could only
get so complex before it wouldn't fit in its housing; and a car could only get
so complex before it could not be efficiently manufactured/repaired. With
computers, there is no upper bound to the complexity.

We are now overrun with complexity, but it's hard to see how to fix that. A
business that puts blind faith in a complex system to do something slightly
faster/better will probably win against a business that tries to truly
understand what it's doing. An employee who just keeps adding to the
complexity will probably win against an employee stepping back to simplify
things. A consumer who makes convenient choices about complex things without
asking many questions will probably find life easier.

~~~
TeMPOraL
Simlpicity is related to education in a sense that education is about teaching
you a sequence of simple steps that lead toward the complex.

Also, I think the perception of computers being magic doesn't come as much
from their complexity as from the people who try to force the complex into
looking like it's simple. They're hiding complexity so hard the end result
doesn't make any logical sense, so it's magic.

AK had a lovely quote in the article:

"Well, a saying I made up at PARC was, “Simple things should be simple,
complex things should be possible.” They’ve got simple things being simple and
they have complex things being impossible, so that’s wrong."

~~~
jeffdavis
Education offers tools to help manage complexity -- abstract mathematics and
lab sciences help us understand the complex world around us.

But that has mostly been used to manage complexity that already existed (e.g.
physics, biology), or that is somehow inherent to our world (e.g. complexity
resulting from social interactions among humans).

Now, we are creating our own, new kinds of complexity that didn't exist
before. Software isn't there to help us manage complexity, it is _adding_ to
the complexity of our world, and dramatically so. That moves education
_backwards_.

------
thoughtsimple
Interesting that neither Kay nor the interviewer seemed aware of Swift
Playgrounds. I would have loved to hear his criticisms of that environment as
a teaching tool for kids. It seems to be targeted at 8-12 year olds given the
graphics. I would be curious as to how closely Swift Playgrounds follows
Papert’s ideas.

------
kemiller
Neglecting path dependence is the root of all evil. I mean, he’s not wrong,
but maybe the answer is just that this is fucking hard. And some of his
specifics are just wrong. The pen? No. It’s super useful in some areas, like
drawing, but if you include it by default you get lazy developers making UI
that depends on it for navigation. Apple absolutely made the right call. I’m
not taking away any of his amazing contributions, but his habit of slamming
every incremental bit of mainstream, at-scale tech because it doesn’t fulfill
his vision (yet) is not that helpful.

------
ksec
This isn't the first time Alan Kay express his view on HTML / CSS and
Javascitpt.

But what alternative do we have / had?

~~~
david927
That's the problem here, actually, but not in the way you think. There are
alternatives (Canvas, SVG, etc.) but they're not as good for the standard use
case of coming up with something quick that fulfills the business requirement.

And then we need to recognize that that stack is a fucking abomination that
should be burned with fire. It should so deeply embarrass us, that we wouldn't
want to admit that we use it in polite company.

We need people who are willing to experiment with new ideas and give us
something better than HTML/CSS/JS. And we need them to get funding.

~~~
mbrock
What should embarrass us so deeply and why? HTML and CSS are some of the most
generative, enduring, and transformative technologies ever. I wouldn't fund
anyone who doesn't recognize how great they are!

~~~
goatlover
HTML & CSS were good for the early web, when it was a simple document layout
platform. They're not ideal for building rich UIs and applications. Just
because they've been extended to allow that doesn't mean they are the best
tool for the job. Similarly, Excel is abused to do all sorts of things it's
not intended for as well, when there are better options for databases,
financial applications and data science.

~~~
klibertp
There is a huge difference between Excel and HTML: the former is Turing-
complete ( _in addition_ to containing Turing-complete scripting language(s))
and was like this for as long as I remember, while HTML is not. I guess CSS
nowadays may be Turing-complete (?), but HTML is most decidedly not.

That difference means that it's orders of magnitude easier to "abuse" Excel to
do some things it wasn't designed to do than to do the same with HTML. I
seriously wonder if the reliance on HTML to this day is not a kind of "job
security" thing for web devs (like myself, btw)...

------
agumonkey
Just today I read that VPRI fonc mailing list was shutting down because they
shifted focus onto other area (communication design or something like that). A
YC incubator thing IIRC.

~~~
kennethfriedman
Hey, the FONC mailing list isn't shutting down! It's just moving! It's now
hosted at MIT instead of VPRI. It's located here:

[http://mailman.mit.edu/mailman/listinfo/fonclist](http://mailman.mit.edu/mailman/listinfo/fonclist)

~~~
agumonkey
ha good to know, I only read a few mail on that thread maybe I missed the part
where they mentionned relocation.

------
kegan
The title is ironic, given Alan Kay is the one who said "The best way to
predict the future is to invent it."

------
thomastjeffery
> let’s do what Sony did with the Walkman, let’s use this technology to make
> things more convenient in a way that a consumer can recognize, even if the
> user interface limits them.

That's a fantastic characterization of what I _hate_ about Apple, and to
contrast, what I _love_ about free software.

The sad thing (in my opinion) is that people _love_ the limitations Apple
gives them. There it's no undefined behavior when the behavior is so
arbitrarily limited. People don't want to create. They just want content. I
find it rather depressing.

~~~
thomastjeffery
Also from the article:

> Simple things should be simple, complex things should be possible.” They’ve
> got simple things being simple and they have complex things being
> impossible, so that’s wrong.

------
jeffdavis
Many of Alan Kay's concepts will take not only time, but also some kind of
catalyst to gain acceptance. What might that catalyst be?

------
pdfernhout
For me, the flip side of all of Alan's (and other's) inspiring work on
Smalltalk is that, having started programming with Smalltalk in the late 1980s
(especially with VisualWorks and OTI's Envy), my entire professional career
since then has seemed like a long succession of dealing with less pleasant
systems around people who just don't get it. That has been frustrating and
painful (even if the money is good) -- in some ways, it might be better not to
know what is possible because then you don't miss it.

It's sad that ParcPlace Systems made such a mess of commercializing
ObjectWorks/VisualWorks Smalltalk (including not licensing it to Sun
affordably for settop boxes, where Sun then reacted by creating
Oak/Green/Java) and then they sold Smalltalk for a song to Cincom instead of
open sourcing it and becoming a services company. I remember talking with one
ParcPlace salesperson who kept going on about how Digitalk was their
competitor, with my trying to point out that Visual Basic was really their
competitor (and ParcPlace buying Digitalk was another part of the disaster for
Smalltalk commercially).

It was also sad to see IBM abandon _two_ reliable fast Smalltalks it owned in
favor of a buggy slow early Java for marketing reasons. And even working
within IBM at IBM Research around 1999, I could not use Smalltalk because
IBM's OTI subsidiary (which IBM had bought) wanted around US$250K a year
internally just to let the embedded speech research group I was in try their
embedded Smalltalk for the product we were making. They claimed they'd have to
devote an entire support person to it -- and I tried to explain I was going
out on a limb there as it was to suggest using Smalltalk instead of C and
VxWorks (which my supervisor had already licensed). So instead I got IBM
Research to approve Python for internal use (which took weeks of dealing with
IBM's lawyers) and Python, not Smalltalk, ended up on Lou Gerstner's desk when
he asked for one of our "Personal Speech Assistant" devices for his office.

I really learned to dislike proprietary software from that experience of
seeing something I loved like Smalltalk being killed by a business model
focusing on runtime fees and such. I remember how when an IBM Research group
licensed their Jikes Java compiler as FOSS, their biggest surprise was not how
many external users they had -- but how many internal IBM users they suddenly
had who no longer had to jump through complex hoops to gain access to it.

Even now, Java, JavaScript, and Python are not quite where Smalltalk was back
then in many ways. Of course, it's decades later, so there are other good
things like faster networks and CPUs and mobility and DVCS and above all a
widespread culture of FOSS -- and when I squint, I can kind of see all the
browsers out there running JavaScript as a sort of big multi-core Smalltalk
system (just badly and partially implemented).

For a while I had hopes for Squeak, but it got bogged down early on by not
having a clear free and open source license. Sames as I did recently to
Automattic when it used React with a PATENTS clause, when I pointed out the
concern, some lawyer chimed in with how everything was fine -- and only years
later did the growing damage to the community become obvious to all.
[http://wiki.squeak.org/squeak/501](http://wiki.squeak.org/squeak/501)
[https://github.com/Automattic/wp-
calypso/issues/650](https://github.com/Automattic/wp-calypso/issues/650)

I really learned to dislike unclear licensing on a community project from that
early Squeak experience.

Squeak continues to improve and has fixed the licensing issue, but it lost a
lot of momentum. Also the general Squeak community remains more focused making
a better Smalltalk -- not making something better than Smalltalk. That is
something Alan Kay has pointed out -- saying he wanted people to move beyond
Smalltalk with Squeak as a stepping stone. Yet most people seem to not pay
attention to that.

But if even Dan Ingalls could be willing to build new ideas like the Lively
Kernel on JavaScript, I decided I could too, and I shifted my career in that
JavaScript direction inspired by his example.

As luck would have it, my day job is now programming in the overly-complex
monstrosity that is Angular, when I know the more elegant Mithril.js is so
much better from previous experience using it for personal FOSS projects... I
guess every technology has its hipster phase where decision making is more
about fads than any sort of technical merit or ergonomics? I can hope that
sorts itself eventually. But once bad standards get entrenched either in a
group or the world at large, it can be hard to move forward due to sunk costs,
retraining costs, retesting costs, and so on. But, thick-headed as I am, I
keep trying anyway. :-)

And as in another comment I made, there are important social/political reasons
to keep trying to create better systems to support democracy:
[https://news.ycombinator.com/item?id=15311950](https://news.ycombinator.com/item?id=15311950)

------
dkarl
_If people could understand what computing was about, the iPhone would not be
a bad thing. But because people don’t understand what computing is about, they
think they have it in the iPhone, and that illusion is as bad as the illusion
that Guitar Hero is the same as a real guitar. That’s the simple long and the
short of it._

Some kids learn guitar. Some kids fantasize about learning guitar. Apple makes
machines that enable the computing equivalent of both of these things.

You're never going to turn all listeners into musicians, all readers into
writers, all theater-goers into actors. You can't even get everyone to dance
at a wedding with an open bar. Technology can play a role in getting a few
more people over the line, but Alan Kay's head is way too far in the clouds
for him to even see the problem that way.

Honestly, I think what he's done is dissociated himself from his own snobbery.
He talks about technology failing humanity, but he's really saying he doesn't
like people. He doesn't like how most people think and live. Why else would he
constantly be disappointed in technology failing to turn all of humanity into
always-on cultural creatives? It's like he looks at books and says, well, the
dream for books is for everyone to read challenging literature, but at the
beach or on the subway you see mostly "Eat Pray Love" and Stephen King, so
clearly books have failed.

People are doing the same dumb shit with books and technology that they did
without it. They choose passive consumption. When I complain about it, I'll
blame it on technology so I don't sound like a misanthrope.

He constantly talks about creating, learning, and education, but he doesn't
bring any attention to any current work supporting these activities. He
doesn't say, look, here's this great work being done for kids, and it's
limited because the iPad needs this feature. He doesn't say, look, here's this
great computing system I use to support my own work, and it be a lot better
with this specific kind of OS or hardware support. Instead of talking about
how the Apple hardware is holding back current work, he's talking about how it
fails to implement his own ideas from thirty years ago.

You'd think he could find SOME new idea to promote and celebrate, if only to
point out what he thinks progress looks like, but apparently Apple has snuffed
out every worthwhile avenue of improvement by a few unfortunate unfortunate
hardware choices.

Hell, where's even the usesthis interview with Alan Kay?

~~~
scroot
The topic is Apple because the interviewer is a journalist who was writing a
book about the iPhone. Plenty of other resources online show him praising work
he likes (Bret Victor, for example) and systems that he thinks are in the
right direction. It just wasn't in the bounds of the interview topic.

As for the "not turning all readers into writers," well, what can we say? One
would hope that all readers are literate enough to write. Now, we don't expect
all literate people to be professional writers, but this is not in line with
what Kay is arguing anyway. He's saying that being literate in computing
should mean being able to wield computing in order to aid human thinking --
for everyone. He's calling out our current systems, saying that their
availability and use, though ubiquitous, do not constitute such literacy.

~~~
TeMPOraL
In particular, even though we don't expect all literate people to produce
professional-quality writing, we do hope they can use their writing skills to
solve their own problems. Be it filling forms, writing letters, posting "lost
cat" messages on trees, _whatever_.

That's what I think computing should be. Giving people ability to solve even
more complex problems themselves, even more effectively.

------
pmoriarty
I kind of feel sorry for Alan Kay. He's trapped in endlessly rehashing ideas
from 60's and 70's.

~~~
jfoutz
What new ideas are there though? Social networking I guess. Have you seen the
mother of all demos? Engelbart's team invented, like, everything.

On the programming side, dependent types maybe. Everything else is from ml or
lisp.

Neural networks are old old ideas as well. Computers have finally gotten fast
enough to make use of the ideas.

~~~
thomastjeffery
> On the programming side, dependent types maybe.

Have you learned about monads yet?

------
norswap
For all the respect and admiration I have for Alan Kay, I think he's often
showing a lot of bad faith.

His ideal vision is always described in very abstract terms. If he took the
care to write it down precisely, what it should look like, I'm sure there
would be hundreds eager to build it for him. Of course, part of it is "design
principles". But then do some common use cases.

When the abstraction meets reality we get comments like "the iPad should have
a pen". Which is really an interesting observation, but doesn't quite rill the
crowd like the abstract impressive-sounding stuff does. And you'd hardly call
these isolated observation "vision".

Clearly, the user interfaces we have today are sub-optimal, and I can make
hundreds of factual remarks on what could be improved. But I don't think this
works out to a "vision". In fact, a vision is I think sometimes bound to be
too abstract. If we just had products where the shit was fixed relentlessly, I
reckon we'd get to "good" without needing them.

~~~
bitwize
On Hackernews there was once a skeptic of postmodernism who got into a debate
with a Derrida fan. The skeptic said "If deconstruction is not BS, how about
you explain to us, in plain English, what it's all about." The Derrida fan
replied "No. How about _you_ demonstrate that you've read Husserl, Heidegger,
Hegel, Nietzsche, Freud, and Levinas. _Then_ we can begin to talk about
Derrida."

If Alan Kay were like the Derrida fan, then he could have rebuffed anyone who
came asking what the Dynabook was about with "How about _you_ demonstrate that
you've studied the achievements of McLuhan, Montessori, Papert, Minsky,
Engelbart, etc. Then we'll talk." Of course he's not like that so he's going
to try to give the reporter a bit of the background he has in those things and
let them figure it out for themself.

The Dynabook is supposed to be, roughly, an iPad with a keyboard, stylus, and
a Lisp Machine-like OS. Of course Apple and third parties have closed the gap
with keyboards and styli but it's the OS that's the really important bit
because in order for a Dynabook to be a Dynabook it must not be a thing that
you can just download and run preapproved apps on. You have to be able to
discover -- and change -- the entire system from the bare metal on up. Think
"Stallmanism on steroids".

It's like a television you can talk back to. Computing is a dynamic medium
that demands interaction from its user, so to deny the user the ability to
question and change the assumptions of the medium is to lock them in to
certain stereotyped modes of interaction and deny the full richness of the
medium. This is where Kay's "real guitar vs. Guitar Hero" analogy comes in.
You can either have a device that offers its full potential up front, like a
real guitar, or you can have a device that can _only_ be used in circumscribed
ways, in vague imitation of having the full richness of the original device,
like Guitar Hero. The price you pay for having a real guitar, or a real
Dynabook, is what we call "ease of use" \-- the ability for a complete
neophyte to pick it up and immediately display something resembling
proficiency with it. (It is _much_ easier for a trained data-entry clerk to
operate a 3270 terminal than it is for that same clerk to fill out a modern
Web form, but that sort of ease doesn't count.) And Apple has really doubled
down on ease of use, at the expense of everything else wonderful about having
a powerful computing device you can hold in one hand. To the point where it's
become just like regular, can't-talk-back-to-it television.

~~~
digi_owl
The basic problem there is that no company in the industry, from Apple all the
way to the lowliest gray box manufacturer in China is not interested in that.

OLPC tried, it got hoodwinked by Microsoft et al.

Mozilla tried, FirefoxOS bombed.

Hell even Linux is turning ever more complex an opaque year by year.

Simple thing is that such flexibility self support do not sell widgets and
support contracts, and thus company are not interested in making them.

~~~
bitwize
Well, yeah. For the benefits to be realized you need a society that values
that sort of thing. It will look quite a bit different from our society, just
like societies with writing are completely different from entirely oral
societies -- something else Kay touches on.

~~~
digi_owl
In effect Star Trek lala land...

------
cocktailpeanuts
I have total respect for Alan Kay but I hope he stops criticizing other people
and actually shows his vision through his own execution (and by execution I
don't just mean invention of a prototype but actual productization and
distribution)

If you're building something visionary and are not the one who's making tons
of money off of it, you have two solutions:

1\. IF you're interested in making money: Try to figure out what you're doing
wrong and fix it so you actually CAN capitalize on your original invention

2\. IF you're NOT interested in making money: Enjoy the respect these people
pay you and please don't badmouth others who worked hard to build important
products that changed the world.

I feel like Alan Kay is on the second boat, so my heart breaks whenever he
badmouths other people who worked hard to build cool things.

It doesn't matter what came first. What matters is what actually managed to
pull off the ultimate success.

~~~
loup-vaillant
> (and by execution I don't just mean invention of a prototype but actual
> productization and distribution)

So, the STEPS project I mentioned above¹ doesn't count? Bret Victor's work
wouldn't count? That's incredibly short sighted.

You seem to have missed the part where it's really _really_ hard to be
successful with unnatural stuff, even when that stuff is unbelievably useful
and valuable. Writing is unnatural, and took a long time to spread. It's still
not there yet, and with the television and YouTube, we're actually reverting
back a little bit.

Computing is even less natural, so instead of a Victor-esque interactive
wonder, we got the caveman interface, where you point and you grunt².

Speaking of badmouthing, my heart breaks every time someone implies that
recent Apple devices are cool. They're dystopian demons that seduced people
into giving into thinking digital prisons are cool. The Apple ][ was cool. The
iPad is crap (it _could_ be cool, with the right OS).

[1]:
[https://news.ycombinator.com/item?id=15266230](https://news.ycombinator.com/item?id=15266230)

[2]:
[https://www.youtube.com/watch?v=uKxzK9xtSXM&t=2m18s](https://www.youtube.com/watch?v=uKxzK9xtSXM&t=2m18s)

~~~
cocktailpeanuts
> So, the STEPS project I mentioned above¹ doesn't count? Bret Victor's work
> wouldn't count? That's incredibly short sighted.

It is incredibly short sighted to make an assumption about what I said and
criticize on that wrong assumption. There's a term for this logical fallacy
but i forget.

Anyway, there's a huge difference between coming up with an idea/prototype and
turning it into a product that's meant for mainstream usage. You would only
know this if you have actually built something that's meant for mainstream
usage so if you don't get this it's fine, but try doing that next time and
you'll realize that's true.

What I'm criticizing is his comments on Apple's implementation decisions on
things like the stylus. There are plenty of reasons why they decided to go
that way, and I totally understand if this criticism was coming from some
ignorant journalist who hasn't built anything (because they're ignorant), but
Alan Kay should know better. There are way more things to consider than just
ideals when you're building a real product.

This is probably why Xerox Parc built so many innovative prototypes but most
of them were commercialized by other companies. If they had people like Steve
Jobs instead, Xerox would be Apple by now.

If you think I'm being overly critical, just watch some of his speech videos
on youtube, too much of what he says is based on the past and not the future.

~~~
loup-vaillant
> It is incredibly short sighted to make an assumption about what I said and
> criticize on that wrong assumption.

Are you saying I was mistaken and the STEPS project and Bret Victor's demos
_do_ count after all? Even though they're prototypes that haven't been
"productised and distributed"?

I merely assumed you meant what you wrote. If you didn't, don't complain.

> _What I 'm criticizing is his comments on Apple's implementation decisions
> on things like the stylus._

Come on, that criticism was spot on. The iPad could have a spot for the pen.
The pen could be tethered to the iPad. Apple could have sold pen like they now
do. _The Nintendo DS ships with a freaking pen_.

I'm reluctant to dispute your point in general, but that was a bad example.

~~~
cocktailpeanuts
> The iPad could have a spot for the pen. The pen could be tethered to the
> iPad.

Yes they could, but I really think this is subjective. In my case I prefer NOT
to have a spot because of aesthetic reasons, plus I never use the stylus.

But like I said, this is subjective. And that's exactly the point. You can't
really judge execution based on subjective qualities like this. It's
impossible to satisfy everyone.

It's just a difference in vision. People have different priorities. For
example the Bitcoin community right now is divided into two based on their
philosophy, and you can't say one is right and one is wrong because each has
their own vision and each side makes sense in their own right.

I'm sure Alan Kay had his own vision about what a computer could become. I
think if he's so dissatisfied with what it's come to, the best approach is to
build one himself and give it to the world. (But honestly, as someone who's
been thinking a lot about this field, I see so many "plot holes" in all the
criticism he's making)

I am all for visionaries building inspiring products and shipping to market.
But it's sad to see people like him dwelling on the past instead of the
future.

------
scottlegrand2
The ad layout on mobile makes this article unreadable on my phone to the point
of rage quitting halfway through or so. How appropriate somehow...

~~~
digitalengineer
[https://www.mozilla.org/nl/firefox/focus/](https://www.mozilla.org/nl/firefox/focus/)

------
johansch
I have seen this pattern in other people with interesting but narcissistic
personalities:

"Steve wasn’t capable of being friends. That wasn’t his personality."

------
anotherbrownguy
I hope someone could tell Alan Kay about the Galaxy Note. It has a really nice
pen and a place to put it.

------
c3d
I recognize the Alan Kay I met at HP. Very sure of his own genius and quick to
snarkily dismiss the genius of others.

Case in point: The iPad and iPhone makes kids creative like no other tool has
before. Kids know how to use it to text, exchange ideas or pictures of
kittens, draw or sketch, learn how to program (Swift Playground anyone), take
and edit photos. It is everything but a glorified TV where kids are passive.
This is the only environment where kids are truly first-class citizens. How on
earth Kay could spend so much energy (a whole article) trying to convince us
otherwise based mostly on super-weak arguments like "nobody is using the Shake
gesture to undo" is truly beyond my understanding.

Don't get me wrong. The Dynabook was a somewhat interesting idea. But it's
about as much an inspiration for the real thing we use daily as the Star Trek
tricorder. What Kay, to this day, did not get, unlike Jobs, is the power of
networking, both physical and human. What made the iPhone greater than the
Dynabook ever would have been is the App Store, and the combined creativity of
millions of developers. What made the iPhone greater is apps like Messages,
Facetime, Facebook, Twitter, Whatsapp, and countless others. What made the
iPhone greater is Instagram, Flickr, sport apps, medical apps, HP-48
emulators, 3D "kill them all and think later" games.

By contrast, what would have made the Dynabook great in Kay's mind was
"designed by Kay for Kay's personal interests". Or, as Fake Bill Gates would
have said: "One programmer should be enough for everybody".

~~~
icebraining
_What Kay, to this day, did not get, unlike Jobs, is the power of networking,
both physical and human. What made the iPhone greater than the Dynabook ever
would have been is the App Store,_

Interesting phrasing, considering that the iPhone did not come with the App
Store. In fact, Apple's was the _third_ software distribution platform to be
released for iOS, after Installer.app and Cydia.

Meanwhile, Alan Kay was working in Croquet[1] - a project that allows the
creation of multi-user collaborative software with live-editing and automatic
distribution and replication - since at least 2001.

[1]
[https://en.wikipedia.org/wiki/Croquet_Project](https://en.wikipedia.org/wiki/Croquet_Project)

~~~
coldtea
> _Interesting phrasing, considering that the iPhone did not come with the App
> Store. In fact, Apple 's was the third software distribution platform to be
> released for iOS, after Installer.app and Cydia._

Which is irrelevant.

First, because it eventually came, and second, because whether they intend to
put one out or not, obviously they couldn't create a first class API for third
party use, fix public interfaces as much as possible, write documentation,
create a store, et al, all while creating the first edition of a new product
(which they didn't even knew if it would be a success or not).

------
kerkeslager
Pop-up warning.

------
cerealbad
writing diminished thinking and reasoning capacity, computers continued
offloading cognitive functions to adaptive machines. dig up, stupid! is a
fitting epithet.

------
paulsutter
Elon didn't wait, he just made it happen

~~~
jlg23
I don't know Elon Musk personally, but I'm pretty sure he'd refer to Alan Kay
as one of the giants on whose shoulders he and quite a few industries stand
on.

