
The Xerox Alto, Smalltalk, and Rewriting a Running GUI - kens
http://www.righto.com/2017/10/the-xerox-alto-smalltalk-and-rewriting.html
======
igouy
> _If you 're used to Java or C++, this object-oriented code may look strange,
> especially since Smalltalk uses some strange characters._ <

It's Smalltalk-76 that uses some strange characters.

The version that made it out of the lab into the wide-world was Smalltalk-80.

 _edited_ With help from Paolo Bonzini, Smalltalk-80 syntax for that square
root code would be something more qwerty-friendly like:

    
    
        sqrt
           | guess |
           (self <= 0.0) ifTrue: [
              (self = 0.0) 
                 ifTrue: [^0.0] 
                 ifFalse: [self error: 'sqrt invalid for x < 0.']
              ].
           guess := self timesTwoPower: self exponent // -2.
           5 timesRepeat: [guess := (self - (guess*guess)) / (guess*2.0) + guess].
           ^guess
    
    
    

"I Can Read C++ and Java But I Can’t Read Smalltalk" pdf

[http://carfield.com.hk/document/languages/readingSmalltalk.p...](http://carfield.com.hk/document/languages/readingSmalltalk.pdf)

~~~
uytuijhgj
From your link:

 _Finally, let’s insist that the separators be part of the method name; i.e.,
let’s require that the name of the method be “rotate by: around:” and let’s
get rid of the spaces to get “rotateby:around:” as the name and finally, let’s
capitalize internal words just for readability to get “rotateBy:around:”. Then
our example could be written

rotateBy: a around: v //This is Smalltalk_

This is also what Objective-C / Apple does.

~~~
igouy
> This is also what Objective-C / Apple does.

Inspired by Smalltalk-80.

See "A short history of Objective-C"

[https://medium.com/chmcore/a-short-history-of-objective-c-
af...](https://medium.com/chmcore/a-short-history-of-objective-c-aff9d2bde8dd)

------
shalabhc
An easy way to try ST-72 is in your browser here: [https://lively-
web.org/users/Dan/ALTO-Smalltalk-72.html](https://lively-
web.org/users/Dan/ALTO-Smalltalk-72.html) and ST-78 here: [https://lively-
web.org/users/bert/Smalltalk-78.html](https://lively-
web.org/users/bert/Smalltalk-78.html)

The above links run emulators in Javascript and have been used for live demos
as well (see
[https://youtu.be/AnrlSqtpOkw?t=2m29s](https://youtu.be/AnrlSqtpOkw?t=2m29s)
for a fun one)

Related, for a Javascript based live system check out [https://www.lively-
kernel.org/](https://www.lively-kernel.org/) (also created by Dan Ingalls).

~~~
robertkrahn01
Thanks for posting the links.

The Lively project evolved over time:

\- [https://www.lively-kernel.org](https://www.lively-kernel.org) is from the
Sun Labs / HPI days (check out the ancient [http://sunlabs-kernel.lively-
web.org](http://sunlabs-kernel.lively-web.org), fully SVG based rendering :D)

\- Lively Web: [https://lively-web.org](https://lively-web.org) A live,
programmable wiki (2012-2015)

\- Since 2016 we are working on lively.next: [https://lively-
next.org](https://lively-next.org). lively.next will focus more on the
"personal environment" aspect.

~~~
detaro
There also seems to be
[https://github.com/LivelyKernel/lively4-core](https://github.com/LivelyKernel/lively4-core)

------
flavio81
You know your article was worth writing when my idol, Sir Alan Curtis Kay,
Smalltalk inventor, takes the time to comment on it!

~~~
mindcrime
Fun note for those who weren't aware: he's been known to post here at HN from
time to time as well:

[https://news.ycombinator.com/threads?id=alankay1](https://news.ycombinator.com/threads?id=alankay1)

~~~
flavio81

        You
            made
                my
                    DAY!

------
maxpert
It's amazing to see how little the concepts have changed in desktop world
since Xerox Alto. The only different concept I saw few years back was by 10gui
[http://10gui.com/](http://10gui.com/) with VR, MR, and AR coming in we need
such innovation again.

~~~
PrimHelios
I had some reservations at first, but that actually seems incredibly useful.
I'd love to be able try it.

~~~
KGIII
Yeah, worth watching the video. The music accompaniment was terrible and
louder than it should be, but the concept is nice.

In the end of the video, they show it where the trackpad would be, which makes
me think there might be a size issue. They also show it with a regular
keyboard and I'm not sure everyone would require one. I do know that people
like tactile keyboards but some folks have managed to function without one.
So, a model where the touchpad is also the keyboard might be cool.

I touch-type, but maybe the trackpad could also be some sort of secondary
display? I could see that coming in handy, perhaps even to display the above
mentioned keyboard for this who can't touch-type.

Off-topic: HN is one of the few sites where I regularly click on links in the
comments. On most sites, that's not a very productive activity. On HN, it is
often interesting and educational.

------
gumby
Smalltalk was the first dynamic UI environment but within a few years you
could do this on lisp machines as well (both the Xerox D machines in either
Smalltalk or Interlisp modes and MIT CADR lispms). Which is to say the
Smalltalk environment was influential both at the time and later.

~~~
pjmlp
If you read the Xerox PARC papers, there was a lot of shared work between the
Interlisp-D, Smalltalk and Mesa/Cedar teams.

Actually some of the REPL and debugging features in Mesa/Cedar were done
because they wanted to appeal to the Interlisp-D and Smalltalk users, while
offering a strong type development environment.

~~~
gumby
Indeed I worked at PARC (at ISL, using Interlisp) and used all three
platforms.

This article was about the '73 Alto implementation though, which preceded the
D machines (and preceded my time at PARC by a decade as well). At that time
Interlisp was PDP-10 only and had no GUI.

~~~
pjmlp
Yeah, I should have paid attention to the nickname. :)

------
igouy
> _Smalltalk-80 in turn led to modern Smalltalk systems such as…_ <

Such as:

\- the direct commercialization of that work (by Xerox PARC spin-off ParcPlace
Systems) as ObjectWorks and then VisualWorks (now Cincom Smalltalk)

\- IBM Smalltalk on mainframe and mini computers
[http://www-01.ibm.com/support/docview.wss?uid=swg27000344&ai...](http://www-01.ibm.com/support/docview.wss?uid=swg27000344&aid=1)

\- HP Distributed Smalltalk
[http://www.hpl.hp.com/hpjournal/95apr/apr95a11.pdf](http://www.hpl.hp.com/hpjournal/95apr/apr95a11.pdf)

\- Gemstone [https://gemtalksystems.com/](https://gemtalksystems.com/)

------
bsaul
i wonder what's today's equivalent to this pionnering technologies...

biotech gene editing ? quantum computers ? artificial intelligence ? Cars ?

Is there someone somewhere discovering the ubiquitous usage of the next
decades ? Does computer science still has the potential to bring the same kind
of world-changing tools ?

Here's something i'd like to ask AlanKay : at that time, you probably had the
feeling that you were working on groundbreaking technologies, but what were
the magazines saying ? I'm pretty sure some people were thinking that
computers were going to play a role, but was this obvious to everyone ? Did
people hesitate between funding CS and, say, flying cars ?

~~~
KGIII
I am not he, but I may be able to offer some history/perspective.

There was a real fear of computers back then, much like the fear of AI today.
They worried that the 'big brain' would put people out of work and take over
the world. Multiple movies were made where the computer was the antagonist,
including the famous 2001: A Space Odyssey.

The first computer I ever touched was called a calculator and I've heard a
couple of reasons for this. The official reason was that people expected
computers to be big and the HP 9100A wasn't very big by standards of the day.
The second, and unofficial reason, was that it came out in the late 1960s,
which was at a time when people had a real fear of computers.

If you're curious, I believe the model I used was the 9100B. It supported
magnetic strip cards and punch cards to store your algorithm. You'd load it
into the memory by swiping the card or inserting the punch card. It would
output to a TV and there was a plotter option that functioned pretty much like
a modern plotter. There was also a dot-matrix printer included that was like
what was on the older POS systems for receipt printing.

Anyhow, they called it a calculator instead of a computer. People had notions
of what a computer was supposed to be and this was not that.

At the time, I didn't really appreciate it for what it was. I think my first
exposure was in 1971 and it was used almost exclusively in the physics and
astronomy labs. It wouldn't be used in the astronomy labs until later, after
the observatory was completed and the telescope put in.

This was all at Kents Hill, no apostrophe, if you'd like to learn more about
it. It's a boarding prep school located in Central Maine. As I was leaving, we
got a mainframe connection to Dartmouth, which was a 'real' computer, though I
never played with it.

So, largely people didn't seem to envision much for the future of computing,
at least not at the personal level. There were exceptions, some visionaries
who saw the potential, but the idea of a personal computer wouldn't really
exist until later in the decade. Frankly, nobody saw much use for one.
Computers were pretty useless without a great deal of domain knowledge. To use
one, you were almost required to know how to program one. Specialty software
was the norm and interaction was limited without knowing how to program.

For the most part, games were also limited to hardware specific devices and
were console format or, eventually, arcade games that were not general purpose
computers. People really didn't have much use for one in their homes and, as
mentioned above, there was a real fear of computers.

YouTube has a copy of the HP 9100 promotional video. I can dig it out, if you
want. It was pretty remarkable but I wasn't nearly as impressed by it as I
should have been. So, I don't really think that the future of computing was
obvious to everyone. In fact, I'd say the opposite was true and that it was
only obvious to a very limited set of visionaries.

Edited to clean up some verbiage and add a wee bit more information about the
9100.

Edit again. If you have 21 spare minutes and would like to learn about the HP
9100, you can click this link:

[https://www.youtube.com/watch?v=Ki1Inux1_wU](https://www.youtube.com/watch?v=Ki1Inux1_wU)

(I figure some folks might actually be curious about the tech of the day. That
was pretty state-of-the-art back then and far more compute power than most
people would be exposed to for the next decade and a half, if not longer.)

~~~
d-crane
Thank you for the historical info, and the video; as a big computing history
buff, I really appreciate both the first-hand account and the media of the
time!

~~~
KGIII
I'm always happy to help. I get a great deal from HN, lots of things to learn
and smart people to answer questions. It makes me happy when I'm able to give
a return contribution.

If, for some reason, you have further questions, uninvolved@outlook.com is the
email address that I hand out in public. I tend to write long posts and
replies, so you have been officially warned.

------
GarvielLoken
"Smalltalk is a highly-influential programming language and environment that
introduced the term "object-oriented programming" and was the ancestor of
modern object-oriented languages." Funny way to spell Simula.

------
andreasgonewild
Pharo ([http://pharo.org](http://pharo.org)) is also worth mentioning on the
subject of modern Smalltalk environments.

~~~
kens
Ok, I've added Pharo to the article.

------
yipopov
Having seen the Smalltalk environment leads me to conclude that modern IDEs
aren't integrated at all. It's a scam.

------
fairpx
It's fascinating how the GUI's haven't changed much since. I run a UI design
agency (we work mostly with startups) but would love to collaborate with
someone who's working on a (niche?) OS and see if we can redesign the UI. Open
Source is fine. If you're working on something, hit me up. Details in my bio
:)

~~~
seanmcdirmid
You can redesign the UI using Canvas, no need to do it at the OS level. It is
hard to get past the standard WIMP GUI though, naturalistic UIs (NUIs) were
tried a decade ago, but they were found to not actually be more usable.

~~~
fairpx
I’ve been following OS UI experiments for some years and the problem with all
of them was that they took an approach that sounded cool but wasn’t really
practical. From 3D desktops to strange card based interfaces that wanted to
replicate the messy physical desk. I think the key is to redesign the GUI by
removing stuff instead of adding fancy new ways of shuffling files.

~~~
setr
I imagine the big issue is that the underlying model isn't actually being
changed to support the new UI (even by a wrapper-layer); so you're just doing
shallow edits to the "skin" of the OS. The problem being the underlying model
and the WIMP model have been co-evolving, and likely aren't very open to other
fashions of interaction.

Presumably, the way to beat it out is to figure out how to change both to
instantiate an entirely different interaction model, and thus naturally a new
UI.

Otherwise you can really only commit to incremental improvememts, but nothing
particularly interesting.

ie smalltalk's _requires_ an entirely different underlying model to support
its in-place editable UI model; it cannot naturally be built (but maybe hacked
together) if you try to avoid that shift

~~~
mncharity
The big incoming UI challenge is XR. But as with ST/Alto, it's an integrated
bootstrap problem of hardware, system design, foundational software, user
software, _and_ UI.

Absent research labs, we wait on market availability for hardware. For the
modern analogues of mice and bitmapped screens. Eye tracking may be cheap next
year, but now it's $2k. High-resolution HMDs may be $10k next year, but now
they're simply unavailable. Hand tracking is flaky. Haptics are "it buzzes".

For software, we're so used to the 2D windows interface, it's easy to forget
how much foundational work was needed. Eg, around topography, and Smalltalk,
and live coding. Hand gesture recognition, spoken dialog management, 3D
constraint-based layout, are just a few modern analogues. And so on.

So it looks like we're going to be doing broad-based innovation and
infrastructure construction, gated on hardware availability. And plagued by
patents.

It's not clear to me what can usefully be done to explore UI in the meantime.
You can do wireframing in TiltBrush, but not really. It doesn't give you the
_feel_ of say hand-attached controls, and of automated collaboration in
general. The "oh, it _feels good_ to present this graph as a sphere, and hold
it you lap and peel off layers". So...

Perhaps do UI design by paper improv? A crew scribbling on and holding up
paper, hovering around a "user", who is managing a todo list? :) That could be
fun.

~~~
seanmcdirmid
VR has a huge input problem, you can't touch any of the 3D stuff, you are
basically left waving hands in the air. Bret Victor's lab is going in the
other direction of tangible objects with digital projections.

~~~
mncharity
> tangible objects with digital projections

You can also use tangible objects with XR. It might be useful for sorting
things, like system subparts, and thinking with your hands. You can fit camera
tracking markers on 1 cm wooden cubes.

> you can't touch any of the 3D stuff

The current controller "tock" impulse haptics can be surprisingly effective.
Your brain fills in a lot. But no, no force feedback or individual finger
haptics is mass market yet.

~~~
seanmcdirmid
I'm not as pessimistic as Bret Victor is about this (see
[http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...](http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/)),
but ya, haptics is an obvious solution.

I think voice input will be even more important, once we get conversational
UIs going, XR will work much better with some pointing and voice input (as in
Iron Man).

~~~
mncharity
> I'm not as pessimistic as Bret Victor [...] ABriefRant

I'd actually go one step further than that "rant".

In the real world, small physical motions usually have little control
significance. Mouse movement and keyboards are among the many exceptions. With
XR's hand tracking, that can change. For example, while gaming is focused on
"feeling like it's real", so game hands overlap real hands, for getting work
done, it's nice to have stretchy arms, so you can reach across the room to
grab things. For optimal long-term ergonomics, you want a mix - sometimes you
want the exercise of waving your arms, and sometimes you want to achieve the
same end by merely twitching a hand resting on a knee or keyboard.

So not only did the video reduce all of human motion vocabulary, including
gaze and voice, down to a single finger over a sad UI, but even that single
finger was used in an impoverished manner. ;)

> XR will work much better with some pointing and voice input

Eye tracking as well. "Look; point; click" has a seemingly redundant step.

When someone is watching an education video, say IBM's atom animation[1], you
want to both be able to field common questions like "What are the ripples?",
but also notice that they are looking at the ripples, and volunteer related
content.

One can do prototyping now, with WebVR, google speech recognition, and in-
browser or google voice synthesis.

[1]
[https://www.youtube.com/watch?v=oSCX78-8-q0](https://www.youtube.com/watch?v=oSCX78-8-q0)

~~~
WorldMaker
> Eye tracking as well. "Look; point; click" has a seemingly redundant step.

> When someone is watching an education video, say IBM's atom animation, you
> want to both be able to field common questions like "What are the ripples?",
> but also notice that they are looking at the ripples, and volunteer related
> content.

This is why I think the gaze tracking in the Windows MR platform (which lights
up, quite literally in Fluent design applications, in the just released FCU)
is more important than people yet realize. It's been a part of HoloLens demos
from the beginning, but it's interesting how many people haven't noticed yet.

