
Duckspeak vs. Smalltalk: Decline of the Xerox PARC Philosophy at Apple (2011) - panic
http://dorophone.blogspot.com/2011/07/duckspeak-vs-smalltalk.html?view=classic
======
DonHopkins
Here's some stuff I wrote about a HyperCard-inspired system called HyperLook
(nee HyperNeWS (nee GoodNeWS)) and some stuff I developed with it:

SimCity, Cellular Automata, and Happy Tool for HyperLook (nee HyperNeWS (nee
GoodNeWS))

HyperLook was like HyperCard for NeWS, with PostScript graphics and scripting
plus networking. Here are three unique and wacky examples that plug together
to show what HyperNeWS was all about, and where we could go in the future!

[https://medium.com/@donhopkins/hyperlook-nee-hypernews-
nee-g...](https://medium.com/@donhopkins/hyperlook-nee-hypernews-nee-
goodnews-99f411e58ce4)

Some highlights:

The Three Axis of AJAX, Which NeWS Also Has To Grind!!!

NeWS was architecturally similar to what is now called AJAX, except that NeWS
coherently:

…(drum roll)…

1) Used PostScript CODE instead of JavaScript for PROGRAMMING.

2) Used PostScript GRAPHICS instead of DHTML and CSS for RENDERING.

3) Used PostScript DATA instead of XML and JSON for DATA REPRESENTATION.

The Axis of Eval: Code, Graphics and Data

We will return to these three important dimensions of Code, Graphics and Data
as a recurring theme throughout this article. But which way to go from here?

Alan Kay on NeWS:

“I thought NeWS was ‘the right way to go’ (except it missed the live system
underneath). It was also very early in commercial personal computing to be
able to do a UI using Postscript, so it was impressive that the implementation
worked at all.” -Alan Kay

What’s the Big Deal About HyperCard?

"I thought HyperCard was quite brilliant in the end-user problems it solved.
(It would have been wonderfully better with a deep dynamic language
underneath, but I think part of the success of the design is that they didn’t
have all the degrees of freedom to worry about, and were just able to
concentrate on their end-user’s direct needs."

"HyperCard is an especially good example of a system that was “finished and
smoothed and documented” beautifully. It deserved to be successful. And Apple
blew it by not making the design framework the basis of a web browser (as old
Parc hands advised in the early 90s …)" -Alan Kay

~~~
leoh
Yikes, am I reading this right? That PostScript here is implied as a good
thing? Here is xeyes implemented in PostScript. It was probably better than
the X Windows implementation of the era, but compared to JavaScript... I
shudder.

[https://groups.google.com/forum/#!original/comp.windows.news...](https://groups.google.com/forum/#!original/comp.windows.news/pIoof2p2eew/HSBV2prw3CoJ)

~~~
DonHopkins
That's 93 lines of PostScript code, including comments. And how many lines of
C code is XEyes? (Not including the Makefile.am, autogen.sh, configure.ac, and
other supporting cruft, to be charitable. I'm counting xeyes.c (141), Eyes.c
(642), Eyes.h (56), EyesP.h (56), transform.c (111), transform.h (32),
eyemask.bit (22), eyes.bit (22) = 1082 lines of really ugly code -- not easy
on the eyes!)

[http://cgit.freedesktop.org:80/xorg/app/xeyes/snapshot/xeyes...](http://cgit.freedesktop.org:80/xorg/app/xeyes/snapshot/xeyes-1.1.1.tar.gz)

By the way, XEyes was a later imitation of Jeremy Huxtable's original NeWS
"Big Brother" eyes.

[https://web.archive.org/web/20140228084012/http://en.wikiped...](https://web.archive.org/web/20140228084012/http://en.wikipedia.org/wiki/Xeyes)

Here's a much more elaborate version of NeWS eyes, that lets you drag and drop
eyes into each other (see "MoveStopSub"), nesting them to any depth (so child
eyes recursively move around with their parent eye, and look really creepy),
and split eyes in two with a menu (see "SplitEye"), in 375 lines of PostScript
code:

[https://donhopkins.com/home/archive/NeWS/eyes.ps](https://donhopkins.com/home/archive/NeWS/eyes.ps)

Have you ever actually used the X11 SHAPE extension to make a round window,
did you know all X11 windows actually have FOUR shapes (client bounding
region, client clip region, effective bounding region, effective clip region),
and do you understand all the nuances and implications of the ICCCM ("Ice
Cubed") window management protocol?

[https://en.wikipedia.org/wiki/Shape_extension](https://en.wikipedia.org/wiki/Shape_extension)

[https://www.x.org/releases/current/doc/libXext/shapelib.html](https://www.x.org/releases/current/doc/libXext/shapelib.html)

[https://blog.dshr.org/2018/05/recreational-
bugs.html](https://blog.dshr.org/2018/05/recreational-bugs.html)

"You will get a better Gorilla effect if you use as big a piece of paper as
possible." -Kunihiko Kasahara, Creative Origami.

And what is the net speed velocity of an XConfigureWindow request?

[https://medium.com/@donhopkins/the-x-windows-
disaster-128d39...](https://medium.com/@donhopkins/the-x-windows-
disaster-128d398ebd47#5a77)

~~~
gilbetron
Apples and Oranges. Postscript itself includes the code for the primitives
used in xeyes, whereas the C code implements them first. Take away that code
and it is much more illuminating.

~~~
DonHopkins
Are you saying that HTML Canvas should not have a method to draw circles
because you can implement a circle drawing algorithm in JavaScript?

What is your point? That it's a liability that PostScript has a far superior,
more powerful, scalable, device independent, future proof, and vastly more
convenient and easier to use and understand imaging model than X11?

And by the way, NeWS eyes came first: "xeyes" was a lower quality knock-off,
not the original. The first version of "xeyes" actually used a rectangular
window, which kind of missed the whole point, until somebody added hundreds of
lines of extremely complex code to implement the SHAPE extension. I linked to
the source above: download and read it if you don't believe me.

NeWS lets you shape a round (or any shape) window the exact same way you draw
the same shape. See how pizzatool shapes the spinning full or half circle
pizza window floating inside a rectangular frame, so you can easily move and
resize it:

    
    
      % Reshape the parent popup pizza window,
      % to reflect a change in the pizza's shape.
      % In all truth, the pizza canvas is rectangular!
      % It gets its round (or semi-circular) shape
      % from the shape of its parent canvas,
      % which is a hollow rectangular popup window frame
      % (just the window borders),
      % with a discontiguous pie floating in the center.
      %
      /ReshapeParent { % - => -
        gsave
          Parent setcanvas
          /bbox PreviewWindow send /reshape PreviewWindow send
        grestore
      } def
    
      % The popup pizza window path includes the window borders and the
      % round (or semi-circular) pie floating in the middle, but excludes
      % everything between.
      %
      /path { % x y w h => -
        matrix currentmatrix 5 1 roll		% mat x y w h
          /minsize self send xymax
          4 2 roll translate			% mat w h
          0 0 3 index 3 index rectpath
          WInset SInset translate			% mat w h
          EInset WInset add				% mat w h ewinsets
          NInset SInset add				% mat w h ewinsets nsinsets
          xysub					% mat insidew insideh
          0 0 3 index 3 index rectpath
    
          2 div exch 2 div exch			% mat centerx centery
    
          2 copy translate
          min dup neg scale				% mat
    
          { % send to Center client (the Pizza):
    	/radiusscale self send 0 moveto
    	0 0 /radiusscale self send
    	0 360 /fraction self send mul
    	arc closepath
          }						% mat {msg}
          /Center /client self send			%       ... client true | false
          /WhatPizza? assert			% mat {msg} client
          send					% mat
        setmatrix					%
      } def
    
      % Reshape the window canvas with the even/odd rule.
      %
      /reshape { % x y w h => -
        /invalidate self send
        gsave
          4 2 roll translate			% w h
          0 0 4 2 roll				% 0 0 w h
          /path self send				%
          self eoreshapecanvas
        grestore
      } def
    

So it looks like this:

[https://donhopkins.com/home/catalog/images/pizzatool.gif](https://donhopkins.com/home/catalog/images/pizzatool.gif)

But with X-Windows you have to bend over backwards manipulating pixmaps or
lists of rectangles, so the window shaping code is completely different and
much more complex than the drawing code.

Case in point: take a look at the "ShapePieMenu" function in the Tk pie menus
I implemented for X11/TCL/Tk SimCity, for another example of how much of a
pain in the ass it is to shape a window in X11. It has to first get the
display and query to make sure the extension is supported, then if it is, make
the window exist, get the window id, make a pixmap, make a graphics context,
set the foreground color, erase the background, set the foreground color, fill
a bunch of rectangles, free the graphics context, and call XShapeCombineMask.
And that's only shaping the window to a simple list of few rectangles for all
the labels, not a circle, which would have been MUCH harder:

[https://github.com/SimHacker/micropolis/blob/master/micropol...](https://github.com/SimHacker/micropolis/blob/master/micropolis-
activity/src/sim/w_piem.c#L2305)

So funny that you'd compare drawing circles in X11 -vs- PostScript. Have you
actually tried to draw a circle with either API yourself -- let alone used the
X11 SHAPE extension to make a round window? Here's an excerpt from the
X-Windows Disaster which addresses that very topic -- specifically "in the
case of arcs":

[https://medium.com/@donhopkins/the-x-windows-
disaster-128d39...](https://medium.com/@donhopkins/the-x-windows-
disaster-128d398ebd47)

Myth: X is “Device Independent”

X is extremely device dependent because all X graphics are specified in pixel
coordinates. graphics drawn on different resolution screens come out at
different sizes, so you have to scale all the coordinates yourself if you want
to draw at a certain size. Not all screens even have square pixels: unless you
don’t mind rectangular squares and oval circles, you also have to adjust all
coordinates according to the pixel aspect ratio.

A task as simple as filing and stroking shapes is quite complicated because of
X’s bizarre pixel-oriented imaging rules. When you fill a 10x10 square with
XFillRectangle, it fills the 100 pixels you expect. But you get extra “bonus
pixels” when you pass the same arguments to XDrawRectangle, because it
actually draws an 11x11 square, hanging out one pixel below and to the
right!!! If you find this hard to believe, look it up in the X manual
yourself: Volume 1, Section 6.1.4. The manual patronizingly explains how easy
it is to add 1 to the x and y position of the filled rectangle, while
subtracting 1 from the width and height to compensate, so it fits neatly
inside the outline. Then it points out that “in the case of arcs, however,
this is a much more difficult proposition (probably impossible in a portable
fashion).” This means that portably filling and stroking an arbitrarily scaled
arc without overlapping or leaving gaps is an intractable problem when using
the X Window System. Think about that. You can’t even draw a proper rectangle
with a thick outline, since the line width is specified in unscaled pixel
units, so if your display has rectangular pixels, the vertical and horizontal
lines will have different thicknesses even though you scaled the rectangle
corner coordinates to compensate for the aspect ratio.

~~~
gilbetron
I created a new language called zis, which has a function named "doXeyes()". I
can do xeyes in literally one line of code. It destroys postscript with it's
amazing simplicity. Sorry for the snarkiness, but that's my point.

~~~
DonHopkins
Snarkyness? What snarkyness? I don't understand what you mean. So I'd
appreciate it if you'd explain what your point is more clearly, by directly
addressing the questions I asked.

Please be kind enough to link to the source code of the language "zis" that
you wrote, just like I linked to the source code that I and others wrote to
illustrate my point. I'd like to see the actual code you wrote, so I don't
miss your point, and can see that you're arguing in good faith. Thank you,
kind sir or ma'am!

So what else is your language "zis" good for? Is it fully general purpose,
easily editable at runtime, graphically skinnable with a build-in drawing
editor, and dynamically scriptable and extensible by normal users, like
HyperCard and HyperLook and Smalltalk? Do you think Alan Kay would describe
"zis" as the "right way to go" like he described NeWS, and what other end-user
problems does it solve, in the sense that he thinks HyperCard is brilliant and
that Smalltalk pioneered? Is "zis" as "finished and smoothed and documented
beautifully" as Alan Kay describes HyperCard?

Have you developed and shipped any commercial products or free software with
your language, like SVR4 or SimCity, that I can see and run for myself, so I
can be confident you're not just making stuff up or wildly exaggerating, and
that it works as you advertise? And have you written any books or
documentation or articles or research papers about it that I can read to learn
more about it? Or at least a screen snapshot, please? What are the URLs?

Have you finished reading the X-Windows xeyes source code that I referred you
to? Did you simply copy that code somebody else wrote into "zis" verbatim, or
did you actually write your own original code? How is it licensed, and where
has it been distributed: is it open source or not, and where's the repo?

Now that you have presumably used the X-Windows SHAPES extension API first
hand yourself, please share your experience and tell me what you think of it:
do you think that it is well designed and easy to use or not? What changes or
improvements would you suggest? Did you run into any of its limitations, or
notice any unnecessary complexity, or ICCCM window manager incompatibilities?
How well does your language "zis" support the SHAPES extension and ICCCM
protocol, or improve upon them, and what else is it good for than xeyes?

Finally, does "zis" support dragging and dropping eyes into each other,
nesting their windows into an arbitrarily deep hierarchy, like my NeWS eyes
did 27 years ago, and what ICCCM window managers support that feature, or did
you have to write your own window manager or X extension to support that
feature?

Thanks for taking all this time and effort to design and write and share all
that code, and to support and clarify your fascinating arguments. I'm looking
forward to seeing your code and your answers to my questions.

~~~
gilbetron
So, yes I've made xeyes in both X11/C and another language - it was part of a
computer graphics course taken years ago, oddly enough, and it _might_ have
been PS, but I don't really remember. I did write one or two small programs in
PS, I know that.

My point with the (imaginary) "zis" language is that comparing a domain-
specific language like PS to C code is a strange comparison - not without
merit, but PS is a higher level abstraction, which is great, but saying it is
an amazing language because it is slightly easier than raw C is misleading. If
I build up (or use) a nice library (and I'm not going to get pulled into
defending X, I'm not as harsh as you on it, but it is ... painful ... to use -
also, side note, I remember skimming through the Unix Haters Handbook when it
came out, so I probably read your words back then, funny!) written in C, it is
fair to compare that to PS.

I'd rather do it in OpenGL/C for instance, plus my version will be more
performant. There's a reason that combination (or DirectX) is used in the vast
majority of graphics software out there.

------
armadsen
Since this article was written, the iPad has gained a number of excellent
programming tools from third party developers (Pythonista is my favorite).

And perhaps more importantly, Apple has released two: Swift Playgrounds and
Shortcuts. Swift Playgrounds is explicitly designed so that kids (and adults)
can learn to program. But it’s also pretty full featured, with access to all
of the APIs and frameworks on the system. Shortcuts is programming-in-
disguise, much like HyperCard before it. It’s an end user app that nearly
requires you to “program”, albeit visually for the most part.

I think the vision of every computer user also being a programmer hasn’t
happened because we really still haven’t figured out how to make that work.
But that doesn’t mean no one is trying anymore.

As for me, I’m a professional programmer because when I bought a Mac 15 years
ago, it came with Xcode in the box, and I was able to start building my own
Mac apps without too high of a barrier to entry.

~~~
gumby
> Since this article was written, the iPad has gained a number of excellent
> programming tools

That's true, and good, but misses the point. In the smalltalk world (and Lisp
machine worlds of PARC and MIT) _everything_ running in the machine (above the
microcode layer) was inspectable, breakpoint able, and modifiable.

While on the iPad everything is, as the article says, opaque and unmodifiable,
unless its your own app.

note: I worked on Lispms at MIT (and other places) and on D-machine Interlisp
at PARC (and other places)

~~~
erikb
> In the smalltalk world (and Lisp machine worlds of PARC and MIT) everything
> running in the machine (above the microcode layer) was inspectable,
> breakpoint able, and modifiable.

So Linux plus maybe some elegance features?

~~~
gumby
There were no kernel/user space distinctions in any of the systems I
mentioned, and all of it could be dynamically modified at runtime (no
recompilation or restarting) down to the function and variable level.
Philosophically completely different from Unix et al.

But you raise an interesting point: one of the reasons I started Cygnus with
John and Michael was that I had never used a machine to which I didn't have
access to the source code and the idea of that horrified me.

------
sgt101
I have spent 100's of hours building a simple simulator using Julia recently.
I mention this because this is what this view of computing misses; despite
having hugely productive tools (compared to the 70's, remember "compile
time?") vast reuse (library after library filled with wonderful goodies) and
amazing computers (I use an X1 and have access to servers and so on and if I
really need it I can use the cloud any time), software is complex and time
consuming. That's why programming / developing is a full time job. Everyone
else needs to get on with all the other things they've got to do.

~~~
scroot
The view of computing you are criticizing in fact doesn't miss your point. It
attacks it straight on. What does "programming" mean? Insofar as it involves
people using Algol inspired languages -- with some features here and there --
typing up instructions in expensive systems that are (still!) teletype
emulators, then what you say is always going to be true.

The point of the article is that once upon a time people began to think
another way was possible and started going in that direction. The reasons that
all of this has fallen out of the computing pop culture have little to do with
viability, but rather the needs of the market.

------
dang
Discussed in 2015, including comments from an HN user who worked at both PARC
and Apple:

[https://news.ycombinator.com/item?id=8976872](https://news.ycombinator.com/item?id=8976872)

------
PinkMilkshake
Windows has a surprising number of ways to code out of the box and all require
nothing more than opening notepad and saving it with the right file extension.

.js .vbs .bat .ps1 or .hta

The problem is they are not simple to use for the average user. Windows lacks
a straightforward procedural automation system. Flow, Workflows and PowerApps
address this in some ways but are really jank.

I want something like:

    
    
      GET COLUMN 'customer name' FROM accounts.xls
      REMOVE ROW 1
      SAVE customers.xls
      EMAIL foo.xls TO 'some@guy.com'

------
lispm
Smalltalk is the product of a very well funded research lab - Xerox PARC. It
was not so much a successful product.

It's less interesting about think what Apple did with it - because they had
other, more conventional, goals in the end of the day.

I think it's more interesting to think what happened at Xerox - why didn't
they get it out to millions of people as the software a computer boots into or
as the primary development environment. Or even as a system design philosophy?

Xerox sold expensive systems like a few Smalltalk machines, but much more they
sold office systems on the same hardware NOT implemented in Smalltalk. IIRC
they might have sold larger laser printers with Smalltalk-based control
software - can't remember.

Xerox gave Smalltalk 80 to a few companies - like Apple and others. Apple
ported it to their machines - this is also the origin of Squeak (a later open
source Smalltalk). But it never really caught on. Even though Apple gave
developers access to it.

Now the philosophy of Smalltalk was already in the Apple ][. It booted into a
BASIC prompt. But later computers were more universal and current incarnations
are only seen as appliances. Nobody needs to program to use an iOS device.

On a Lisp Machine one can force any application into a REPL (called listener)
at any time. The keyboard even has a key for that. Live programmability is the
main purpose of such a computer. Source is only a keystroke/mouse click away.

We don't have that anymore and there must be a reason for it. Probably it
would also be a security nightmare in current networked surroundings...

~~~
simonh
The reason Smalltalk didn’t take over the world was it was so resource
intensive. A single workstation would have cost about $40,000. There’s no way
the early Macs could have run Smalltalk, they only became capable of that
years later and it was slow as molasses. I suppose it should really have had a
chance by the Java era, but C style syntax languages ruled the world and the
state of the art had moved on by then.

~~~
ksec
But would it take over the world today? Our latest smartphone properly have
10x to 100x the processor power of the "$40,000" Workstation back then.

I am not entirely sure if Smalltalk was the right approach, most people, the
98% who don't read HN don't want to code. I think Hypercard is a much closer
to that vision but it doesn't seems there are massive interest for something
like that either, The world today is like 90% consumption, people consume
news, media, video etc.

~~~
scroot
> it doesn't seems there are massive interest for something like that either

It's been a long enough time that most people don't even know that "Hypercard
like things" are even possible, or what they mean, or how one could use them.
The "appification" of personal computing has ensured that there are
"programmers" and "users" (scribes and plebs) and that the relationship will
be transactional and commercial.

> he world today is like 90% consumption, people consume news, media, video
> etc.

The more the development community realizes this, the better off we will be --
because it will become clear that in a technical sense a completely different
way (ie the direction of Smalltalk or Hypercard) is possible and that the
truly limiting factors are cultural more than anything else.

------
fourthark
_But while computers rapidly increased in power, the tools that programmers
used to program them developed relatively conservatively. It is easy to
imagine a world where those tools developed along with the computers, until
programming itself became so easy that the average user would feel comfortable
doing it._

It's nice to imagine. But is it so easy to imagine?

~~~
scroot
He has provided historical examples that trend in that direction. The point
is: industry stopped looking into this (and an implicit point is also that
investigating these things is no longer well funded).

------
overgard
Maybe I’m cynical but I just don’t think the average person cares enough to
want to program even if it’s really easy. I mean it’s easy to criticise the
software for being locked down, and fair enough, but maybe .0001% of users
would really want to modify things anyway.

~~~
true_religion
The amount of half-baked Javascript bookmarklets, Excel scripts, and SQL
stored procedures I have come across in my lifetime beg to differ.

People are already programming, and they'd do more of it if everyone had basic
knowledge of how. It's like literacy. Before general literacy, people thought
it was pointless to learn to write, now everyone jots down grocery lists and
working notes because they're capable of doing it without a second thought.

~~~
coldtea
> _The amount of half-baked Javascript bookmarklets, Excel scripts, and SQL
> stored procedures I have come across in my lifetime beg to differ._

Even those are made by what? 1% of users?

------
hevi_jos
Creators love to create, thinkers love to think, programmers love to program.

Most people do NOT like creating(when they can use something already created
by someone else without effort), programming or thinking(it is slow, takes
effort and gratification is not instant) at all.

It is not a critic on most people, in fact I have seen people spending months
of work for installing or compiling a custom Linux like gentoo just because
they can. Thinking on problems that will probably be a dead end forever or
programming things that would have been done cheaper, better and way faster
manually.

That is the reason Steve Jobs succeeded, he made what people (most
people)wanted. Instant gratification like music on iPods. Simple solutions to
problems people have(simple install). Making sure creators could live from
their professional work(simple payment system).

There is a talk somewhere when Steve Jobs talk about him realizing people
really demand things like Trash TV.

When you sell products you see reality, Alan could only see the idealism of
Academia.

It is not that people do not want to improve themselves. It is just that most
people do not want to pay the price, in effort, in time or focus. Just look at
the Ads on TV: Lose weight with no effort with product X, in no time, while
you watch TV.

------
erikb
> It is easy to imagine a world where those tools developed along with the
> computers, until programming itself became so easy that the average user
> would feel comfortable doing it.

I would argue it IS SO EASY. The thing is that programming itself is not a
problem. The problem is he complex world the programming happens in.
Programming itself even helps to reduce that complexity though by allowing you
to traverse some amount of that complexity automatically.

The problem is not, that programming is too hard for people, but that people
are not willing to take responsibility for themselves and hteir problems. Most
people look to others for solutions to their problems. That's also why people
who promise them solutions, like Jesus or Jobs, are praised as saviours, even
though they are never able to fulfill all the promises, at least in the amount
and dimensions that the receiver would hope for.

> It is interesting that at one point, Jobs (who could not be reached for
> comment) described his vision of computers

A kind of statement we haven't seen for a long time. RIP, Steve.

PS: The article didn't really explain why he thinks there was a Xerox Parc
philosophy inside of Apple and how/why it was in decline, right? It basically
just says, there was Xerox Parc with that awesome philosophy and there was
Apple who copied some of it, but altogether was a closed system developer.

------
linguae
This has been something that I've been thinking quite a lot about for the past
few years. The early Apple was heavily influenced by the work that came out of
Xerox PARC. Even during Apple's low point in the 1990s, Apple maintained a
research lab that was the logical successor to many of the ideas that came out
of Xerox PARC. The Dylan programming language would have been an interesting
environment for programming Newton applications, and OpenDoc would have
brought the idea of larger applications and compound documents composed from
smaller applications and documents, which I find quite similar to the
Smalltalk vision in some ways. Unfortunately Apple's research lab would be
shuttered in 1997, but the rationale was understandable; Apple was on the
verge of bankruptcy back then and Apple desperately needed to focus on its
core product strengths.

Apple used to be the champion of personal computing. Personal computing is
about empowering individuals by giving them access to computation in a
relatively accessible and affordable fashion. Apple's mission was to empower
the user through usability, and they applied the research from Xerox PARC and
other places to accomplish this. Even though the classic Macintosh operating
system is not as powerful as the Smalltalk environment, and even though
certain important proposed additions such as OpenDoc unfortunately were
cancelled, Mac OS enabled people to be more productive and more creative, and
it even helped create many industries such as the desktop publishing market
and the early web design market during the 1990s. Apple's usability guidelines
for Mac programs were well thought out and enforced a coherent vision of
usability for the entire platform. Mac OS X in the 2000s was the pinnacle of
Mac, combining the Mac's focus on usability and good design with a solid Unix
foundation that provided features that the classic Mac did not have such as
preemptive multitasking and protected memory. And, let's face it, once
competitors like the Amiga and BeOS died, Apple seemed to be the only major
player remaining in the computer industry to have a coherent view of personal
computing for the masses. If the Mac at its peak was like In-n-Out Burger,
then Microsoft Windows is McDonald's, and the desktop environments for Linux
are like frozen microwaveable burgers from the grocery store.

Unfortunately once Apple started making piles of money from the iPhone and
other parts of the iOS ecosystem, Apple started to neglect the Mac (especially
on the desktop hardware side) and the overall vision of personal computing as
a way of empowering the masses. It seems today that all Apple is concerned
about is the iOS ecosystem, which is a locked-down walled garden instead of
the freedom and empowerment that the Mac provides.

Right now personal computing needs a champion. Many hundreds of millions of
people, if not billions of people, rely on personal computers in order to
carry out their business and creative tasks. Unfortunately there are no
companies that are passionate about personal computing. Apple has become the
iPhone company, Microsoft is focused on upholding its monopoly and building
its cloud business, Google is all about mining personal data, and the Linux
desktop world is too fragmented in order to put up a united front. Even worse,
many of the major players in personal computing bought into the notion in the
late 2000s and early 2010s that personal computers will be replaced with
smartphones and tablets. This led to Apple's neglect of the Mac, Microsoft's
failed Windows 8 Metro interface, and some odd design decisions in the early
days of GNOME 3 and for Ubuntu. While the industry has been backing away from
the thought that tablets will supplant personal computers, personal computing
still lacks a champion.

My dream is for either a company or a team of open source software developers
to pick up from where Alan Kay, Don Norman, and other people left off and
create an personal computing operating system that combines the best ideas of
Smalltalk, Lisp machines, Hypercard, OpenDoc, and Apple's usability guidelines
from the Mac OS 8-9 days. It will be an operating system that is focused on
composable documents similar to OpenDoc but with Smalltalk-like levels of
flexibility and control. It will also have a strong emphasis on usability with
a "back-to-basics" viewpoint instead of the flat design promoted by
contemporary desktop and mobile GUIs.

~~~
Banana_Peel
I am curious to know which GUI you think was better, Mac OS 8-9, one of the
versions of OS X, or the original black and white Mac OS (System 6)

I am currently writing a window manager / desktop environment, and I'm kind of
torn. At the moment, I'm using the look of the black and white Mac GUI, which
I prefer over the colorized Mac OS 8-9 GUI, but at the same time I am fond of
the early OS X GUI (10.1 Puma or 10.2 Jaguar).

You say that Mac OS X in the 2000s was the pinnacle of Mac, but then you say
you want Apple's usability guidelines from the Mac OS 8-9 days. Which GUI do
you think was better? What exactly was it that made OS X in the 2000s the
pinnacle?

My goal is to build a "desktop programming GUI environment" using the
Objective-C runtime as the base, but with the look and feel of the black and
white System 6 GUI. I've written my own simplified Foundation-subset (only the
stuff that I need) as well. I'd like for it to be as simple as possible
without making too many sacrifices. After all, the original Mac used 128K of
RAM, the original OS X could run with 128MB of RAM, and the original iPhone
also came with 128MB of RAM. It's ridiculous that everything requires
gigabytes of RAM nowadays.

~~~
linguae
I should've clarified in my original post my preferences regarding the Mac OS.

When it comes to overall systems, I believe that Mac OS X is the pinnacle of
the Mac due to its combined usability and stability, and I also believe that
the Mac as a platform peaked around the Snow Leopard era (2009-11). Mac OS X
has the usability and consistency of the Mac platform with the stability of
Unix. Even though Mac OS X hasn't gotten worse since Snow Leopard,
unfortunately it hasn't dramatically improved since then; if it weren't for
the necessity of modern browsers and security updates, I could still be
productive on a computer running Snow Leopard. And the hardware situation
since 2012 has been discussed on Hacker News numerous times.

However, when it comes to usability alone, I have a soft spot for the classic
Mac OS. Personally I believe the classic Mac OS's interface is actually more
user-friendly than Mac OS X's (Mac OS is really simple with its controls and
its spatial Finder, while Mac OS X is more complex due to its NeXT roots), and
I also believe that the Mac platform back then was more compliant with Apple's
user interface guidelines than today (although unfortunately I don't know of a
way I could measure this). I find Apple's A/UX and Rhapsody projects to be
quite interesting attempts to bridge the classic Mac environment with Unix. I
also like the conservative choices that Apple made at the time regarding
color. While Mac OS X's Aqua is quite impressive, there's something about the
subdued atmosphere of Mac OS, whether it's the original black-and-white
interfaces of System 6, the light touches of color in System 7, or the
Platinum theme from Mac OS 8 and 9, which is still quite conservative compared
to Aqua.

------
bsaul
Funny how smalltalk was one of the first object oriented language, and Kay is
among the inventors of object oriented programming, and yet the vision is
completely opposed to the concept of encapsulation (which is one of today's
core principle of OOP).

Aka: to provide a simple interface to manipulate an object, rather than expose
its internals.

~~~
igouy
> …the vision is completely opposed to the concept of encapsulation…

Why do you think that?

p.286 Design Principles Behind Smalltalk [author Daniel H H ingalls]

[https://archive.org/details/byte-
magazine-1981-08](https://archive.org/details/byte-magazine-1981-08)

~~~
bsaul
I’m not talking about smalltalk’s vision as a programming language but rather
the operating system exposing the source of every component in read / write
mode.

~~~
igouy
That is not what you wrote.

------
microtherion
I'm very much in favor of constructionism as an educational philosophy, and
thus of teaching programming as a basic skill to a wide audience.

However, I also think there is a need for professional software development,
and the interfacing between professional software and casual end user
programmable systems is problematic in my experience.

Try dealing with an organization where business critical processes are handled
by a FileMaker database, a HyperCard stack, or an Excel macro written by a
rando. For extra credit, try dealing with an organization where there are
dozens of copies randomly copied to different users' machines, with different
strains of customizations, none of them under version control (even if the
authors were willing and knowledgeable to use VCS, casual programming
environments often don't seem to play nicely with version control).

I have little personal experience with pervasively programmable systems, but I
can't imagine that they interface well with professional software development
either. How do you install a word processor on a system where it's anyone's
guess how the addition operator behaves in a given week?

------
scroot
Gets right to the heart of the matter. A clear and well-written article.

------
patrickg_zill
If you want to run an Alto, the Living Computers Museum has an emulator that
can be downloaded.

[https://livingcomputers.org/Discover/News/ContrAlto-A-
Xerox-...](https://livingcomputers.org/Discover/News/ContrAlto-A-Xerox-Alto-
Emulator.aspx)

