
The Decline of the Xerox PARC Philosophy at Apple Computers (2011) - rlander
http://dorophone.blogspot.com/2011/07/duckspeak-vs-smalltalk.html
======
jedharris
I worked at PARC (in the Learning Research Group) for most of a year in 1974,
and later at Apple for 10 years. In both places I worked with Dan Ingalls and
other Smalltalk folks. At Apple I was very involved in projects that led to
various HyperCard related follow ons including AppleScript.

This article does not ring true to me at all. The problem with making a big
fraction of users into programmers is that we don't know how to do it. The LRG
at PARC and then various groups at Apple tried every way they (we) could think
of, and other ways have continued to be invented and tried. So far none work.
Hypercard was indeed the most accessible development environment, but only a
small fraction of Hypercard users ever wrote code (maybe 5%).

Apple has continued to make user programming of Macs as easy as they
conveniently can (as far as I can tell having been out of there for a long
time). iOS has a different goal -- making devices safe and usable -- which is
intrinsically in conflict with maximum programmability. Even on iOS there are
plenty of "user programmable" apps.

I think part of the problem here is that many developers take themselves and
their developer friends as "typical" but that is totally not the case. This
really treats most of the population with (unintentional) disrespect.

~~~
mempko
"Apple has continued to make user programming of Macs as easy as they
conveniently can"

I don't see how that is true. XCode is not installed by default. You can use
AppleScript and Automator to automate some tasks, but you can't really call
that programming.

The source code to the whole system is closed which limits discover-ability.

I remember in my childhood playing Gorillas on my DOS machine. I pressed a
button and suddenly all the code for it popped up. I learned that if you
change a part of that, the game would change.

There is NOTHING like that on a Mac install today. The closest thing is in the
web browser.

~~~
jedharris
It would be interesting to mandate that all Mac Apps be written in Javascript
so users could modify them. However this seems ambitious. Even mandating that
all apps be scriptable seems to be a bridge too far.

Note that Javascript is supported in addition to AppleScript as a user level
scripting language.

Much of the Mac OS is in fact open source -- see Darwin and other projects.
However having the code for the OS available -- or having XCode installed --
does _nothing_ for user programmability.

This comment is a good example of why it doesn't work for us developers to see
ourselves as typical users.

~~~
TheOtherHobbes
Apple still supports Widgets, which are canned JavaScript built to run in a
special Dashboard app. (Does anyone still develop Widgets?)

But this still misses the point about HyperCard, which was that Hypercard was
its own toolchain. To get started with it, you didn't have to:

Install a separate editor Learn a separate editor Install a build system Learn
how to build a project Learn how to share code Deal with dependencies

It's the no-separate-toolchain feature that made HyperCard so accessible, and
which was influenced by SmallTalk.

Modern programming is a nightmare in comparison - including modern app
programming.

Just because Apple uses Objective-C doesn't mean it's approaching app
development in a SmallTalk-like way. Getting an app out of Xcode and into the
store is a horror story of provisioning profiles, debug vs production builds,
sandboxes, entitlements, and so on. Obviously people manage to fight through
this, but it's a long way from being friendly and accessible.

IMO only VBA gets close to being a successful no-separate-toolchain
environment - which is one of the main reasons VBA became so popular.

(You could argue Python is, but I don't think it's equivalent, because unlike
VBA and HyperCard the first thing the user sees is the editor, and not a
product that's obviously and immediately useful, but also happens to include a
code editor.)

------
mmarks
In 1995, we watched these 1979 videos of Smalltalk in my programming languages
college computer science course, which focused on Scheme.

I vividly remember the reaction of my fellow students. Given the mockery and
jokes from my fellow students, you'd think they were watching a bad sci-fi
movie. Most students discounted everything they saw, 'real men and real
programmers used C'. I remember being so disheartened that it seemed we'd
evolved so little in tools/languages from 1979 - 1995.

At the time, everything was Unix and C programming (DEC Alpha were just being
installed on campus, Windows 95 had just been released). There were a lot of
reasons Unix/C succeeded, there is a great classic paper about why C beat
Lisp, and I agree with the author.

However, what always troubled me, is how my fellow students completely ignored
any potential learnings from those videos. In many ways, those early Smalltalk
programs were far more impressive than anything they had created, but they
just wrote them off.

At GDC 2014, a post-mortem was presented on the classic game Myst. That was
written entirely in Hypercard.

~~~
jarcane
The prejudice of programmers is one of the biggest hindrances of technological
advancement in computing AFAIC.

Think about it: currently, functional programming is, finally, getting some
well deserved recognition in the wider programming world.

Yet almost everything it presents has been present in programming for as much
as 45 years. The original paper on Hindley's type system was published in
1969. Milner's theory went to print in 1978. Scheme first appeared in 1975 and
was already building off functional ideas that had been spawned by earlier
Lisps. Guy Steele designed an actual Scheme CPU in hardware in _1979_.

And yet even today, a non-trivial number of programmers react with absolute
horror at the idea of a Lisp (usually based solely on ignorant trivialities
like the parens-syntax), more or less exactly as your C programming classmates
did in 1995, and while FP is starting to gain major inroads in some spheres,
others dismiss the whole field as wank and Java and C remain kings that are
unlikely to be unseated for another decade at a minimum, if ever.

We remain utterly bound to one model of hardware, one model of programming,
and largely, only a couple models of operating system, after decades of
development, because so many programmers react with horror at anything they're
unfamiliar with or that deviates from the percieved norm, be it in features,
syntax, or focus.

And God forbid you make anything that might actually be easy for non-
programmers to learn. It will be more or less met with instant and persistent
scorn, and its users derided and outcast, simply because they didn't use a
'Real Programming Language' like C. Go ask a BASIC coder what life has been
like for the last 40 years, or a game dev who worked in AGS or GameMaker prior
to the last half decade or so. Hell, I have a friend who still sneers at
visual layout designers.

The divide described in the article is very much culturally enforced as much
as economically.

~~~
sillysaurus3
_And God forbid you make anything that might actually be easy for non-
programmers to learn. It will be more or less met with instant and persistent
scorn, and its users derided and outcast, simply because they didn 't use a
'Real Programming Language' like C._

Prejudice doesn't seem to have anything to do with it. Functional programmers
think differently, and what's obvious to the Functionals isn't to the
Statefuls. And the Statefuls are currently most of the world. I've flip-
flopped myself, because while I love the elegance of being a Fucntional, being
a Stateful is just so much more productive. There are a few reasons for this:
If I want to make a game, there's no good functional framework. If I want to
write a script to get something done, like download a webpage, my goto
language is Python because I know for a fact that their libraries work and
that their documentation is almost always stellar. Contrast that with Lisp
where you can spend at least a day just getting the environment set up in a
way that asdf doesn't hate. Especially on Windows. (Yes, if you want to make
games, Windows needs to support your dev environment.) My info about asdf is a
couple years out of date, because to be honest I haven't felt inclined to look
into it again after some bad experiences.

Haskell could be wonderful. Never tried it. Will someday. Until then, I'd love
some sort of competition where a Haskell programmer and myself are given a
task, like "write a script to X," where X is some real-world relevant task and
not an academic puzzle, and see who finishes it first. It would be
illuminating, since I'd give myself about a 30% chance of finishing first, but
it would reveal what I'm lacking.

Arc had potential. It really did. Everyone just gave up on it, and it never
attracted the kind of heroic programmers that Clojure did.

So the wildcard seems to be Clojure. It's a decent mix of performant,
practical, and documented.

I'm out of time to pursue this comment further, but the main point is just
that FP's problems have very little to do with societal acceptance or scorn.
If you're running into that, you're probably running with a bad crowd anyway.
It's mostly because imperative languages are popular, so network effects mean
they'll just get better and better. If FP wants to chip away at that, it'll
need to start off better and stay better. "Better" is many things, but it
includes performance, cross-platformability (yes, Windows is necessary),
documentation, and practicality (the ability to quickly accomplish random
tasks without a huge prior time investment, Python seems to be the best at
this so far).

~~~
emiljbs
> Contrast that with Lisp where you can spend at least a day just getting the
> environment set up in a way that asdf doesn't hate.

I recently re-installed my OS and setting up my Lisp environment (SBCL,
Quicklisp, SLIME+Emacs) took about 30 minutes.

~~~
aerique
Yeah, download SBCL, load quicklisp.lisp, done.

I don't see what the problem is on Windows.

~~~
sillysaurus3
Several years ago SBCL was unusable for game programming on windows due to
crashes. I tried and eventually gave up.

My info is out of date though. Maybe it's better now.

~~~
jarcane
I've not had any issues with SBCL on Windows except with the thread support.

------
Animats
I got a tour of Xerox PARC in 1975, when taking McKeenan's UCSC course in
computer architecture. I got to see the first Ethernet (Alan Kay referred to
it as "an Alohanet with a captive ether"), the first Alto workstations (they
were making their own CRTs, and were having trouble getting a uniform phosphor
coating), an early version of Smalltalk, and a xerographic print server.

Back then, Kay saw simulation as the killer app. He saw discrete-event
simulators as a business tool. They had a hospital simulation, where patients
went in and went through Waiting, Examination, Surgery, Recovery, Rest, and
Discharge. This was visual, and you could click on the little patient icons
and get something like "I am a victim of Bowlerthumb". Smalltalk was a
descendant of Simula, which was a simulation add-on for Algol. So that was a
natural direction.

It turns out that very few people use discrete-event simulators as business
tools. Although you can model your simulator or bank branch in a discrete-
event simulator and try to fix bottlenecks, nobody does this. That was a dead
end as a concept.

Xerox's commercial product, the Xerox Star, was very much locked down. It came
with a set of canned applications, and was intended for use by clerical staff.
I don't think it even ran Smalltalk, but was programmed in Mesa or Cedar. It
competed with a forgotten category of products, shared-logic world processors.
These were low-end time sharing systems with dumb terminals tied to a server
machine with a disk, used for word processing. Wang was the leader in that
field. Those, too, were very locked down.

Back then, hardware cost was the huge problem. Kay said they could build the
future because Xerox had the money to spend. (Xerox stock hit its all-time
high in 1973). Each Alto cost about $20K at the time. Apple's first attempt at
an Alto-like machine was the Lisa, which was a good machine with an MMU and a
hard drive. But it cost around $10,000. The Macintosh was a cost-reduced
version, and the original 128K floppy-only Mac was a commercial flop. It
wasn't until the cost of parts came down and the Mac could be built up to 512K
and a hard drive, with the option of a LaserWriter, that it started to sell in
volume.

What made the Macintosh line a success was desktop publishing. "Now you can
typeset everything in your office" \- early Apple advertising. It sold,
because it was far cheaper than a print shop.

Before the Macintosh, there was UNIX on the desktop. Yes, UNIX on the desktop,
with a GUI, predates the Mac. Three Rivers, Apollo, and Sun all had good
workstations with big screens in the early 1980s, before the Mac. The Three
Rivers PERQ launched in 1979, five years before the Macintosh, with a big
vertical CRT like the Alto. All these had some kind of GUI, generally not a
very good one. Those were the first descendants of the Alto. They were used
for engineering and animation, not just word processing.

------
zoul
iPad is the first computing device in history that I can suggest to my non-
computer-savvy friends and know they will be able to use it, without the
software getting screwed up in a few months. With the iPhone and iPad, Apple
has (pardon the hyperbole) brought usable computing to the masses.

We can bash Apple that they did not turn the masses into programmers at the
same time, but it’s far from obvious this is even possible (on the contrary).
And, at least for me, a hackable software device is a device that can be
broken more easily, thus compromising the first and most important principle
of usability. It’s _so_ refreshing to be able to tell people that they don’t
have to worry about the device, because there is almost nothing they could
break, no matter how hard they try.

Also, this sounds awfully condescending:

 _I think one of the main consequences of the inventions of personal computing
and the world wide Internet is that everyone gets to be a potential
participant, and this means that we now have the entire bell curve of humanity
trying to be part of the action. This will dilute good design (almost stamp it
out) until mass education can help most people get more savvy about what the
new medium is all about. (This is not a fast process). What we have now is not
dissimilar to the pop music scene vs the developed music culture (the former
has almost driven out the latter -- and this is only possible where there is
too little knowledge and taste to prevent it). Not a pretty sight._

It’s like someone has got the whole computer interaction thing sorted out and
is just waiting for the rest of the idiots to catch up. With all respect to
Alan Kay, I’m not buying that.

~~~
carlosrg
iPad is the first computing device in history that essentially takes away any
user capability to tinker with its machine, forbids installing any software
that Apple decides it shouldn't be running on it, including basic things like
alternative browsers (as you know, Chrome on iOS is only a reskin of the
included WebKit) or installing a previous OSes (which means my one year old
iPad 3 became a slow but beautiful paperweight after updating to iOS 7), you
can't develop apps for it without paying Apple 100€/year (even if you don't
plan to distribute them), etc. But if you think that's okay because that means
less technical support for friends, go ahead.

~~~
mullingitover
> forbids installing any software that Apple decides it shouldn't be running
> on it

Weird, I got a developer license and was building and installing random github
projects on my iPad without any fuss. If you can't afford the license, go in
on it with 100 of your closest friends and it's only a buck.

> But if you think that's okay because that means less technical support for
> friends, go ahead.

I'd happily pay 100 bucks not to do tech support anymore. I sent my mother my
iPad 2 a year ago and haven't heard a peep from her about computing problems
since.

~~~
carlosrg
So you don't see any issue with paying Apple _again_ to grant you the right to
install what you want in the device you already paid for (and I'm not talking
about distributing it in the App Store). Well, we can agree to disagree.

~~~
mullingitover
It's not that I don't see any issue--I do and it's actually a selling point.
The best security in the game comes at a price that I, and millions of other
satisfied customers, are more than willing to pay. The only people who
complain about Apple's system are people who don't mind doing free sysadmin
labor on a device that they already paid for.

------
kemiller
I have mixed feelings about these sorts of utopian dreams of universal
programming. On the one hand, yes, that would be amazing! On the other,
whenever you make a system that open, inevitably malicious strong programmers
will find a way to take advantage of naive weak ones. I still think it's
possible, but Smalltalk, nor anything else we've come up with yet, is still
hopelessly inadequate. I suspect that the problem of creating a programming
environment that anyone can use is roughly equivalent to the problem of
creating an AGI — it has to do what the user means, while seamlessly handling
the thousands of details and gotchas that can sink them.

~~~
javajosh
_> On the other, whenever you make a system that open, inevitably malicious
strong programmers will find a way to take advantage of naive weak ones._

Can you explain this and give an example? I've been around for a while and
don't see, or perhaps don't recognize, this pattern. If anything, I observe
the opposite: very strong programmers _help_ the weaker ones by creating new
languages, libraries, and tools.

~~~
kemiller
I'm talking about malicious hackers. They're a tiny minority, but it only
takes a few.

------
DonHopkins
Alan Kay gave a talk that was recently discussed here on Hacker News titled:
Is it really "Complex"? Or did we just make it "Complicated"?

[https://www.youtube.com/watch?v=ubaX1Smg6pY&=](https://www.youtube.com/watch?v=ubaX1Smg6pY&=)

Someone asked Alan Kay an excellent question about the iPad, and his answer is
so interesting, and reveals something very surprising about Steve Jobs losing
control of Apple near the end of his life, that I'll transcribe here.

To his credit, he handled the questioner's faux pas much more gracefully than
how RMS typically responds to questions about Linux and Open Source. ;)

Questioner: So you came up with the DynaPad --

Alan Kay: DynaBook.

Questioner: DynaBook!

Yes, I'm sorry. Which is mostly -- you know, we've got iPads and all these
tablet computers now. But does it tick you off that we can't even run Squeak
on it now?

Alan Kay: Well, you can...

Q: Yea, but you've got to pay Apple $100 bucks just to get a developer's
license.

Alan Kay: Well, there's a variety of things.

See, I'll tell you what does tick me off, though.

Basically two things.

The number one thing is, yeah, you can run Squeak, and you can run the eToys
version of Squeak on it, so children can do things.

But Apple absolutely forbids any child from putting a creation of theirs to
the internet, and forbids any other child in the world from downloading that
creation.

That couldn't be any more anti-personal-computing if you tried.

That's what ticks me off.

Then the lesser thing is that the user interface on the iPad is so bad.

Because they went for the lowest common denominator.

I actually have a nice slide for that, which shows a two-year-old kid using an
iPad, and an 85-year-old lady using an iPad. And then the next thing shows
both of them in walkers.

Because that's what Apple has catered to: they've catered to the absolute
extreme.

But in between, people, you know, when you're two or three, you start using
crayons, you start using tools. And yeah, you can buy a capacitive pen for the
iPad, but where do you put it?

So there's no place on the iPad for putting that capacitive pen.

So Apple, in spite of the fact of making a pretty good touch sensitive
surface, absolutely has no thought of selling to anybody who wants to learn
something on it.

And again, who cares?

There's nothing wrong with having something that is brain dead, and only shows
ordinary media.

The problem is that people don't know it's brain dead.

And so it's actually replacing computers that can actually do more for
children.

And to me, that's anti-ethical.

My favorite story in the Bible is the one of Esau.

Esau came back from hunting, and his brother Joseph was cooking up a pot of
soup.

And Esau said "I'm hungry, I'd like a cup of soup."

And Joseph said "Well, I'll give it to you for your birth right."

And Esau was hungry, so he said "OK".

That's humanity.

Because we're constantly giving up what's most important just for mere
convenience, and not realizing what the actual cost is.

So you could blame the schools.

I really blame Apple, because they know what they're doing.

And I blame the schools because they haven't taken the trouble to know what
they're doing over the last 30 years.

But I blame Apple more for that.

I spent a lot of -- just to get things like Squeak running, and other systems
like Scratch running on it, took many phone calls between me and Steve, before
he died.

I spent -- you know, he and I used to talk on the phone about once a month,
and I spent a long -- and it was clear that he was not in control of the
company any more.

So he got one little lightning bolt down to allow people to put interpreters
on, but not enough to allow interpretations to be shared over the internet.

So people do crazy things like attaching things into mail.

But that's not the same as finding something via search in a web browser.

So I think it's just completely messed up.

You know, it's the world that we're in.

It's a consumer world where the consumers are thought of as serfs, and only
good enough to provide money.

Not good enough to learn anything worthwhile.

------
lispm
While Jobs saw a Smalltalk system during some Xerox PARC visit, the Mac was
more influenced by the Xerox Star office system with its GUI, Desktop, Icons,
...

[http://www.digibarn.com/collections/systems/xerox-8010/retro...](http://www.digibarn.com/collections/systems/xerox-8010/retrospective-
fig1.jpg)

Compare that with a classical Smalltalk UI:

[http://research.microsoft.com/en-
us/um/people/blampson/38-al...](http://research.microsoft.com/en-
us/um/people/blampson/38-altosoftware/WebPage_files/image018.jpg)

The programming language of the Xerox Star system was 'Mesa'. Which was much
more like what Apple used: Clascal, Lisa Pascal, and then Object Pascal, ...

[http://en.wikipedia.org/wiki/Mesa_(programming_language)](http://en.wikipedia.org/wiki/Mesa_\(programming_language\))

~~~
SlipperySlope
Yeah, back in the day the bitmapped display was so awesome.Look at the
proportional fonts and bitmapped image!

I was using IBM 3270 terminals, in a special shared room for that purpose, to
edit COBOL programs with fixed width fonts - ugh.

------
sebastianconcpt
As Smalltalk allows to write really readable code, I often wonder how much
subconscious fear coming from the IT community itself is triggered by
something like Smalltalk?

How strong the self-defence mechanisms would be if this system succeeds at
allowing _anyone_ to be _equally able_ to program _anything_ on the system?

Wouldn't be way "safer" for the statu quo of the IT community to hide
implementation power behind the walls raised by the C learning curve?

Interestingly, the divorce between technicians and users/consumers is steady
reversing the trend.

~~~
zoul
That’s simply crazy. Are you a programmer?

I, as a programmer, spend my days trying to express things as clearly as
possible. Programming, as a whole field, is trying hard all the time to come
up with safer and simpler ways to express precise thoughts.

Smalltalk is probably neat, but it’s not a magic bullet. There are no magic
bullets. Anyone can’t be a programmer not because programming languages are
hard. Anyone can’t be a programmer because you have to be able to think
precisely, abstract, deal with complexity.

~~~
krig
Yeah, the simple explanation as to why the programming community at large
doesn't agree on languages or methodologies is that no one knows what they are
doing. This is still a very new field. We're all just bashing rocks together.
To declare some language or methodology as universally "better" just reveals a
fundamental lack of insight into just how difficult the problem being solved
really is.

------
washadjeffmad
"...[Steve] Jobs (who could not be reached for comment)..."

I did a double-take after reading that line before I noticed the publish date.

------
coliveira
Despite the fact that computer languages are technological constructs, what
happens to them is similar to what happens with other human languages.
Adoption is more correlated to the usefulness than to any intrinsic
characteristic of the language. For example, English is the most used language
in the world, despite the fact that it is a bad language in so many senses:
hard to spell, hard to speak, ambiguous, etc. Other languages more suited to
the task are forgotten -- just take the example of Latin that is now a dead
language. So, despite the fact that the C family has shortcomings, this has
little to do with its destiny as a way for humans to collaborate in creating
software.

------
GuiA
This line of thought is pretty common amongst a certain class of modern HCI
researchers.

See for example a tweet from Bret Victor a few days ago:

[https://twitter.com/worrydream/status/560519372641677312](https://twitter.com/worrydream/status/560519372641677312)

Or this article by Alec Resnick:

[http://alecresnick.com/post/84756789325/from-bicycles-to-
tre...](http://alecresnick.com/post/84756789325/from-bicycles-to-treadmills)

Their argument can be roughly summed up as "We were on the right track with
the constructivist approach to HCI in the 70s/80s, but capitalism ruined
everything and now we just watch Kim Kardashian on our tablets". There's of
course some cynical truth in there- I also spent my grad school years reading
Papert and Piaget and Kay and Resnick (Mitchell, not Alec), and I also find
their vision of personal computing very enticing. And in many ways I orient my
work to fit within the frameworks they built- I have nothing but deep respect
for them as academics.

But I don't buy the whole "tablets and phones are just mind numbing dumb
entertainment devices". In the past 5 years, I have seen:

\- my younger brother start an internet radio from scratch from a bedroom at
our parents, that rose to thousands of listeners

\- my grandmother use the internet to communicate with her geographically
distant friends and family

\- teenagers making short movies in their backyard, or learning how to compose
music

\- kids discovering themselves a passion for photography, able to retouch
photographs without needing access to a dedicated dark room, all with a
hundred dollar device (when I was a kid, a hundred dollars barely got you a
walkman).

\- illustrators empowered to work from their home and make a living by working
with clients many thousands miles away

\- and many, many more.

None of that would have been possible even 15 years ago. Our tools are much,
much better - Garageband and Photoshop and iMovie and Fruity Loops and others
offer the means to do things in your bedroom that would have cost tens of
thousands of dollars and required hundreds of square feet of free space a few
decades ago. Sure, you have some purists that might argue that Instagram is to
a dark room what a McDonald's burger is to Kobe beef - but I take issue with
that as well [0].

Naturally, there are people who only use these devices for watching movies and
playing games. That's absolutely fine, and I'm not sure why computer
scientists like to take such a haughty attitude towards that. Not everyone
spends all their waking hours working on their next masterpiece; and in fact,
even the people who produce masterpieces spend idle time doing unproductive
things just on par with reading Reddit or watching silly YouTube videos.

Are computers the best they can be? Far from it, and many of us work very hard
at it (including the people I quoted earlier). But the attitude that we are
now in a tarpit of mind controlling devices and that the golden days of
computing are behind us is deeply wrong. We have come such a long way.

[0]
[https://news.ycombinator.com/item?id=8856780](https://news.ycombinator.com/item?id=8856780)

------
zackmorris
I blame fervent fanboyism of any particular hardware or software for leading
us to such levels of distraction. The article mentioned something dear to my
heart that I want to expand upon because I think it’s at the core of what’s
preventing real progress not just in computer science but in the way computers
elevate humanity. One of my biggest dreams is to see a reconciliation between
message passing languages like Smalltalk/HyperTalk and functional languages
like Lisp. They are more similar than they may seem.

For example functional languages eschew the use of variables but I think that
has set back their adoption because variables merely attach human readability
to logic (like comments). Seeing a big blob of functions is great for brevity
but disastrous for long term maintenance. A compiler can trivially treat
variables (especially immutable ones) like macros and transform them back into
the same syntax tree that pure functional code generates. To say that another
way: why don’t functional paradigms suggest that we break long blocks of code
into shorter blocks with variables? They seem to be discouraging human
instinct when there is no cost in encouraging it.

Also if we take a step back and see that the elimination of globals gets us
most of the way to functional programming, then the main thing left is the
notion of time due to externalities like input/output. Imagine for a moment
working in a lisp environment using REPL. When you type some code and hit
enter, how is that fundamentally different than a monad? Well, it potentially
triggers the entire syntax tree to be reevaluated which is actually more
expensive than handling input with a monad. But metaphorically it’s similar -
we can just think of monads as places where the logic can’t proceed because
it’s missing a piece of the puzzle. If we were using functional programming
properly and assumed that we had a near-infinite number of cores and flops,
we’d quickly see that monads are the bottleneck.

We should be able to translate between global-less Smalltalk message handing
and Lisp monads, and let the compiler optimize away any monads that don’t
depend on external state. To me, that suggests that working at a level below
human-readable/imperative is generally a waste of time. We should be able to
select any code block and use a dropdown menu to select the language we want
it presented in. I remember the first time this hit me and I asked myself if
it was possible to write something in HyperTalk/Smalltalk/C (or any other
imperative language) and covert that to Lisp with these conventions. The
answer to me is pretty self-evidently yes. Going the other direction from Lisp
to an imperative language is even easier.

A ramification of this is that if you converted a C loop that uses immutable
variables and no globals to Lisp, it would be evaluated into the most minimal
logic possible because analysis would reveal that (for example) the elements
being iterated over have no side effects between one another. In other words,
the compiler would independently parallelize the loop and derive something
analogous to map/reduce. Why are we doing that by hand? Surely there has to be
a better reason than brevity but I struggle to find one.

When it’s all said and done, I think the arbitrary distinctions we’ve made
between computer languages are relevant to education and worker productivity,
but in the end are fantasy. I would have thought by now that we would have
dropped the pretense. If computers worked the way I had imagined they would by
this point in history, a tablet would have at least 1000 cores (or drop the
notion of cores entirely and go with functional hardware), the
compiler/runtime would consider the latency between it and the other devices
around it in the mesh and adapt accordingly, all processes from the kernel to
userland services to executables and threads would just be functions with the
minimum permissions necessary to do their jobs, and most importantly usage
would be reversed so that users would write macros through the use of human
language rather than try to figure out how to do a task by hand from a locked
down sandbox. All this code would be shared out to the world through some kind
of hybrid Git/BitTorrent so that the solution for how to perform some
declarative programming task would almost always already exist. And all of
that would be constantly evolving with genetic algorithms and other software
agents.

The tablet may as well not exist, because with the world’s computing power at
your fingertips, why is your dumb terminal any better than another except for
eye candy? It’s reminiscent of idol worship. Kind of gives me the heebie
jeebies actually. And tragically moves us further away from technoliteracy
with each passing moment.

------
guelo
(2011)

------
SlipperySlope
In the early 1990's Visual Works Smalltalk became the language that introduced
enterprises to object oriented programming.

At FPL, the large Florida electric utility where I worked, we went all-in,
received training, and with pricey consultants, built several client server
applications using GemStone as the Smalltalk server.

What most developers loved about Smalltalk was the liveness of the system, in
that a just-in-time compiler made it possible to develop in a debugging
environment. You changed a line in a method, and immediately you could
interact with the revised app.

In 1995, Sun Microsystems released Java whitepapers. When I saw those I
thought - Java is Smalltalk with C syntax, but without the image-derived
liveness.

FPL switched from Smalltalk to Java, gaining all the benefits of reusable
code, and object-oriented libraries.

I miss Smalltalk syntax, in particular named method arguments, and to this day
comment my Java code to compensate like this...

    
    
        sendMessage(Message.notUnderstoodMessage(
                message, // receivedMessage
                this)); // skill
    

where the method parameter names appear as comments if the calling method does
not have local variables of the same name.

A particular feature of the Visual Works Smalltalk implementation was a
concept known as "Save to Perm" which, after a thorough garbage collection,
moved the remaining long-lived objects, e.g. nearly the whole language
runtime, to a memory buffer thereafter free of garbage collection.

Even after two decades, no Java implementation has this feature without
resorting to off-heap storage.

IBM got on board the object-oriented development wave in the early 1990's and
introduced Visual Age Smalltalk. They bought a Smalltalk version control
product "Envy" and created the development environment that has over the years
evolved into the famous Eclipse (IBM's Eclipse "blocks out" Sun Microsystem's
Java).

~~~
thewarrior
I make an app at work using a Javascript framework called Cappuccino. It
transpiles an objective - c like language to JS allowing Cocoa developers to
develop web apps.

Initially it was very tedious to develop apps as you'd have to refresh the
page each time , and the transpilation times began to add up. I read about
small talk and realized that objective-c is very close to small talk and hence
I could implement hot reloads in a similar way.

I proceeded to make a watcher process in node.js and made a process in the
page listen via websockets. Whenever a file changed , I'd just fetch the file
and execute it again , replacing it in the central class repository. All
existing objects would instantly have their implementations changed.

It's much more fun to develop this way but it's also easier to make a mess.
But it's speeded me up enormously and I end up missing it when I code in other
object oriented languages.

------
johansch
The smalltalk environment, while technically impressive, had a very low
usability level for beginners. Just download Squeak and try to use it! It was
basically academic in every sense of the word.

Or put another way: yes, smalltalk addressed programmability, but it did not
address usability.

~~~
gaius
Squeak is not representative of the original Smalltalk.

~~~
johansch
You mean the original smalltalk was more user-friendly?

------
cbd1984
Is this article just not loading for anyone else? I have all scripts running
and everything else the webpage wants, and there's no content there. Just the
loading icon.

~~~
SlipperySlope
Works OK for me - Chrome on Ubuntu.

------
webwielder
Always important to establish credibility by getting the name of the company
you're writing about wrong in the title.

