
Lost Lessons from 8-Bit BASIC - ingve
http://prog21.dadgum.com/198.html
======
chx
There is one more important thing here: the 8 bit computer immediately invited
you to start coding. The barrier to entry was incredibly low to the point of
nonexistent. Compare this to a laptop today or even worse a tablet.

I very strongly believe that perpetuating a "code me!" mindset vs the "consume
me!" mindset has really big consequences. The ability to make your computer do
something that _you_ can do something with, that the kind of game you bought
with your computer is something _you_ could attain as well (which is not
necessarily true -- some of those games were incredible feats of programming
but still the illusion was there) is completely, absolutely missing today.

And the consequence is: you get (at least the illusion of) the possibility
that you yourself can create something that is easily spread worldwide. The HN
readers will argue the huge audience of tablets makes up for this but the
problem is -- most people will never even think they can do it. That's the
problem: did your iPad came with a manual for a programming language? I didn't
think so.

Prorgam or be programmed. [http://www.rushkoff.com/program-or-be-
programmed/](http://www.rushkoff.com/program-or-be-programmed/)

Kids Can't Use Computers [http://coding2learn.org/blog/2013/07/29/kids-cant-
use-comput...](http://coding2learn.org/blog/2013/07/29/kids-cant-use-
computers/) (I know there is a lot of controversy about this article but it
does have valid points.)

~~~
userbinator
Even the IBM PC and XT had BASIC in ROM, which worked much the same way; and
it also came with a complete set of schematics and the source code of the BIOS
(I don't think the source for ROM BASIC was included, since it was licensed
from Microsoft.) While it wasn't truly "open source" in the legal sense and
was still copyrighted, it gave users a chance to learn how their machines
worked all the way down to the hardware level, and along with that came the
feeling that they actually _owned_ the hardware they bought. DOS also came
with a version of BASIC and DEBUG, a simple debugger that allowed the
creation, editing, and testing of Asm programs. I remember the magazines at
the time would have source code listings of simple programs for readers to
type in and use. This isn't really "programming" in the sense of "write your
own code", but it definitely provided a "catch-point" \- some of the more
curious might modify the program and see its effects, or do more research
about e.g. the instructions and BIOS calls it used, and eventually start
writing their own.

Several decades later, you're lucky if you can find even a detailed datasheet
or programming manual for the most important chips in a computer. BIOSes are
all closed and proprietary, with the exception of minority projects like
Coreboot. The relatively few schematics for commercial PCs only exist because
someone was nice and neighbourly enough to leak them. I think the gradual
shift towards consumer-oriented is part of it, but security also had a
chilling effect: belief in "security through obscurity" and the idea that
users shouldn't be developers has lead to a situation in which access to
development tools and information are seemingly treated as a privilege instead
of a right, and systems are correspondingly locked down against users (but
they'll all say this was to prevent "malicious attackers"...)

The walled gardens of Apple's iDevices, Microsoft's position on Secure
Boot/Trusted Computing, and increasing prevalence of other schemes like DRM
designed to take control away from users and strongly push a consumer-oriented
mentality are a great evidence of this effect. More subtly, dumbed-down
software designed to be "easy to use" take away much of the incentive to learn
about how things work that is often responsible for transforming consumers
into producers. No doubt the companies like this because they want to be in
control and regulate the creation of software; as me and others have said
before, "knowledge is power, and they don't want the users to have too much of
it." However, I don't think they're ultimately going to benefit from this
practice, since by encouraging users in the direction of consumption, they'll
be reducing the number of potential good developers in the future.

From that article you linked to:

 _A kid puts her hand up in my lesson. ‘My computer won’t switch on,’ she
says, with the air of desperation that implies she’s tried every conceivable
way of making the thing work. I reach forward and switch on the monitor, and
the screen flickers to life, displaying the Windows login screen. She can’t
use a computer._

Having done some work helping with teaching before - in a _computer science_
course - the number of times I've seen this happen is astounding. A large
number of the population seem to have this condition where it appears their
brain completely shuts down the moment they're put in front of a monitor, and
I think a large part of it has to do with the notion that computers are
somehow "magical" and "mysterious" things that don't follow the same rules of
the universe as everything else.

~~~
pavlov
_... I think a large part of it has to do with the notion that computers are
somehow "magical" and "mysterious" things that don't follow the same rules of
the universe as everything else._

Excellent point. Computers are indeed like bubbles of custom universes that
don't follow the rules of this one. The web doesn't have a real-world
metaphor.

How do those universes and their foundational laws get created? By
programming, of course. We all know that the stuff computers do is just built
up from simple stuff that's been meticulously assembled into layers of
increasing complexity: electrons on tiny wires, ands and registers, ifs and
structs, packets and sockets, requests and threads, etc. But that knowledge is
not discoverable by using a computer today.

It's no wonder that children don't know how to deal with failures that occur
in the real-world interface of these custom bubble universes, when all they're
thought about that interface amounts to magic gestures that trigger actions
within the bubble. When a magic wand stops working, how do you fix it?

~~~
joelanders
Regarding machines and discoverability: I think an interesting comparison to
flesh out would be computers vs. cars. Compare how tinkerers in each came to
be (opening up their parents' car/computer, for example). Compare how "the
average user" treats the thing when it breaks. Etc.

------
fit2rule
I've still got all my machines from that era. They still work, they're still
highly entertaining, and very, very useful. My kids (6 and 4 yrs) are learning
to read, write, spell .. and do math .. with the same machines I used when I
was 13.

This just points out, to me, how arbitrary technology really is. All the
energy into building that C64 is wasted if the thing ends up on the trash heap
.. but dust it off today and someone, somewhere, will still find a use for it.

    
    
        "Where did the IDE go wrong?"
    

I think where things went wrong is the disassociation of 'developer' from
'user' that happened as a consequence of marketing-grads getting involved in
the business of computers. I've never considered an OS truly 'user friendly'
if it doesn't ship with everything on board that a person would need to build
applications for it - and that is something the BASIC guys did well, back in
the day.

(Which is why I think that things like LOAD81 are so darn cool .. ;)
[http://github.com/antirez/load81](http://github.com/antirez/load81))

~~~
zokier
> I think where things went wrong is the disassociation of 'developer' from
> 'user' that happened as a consequence of marketing-grads getting involved in
> the business of computers

Somehow this happened to happen at the same time when computer users rose from
1% of populace to vast majority. We probably ended up with _more_ "developer"
guys this way.

~~~
fit2rule
Its difficult to really say, but you may be 'right' in the sense that the more
people you throw at something, the greater the variety of skill set you have
to accommodate. In the early days, "only nerds used computers" \- and now look
at us. Still, I blame the tools vendors for not making it viable to include
developer-style applications as a basic default built-in - certainly not true
of most OS's, except 'those built for consumers'. Imagine if we'd had the
temerity, we nerds, to demand that the dev tools be treated as 1st-class
applications in the OS/execution environment? I suppose we'd all be using Lisp
machines, eh? :P

------
jacquesm
To me a huge loss from that time is that you could still completely understand
what your computer was doing and what the software running on it was. Bloat
has solidly killed that possibility, you could not even understand all the
code on your phone these days if you wanted to, let alone your desktop
machine.

~~~
Narishma
Nowadays a phone and a desktop computer are at about the same level of
complexity in terms of code.

~~~
orf
Sorry, got any sources to back up that claim? Smartphones are indeed complex
but I highly doubt Windows phone has the same LOC as Windows 8 (despite them
sharing the same core), same for iOS -> OSX

~~~
tim333
The download is certainly bigger for OSX Mavericks (5.3gb) vs iOS7 (750mb).
OSX seems a lot more open in terms of understanding what's going on though.

------
tragomaskhalos
As a ZX Spectrum veteran the only things I really missed - in the sense that
they made my bigger programs unwieldy - were precisely those cited in the
first paragraph, viz calling subroutines by name rather than line number, and
parameter passing. Oh yes and an Else statement. Of course those snooty BBC
Micro kids had all that IIRC.

The Spectrum community was fantastic in those days - in addition to a plethora
of magazines there was the "ZX Spectrum ROM disassembly" (which I still
possess) that gave an annotated listing of the whole 16K ROM; basic
interpreter, fp calculator, cassette tape routines, the lot; an absolute
goldmine.

So an entire ecosystem that basically screamed "program me!". A beautiful
time.

------
Aardwolf
I know it's not exactly the same, but, open the Javascript developer console
in a browser, and you have a somewhat similar capabilities at your fingertips.
You can alter your whole environment if you consider the particular webpage
you're visiting to be your whole environment. You don't need to do much more
special for "cos" there either, just type Math.cos(2) / 2.

And if you're using Linux, it's easy to be welcomed by the bash prompt of
course (and I hear Mac uses a Unix terminal too now these days).

------
jhallenworld
You can easily have this on a Linux machine:

First install basic: apt-get install bwbasic

Next find where getty starts the login program, but change it to run basic
instead. In Ubuntu: /etc/init/ttyS0.conf:

    
    
       start on stopped rc RUNLEVEL=[2345]
       stop on runlevel [!2345]
       respawn
       exec /sbin/getty -8 -n -l /usr/bin/bwbasic -L 115200 ttyS0 vt102
    

You will see this on the serial port:

    
    
       Bywater BASIC Interpreter/Shell, version 2.20 patch level 2
       Copyright (c) 1993, Ted A. Campbell
       Copyright (c) 1995-1997, Jon B. Volkoff
     
    
       ERROR: Failed to open file --
       bwBASIC: 
       bwBASIC: print "Hello, world!"
       Hello, world!
       bwBASIC: 10 for a = 1 to 10
       bwBASIC: 20 print "Hello ", a
       bwBASIC: 30 next a
       bwBASIC: run
       Hello         1
       Hello         2
       Hello         3
       Hello         4
       Hello         5
       Hello         6
       Hello         7
       Hello         8
       Hello         9
       Hello         10
       bwBASIC: 
    

bwBASIC is a shell.. so you can type "ls".. or "exec emacs"..

~~~
JoeAltmaier
That's pretty similar to what the OP covered. Except part of the attraction
was NOT having to install or configure special tools. Remember having BASIC
always-on was what made it so accessible.

I get it, after you do this you can just log in and voila Basic. Still until
Linux comes with this as a special user login, there's a gap in the
experience.

~~~
davb
Agreed. There was something universal, almost magical, about ROM BASIC.

For me it was Commodore BASIC on the C64. I turn the machine on, and instantly
there's a little flashing prompt inviting me to give it some input.

Having to enter commands here to LOAD from tape was what initially sparked my
interest in programming. I'm telling the machine what to do, and it's doing it
right away. Wow.

There was no barrier to entry. And despite valiant efforts these days, it was
so much better than the modern "making programming accessible" ideas. I didn't
have to install anything. Every C64 I would encounter had this functionality.
It didn't feel like a dumbed down second-class-citizen environment. This was
how you interacted with the machine.

I think that main-streaming of general purpose computers has killed this. I
can't imagine booting my primary machine to a BASIC prompt now (or even a
shell, I guess). It would make listening to music, watching videos, checking
my bank balance or accessing my virtual machines for work.

I don't have separate media devices. I can't just throw on my headphones and
listen to my walkman while my computer is busy with something.

Although modern computing has unlocked a lot of potential, I think we've made
too many compromises. Outside of industrial and scientific applications, most
computing devices try to be a jack of all trades, being everything to
everyone, and doing poorly at it. Like so many contemporary software engineers
(myself included - I mean, the modern tools and frameworks are too large and
diverse know as intimately as we used to know computers).

Sigh.

------
Turing_Machine
Hmm.... the JavaScript console in modern browsers like Chrome and Firefox is
probably the closest thing we have to the old 8 bit BASICs. It's always on,
and available at the click of a mouse.

It's not as directly "in your face" as the BASIC was, of course, but it's
there.

------
dylanrw
I bought a mint C64 off of ebay last year, included was it's original
Programmer's Reference. It immediately had me missing the afternoons where the
limited functionality and simplicity of it all invited you to peek and poke
your way around the system trying to coax music and art out of the hardware.
Even people who weren't 'savvy' understood this, ex: in the 6th grade our
librarian gave us simple programs to type in and encouraged us to change it.

~~~
dylanrw
[http://cl.ly/image/3Z2I3r051J0O](http://cl.ly/image/3Z2I3r051J0O) <\- Simple
Memory Map of the C64, those were the days!

------
laumars
I think the article misses the point a little bit. The IDE facets that the
author fondly remembers isn't part of the BASIC programming language, it's
part of the command shell. It's loosely akin to your terminal emulator running
Bash. By default it works in real time but you can write more complicated
routines programmatically and then run them at your convenience; and you can
do so from the shell prompt (either via aliases, shell functions or just
echo'ing to a shell script in a similar fashion as his line numbered example).

Plus a lot of his complaints seem to be about the modularisation of modern
languages - which seem an odd complaint to make in my opinion. If anything,
I'd personally argue that things like importable, self-contained, chunks of
code is one of the single greatest advances.

He definitely has a point that the barrier for entry these days is much higher
(and this is probably why so many kids these days fall into web development
over native applications) but I think the examples he's used don't justify the
conclusion he's trying to draw. And neither do I agree that regressing to a
BASIC-like environment would fix the problem. I think the problem is simply
expectation - people expect so much more that there often isn't the patience
to start with the basics. Plus the "code me!" vs the "consume me!" mindset
raised by chx[1] erodes what little patience some might have.

That's my 2c worth anyway

[1]
[https://news.ycombinator.com/item?id=8256211](https://news.ycombinator.com/item?id=8256211)

~~~
JoeAltmaier
BASIC and its command shell were in an 8KB ROM. They were one monolithic
program. You've got modern ideas, and trying to apply them to a 20-year-old
environment. Its not 'like' a lot of things; it came before them so the most
you could say is, those things are like BASIC.

What BASIC was, was an extremely accessible try-it-now environment that needed
no installation, no setup, no environment variables, no directory structure.
It was what we thought of when we thought of interpreters, as opposed to
compilers. But then interpreters got all file-oriented and broken too.

SO there have been a lot of improvements since then. But some of what we lost
was very, very different. And some of it was valuable in a way.

~~~
laumars
_> BASIC and its command shell were in an 8KB ROM. They were one monolithic
program. You've got modern ideas, and trying to apply them to a 20-year-old
environment. Its not 'like' a lot of things; it came before them so the most
you could say is, those things are like BASIC._

Except what you've described is exactly what I described with Bash ;)

 _> What BASIC was, was an extremely accessible try-it-now environment that
needed no installation, no setup, no environment variables, no directory
structure. It was what we thought of when we thought of interpreters, as
opposed to compilers. But then interpreters got all file-oriented and broken
too._

Again, all that is comparable with Bash:

1\. Bash doesn't need any installation with most UNIX-like systems. But even
in the rare example that Bash isn't bundled with your OS, there's the Bourne
Shell (sh) or even Windows cmd.exe (for all it's faults).

2\. Bash et al doesn't require any set up.

3\. Environmental variables are just global OS variables and since BASIC
wasn't namespaced, all variables were effectively environmental variables.

4\. Directories are a file system thing, not a language thing. However it's
worth noting that many BASIC systems did have a directory structure even if
though (at that point) nested subdirectories didn't really exist. You could
have different storage devices that were switchable (akin to changing drive
letter in DOS) and you could store multiple files on an particular medium. In
fact I still have a stack of floppy disks for an Amstrad CPC 464 (which ran
Locamotic BASIC) in my attic that would testify to this.

5\. Bash et al are interpreters.

Don't get me wrong, I romanises about the old days too. They were fun. But I
don't think you can blanket say _" [it's a] 20-year-old environment. Its not
'like' a lot of things"_ just to dismiss any arguments you dislike.

~~~
JoeAltmaier
I only disliked the wrong arguments :) Like

"The IDE facets that the author fondly remembers isn't part of the BASIC
programming language, it's part of the command shell"

There was no such distinction. That's recasting of the facts into something
understandandable today.

Know why the language keywords were so short? Because in an 8K ROM the symbol
table was a significant hit on resources. Add a keyword? Means remove some
other feature. It was a whole different world, with different constraints. And
still it was usable, accessible, friendly even. At least when coming from
nothing to a computer (instead of coming from 20 years of growth and confusing
hindsight with foresight).

~~~
laumars
_> There was no such distinction. That's recasting of the facts into something
understandandable today._

But that's the whole bloody point of the article. If you have an issue with
that then take it up with the author rather than me.

Please also remember that the Bourne Shell is as old as BASIC micro computers.
So my comparisons are of two environments of the same age rather than older
systems vs modern systems (like you keep accusing me of).

And around the same time some micro computers (even the lower end ones) would
support other interpreters (eg the BBC Micro supported BBC Basic, LISP, LOGO,
Fortran, and a few others) and you could switch between languages like you
switch shells in Linux. Which is also why I like to make the distinction
between the language and the shell.

 _> Know why the language keywords were so short? Because in an 8K ROM the
symbol table was a significant hit on resources. Add a keyword? Means remove
some other feature. It was a whole different world, with different
constraints. And still it was usable, accessible, friendly even._

I know - I was there. And while it's interesting, it's also irrelevant to any
of mine or the authors points.

------
Zardoz84
I remember when I used my ZX Spectrum to study multiplication tables with a
program that I made on Basic that show me the tables and then ask me random
multiplications of a table. Funny times...

------
lmm
This has something of the Smalltalk image-based approach (and the author might
enjoy using a Smalltalk development environment). IMO the costs outweigh the
gains - it's worth decoupling programming a computer from using it. When
source code is just text files, you can manipulate it with lots of powerful
tools; even better, you can use tools from different languages whose authors
never talked to each other. When the shell and the compiler are just user-mode
programs, they can iterate much faster, and you can choose one that suits your
style. You can use the same program for both, if you really want to - for a
few weeks I used tclsh as my login shell - but it turns out the tools you need
for programming are quite different from those for general computer use.
Division of labour is ultimately a good thing, even if it means less of the
population has any specific skillset.

------
VLM
No commentary on the results of a quick google search of "tryruby" "trypython"
"tryclojure" pretty much try* where * is a currently popular language. Or even
not so cool, I found a "try brainfck" at

[http://www.compileonline.com/execute_brainfk_online.php](http://www.compileonline.com/execute_brainfk_online.php)

I'm specifically pulling lists of "try" services rather than the much heavier
"I am an IDE in your web browser" services which might be somewhat
overwhelming as a first introduction to programming...

Take your web browser appliance, make
"[http://tryclj.com/"](http://tryclj.com/") your web browser appliance's home
page, all done.

------
tim333
I kind of miss BASIC. One of the good things in those days was that printing a
few lines or drawing a graph was about the limits of what the computer could
do so it seemed cool. You can now run BASIC in a javascript interpreter in
your browser
([http://www.calormen.com/jsbasic/](http://www.calormen.com/jsbasic/)) but
it's not cool anymore. I'd guess the nearest equivalent of something simple to
learn that you can impress your mates with these days would be javascript -
then you can make apps, funny effects on webpages and BASIC interpreters if
you get good at it.

------
omnibrain
I just habe finished reading Petzold's Code after having it in my reading list
for several years now. I expected something along the same lines as Code
Complete and got something entirely different. And boy, I was in for a ride. I
read it in two days straight. I think this book can offer an answer for us who
were too late for the type of computer mentioned in the linked article. Has
any of you experience in giving this book to people who don't have much to do
with technology or even children?

~~~
zokier
> I think this book can offer an answer for us who were too late for the type
> of computer mentioned in the linked article

What is the question for the answer that the book gives?

------
davb
I think the best thing anyone can do to encourage people to learn programming
(with what we currently have) would be for Google to include a RAD IDE with a
simple interpreted programming language like Python with the OS itself.
Something standard, across one of the currently most popular platforms. When
first launched it should give you a programming manual.

But I can imagine them ruining it. It would be tied to their cloud service (oh
my tablet's offline? Half of the features won't work, the manual is gone and I
can't share my code). It would be updated all the time so wouldn't be stable.
The runtime and IDE would be so fragmented (one of the benefits of ROM was
that it was expensive to burn so tended to be quite stable over a long
period). And of course there would be the AOSP version and the Google version
furthering fragmentation.

Modern computing has turned me into a cynic. I'm slowly starting to hate what
our industry has become. By the time I was old enough to start my professional
career, the world I had fallen in love with was gone.

~~~
teamonkey
Not exactly what you mean, but...

    
    
        Ctrl+Shift+J
        
        console.log("Hello World");

------
smacktoward
As a kid, I first learned how to program in an 8-bit BASIC environment. This
proved to be a mixed blessing.

On the plus side, it was every bit as approachable as this post makes it
sound. Not just because of its "always-on" nature, but because of the
relatively small learning surface it presented: line numbers, GOTO, a few
operators. It was _comprehensible_ in a way that more complex languages
weren't.

On the minus side, soaking my young mind in the paradigm of structuring
program flow around line numbers bent it in ways that didn't become apparent
until I got a little older and tried to graduate to languages like Pascal and
C. I struggled to get comfortable in these environments in ways that other
peers with less programming experience did not, precisely because I had
internalized so much of BASIC's skewed way of thinking about program
structure. It took a fair bit of time to un-learn the bad habits all that
BASIC programming had taught me.

~~~
sharpneli
I learned my ropes with QBASIC. It's as approachable as good ol basic but it
had text labels instead of line numbers.

Thus for me the biggest step in moving to C when I was 12 or so was string
processing and the concept of having to compile it and the weird include
files. And why do I have to write -lm to djgpp so I can use cos? Those were
the days.

------
markbnj
David Brin and others have lamented the lack of an ubiquitous, interpreted,
always-there programming environment to help get kids started down the path to
software development. I tend to agree that it is a lot harder to penetrate the
first layer than it used to be. So, from someone who also started with ROM
BASIC, thanks for the memories.

~~~
chillingeffect
Montfort et al described the same thing in the book "10 print
chr$(205.5+rnd(1));:goto 10" [0]

[0] [http://trope-tank.mit.edu/10_PRINT_121114.pdf](http://trope-
tank.mit.edu/10_PRINT_121114.pdf)

------
derekp7
What this article boils down to is two things: 1) Computers used to have a
built-in language that would be immediately available, and 2) that language
had a REPL.

If you wanted to get the same experience in any REPL without line numbers (and
with Lambdas), you could say:

    
    
        mainprog = lambda{some lines of code};
        exec mainprog;
    

instead of:

    
    
        10 line 1 of code
        20 line 2 of code
        RUN
    

In the case of BASIC, prefixing a line number is a shortcut for saying
"variable LINE-XX = lambda{some code}". Now we just need to get computers to
come up with a standardized REPL readily available at boot time. Maybe if web
browsers started to default to having a Javascript console prominently
displayed at all times.

------
cturner
If you wanted to recapture that feel of an instant-on console, you could set
up a computer that booted into lighttable. It supports mouse and graphics and
music. But to get a web browser up, you call a function.

------
mncolinlee
The author mentions jokers breaking out of his Atari demo loop written using
BASIC.

I first learned BASIC on the Radio Shack TRS-80 Model 1. As a child, there was
a favorite jokester demo loop involving Qbasic with Windows. I'd visit a
department store, break out of each store computer's demo loop to DOS, and
then type up a simple Qbasic loop with random SOUND calls where I had
painstakingly tested and memorized random frequency ranges and delays to
simulate the sound of water running.

Returning these computers to their store demo loop left a sea of scratched
heads in my wake.

------
xxs
I started with BASIC when I was 6. That thing had built-in calculator with
parenthesis(and calculators were not that common in the early 80s). So it was
totally fascinating. Of course I never realized there could be anything but
global variables and the GOTO was considered bad style.

It was possible to code a small program that draws on the screen (with ijkm
[ijkl])in several minutes. Fun times indeed, but I am not sure it's applicable
to the young kids any more.

------
narag
_How did IDEs go so wrong?_

I'd say it's a retorical question, but the IDEs part makes me doubt.
Programming is no more inmediate, graphics are difficult. All that.

------
drblast
The best thing about line oriented languages like BASIC was that, lacking
abstract functions, the language worked exactly like the machine worked. If
you wanted functions and parameters, you had to implement that much like you'd
do if you were writing assembly.

Learning BASIC with line numbers and GOTO meant learning how the hardware
control-flow worked. This is abstracted away in all languages today.

------
McUsr
If you are on a mac, then you can really do all this with Apple Script, which
also is a bit small talk alike, and that by Maverick, can let you create
libraries quite easy.

I use it a lot as a calculator, when I don't use it for autmating the UI. It
is great, IMO, the best thing about a Mac.

~~~
vidarh
You can do it with environments provided with most OS's. The big difference is
that it is not put in front of people to the same extent that makes it
trivially discoverable, and available from the _second_ you press the on
switch in the case of most of the 8-bits.

AppleScript is a particular peeve of mine. I was a long time Amiga user. I was
used to "everything" being glued together via AREXX ports. OS X have the
capability, yet the environment is very different. It is far more rare
(presumably not Apple's or developers fault per se - more reflecting a
different demographics) to see OS X applications manuals call out script
integration as a major feature, or "hang" all its own automation off of it,
even for the ones with good script integration (and though I'm a Linux user at
home, this is a sore point: Linux script integration is _woeful_ in
comparison)

------
user_id3
I think the important message here is that simple programming has long been
the niche of hobbyists and entrepreneurs, it's about time that we cornered
this market and commoditized it. It should be really easy since it's small and
unimportant.

------
t__r
Dont forget that the first IBM PC (8086, 16 bit) also had rom basic, as well
as a cassette port.

