
Learning BASIC Like It's 1983 - fcambus
https://twobithistory.org/2018/09/02/learning-basic.html
======
jcmeyrignac
I'll share my own experience, since I started computer programming in 1982 (on
a 6502 based computer called Oric-1), then worked in the game industry for 18
years.

What I remember from these early times:

1) we had only one television at home, so typing programs required to have
access to the TV. This is why I spend a lot of time analyzing programs BEFORE
typing them, since I didn't have a lot of time to type them. This probably
taught me a lot !

2) I always wanted to improve the programs I typed, so I spent a lot of time
optimizing them. This also proved useful later ;-)

3) Some programs included mysterious hexadecimal characters. I tried to find
some documentation about that. It was difficult, because the information was
scarce, and there was no Internet. One day, I got an Aha!, and I discovered
6502 that day. This was useful, since I did write quite a lot of games in
6502, and it got me my first job in the game industry in 1985.

4) In France, there was a beautiful newspaper called Hebdogiciel. It contained
programs for all kinds of computers. I tried to convert these programs to my
computer, and this also gave me pointers to handle conversion between Basics.
My first job was about converting Basic programs between various computers
(Thomson TO7 <> Exelvision).

5) everything was so new and exciting ! Nowadays, I don't feel this kind of
excitement. Everything is so easy to put in place. At the time, we only had 48
to 64 kilobytes of memory. Everything was a challenge. The computers were not
designed to write games, but games were doable.

~~~
ur-whale
The lack of access to documentation was hard, very true, but then it also
meant reverse-engineering was the only way to proceed, and from a certain
angle, that was a good thing.

~~~
japanuspus
This! Still have fond memories of reverse-engineering the screen-buffer
encoding on my Amstrad CPC 464 by poking values directly into the buffer and
watching what happened on the display (i.e. my TV). Somehow got hold of the
Z80 datasheet for opcodes - but working out the relative jumps with pen an
paper was a slow process. Upgraded to Amiga around the time when I realized
that assemblers had to be a thing...

~~~
3chelon
Yes, I was coding in hand-coded hex machine language for a couple of years
before I had access to an assembler. These days, when and IDE drops into
assembly and all my 20-something colleagues duck for cover, I rub my hands
together in glee...

------
YouKnowBetter
I cannot agree more. I started when a computer did not have an OS, let alone
1.000 applications.

You got dropped right into the interpreter. There was not already a GB of OS
loaded that you had a hard time to learn. All the code that was there was what
you wrote (copied) yourself.

The processor & interpreter was about as fast as you could think, so it was
easy to follow and step though (mentally). Reversing to assembly was the
logical next step and since the programs where small, it was easy to learn &
memorize.

After years of coding yourself, you'd stumble on the first OS, which consisted
of the most rudimentary libraries that one could basically read & remember.

Years later still, the first rudimentary networking picked up. Slow and not
business critical so again easy to experiment with. By the time I connected
the first commercial network to "the internet" downtime of email for less than
24 hours was not even noticed.

I do not envy the kids who nowadays stand 0 change to ever learn the complete
stack of code running on any modern device. From what I see, they all are
"stuck" on top of a GUI with only the slightest idea of what happens between
their mouse and the actual hardware (and even that is often not hardware
anymore).

~~~
3rdAccount
I haven't read it yet, but I think Nand2Tetris was meant to address this.

Petzold's CODE is also really good going from logic gates to microprocessors
to assembly language.

I still wish you could buy something like a Pi board that has just an
interpreter and compiler on it as well as a textbook and you implement a
simple version of a file system, text utilities, task manager...etc.

~~~
aerotwelve
I find myself asking the same things when it comes to the lower-level software
side of things. I find this area fascinating as well, but it's all so
"invisible" in modern systems.

I graduated from a CS program (BS) a few years ago, and the projects I'm still
most proud of existed "lower" in the stack: implementing FAT16, writing
compilers/interpreters (I still have my Brainfuck interpreter!), playing
around with paging in MULTICS, and the like.

These were all very toy-like (with good reason, a semester is only so long),
but since I enjoyed them I've found myself asking things like "I wonder how
the process scheduler/virtual memory manager/in OS X is handling [whatever I'm
doing at the moment]? What does my stack look like right now? How are all of
these threads communicating with each other, what are they saying?"

You can occasionally see this in the console when something is going wrong,
but usually not when the system is operating normally. When the OS is handling
a heavy load brilliantly, that's kind of when I'm most impressed, and
therefore most interested.

(maybe this kind of procedural output would be dreadfully boring or unreadable
due to the complexity of a modern system if I actually saw it, I don't
actually know of course.)

~~~
exikyut
`perf top` on Linux, and doing a (really) deeeep dive on the perf and BPF
APIs, may be of interest.

The possibly-incorrect impression I get of the performance-analysis related
areas of the kernel is that they're a bit siloed (in the same way that X11 is
siloed, and only a very small group of people look at it), which may render it
bit functionally academic. (It's newer code, though, so there's less chance of
eccentricity, FWIW.) Isolation does have benefits - with less chaos to keep up
with and less bus factor, the code is changing less and the maintainers have
more mental bandwidth to post on mailinglists :), so you have more opportunity
to get a good understanding of what's going on.

I don't have the same low-level experience you do, but I do share the same
interest in wanting to understand "what's really going on" \- and I
incidentally want to make a Linux system monitor tool that vacuums up as much
information that the kernel is willing to make available to it. `htop` and
friends surface a very caricatured picture from maybe 1-10% of the data the
kernel has to offer at any given moment.

------
ofrzeta
> If you wanted to play one of those games, you had to type in the whole
> program by hand. Inevitably, you would get something wrong, so you would
> have to debug your program. By the time you got it working, you knew enough
> about how the program functioned to start modifying it yourself.

As someone who was there and did that I want to refute that assumption :) I
have typed in many programs and there's not a lot to learn because most of
them consisted of many pages of DATA lines and a small loop that loaded those
machine instructions into the home computer's memory and started the program
by a USR directive (please note that I made this explanation many years after
the fact). I guess there are not many teenagers who are able to debug this
kind of program by looking at the actual opcodes.

Sure you could learn a lot from typing in regular BASIC programs but that
weren't the most interesting games as far as I remember. The most productive
learning experience was interactive and exploratory programming a shown in the
OP article.

~~~
gaius
_As someone who was there and did that I want to refute that assumption :) I
have typed in many programs and there 's not a lot to learn because most of
them consisted of many pages of DATA lines and a small loop that loaded those
machine instructions into the home computer's memory and started the program
by a USR directive_

On the Beeb most listings were BASIC with all the logic there and DATA
statements just for in-game assets such as sprites or room descriptions in a
text adventure. All the Osbourne books are online now , check them out
[https://www.raspberrypi-spy.co.uk/2016/02/usborne-
releases-1...](https://www.raspberrypi-spy.co.uk/2016/02/usborne-
releases-1980s-coding-books-as-free-pdfs/) many happy memories inside!

~~~
pdjstone
For more 80's microcomputer BASIC nostalgia, there's also INPUT magazine
([https://en.wikipedia.org/wiki/Input_(magazine)](https://en.wikipedia.org/wiki/Input_\(magazine\))).
I spent hours and hours as a kid poring through those magazines. Archive.org
has the entire series as PDFS -
[https://archive.org/details/inputmagazine](https://archive.org/details/inputmagazine).
They were really good at explaining core programming concepts and the
illustrations were also amazing.

------
tralarpa
> I think the people that first encountered computers when they were
> relatively simple and constrained have a huge advantage over the rest of us

I pity sometimes the young people who are studying computer science nowadays.
I studied CS in the 1990s. Compared to a modern curriculum, my courses looked
very basic: five years on functional programming, compiler construction,
networking, databases, etc. No P2P, cloud computing, mobile applications, IoT,
etc.

Now, most CS studies have to rush through the basics in three years, followed
by two years where the students have to learn all the tools and techniques
that they need for their professional career. I had an entire course
("Advanced topics in databases") on the efficient implementation of indexing
and query execution for databases. Today's students have to learn in the same
time: a shortened version of the old course PLUS nosql, column-oriented DBMS,
DHTs etc.

~~~
tomsmeding
> Today's students have to learn in the same time: a shortened version of the
> old course PLUS nosql, column-oriented DBMS, DHTs etc.

If your university is good. I took a computer science bachelor the past three
years in the Netherlands. In my case, I fear that I learned as much about
databases as your basic course taught you, and maybe even less. In particular,
the implementation of databases wasn't even touched on. I definitely learned
nothing whatsoever about nosql db's, column-oriented db's, dht's and other
cool things, though the existence of the first two was hinted.

Luckily I can usually figure out what I want to know without help of a
teacher, but there's a definite difference in level of difficulty and amount
of content between universities today.

~~~
pjmlp
Doesn't Netherlands publish national classification of university quality,
thus allowing a better selection where to apply for?

~~~
tomsmeding
They do, and this university came out pretty high up, though probably not at
the top. My reason for choosing it, though, was that they provided a maths-CS
double bachelor programme. In my experience, their maths programme is of very
high quality, so that makes up for it.

~~~
pjmlp
Sure, there are always multiple factors when choosing where to apply.

I was just curious, because not all countries do it.

------
hfdgiutdryg
_It was just last week that you saw the Commodore 64 ad on TV. Now that M_ A
_S_ H was over, you were in the market for something new to do on Monday
nights. This Commodore 64 thing looked even better than the Apple II that
Rudy’s family had in their basement.*

Alternative version:

It's 1983 and your dad decides to buy a personal computer. The C64 is too
expensive at $595, so he buys a VIC 20 (introduced at $295, probably $200 by
'84).

You plug it in to the television downstairs and, after mentally tuning out the
hum from the RF converter, you start to enjoy gems like GORF and Radar Rat
Race
([https://m.youtube.com/watch?v=1LRkON9XTOk](https://m.youtube.com/watch?v=1LRkON9XTOk)).

You try to read the included user manual, with its helpful computer chip
themed cartoon mascot explaining things like strings, but none of it makes
sense because you're eight. You and your sister spend hours reading aloud and
typing in bytecode listings from then-popular computer magazines. Imagine
spending an hour transcribing hex codes, only to type the run command and have
it crash.

If, on the off chance you found the errors and got the program to run, you'd
find it was wildly over-hyped in the description. You didn't want to waste
your work, so you'd save it to the tape drive that used audio cassette tapes.

A few days later, you'd try to load the program from the cassette tape and
find that it was corrupted. I never once got a saved program to load from that
thing, it only successfully read commercially published software.

The nostalgia is largely inaccurate. It was an era of immense frustration. And
I never saw any C64 television ad, probably because we only had three
television stations and no market to speak of for personal computers.

Incidentally, there's no way the kids in Stranger Things would have had those
high end walkie talkies. They'd have had the crappy ones that only work to
maybe 100 yards and emit static nonstop.

~~~
mixmastamyk
I had a VIC-20, was about 12 and took to it like a duck to water. The tape
drive worked flawlessly, though it did take ten minutes to find and load a
program.

One long-form magazine program I tried never worked right, even after finding
and fixing some bugs myself. But we neighborhood kids had hours upon hours of
laughs plugging our names and bad words into the sentence generator program's
DATA lines.

------
amorphous
The problem is not that it was easier in the 80s to start programming (that
would be that same as to say it was easier to study medicine in the 1800s) but
that today it is much harder to get to the point to do something _meaningful_.

If you want the simplicity of the Basic interpreter, just fire up a Python
console, and you are in a much more comfortable position to learn programming
and computer science than you were back then. But it is still a long way to
get to someplace useful. In the 80s, by owning a VC-20 and programming Basic
and Assembler, I was pretty much already at the edge of something new and
powerful.

~~~
autokad
> "but that today it is much harder to get to the point to do something
> meaningful."

in some ways that is true, in many ways that is false. only as far back as
2005~, if I didnt know linux and wanted to learn it, I needed to purchase
hardware. installing and messing with binaries / configurations ran the risk
of making a machine inoperable. now today, its very easy to spin up something
in AWS or VM. want to experiment with tensorflow on a large GPU box? 4$/hour
on demand.

also, we have lots of API's and libraries we can just import, and I think
thats a great thing. writing a website where the login was needed but not
ultimately what you wanted to explore use to be hard. now its gem install
devise. makinga basic website look pretty enough? bootstrap. want to compare
random forests, xgboost, and linear regression for predicting home prices?
from sklearn import ...

i know all these things arent doing 'new' things, but they enable the person
to not worry about the details and do something cool. I think the volume of
interesting stuff we see coming from blogs and articles on hacker news is a
testament to that.

~~~
aidenn0
I think the point is that in 1983, if you wrote a program that read in some
input, did some calculations and printed output, you had created a program
that resembled much of the professional software out there.

If you have kids do that know they would probably say "It's not a _real_
computer program, it's just text!"

------
jgrahamc
_I think the people that first encountered computers when they were relatively
simple and constrained have a huge advantage over the rest of us_

I don't and I grew up in the era of 8 bit machines and kilobytes of RAM etc. I
fully recognize that it was _fun_ to have those constraints and we learnt a
lot about dealing with them (using less memory, using fewer instructions) but
I don't buy that that really matters for most programmers today. They'll have
other things to worry about: e.g. debugging distributed programs.

Sure, if you want to do microcontroller work then that sort of thing is
useful, but literally nothing stops a "Full Stack" programmer picking up an
Arduino and programming it and learning something new.

I love playing with those environments ([http://blog.jgc.org/2009/08/just-
give-me-simple-cpu-and-few-...](http://blog.jgc.org/2009/08/just-give-me-
simple-cpu-and-few-io.html)) but every day I see people with different
experience of computing from me and I don't feel that I have an advantage over
them: they often know about things I'm totally ignorant of. It's true that I'm
very good at debugging horrible low level things, but it's also true that I'm
not good at imagining the state of a system with hundreds of micro-services.

~~~
ur-whale
>I'm not good at imagining the state of a system with hundreds of micro-
services.

I feel you. Thinking across multiple levels of abstraction is one of the
hardest thing in engineering, especially if, over the years, you've
overspecialized on one specific strata.

------
DanielBMarkham
I've been doing a lot of tech blogging lately, playing around with various F#
tools and toy projects, seeing what resonates with the community.

There was a progressive complexity that happened back in the late 70s and
early 80s such that people alive today who still code and learned back then
have taken the ride from machine language to multi-gigabyte stacks.

We just kept adding stuff and having to make sure we could be functional in
all of it. Not an expert, but functional. It was standard practice on my
commercial programming teams to decide what everybody wanted to learn on a new
project before starting. (And these were high-paying projects. We always left
with happy customers).

People were jack-of-all-trades. Most everybody was. You had to be.

What am I seeing resonate, at least as far as I can tell? The inability to
understand what the hell is going on and work with it. You get a C++ compiler
compiling a hellacious codebase working in DOS, a rails configuration ain't
nothing.

I see what are supposedly senior programmers walk a bit off the happy path on
a framework and they're lost. Not only are they lost, they are insecure,
afraid, embarrassed. There's nothing wrong with these people. There's
something wrong with the way we're training and staffing them.

Fifteen years ago I was still coding commercially, having a blast. Talking to
a recruiter one day about various projects, she said "You know, you're one of
the last true general consultants"

There may be ten thousand of us. Beats me. But her general appraisal was
correct. There is a drastic and complete change between the way coders used to
relate to technology and the way they do today. It's not tech. It's mindset.

~~~
52-6F-62
Just to your point about the recruiter’s assessment— I’ve never met a modern
tech recruiter who can make that kind of remark.

They largely focus on buzzwords and where you went to school. Seems like they
even often fail the companies they work for by designing poor matches and
wouldn’t recognize experience like you’ve noted as being worth anything. Then
again, maybe I’m just in a weird bubble here. And I’m just
generalizing—they’re not all the way I’ve described, but many (possibly most)
are.

~~~
0x445442
"Just to your point about the recruiter’s assessment— I’ve never met a modern
tech recruiter who can make that kind of remark."

How many years experience of "the XML" do you have?

~~~
JDWolf
"What versions of .Net were you using at each of these jobs?" "What percentage
of time were you using SQL databases?" "On a scale of 1-10, how well do you
know Java?" and my personal favorite...."Are you willing to do a contract role
for less than your salary? We have many candidates go long term."

------
tarcyanm
This is why the "STEPS Toward the Reinvention of Programming" (a.k.a. 20k
Lines of Code) Project was important. It was spearheaded by computing luminary
Alan Kay with help from some very impressive researchers.

The final report is available here:
[http://www.vpri.org/pdf/tr2012001_steps.pdf](http://www.vpri.org/pdf/tr2012001_steps.pdf)

I haven't kept up with subsequent research the Institute has produced. I do
know they had an aversion to producing software artifacts and were far more
concerned with the written reports (which in some senses restricted the
ability of amateurs like me to play with the interesting output by the group).
I did play with OMeta (a meta-parser) and the COLA / Id code - which was
enlightening!

------
masswerk
For all those willing to play around a bit, here's a low entry-level access to
Commodore BASIC, a web-based PET 2001 emulator. You may write programs in your
favorite text editor and load them per drag-and-drop, and even export any
screen contents as text. (Also, all the special characters are accessible by a
virtual keyboard.) Manuals are found in the programs/download section.

[https://www.masswerk.at/pet/](https://www.masswerk.at/pet/)

(Core emulation by Thomas Skibo, interface and IO enhancements by yours
truly.)

~~~
Jupe
Thanks for this... Took a minute or two to re-create my very first program:

10 PRINT " __ __ __* ";

20 GOTO 10

:)

------
wmnwmn
You are right that it was a special and exciting feeling learning these things
when these machines were brand new. When my friend's family bought a TRS-80 in
1978, it was like some kind of alien artifact that fell from space; we were
utterly fascinated by it even though its capabilities were almost absurdly
minimal. It had only 4k RAM and that required several minutes to load from the
casette tape storage.

Nevertheless it ran BASIC and my friend and I learned to program on that
machine. Subsequently we honed our skills on Apple II's at school, and later
by hacking on the PDP-11 at school. In 1981 I built a simple Z-80 computer,
roughly equivalent to the TRS-80 (note:"80"). At that time doing something
like this literally caused newspaper reporters to come interview you. It was
nice to learn these things sort of organically, albeit perhaps not optimally;
I've made my living in software development and to this day have never taken
any class in programming.

Of course that moment of novelty really was brief. Once the IBM PC came out in
1981, computers proliferated rapidly and they no longer seemed so special.
Nevertheless I do think that our "Generation X" had sort of lucky timing with
computer culture, since we were also just reaching working age when the
Internet revolution hit (my first job out of grad school came from an ad which
literally said "the Internet revolution is here and you can be part of it"...a
small part to be sure but still part!)

But anyway every time period has its pluses and minuses! If you want to know
how it really felt to grow up during that time, the truth is that we envied
the 60's generation hugely and thought that everything we had was just kind of
a pale imitation of what they did (for example, music). Take a look for
example at the book called "Generation X", which is pretty dystopian, and
really did express how many people felt. There's always something new
happening!

------
Ricardus
It was definitely fun to go to the computer lab in high school and play
with/program the Commodore PETs. These were the 4K models with the chicklet
keyboard.

I will say this though. I disagree with one word in the author's text. He said
the machines of the time were "constrained." They weren't constrained at all.
They had limitations just like today's machines do. But we weren't aware of
the limitations because we weren't time travelers coming from the future with
more computing power in our pockets than most corporations had in 1983.

In fact the exact opposite was true. The computer industry was bursting at the
seams. It was exploding. It was not constrained at all.

Then they cloned the IBM PC and the rest is history. Every machine you're
reading this post on descended from those.

------
bennyp101
I would have been 6 around 1990 when I first discovered BASIC, on our Amstrad
CPC464. Found the manual filled with code, and that cemented my love of
programming.

It's interesting how rather than coming with an instruction manual nowadays,
its assumed that everybody knows how a PC (including phones) works, especially
as they are more complicated now - just hidden beneath a veneer of GUI

------
Koshkin
You can have the full experience today using the Colour Maximite board:
[https://www.youtube.com/watch?v=XQA8lowEKOo](https://www.youtube.com/watch?v=XQA8lowEKOo).
The MMBasic environment it uses is surprisingly powerful and convenient - it
even includes a full-screen text editor.

------
atemerev
Now, this sort of experience can be recreated with Arduino, micro:bit and
other educational microcontrollers and nanocomputers. Raspberry Pi doesn’t
count: too complicated.

~~~
pmyteh
You _can_ recreate it with Raspberry Pi, though it's no longer the best
platform for it. It can natively run RISC OS Open, which can be set up to drop
you straight into a BBC BASIC prompt as in the old days. And there was plenty
of software printed in paper listings form for 32-bit Acorn computers if you
want the traditional experience. The issues of Acorn User magazine from 1988
onwards will give you plenty to be getting on with...
[http://8bs.com/aumags.htm](http://8bs.com/aumags.htm)

~~~
UncleSlacky
The RISC OS people have one already set up to run BBC Basic standalone:
[https://www.riscosopen.org/content/sales/risc-os-
pico](https://www.riscosopen.org/content/sales/risc-os-pico)

------
stevage
I can't decide how much I buy this argument. I was born in 1980, and started
programming Logo and BASIC very young (around 5 and 7 years old respectively).
Almost everything this post describes is familiar to me: typing in code from
books, learning PEEK and POKE, etc. I just don't think much of that matters:
whether you learn BASIC on a stupid terminal that can't do much else, or in a
super simulated terminal in a web browser seems kind of irrelevant.

OTOH, the experience of learning C and having to actually write video and
network drivers (or their barest elements) because there wasn't a web you
could download a library from...yeah, that probably actually did make me a
better programmer. Having had the experience of writing a little bit of actual
assembler, even just the old "MOV AX 10; INT 13" (am I remembering that
right?) does give me a sense of connecting with the machine more deeply than
someone who grew up with the internet.

On the first hand again though...living through the late 90s and most of the
2000s was crap. The time of Java. And the worst kind of JavaScript. I pretty
much stopped coding altogether until the web had matured as a platform a bit,
by the early 2010s.

~~~
danmg
dude make a new account. this one is hellbanned.

------
philjohn
I've recently been introducing my son to programming, and it was so much more
difficult just to choose the best starting language.

When I started in the mid to late 80's as a child, we had a Commodore 128 and
it was Basic or nothing. I also remember typing in those program listings,
getting annoyed they didn't work, realising I'd transcribed something wrong,
then when it was working making changes.

We then moved onto a 386 and I gravitated towards QBasic (and later TurboBasic
because you could use the mouse and compile to an exe)

With my son, I ended up searching high and low, umming and ahhing, and in the
end I found the spiritual successor to those computer magazines with program
listings in a Python for Kids book that takes them step-by-step (a little like
the Commodore manual did) introducing concepts and eventually getting you to
build a game.

I think a lot of the wonder about computers that seems to have waned is
because we're all older and not looking at it through the same lens, true,
computers are far more ubiquitous now, but learning that you can tell it what
to do and are only limited by your imagination is still incredibly powerful to
a child.

------
tokyodude
Not basic but fantasy consoles like pico-8 try to bring back that feeling.
Pico-8 has a 32k limit and all graphics / sound / code are memory mapped so
you can peek and poke if you want to old school hack. It boots directly into a
Lua interpreter.

[https://www.lexaloffle.com/pico-8.php](https://www.lexaloffle.com/pico-8.php)

------
esfandia
The nature of programming has changed so much since then. Back then, you could
get going by knowing a very limited set of building blocks (BASIC keywords,
all described in a single manual). The challenge was to build something
meaningful out of those building blocks. Programming could be hard, but never
too complex. Problem solving at its purest. That's what got me hooked!

Today, the set of building blocks is unlimited. The challenge is to find the
right building block for what you want to do, using the infinity of resources
available to you (all the APIs, libraries, Google, GitHub, StackOverflow, blog
posts). You're drowning in complexity, even if what you want to program is
straightforward. And you are constantly mindful of writing idiomatic code,
always wondering if there's a better way to do the same thing, which is
normally a good thing but can be time-consuming and getting in the way of the
fun part that is the problem solving.

~~~
eesmith
"Back then" means programming as a kid, on a hobbyist's microcomputer, yes?

I'm a bit confused because you compare that experience to modern programming
as a software developer. I think you are thinking of the difference across 40+
years, rather than the difference of kid vs. adult.

I mean, if you were programming on a Unix machine in 1983, or a VMS machine,
or an IBM mainframe, you would have far more than a single manual.

~~~
esfandia
You're right, I guess I'm comparing against both those axes at the same time,
since I'm relating to my own experience as a kid 30 years ago vs as someone
working in tech today. Indeed, that's not really a fair comparison.

The point I was trying to make was about the distinction between programming
with a limited set of building blocks vs programming with a large and open-
ended selection of libraries. They don't require the same skill set, and one
seems more fun to me than the other.

~~~
eesmith
It is easy to get started with Lego, and fun to use. Those who are really good
can build some impressive objects. The limitations of the parts make it easy
to find challenges to overcome.

Mechanical engineering is a lot harder, there are so many options to consider,
new options come along all the time so you're always falling behind, and cost
plays a much larger and - I think - more boring role in the process.

I think that is analogous to the point you are making with computers.

~~~
esfandia
Exactly.

------
eggy
I started on a Commodore PET 2001 in 1978, and wrote a horse racing game for
my parents. Three horses, ASCII characters randomly going across the screen
0,1 or 2 spaces. Odds at the start and payoff calculated at the finish. My
parents loved it! (OTB and racetrack fans). I then went on to a Vic-20 (1981),
Amiga 1000 (1985) and eventually a 386 PC, a PowerMac PC (I got minix or other
variant running on it), and then Linux, Mac, and PCs thereafter. The Basic
Stamp in 1997 really brought back old coding to me with the addition of
hardware tinkering. My fondest memories are the horse racing game, and
plotting the 4 main moons of Jupiter on my Vic-20's thermal printer. Figuring
out how to program those two programs started me on coding. Funny, I have
always coded, but only as an employed coder for two years at one job. I
program for myself, and when I can for work when needed (mainly technical
computing).

------
sjclemmy
I recently found a copy of the commodore 64 programming manual in a garage
sale. Of course, I had to buy it. Compared to the *nix paradigm with heaps and
stacks, it's very, er, basic. The manual explains that you are just loading
registers with instructions when you program. No Heap, No Stack, No memory
management.

~~~
s800
You're loading memory with instructions and data, registers with operands, and
yes, there's definitely a stack that's used heavily.

------
temporallobe
BASIC was my first love. I was 10 when I got my first computer, a Tandy TRS80.
It came with an instruction manual with example BASIC programs and I could
save my programs with an external tape deck. Something awoke in me and from
then on, I onew my calling. I still remember the first command instruction:
SOUND 39,20

------
emersonrsantos
There’s this naive concept that these 8 bit microcomputers programming were
based in BASIC. The speed of these Z80 chips was 1-3 MHz.

Almost all games and programs in the 8-bit world were written in Assembly. The
BASIC interpreter was just used as the bootloader to load and run these
programs, BASIC was in fact the shell interface and the toy language, but the
assembly programs essentially sold the computers.

If you wanted to do something in these computers, it would have to involve
assembly because of BASIC slowness. And there were newbie books about assembly
language, assembly libraries, tons of assembly compilers and debugging
utilities available anywhere.

~~~
dragonwriter
> If you wanted to do something in these computers, it would have to involve
> assembly because of BASIC slowness.

No, you could do all kinds of useful, productive things in BASIC, which were
still orders of magnitude faster (and also still more accurate) than manual
processes you were replacing.

------
freediver
Thinking about how to introduce my 6yr old to the world of programming I
decided she is going to write her first code like I did - using BASIC on C64.

Luckily there are many JS C64 emulators online like this one
[https://virtualconsoles.com/online-
emulators/c64/](https://virtualconsoles.com/online-emulators/c64/) so its a
breeze to get started.

And of course her first program was

    
    
      10 PRINT "HELLO PETRA"
      20 GOTO 10
    

exactly 33 years after her dad made his first.

She also tried Scratch and like it but to me in order to write code you need
to learn to "write" it.

------
mmjaa
I've been quite into retro- computing for a few years, and have started
recently to amass a collection of my favourite machines from the 8-bit era ..
and one thing I'm really intrigued about is the 'lost tech' of these machines.

I remember for one of these platforms, there wasn't really a great commercial
release scene - but there was a great home-hacker scene, with type-ins from
magazines and so on ... and I remember having quite a small library of
routines and utilities, saved on cassette tape, that could do various things -
fast scrolling, tape copying, UNDELETE commands, and so on. But now these
things are lost to time (still out there on my cassette collection, wherever
it is these days) .. and we have to re-create them.

So now one of my favourite aspects of the hobby is the reconstruction of all
the 'cool utilities and stuff' that made the platform great in the 80's. Its
not so easy! For some of the obscure platforms, we really have to dig deep ..
fortunately though, 8-bit computer magazines seem to have been pretty well
preserved on the Internet. Its just now a matter of going through them,
spotting the gems for the obscure platform, and re-typing it all in, lol. ;)

But that said, having a variety of 8-bit machines at hand is really a special
treat. I always wanted a Spectrum machine, and now its finally affordable. ;P

~~~
moochamerth
If you are looking for 80's listings, please check out
[http://www.hebdogiciel.free.fr/](http://www.hebdogiciel.free.fr/)

The site contains all the listings from the Hebdogiciel magazine, re-typed and
ready for download. The magazine was a must for all French-speaking computer
kids back then.

~~~
mmjaa
Thanks for that - as an Oric-1/Atmos fan, I'm already aware of it, though. ;)

No doubt other readers of this thread will find it useful.

------
seriocomic
Wow, what a throwback - this was me, but I had already cut my "basic" teeth on
the Commodore VIC-20, quickly upgrading to the C-64 as soon as I could get
someone to drive me to the nearest city to buy one with my meagre earnings.
I'm still fumbling my way through "programming" today...

------
jmull
Actually, the memory mapped approach of the C64 is an abstracted API. It’s so
immediate and intuitive, though, I can see why you might not think of it that
way.

(I started on the Pet and later th C64, and had so internalized this approach
that I was baffled when I first encountered systems where this wasn’t the
primary approach.)

I think there’s actually a lesson here for API design today: the power of the
“everything’s-a” approach. Then it was everything’s a memory location. But
generally the everything’s-a approach allows for a flat abstraction space with
low cognitive overhead and high inherent/automatic composability, leading to
short learning curves and high productivity.

------
drawkbox
In 6th grade there were Apple II's at school I was part of a group of honors
kids that got to code BASIC on Apple II's. I made a Tron disc game, it wasn't
good but the ability to create a game, render block graphics using text and
code BASIC was magic.

My friend also had a Commodore 64 which even to start a game you had to know
some commands (LOAD "$", 8, LIST, then LOAD your game). Computing and
coding/interfacing with the computer was more involved but simple as well, led
to lots of fun learning to code and create.

Between these two machines (Apple II and Comodore 64) I fell in love with
coding and games. I never had an Amiga which was a bummer but these were
enough to inspire kids to be creators with code.

In high school my teacher Mr Isles was big into the internet and media
computers, we were watching TV on a computer, playing games like Scorched
Earth, making games in pascal while browsing the web, it was amazing and
moving fast.

Flash had the same fun factor as those in late 90s to around 2006-ish before
the mobile phone came out. Flash communities were very special in both
designers and developers, it was powerful that either type of aim could create
games, interactives, experiences and Amiga like demo scenes.

Even when mobile truly arrived in 2007 when the iPhone and smartphones upped
the game, I was blown away when OpenGL was on it and I knew immediately that
it was a new handheld gaming market that I had to get into. Mobile existed at
that point and I was making games on Windows Mobile but everything changed
with iPhone/Android in mobile in the ability to create. That fun creation
market is still going on today, now we are onto fun interactive tools like
augmented reality and location based games which are fun.

There are really inspiring innovations happening all the time in each
generation but it does seem like the fun platforms and really interesting ones
are led by gaming, or apps today, and areas approachable by designers and
developers alike to create interesting games, apps, interactives and the
platform makes it fun.

I think it is really important for platform designers/developers to make their
platform approachable, simple and reduce complexity so that it can attract
people interested in creating. I think an engineers job is to create
simplicity from complexity, design platforms smartly with senior skill for the
junior, in some areas today we are failing that due to heavily specialization.
Every truly successful platform that really hit and progressed
innovation/creation forward did exactly that.

------
Annatar
" _I think the people that first encountered computers when they were
relatively simple and constrained have a huge advantage over the rest of us.

Today, (almost) everyone knows how to use a computer, but very few people,
even in the computing industry, grasp all of what is going on inside of any
single machine._"

An astute observation: I see a lot of complexity in IT today which obviously
comes from not having a clue how the hardware functions and how it's
efficiently programmed. The complexity, performance and resource hits grow
with every layer of abstraction. Convenient for those who write software, very
bad for users who then needlessly suffer.

------
okket
This is literally how I got hooked on computers in the 80s. That and a
soldering iron.

~~~
3chelon
+1 for the soldering iron. There was nothing quite so stressful as the first
time you plugged in the prototype relay controller you'd built on Veroboard
out of salvaged parts and a couple of 74-series logic chips. The whole thing
was connected _directly to the CPU_ via the edge connector, and if you didn't
have flyback diodes and enough decoupling then every time the relay switched,
the computer would crash with a psychedelic random bitmap on the screen and
you'd wonder if it would ever work again... Ah, the nostalgia!

------
tabtab
A magic feature of BASIC was that you didn't need to learn different editors
to get started. It was based on line numbers, so the line number was how you
added, changed, and deleted lines.

I could walk into a computer store in early/mid 1980's at the mall, spot a
model of microcomputer I never encountered before, and could type:

    
    
       10 PRINT "Oh no! Something is going to blow! Run!"
       20 GOTO 10
       RUN
    

Then scurry off while snickering.

------
sytelus
Can we replicate this experience on modern PCs using VMs?

------
jonplackett
This brings back so many memories from being about 12 and figuring out BASIC
on my dad’s PC.

Why did programming get so complicated?

------
macca321
I own 10printhelloworld.com, but unfortunately 20goto10.com is only available
for a whole heap o' money. :(

------
tomrod
This is brilliantly written. Thanks and kudos to the author!

------
tsumnia
This is my current research [1]! In 2015, I was at PyCon's Educational Summit
when I thought about integrating some of what we do in martial arts to CS -
drilling moves to use in sparring/randori. Sparring/randori is a high
intensity activity that requires fast problem solving skills which rely
heavily on muscle memory due to the speed involved. Additionally, forcing a
beginner to spar is one of the fastest ways to make them quit [2]. I think
this is one of the reasons why CS has a high dropout rate - we are asking them
to "spar" (problem solve) too early or incorrectly and as a result they quit
because they hate feeling like failures. Instead there should be some level of
drilling before getting "thrown to the wolves" (as I used to tell my students)
to build their confidence and understanding. I don't think traditional
small/large scale programming exercises fully tackle this problem.

I think drilling is something we do in almost all technical skill development
(music, art, athletics, vocational) and I wanted to bring the same thing to my
CS courses - so I started requiring typing exercises as one of their
assignments for the week. These aren't just "typing a for loop 10 times", but
additional context (for example, the link below shows regular expressions for
addresses) to give them something they could use as a template for their
programming exercises. To combat copy and pasting, I just made the code an
image. In my first link, you'll see an example of using a regular expression
to validate addresses. After completing this, the students would then be
required to complete some Q&A exercises as well as traditional programming
exercises where they needed to design functions that: validate phone numbers,
(a limited scope of) email addresses, and Social Security Numbers. The
objective of that week was to get them familiar with regular expressions, not
finding a StackOverflow link that teaches them how to implement regular
expressions.

As the article says, this is what we did in the 80's. That doesn't make it
better, it just makes it how things were done "back in the day..." However, K.
Anders Ericsson states that early specialization is often a key determinant to
future mastery and that deliberate practice refines areas where an individual
struggles and may be unenjoyable [3] (see my older comments on
grit/perseverance). Likewise, syntax errors are one of the first problems
novices face [4]. By completing typing exercises, the learner does not need to
worry about using problem-solving skills, which they may still be struggling
with, just the correctness of the typed characters. Thus, typing exercises
give the learner a deliberate practice resolving a simple, but major issue.
Additionally, typing exercises remove students' ability to just "copy and
paste" before using example code. With syntax errors mostly resolved, the
student can then focus on problem-solving rather than where the semicolon
should go.

I currently have a SIGCSE paper under review, but the gist of the paper is
that students that voluntarily completed typing exercises performed better in
their class than students that did not. The students may have just been more
motivated and therefore that is why they scored higher, so there is a
limitation to my study. I could require it, but then designing a control group
that would receive the same amount of learning would be difficult as well.

[1]
[https://research.csc.ncsu.edu/arglab/projects/exercises.html](https://research.csc.ncsu.edu/arglab/projects/exercises.html)
(the Heroku link is currently down as I've made recently changed to the live
version)

[2]
[https://www.youtube.com/watch?v=hHebXvoHue0](https://www.youtube.com/watch?v=hHebXvoHue0)
(Rener Gracie is a character, but listen to those first few minutes)

[3]
[https://en.wikipedia.org/wiki/Practice_(learning_method)#Del...](https://en.wikipedia.org/wiki/Practice_\(learning_method\)#Deliberate_practice)

[4]
[https://dl.acm.org/citation.cfm?id=2677258](https://dl.acm.org/citation.cfm?id=2677258)

------
gaius
_I think the people that first encountered computers when they were relatively
simple and constrained have a huge advantage over the rest of us_

It is not a blessing but a curse. If you understand the computation you will
spend so much time wondering “WTF is this (modern) computer doing” when it
fails to do something simple that you know an older machine with a tiny
fraction of the power could do easily.

~~~
HeadsUpHigh
Like the crazy latency times from pressing buttons to letters appearing on the
screen. I see it's much faster on hc than other websites but not hardware
interrupt fast. Why in the world does it have to be usb.

~~~
pjc50
USB isn't really the problem, it's all the code that feels it has to get
involved in handling the event. Lots of websites (e.g. Facebook) invoke all
sorts of background network activity on keypress.

~~~
HeadsUpHigh
You might be surprised but I remember an article comparing various computers
and input methods and it turns out that USB vs hardware interrupt(PS/2) is
significantly more latency. Software plays a role but even e.g. vim vs an old-
school computer actually has a noticeable difference. I haven't experienced
this myself but I do know writers that use computers from the 80s for this
exact reason( George R. R. Martin being one example). I have also heard from
other people that have tried this that the QoL improvement is pretty big.

~~~
TeMPOraL
I think we need to take it one step at a time. First, let's kill the modern
web, then continue to get rid of other layers of bloat on modern computers,
and when all the low-hanging fruits have been plucked, we can do something
about hardware protocol latency...

~~~
HeadsUpHigh
Agreed. But how do you kill the modern web? And how do you go about killing
all the bloat underneath? Do you expect microsoft to fix windows? Or redhat to
abandon systemd? Or wayland? Or the countless other bloated pieces of the
puzzle? Honestly at this point unless we go back and remake everything from
the start the interests are probably too big to be dealt with.

~~~
gaius
_But how do you kill the modern web?_

There’s no actual need for a page to be 10Mb to deliver 5k of text and a 50k
JPEG. Esp not on mobile.

~~~
TeMPOraL
Yeah, but the GP's question - the one that's really hard - is _how_ , not
_why_.

