Hacker News new | past | comments | ask | show | jobs | submit login
Why 80s BASIC still matters (usejournal.com)
110 points by ingve 28 days ago | hide | past | web | favorite | 131 comments

This article inadvertently underscores the huge challenge in getting kids interested in programming today.

In 2019, it's very hard to do anything "interesting" with BASIC when kids are surrounded by apps and games with hundreds or thousands of person-hours of development. "Hunt the Wumpus" just doesn't cut it anymore.

As others have written, Python is much more interesting because of the extensive library ecosystem. Integrating the physical world, graphics, controllers, sound, audio, network, video, etc. offers a lot more engagement than "Hello World".

Javascript is also interesting because it runs everywhere, and HTML/JS skills will offer good summer job money. :)

Even in 1985, the things that a child could produce in BASIC weren't "interesting". Children in that era could make a few dots appear on the screen, or make a prompt that asked questions and responded, or maybe draw some interesting procedural graphics. It's not much when you compare it to games like The Bard's Tale, Ultima IV, or King's Quest II. But it was enough, because programming was by itself an interesting puzzle to solve.

It's not the result that's interesting, it's the process of programming.

Even in 1985, the things that a child could produce in BASIC weren't "interesting". Children in that era could make a few dots appear on the screen, or make a prompt that asked questions and responded, or maybe draw some interesting procedural graphics.

Prior to 1985: (As a child.)

I wrote "snake" and independently came up with the circular queue data structure. I wrote a "Bezerk" clone in low-res graphics, where everything was just a dot. (I had no idea how I could handle the AI at the time, so baddies just moved randomly.) I wrote a "skiing" game that exploited text scrolling for the vertical scrolling. I wrote a "Star Wars" game where you tried to "fly" a ship with a joystick first person to line up Tie Fighters in the crosshairs. It was a clone of one of the demo games which came with the Apple II, but mine had a "horizon." I wrote the start of an Asteroids style game, where I got a swarm of 50 ships to chase each other around in a massive "dogfight." All of the above was done with BASIC, except for the last, which was done in a BASIC derived macro language called Macrosoft.

More recently, a few years back I taught a class for high school aged women, where we used Unity 4 to create a 2D shoot-em-up game in just one afternoon. 18 of the 19 students rated the class highly. Granted, we had to "pre-fab" a lot of things for the students to keep things moving. The actual point of the course was to give the students a taste of what they could do.

It's not the result that's interesting, it's the process of programming.

Some years back, back before Arduino was even a thing, a friend of mine (who looks a bit like a cross between the stereotypical 50's geek and Kyle Maclaughlan) and I spent an entire evening bringing an embedded computer to life. We got an LED to blink then awkardly high fived. His wife just shook her head.

Funny enough, Ultima I was originally written mostly in Basic (with graphics routines written in assembly).

The stuff I was able to do in Basic in 1981 as a ten year old kept me interested. Then like now I enjoy programming, but mostly as a means to an end. If anything, the fact that what I could program wasn’t as “interesting” as a commercial game (say) was motivation to keep learning.

> the things that a child could produce in BASIC weren't "interesting"

That's just not true. For I/O you had keyboard input, screen output, and maybe file I/O. 3 basic things that pretty much all programs used. Most commercial programs were 80x25 text mode with 16 color attributes. If you could learn to read scancodes from the keyboard, write text to the screen, and save/load files, (along with a few fundamental ideas like loops and conditionals) you could do almost anything that a commercial program did. With just that basic knowledge of a few simple things.

It was simple because it was direct, there weren't 72 layers of abstraction and UI libraries that you had to know before you could even begin. You could also do pixel graphics, and again it was pretty simple - just set the screen mode and go with DRAW, LINE, CIRCLE, POKE etc. Again, no layers of indirection or libraries or window managers or canvases or DrawableObjectFactoryManagerFactoryInterfaces needed.

One of the problems of modern attempts to recreate that feel is just that there is so much between the programmer and the hardware and it's so much more complex. Back then, we had single-user, single-process systems with direct access to simple hardware. Everything you needed to do was either a built-in part of the language or a BIOS/OS interrupt that you could call just as easily. It's hard to recreate that on a multi-user multitasking system with indirect access to more complicated hardware and countless layers of software between the programmer and the machine and BIOS/OS interrupts. Sure you can do more powerful stuff now, but it's not as interesting.

Direct access to hardware is not the difference. It is easier to get things done today. You can argue that it was simpler in 1985, but I don't think that simplicity is the right way to measure things. You can present an easy-to-use programming environment to people, accessible to newcomers, usable by experts, and it doesn't have to be simple but it has to be possible to use it in simple ways.

Just personal observations here. Someone with a computer in 2019 can download Unity, which is free, and watch a tutorial on YouTube, which is also free. With near zero understanding of what is going on, and no prior experience, they can have some kind of rudimentary platformer working within hours. This is then a good starting point to learn programming (you can dive into C#) or you can continue to jumble together copy/pasted bits of code that you see online (kind of like how I remember doing with BASIC and library books).

Sure, there are a bunch of layers of abstraction, and those abstractions will break down all the time. You don't need to learn those abstraction layers, you can stay at the top layer and still get good work done, maybe working around a few problems that you don't understand from time to time by futzing around with stuff until it works. Or you can dive in and try to understand what's underneath an abstraction. That's the whole point of having abstractions in the first place. They exist to hide the complexity, to let you get work done without understanding the entire system. 72 layers of abstraction is a bit of an exaggeration. I'd say if BASIC has three layers (interpreter/machine code/hardware), Unity only has six (editor/scripts/engine/os/machine code/hardware), but who knows how you count them.

In my experience, watching people learn how to create things, it has never been easier to learn programming and start building things. The main difference is that in 1985, programming was considered an essential computer skill, and in 2019, you're expected not to program. BASIC was amazing because it was what you saw on the screen when you turned your computer on, nothing more.

I wrote a fun game on the C64 as a teen - the starship Enterprise drawn as an "O" with lines behind it. You could steer it with arrow keys to aim at klingon warbirds, and fire phasers with the space bar.

I suppose this exposes some assumptions I was making. When I think of children learning programming with BASIC, Scratch, or Logo, I'm usually thinking of pre-teens or young teenagers, say 10-14 years old. Or younger, say 8, for some kids.

When my daughter was three, I built a Windows 95 computer for her and installed a load of preschool games on it. She could power the computer up, find her game, play it, close it and shut the computer down at that age. She learned to read from those games.

I learned BASIC in 5th grade and instantly fell in love with how quickly I could read and understand the code and modify it to do what I wanted.

Sure, when I wanted to make games I moved to C++ and Perl for web apps, but without programming experience those languages looked like a mass of symbols while BASIC looked like sentences.

For an entry level language it was perfect for me.

In 1985, putting your own pixels on the CRT was a pretty big deal.

We used to write goofy games, share them, hack on all of it, and had lots of fun. Imagination.

Now, the blue dot is the baddie, and you....

#IDARB stands for "I Drew a Red Box."


See if you can find the GDC presentation they gave. It was the most entertaining presentation of that year I can remember.

I watched a couple of things. Yes! That is exactly that old 80's feel in a current game. Really cool story. I have to go play this damn thing.

Is this the presentation you are referring to? https://youtu.be/0MJ_oxzJ2zc

> Even in 1985, the things that a child could produce in BASIC weren't "interesting".

Back then, as a child, my parents purchased books that contained basic programs that one could manually transcribe and save onto tape or disk that were actually interesting. IIRC, those books did contain an overview of how the program worked.

My son is 6.

He likes electronics because they have games. Usually the older games are better for him anyway because they are simpler. Right now his favorites are Worms Armageddon (iPad/PC/N64) and San Francisco Rush 2049 (N64).

Since he doesn't have free access to any of these devices, he is perfectly happy to use whatever we allow at the time ("Daddy, can we play games on the TV from when you were a little boy", referring to a 12" junky CRT for the SNES). I have no doubt he would be thrilled to have some kind of retro computer to use QBasic on, just like I did when I was 8.

You wildly underestimate the power of a child's imagination. The problem is free access to overstimulating contemporary technology, not some intrinsic boringness of old software.

>San Francisco Rush 2049

You and he should make "San Francisco Rush 2019" where you jump around from startups to FAANG all the while trying to make enough money to keep up with housing price increases.

> The problem is free access to overstimulating contemporary technology, not some intrinsic boringness of old software.

I feel like that's exactly what the parent comment said.

My kids have always loved retro games too, as well as current games, and they are also interested in programming. But for a decade they've been unable to motivate themselves to actually do the programming. My oldest has asked me to teach him multiple times, and when I help him, it doesn't stick. He doesn't have the patience for code. He chooses to spend his free time playing very high quality games and browsing addictive content on the internet, from an insane vast sea of choice that we never had when we were programming in BASIC.

> You wildly underestimate the power of a child's imagination.

In my experience, that imagination tends not to come out until kids are bored, and produced content keeps them from being bored.

Wait until your kid is actually programming to pass judgement. My son is finally starting to find some patience and just turned 15. It was easier for me when I was 12 because there was nothing else to do.

I also am getting the feeling that the trend is global because I'm involved in hiring, and the younger kids out of school all started programming in college, not in junior high like all my cohorts my age. My group asks specifically at what age candidates started programming. The theory was that early is good because it shows self-motivation, but we're seeing general trends that kids are starting later.

EDIT: BTW, the main motivation for my son's newfound patience for programming, unix, and shells is to escape parental controls and sandboxing.

> Worms Armageddon

If he likes Worms Armaggedon, he might like the Gorillas game: https://www.youtube.com/watch?v=UDc3ZEKl-Wc

I had a lot of fun hacking that game back in he day.

So did I! Also SNAKES.BAS or whatever it was.

As with everything regarding "ranking" programming languages, subjectivity abounds. Target tasks and audiences are the real value assessor.

For example, my 5 year old, who is just learning how to read, can understand the beginnings of BASIC. She's super excited to see the TI-99 4/A print back what she painstakingly types, or to see the computer count from 1 to 5 with a basic for loop. She wouldn't know what the F is going on with Scratch, let alone Python.

BASIC is exactly what it's name implies. I'm going to move her to Scratch and Python eventually, but I have yet to see either meet the needs of the absolute entry level, very young programmer.

BASIC made it easy to learn incrementally, one line at a time. Syntax and block issues in Python require too much up-front training, in my opinion. And, the library issue is not mutually exclusive. BASIC could be given libraries/functions that do dramatic/interesting things: "10 boom(100, 200, 50, 3)"

This makes an explosion at coordinate x=100, y=200 that is 50 pixels wide and lasts 3 seconds, including sound. Put it in a loop that slowly increases the size or decreases the duration. What boy doesn't want to have control over screen explosions? (I may be gender-stereotyping, but you get the idea.)

I wish Hypercard had managed to survive as a default install on Macs. It's the perfect first development environment for kids IMHO. You can learn the basics by reading the source in any stack (including the home stack) and there's enough depth that you won't get immediately bored with it. Plus your applications (stacks) are graphical right from the start so it doesn't feel like you're back in the 80s and being amazed when it scrolls out "Hello World" indefinitely.

Pyret along with Bootstrapping course is being used successfully for getting kids into programming:


Scratch before it making programming like lego's.


The situation isn't nearly as bleak as you indicate. I also found that I can counter that expectation by showing the labor and code involved in big projects. Then, I show them a small project that's still fun made by one person or a small team with associated code. Then, I tell them they can start there, make fun/useful things, and work their way up to bigger projects like their apps or AAA games. Only thing I run into at that point is laziness about putting in effort or just disinterest in favor of other pursuit.

I do need to do an update some time in future of exemplar apps to use for my examples with appeal to wide audience. Probably several or a pile of them with summaries so they can just skim through to get to what interests them.

> Integrating the physical world, graphics, controllers, sound, audio, network, video, etc. offers a lot more engagement than "Hello World".

While I can agree that there are more "interesting" things to occupy time, I disagree that simple things will always be beaten by a team of engineers. To use an example outside of programming and video games, there is a multitude of tricks you are learn with a deck of cards. You don't need elaborate magic rigs to be entertained.

Secondly, I'd recommend looking at what BASIC was capable of doing. I'm going to be showcasing the game "Missile!" (pg 35 of [1]) on an actual Apple IIe next month. It's going to take only 43 lines of code to have a GUI driven game. It's obviously limited, but designed to get kids interested in Computer Science. To produce the equivalent in a text-based programming language would require too much overhead - even in Python's case.

[1] https://drive.google.com/file/d/0Bxv0SsvibDMTVUExUjFhTURCSU0...

I like the Arduino for this reason. It was a lot more interesting than printing hello world to a screen. I would find little things I could do with the Arduino for random little things that I couldn't easily do with a pre-existing app on the phone or computer.

Also, before I started studying computer science, I liked the Unity Game engine as it was quick to implement something interesting and it was easy to find lessons and instructions online to develop in C# and the Unity engine.

I think Arduino, Microbit et. al. have taken the place of the homecomputer these days.

They are simple to use, connects with the real world so you can actually get feedback and can be used in a variety of cool projects.

I have long hesitated to which language teach my 8yo kids, between Python and Basic.

I've gone with Python, but I'm not sure I made the right choice (too rich, too much information to swallow before starting).

Despite that, my son is working on a car race game, all ASCII, using curses. He made his own track with pipes and slashes and is pretty proud of it.

To be fair, their most advanced device to play is a nintendo DS (not 3DS or 2DS, the original one)

But I still think it would have been easier in BASIC.

In the Summer of 1985 (when I should have been outside chasing girls) I wrote a COBOL interpreter in C64 BASIC as the college I was at was still using punched cards and sending them offsite to be run overnight.

That was interesting.

My kid started on BASIC a year or two ago, and is now requesting that we have sessions to learn Unity and C++.

Never forget that operational causality over computers is gained one step at a time.

this appears cool, designing a graphics programming interface on a raspberry pi sense hat


80s BASIC (GW-Basic, to be specific) was, right up to his retirement two years ago, my fathers go-to language for anything work related. It was "good enough" for the job, cheaper and faster than trying to get their software department to understand the problem needing solving, and it was what he knew -- including it's limitations and when not to use it.

Old, "bad" languages can still be extremely powerful and productive for what used to be called "power users"; don't dismiss them out of hand because the professional software development world has moved on.

What was transformative for me in my youth were the "listings" he mentions. I remember spending hours typing in pages of code, having it fail, and having to debug it. Even though I didn't think so when I was 12, I think these days that those failures were the best thing that ever happened to me. I actually think they were far more valuable than the AP CS course I took as a senior in HS.

That experience taught me skills that I still use today. Mainly, trying to understand an unfamiliar (often poorly commented) codebase, and having to debug it.

These days, I think that the most equivalent experience might be working on something open source. But I'm afraid the motivation just isn't there for most kids. My motivation was "Wow! Free GAMES!", but my kid gets free games on his tablet without having to debug a darned thing.

Motivations are certainly different now. I do wonder if kids today still have that "wild west" feel when messing with computers. I remember when I was young, writing programs and making computers do simple things seemed really impressive to me at the time! The bar is a lot higher now and I wonder if that dissuades some kids from trying because doing something impressive now requires a lot of extra overhead that I didn't have to deal with.

> I remember when I was young, writing programs and making computers do simple things seemed really impressive to me at the time!

It was same feeling for me too, in the early/mid 80s. And it was impressive also because few other people were even doing it. Whether other people could or not, based on their capabilities, most didn't actually have any home computer to even practice on. Merely having a home computer in, say, 1983, was a rare thing. Actually being able to make it do things you wanted was even rarer. It probably seemed more impressive to many outsiders than it actually was (in the sense that a for-loop is pretty boring/basic) but hard to separate out that impressiveness between having access to the situation at all vs having the skills.

Me too. Hours and hours of typing...then "RUN". I had a computer for about a year with no tape drive to store anything I typed. My Mom used to do me a favour and shut my computer off...thus losing all my work. Good times.

I learned to program from old books a coworker of my dad’s gave me. Some of my favorites were two books of computer games, where you would type in the code word for word.

Each line would have symbols that told you which versions of basic used that line, so there would be duplicate lines for different versions of basic. They didn’t have the version I had, so I had to figure out the differences on my own.

I learned so much.

I typed in stuff from Compute! magazine. They had a neat helper program to show a simple checksum of each line in the margin, so you could catch typos more easily. To make things even more fun I didn't have a tape drive so I got really good at typing in my games each time I felt like playing :)

Oh wow, I wrote games for Compute! magazine. Rather, ported them. Readers would submit games for their particular computer, an editor would select a few for each issue, and then we'd port them to the other popular 8-bit micros. Best high school job ever.

IIRC the guy who wrote the checksum generator also wrote the common assembly language graphics routines we used for all the games.

That's amazing. The game I had was called "Biker Dave" for Atari 800XL [1] and it was a little biker sprite jumping over obstacles.

[1] This was it: http://www.atarimania.com/game-atari-400-800-xl-xe-biker-dav...

I still have the Acorn Atom that I started learning to code on. 12Kb of memory (with the expansion pack installed). No hard drive - if you don't spend 10 minutes saving it to cassette then it's gone. As I found out when my cousin turned off my computer after I'd rushed to dinner after an 8 hour coding session :( still haven't forgiven him.

My girlfriend is trying to learn to code, and I realised that I had it easy - because computers were much simpler, learning to code on them was much simpler. I learned Assembly coding trying to speed up my simple homemade games, but that was easy: the machine only had 3 registers, the entire machine memory was addressable, and there was maybe two dozen commands to learn.

I think this is why I like Go so much. It reminds me of old-school BASIC in its simplicity.

I still keep typing `IF <condition> THEN` and `FOR I = 1 TO 10` if I'm tired, even after all these years.

There's a nice little app called PICO-8 that retains that simplicity, I heartily recommend it for anyone who wants to learn programming (and I don't in any way get paid for saying this). Also that syntax is almost valid Lua which PICO-8 uses.

There are a number of fantasy computers around now. I have played around with Pico-8 and TIC-80 [1] so far. Very much fun. The TIC-80 is opensource and can be found compiled for most stuff, even the Pandora handheld console for those that managed to get hold of one. Hopefully someone makes a port for the new DragonBox Pyra [2] when that one is out, that would be really cool :-)

[1] https://tic.computer/

[2] https://www.dragonbox.de/en/45-pyra

I'm not sure we can turn the clock back like that. Back in the early 80's computers were expected to do very little, so we were easily impressed if they did anything. For those of us who were easily impressed with this, it got us interested. And it was possible to bang out a somewhat-playable version of an arcade game in 12Kb ;)

Nowadays the expectations are so much bigger. I've seen people create their first website and be disappointed it looks so crappy, rather than excited it works at all. Not everyone, mind - some are still excited.

I can tell you from experience, PICO-8 has enthralled all my children, especially my oldest, with how little is required to make a really cool game like Jelpi. Every time we realize yet another cool thing you can do with just cos, sin, and atan2, we're always floored. Maybe we're simpler people, but my children were aware of modern video games and web apps, and somehow they still are fascinated by it and the oldest one has mastered it by now.

That's really cool :)

He was using a ZX Spectrum. People might be interested in https://www.worldofspectrum.org with old magazines and games and everything Spectrum related.

My main side project right now is a 3D libretro front end with Lua programming. One of my plans is to create a virtual 80s microcomputer lab with 3d models of C64, Apple II, Spectrum and maybe a few other things as well as virtual programming manuals and floppy disks and of course the emulators running on the screens. The Lua code can read and write the emulator memory so it will be possible to create a C64 demo that manipulates 3D objects for example.

Th problematic thing for the Lab is copyrights related to content. I'm not sure how I'm going to solve that. But if anyone is interested what I have so far (without the copyrighted content) is totally programmable with Lua and free. https://vintagesimulator.com

A new commercial version of the Spectrum is about to ship, the Spectrum Next.


I started coding on a Spectrum, initially in BASIC then in z80 machine code:


Recently I hacked together a simple BASIC interpreter in golang, which was fun. Whenever I had a choice about implementation I picked the choice that was closest to the Sinclair-BASIC I remembered:


A decent candidate for a BASIC dialect which includes 'the basics' (e.g. GOSUB for backwards compatibility with OLD programs) as well as a long-developed (since 2004) set of modern enhancements (objects, named functions, ...) : https://www.freebasic.net/

Free, open-source, multi-platform.

Gotta also mention QB64 here: https://www.qb64.org/

Same thing - free, open source, multi-platform

Also - a couple for Android:

Mintoris Basic - http://www.mintoris.com/ (not open source)

RFO BASIC! - http://rfo-basic.com/ (open source)

Can you tell I like BASIC?

While I don't code in it for anything but fun and toying, it's what I learned on in the 1980s (Microsoft BASIC for the TRS-80 Color Computer 2). Also played a lot with Applesoft BASIC on the Apple IIe in high school.

Later I moved on to QBasic 1.1, QuickBasic 4.5 and PowerBasic 3.2.

My first software development job, about a year after high school, was using PICK BASIC (in the form of UniVERSE on the IBM RS6000 running AIX - my first exposure to UNIX).

Then on my Amiga with AmigaBASIC (IIRC?) then later AMOS...

I've toyed around with PBASIC (for the BASIC Stamp microcontroller).

Still holds a special place in my heart...

Small BASIC is maybe the "biggest" one you missed I'm aware of: https://smallbasic-publicwebsite.azurewebsites.net/

(It's good that Microsoft still has a BASIC out there that you can point kids at. BASIC was Microsoft's first product and absolutely left a legacy across the decades.)

Also, I feel that as long as schools continue to require Texas Instruments graphing calculators for tests, that TI-BASIC will probably long have a place for teenage experiments when bored in classes.

Or GLBasic that just was released on Steam.

Late 70's BASIC is an interesting language. Even though it was called a high-level language at the time, it's still very close to how the CPU actually works. You get a couple primitive types (integer, float, string and half-precision floats and ints if you are very lucky) and fixed-size arrays of them, conditionals always involve a GOTO or GOSUB and all variables are global. You can still PEEK and POKE (read and write directly to memory addresses) your way around. There is no such thing as a data stack or named (actual) functions (or their parameters).

I think it's a fine language for introducing people to programming because it's conceptually very simple and leads one to realize how a limited language can or cannot express things (try to implement recursion, for instance).

This is kinda recursion - stackless, though - so it doesn't work right (and honestly, I don't know if it would work properly on actual BASIC from the era or not; it would depend on how the GOSUB/RETURN stack worked - I don't recall if you can GOSUB to a routine line number prior to the GOSUB line number - not to mention other reasons):

  10 Y0=1:GOSUB 1000:PRINT Y0
  999 END
  1000 REM
  1010 Y0=Y0+1
  1020 IF Y0<5 THEN GOSUB 1000
  1030 RETURN
You could potentially make something work with a DIM array for a limited sized stack to make "local" variables of some sort for each level of recursion. It would be very ugly - but barely possible I think.

This seems like a fun challenge to try out...

Many of the languages of the era, BASIC dialects included, just wouldn't recurse. You could call a single subroutine and return from that one, but it would either terminate with an error or overwrite the first return address if you tried to use it like a stack. So you would have to manage the entire thing manually - addresses and variables both. With enough POKE and PEEK you could certainly rig up a solution.

(C having a callstack is actually a pretty big deal, and not everyone adapted to it well at the time.)

Which dialects?? A stack is the obvious thing to have. You'll already need something like for nested FOR loops, so it's the obvious thing to do for GOSUB as well. I'd be surprised to hear that any non-toy implementations didn't have a GOSUB/RETURN stack.

ZX81 BASIC has a GOSUB/RETURN stack. MS BASIC has a GOSUB/RETURN stack. Atari BASIC has a GOSUB/RETURN stack. Apple 1 BASIC has a GOSUB/RETURN stack! - sure, it appears to be all of 8 deep, so good luck with using it for anything recursive, but it's a stack nonetheless...

You need to keep a set of global pointers to arrays for all internal state of the function, increment it on the entry point and decrement it before the subroutine's RETURN.

It's a nightmare, but doable.

I remember bringing a BASIC code listing I had written to show-and-tell in 3rd grade. I printed that sucker out on a big stack of fan-fold paper on the ol' Apple Imagewriter so that when I was standing in front of the class, I could drop it and let it unspool onto the floor in dramatic fashion and say "This is a program I wrote."

Blank stares all around, including from the teacher. So it goes.

There are plenty of python equivalents of his book. I bought "Computer Coding Python Games for Kids" for my son and its very good. It doesn't just list the program to type in, it breaks it up into logical sections like "now we need to write the code to animate the dragons" and then briefly explains how it works before giving a short code listing to type in.

Python's indentation rules are a source of confusion, but I'd much rather he learned that instead of BASIC.

To be fair, C-style blocks are at least as much a source of confusion as Python's indentation.

    if (banana)
        x = orange();
        strawberry(x, 3);

I'd think Racket (and DrRacket) would be a good language & environment for a kid to explore programming in: simple, consistent syntax, easy access to graphics. I'm not sure if there's any books or tutorials targeting kids & parents though.

80s style Basic does not have functions. This makes Basic programs tend to spaghetti and unreadable code, especially considering the constant memory constraints of those platforms. I grew up on these systems, and every time I think back on it I wish something like Forth would have taken its place - i.e. something with a clean and scalable pattern for abstraction. Basic, on the other hand, doesn't do abstractions. It barely has data types and what it calls “functions” are an even more limited form of Python's style of lambdas; every subset of code which can actually do something looks like “GOSUB 11600” when you call it. No naming, no abstractions, nothing.

It doesn't have functions they are normally thought of - but most BASIC languages of the era (generally Microsoft derived) did have simple "DEF FN" style:


Otherwise, you had to fake it, which you kinda note: Since all variables were global in nature, this made it challenging - but possible. You could set aside a few named variables for your "functions", then just use GOSUB:

  10 X0=1:X1=3:GOSUB 1000:PRINT Y0
  999 END
  1010 Y0=X0+X1
  1020 RETURN
That isn't "spaghetti code"; done properly, it's a form of structured BASIC. No, it isn't as readable, but you honestly get used to it. "GOSUB 2050" just means "do this task" and you know what to pass it (and if you don't, then you need to consult your docs or REM statements).

It's only when you start mixing in GOTOs jumping out of routines, all over the place, no structure, etc - that's where you get the infamous spaghetti code...

> It doesn't have functions they are normally thought of - but most BASIC languages of the era (generally Microsoft derived) did have simple "DEF FN" style:

Um, yes? That’s exactly what I meant by “an even more limited form of Python's style of lambdas”. Which is what they are. I never used them myself; I never, ever, found a use for them.

The rest of your argument seems to be “If you’re careful, it’s techically possible to write structured code.”, but the same can be said of most any language. Even assembly language for $FOO’s sake, usually has labels!

Modern BASIC like FreeBASIC has functions. When I tried different languages, FreeBASIC was easiest to set up getting from a download to compiling some source with a terminal command. Compiles fast, too, like older BASIC's or Go.


EDIT: Fixing comment to reflect you said 80's-style. My speedreading made me miss it first time.

> 80s style Basic does not have functions.

> No naming, no abstractions, nothing.

This is not entirely true. BBC BASIC (certainly the versions I used on the Electron and Master 128) had both named sub-procedures and named functions, with local variables and allowing recursion. While limited in some ways they did allow programming without sight nor sound of GOTO/GOSUB, which I actually used to do.

You could even do away with line numbers with a little hackery: write your code in a text editor (View, as built-in to the Master series and which we had as an add-on to the Electron we had at home before that, was what I used), I forget how due to the mists of time but I had a method of having the file repeated as if typed from the keyboard (adding in the line numbers via AUTO) and executing it directly. I felt rather clever, particularly as I did it mainly because a CS teacher had said it was not possible!

> BBC BASIC […] had both named sub-procedures and named functions […]

OK, sounds great. I also remember the versions of Basic on the Amiga and Atari ST being perfectly adequate in this regard.

But this is never the version of Basic which people talk about! What you describe is not the version of Basic used by all those listings in magazines, or in the linked article. This is, for all intents and purposes, not the “80s style Basic” which everybody remembers with such apparent and baffling fondness.

This is true of the 8-bit Basics.

However, later 80's 16-bit Basic (think Amiga, ST, QuickBasic for the PC, etc.) at least had functions, libraries, etc.

Going back to 8-bit is probably too far...

BBC BASIC (1981 - you want the 1982 version though) had procedures, multi-line functions, and variable names of any length. It's still a bit rubbish by modern standards, but workable for the sort of size of program that you might actually write. You're vanishingly unlikely to make something longer than 1,000 lines, if even that...

> and variable names of any length

Almost, but not quite. It would ignore anything after the 40th character of a name, and the overall line length restriction (255 characters, less line number) imposed a secondary limit.

Still night-and-day compared to some common contemporary BASIC variants what only allowed two-character variable names, of course, and not much of a limitation as really long names would soon have you hitting the overall memory restriction of the default address space (between 10 and 29Kbyte available, depending on screen mode, which had to be enough for the heap (including your code) and the call stack).

I perhaps remember this far too well, given how long it is since I touched one of those machines...

Yes, sorry, the line length is indeed limited to 255 chars (or is it 252/253?)! - I guess I just meant that there are no specific additional limits on variable name lengths.

I couldn't find any evidence of a 40 char limit on BASIC II... the full length of the variable name appears to be stored and compared.

253 IIRC: 256 less two bytes for a 16-bit line number and one for either an EOL marker or a length indicator (I forget which it used and I'm feeling too lazy to look it up ATM).

Though there was also a limit (also ~256) to the line on entry, which you would normally hit first because keywords are stored tokenised: "PRINT" would be stored as one byte not five.

Some of that stuff got burned in, if you ask me.

> Going back to 8-bit is probably too far...

I agree, but this is exactly what the linked article does. It even promotes the idea that the GOTO is better than functions or OOP, because it’s “simpler”. Needless to say, I disagree.

Naming things is almost a prerequisite to be able to think and reason about things without keeping the entire program in your head at once. And without any means of abstraction with names for code, old-style 80s Basic fails this requirement.

And many of them compiled to actual native code.

More precisely they didn't have a stack, and honestly most 8 bit comptuers didn't have the memory or hardware support for a stack.

I think exposing people to at least a little bit of basic or assembly is probably a good thing so people can understand why functions are useful.

I disagree; growing up deprived of things damages you deeply, as you can’t as easily get used to them later in life. I am reminded of the quote from The Unix-Haters Handbook:

I liken starting one’s computing career with Unix, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. But, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act.

— Ken Pier, Xerox PARC

Even putting aside its educational value (questionable), 80s BASIC still matters for the same reason that COBOL still matters: I know for a fact it is being used to run businesses with $1m+ annual revenue (I get paid to maintain a couple of them). Quasi-ERP type systems sold in the 70s and 80s continue to survive on commercial interpeters like ProvideX (https://home.pvxplus.com/) and BBX (https://www.basis.com/). Note, I'm not saying that's a good thing or that it's fun to work on.

Computers were, in general, much more accessible in the 80's. I'm sure the simplicity helped...

Many shipped with books documenting not only the Basic language, but full hardware documentation: memory map of the machine, special memory addresses (useful for peek/poke), etc. I even remember the Apple II manuals coming with schematics for the entire motherboard!

And commented ROM listings, where one found out there was a line assembler.

Good times.

I second that. Today, clearly BASIC can be a great "toy" language to get into programming. And being so different from "modern" languages is not a disadvantage but actually an advantage. There's another world then standard-imperative-oop-languages. Why not go to Haskell after BASIC? There is so much more to explore then Python or Processing...

Question: is there a small and simple terminal that runs basic and suitable for young children?

I’m looking to avoid a general purpose device that can play modern games and connect to the internet.

The 8-Bit Guy is building one, you might want to check out his video on this:


QBasic on FreeDOS running in an emulator on a RaspberryPi?

I think RiscOS includes a pretty robust BASIC, and then you avoid the emulation overhead.

Qbasic is also a sort of weird situation-- modern in some ways, but sometimes feature and resource limited compared to other BASICs. It was a defeatured QuickBasic that they sort of expected users to graduate to the full product.

You sure you couldn't squeeze a few more layers of abstraction on there?

QBasic on FreeDOS running in a Raspberry Pi emulator in a Docker container over VNC on Kubernets running on Amazon EKS in several availability zones for five nines of abstraction nostalgia goodness.

You could have freedos run on a virtual machine? And emulate the raspi in the cloud, but that would be being silly.

Check out the pocket chip, you can use Basic on it but the real gem is writing for PICO-8.



There are a few projects using things like the Arduino to create a homebrew 80s micro style computer. Found this one relatively quickly: https://www.instructables.com/id/Arduino-Basic-PC-With-VGA-O...

RC2014 'is a simple 8 bit Z80 based modular computer originally built to run Microsoft BASIC' https://rc2014.co.uk/ Though for keyboard and screen you still need a terminal (or terminal emulator, which can be a Raspberry Pi).

RISC OS Pico - a minimal BBC Basic environment for the Raspberry Pi: https://www.riscosopen.org/content/sales/risc-os-pico

Raspberry Pi running IchigoJam BASIC? Or the IchigoJam board itself?


Surely there's a server linux distro that ships without X11. Or you could uninstall X11 from another distro.

Today's warty mainstream beginner language, JavaScript, is 100x better than BASIC, IMO.

Most graphing calculators match that description.

BASIC256 on a rpi?

Ah... BASIC. I started into programming as a kid back in the 80's, by transcribing BASIC games from Andrew Lacey's book "Games for MSX", into of course, an MSX. Learnt a lot.

I've found the exact Spanish version! https://sites.google.com/site/santiagoontanonvillar/personal...

Radio 4 covered BASIC in their 3/5 episode of 'Codes that Changed the World' https://www.bbc.co.uk/programmes/b05pnvmh

The episode does a good job of explaining that the language hit a sweet spot of easy to learn, and easy to implement on the limited hardware of early home computers.

There are a number of old Usborne books available for download which teach how to write simple games in BASIC (go to the bottom of the page): https://usborne.com/browse-books/features/computer-and-codin...

Lua feels like it could fit the bill as an easy to get into, but modern language. See the great demos folks make with the Pico-8 fantasy console https://twitter.com/hashtag/tweetcart

I am an admitted retro addict - I have every computer I've ever owned since 1983. So, to me there is just one answer to why BASIC still matters: Old computers never die - their users do!

Every computer out there is still capable of delivering value to someone, out there. I really think its a shame we have allowed the consumer cult viewpoint to prevail, and that we consider 'old computers' to be "useless" - when this is really not what's happened.

We changed, not the physical hardware.

By way of anecdote: My 11 year old kid is learning as much about programming computers - and most importantly, how computers really work - from hacking code on one of my 8-bit machines, as he would by trying to crank up Xcode, were that at all feasible ..

BASIC was a huge turn off for me as a kid in the 90s. I hated it. I hated having to deal with the line numbers. I hated everything about it, except that it let you write a program. Whenever I would look at a long BASIC program I would think, “how the hell did anyone write this?” To this day I don’t understand how people liked this language. I guess it’s easier than assembly, which also has line numbers. And I absolutely loved programming and computers. I remember reading a C++ book around 1997 and even though I couldn’t get my hands on a compiler then, it was liberating to just read about.

Because we only got to do it during the 8 bit era, there were tons of books and magazine listings to learn it, and for many of us it was the only way to get software.

16 bit home computers already had structured BASIC compilers, producing native code, for example I learned Turbo Basic in 1990.

Here is how it looked like,


Amiga demoscene was presented with AMOS.

Just to cite those which I had more experience with.

Thanks for the perspective and especially the book link. I looked around at the book. Looks really nice. I couldn’t have afforded this compiler back in the day though :). Incredible all these compilers/interpreters are free now.

Everyone liked BASIC for 2 reasons, if they did not have much prior exposure to a Unix-like or VAX-like multiuser/multitasking environment in the 80's:

- BASiC was a very interactive environment. This new and awesome in the 80s. You could enter programs, run them immediately, edit, run, repeat without having to go through a compilation step.

- BASIC worked without any planning whatsoever. You don't need to declare variables, or even arrays if you only use subscripts 1-10. Strings handled automatically.

So it was really fun to play with. You could make your new computer do something rather quickly, and refine it over time.

Line numbers were needed on the old 8-bit systems that didn't have full screen text editors with copy/paste built in. It's how you told BASIC where to put new lines in the program.

Admittedly it was certainly getting long in the tooth in the 90's though.

Thanks for this perspective. I guess I’m just one generation late so I never got it. Admittedly most of my BASIC experience was on graphing calculators :)

What was it that made games in BASIC simpler than simple games in currently popular languages? Things like "x = x + vx" aren't really different. Python is a much bigger language and though you don't need to know the extra features, they're going to leak through in the form of harder-to-understand errors when you go off the rails. But is there more to it? I haven't tried to use pygame or https://love2d.org or the like. Is there a niche for an even simpler library and IDE?

My Own Experience with Basic and teaching programming:

I know from teaching kids is that LINE NUMBERS makes more sense to them. Personally I HATE line numbers and think they are the devil.

I was showing kids my old 1970s code books that I use to copy into basic and play games. I told them how I was in 1st grade and that my dad would help debug them with me. They all wanted to try and one kid copied a good 300 lines for a battleship game. For some reason the logical numbers made sense to them. That kid must have played that game dozens of times because he felt like he made it.

My own opinion is that BASIC shouldn't be used to teach programming. I opt for Racket.

I know where those kids are coming from. I understood 8-bit BASICs "better" than I did QBASIC, because I could spatially follow the flow of a program written with GOTO: knowing the line number told me roughly where in the code it would land. But code that used labels and indirection(arrays, pointers, that kind of thing) was beyond me for quite a while, so I didn't fully understand most of the interesting listings, even though I could type them in and find data assignments or comparisons to change.

Modern languages pack in tons of indirection because that's where the power tools are - but an introductory environment might benefit from cutting down on that.

Yes. I think it was Atari BASIC that allowed GOTO (X*100)+1000

Lol, found out later why that was bad, but as a kid, why not?

What else are the numbers for?

That's how I started - on Apple][/e clone (bulgarian version was called "Pravetz 8C"). Right there the ] prompt was waiting for me, and even CALL-151 was available for futher experimentation ;)

Oh, the joy when I've found Beagle Bross software - https://en.wikipedia.org/wiki/Beagle_Bros Also this: https://stevenf.com/beagle/where.html

Here is a browser-based Javascript Apple Basic emulator: https://www.calormen.com/jsbasic/

I assigned my son exercises for it over one summer. I had him save his results into local Notepad for later inspection by me. It also comes with drop-down samples.

What about a 21st century beginner's language, in the same vein as BASIC, but which is more similar to modern languages? Does one exist?

That would obviously be Javascript. A toy language no sane person would use willingly but just powerful enough to demonstrate the sheer minimum of what a computer can do. Yet its adepts are know-it-all gurus who will never, ever, need another language again. Hence the preoccupation with re-implementing every piece of software that has ever been conceived, in Javascript. </rant>

We don't condone torturing children here.

The subset of C/C++ you see used in typical Arduino sketches seem to be what's evolved as the replacement.

The kids I see using Arduinos seem roughly like the same set of kids that would have had an 8-bit home computer in the 80's, and actually spent any time programming on it.

Oh no no no. If we do a 21st century version, we'll throw in all sorts of opinionated CS jerkoffery into it, and then it will look just like everything else--where we have to pass through three different toolchains before we can even think about running something.

Bicycles are already pretty good just the way they are.

Microsoft released Small Basic in 2008-2015 https://en.wikipedia.org/wiki/Microsoft_Small_Basic though a cynic might say that it's just an on ramp to VB.Net

Python is pretty much taking over as the go-to beginner programming language (see what I did there?).

However my daughters are learning VB.NET at school for her GCSE Computer Science (UK). It's a very capable language, if a little bit verbose and clunky, on a modern runtime and with excellent tooling. Until recently they were using Visual Studio for Mac, but we got my eldest a Windows 10 laptop for Christmas so now she's on 'real' Visual Studio. They're both very good though. It's quite something to watch her debugging code by setting break points, inspecting variables and such.

The last couple of years at my previous job I was working in life sciences domain, got to meet several researchers that were using VB.NET for their research projects.

In all cases they were people that could already do some stuff in Office VBA, got IT to have them a VS install made available to them, and just coded away on VB.NET when needed.

It helped that many devices in life sciences tend to have Windows only drivers, sometimes COM based, as such on their specific domain, having .NET based applications around was already pretty common anyway.

Hypercard has a modern implementation called LiveCode[1] that might be a good starting spot, but is definitely not as accessible as the default installed Hypercard on a System 6/7 machine.

[1] https://livecode.com/

Also Swift has 'Playgrounds' as an iPad app or online http://online.swiftplayground.run/

My fondest memories of programming are all the days of BASIC. I felt it was super simple, easy to keep in your head, and made no mistakes about flow. I don't honestly know if this was just because I was young and excited, or if the simplicity of BASIC has a lot of merits that are missing in today's popular languages.

I think it was youth. My fondest memories of programming were Turbo Pascal. I used basic in the eighties but I was still a literal child. You were probably either a Gen-Xer or a little more precocious about programming than me.

In about five to ten years we'll have a spate of articles from nostalgic early Millenials about Turbo Pascal, Turbo C, Watcom C and such, then a rash of remembarances about CGI-bin, then early Java....

Personally, I miss mode 13h at least twice a month.

bas55 [1] is the closest I can find to a Dartmouth style basic (so 70s rather than 80s). Alas constants are truncated at 6 places (internal calculations are all doubles). It compiles fine and works well complete with LOAD, SAVE and limited line editing.

Does anyone know of a portable basic interpreter that supports double precision all through?

The kind of stuff I (used to) do [2] requires more than 6 figures.

[1] https://jorgicor.niobe.org/bas55/

[2] https://sohcahtoa.org.uk/kepler/moon2.html

Maybe FreeBASIC would be useful, (although it's a compiler rather than an interpreter, and a more modern dialect than 70s BASIC, mostly compatible with QBASIC/QuickBASIC). https://www.freebasic.net/

It handles 64-bit doubles: https://www.freebasic.net/wiki/wikka.php?wakka=KeyPgDouble

Have yet to read article, but I believe that it is always beneficial to learn how things were done. If only to gain context around way things are the way they are, ignoring anything else.

and you can grab TurboBasic/QBasic and have functions, subroutines, switch/case, while loops....

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact