
Why 80s BASIC still matters - ingve
https://blog.usejournal.com/why-80s-basic-still-matters-1c17de5768fa
======
payne92
This article inadvertently underscores the huge challenge in getting kids
interested in programming today.

In 2019, it's very hard to do anything "interesting" with BASIC when kids are
surrounded by apps and games with hundreds or thousands of person-hours of
development. "Hunt the Wumpus" just doesn't cut it anymore.

As others have written, Python is much more interesting because of the
extensive library ecosystem. Integrating the physical world, graphics,
controllers, sound, audio, network, video, etc. offers a lot more engagement
than "Hello World".

Javascript is also interesting because it runs _everywhere_ , and HTML/JS
skills will offer good summer job money. :)

~~~
klodolph
Even in 1985, the things that a child could produce in BASIC weren't
"interesting". Children in that era could make a few dots appear on the
screen, or make a prompt that asked questions and responded, or maybe draw
some interesting procedural graphics. It's not much when you compare it to
games like _The Bard 's Tale, Ultima IV,_ or _King 's Quest II._ But it was
enough, because programming was by itself an interesting puzzle to solve.

It's not the result that's interesting, it's the process of programming.

~~~
Falkon1313
> the things that a child could produce in BASIC weren't "interesting"

That's just not true. For I/O you had keyboard input, screen output, and maybe
file I/O. 3 basic things that pretty much all programs used. Most commercial
programs were 80x25 text mode with 16 color attributes. If you could learn to
read scancodes from the keyboard, write text to the screen, and save/load
files, (along with a few fundamental ideas like loops and conditionals) you
could do almost anything that a commercial program did. With just that basic
knowledge of a few simple things.

It was simple because it was direct, there weren't 72 layers of abstraction
and UI libraries that you had to know before you could even begin. You could
also do pixel graphics, and again it was pretty simple - just set the screen
mode and go with DRAW, LINE, CIRCLE, POKE etc. Again, no layers of indirection
or libraries or window managers or canvases or
DrawableObjectFactoryManagerFactoryInterfaces needed.

One of the problems of modern attempts to recreate that feel is just that
there is so much between the programmer and the hardware and it's so much more
complex. Back then, we had single-user, single-process systems with direct
access to simple hardware. Everything you needed to do was either a built-in
part of the language or a BIOS/OS interrupt that you could call just as
easily. It's hard to recreate that on a multi-user multitasking system with
indirect access to more complicated hardware and countless layers of software
between the programmer and the machine and BIOS/OS interrupts. Sure you can do
more powerful stuff now, but it's not as interesting.

~~~
klodolph
Direct access to hardware is not the difference. It is easier to get things
done today. You can argue that it was simpler in 1985, but I don't think that
simplicity is the right way to measure things. You can present an easy-to-use
programming environment to people, accessible to newcomers, usable by experts,
and it doesn't have to be simple but it has to be possible to use it in simple
ways.

Just personal observations here. Someone with a computer in 2019 can download
Unity, which is free, and watch a tutorial on YouTube, which is also free.
With near zero understanding of what is going on, and no prior experience,
they can have some kind of rudimentary platformer working within hours. This
is then a good starting point to learn programming (you can dive into C#) or
you can continue to jumble together copy/pasted bits of code that you see
online (kind of like how I remember doing with BASIC and library books).

Sure, there are a bunch of layers of abstraction, and those abstractions will
break down all the time. You don't _need_ to learn those abstraction layers,
you can stay at the top layer and still get good work done, maybe working
around a few problems that you don't understand from time to time by futzing
around with stuff until it works. Or you can dive in and try to understand
what's underneath an abstraction. That's the whole point of having
abstractions in the first place. They exist to hide the complexity, to let you
get work done without understanding the entire system. 72 layers of
abstraction is a bit of an exaggeration. I'd say if BASIC has three layers
(interpreter/machine code/hardware), Unity only has six
(editor/scripts/engine/os/machine code/hardware), but who knows how you count
them.

In my experience, watching people learn how to create things, it has _never_
been easier to learn programming and start building things. The main
difference is that in 1985, programming was considered an essential computer
skill, and in 2019, you're expected not to program. BASIC was amazing because
it was what you saw on the screen when you turned your computer on, nothing
more.

------
fotbr
80s BASIC (GW-Basic, to be specific) was, right up to his retirement two years
ago, my fathers go-to language for anything work related. It was "good enough"
for the job, cheaper and faster than trying to get their software department
to understand the problem needing solving, and it was what he knew --
including it's limitations and when not to use it.

Old, "bad" languages can still be extremely powerful and productive for what
used to be called "power users"; don't dismiss them out of hand because the
professional software development world has moved on.

------
drewg123
What was transformative for me in my youth were the "listings" he mentions. I
remember spending hours typing in pages of code, having it fail, and having to
debug it. Even though I didn't think so when I was 12, I think these days that
those failures were the best thing that ever happened to me. I actually think
they were far more valuable than the AP CS course I took as a senior in HS.

That experience taught me skills that I still use today. Mainly, trying to
understand an unfamiliar (often poorly commented) codebase, and having to
debug it.

These days, I think that the most equivalent experience might be working on
something open source. But I'm afraid the motivation just isn't there for most
kids. My motivation was "Wow! Free GAMES!", but my kid gets free games on his
tablet without having to debug a darned thing.

~~~
mipmap04
Motivations are certainly different now. I do wonder if kids today still have
that "wild west" feel when messing with computers. I remember when I was
young, writing programs and making computers do simple things seemed really
impressive to me at the time! The bar is a lot higher now and I wonder if that
dissuades some kids from trying because doing something impressive _now_
requires a lot of extra overhead that I didn't have to deal with.

~~~
mgkimsal
> I remember when I was young, writing programs and making computers do simple
> things seemed really impressive to me at the time!

It was same feeling for me too, in the early/mid 80s. And it was impressive
_also_ because few other people were even doing it. Whether other people could
or not, based on their capabilities, most didn't actually _have_ any home
computer to even practice on. Merely having a home computer in, say, 1983, was
a rare thing. Actually being able to make it do things you wanted was even
rarer. It probably seemed more impressive to many outsiders than it actually
was (in the sense that a for-loop is pretty boring/basic) but hard to separate
out that impressiveness between having access to the situation at all vs
having the skills.

------
marcus_holmes
I still have the Acorn Atom that I started learning to code on. 12Kb of memory
(with the expansion pack installed). No hard drive - if you don't spend 10
minutes saving it to cassette then it's gone. As I found out when my cousin
turned off my computer after I'd rushed to dinner after an 8 hour coding
session :( still haven't forgiven him.

My girlfriend is trying to learn to code, and I realised that I had it easy -
because computers were much simpler, learning to code on them was much
simpler. I learned Assembly coding trying to speed up my simple homemade
games, but that was easy: the machine only had 3 registers, the entire machine
memory was addressable, and there was maybe two dozen commands to learn.

I think this is why I like Go so much. It reminds me of old-school BASIC in
its simplicity.

I _still_ keep typing `IF <condition> THEN` and `FOR I = 1 TO 10` if I'm
tired, even after all these years.

~~~
sephoric
There's a nice little app called PICO-8 that retains that simplicity, I
heartily recommend it for anyone who wants to learn programming (and I don't
in any way get paid for saying this). Also that syntax is almost valid Lua
which PICO-8 uses.

~~~
marcus_holmes
I'm not sure we can turn the clock back like that. Back in the early 80's
computers were expected to do very little, so we were easily impressed if they
did anything. For those of us who were easily impressed with this, it got us
interested. And it was possible to bang out a somewhat-playable version of an
arcade game in 12Kb ;)

Nowadays the expectations are so much bigger. I've seen people create their
first website and be disappointed it looks so crappy, rather than excited it
works at all. Not everyone, mind - some are still excited.

~~~
sephoric
I can tell you from experience, PICO-8 has enthralled all my children,
especially my oldest, with how little is required to make a really cool game
like Jelpi. Every time we realize yet another cool thing you can do with just
cos, sin, and atan2, we're always floored. Maybe we're simpler people, but my
children were aware of modern video games and web apps, and somehow they still
are fascinated by it and the oldest one has mastered it by now.

~~~
marcus_holmes
That's really cool :)

------
ilaksh
He was using a ZX Spectrum. People might be interested in
[https://www.worldofspectrum.org](https://www.worldofspectrum.org) with old
magazines and games and everything Spectrum related.

My main side project right now is a 3D libretro front end with Lua
programming. One of my plans is to create a virtual 80s microcomputer lab with
3d models of C64, Apple II, Spectrum and maybe a few other things as well as
virtual programming manuals and floppy disks and of course the emulators
running on the screens. The Lua code can read and write the emulator memory so
it will be possible to create a C64 demo that manipulates 3D objects for
example.

Th problematic thing for the Lab is copyrights related to content. I'm not
sure how I'm going to solve that. But if anyone is interested what I have so
far (without the copyrighted content) is totally programmable with Lua and
free. [https://vintagesimulator.com](https://vintagesimulator.com)

~~~
walkingolof
A new commercial version of the Spectrum is about to ship, the Spectrum Next.

[https://www.specnext.com](https://www.specnext.com)

------
8bitsrule
A decent candidate for a BASIC dialect which includes 'the basics' (e.g. GOSUB
for backwards compatibility with OLD programs) as well as a long-developed
(since 2004) set of modern enhancements (objects, named functions, ...) :
[https://www.freebasic.net/](https://www.freebasic.net/)

Free, open-source, multi-platform.

~~~
cr0sh
Gotta also mention QB64 here: [https://www.qb64.org/](https://www.qb64.org/)

Same thing - free, open source, multi-platform

Also - a couple for Android:

Mintoris Basic - [http://www.mintoris.com/](http://www.mintoris.com/) (not
open source)

RFO BASIC! - [http://rfo-basic.com/](http://rfo-basic.com/) (open source)

Can you tell I like BASIC?

While I don't code in it for anything but fun and toying, it's what I learned
on in the 1980s (Microsoft BASIC for the TRS-80 Color Computer 2). Also played
a lot with Applesoft BASIC on the Apple IIe in high school.

Later I moved on to QBasic 1.1, QuickBasic 4.5 and PowerBasic 3.2.

My first software development job, about a year after high school, was using
PICK BASIC (in the form of UniVERSE on the IBM RS6000 running AIX - my first
exposure to UNIX).

Then on my Amiga with AmigaBASIC (IIRC?) then later AMOS...

I've toyed around with PBASIC (for the BASIC Stamp microcontroller).

Still holds a special place in my heart...

~~~
WorldMaker
Small BASIC is maybe the "biggest" one you missed I'm aware of:
[https://smallbasic-publicwebsite.azurewebsites.net/](https://smallbasic-
publicwebsite.azurewebsites.net/)

(It's good that Microsoft still has a BASIC out there that you can point kids
at. BASIC was Microsoft's first product and absolutely left a legacy across
the decades.)

Also, I feel that as long as schools continue to require Texas Instruments
graphing calculators for tests, that TI-BASIC will probably long have a place
for teenage experiments when bored in classes.

------
rbanffy
Late 70's BASIC is an interesting language. Even though it was called a high-
level language at the time, it's still _very_ close to how the CPU actually
works. You get a couple primitive types (integer, float, string and half-
precision floats and ints if you are very lucky) and fixed-size arrays of
them, conditionals always involve a GOTO or GOSUB and all variables are
global. You can still PEEK and POKE (read and write directly to memory
addresses) your way around. There is no such thing as a data stack or named
(actual) functions (or their parameters).

I think it's a fine language for introducing people to programming because
it's conceptually very simple and leads one to realize how a limited language
can or cannot express things (try to implement recursion, for instance).

~~~
cr0sh
This is kinda recursion - stackless, though - so it doesn't work right (and
honestly, I don't know if it would work properly on actual BASIC from the era
or not; it would depend on how the GOSUB/RETURN stack worked - I don't recall
if you can GOSUB to a routine line number prior to the GOSUB line number - not
to mention other reasons):

    
    
      10 Y0=1:GOSUB 1000:PRINT Y0
      999 END
      1000 REM
      1010 Y0=Y0+1
      1020 IF Y0<5 THEN GOSUB 1000
      1030 RETURN
    

You could potentially make something work with a DIM array for a limited sized
stack to make "local" variables of some sort for each level of recursion. It
would be very ugly - but barely possible I think.

This seems like a fun challenge to try out...

~~~
mntmoss
Many of the languages of the era, BASIC dialects included, just wouldn't
recurse. You could call a single subroutine and return from that one, but it
would either terminate with an error or overwrite the first return address if
you tried to use it like a stack. So you would have to manage the entire thing
manually - addresses and variables both. With enough POKE and PEEK you could
certainly rig up a solution.

(C having a callstack is actually a pretty big deal, and not everyone adapted
to it well at the time.)

~~~
tom_
Which dialects?? A stack is the obvious thing to have. You'll already need
something like for nested FOR loops, so it's the obvious thing to do for GOSUB
as well. I'd be surprised to hear that any non-toy implementations didn't have
a GOSUB/RETURN stack.

ZX81 BASIC has a GOSUB/RETURN stack. MS BASIC has a GOSUB/RETURN stack. Atari
BASIC has a GOSUB/RETURN stack. _Apple 1 BASIC_ has a GOSUB/RETURN stack! -
sure, it appears to be all of 8 deep, so good luck with using it for anything
recursive, but it's a stack nonetheless...

------
jcadam
I remember bringing a BASIC code listing I had written to show-and-tell in 3rd
grade. I printed that sucker out on a big stack of fan-fold paper on the ol'
Apple Imagewriter so that when I was standing in front of the class, I could
drop it and let it unspool onto the floor in dramatic fashion and say "This is
a program I wrote."

Blank stares all around, including from the teacher. So it goes.

------
zelos
There are plenty of python equivalents of his book. I bought "Computer Coding
Python Games for Kids" for my son and its very good. It doesn't just list the
program to type in, it breaks it up into logical sections like "now we need to
write the code to animate the dragons" and then briefly explains how it works
before giving a short code listing to type in.

Python's indentation rules are a source of confusion, but I'd much rather he
learned that instead of BASIC.

~~~
klodolph
To be fair, C-style blocks are at least as much a source of confusion as
Python's indentation.

    
    
        if (banana)
            x = orange();
            strawberry(x, 3);
        kumquat(x);

------
teddyh
80s style Basic does not have _functions_. This makes Basic programs tend to
spaghetti and unreadable code, especially considering the constant memory
constraints of those platforms. I grew up on these systems, and every time I
think back on it I wish something like Forth would have taken its place - i.e.
something with a clean and scalable pattern for abstraction. Basic, on the
other hand, doesn't _do_ abstractions. It _barely_ has data types and what it
calls “functions” are an _even more_ limited form of Python's style of
lambdas; every subset of code which can actually _do_ something looks like
“GOSUB 11600” when you call it. No naming, no abstractions, nothing.

~~~
icedchai
This is true of the 8-bit Basics.

However, later 80's 16-bit Basic (think Amiga, ST, QuickBasic for the PC,
etc.) at least had functions, libraries, etc.

Going back to 8-bit is probably too far...

~~~
tom_
BBC BASIC (1981 - you want the 1982 version though) had procedures, multi-line
functions, and variable names of any length. It's still a bit rubbish by
modern standards, but workable for the sort of size of program that you might
actually write. You're vanishingly unlikely to make something longer than
1,000 lines, if even that...

~~~
dspillett
_> and variable names of any length_

Almost, but not quite. It would ignore anything after the 40th character of a
name, and the overall line length restriction (255 characters, less line
number) imposed a secondary limit.

Still night-and-day compared to some common contemporary BASIC variants what
only allowed two-character variable names, of course, and not much of a
limitation as really long names would soon have you hitting the overall memory
restriction of the default address space (between 10 and 29Kbyte available,
depending on screen mode, which had to be enough for the heap (including your
code) and the call stack).

I perhaps remember this far too well, given how long it is since I touched one
of those machines...

~~~
tom_
Yes, sorry, the line length is indeed limited to 255 chars (or is it
252/253?)! - I guess I just meant that there are no specific additional limits
on variable name lengths.

I couldn't find any evidence of a 40 char limit on BASIC II... the full length
of the variable name appears to be stored and compared.

~~~
dspillett
253 IIRC: 256 less two bytes for a 16-bit line number and one for either an
EOL marker or a length indicator (I forget which it used and I'm feeling too
lazy to look it up ATM).

Though there was also a limit (also ~256) to the line on entry, which you
would normally hit first because keywords are stored tokenised: "PRINT" would
be stored as one byte not five.

------
velcrovan
Even putting aside its educational value (questionable), 80s BASIC still
matters for the same reason that COBOL still matters: I know for a fact it is
being used to run businesses with $1m+ annual revenue (I get paid to maintain
a couple of them). Quasi-ERP type systems sold in the 70s and 80s continue to
survive on commercial interpeters like ProvideX
([https://home.pvxplus.com/](https://home.pvxplus.com/)) and BBX
([https://www.basis.com/](https://www.basis.com/)). Note, I'm not saying
that's a good thing or that it's fun to work on.

------
icedchai
Computers were, in general, much more accessible in the 80's. I'm sure the
simplicity helped...

Many shipped with books documenting not only the Basic language, but full
hardware documentation: memory map of the machine, special memory addresses
(useful for peek/poke), etc. I even remember the Apple II manuals coming with
schematics for the entire motherboard!

~~~
ddingus
And commented ROM listings, where one found out there was a line assembler.

Good times.

------
ktpsns
I second that. Today, clearly BASIC can be a great "toy" language to get into
programming. And being so different from "modern" languages is not a
disadvantage but actually an advantage. There's another world then standard-
imperative-oop-languages. Why not go to Haskell after BASIC? There is so much
more to explore then Python or Processing...

------
banku_brougham
Question: is there a small and simple terminal that runs basic and suitable
for young children?

I’m looking to avoid a general purpose device that can play modern games and
connect to the internet.

~~~
achr2
QBasic on FreeDOS running in an emulator on a RaspberryPi?

~~~
AnIdiotOnTheNet
You sure you couldn't squeeze a few more layers of abstraction on there?

~~~
ohithereyou
QBasic on FreeDOS running in a Raspberry Pi emulator in a Docker container
over VNC on Kubernets running on Amazon EKS in several availability zones for
five nines of abstraction nostalgia goodness.

------
scoutt
Ah... BASIC. I started into programming as a kid back in the 80's, by
transcribing BASIC games from Andrew Lacey's book "Games for MSX", into of
course, an MSX. Learnt a lot.

I've found the exact Spanish version!
[https://sites.google.com/site/santiagoontanonvillar/personal...](https://sites.google.com/site/santiagoontanonvillar/personal/old-
computer-books-preservation)

------
cpswan
Radio 4 covered BASIC in their 3/5 episode of 'Codes that Changed the World'
[https://www.bbc.co.uk/programmes/b05pnvmh](https://www.bbc.co.uk/programmes/b05pnvmh)

The episode does a good job of explaining that the language hit a sweet spot
of easy to learn, and easy to implement on the limited hardware of early home
computers.

------
UncleSlacky
There are a number of old Usborne books available for download which teach how
to write simple games in BASIC (go to the bottom of the page):
[https://usborne.com/browse-books/features/computer-and-
codin...](https://usborne.com/browse-books/features/computer-and-coding-
books/)

------
bump-ladel
Lua feels like it could fit the bill as an easy to get into, but modern
language. See the great demos folks make with the Pico-8 fantasy console
[https://twitter.com/hashtag/tweetcart](https://twitter.com/hashtag/tweetcart)

------
fit2rule
I am an admitted retro addict - I have every computer I've ever owned since
1983. So, to me there is just one answer to why BASIC still matters: Old
computers never die - their users do!

Every computer out there is still capable of delivering value to someone, out
there. I really think its a shame we have allowed the consumer cult viewpoint
to prevail, and that we consider 'old computers' to be "useless" \- when this
is really not what's happened.

 _We changed_ , not the physical hardware.

By way of anecdote: My 11 year old kid is learning as much about programming
computers - and most importantly, how computers really work - from hacking
code on one of my 8-bit machines, as he would by trying to crank up Xcode,
were that at all feasible ..

------
linkmotif
BASIC was a huge turn off for me as a kid in the 90s. I hated it. I hated
having to deal with the line numbers. I hated everything about it, except that
it let you write a program. Whenever I would look at a long BASIC program I
would think, “how the hell did anyone write this?” To this day I don’t
understand how people liked this language. I guess it’s easier than assembly,
which also has line numbers. And I absolutely loved programming and computers.
I remember reading a C++ book around 1997 and even though I couldn’t get my
hands on a compiler then, it was liberating to just read about.

~~~
pjmlp
Because we only got to do it during the 8 bit era, there were tons of books
and magazine listings to learn it, and for many of us it was the only way to
get software.

16 bit home computers already had structured BASIC compilers, producing native
code, for example I learned Turbo Basic in 1990.

Here is how it looked like,

[https://archive.org/details/bitsavers_borlandBorsHandbook198...](https://archive.org/details/bitsavers_borlandBorsHandbook1987_15768512)

Amiga demoscene was presented with AMOS.

Just to cite those which I had more experience with.

~~~
linkmotif
Thanks for the perspective and especially the book link. I looked around at
the book. Looks really nice. I couldn’t have afforded this compiler back in
the day though :). Incredible all these compilers/interpreters are free now.

------
abecedarius
What was it that made games in BASIC simpler than simple games in currently
popular languages? Things like "x = x + vx" aren't really different. Python is
a much bigger language and though you don't _need_ to know the extra features,
they're going to leak through in the form of harder-to-understand errors when
you go off the rails. But is there more to it? I haven't tried to use pygame
or [https://love2d.org](https://love2d.org) or the like. Is there a niche for
an even simpler library and IDE?

~~~
baldfat
My Own Experience with Basic and teaching programming:

I know from teaching kids is that LINE NUMBERS makes more sense to them.
Personally I HATE line numbers and think they are the devil.

I was showing kids my old 1970s code books that I use to copy into basic and
play games. I told them how I was in 1st grade and that my dad would help
debug them with me. They all wanted to try and one kid copied a good 300 lines
for a battleship game. For some reason the logical numbers made sense to them.
That kid must have played that game dozens of times because he felt like he
made it.

My own opinion is that BASIC shouldn't be used to teach programming. I opt for
Racket.

~~~
mntmoss
I know where those kids are coming from. I understood 8-bit BASICs "better"
than I did QBASIC, because I could spatially follow the flow of a program
written with GOTO: knowing the line number told me roughly where in the code
it would land. But code that used labels and indirection(arrays, pointers,
that kind of thing) was beyond me for quite a while, so I didn't fully
understand most of the interesting listings, even though I could type them in
and find data assignments or comparisons to change.

Modern languages pack in tons of indirection because that's where the power
tools are - but an introductory environment might benefit from cutting down on
that.

------
malkia
That's how I started - on Apple][/e clone (bulgarian version was called
"Pravetz 8C"). Right there the ] prompt was waiting for me, and even CALL-151
was available for futher experimentation ;)

Oh, the joy when I've found Beagle Bross software -
[https://en.wikipedia.org/wiki/Beagle_Bros](https://en.wikipedia.org/wiki/Beagle_Bros)
Also this:
[https://stevenf.com/beagle/where.html](https://stevenf.com/beagle/where.html)

------
tabtab
Here is a browser-based Javascript _Apple Basic_ emulator:
[https://www.calormen.com/jsbasic/](https://www.calormen.com/jsbasic/)

I assigned my son exercises for it over one summer. I had him save his results
into local Notepad for later inspection by me. It also comes with drop-down
samples.

------
homarp
And if you're looking for the book, available on [https://www.amazon.com/ZX-
Spectrum-Games-Code-Club/dp/099347...](https://www.amazon.com/ZX-Spectrum-
Games-Code-Club/dp/0993474454) or
[https://www.amazon.co.uk/gp/product/0993474403](https://www.amazon.co.uk/gp/product/0993474403)

------
hammock
What about a 21st century beginner's language, in the same vein as BASIC, but
which is more similar to modern languages? Does one exist?

~~~
tempodox
That would obviously be Javascript. A toy language no sane person would use
willingly but just powerful enough to demonstrate the sheer minimum of what a
computer can do. Yet its adepts are know-it-all gurus who will never, ever,
need another language again. Hence the preoccupation with re-implementing
every piece of software that has ever been conceived, in Javascript. </rant>

~~~
rbanffy
We don't condone torturing children here.

------
axaxs
My fondest memories of programming are all the days of BASIC. I felt it was
super simple, easy to keep in your head, and made no mistakes about flow. I
don't honestly know if this was just because I was young and excited, or if
the simplicity of BASIC has a lot of merits that are missing in today's
popular languages.

~~~
StrictDabbler
I think it was youth. My fondest memories of programming were Turbo Pascal. I
used basic in the eighties but I was still a literal child. You were probably
either a Gen-Xer or a little more precocious about programming than me.

In about five to ten years we'll have a spate of articles from nostalgic early
Millenials about Turbo Pascal, Turbo C, Watcom C and such, then a rash of
remembarances about CGI-bin, then early Java....

Personally, I miss mode 13h at least twice a month.

------
keithpeter
bas55 [1] is the closest I can find to a Dartmouth style basic (so 70s rather
than 80s). Alas constants are truncated at 6 places (internal calculations are
all doubles). It compiles fine and works well complete with LOAD, SAVE and
limited line editing.

Does anyone know of a portable basic interpreter that supports double
precision all through?

The kind of stuff I (used to) do [2] requires more than 6 figures.

[1] [https://jorgicor.niobe.org/bas55/](https://jorgicor.niobe.org/bas55/)

[2]
[https://sohcahtoa.org.uk/kepler/moon2.html](https://sohcahtoa.org.uk/kepler/moon2.html)

~~~
Falkon1313
Maybe FreeBASIC would be useful, (although it's a compiler rather than an
interpreter, and a more modern dialect than 70s BASIC, mostly compatible with
QBASIC/QuickBASIC). [https://www.freebasic.net/](https://www.freebasic.net/)

It handles 64-bit doubles:
[https://www.freebasic.net/wiki/wikka.php?wakka=KeyPgDouble](https://www.freebasic.net/wiki/wikka.php?wakka=KeyPgDouble)

------
derblitzmann
Have yet to read article, but I believe that it is always beneficial to learn
how things were done. If only to gain context around way things are the way
they are, ignoring anything else.

------
Zardoz84
and you can grab TurboBasic/QBasic and have functions, subroutines,
switch/case, while loops....

