In 2019, it's very hard to do anything "interesting" with BASIC when kids are surrounded by apps and games with hundreds or thousands of person-hours of development. "Hunt the Wumpus" just doesn't cut it anymore.
As others have written, Python is much more interesting because of the extensive library ecosystem. Integrating the physical world, graphics, controllers, sound, audio, network, video, etc. offers a lot more engagement than "Hello World".
It's not the result that's interesting, it's the process of programming.
Prior to 1985: (As a child.)
I wrote "snake" and independently came up with the circular queue data structure. I wrote a "Bezerk" clone in low-res graphics, where everything was just a dot. (I had no idea how I could handle the AI at the time, so baddies just moved randomly.) I wrote a "skiing" game that exploited text scrolling for the vertical scrolling. I wrote a "Star Wars" game where you tried to "fly" a ship with a joystick first person to line up Tie Fighters in the crosshairs. It was a clone of one of the demo games which came with the Apple II, but mine had a "horizon." I wrote the start of an Asteroids style game, where I got a swarm of 50 ships to chase each other around in a massive "dogfight." All of the above was done with BASIC, except for the last, which was done in a BASIC derived macro language called Macrosoft.
More recently, a few years back I taught a class for high school aged women, where we used Unity 4 to create a 2D shoot-em-up game in just one afternoon. 18 of the 19 students rated the class highly. Granted, we had to "pre-fab" a lot of things for the students to keep things moving. The actual point of the course was to give the students a taste of what they could do.
Some years back, back before Arduino was even a thing, a friend of mine (who looks a bit like a cross between the stereotypical 50's geek and Kyle Maclaughlan) and I spent an entire evening bringing an embedded computer to life. We got an LED to blink then awkardly high fived. His wife just shook her head.
The stuff I was able to do in Basic in 1981 as a ten year old kept me interested. Then like now I enjoy programming, but mostly as a means to an end. If anything, the fact that what I could program wasn’t as “interesting” as a commercial game (say) was motivation to keep learning.
That's just not true. For I/O you had keyboard input, screen output, and maybe file I/O. 3 basic things that pretty much all programs used. Most commercial programs were 80x25 text mode with 16 color attributes. If you could learn to read scancodes from the keyboard, write text to the screen, and save/load files, (along with a few fundamental ideas like loops and conditionals) you could do almost anything that a commercial program did. With just that basic knowledge of a few simple things.
It was simple because it was direct, there weren't 72 layers of abstraction and UI libraries that you had to know before you could even begin. You could also do pixel graphics, and again it was pretty simple - just set the screen mode and go with DRAW, LINE, CIRCLE, POKE etc. Again, no layers of indirection or libraries or window managers or canvases or DrawableObjectFactoryManagerFactoryInterfaces needed.
One of the problems of modern attempts to recreate that feel is just that there is so much between the programmer and the hardware and it's so much more complex. Back then, we had single-user, single-process systems with direct access to simple hardware. Everything you needed to do was either a built-in part of the language or a BIOS/OS interrupt that you could call just as easily. It's hard to recreate that on a multi-user multitasking system with indirect access to more complicated hardware and countless layers of software between the programmer and the machine and BIOS/OS interrupts. Sure you can do more powerful stuff now, but it's not as interesting.
Just personal observations here. Someone with a computer in 2019 can download Unity, which is free, and watch a tutorial on YouTube, which is also free. With near zero understanding of what is going on, and no prior experience, they can have some kind of rudimentary platformer working within hours. This is then a good starting point to learn programming (you can dive into C#) or you can continue to jumble together copy/pasted bits of code that you see online (kind of like how I remember doing with BASIC and library books).
Sure, there are a bunch of layers of abstraction, and those abstractions will break down all the time. You don't need to learn those abstraction layers, you can stay at the top layer and still get good work done, maybe working around a few problems that you don't understand from time to time by futzing around with stuff until it works. Or you can dive in and try to understand what's underneath an abstraction. That's the whole point of having abstractions in the first place. They exist to hide the complexity, to let you get work done without understanding the entire system. 72 layers of abstraction is a bit of an exaggeration. I'd say if BASIC has three layers (interpreter/machine code/hardware), Unity only has six (editor/scripts/engine/os/machine code/hardware), but who knows how you count them.
In my experience, watching people learn how to create things, it has never been easier to learn programming and start building things. The main difference is that in 1985, programming was considered an essential computer skill, and in 2019, you're expected not to program. BASIC was amazing because it was what you saw on the screen when you turned your computer on, nothing more.
Sure, when I wanted to make games I moved to C++ and Perl for web apps, but without programming experience those languages looked like a mass of symbols while BASIC looked like sentences.
For an entry level language it was perfect for me.
We used to write goofy games, share them, hack on all of it, and had lots of fun. Imagination.
Now, the blue dot is the baddie, and you....
See if you can find the GDC presentation they gave. It was the most entertaining presentation of that year I can remember.
Back then, as a child, my parents purchased books that contained basic programs that one could manually transcribe and save onto tape or disk that were actually interesting. IIRC, those books did contain an overview of how the program worked.
He likes electronics because they have games. Usually the older games are better for him anyway because they are simpler. Right now his favorites are Worms Armageddon (iPad/PC/N64) and San Francisco Rush 2049 (N64).
Since he doesn't have free access to any of these devices, he is perfectly happy to use whatever we allow at the time ("Daddy, can we play games on the TV from when you were a little boy", referring to a 12" junky CRT for the SNES). I have no doubt he would be thrilled to have some kind of retro computer to use QBasic on, just like I did when I was 8.
You wildly underestimate the power of a child's imagination. The problem is free access to overstimulating contemporary technology, not some intrinsic boringness of old software.
You and he should make "San Francisco Rush 2019" where you jump around from startups to FAANG all the while trying to make enough money to keep up with housing price increases.
I feel like that's exactly what the parent comment said.
My kids have always loved retro games too, as well as current games, and they are also interested in programming. But for a decade they've been unable to motivate themselves to actually do the programming. My oldest has asked me to teach him multiple times, and when I help him, it doesn't stick. He doesn't have the patience for code. He chooses to spend his free time playing very high quality games and browsing addictive content on the internet, from an insane vast sea of choice that we never had when we were programming in BASIC.
> You wildly underestimate the power of a child's imagination.
In my experience, that imagination tends not to come out until kids are bored, and produced content keeps them from being bored.
Wait until your kid is actually programming to pass judgement. My son is finally starting to find some patience and just turned 15. It was easier for me when I was 12 because there was nothing else to do.
I also am getting the feeling that the trend is global because I'm involved in hiring, and the younger kids out of school all started programming in college, not in junior high like all my cohorts my age. My group asks specifically at what age candidates started programming. The theory was that early is good because it shows self-motivation, but we're seeing general trends that kids are starting later.
EDIT: BTW, the main motivation for my son's newfound patience for programming, unix, and shells is to escape parental controls and sandboxing.
If he likes Worms Armaggedon, he might like the Gorillas game: https://www.youtube.com/watch?v=UDc3ZEKl-Wc
I had a lot of fun hacking that game back in he day.
For example, my 5 year old, who is just learning how to read, can understand the beginnings of BASIC. She's super excited to see the TI-99 4/A print back what she painstakingly types, or to see the computer count from 1 to 5 with a basic for loop. She wouldn't know what the F is going on with Scratch, let alone Python.
BASIC is exactly what it's name implies. I'm going to move her to Scratch and Python eventually, but I have yet to see either meet the needs of the absolute entry level, very young programmer.
This makes an explosion at coordinate x=100, y=200 that is 50 pixels wide and lasts 3 seconds, including sound. Put it in a loop that slowly increases the size or decreases the duration. What boy doesn't want to have control over screen explosions? (I may be gender-stereotyping, but you get the idea.)
Scratch before it making programming like lego's.
The situation isn't nearly as bleak as you indicate. I also found that I can counter that expectation by showing the labor and code involved in big projects. Then, I show them a small project that's still fun made by one person or a small team with associated code. Then, I tell them they can start there, make fun/useful things, and work their way up to bigger projects like their apps or AAA games. Only thing I run into at that point is laziness about putting in effort or just disinterest in favor of other pursuit.
I do need to do an update some time in future of exemplar apps to use for my examples with appeal to wide audience. Probably several or a pile of them with summaries so they can just skim through to get to what interests them.
While I can agree that there are more "interesting" things to occupy time, I disagree that simple things will always be beaten by a team of engineers. To use an example outside of programming and video games, there is a multitude of tricks you are learn with a deck of cards. You don't need elaborate magic rigs to be entertained.
Secondly, I'd recommend looking at what BASIC was capable of doing. I'm going to be showcasing the game "Missile!" (pg 35 of ) on an actual Apple IIe next month. It's going to take only 43 lines of code to have a GUI driven game. It's obviously limited, but designed to get kids interested in Computer Science. To produce the equivalent in a text-based programming language would require too much overhead - even in Python's case.
Also, before I started studying computer science, I liked the Unity Game engine as it was quick to implement something interesting and it was easy to find lessons and instructions online to develop in C# and the Unity engine.
They are simple to use, connects with the real world so you can actually get feedback and can be used in a variety of cool projects.
I've gone with Python, but I'm not sure I made the right choice (too rich, too much information to swallow before starting).
Despite that, my son is working on a car race game, all ASCII, using curses. He made his own track with pipes and slashes and is pretty proud of it.
To be fair, their most advanced device to play is a nintendo DS (not 3DS or 2DS, the original one)
But I still think it would have been easier in BASIC.
That was interesting.
Never forget that operational causality over computers is gained one step at a time.
Old, "bad" languages can still be extremely powerful and productive for what used to be called "power users"; don't dismiss them out of hand because the professional software development world has moved on.
That experience taught me skills that I still use today. Mainly, trying to understand an unfamiliar (often poorly commented) codebase, and having to debug it.
These days, I think that the most equivalent experience might be working on something open source. But I'm afraid the motivation just isn't there for most kids. My motivation was "Wow! Free GAMES!", but my kid gets free games on his tablet without having to debug a darned thing.
It was same feeling for me too, in the early/mid 80s. And it was impressive also because few other people were even doing it. Whether other people could or not, based on their capabilities, most didn't actually have any home computer to even practice on. Merely having a home computer in, say, 1983, was a rare thing. Actually being able to make it do things you wanted was even rarer. It probably seemed more impressive to many outsiders than it actually was (in the sense that a for-loop is pretty boring/basic) but hard to separate out that impressiveness between having access to the situation at all vs having the skills.
Each line would have symbols that told you which versions of basic used that line, so there would be duplicate lines for different versions of basic. They didn’t have the version I had, so I had to figure out the differences on my own.
I learned so much.
IIRC the guy who wrote the checksum generator also wrote the common assembly language graphics routines we used for all the games.
 This was it: http://www.atarimania.com/game-atari-400-800-xl-xe-biker-dav...
My girlfriend is trying to learn to code, and I realised that I had it easy - because computers were much simpler, learning to code on them was much simpler. I learned Assembly coding trying to speed up my simple homemade games, but that was easy: the machine only had 3 registers, the entire machine memory was addressable, and there was maybe two dozen commands to learn.
I think this is why I like Go so much. It reminds me of old-school BASIC in its simplicity.
I still keep typing `IF <condition> THEN` and `FOR I = 1 TO 10` if I'm tired, even after all these years.
Nowadays the expectations are so much bigger. I've seen people create their first website and be disappointed it looks so crappy, rather than excited it works at all. Not everyone, mind - some are still excited.
My main side project right now is a 3D libretro front end with Lua programming. One of my plans is to create a virtual 80s microcomputer lab with 3d models of C64, Apple II, Spectrum and maybe a few other things as well as virtual programming manuals and floppy disks and of course the emulators running on the screens. The Lua code can read and write the emulator memory so it will be possible to create a C64 demo that manipulates 3D objects for example.
Th problematic thing for the Lab is copyrights related to content. I'm not sure how I'm going to solve that. But if anyone is interested what I have so far (without the copyrighted content) is totally programmable with Lua and free. https://vintagesimulator.com
Recently I hacked together a simple BASIC interpreter in golang, which was fun. Whenever I had a choice about implementation I picked the choice that was closest to the Sinclair-BASIC I remembered:
Free, open-source, multi-platform.
Same thing - free, open source, multi-platform
Also - a couple for Android:
Mintoris Basic - http://www.mintoris.com/ (not open source)
RFO BASIC! - http://rfo-basic.com/ (open source)
Can you tell I like BASIC?
While I don't code in it for anything but fun and toying, it's what I learned on in the 1980s (Microsoft BASIC for the TRS-80 Color Computer 2). Also played a lot with Applesoft BASIC on the Apple IIe in high school.
Later I moved on to QBasic 1.1, QuickBasic 4.5 and PowerBasic 3.2.
My first software development job, about a year after high school, was using PICK BASIC (in the form of UniVERSE on the IBM RS6000 running AIX - my first exposure to UNIX).
Then on my Amiga with AmigaBASIC (IIRC?) then later AMOS...
I've toyed around with PBASIC (for the BASIC Stamp microcontroller).
Still holds a special place in my heart...
(It's good that Microsoft still has a BASIC out there that you can point kids at. BASIC was Microsoft's first product and absolutely left a legacy across the decades.)
Also, I feel that as long as schools continue to require Texas Instruments graphing calculators for tests, that TI-BASIC will probably long have a place for teenage experiments when bored in classes.
I think it's a fine language for introducing people to programming because it's conceptually very simple and leads one to realize how a limited language can or cannot express things (try to implement recursion, for instance).
10 Y0=1:GOSUB 1000:PRINT Y0
1020 IF Y0<5 THEN GOSUB 1000
This seems like a fun challenge to try out...
(C having a callstack is actually a pretty big deal, and not everyone adapted to it well at the time.)
ZX81 BASIC has a GOSUB/RETURN stack. MS BASIC has a GOSUB/RETURN stack. Atari BASIC has a GOSUB/RETURN stack. Apple 1 BASIC has a GOSUB/RETURN stack! - sure, it appears to be all of 8 deep, so good luck with using it for anything recursive, but it's a stack nonetheless...
It's a nightmare, but doable.
Blank stares all around, including from the teacher. So it goes.
Python's indentation rules are a source of confusion, but I'd much rather he learned that instead of BASIC.
x = orange();
Otherwise, you had to fake it, which you kinda note: Since all variables were global in nature, this made it challenging - but possible. You could set aside a few named variables for your "functions", then just use GOSUB:
10 X0=1:X1=3:GOSUB 1000:PRINT Y0
1000 REM THE FUNCTION HERE
It's only when you start mixing in GOTOs jumping out of routines, all over the place, no structure, etc - that's where you get the infamous spaghetti code...
Um, yes? That’s exactly what I meant by “an even more limited form of Python's style of lambdas”. Which is what they are. I never used them myself; I never, ever, found a use for them.
The rest of your argument seems to be “If you’re careful, it’s techically possible to write structured code.”, but the same can be said of most any language. Even assembly language for $FOO’s sake, usually has labels!
EDIT: Fixing comment to reflect you said 80's-style. My speedreading made me miss it first time.
> No naming, no abstractions, nothing.
This is not entirely true. BBC BASIC (certainly the versions I used on the Electron and Master 128) had both named sub-procedures and named functions, with local variables and allowing recursion. While limited in some ways they did allow programming without sight nor sound of GOTO/GOSUB, which I actually used to do.
You could even do away with line numbers with a little hackery: write your code in a text editor (View, as built-in to the Master series and which we had as an add-on to the Electron we had at home before that, was what I used), I forget how due to the mists of time but I had a method of having the file repeated as if typed from the keyboard (adding in the line numbers via AUTO) and executing it directly. I felt rather clever, particularly as I did it mainly because a CS teacher had said it was not possible!
OK, sounds great. I also remember the versions of Basic on the Amiga and Atari ST being perfectly adequate in this regard.
But this is never the version of Basic which people talk about! What you describe is not the version of Basic used by all those listings in magazines, or in the linked article. This is, for all intents and purposes, not the “80s style Basic” which everybody remembers with such apparent and baffling fondness.
However, later 80's 16-bit Basic (think Amiga, ST, QuickBasic for the PC, etc.) at least had functions, libraries, etc.
Going back to 8-bit is probably too far...
Almost, but not quite. It would ignore anything after the 40th character of a name, and the overall line length restriction (255 characters, less line number) imposed a secondary limit.
Still night-and-day compared to some common contemporary BASIC variants what only allowed two-character variable names, of course, and not much of a limitation as really long names would soon have you hitting the overall memory restriction of the default address space (between 10 and 29Kbyte available, depending on screen mode, which had to be enough for the heap (including your code) and the call stack).
I perhaps remember this far too well, given how long it is since I touched one of those machines...
I couldn't find any evidence of a 40 char limit on BASIC II... the full length of the variable name appears to be stored and compared.
Though there was also a limit (also ~256) to the line on entry, which you would normally hit first because keywords are stored tokenised: "PRINT" would be stored as one byte not five.
I agree, but this is exactly what the linked article does. It even promotes the idea that the GOTO is better than functions or OOP, because it’s “simpler”. Needless to say, I disagree.
Naming things is almost a prerequisite to be able to think and reason about things without keeping the entire program in your head at once. And without any means of abstraction with names for code, old-style 80s Basic fails this requirement.
“I liken starting one’s computing career with Unix, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. But, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act.”
— Ken Pier, Xerox PARC
Many shipped with books documenting not only the Basic language, but full hardware documentation: memory map of the machine, special memory addresses (useful for peek/poke), etc. I even remember the Apple II manuals coming with schematics for the entire motherboard!
I’m looking to avoid a general purpose device that can play modern games and connect to the internet.
Qbasic is also a sort of weird situation-- modern in some ways, but sometimes feature and resource limited compared to other BASICs. It was a defeatured QuickBasic that they sort of expected users to graduate to the full product.
I've found the exact Spanish version! https://sites.google.com/site/santiagoontanonvillar/personal...
The episode does a good job of explaining that the language hit a sweet spot of easy to learn, and easy to implement on the limited hardware of early home computers.
Every computer out there is still capable of delivering value to someone, out there. I really think its a shame we have allowed the consumer cult viewpoint to prevail, and that we consider 'old computers' to be "useless" - when this is really not what's happened.
We changed, not the physical hardware.
By way of anecdote: My 11 year old kid is learning as much about programming computers - and most importantly, how computers really work - from hacking code on one of my 8-bit machines, as he would by trying to crank up Xcode, were that at all feasible ..
16 bit home computers already had structured BASIC compilers, producing native code, for example I learned Turbo Basic in 1990.
Here is how it looked like,
Amiga demoscene was presented with AMOS.
Just to cite those which I had more experience with.
- BASiC was a very interactive environment. This new and awesome in the 80s. You could enter programs, run them immediately, edit, run, repeat without having to go through a compilation step.
- BASIC worked without any planning whatsoever. You don't need to declare variables, or even arrays if you only use subscripts 1-10. Strings handled automatically.
So it was really fun to play with. You could make your new computer do something rather quickly, and refine it over time.
Line numbers were needed on the old 8-bit systems that didn't have full screen text editors with copy/paste built in. It's how you told BASIC where to put new lines in the program.
Admittedly it was certainly getting long in the tooth in the 90's though.
I know from teaching kids is that LINE NUMBERS makes more sense to them. Personally I HATE line numbers and think they are the devil.
I was showing kids my old 1970s code books that I use to copy into basic and play games. I told them how I was in 1st grade and that my dad would help debug them with me. They all wanted to try and one kid copied a good 300 lines for a battleship game. For some reason the logical numbers made sense to them. That kid must have played that game dozens of times because he felt like he made it.
My own opinion is that BASIC shouldn't be used to teach programming. I opt for Racket.
Modern languages pack in tons of indirection because that's where the power tools are - but an introductory environment might benefit from cutting down on that.
Lol, found out later why that was bad, but as a kid, why not?
What else are the numbers for?
Oh, the joy when I've found Beagle Bross software - https://en.wikipedia.org/wiki/Beagle_Bros
Also this: https://stevenf.com/beagle/where.html
I assigned my son exercises for it over one summer. I had him save his results into local Notepad for later inspection by me. It also comes with drop-down samples.
The kids I see using Arduinos seem roughly like the same set of kids that would have had an 8-bit home computer in the 80's, and actually spent any time programming on it.
Bicycles are already pretty good just the way they are.
However my daughters are learning VB.NET at school for her GCSE Computer Science (UK). It's a very capable language, if a little bit verbose and clunky, on a modern runtime and with excellent tooling. Until recently they were using Visual Studio for Mac, but we got my eldest a Windows 10 laptop for Christmas so now she's on 'real' Visual Studio. They're both very good though. It's quite something to watch her debugging code by setting break points, inspecting variables and such.
In all cases they were people that could already do some stuff in Office VBA, got IT to have them a VS install made available to them, and just coded away on VB.NET when needed.
It helped that many devices in life sciences tend to have Windows only drivers, sometimes COM based, as such on their specific domain, having .NET based applications around was already pretty common anyway.
In about five to ten years we'll have a spate of articles from nostalgic early Millenials about Turbo Pascal, Turbo C,
Watcom C and such, then a rash of remembarances about CGI-bin, then early Java....
Personally, I miss mode 13h at least twice a month.
Does anyone know of a portable basic interpreter that supports double precision all through?
The kind of stuff I (used to) do  requires more than 6 figures.
It handles 64-bit doubles: https://www.freebasic.net/wiki/wikka.php?wakka=KeyPgDouble