I distinctly remember as a kid it was very slow indeed, but interesting.
I got lost reading the ads. In retrospect, computing used to be a much more expensive hobby than it is today. Not just relative terms, but absolute terms. Then again, people are much poorer now, so its required.
Anyway, since 1983, he became a neuroscience prof and mentions his BASIC LISP on his homepage
The line numbers are not consecutive, but I think he's well under a thousand lines of BASIC, there just aren't enough pages of code in the listing to exceed that.
And yes, this was considered reasonable coding style back then. That is why this generation never shrank in terror at the sight of bad Perl code. Why yes, this is a bit hard to read, but I've certainly seen worse...
Line numbers. Control flow exclusively by if/then and for/next. We Love GOTO (noobs probably don't know what a GOTO is or why some considered them harmful). No namespaces, everything was a giant global shared namespace where $T meant the same thing everywhere. No naming conventions for variables (like Hungarian or CamelCase). No one used revision control. No unit tests. No symbolic debuggers (they were coming, soon, and something like them existed for assembly, but this was 1983 and not-assembly). For better or worse, no REGEXes. No object orientation, pure procedural. Everything is in one file because quite a few people had no disk drive and relied on cassette tapes. Line editing was a little crude compared to modern vim/emacs and pretenders to the throne. No IDEs until Turbo Pascal and the like quite a few years later (Or was quickbasic first? Either way it would be a long time...)
By modern standards MSBASIC is pretty weird.
But more to your point, by the 1990's the idea that computer languages should be understandable by non-programmers with a reasonable education and a general familiarity with the principles of computing was dead.
Think of CoBOL. If its approach hadn't been abandoned for the obfuscations of C++ and Perl, Eric Edwards wouldn't have written a book and gone on the lecture circuits to spread the gospel of ubiquitous language. There's a reason the unschooled learned HTML and PHP in the 1990's - they were accessible to moderately educated people and could do useful work in the same way that CoBOL and BASIC were by design.
The TRS-80 ran a version of TinyBasic. The use of line numbers and GOTO allowed BASIC to run closer to the metal - GOTO 15 is an Assembly Language JUMP to the address of where ever the instruction on line 15 mapped by the assembler. Without it, BASIC would not have been such a successful path to a higher level programming language for serious programmers steeped in assembly language. GOTO is handy if you want to translate from Knuth's MIX without a lot of fuss.
It's strange to look back and see just how popular virtual machines were at the time. BASIC typically used one, as did many other languages. Smalltalk is famous for using a virtual machine, for example. Microsoft's original Mac apps all ran bytecode in a virtual machine.
It seems crazy, because these computers were already tremendously slow, relatively speaking, and adding a virtual machine makes it much worse. However, it was ultimately a useful tradeoff because these machines were even more limited on RAM than they were on CPU power, and using a virtual machine with bytecode that allowed for an efficient instruction encoding could save a lot of space. It doesn't matter how fast your code runs if it doesn't fit in RAM, after all.
On a TRS-80 Model 1, compiling means the compiler, the input and the output have to live in 4k of ram (or 16k in the later versions).
Considering that the Level I TinyBasic interpreter lived on a 4kB ROM; Level II lived on a 12kB ROM; and Mass storage for most early machines was audio tape - not only slow but also notoriously prone to not loading files correctly, the compiling code would have been great for masochists, not so good for people who were just trying to get something done.
And that's before considering the complexities of tuning a compiler to optimize code.
Tokenization also allows some syntax error detection to occur as you type code in, which was interesting. I don't remember enough about this. Obviously some mistakes won't tokenize at all or will tokenize into gibberish.
So tiny basic in memory stored plain old ascii and saved plain old ascii to cassette tape. lvl2 msbasic stored tokens in memory although it could optionally save pure ascii to cassette tape. This had some interesting software distribution issues and compatibility issues as it was sorta kinda half way possible to save something on lvl1 and load it into lvl2 if you were careful and vice versa.
Level 1 basic was pretty much a model I 1979 thing only. I believe level 1 was technically available for the M3, but...
The article was more or less contemporary with the M4 which was 80 columns and used a licensed ldos instead of trsdos and I'm pretty sure was level2 basic only. So by the time of the article L1 basic was about two generations and 4 years out of date.
Also I recall Radio Shack sold the L2 upgrade eprom for something ridiculous like $19 so a L1 only machine was probably a 1979 experience (before the release of L2) or somewhat unusual in not having been upgraded.
Just wanted to say thanks for linking to this fine chap. He seems to have a cross-section of interests such that I can really relate to and draw from. In particular, some people here might be interested about the intersection of cellular automata theory and (what it has to say about) cognition, autonomous systems, etc. (see e.g. [1, 2]) This is one of his approaches to understanding dynamic systems and how coordinated bahaviour can arise in them. See his publication list, too. 
Apparently formal treatments / approaches to autopoiesis have been developed for quite some time (currently recommended books seem to be from e.g. 1980 and onward.) Interesting indeed! :)
The short version is that bash is the closest thing to being universally available on every UNIXoid system no matter what, and so by writing stuff in bash, you make it so that it can run everywhere. But because bash sucks to program in, this is a minimalist interpreter for a sane language. You can then write programs in that language, and they will only depend on bash and on this interpreter, and the interpreter is simple enough not to need any sort of complex installation.
I can't quite think of a use case for this where it's not worth e.g. installing Python first, but it's an interesting project all the same.
Whereever you get Bash, you can reasonably assume Perl5 (unless in an initrd or something). Even on some old AIX 4 I had a readily available Perl 5.005.
Nonetheless I wish there were more actual shells that were not sh descendants.
Also I wonder if awk would have been a better language than bash. I think awk is more available across various Unixes.
If anything, the length that they are willing to go to do this points to the brokenness of package managers. There are still a lot of disadvantages to a package manager versus cp-ing some code.
Here you go!
In the meantime, for variadic addition, one can do:
(reduce + 0 '(1 2 3)) ;=> 6