> The ZX Spectrum has a 3.5 MHz Z80 processor (1,000 times slower than current computers)
Actually, it's much, much slower than that.
The Z80 takes at least one clock cycle to add or subtract an 8-bit number (sometimes several clock cycles, depending on addressing mode). For a 32-bit number those clock cycles add up quickly.
In contrast, a modern 3.5GHz CPU can add or subtract several 64-bit numbers every clock cycle
And don't get me started on multiplication or floating-point arithmetic. A modern CPU would be thousands times faster, even if it was running at the same clock speed.
Thus, I think that the feat is even more impressive than presented!
The fastest Z80 instructions are 4 clock cycles, at least 7 clock cycles if they also need to load or store an 8-bit value from/to memory, and at least 10 clock cycles for instructions that load or store a 16-bit value.
I did a little deep dive into Z80 timings a little while ago using the Z80 netlist simulation from visual6502.org:
But for BASIC programs those timings don't matter much, since BASIC on 8-bit home computers was easily 100x times slower than assembly code doing the same thing (the difference is probably less though when floating point math is involved, since this also needs to be 'software-emulated' in assembly).
To grasp the interpreter speed you need to remember how memory was even more of a constraint than CPU. BASIC had to hold your code in RAM in way that's source-editable and runnable and compact, and process it using interpreter code that's also squeezed for its space. Any obvious optimization you just assume for a speed demon like Python nowadays, you might be surprised about when it comes to those 8-bit Basics.
I've been working with eZ80 on Agon Light and the performance ceiling is considerably different. Both the clocks and the instruction timings are more generous owing to a three-stage pipeline, so while interpretation is still slow, and it's got a clear preference for either 8 or 24-bit(in ADL mode), it is now a Z80 that can be applied productively to high-level code operating over 32-bit values...
...except that the BBC BASIC was giving me Heisenbugs, so I had to give up on it. So I'm on Forth now, and immediately found that Forth is a more productive environment than an early BASIC anyway. Linenums add a killer amount of bookkeeping.
Well, yes and no. You're correct that ZX Spectrum is a lot slower than the author estimates, but there's also a software-side performance bottleneck: the code is written in the built-in BASIC language, which on ZX Spectrum, was not only interpreted at runtime, but also almost comically unoptimized. For example, people would use "aftermarket" circle-drawing routines that performed 10x better than the stock ROM implementation.
BASIC on ZX Spectrum was meant for learning, but no serious software of the era relied on it for anything that needed to run fast.
I suspect this code could be 10x faster if written in assembly or a compiled language (a handful existed for the platform).
After learning to code on it, we used BASIC as means to write the utilities to load DATA statements representing Assembly code, hexdump monitors or compiled languages, although the later were relatively hard to take advantage of, given the space requirements.
Yeah, there was a C compiler, a Pascal compiler... I'm pretty sure others.
I would guess at least 100x for an optimized assembly version. Even 500x between this sort of interpreter and assembly is not unheard of.
The thing I loved about Spectrum was game loading graphics from tape. Spectrum had a very strange video memory layout, so the bitmap would load first into different thirds of the screen (but still all monochrome). And then at the end the attributes (which were much smaller due to the 8x8 size) would load almost instantly, "painting" the monochrome image in a final splash! So this is very nostalgic for sure.
And the funny thing is, I like the spectrum renders better than the "perfect" ones from a modern computer :-)
I also remember that there existed cool loaders that were able to paint the image in arbitrary order. It was pretty impressive because it required doing calculations in the tight loop which calculated the delay between slopes of the signal which came from tape.
Below is a link to one I did. It loads in various patters, including writing, and from what I remember was spread out over the time it took to load the game so there was always something going on.
It wasn't the best game around, hopefully the loader made up for that!!
I remember a loader where you could play a game of Mastermind while the game was loading. Magic. Blew my mind, it's the kind of thing that wasn't supposed to be possible, yet it was...
There is something like a 171 clock cycle delay in the regular tape loaded routine between looking for edge transitions in the audio signal. You just break your program up into pieces that fits into that. I did one that did a countdown timer and broke it into exactly the loop delay but I suspect the tape loading would be tolerant to a little more inexactness.
Excellent! A similar program, apparently from 3 years ago, did similar things, in Basic also, on the Amstrad CPC, a machine with the same processor, a little more RAM and more (27) colors or shades of green, depending on the screen it was hooked to.
The webpage displays the color (left) and green (right) versions on hovering your pointer over the ribbon (above).
Image 1 and 2 are okay in both cases, image 3 is clearly tuned for color screen, image 4 for green screen.
That's pretty impressive. The main difficulty with reflections in the ZX Spectrum is not so much the limited color palette, which can be dealt with, but the 2-color limit per 8x8 block. It's already a big issue with just lights and shadows as it is :-/
> the 2-color limit per 8x8 block. It's already a big issue with just lights and shadows as it is :-/
Well, you impressed me (and others) with your simple bet that "hey, what if I simply choose the two most popular colors" and just code that in simple BASIC. The same could be done with the shading. It would probably pick a good compromise when bright yellow meets dark red: by choosing to have the few should-be-red pixels become yellow, you avoid a bigger obviously-square-grid-based clash of colors.
Next step: error diffusion dithering would IMHO be a good bet. But maybe that becomes really to much for Basic.
I've been considering doing Raytracing on the Amstrad CPC, but in C.
Perhaps using integers only, for speed.
> Line 3000 starts with REM, short for “remark”. We call them “comments” these days, but the ZX Spectrum is British, the brainchild of mad genius Sir Clive Sinclair.
Exactly this. This is only one of several mistakes in the piece about Sinclair BASIC. It was the language of the first computer I owned, so I'm still a bit defensive about it.
* REM for Remark is 100% standard usage.
This is true in all BASIC dialects I know of and in DOS batch files and so on.
* Line numbers are not for GOTO and GOSUB.
Line numbers are an elegant abstraction for a teaching language, and for machines with no directly-addressible non-volatile storage.
Type in some valid BASIC, and the computer does it now. Prefix it with a line number, and it _doesn't_ do it now, it remembers it for later. Sequence is controlled by the numbers, so flow of control is explicit, not implicit.
This is much simpler than learning multiple Unix-like abstractions such as "this is a 'file', files have 'file names', to edit the code you must 'load' a 'file' into another separate program called an 'editor', then you 'save' the file to 'disk' and pass its 'file name' to the 'interpreter'."
All that stuff is 1970s minicomputer legacy nonsense, that we have fossilised into our computing culture. Smalltalk banished all this legacy baggage in 1980 but Unix was so dominant that we learned nothing from it.
Kids should not have to know about ludicrous 1970s abstractions such as "files", "programs", "editors", "interpreters", "compilers" and all that DEC PDP-7 nonsense. The computer should come to us, not us to its concepts.
* No ENDIF? No, almost no early-1980s BASIC had ENDIF. BBC BASIC on the BBC Micro is the _only_ one I can think of, and it took 2x the ROM space of Sinclair BASIC -- and that was on top of a separate 16kB ROM chip for the BBC Micro OS, or MOS. Full 50% of the BBC Micro's memory space was consumed by 32kB of ROM. The Spectrum had 50% less ROM and so you got 50% more usable memory.
* The increment of 10 is nothing to do with the Sinclair editor. All BASICs did that.
* "there are no function calls" -- yes there are. DEF FN() worked, as in more or less all 8-bit BASICs.
* "It’s also interpreted, so super slow." -- again, they all were.
These are cavils. It's a great tech demo and I enjoyed reading it!
I guess you were inspired by the recent Acorn Electron demo, as used by the BBC Micro Bot on Mastodon?
All of this is accurate and just want to add how hard it is to appreciate how revolutionary BASIC was at the time of creation (at Dartmouth, when everything was batch oriented FORTRAN). It was probably the first _widely_ available example of interactive computing, and as we have to come to understand, instant feedback is critical for learning and a boon for productivity.
However, using line numbers for editors wasn't AFAIK unusual at the time and I think making the semantics depend on them was unfortunate. However 20/20 hindsight etc.
Beautiful, I love it, congratulations! I started my programming journey with Basic on a ZX Spectrum +, then Z80 assembly language, then 8086 and the rest from there... 40 years of programming and my appreciation for my original platform is still there. Thank you for the tribute.
This was almost my first computer (almost because it was actually a TK-90) so it's nice to see it here, I guess it was a common starting point if you were in a family with a computer in the 80s in UY :)
While I know it's not the same (plus I'm a backend guy so I know nothing about graphics, a fact that may change in the near future as my daughter is set to study animation & videogames after she's done with high school next year) but your last version made me think of Batman for the Spectrum. I remember thinking that game's graphics were magical compared with most of the other games for that computer at the time.
Batman is a good example of avoiding attribute clashing by going monochrome :) But I have the same memories, huge sprites and smooth animation, it was pretty great!
> each 8x8-pixel block can show one or two different colors, but never three or more
That's not entirely true. If you modify the colour attributes between the drawing of each row, you can get eight separate 2-colour palettes in an 8x8 block. I believe I saw somewhere that people had managed this horizontally as well somehow.
From what I remember there were two major challenges with tricks like that.
The frame timing was based on one single interrupt at the top of each frame and you had to count down using CPU cycles to trigger the change. This meant no variable path code.
Secondly it needed doing every frame so a lot of those precious CPU cycles were taken up with the fancy display so less available for the game logic.
(did a 2d morphing animator and a simple cad tool back then. also straight forward base algs that you can keep adding to, adding some more, and some more...)
Great write-up. Thanks. Love the Spectrum and its direct token input despite never owning one.
It would be fun to see a set of frames rendered out and stitched together to make an animated sequence. 17 hours per frame. It'll be like Pixar where you'd need a render farm of Speccy's to knock it out.
One could "cheat" by running this in an emulator of a massively overclocked ZX Spectrum to cut down on the render time and still get an authentic end result.
Also, not exactly the same type or rendering, but basically word for word what you wrote: the full-screen fully animated parts appear to be "a set of frames rendered out and stitched together to make an animated sequence"
Amiga is famous for The Juggler, an animation by Eric Graham which did exactly that in January 1986. It became an Amiga "mascot" of sorts and the animation was used to advertise the machine in computer shops, magazines, etc.
I did some extremely rudimentary animations on my Spectrum 128 in Beta BASIC this way.
I generated a couple of dozen frames of a Julia set distorting, in monochrome, saved them to the RAMdisk... which took a couple of hours or something... and then loaded them back in sequence from the RAMdisk, which gave me a few frames-per-second "video".
It worked and it fit into 20kB or less of code, and about 90kB of RAMdisk.
I have absolutely no recollection of being able to put multiple statements on a line, separated by : - is that just my old brain forgetting stuff, or is there something else going on here?
Most (all?) basics could do that at the time: I used it a lot to get more on one screen (moving through numbered basic code was even more annoying than moving through code now). Also it saved a few bytes here and there which was quite crucial as there was not a lot of mem.
Took me a while to remember this too (it had been 30 years)! I started using IF just GO TOing over a couple of lines whenever I had a more complex IF "block", but I had this nagging feeling that there was a better way :)
Cool project and great write up!
I’ve literally gone through a very similar process just a couple of days ago!
I used a threshold screen with random element added, which worked really well (and isn’t toooo slow).
The Softek FP Compiler might help with performance, although failed to compile for me as it doesn’t support DEF FN.
Mine: https://github.com/deanthecoder/ZXSpeculator/tree/main#exper...
If you're talking about the first iteration, you're right (ROX, ROY, ROZ) is hardcoded to (0, 0, 0), but I decided to implement a general algorithm anyway, because performance wasn't a problem at that point.
No, but I'm likely to rewrite it in assembler at some point, once I have a stable feature set -- still thinking I can improve the attribute clash handling in some way.
My favorite class in college was building a raytracer from scratch and slowly adding features and optimizing. I always wanted to keep adding on to the raytracer I made but never found the time.
Actually, it's much, much slower than that.
The Z80 takes at least one clock cycle to add or subtract an 8-bit number (sometimes several clock cycles, depending on addressing mode). For a 32-bit number those clock cycles add up quickly.
In contrast, a modern 3.5GHz CPU can add or subtract several 64-bit numbers every clock cycle
And don't get me started on multiplication or floating-point arithmetic. A modern CPU would be thousands times faster, even if it was running at the same clock speed.
Thus, I think that the feat is even more impressive than presented!