This is cool, but there's an argument to be made that the baggage is the whole point of doing retro computing. You can emulate a Z80 on an STM32 and it will run faster than the real thing. Much faster. So why bother with a real Z80? Because there is something ineffably cool about the "real thing". It's kind of like vinyl records. Digital audio is vastly superior by any objective measure, and yet people still want vinyl. The don't want it despite the clicks and pops and the annoyance of having to clean it and futz with the needle and the turntable speed, they want it because of all of these things.
Right, building projects like any of these isn't for "most people". Kind of like how painting a picture is "too much futzing for most people". Thankfully, on something of this scale, it only has to be interesting to one person for the project to get built (and for us to have a fine piece of engineering to admire, even if we'd never want to build it ourselves).
I remember when the boot straploader failed on ur PDP's having to manually load the boot strap program from the front panel. Glad I only had to do that twice :-)
Switches/LEDs or keypads are the minimal IO a computer must have. It is not the most convenient input method but allows full control, no questions asked, on the whole computer.
Even people in the day used punch cards and paper tape for day-to-day IO. They used keypads and LEDs only for bootstrapping the bootloader/OS/whatever and for debugging.
Do you want to make games? Using something like the PICO-8 will still mean you're gonna futz around. But you'll be futzing around on game design-y stuff, rather than on "how do I get pixels to the screen" futzing.
If you want to make game engines, then the pixels to the screen is the fun part.
I futzed around so much as a kid, but not because I wanted to. It was more because I didn't know any better, and was just trying to get to what I "really" wanted to futz with.
Yes. I don't think I was clear enough with my first comment.
Futzing is great, and I celebrate people who futz on this kind of level. The people who code golf a bubble sort to be one less instruction; the people who can cram a plasma and text scroller demo into 89 bytes; the people who really understand everything about a system from top to bottom, inside out and upside down.
As a society we're not very good at protecting those people from harm (in the form of bullying from other students) or providing them with work that makes the most of their rare talent.
Some of the most interesting people, I recall, have been Futzers. So many of them really clever people. I don't even know if Futzers us a word, but I hope the world produces more Futzers.
When I was a kid--it was kinda frowned upon. "Don't waste your time fooling around?". I think times have changed though? I hope?
The other day I made a watch band using the leather from my old boots, and dental floss. I didn't tell anyone because I knew it would be considered excentric.
There's still surprising amount of hostility to tinkerers to this day. When one of my hobby hw projects got picked up by hackaday and petapixel, the comments were mostly negative to downright rude.
None of them even questioned the execution, but the point of doing it. It ranged from calling me a stupid hispter up to insinuating I'm a con artist who will use that to rip people off on kickstarter(?!?).
Hackaday comments are terrible no matter the subject. I don't know why, but they always have been; useful content is seldom and the tone is that of a struggle session. When you're on Hackaday, nothing is about fun. You are hobbying wrong and the world has to know about it RIGHT NOW. Granted, there's the occasional case where flagrant disregard for safety exists and wants addressing, but even that, I think, could be more usefully handled with less of a "you're worthless and you should feel terrible about yourself" vibe, not least because if you drive decent people away with your godawful attitude, no one's going to hear your warning.
I don't know from Petapixel, but if it's the hobby photographer equivalent of Hackaday, that's terrifying. I'm sure there are serious amateur photographers who are also perfectly nice people, but for some reason the ones I find myself dealing with are more on the "why bother trying to frame anything, it's a phone so it'll come out bad anyway" level. I'm sure that guy would be perfectly at home on Petapixel.
Using an FPGA makes it especially weird because a Z80 only has a few thousand transistors and would be pretty easy to just implement on the FPGA!
That said, I've used a microcontroller as a debug device for a retrocomputer and it made things a lot easier. But that's the way it was originally done -- using a bigger computer to build a smaller computer is "fair game".
This is not a 8-bit computer built for 8-bit retro experience. It's a workstation to develop, trace, and debug Z80 programs on hardware level, down to step by step execution.
Any modern computer is a workstation to do all that, because accurate Z80 emulation exists. What difference does using a real hardware Z80 make if you're avoiding the "retro experience"?
Z80 based 16K ram, yes thats a K. 2MHz processor. Plugged it into a cassette player for storage. Loading programs was hit and miss. I later splurged and bought a Basic ROM, initially you had to program it in Hex. I used it for most of my uni days, running linear regressions for lab. It cost me $300 AUD then, all my school kid savings :-).
Edit: It initially had just upper case characters - these were bitmaps stored in a ROM. I later bought an add on board that added lower case characters and graphic characters (like the old PC-DOS characters) contained in another ROM. I think from memory you could also add a RAM chip and load your own programmable characters, though that could have been my imagination. I have vague recollections of trying to figure out ways to make it have graphics that you could move around using the RAM, so I could make my own space invaders game :-)
I have that same book on my shelf, survived 20+ years and more moves than I can count.
My first computer was the Sinclair Spectrum and I've always been pleased that the z80 assembly was close to Intel's syntax. The competitors at the time, such as the 6502, always looked so dense and hard to read.
It's an interesting take on retro; gives the user the feel of a retro computer without all the gnarly period-accurate hardware. Also amusing that the STM32 supporting CPU has far more power than the main Z80 CPU.
Nothing will replicate the same feeling I had when I wrote my first machine code program by poking into RAM, and discovered the world of million+ operations per second, rather than thousands from the land of BASIC. It was like a glimpse of blinding light through the narrowest of cracks.
Those were the days, I programmed in assembly language for at least two years without an assembler.
As you say poking bytes into RAM, via the lookup table in the computer manual. Writing them down on paper, and trying to leave space in the code so I didn't have to recalculate all the relative JMP-offsets if I tweaked my program over a few days.
> Also amusing that the STM32 supporting CPU has far more power than the main Z80 CPU.
I believe some Commodore drive already had two processors identical to the main processor of the computer -- effectively giving the drive more processing power than the actual computer :)
One of the fun things on my Z80 CP/M machine was a 20MB Seagate SCSI hard drive with a 680000 controlling the drive electronics. Which is why I've never really understood people who focus on one chip to "define" a system, it really is about all the parts playing in harmony.
That's funny - My 68000 based Amiga 2000 had a SCSI card in it with a Z80 on it.
That machine was a real monster - I had a bridge-board in it: the A2000 had both ISA and Zorro-II (Amiga proprietary) buses, with one ISA and one Zorro-II slot in line so that you could buy a bridge-board that held an 8086 etc. on it to basically give you a "PC in a window" on your Amiga.
I'd then bolted a 286 accelerator board onto it, and a 68020 accelerator board onto the Amiga-side (I think both were of the type that slotted into the CPU socket andwhere you re-seated the CPU in the accelerator in case you wanted to switch to the slower CPU for compatibility reasons)
On top of that the Amiga 2000 and Amiga 500's used an 6502-compatible SOC with onboard RAM and PROM as their keyboard controllers.
So that one machine had a 68000 (though not usually running), 68020, 8086 (not usually running), 80286, Z80 and a 6502-compatible. I found it quite pleasing to have one machine with representatives of four of the most common CPU families at the time in it.
> Which is why I've never really understood people who focus on one chip to "define" a system, it really is about all the parts playing in harmony.
It was about identity in a sense when so much was done in assembler on the main CPU, and most people are never aware of those auxiliary CPU's.
Well, unless you can run programs on your hard disk controller, that's not the chip you use to define how fast the system is.
Yes the speed of your computer is more than just the CPU, but the CPU, the RAM, and GPU are far more influential that what speed your disk's controller runs at.
The 8250 (the equivalent disk drive for the PET) had two 6502s inside!
In today's language, one was the application processor running DOS and the other was a DSP that did the low level hardware monkeying; they were connected up to a shared data bus. Because the 6502 only does bus accesses on one phase of the clock, by inverting the clock for one processor, it was safe to connect them both up to the bus simultaneously with no loss of speed...
The really fun thing about these was that you could actually download code to them and run it, so if you did have code that was small enough and CPU intensive enough and required little enough IO, you could in fact use them as an accelerator of sorts (but limited by a slow serial port and 2KB(?) I think of RAM on the drive)
I know of various code that downloaded code to it, but usually loaders or just to do things like blink the leds - I've never seen code that actually used the extra CPU for anything serious. I'm really curious if any exists.
It worked by transferring the "live" GCR-encoded data from the 1541's disk head to the C64 and simultaneously doing a fast checksum. Part of the checksumming was done on the 1541, part was done on the C64. There simply weren't enough cycles left on either side! Most of the transfer happened asynchronously by adjusting for the slightly different CPU frequencies and with only a minimum number of handshakes. This meant meticulous cycle counting and use of some odd tricks.
A GPU isn't really comparable to the modern CPU sitting beside it, since it lacks many of the CPUs features, eg. virtual addressing, while in the other instances mentioned the processors were pretty much the same, feature-wise.
GPUs do, like most peripherals, contain additional CPUs that control them -- in the case of GPUs these are proprietary architectures specifically designed just for scheduling and sequencing tasks on the GPU and are said to be quite potent, for a device control processor.
Similarly hard drives had multi-core MIPS and ARM processors for quite a while (ditto for SSDs, in both cases they are ASICs, the actual data juggling isn't done by the CPU cores), and they even have quite a bit of RAM -- the cache -- that they'd normally share with the DMA engine, plus some of their own...
The control processors in most other peripherals are usually much weaker, though. They're pretty much just microcontrollers.
Some models of LaserWriter also had an undocumented PostScript command which would print a La Costeña order form for you to fax in. I always appreciate super-obscure Easter Eggs like that.
Sorry I'm late seeing your comment, but I'll tell you what I know for posterity. I heard the story third-hand: a prior manager of mine, Mark "The Red" Harlan (of Clarus the dogcow fame), used to be a DTS manager at Apple back in the late 80's/early 90's. He also hosted a MC'd at WWDC each year called "Stump the Experts," where the audience got to challenge a panel of Apple engineers with the most obscure questions possible. One audience member asked about this PostScript command, and one of the engineers admitted to the fax-in burrito order form. Sorry I don't remember the engineer's name or which WWDC year, but I'll ping Mark.
Kurzweil Music Systems produced the 150 additive synthesizer around the same time that also used a 12 MHz 68k. Do you know if this faster variant was encumbered in some way that made it less suitable for use in a "general purpose" computer, or was it simply a cost consideration?
I'm not sure how you think it would be encumbered, but from the Wikipedia page, my guess is it was mostly cost. The faster variants (like 12 Mhz) were only available later, and I would guess that was due to improved production processes and binning that enabled them.
Really nice to see a Z80-project :) I'm actually finishing my own Z80 microcomputer this weekend. But mine will utilize a symphony of exotic 7400 series chips and hard-to-get Z80 peripheral chips. All ceramic ones. So if anyone wants to build their own, it will cost several hundreds of dollars :-/. But the idea with the project was to learn how computers where built in the early 80s, not to do a modern approach, like you :-)
Fun name, by the way. Mine will be named Calculon80 or Calculon64 :-D
Yes, and it is a lot of soldering :-) The real retro feeling is given by the old UV EPROM with gold pins (DDR copy of 2732) and the shiny big 2W resistors :-D
I first built my computer using low quality breadboards and a lot of wires, which I tried to keep as tidy as possible, without success ;-) I designed a PCB in KiCAD and got 5 of them delivered a week ago. Sooo much better :-)
So, at some point I learned that you could use an aspirin tablet instead of rosin. It works, kinda, but the fumes are the kind that melts your lungs off.
But yeah, there's no "real" soldering without burnt fingertips.
Burnt fingertips, hell. My second, and so far last, iron burn happened as a result of trying to solder an SD card slot hacked from an old floppy edge connector onto the CPU pins of a WRT54G with an insufficient magnifier that had me leaning way too close to the work. Turns out my nose sticks out further than I'd thought! And the blister stuck out further still. The sort of thing about which my mother might say "that'll learn ya," and it did - I'm much more cautious now.
It is especially the Z80 DART and SIO, which are very expensive on eBay. I have 3 of them now, but I accidentally broke one when I inserted it wrong in the IC socket :'-(
Interesting. I was poking around the MAME database and found that the Z80 was used as recently as 2010 by a company called Igrosoft to make video slot machines, along with some modern PLDs and support chips. They also use a YM2149-compatible sound chip for that authentic bleepy sound.
The 8051 family also has a huge amount of support tooling for it -- compilers, validation tooling, etc. For a lot of jobs where the 8051 lives, a new architecture would just mean an expensive set of validation tests for no real benefit.
Wow, you're right. I figured an 8051 was popular, but its derivatives make up >50% of the embedded market it seems. I am guessing most clones offer more than 128 bytes of memory.
Old 8-bit architectures like Z80, 6502, and 8051 are everywhere, because they're cheap (royalty-free, unlike e.g. ARM) and there's plenty of software and tools for them.
This is a nice project and I have no objection to it.
It's interesting that he includes an STM32, an FPGA, and other such devices, as support and peripheral emulation chips, in his effort to get his 6502 running as he would like.
Part of the education of my students here on the reservation we don't have a lot of resources. But we were able to score some nice FPGAs so they've designed and built from the gate level their own CPUs from scratch, then built compilers and interpreters on top of that. The youngest to do so was 9. It's been really convenient as well to be able to order inexpensive boards rather than having to deal with all the copper etching with chemicals as we did a few years ago.
When i was a kid, i had a Mattel Aquarius. Those early days were great. Whilst this project does rekindle some of that nostalgia, I'm not sure I'll be FAPping anytime soon.
So instead of trying to recreate the “good old days”, I made the decision to liberally use modern parts to simplify the design process...
Then uses a thru-hole Z80.
If that's not a perfect illustration of nostalgia, I don't know what is. Remembering the good parts without all of the fuss of the not so good. Very nice project. I'd probably buy that as a hobby kit.
I would also recommend looking at the RC2014 project. Kits are available for order on Tindie. The mailing list is very active and new developments happen regularly.
Awesome project. Kind of off-topic: I wish I could buy such a machine or something similar as fully assembled mini laptop device with a large character LCD display.
I've been looking for fully programmable, full keyboard devices with extremely long battery life (weeks not days) for years, and all I can find are old pocket calculators and vintage machines from the 80s. I've had high hopes for the PocketChip, but there you go: 5 hours battery life only. :(
I really would love someone to build something like this on one of those FRAM chips so there is no volitile memory on the system. I'd love to write an opterating system for such an enviroment (like those non-volatile time sharing systems from the 80s that made use of stateless microkernels and really pushed the limits of disk).
The peak transfer rate of NVMe SSDs is getting close to theoretical DDR1 SDRAM speeds; not sure how well random access works in comparison, and if there's an easy way of interfacing them, but it should be able to keep an alternate history-retro machine with a ~100 MHz system clock occupied.
Another possibility would be to use an obscene amount of SRAM (by retro standards) and a battery to keep it powered at all times.
SSDs (and NAND flash in general) tend to have block oriented interfaces, so you'd need at least a cache. You'd end up at best with a hardware implementation of paging...
You could us NOR flash, which tends to support execute-in-place as well. I've booted Linux from a system with the bootloader running from the flash.
The problem is still erase-units. Doing read-only is easy. The moment you want writes, you'll need a RAM as cache and complex controller logic to do wear-levelling unless you want to wear out your flash in no-time.
I think your project is awesome. Wish I had something like this 30 years ago. Having the Z80 clock controlled by a microcontroller is pure genius. We used an in-circuit emulator, and always wished we could write code for the ICE.
> FAP stands for FPGA Assisted Processor, inspired by Steve Ciarcia's 1981 book Build Your Own Z80 Computer, in which he called his computer ZAP, short for Z80 Application Processor.
I read the readme and this is what it said. Did you read the readme to the end? Was it different for you?
I read that too, but even after reading it, "FPGA Assisted Processor" isn't the first parsing that comes to mind for "fap", for the same reason that "Stream-Extracted XML" wouldn't be how I'd read "sex", even if I had just seen it in a readme.
If you go looking for issues, you'll find them, alright...
From the repo:
"FAP stands for FPGA Assisted Processor, inspired by Steve Ciarcia's 1981 book Build Your Own Z80 Computer, in which he called his computer ZAP, short for Z80 Application Processor."
Go looking? It's literally the project's name. Or do you think people see "FAP" and go "oh, jeez, that's such an unfamiliar word, I guess it's an acronym? wonder what it stands for"?
>The author would be wise to change it if he wants to be taken more seriously.
Based on the discussion so far it doesn't seem there's a problem on that front. If anything, the connotation will probably help it gain popularity, and that's a good thing.