This was a brilliant design. It had to balance cost in terms of pins and dynamic refresh, with capacity and reliability.
It hit a sweet spot for everything that basically let this design take over the world of early computing and it was easily composable to larger memory sizes up to a point that was sufficient for the time.
There's a pretty good reason this style of RAM is used on so many machines of the time, in pretty much the same configuration, and that you can pretty much swap RAM chips between machines of this time.
We are at an interesting point in computer history IMO. Microcontrollers are getting fast enough to keep up with most things happening inside the first (or first few) generations of personal computers. Before, you had to use FPGAs. Now, people use the Pico to create things like digital RGB upscalers or N64 cartridges. The Raspberry Pi Pico is fast enough (or almost fast enough) to keep up with most timing transitions of the 4116 and/or 4164, and I was able to create a Raspberry Pi Pico-based RAM tester that tests 4116 or 4164 RAM chips _live_ as they are being driven by an MSX. If you're interested, take a look at my blog entry here: https://blog.qiqitori.com/2023/08/testing-4164-4116-dram-ics...
I guess this is a bit outside of microcontrollers, but it is nice to see currents going towards things like FPGAs too, for example.
GPUs are great but for my usecase they also can be a bit limiting, because I'd like to do high throughput bitmagic on numbers, which starts to get difficult past a certain point. And certain bit depths like 5, 3, and under get weird too.
All that said, definitely agree that much more powerful microcontrollers is extremely nice, less fiddling with weird fiddly quirks, more devvy magic. <3 :'))))
The amazing thing (as Ken points out right at the end) is the latest chips store 16 gigabits, about a million times larger, in a similar area. So you can think of each of those 1979 transistors as now containing 10^6 2020's transistors! It's mind-boggling. Also a million transistors is approximately an Intel 80386 (actually it's a little bit more).
Also a million transistors is approximately an Intel 80386 (actually it's a little bit more).
275k for a 386. 1.2M for a 486.
(I just noticed the Wikipedia article on the 386 is wrong --- it gives both 275k and 855k, with the latter a citation to a set of presentation slides that doesn't even contain the numbers "855". Perhaps that's where the misinformation came from? But I strongly remember the 275k number from early Intel marketing material.)
I looked into this: the 386SL has 855,000 transistors while the other 386 versions have 275,000 transistors. I updated the Wikipedia page, adding a reference to Intel's detailed "Microprocessor Quick Reference Guide", which has transistor counts for most of their chips. (I wish I had found this guide long ago.) https://www.intel.com/pressroom/kits/quickreffam.htm
The 386SL adds a cache controller, memory controller, AT bus controller, and EMS 4.0 hardware, which is why it has more than three times the transistors. I checked multiple independent sources before posting, and they all said 855,000 transistors for the 386SL. (Sources from 1990s, so they're not parroting Wikipedia.) The cache controller has a (relatively) huge amount of tag RAM, which accounts for a lot of the additional transistors.
Neat! I remember feeling very, very fancy when I got my Timex/Sinclair 16K RAM pack. It used 4116 chips. I was puzzled why they'd have so many chips in the expansion instead of just using static RAM, but back then everything was pricey. Compared with the 2K of RAM the TS/1000 comes with, this was a huge amount of memory :)
And of course the ZX Spectrum main memory used only half of the chips. Sinclair could buy faulty RAM chips very cheaply, they binned them into "top half working" or "bottom half working", then put either all tops or all bottoms into each Speccy and set a solder point on the motherboard accordingly.
The Speccy used 4116s and 4164s (and later 4464s in the Amstrad versions, except the "+2"). The 4116 was used for lower RAM in 48k version, or all the RAM if you had a 16k version. The 4164 was used for upper RAM, and they were the ones that were binned for working upper and lower, and you set a jumper to select the good half to use. In the 128k "toastrack" and grey "+2" it used good versions of the 4164 for all its memory. A neat and typical Sinclair solution to saving a few bob on each unit.
Always interesting how much engineering went into a 'simple' early chip like this. This was state-of-the-art once. :-)
Never liked these 4116s myself due to their multi-supply voltages. Common cause of problems where used (like in the ZX Spectrum as another poster pointed out).
And of course Ken Shirriff is brilliant as always!
It hit a sweet spot for everything that basically let this design take over the world of early computing and it was easily composable to larger memory sizes up to a point that was sufficient for the time.
There's a pretty good reason this style of RAM is used on so many machines of the time, in pretty much the same configuration, and that you can pretty much swap RAM chips between machines of this time.