When I worked in their R&D dept, I developed a feature for Qualcomm's modem chips that allowed the code to be shipped in compressed form. I wrote a custom paging handler that could fetch and decompress pages on-the-fly.
It started with just code and RO data. Then we added RW compression, which meant our handler now had to re-compress evicted pages of RW data that been modified.
I came up with the eviction scheme. I even added some tooling to our build system that would analyze hardware traces of typical usage in order to provide input to the linker to organize and optimize the image layout so as to reduce page fetches (thus decompressions) in those common use cases.
They've since gone wild with it. I even remember there being discussion on stack compression. I think they are working on a hardware compression/decompression engine these days and integrating with caching.
Compression algorithms like LZ weren't viable in games until much later, as most computers lacked the memory and clock speed to run them quickly.
True. A space-time tradeoff.
>I ended up making a segmented that spilt the call graph into mostly subtree by selecting the subroots. A small set of the shared pets got to live in non overlay memory.
I didn't get what you mean by "making a segmented" and "shared pets". Typos?
The aim was to make Byte readers aware of Zork.
IIRC this article is a transcript of an article from an issue of Byte magazine in the early 1980’s.
Ahh, here it is: https://archive.org/stream/creativecomputing-1980-07/Creativ...
my approach was a little bit different - all strings were shared with reference counts, and were copied on write. it took a while to boot and load the world, but once it was loaded it worked well.
that was still a little bit too big, so paging was introduced, which made it run quite nicely in what little ram I had at the time.
New games would come out simultaneously on every platform they supported; Wikipedia lists those as "the Apple II family, Atari 800, IBM PC compatibles, Amstrad CPC/PCW (one disc worked on both machines), Commodore 64, Commodore Plus/4, Commodore 128, Kaypro CP/M, Texas Instruments TI-99/4A, the Mac, Atari ST, the Commodore Amiga and the Radio Shack TRS-80."
(Eventually they expanded the restrictions on the Z-Machine and began creating games too big to fit on the smaller machines, but they were still able to do simultaneous releases on everything with enough room to juggle the larger games.)
Also, the VM implemented a domain-specific language, designed with "writing text adventure games" in mind; this was a huge advantage in a time when most games were programmed in assembly language. Both in terms of a faster development cycle, and in terms of building better text-based games by hiring people who were writers first and foremost rather than programmers.
(The VM interpreter also implemented virtual memory; while a 128k game file doesn't sound like much, it was pretty huge in an era when typical home computers had 32 or 64k of RAM. Swapfiles didn't exist on anything but huge minicomputers.)
The 3:1 reduction doesn't necessarily apply to operations that would normally map directly to CPU instructions. But especially for that generation of 8-bit microprocessors like the 6502, as soon as you're working with 16-bit data (which you definitely want for something like Zork) you end up spending a lot of code space for even simple operations. There's a famous example of a VM designed for the 6502 explicitly for reducing the code size (and inconvenience) of working with 16-bit operands: https://en.wikipedia.org/wiki/SWEET16
"In effect, Z-code is like P-code: a string of subprogram calls, with the bodies of the subprograms executed by a Z-machine or ZIP. Any often-used sequence of pperations [sic] in Zork programs could, in principle, be compressed into a Z-code instruction, thereby moving the sequence of operations into the Z-machine or ZIP, where it needs to appear only once."
The goal wasn't to do it the easy way, but to achieve something that happened to be difficult.
It's also important to remember that loading a program into a computer back then was very different than today. Today, you load a program and it exists along with the operating system and all its support routines, needed or otherwise. For example, you may load a car racing game on Windows or macOS, but the printer drivers are still there.
Back then, most games on most computers would wipe out the entirety of the machine down to the last byte. Even the fonts could be blown out if you needed the space. (Storing changing variables in what we would call the font space was a fun way to create an interesting display while heavy math was being done, but without wasting resources on a "loading..." progress bar.)
Edit: cool, found it.
The archive.org site has a vast collection of old DOS games that are browser-playable: