In this era, code for games was essentially write-only, by one or two engineers. No one would be confused by the code in later releases because there simply weren't any. Code that was common between games was the exception rather than the rule (at Atari, some of the best debugged and well-structured code was the stuff that operated the coin mechanism, because it had to be right or customers would aggressively exercise the tilt switch mentioned in the article).
Code also had to run on really tiny hardware. ROMs cost real money, and marketing wasn't going to give you more just to make your life easier. If your code ran in 4K, but Joe's code was a complete train-wreck that ran fine in 2K, guess who got the big bonus?
One cow-orker of mine had spent some time writing code for pinball machines at Midway. He said that on one project the hardware engineers wanted to save money by half-populating the board's RAM; unfortunately the way they sliced it was to only provide the bottom 4 bits of each byte.
"The ROM is still 8-bits wide, you can work around the RAM thing in software."
He only got his full-width RAM by pointing out that the processor's stack wouldn't work.
That’s hilarious. I guess people were not that knowledgeable back then.
And that scale of logic translated into a microprocessor makes for some pretty trivial software; the games got to skip over all the intermediate technologies of vacuum tubes and drums and core memory. So part of that initial video game boom - and the corresponding switch to solid state pinball - really was a kind of seeking out of applications for hundreds to thousands of bytes of memory, often supplemented with existing analog tech. It's a fascinating crossover in many ways with a punctuated ending: the '84 games crash and the '85 computers crash precipiated cheaper RAM and in following years, suddenly all the games started having more expansive worlds, full soundtracks, keyframed sprite animation etc. The development shifted from a hardware and manufacturing focus to a software one; and the computing devices became much more convergent and portable. And that is the trend we've been on since in the consumer space - inexhaustible cravings for more memory and higher fidelity, held at bay only by the whims of hardware teams.
But I'm looking at coverage of CES this year and see this long trend as something that's played out, with a lot of companies across the board aiming to pitch new concepts. We're in a space where people have Enough Memory. So things are getting exciting again.
(alternative YouTube link: https://www.youtube.com/watch?v=WU49fpXtUTU)
> Move so you are peaking out of the right edge
Obviously, "peaking" should be "peeking".
I don't recall seeing it much until relatively recently, and it seems to have become very common.
Have I missed something? Was it always a common homophone error? Or is there a reason for it peaking recently?
Incidentally, 9 credits of space invaders ruins the space invaders experience.