Hacker News new | past | comments | ask | show | jobs | submit login
How Atari 2600 Game Pitfall Builds Its World (evoniuk.github.io)
166 points by weare138 on May 10, 2021 | hide | past | favorite | 29 comments



I'm surprised this article does not discuss one of the most fundamental constraints facing all Atari 2600 games - the fact that the machine had no framebuffer and every programmer had to "race the beam".

You had to keep track of where the light gun of the tv was at every moment and if your program took too much CPU time to ... compute things ... you would miss changing colors, etc., as the beam went past ...

This is why all Atari 2600 games (like Pac Man) have graphics that seem weirdly, horizontally distended - the CPU wasn't fast enough to change color before the light gun got to the next pixel ...

All of this is beautifully explained in the wonderful book _Racing the Beam_ which is part of the MIT platform series and is one my favorite books on any subject.


The 2600, called the VCS (Video Computer System) when I was a kid, offers three kinds of graphics.

Background. This can be changed by the CPU, but as mentioned, the CPU is not fast enough to do that per pixel. It is often changed per scan line though.

Playfield. These are 40x1 pixels, that cab be mirrored and otherwise manipulated. Call this a scanline buffer, or partial one, depending on what one wants to do.

If you want symmetry, 2 and a half bytes of buffer has you covered. Otherwise, you need to load different pixels for the other side of the screen.

Playfield is monochrome. One bit per pixel. The colors can be changed ar any time, and are often changed per scanline.

Sprites. There are two "balls" which are two bit wide strips that can run the length of the screen vertically. It is up to you to put the right pixels and position in place for every scanline.

And two player sprites, same as the balls, 8 bits wide.

The color precision of the system is basically one NTSC color clock, which is roughly 160 pixels, or positions to which the sprites can be drawn.

This generally means pixels at least twice as wide as they are tall. Other graphics can be much wider.

A typical game will contain a complicated kernel that is responsible for the graphics. Game logic and other tasks happen during the blanking times.

System resources:

128 bytes of RAM

4k ROM

Sound and game controller I/O, which is actually fairly robust, supporting digital direction input, up to 4 paddles (pots that can indicate a position), and some buttons.

I do not recall having ever seeing it, but a nice analog joystick is supported as are rotary encoders, and various button and stick combinations.


Pixel perfect hardware collision detection


And sprite / playfield priority. The cool, invisible room in ADVENTURE comes to mind.


The NES didn't have a framebuffer either, but its sprite hardware was a lot more capable. OTOH, Atari had a patent on synchronizing the cpu with a scanline, so if you ran out of stuff to do during a scan line, you could wait and start again knowing exactly where the beam was. For the NES, Nintendo didn't build that, so if you do want to do something tricky based on where the beam is, you'd need to work a lot harder (or have something to help you in the cartridge hardware)


> The NES didn't have a framebuffer either

The NES does have two nametables that perform a similar role, though. Instead of storing a 256x240 array of pixels, they store a 32x20 array of tile indices, each describing an 8x8 pixel tile. Once the CPU has written to the nametables, it can do something else while the PPU draws an entire frame.

Whereas on the 2600, the CPU needs to step in every few scanlines or it can't draw a 2D image at all!


On the VCS, you are supposed to set up a scanline, which will be then drawn by the graphics chip (TIA) automatically. Manipulating this setup on the fly as in "racing the beam" is really more exploit territory. The vertical aspect of the image is left to the program(mer), but the VCS gives you a helping hand in form of a critical feature: The graphics chip can halt the CPU until it has reached the start of a new scanline. So your program can sleep and will be woken up precisely at the beginning of the horizontal blank interval. (The 6502C "Sally" MPU used in later Atari 8-bit computers had a similar feature: an extra signal to pause the processor, when pulled low.)


> The graphics chip can halt the CPU until it has reached the start of a new scanline.

This seems similar to the horizontal-blank interrupt that's available on later machines, where it's widely used to change color palette or scroll values part-way down the screen. Because it’s an interrupt, though, the CPU doesn’t have to wait for it and can do other things during the active display period. As toast0 says, on the NES this interrupt is usually provided by hardware on the cartridge, but other 8-bit consoles (including the Game Boy and Master System) support it natively.

> Manipulating this setup on the fly as in "racing the beam" is really more exploit territory.

Even on the NES, "racing the beam" is not unheard of. For example, Marble Madness uses mid-scanline changes to draw text boxes in the middle of the level graphics. For part of the display it uses timed code to switch from the level nametable to text and back again within each scanline.


> a 32x20 array of tile indices

Oops, that should say 32x30.


Yes, it's all very stressful to consider from our modern cushy full-screen 32-bit 8-megapixel triple-buffered world. :)

That said, a tiny nit: televisions don't have light guns, they have electron guns. The light is created by the phosphor layer on the inside of the tube. Chances are you know this from the rest of the comment, of course.


I didn't think to include that (I'm the author) 'cause that constraint is more about the difficulty of timing instructions, rather than saving memory, which is the main reason Crane implemented the LFSR that he did.

But yeah I read that book and it started my interest in this topic. Super interesting stuff.


I got interested in 2600 programming a few years ago, it's really fun. I never got around to making anything of interest, and I've sort of lost the drive to try. But even though I'm not programming, or even playing it much, it turns out the community is just really cool in general. I still browse the Atari Age forums daily and there's a really great Twitch channel, ZeroPage Homebrew, that's usually live a couple times a week. The developers of the homebrew games are usually in the chat, and sometimes even OG developers are on the show. David Crane has been on a few times. It's the coolest online community I've ever come across.


I went through the same cycle, and did end up making a fun game using Batari Basic.

That is a compiled BASIC for the 2600, and while that might seem crazy on a machine with 128 bytes of RAM, it actually works very well.

Agree with you on the scene. Lots of fun!


https://www.youtube.com/watch?v=tfAnxaWiSeE

the GDC post-mortem of pitfall has some more info, and I found it pretty entertaining.

I think the whole series is worth watching if you like this sort of stuff.


Pitfall II was the first video game I ever owned. Good times.

The impact games had on my generation should not be underestimated. I am sure for many if not most people who got interested in computer programming, games were the entry way. For me, it's certainly true. Ever since my dad brought home a used Commodore 64, I wanted to learn how to program it. The C64 User Manual got read over and over again until it almost fell apart.

And in a way, the simple times back then made this not only possible but also almost a necessity. Pre-internet, you didn't have access to millions of games. So if you wanted to do something with that great machine, learning how to write your own game (which, coincidentally, I never mastered) seemed like the only logical thing. And it was entirely possible: flip the on/off switch and two seconds later, you're greated by a BASIC prompt. Have fun!

(Of course, later on people started to figure out how to circumvent copy protection and it was very common for some kids to share floppy disks with cracked games on the school yard. Copying a disk, however, required special tools and would take quite long because you'd only have one disk drive but the computer did not have enough memory to fit all the contents into memory. So you would have to go back and forth between the source and the target disk a few times. This took up 10, 15 minutes if memory serves right. And you could never be sure if the copy would actually work, for some reason. You'd have to try different copying tools to find one that did the job right.)

The closest to that feeling today is probably coding JavaScript, but the experience is entirely different. It's actually much better today, in that the chances are much lower for you to get stuck - there's infinity free programming resources, plus platforms like StackOverflow where you can find answers to common questions, or even try to ask some questions of your own (good luck, newbie).

But I still look back fondly to the endless hours that I spend in my room in front of a bulky but tiny monitor, staring at these 16 colors, having big dreams of fantastic games and learning by just figuring things out.

And every now and then, when things got too frustrating, you could always put in the old cassette tape and load up Pitfall II again to give yourself a break.

SHIFT + RUN/STOP


This is a perfect place to mention Ben Frys distellamap project https://benfry.com/distellamap/

It creates stunning visualizations of vcs game code.


Incredible that each room in Pitfall is represented by just 1 8-bit byte.

That's what makes these old games fascinating, producing results under extreme constraints.


We're all so used to modern abstractions, libraries, data-driven engines, general solutions. We can afford them now. The developers of these old games couldn't, they were forced to do more with less. The results are absolutely fascinating. There's a certain quality that seems to have been lost over time as computing resources available to developers have improved.


It would be surprising if those old game developers didn't have their own library of code, gradually perfected from game to game. That would have been their modern frameworks.


The procedurally generated universe in Elite [1] contained 8 galaxies and each galaxy had 256 planets. They were all generated from a single seed number.

Interestingly, they artificially limited the number of galaxies as theoretically they could have 2^48 of them.

[1] https://en.wikipedia.org/wiki/Elite_(video_game)


The call to action at the end may be the most important part of the article, so if you get tempted to bail early, try not to.


Reminds me of http://www.neocomputer.org/projects/et/ - Fixing ET for the 2600.


It would have been kinda neat to learn assembly programming with the concepts from the article. When I was in school it was a required part of the curriculum and seemed like a gate that weeded out students since people complained it was so hard. Context of what assembly was used for and could be used on might have helped students understand better. I found it easy, almost free-ing, since there were so few instructions to use.

There are some really innovative ways things were accomplished back then as pointed out in the article.


8-bit Workshop[0] got me started on 2600 programming. I've since moved on to a local setup using Stella (emulator) and DASM (assembler), but it's a really easy place to start. The book advertised there is decent, but there are also some really good tutorials that can be found online, this one[1] is pretty popular.

[0] https://8bitworkshop.com/

[1] https://atariage.com/forums/topic/33233-sorted-table-of-cont...


I wanted to comment directly on the page, but I didnt see a way to do so. I am not an assembly language programmer so some of this just glazed my eyes over - but I have to say anything about adventure OR pitfall is gold as far as I'm concerned. Thank you SO much for your article! I hope this ends up on archive.org so it never gets lost...


Thanks so much! I'm glad you enjoyed!


> For comparison, this sentence alone takes up more space if encoded in ASCII, let alone the UTF format in which it's actually encoded.

If the author had used a U+2019 Right Single Qutation Mark instead of an ASCII apostrophe, the sentence would actually be true.


Lol. Just added it!


Was hoping to see some more photos of pitfall




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: