
How Naughty Dog Fit Crash Bandicoot into 2MB of RAM on the PS1 - ekianjo
http://www.quora.com/How-did-game-developers-pack-entire-games-into-so-little-memory-twenty-five-years-ago/answer/Dave-Baggett?srid=z9ZA&share=1
======
JonnieCache
If you haven't read the "Making Crash Bandicoot" blogposts, you're in for a
treat.

[http://all-things-andy-gavin.com/2011/02/02/making-crash-ban...](http://all-
things-andy-gavin.com/2011/02/02/making-crash-bandicoot-part-1/)

~~~
joesmo
Nice read. I really like this part: "But we worried about the camera,
dizziness, and the player’s ability to judge depth – more on that later."

It's interesting that they were concerned with dizziness and the camera,
concerns which seem to have unfortunately evaporated in most 3d games made
since, to their detriment.

~~~
JabavuAdams
These concerns are back for VR. Also, remember that Crash Bandicoot was made
at a time when devs weren't sure how mouse-look should work.

~~~
talmand
Well, mouse look was already in use with shooters at that point; it was well
known. Maybe you mean behind the character third-person camera? Or just the
concept on consoles?

~~~
Slothrop99
Quake came out a few months earlier and had mouselook turned off by default
(and I (somehow) played the whole game with only the keyboard). So I would
agree that the concept wasn't well-known or established yet.

------
M4v3R
> Ultimately Crash fit into the PS1's memory with 4 bytes to spare. Yes, 4
> bytes out of 2097152. Good times

Wow. Just wow. One can only imagine the amount of hard work and sweat that was
put into making this possible. And the pride of developers when it actually
worked and the game has become a success. Great story.

~~~
bryanlarsen
4 bytes to spare isn't surprising. When you're crunching things down, you stop
once it fits. I once worked on a project that used 4x of its memory space when
the project was half done. When I was done, I had 13 free bits of space. Yes,
bits; I was doing a lot of bit picking to get it to work.

What was surprising is the lengths they went to to make things fit. A solver?
Wow. My problem was relatively straight forward in comparison: just bit
packing and silly amounts of code reuse: Hey, these two completely unrelated
routines have the same 7 byte sequence; can I make it common?

Fun times, I miss projects like that.

~~~
stackcollision
Yes, this is kind of like "of course the remote is in the last place you look,
why would you keep looking once you found it?"

A little more than a year ago I was working on a very space-constrained
device: only 2kb of program flash (an attiny23 for those curious). I had to
use libusb with this, which ate up a huge portion of that space. My first shot
at the main program put me over the limit by almost 500 bytes. By the time I
was done, I had packed the program + the usb lib into the program flash with 4
bytes to spare.

Man, was that fun.

------
kayoone
Pretty amazing what kind of skills game development required back then. One
side of me is happy that we have all these great tools today, the other is sad
because you hardly use this low level stuff in todays software development
world.

We are solving different problems today, but the level of software development
skills required for a game that today could be done by a single person in
Unity in a few weeks is quite impressive.

~~~
fra
Some day I will write a blog post about the tricks we have to pull off to fit
all code & assets - or get animations running at 30FPS - on Pebble.

Trust me when I say this: low level development is alive and well at hardware
companies.

~~~
fragmede
Why wait?

Blog posts serve as great marketing these days, from showing off your
engineering team's technical prowess, to recruiting others who hope to
participate in some really advanced problem solving.

The press also helps maintain your existence in people's minds. With Google
entering the market with Android Wear, and of course Apple's entry; a couple
of 'this is how cool a Pebble watch is inside' would do well to not let us
forget you exist!

------
i_are_smart
> and this had to be paged in and out dynamically, without any
> "hitches"—loading lags where the frame rate would drop below 30 Hz.

This is what gets me. Modern game development seems to say "eh, a little
hitching won't hurt anyone", and then we wind up with games that run like
shit. Even on consoles.

~~~
gambiting
I work as a programmer in games industry,and I feel like the problem is made
worse by artists and level designers who add more stuff without worrying about
performance. I can make a super efficient physics system or model loader,but
that only means that someone somewhere is going to add more particle effects
or lights or whatever, or maybe placing too many props in the scene so PS4/X1
can't handle it. In fact, the separation is huge nowadays - I know our engine
inside out, but I personally wouldn't really know how to use the editor to
remove things from the scene. Likewise, a level designer will know how to put
props in the scene,but they will have no idea how underlying things are wired
together or what is the performance cost of doing them.

It's a complicated problem, which might have been made worse by the fact that
games are simply easier to make nowadays than ever before.

~~~
talmand
Back when I was somewhat involved in the gaming industry a very, very smart
programmer told me the reason his game with his new fancy, innovative,
advanced engine didn't work out.

If you make it possible for level designers to make six square mile levels,
they all make nothing but six square mile levels.

~~~
Jare
The internal Commandos Level Editor had a bug where it computed twice the
memory footprint for the level. When the boss found out, he ordered the tool
programmer to NOT fix that bug or tell anyone about it.

------
derefr
Compare this to the Super Nintendo's 128KB of working memory.

It's hard to tell which games used more or less of that memory; the big thing
about game complexity in that era was always ROM size limiting asset
complexity, rather than RAM size limiting computational complexity, so the
games released toward the end of the console's lifecycle were just ones with
the biggest ROMs and therefore most assets, rather than games that used the
available CPU+RAM more efficiently.[1]

Now I'm considering writing a memory profiler patch for a SNES emulator, to
see how much of the 128KB is "hot" for any given game. I would bet the
hardest-to-fit game would be something like SimCity or SimAnt or Populous.

On the other hand, the SNES also had "OAM" memory—effectively the 2D-sprite
equivalent to GPU mesh handles. And _those_ were a very conserved resource—I
think there was space to have 128 sprites active in total? Developers
definitely had problems fitting enough live sprites into a level. Super Mario
World's naive answer was to basically do aggressive OAM garbage-collection,
for example: any sprite scrolled far-enough offscreen ceases to exist, and
must be spawned again by code. Later games got more clever about it, but it
was always a worry in some form or another.

\---

[1] There were also those that used expansion chips to effectively displace
the SNES with something like an ARM SoC in the cartridge, once that became
affordable. It's somewhat nonsensical to talk about how much SNES system
memory some games used, because they came with their own.

------
codesushi42
Pretty awesome. Goes to show the great lengths taken even on state of the art
hardware at the time.

And the PS1 wasn't even the worst of it. The Sega Saturn and N64 were both
considerably more difficult to develop for. And the PC market was terribly
fragmented and had a high rate of obsolescence.

This stuff can give you nightmares:
[http://koti.kapsi.fi/~antime/sega/docs.html](http://koti.kapsi.fi/~antime/sega/docs.html)

~~~
Grazester
I sometimes wonder what could have been had Sega released proper documentation
and a decent dev kit earlier for their difficult to program for Saturn. It was
only later on in its life did Sega start to utilise the second processor.
Shenmue was originally being worked on for the Saturn. How's that for
ambitious. It was rumoured to require the 4 meg ram cart. That I believe 100%.

[https://www.youtube.com/watch?v=foZUcPQAMvg](https://www.youtube.com/watch?v=foZUcPQAMvg)

~~~
codesushi42
Absolutely. I've seen the Shenmue video and it's stunning. It's more
impressive than anything that launched on the PS1 or N64.

>I sometimes wonder what could have been had Sega released proper
documentation and a decent dev kit earlier

That is a complaint I have heard a lot, and it is valid. But ultimately I
think Sega shot itself in the face by even having a second CPU. Concurrency is
a hard problem, and certainly game devs back in the mid 90s were not up to the
task of utilizing a second CPU. They had enough on their hands with
transitioning from 2D to 3D already. I have read that most games developed on
the Saturn only used one CPU.

Choosing quads over polygons was also a major blunder of the Saturn's design.
The list of Sega's mistakes with the Saturn is so lengthy that it's impossible
to think it had any chance of succeeding.

But I still play mine. :)

The sad thing is while Sega did everything wrong with the Saturn, they did
everything /right/ with the Dreamcast and it still failed miserably. I kind of
think of them as the Commodore of the games industry. Technology that was
ahead of the curve, awesome products, but ruined by terrible management and
stomped out by juggernauts (MS Windows, Sony Playstation).

------
xedarius
Resources blowing the memory was a common problem on the PS1. A common trick
was to bake the resources into the exe. I worked on a game where the final
disc was a directory of 25 exe's. Each exe was a level, even the front end was
a separate exe. You could see that the size of each file was under 2mb, so you
knew it would work. You would never/hardly ever dynamically allocate memory on
the PS1.

There was a lot of duplication, but the CD was a huge resource and memory was
thin on the ground. Also meant that we could use the CD for audio during the
game.

~~~
trumpete
As mentioned in the answer, there were levels that were a lot bigger than 2
megabytes. What their goal was to ensure that these bigger levels would
seamlessly be in the memory at all times

------
Htsthbjig
I am waiting for Mr. Baggett book on the Lisp internals of Crash Bandicoot.

I don't care about the game, I never liked it, but the Lisp code of this man
should be on par with PG's.

~~~
dmbaggett
That would have to be Andy. I'm really kind of a hack when it comes to lisp
programming. :)

But yes, Andy's lisp code is certainly great -- all the more so because he
also wrote the lisp compilers that compiled it. :)

~~~
e40
He didn't use a commercial lisp? I thought he used Allegro?

~~~
dmbaggett
He did use Allegro, but only to host his own compilers.

------
iNerdier
Dave Bagget could write a book on the making of crash and I'd buy it in a
second. There was something very special that went to making that game and I
was just the right age to appreciate it.

~~~
dmbaggett
Thanks!

It's on my bucket list to write that book -- also including many humorous
tales from my 10+ years at ITA Software, and anecdotes from my current startup
([http://inky.com](http://inky.com)).

If I live long enough, that is. :)

------
ZenoArrow
Sounds impressive.

Might also be interested to know that the PS1 version of the first Ridge Racer
ran completely from RAM (aside from music tracks):
[https://en.wikipedia.org/wiki/PlayStation_models#Net_Yaroze](https://en.wikipedia.org/wiki/PlayStation_models#Net_Yaroze)

~~~
jbrooksuk
Ridge Racer was such a great game and always felt so smooth to me and now I
know why. Thanks for this fact! :)

~~~
ZenoArrow
You're welcome. :-)

------
kozak
Old console games are great examples of how creativity benefits from
constraints.

~~~
Someone1234
Or that survival bias is a real thing. People only remember the good games
that survived obscurity for tens of years, while ignoring the majority of
terrible games that didn't.

A lot of people like to claim "old games used to be better!" but pick any
month of the 1990s and look at the newest releases for that month, I bet for
the average month there might be one title you've even heard of.

------
Kenji
_(Incidentally, this problem—producing the ideal packing into fixed-sized
pages of a set of arbitrarily-sized objects—is NP-complete, and therefore
likely impossible to solve optimally in polynomial—i.e., reasonable—time.)_

Aren't there polytime algorithms that approximate to a certain percentage of
the optimum?

~~~
riffraff
yes[0], the author mentions "first fit" which presumably is "first fit
decreasing" which is one of those. His approach was, from what I understand,
to try a few approximate techniques and choose the best result without trying
to run an exact algorithm with unbounded time.

[0]
[https://en.wikipedia.org/wiki/Bin_packing_problem#Analysis_o...](https://en.wikipedia.org/wiki/Bin_packing_problem#Analysis_of_approximate_algorithms)

~~~
dmbaggett
Exactly. You said it better than I. :)

------
listic
I wonder if anyone controls the physical layout of bytes on the disk, at least
for things like installing large software packages on an HDD, or a major
release of an operating system on a DVD. It doesn't look likely: every time I
install Ubuntu, I feel the process could be made much faster.

~~~
cnvogel
At least for Windows there's now a way to install the OS so that it only
occupies one continuous large filesystem image with the compressed install
files, and only stores the changes to this frozen filesystem images in the
"traditional" way. It's called "wimboot".

[http://labalec.fr/erwan/?p=1078](http://labalec.fr/erwan/?p=1078)

That way installing the OS mainy consists of copying this huge image file,
hence the "physical layout of bytes on the disk" will mostly be fixed.

I'd guess that this could also be replicated in Linux, but personally I don't
know if this is being done. I usually "debootstrap" or "pacstrap" my installs
from a bootable USB stick :-).

~~~
shubb
Sure. On one project we built a custom Linux distribution by mounting a file
as a loop back file system and copying the right files in. We installed a grub
boot sector at the start. Then we copied it as a file into a simple live usb.
The live usb also containing a script that used dd to copy the data from that
file over the start of the first disk of a system you plugged it into. Nice
quick install usb, minimal work.

We were installing into standard hardware so it was pretty good. But if your
hardware varies this approach isn't great.

We could also have created a fresh filesystem on the device and untarred into
it but we wanted to keep se Linux attributes.

~~~
cnvogel
dd'ing to the target device, and then growing the fs to the full disk extent
seems to be a splendid idea. Especially as Linux tends to replace quite a lot
of code over the usual curse of updates...

For a cluster system I had built a netbootable system for reinstall that would
restore a dump of the reference system's fs on a node and adjust hostnames,
ssh host keys etc... Also was quite fast at that time..

------
kayoone
Looks like the system RAM on playstations increased by the factor of 2ˆ4 / 16x
with every generation. 2MB to 32MB (PS2) to 512MB (PS3) to 8GB (PS4)

~~~
userbinator
Following the same trend, the PS5 having 128GB of RAM seems a little
ridiculous now, but consider that the PS4 was released in 2013, and the PS3 in
2006, so perhaps in 2020 it may really have that much.

~~~
kayoone
Yeah i thought about that too. My PC already had 8GB of RAM in 2009 and even
today that is still totally fine for most games. However, consoles usually
share RAM between GPU and CPU though and high end PC GPUs today have up to 8GB
of VRAM which will only increase once 4K content becomes more common place, so
something like 64GB shared RAM for a supposed PS5 does not sound so outlandish
as it may seem at first glance.

------
drzaiusapelord
> The PS1 had 2MB of RAM, and we had to do crazy things to get the game to
> fit.

That's what? About $80-100 worth of RAM back then? On a product that sold at
$299 (July 1995 pricedrop) that's incredible. Nearly a third of your cost was
ram alone.

------
gavanwoolery
Crash Bandicoot was quite a technical feat, but also (perhaps less talked
about) - a marketing feat. The series went to sell ~50 million copies IIRC and
was also the best selling Playstation game of all time.

~~~
Mahn
Games that are well engineered tend to be good games for some reason :-)

~~~
gavanwoolery
See: Doom :)

------
leeoniya
also great read about packing:
[https://fgiesen.wordpress.com/2012/04/08/metaprogramming-
for...](https://fgiesen.wordpress.com/2012/04/08/metaprogramming-for-madmen/)

submitted before:
[https://news.ycombinator.com/item?id=7739599](https://news.ycombinator.com/item?id=7739599)

~~~
nickpsecurity
That was a fun read. An insane solution I probably wouldn't have attempted.
Impressed that they threw together a C++ parser and translator that worked
that well in such a limited time.

------
tluyben2
I still play Crash (2/3, not 1) quite a lot on the OpenPandora & on my PS2
when i'm in my wood cabin. I still like it and that's partly of how much
constraints there were on the PS1 to get something 'big' like this out there.

I keep hoping for a book with annotated (including lisp) source... Please
please!!

~~~
fit2rule
Crash on OP is a real treat - its also one of my favourite wastes of time from
the Pandora repo .. truly wonderful to have it so easily portable!

------
ParrotyError
Back in the days of the 8-bit micros there were some games that were
absolutely incredible for what they could pack into a tiny machine. Elite on
the BBC micro (32k) was one, but being poor I had a Spectrum 128. There was a
helicopter simulator by Digital Integration for the 48k Spectrum called
Tomahawk which was particularly good, and another 3D game called Starglider,
which was originally for the Atari ST but there were ports to the Spectrum 128
and 48k Spectrum! I seem to remember an article in Sinclair User where they
interviewed the developers who explained how they got it all to fit into such
a small machine. Self-modifying code and using parts of the frame buffer to
store code and data IIRC...

------
sown
Time + Internet = A smaller world that doesn't forget

------
jkldotio
I recall reading that at one point game programmers were using portions of
their code as textures. Which reminds me of the story of Mel.
[http://www.catb.org/jargon/html/story-of-
mel.html](http://www.catb.org/jargon/html/story-of-mel.html)

~~~
qrmn
Indeed, the old FTL game, Dungeon Master (an old RPG, its most direct modern
successor-by-inspiration being Legend of Grimrock) put some copy-protection
code in one of its sprites, where it hoped you wouldn't notice.

One part of it would repeatedly read one "weak" sector that had been
incorrectly encoded on the disk in such a way that it would usually read
inconsistently, and would crash the game with an obscure system error if it
_didn 't_ read differently.

Cute, but kind of a problem when you have a much better quality floppy drive
which read the weak sector consistently.

------
landmark2
confirmed: I'm a shitty developer

------
voltagex_
Found while looking for other articles about this:
[http://web.stanford.edu/group/htgg/sts145papers/jdelahunt_20...](http://web.stanford.edu/group/htgg/sts145papers/jdelahunt_2004_1.pdf)

------
kelvin0
I could read stuff like this 24/7 ... something visceral about plowing though
some ridiculous constraints and finally making history really gets to me ...

------
ermias
Great post! Unfortunately it's on quora

~~~
GolfyMcG
I don't mean this sarcastically but what's wrong with quora? I've always found
it to have pretty good content but your comment seems to be implying it's
assumed to be a bad place to read things. Why?

~~~
gergles
To read the rest of this comment, please log in or append a secret URL
parameter to the URL.

~~~
__z
What's the secret URL parameter?

~~~
codezero
?share=1, though it's not a secret, it was posted about when this change went
in: [https://blog.quora.com/Making-Sharing-
Better](https://blog.quora.com/Making-Sharing-Better)

Then again, who's going to find that old blog post :P

~~~
__z
Thank you!!

(I figured out you don't have to pass anything specific into the share
parameter, you can do "?share=poop" or even just "?share."

~~~
codezero
Hah, nice :)

Also, this parameter used to (maybe still does) set a cookie so that you can
browse around the rest of your session without worrying about a sign up modal.

------
adibchoudhury
Gotta love Naughty Dog. Still leading the industry! Can't wait for Uncharted 4

------
znpy
Gloriosus tales of a forgotten past.

------
B0073D
For those of you who don't want to endure Quora's forced account setup:

Here's a related anecdote from the late 1990s. I was one of the two programers
(along with Andy Gavin) who wrote Crash Bandicoot for the PlayStation 1.

RAM was still a major issue even then. The PS1 had 2MB of RAM, and we had to
do crazy things to get the game to fit. We had levels with over 10MB of data
in them, and this had to be paged in and out dynamically, without any
"hitches"—loading lags where the frame rate would drop below 30 Hz.

It mainly worked because Andy wrote an incredible paging system that would
swap in and out 64K data pages as Crash traversed the level. This was a "full
stack" tour de force, in that it ran the gamut from high-level memory
management to opcode-level DMA coding. Andy even controlled the physical
layout of bytes on the CD-ROM disk so that—even at 300KB/sec—the PS1 could
load the data for each piece of a given level by the time Crash ended up
there.

I wrote the packer tool that took the resources—sounds, art, lisp control code
for critters, etc.—and packed them into 64K pages for Andy's system.
(Incidentally, this problem—producing the ideal packing into fixed-sized pages
of a set of arbitrarily-sized objects—is NP-complete, and therefore likely
impossible to solve optimally in polynomial—i.e., reasonable—time.)

Some levels barely fit, and my packer used a variety of algorithms (first-fit,
best-fit, etc.) to try to find the best packing, including a stochastic search
akin to the gradient descent process used in Simulated annealing. Basically, I
had a whole bunch of different packing strategies, and would try them all and
use the best result.

The problem with using a random guided search like that, though, is that you
never know if you're going to get the same result again. Some Crash levels fit
into the maximum allowed number of pages (I think it was 21) only by virtue of
the stochastic packer "getting lucky". This meant that once you had the level
packed, you might change the code for a turtle and never be able to find a
21-page packing again. There were times when one of the artists would want to
change something, and it would blow out the page count, and we'd have to
change other stuff semi-randomly until the packer again found a packing that
worked. Try explaining this to a crabby artist at 3 in the morning. :)

By far the best part in retrospect—and the worst part at the time—was getting
the core C/assembly code to fit. We were literally days away from the drop-
dead date for the "gold master"—our last chance to make the holiday season
before we lost the entire year—and we were randomly permuting C code into
semantically identical but syntactically different manifestations to get the
compiler to produce code that was 200, 125, 50, then 8 bytes smaller.
Permuting as in, "for (i=0; i < x; i++)"—what happens if we rewrite that as a
while loop using a variable we already used above for something else? This was
after we'd already exhausted the usual tricks of, e.g., stuffing data into the
lower two bits of pointers (which only works because all addresses on the
R3000 were 4-byte aligned).

Ultimately Crash fit into the PS1's memory with 4 bytes to spare. Yes, 4 bytes
out of 2097152. Good times.

------
hitlin37
very good read.

------
mianos
Multiple orders of magnitude less artwork. Multiple orders of magnitude less
polygons in models. This would explain a lot of the size difference.

~~~
MojoJolo
But it also has multiple orders of magnitude less in terms of memory. So they
have so little or no room for excess.

