HN discussion here: https://news.ycombinator.com/item?id=6654905 (with participation from the author, dmbaggett https://news.ycombinator.com/user?id=dmbaggett)
As to why I didn't link to the Quora answer? Because not everyone on HN has a Quora account. And in the past, Quora has sometimes erected a login-wall for non-users trying to view a post. Gamasutra does not.
It's interesting that they were concerned with dizziness and the camera, concerns which seem to have unfortunately evaporated in most 3d games made since, to their detriment.
EDIT: Crash Bandicoot on the other hand managed the camera extremely well without creating any motion sickness.
- Models were not "skinned" as it was popular in the day. Some textures were covering only the front part of the body, others arms, etc. As such it was possible to use very little colors per texture (16) and use palettes (which is a very small "texture" in the graphics memory). If models were skinned they would've required all the colors used anywhere on the body, and would produce other unpleasant effects (different sampling frequency, especially on the shoulders, etc.) Konami's character modeling is top-notch.
- Music/Sound - this was enignma for us. We were never given their internal sound mixer, but the popular metal gear tune was "mod"-like with very short samples - all of this + game effects was fitting in a 512kb audio buffer (adpcm).
- Game used overlays for the executable part. About 600kb were a main/shared/common part, and if I'm not mistaken 100kb or a bit more were swapped (the overlay). The main part would declare entry-points to be reached, and the "swapped" overlay were like many .so/.dylib/.dll files that knew about the main part.
- TCL-like language was used to script the game, the radio, traps/objects in the game, etc. Each character would have a "main" like function that accepted (int argc, const char argv) and handled the arguments from there (these were directly from the TCL scripts). Ah, the whole thing used "C" only.
- So 600kb+100kb, leaves you about 1.0mb for objects, "scenerio" files to be loaded, etc. Since our port was more or less "wrap" the PSX libs as PC, we didn't have to change too much, just on the surface - a bit like patching here and there.
- The game used a tricky pointer hack, basically on the PSX accessing a pointer with the highest-24-bit set means read it from the CPU cache, otherwise not (or maybe the other way around). This was used for example to indicate whether the C4 bomb was planted on the ground, or on the wall instead of keeping a booblean/bit flag for it. Possibly was used for some more things. Since it was 24-bit, that meant 16mb.
To work on Windows we had to ensure that we don't go above the 16mb (and the exe starts from 4MB), we also had all overlays for the game compiled-in rather than doing the swapping as the game did, but we had plenty of space even then to fit. It's possible that we might've messed up some of the AI tweaks, but no one complained, and we were young and did not care. Then I had something to find all places where these pointers were used and mask them out when they had to be read, but kept the 24-bit highest bit in there (okay, it's a bit like tagging I've learned much later when I did some Common Lisp).
- As we couldn't do shit about having the original mod-music working, we relied on couple of then popular MGS web-sites and "stole" from them the whole music piece, and other things which came as an audio "pre-rendered" form, and then played them directly from our game. Ah... So embarrased!
- On my part I'm really proud that I was able to do a global-hack where I kept the fixed-point coordinates sub-pixel precision, so our PC port did not "tremble" or "shake" like others to come. Basically on the PSX when you draw a triangle, the "chip" makes all numbers integer pixels, and each vertex "sticks" to a concrete pixel - this makes "shimmering" like feature, and I was able to get around it.
- The other team-mates were able to get software/hardware renderring (directx I think 3, or was it 5?...). Konami used quite a lot of rendering trick that were not available back then. For example the camo-suit basically used the framebuffer as texture, from the location where the character was rendered - so it looked a bit like shimmering!
- Two lessons learned from it - We've put much better high-res textures for the eyes (hired someone from Texas to do it for us), when we got the idea rejected by Hideo himself (by the phone), he told us (through the interpretter) that the game during normal game-play did not have any eye-movement, so higher-res textures would look like crap, while with a blurry texture your own eyes won't see it as a problem - it's really sometimes LESS is better.
- Another was from my boss back then. We had to have a very strict frame-rate - I thnink 30fps otherwise some things were not properly emulated. On some older machines we had the fps going below 15fps, due to the actual renderering, not game code - and since he had experience, he simply said - we'll just skip the drawing of frame then to gain some time. Now that seemed like thing that it should not work, but it did - and saved us from trying to do a non-constant frame-rate hacks.
- Another minor tidbid. The game reffered to it's files/chunks/etc. by using a 16-bit CRC, since there were quite of lot of objects - almost 32000 overall, there would be collisions, but the way Konami solved it was by simply renaming an object, rather than changing the hash-function, or something else. It puzzled us why some soldiers were called charaB or chara4 without other numbers, but when I got afraid of hash-collision, and saw that there are none (for all objects in the game) it kind of explained.
- Who knows how many other treasures we did not discover. In all working on it, made me love the "C" language more than "C++" then. The code was only with japanese comments, and early on I wrote a very simple translator - using the offline Star Dictionary (I've downloaded from somewhere), while it was not much usable, apart from really weird to understand (at first) code or algorithm, it also uncovered things like "CONTINEKU" "METARU GIRU SORIDU" (Continue, Metal Gear Solid), and at first I was like... are these folks writing english with japanese symbols?
- They had a dedicated "optimization" programmer - he basically went through the code, have found the hot-spots and turned them into assembly (mainly model t-junction extrapolations, splitting the model in pieces to fit in the small 1kb fast-cache, and few others). Hopefully he kept the original "C" code, and it was easier for us to choose the right part here and there.
Lots of music on the PSX used a system like that, because that's a very natural fit for the PSX SPU. Tracker "modules" combine the sample data and tabulated sequencing data, but what you found more often on the PSX was separate sample wavetables and sequencing data closer to (i.e. literally converted from, and convertible back to) a MIDI format: it's smaller, timing-based, without all those pesky 00s wasting space. (It sounds better with the PSX reverb unit/buffer on top, of course.) It's actually very similar to what Minoru Akao did for the AKAO sound engine for the PSX Final Fantasy games, for example.
What did you think of the multi-tasking kernel/DMA bit in the "main" binary? (Or did you just remove that?)
By the way, the VR missions mentioned above were released as a separate add-on disc in many regions (rather than the later release Integral which the PC port was). If you do happen to have an original and can't play it on a PS2/PS3 because it doesn't recognise that the 'lid' is open (because it's a tray/slot-loader)... try launching the other executable, it runs fine :)
Konami were very late with delivering their (MTS?) system that was their audio/tasking thing, we were not given anything in advance. As such we've just found in the code where the samples and music had to be played, and as I've said above we "stole" (downloaded) the music data from the web-sites that had them (not sure how they've got all the effects, or it could be that we also found some waveforms in the source package).
I think at some point the "radio-codec" (okay, that's the actual Radio that Snake talks to the others) used this system - maybe a bit like fibers/threads, when a message comes, then switches. I'm not sure what exactly I did, and how much I understood it well (threads/fibers were not my thing back then... that much) but we've got it working.
You can decode the audio bits - it's just Sony's special version of ADPCM - or use a cartridge (for those PlayStations old enough to have a cartridge port) and read out the SPU memory over X-Link. You didn't even need a debug model to do it (although you did need a handy parallel port and the ability to bit-bang, or run a DOS program).
The CODEC used CD-XA Mode 2 Form 2 (2532-byte sector, with less error-correction layers) ADPCM-compressed streaming audio, 1 of 8 channels, at a relatively low sample rate - which works fine for speech. Lots of PlayStation games used the same basic technique for music and voices (as well as FMVs, although the bulk of that data would have been MDEC-compressed video).
Oh, these were some exciting times! - the whole systems was there open for you to see (at least from the software level, and to some point HW).
Could this be part of the reason why I didnt like the look of the gamecube port as much as the PS version?
For anyone that enjoys it, here is some great art done with 8-bit palette cycling (cylcing dozen or more colors to achieve animation) - http://www.effectgames.com/demos/canvascycle/ - (select other bitmaps too)
It's pretty normal for Japanese people to write English words in katakana, especially in things like games. Many program menus are perfectly readable by English speakers if you can read katakana. It's something taught in every Japanese school, so being skilled in it makes you look intelligent.
It was probably closer to KONEKUTTO and METARU GIRU SORIDO though.
English loan words in Japanese are so fascinating to me. Here's an example: "limited slip differential" -> リミテッド・スリップ・デフ (rimiteddo surippu defu)
(The ・ is used to separate foreign words/names when a Japanese speaker would not be able to figure it out)
This must be how Romance-language speakers feel when they see their words modified and incorporated into English.
If you want to learn the Katakana syllabary, try this website I found recently: http://katakana.training
There's also http://hiragana.training for the other syllabary.
Sometimes it really takes imagination. I have a family member who has an arcade game labeled "Hangly Man" (a Pac-Man clone). It took quite a while for it to dawn on me to reverse that back to kana (HANGURI) and figure out that it was meant to be "Hungry Man."
That is quite amusing! I think the hardest word I've found for Koreans and Japanese to say is "parallel".
I found this video where you can see the effects of the sub-pixel vertex precision issue: https://www.youtube.com/watch?v=HrFcYbwz_ws
Early on due to my porting libraries I've introduced a severe bug, where the internal timer was 10x (or 100x?) faster, causing issues for loading/saving, this was resolved just weeks before shipping thanks to the other awesome Ukraine programmer. It was also a lesson for me to be less cocky, and take the blame sometimes.
The IHRA Drag Racing game (PC) version had a full simulation of the engine (valves, torque) - e.g. for any configuration it'll calculate right away the torque/gear ratio tables (okay, I was never a car buff, and my memory is very short here), but essentially there was an algorithm, which I later understood certain car-tuning services were using to adjust real cars! - I mean there were DOUBLES and lots of fortran-looking code written in "C".
Where we failed was trying to reuse this code on the PSX. First there was no hardware floating point, and what took 10 seconds of calcuation on PC took 45 minutes on PS1 - unacceptable.
Again our boss made a crazy idea - why we don't precalculate some values and store them on the CD - way more limited than the PC, but still something. Not sure how the values were chosen, but overall the game was not a success - one magizine rated it, as one of the worst PSX games ever... I left the project early on, as I felt it wasn't doing good (and was feeling really bad since then about it, as I felt like deserting the person that took care of me, and brought me to US) - http://www.mobygames.com/game/playstation/ihra-motorsports-d...
Regarding engine simulation, that's a trip that it was useful enough mechanics were using it. Kind of disturbing they were using it though... Yeah, precalculation is kind of the goto way to deal with this sort of things for many resource- or performance-constrained systems. Always worth remembering.
I looked up the game on eBay. I can't get a consistent price because everyone starts at $10 and works down from there. Still worth somewhere from $1-9 plus shipping. Your worst project is at least helping people with bill money. Not the worst outcome. ;)
You mean "Fortunately he kept the original "C" code?
Great read anyway!
Wow. Just wow. One can only imagine the amount of hard work and sweat that was put into making this possible. And the pride of developers when it actually worked and the game has become a success. Great story.
What was surprising is the lengths they went to to make things fit. A solver? Wow. My problem was relatively straight forward in comparison: just bit packing and silly amounts of code reuse: Hey, these two completely unrelated routines have the same 7 byte sequence; can I make it common?
Fun times, I miss projects like that.
A little more than a year ago I was working on a very space-constrained device: only 2kb of program flash (an attiny23 for those curious). I had to use libusb with this, which ate up a huge portion of that space. My first shot at the main program put me over the limit by almost 500 bytes. By the time I was done, I had packed the program + the usb lib into the program flash with 4 bytes to spare.
Man, was that fun.
The character of Mew was meant to be referenced through the story, but not actually catchable. Considering that you can only get Mew by exploiting bugs in the game, it's pretty believable that it wasn't added above the radar.
Besides some of the features, there is no way in hell a Commodore 64 would be able to handle the textures, input, and UI of an app like that.
And while you may think that those things are useless, to the overwhelming majority it's the reason why they use that app at all.
Besides, why not use 33MB of memory? I think the number of people that would actually benefit from being able to have 30MB more memory (while using the clock app) on their Android devices is literally 0. Plus being able to be somewhat wasteful with memory provides tons of benefits. It speeds development time, it reduces CPU usage via caching (which reduces battery consumption), it allows higher res textures and a nicer UI, it allows easier-to-maintain code, and tons of other little benefits.
Multi-touch is in the system libraries, shadows and faux-3d layering and animations, too, the alarm system and every timezone imaginable is in the system, automatic home time is in the system.
The app uses 33MB of storage on disk. Only the app, none of the above mentioned libs.
The Facebook app is nowadays 159MB. That’s 120 floppies. For a single app.
Also, the Facebook app (on my Android device at least) is 40.36MB...
But accounting for assets, multi-language translations, functionality, and the fact that it contains a bunch of multi-platform code (multi platform meaning Android with Google services, android with Amazon's stuff, Android with none of that, etc... which means it cant depend on a lot of "system" libraries that might not be there), and more then 40mb isn't even that bad. That being said the Facebook app is a bit of an outlier, with most apps being in the 5MB range.
This app already has over 8MB storage space and 34MB RAM. And the app has only 5 images overall in it, uses no Google services, and has only about 25k LOC.
We are solving different problems today, but the level of software development skills required for a game that today could be done by a single person in Unity in a few weeks is quite impressive.
I don't work in games any more, but on the last title I worked on (Forza Horizon, Xbox 360), one of my colleagues engaged in a very similar exercise in order to allow data for the open world to be streamed in quick enough to deal with car (travelling at potentially 150+mph) to drive through the world without having to wait for loads, whilst streaming direct from the DVD (we weren't allowed to install to the HDD).
Given that the world was open and you could drive in pretty much any direction, trying to pre-load the right tiles of the world was difficult, and seek times made it tough to bring stuff in if it wasn't all packed together. However we were almost at the limit of DVD capacity so we couldn't do much duplication of assets to lower the amount of seeking required.
My colleague wrote a system that took about 8 hours overnight (running on a grid) to compute a best attempt at some kind of optimized packing. It did work though!
Trust me when I say this: low level development is alive and well at hardware companies.
Blog posts serve as great marketing these days, from showing off your engineering team's technical prowess, to recruiting others who hope to participate in some really advanced problem solving.
The press also helps maintain your existence in people's minds. With Google entering the market with Android Wear, and of course Apple's entry; a couple of 'this is how cool a Pebble watch is inside' would do well to not let us forget you exist!
I've seen many Unity projects with the worst code you could imagine still run close to 60 FPS when shipped.
Having an easy entry point means you're also going to get a lot of mediocre programmers using it. They seem productive in the first few weeks of the project but then quickly slow down halt once they start changing the design and end up with massive overtime hours while trying to debug and optimize the resulting mess.
Its made even worse with managers trying pseudo-agile-learned-in-a-two-day-masterclass adding even more procedures and overhead.
So yeah, you can make games today with less development talent than yesterday, it's still going to cost you more than having skilled software engineers and the resulting product will be a fraction of its potential.
On the other hand, a lot of the worst Unity dreck is horribly unoptimized and runs incredibly poorly, despite graphical simplicity and no real visual effects. If you tool around on Steam or YouTube you can find tons of examples - look up, for instance, Jim Sterling's "Squirty Play" series. Not every bad game on there has issues, but many do.
I'm not terribly familiar with the scene, but there are a variety of competitions for fun and art that operate within highly constrained environments. The demoscene, and computer golf, come to mind (although ironically computer golf is in some sense has to be very high level). There's also the security scene, which is quite bit-fiddly.
It's also the case that Go and particularly Rust are quite low-level system languages at their heart, so are presumably amenable to running in constrained environments.
 E.g. Rhoscript http://rhoscript.com/
Be sure also to check out Brian Goetz's excellent "Lambdas under the covers" talk, linked in the gist.
I don't think there is a dichtomy. It's always about smartly leveraging available resources. The problem with modern development perhaps is then that there are these tempting high level orthodoxies that often obscure the core matter at hand. Ie. focusing on some pointless candied abstract object interface rather than focusing design efforts on data flow and the datastructures and algorithms used.
The need for low level optimization has most definetly not vanished. When program data fits into local ram the bottleneck moves into cache optimization.
Yeah they might get bigger storage every year, but the reason why Google, Apple and Microsoft do talks about package size on their developer conferences is that size is number one reason for people to avoid installing apps, or to chose which one to remove.
Also given how the app life cycle works on mobile platforms, big apps are being killed all the time they go into background.
All iOS devices except the iPad Air 2 have less than 2GB RAM (most 512 MB). Android 1-4 devices have often less than 1GB RAM. It's common that only 1-3 apps can stay in RAM depending on the platform and the apps memory usage (foreground apps, not background services).
Applications/games in the Win95/PS1/N64 era were coded a lot more efficient. Back than, a common Win95a PC had 4-8MB RAM, (highend was 32MB).
Any realistic setup had 8 MB RAM or more.
Application at that time also didn't support i18n, didn't anti alias fonts, had low-res, low-color assets, that were enough at 320x200(240)/640x480 resolution.
The amount of RAM needed has to do with assets used by the code, not the code itself. The code itself is miniscule.
And no, Android phones do not have 4 GB RAM. Low end has 512 MB, with many phones in 1-1,5 GB range and the 2015 flagships have 3 GB. (Nexus 5 a 7 have 2 GB. Nexus 6 has 3 GB). All that without swap (where would you like to swap? To flash?). While most modern 32-bit ARM CPUs do come with LPAE, Android does not support that, so going above 4 GB will have to wait for ARMv8.
Lots of industrial applications run embedded Java with a few KBs and acceptable performance for their use cases.
javac has nothing to do with Dalvik or ART.
Prior to this year, javac compiled the Java code to .class files and then dx translated the Java bytecode in the .class files into Dalvik bytecode in a .dex file, with some simple dedupe optimizations.
Only this year did the Android build system switch to Google's own compiler.
Then make little drawings about which piece of Android is converting intermediate code representation into native CPU instructions.
For brownie points compare the quality of generated Asssembly code between Hotspot, Dalvik and ART for the same unmodified jar file.
I gather from your response that you've realized you were wrong about Android not using javac but were too proud to admit it. Don't worry, we can fix your pride problem with these tasks below:
1. Dalvik and ART don't take jar files as input, so it is impossible to get your brownie points. Learn why.
2. Oracle's Hotspot targets x86 and x86-64, and Dalvik and ART are mostly focused on ARM. Learn the difference between ISAs.
3. Hotspot and Dalvik make different tradeoffs between CPU and memory both in their choices of garbage collectors and in their JIT strategies. Think about why that would be.
4. The word "compiler" by itself refers to a program that translates source code into object code. Notably, an assembler is not usually considered to be a compiler, and JIT "compilers" were originally called dynamic translators for three decades, with JIT compiler only appearing in the 90s. Given that terminology background, figure out why most people would call javac a compiler but not Hotspot or Apple's Rosetta.
And yet failed to grasp the difference between frontend, backend and intermediate execution format.
> I gather from your response that you've realized you were wrong about Android not using javac but were too proud to admit it. Don't worry, we can fix your pride problem with these tasks below:
I don't have to acknowledge anything. Anyone knows that javac does not execute code on the Android platform. As such talking about whatever influence it might have on runtime performance, besides peephole optimizations, constant folding and similar AOT optimizations only reveals ignorance about the Android stack.
> 1. Dalvik and ART don't take jar files as input, so it is impossible to get your brownie points. Learn why.
Yes they do. Jar files get converted into dex files, which means the same file can be used as canonical input for both platforms.
Then again we are learning about Android aren't we?
> 2. Oracle's Hotspot targets x86 and x86-64, and Dalvik and ART are mostly focused on ARM. Learn the difference between ISAs.
Maybe you are the one that should inform yourself about Oracle and certified partners Java JIT and AOT compilers for ARM platforms.
Learn about the Java eco-system.
> 3. Hotspot and Dalvik make different tradeoffs between CPU and memory both in their choices of garbage collectors and in their JIT strategies. Think about why that would be.
Of course they do different tradeoffs. The ones made by Dalvik and ART are worse than approaches taken by other Java vendors, hence why they generate worse code, which leads to bad performance.
Learn about commercial embedded JVMs.
>4. The word "compiler" by itself refers to a program that translates source code into object code. Notably, an assembler is not usually considered to be a compiler, and JIT "compilers" were originally called dynamic translators for three decades, with JIT compiler only appearing in the 90s. Given that terminology background, figure out why most people would call javac a compiler but not Hotspot or Apple's Rosetta.
Learn about Xerox PARC documentation and its references JIT compilers.
Or better yet feel free to dive into OS/400 documentation about its kernel level JIT compiler.
All of which go back a little earlier than the 90's
"Instead, if the amount of free memory drops below a certain threshold, the system asks the running applications to free up memory voluntarily to make room for new data. Applications that fail to free up enough memory are terminated."
If the system requires memory the ones with higher memory footprint are the first ones to go. They aren't asked nicely, just killed.
This might have changed on newer versions though. I am typing this from memory.
Also on the Watch there are also time constraints. How quick an app is allowed to execute.
Windows Phone also has similar constraints.
Besides, most mobile games are being played by the casual crowd. Games don't need graphics that push hardware limits to sell.
Any source by mobile OS that you can point to?
I am quite sure there are other contenders like home grown engines, LibGDX, Marmalade, Cocos (all variants), SDL, MonoGame, DirectXTK, Project Anarchy , Apple own Scenekit and SpriteKit,...
> Besides, most mobile games are being played by the casual crowd. Games don't need graphics that push hardware limits to sell.
Why do you think then all major OS vendors are teaching the devs how to reduce their packages sizes? I can happily post the links of such presentations, just need to hunt them down again.
Game logic + Assets + Engine
Something got to give if one is required to push the size down.
It doesn't matter how many frameworks are out there. If you hang around the game dev scene long enough, you'll see that most small devs are using Unity, and if not that, Cocos2DX. Just head over to Gamasutra, /r/gamedev, or browse Steam & itch.io and see the # of cross platform mobile ports. Talk to devs, they are using Unity.
> Why do you think then all major OS vendors are teaching the devs how to reduce their packages sizes?
Reducing package size isn't really comparable to writing portions of your game in assembler. I don't consider that low level coding or pushing hardware limits.
Former IGDA member, Gamasutra subscriber and GDCE attendee here, hence why I asked for numbers.
> Reducing package size isn't really comparable to writing portions of your game in assembler. I don't consider that low level coding or pushing hardware limits.
It is not, but the goal of fitting as much code as possible in small packages is.
Maybe "former" is why. There are no numbers published to confirm or deny. It's apparent if you keep up with the community and ask developers what they use.
> It is not, but the goal of fitting as much code as possible in small packages is.
You're really talking about reducing sizes of assets & included libs. That has more to do with optimizing DL time than hardware performance, and nothing to do with low level coding where you're writing machine instructions without touching a higher level of abstraction. Not the same at all.
So just an anecdote, kind of "on my neighbourhood...".
A typical engine controller ECU in a car might have 256KB of RAM (and maybe 2-4MB of flash).
Of course there exist very complex components in the category of microcontrollers, some of them even offer enough resources to run Linux, but if you stick to the $1-$5 range the specs are very limited.
Here are two examples, the first one costs around $3 and the second one is less than $1.
I develop on such platforms and even though there is an interesting challenge in programming these tiny processors and optimizing CPU cycles and memory usage all the time, in the long run it becomes quite strenuous because there is only low-level stuff and I miss the expressiveness and flexibility of more abstract languages.
A typical Google search from 2009:
"Meaning of Life: Approximately 72,000,000 Results (0,00000042 Seconds)"
A typical Google search today:
"Meaning of Life: Approximately 364,000,000 Results (0,62 Seconds)"
During my engineering days(Circa 2005) I programmed in 8085 and our professor would give us all kinds of crazy assignments and small projects. That was the first taste of any genuine programming challenge I faced in my life. Immediately post that, programming in C felt a little boring.
Recently I worked on a embedded systems project for which I relived these kind of days. I had to invent all kinds crazy tricks to work around with resource constraints.
Your true creativity skills are kindled when your resources are constrained in all manners. Time or other wise. Unfortunately you can't academically recreate the same experience.
This is what gets me. Modern game development seems to say "eh, a little hitching won't hurt anyone", and then we wind up with games that run like shit. Even on consoles.
It's a complicated problem, which might have been made worse by the fact that games are simply easier to make nowadays than ever before.
If you make it possible for level designers to make six square mile levels, they all make nothing but six square mile levels.
The difference is, sadly, that we don't control the assets at all. :(
Early access game devs pushing unoptimized releases should set up their release system betterer.
Backfilling existing code with ifdefs and dealing with compile breaks and other more weird things can be intimidating, time consuming, with hard to define ROI, so I can empathize if someone doesn't do it.
Great game, except it has NES graphics and stutters like hell.
Wasteland 2 on my machine also ran really, really badly, it was unplayable (it was the first time I got pissed for kickstarting something).
Kerbal Space program also has some performance issues, but not bad as the previous two.
Then I go play some graphics heavy game made by some studio that like to make good tech, or play emulated Wii or PS2 games, and there are no issues and games look awesome.
Garbage collection is often a big problem for real-time performance, especially on the old version of Mono that Unity has.
But, even with having a worse experience than a day 1 patch (that being unable to get the day 1 patch because it is 10GB and you only have 5GB for everything for a month), I wouldn't call it the worse thing ever to happen to game development.
It shouldn't really become acceptable to ship an knowingly subpar product with the attitude "we can always issue an update later."
There is an art in exploiting bugs in old games and working to glitch your way to worlds you aren't supposed to enter at that time. The kind of time and effort in finding these is really amazing. Finding and exploiting those bugs is an art form in itself. Take a look at this - https://www.youtube.com/watch?v=aq6pGJbd6Iw Skip to 12:15 for the real insanity.
Or just like in the PS1 days, game developers still have to make trade-offs to meet dead drop dates set by publishers.
Online enabled patches can allow developers to be a little more cavalier with the quality, as they push to build more features closer to ship date.
That just means that we get to deal with buggy crap while they tell themselves it's OK because they can ship another update.
(It is possible to do 60hz on the N64, but it's really hard)
I also wonder if they are doing the equivalent in modern consoles - pushing the limits with graphics, etc. where you simply can't avoid the hitch.
It would be good to hear someone's perspective on this that works on these types of games.
Chances are they're not doing anywhere near the same thing with modern consoles; maybe in off-the-shelf engine code to get maximum FPS, but the games themselves, not likely. Also because modern video games are millions of lines of code - you don't want to duplicate those tenfold by squeezing every bit of performance out of it. Maybe only in the most frequently accessed codepaths.
PS: Skyrim is an interesting case where there is a player made patch to make city's open world where the original game has a loading screen. http://www.nexusmods.com/skyrim/mods/8058
The streaming model isn't simple though. You need to decide which assets to load, when, and when to discard them. You also have to consider how your level is designed, i.e. if you've got three tiles, and a player travels from tile a to b, and then b to c. When he moves from b to c, you can remove a from memory, but what if zone b + c is too big to fit, whereas a to b is okay.
A really interesting presentation was given at GDC this year from Insomniac games on streaming http://s3.crashworks.org/gdc15/ElanRuskin_SunsetOverdrive_St...
Also, when you get close there many ways to hide the loading going on so it seems cleaner. Like a player going from planet A to planet B though a more limited space ship. On arrival they see a larger but still limited space port, giving the game time to load the new planet. Or even just boosting the glare when someone steps outside.
However, IMO these things can easily be over done.
It's hard to tell which games used more or less of that memory; the big thing about game complexity in that era was always ROM size limiting asset complexity, rather than RAM size limiting computational complexity, so the games released toward the end of the console's lifecycle were just ones with the biggest ROMs and therefore most assets, rather than games that used the available CPU+RAM more efficiently.
Now I'm considering writing a memory profiler patch for a SNES emulator, to see how much of the 128KB is "hot" for any given game. I would bet the hardest-to-fit game would be something like SimCity or SimAnt or Populous.
On the other hand, the SNES also had "OAM" memory—effectively the 2D-sprite equivalent to GPU mesh handles. And those were a very conserved resource—I think there was space to have 128 sprites active in total? Developers definitely had problems fitting enough live sprites into a level. Super Mario World's naive answer was to basically do aggressive OAM garbage-collection, for example: any sprite scrolled far-enough offscreen ceases to exist, and must be spawned again by code. Later games got more clever about it, but it was always a worry in some form or another.
 There were also those that used expansion chips to effectively displace the SNES with something like an ARM SoC in the cartridge, once that became affordable. It's somewhat nonsensical to talk about how much SNES system memory some games used, because they came with their own.
And the PS1 wasn't even the worst of it. The Sega Saturn and N64 were both considerably more difficult to develop for. And the PC market was terribly fragmented and had a high rate of obsolescence.
This stuff can give you nightmares:
>I sometimes wonder what could have been had Sega released proper documentation and a decent dev kit earlier
That is a complaint I have heard a lot, and it is valid. But ultimately I think Sega shot itself in the face by even having a second CPU. Concurrency is a hard problem, and certainly game devs back in the mid 90s were not up to the task of utilizing a second CPU. They had enough on their hands with transitioning from 2D to 3D already. I have read that most games developed on the Saturn only used one CPU.
Choosing quads over polygons was also a major blunder of the Saturn's design. The list of Sega's mistakes with the Saturn is so lengthy that it's impossible to think it had any chance of succeeding.
But I still play mine. :)
The sad thing is while Sega did everything wrong with the Saturn, they did everything /right/ with the Dreamcast and it still failed miserably. I kind of think of them as the Commodore of the games industry. Technology that was ahead of the curve, awesome products, but ruined by terrible management and stomped out by juggernauts (MS Windows, Sony Playstation).
There was a lot of duplication, but the CD was a huge resource and memory was thin on the ground. Also meant that we could use the CD for audio during the game.
I don't care about the game, I never liked it, but the Lisp code of this man should be on par with PG's.
But yes, Andy's lisp code is certainly great -- all the more so because he also wrote the lisp compilers that compiled it. :)
It's on my bucket list to write that book -- also including many humorous tales from my 10+ years at ITA Software, and anecdotes from my current startup (http://inky.com).
If I live long enough, that is. :)
Biography of the guy who did the R-Type port for the ZX Spectrum, it's a riot of a read and has a lot of the old info.
Might also be interested to know that the PS1 version of the first Ridge Racer ran completely from RAM (aside from music tracks):
A lot of people like to claim "old games used to be better!" but pick any month of the 1990s and look at the newest releases for that month, I bet for the average month there might be one title you've even heard of.
Aren't there polytime algorithms that approximate to a certain percentage of the optimum?
That way installing the OS mainy consists of copying this huge image file, hence the "physical layout of bytes on the disk" will mostly be fixed.
I'd guess that this could also be replicated in Linux, but personally I don't know if this is being done. I usually "debootstrap" or "pacstrap" my installs from a bootable USB stick :-).
We were installing into standard hardware so it was pretty good. But if your hardware varies this approach isn't great.
We could also have created a fresh filesystem on the device and untarred into it but we wanted to keep se Linux attributes.
For a cluster system I had built a netbootable system for reinstall that would restore a dump of the reference system's fs on a node and adjust hostnames, ssh host keys etc... Also was quite fast at that time..
Web installs are also much slower unless you have a reliable internet connection (which is not a given). Installing from a USB stick (preferably of the 3.0 variety in a 3.0 port) is indeed the best option for all but the oldest of machines (in which case you'll need a boot floppy to start the machine up with a bootloader capable of initializing the boot medium (USB, CD/DVD/BD, network) and booting from it).
That's what? About $80-100 worth of RAM back then? On a product that sold at $299 (July 1995 pricedrop) that's incredible. Nearly a third of your cost was ram alone.
submitted before: https://news.ycombinator.com/item?id=7739599
I keep hoping for a book with annotated (including lisp) source... Please please!!
One part of it would repeatedly read one "weak" sector that had been incorrectly encoded on the disk in such a way that it would usually read inconsistently, and would crash the game with an obscure system error if it didn't read differently.
Cute, but kind of a problem when you have a much better quality floppy drive which read the weak sector consistently.
Stenography via rearranging executable code within a binary file so that the binary file looks most like a particular bitmap.
Then again, who's going to find that old blog post :P
(I figured out you don't have to pass anything specific into the share parameter, you can do "?share=poop" or even just "?share."
Also, this parameter used to (maybe still does) set a cookie so that you can browse around the rest of your session without worrying about a sign up modal.
Here's a related anecdote from the late 1990s. I was one of the two programers (along with Andy Gavin) who wrote Crash Bandicoot for the PlayStation 1.
RAM was still a major issue even then. The PS1 had 2MB of RAM, and we had to do crazy things to get the game to fit. We had levels with over 10MB of data in them, and this had to be paged in and out dynamically, without any "hitches"—loading lags where the frame rate would drop below 30 Hz.
It mainly worked because Andy wrote an incredible paging system that would swap in and out 64K data pages as Crash traversed the level. This was a "full stack" tour de force, in that it ran the gamut from high-level memory management to opcode-level DMA coding. Andy even controlled the physical layout of bytes on the CD-ROM disk so that—even at 300KB/sec—the PS1 could load the data for each piece of a given level by the time Crash ended up there.
I wrote the packer tool that took the resources—sounds, art, lisp control code for critters, etc.—and packed them into 64K pages for Andy's system. (Incidentally, this problem—producing the ideal packing into fixed-sized pages of a set of arbitrarily-sized objects—is NP-complete, and therefore likely impossible to solve optimally in polynomial—i.e., reasonable—time.)
Some levels barely fit, and my packer used a variety of algorithms (first-fit, best-fit, etc.) to try to find the best packing, including a stochastic search akin to the gradient descent process used in Simulated annealing. Basically, I had a whole bunch of different packing strategies, and would try them all and use the best result.
The problem with using a random guided search like that, though, is that you never know if you're going to get the same result again. Some Crash levels fit into the maximum allowed number of pages (I think it was 21) only by virtue of the stochastic packer "getting lucky". This meant that once you had the level packed, you might change the code for a turtle and never be able to find a 21-page packing again. There were times when one of the artists would want to change something, and it would blow out the page count, and we'd have to change other stuff semi-randomly until the packer again found a packing that worked. Try explaining this to a crabby artist at 3 in the morning. :)
By far the best part in retrospect—and the worst part at the time—was getting the core C/assembly code to fit. We were literally days away from the drop-dead date for the "gold master"—our last chance to make the holiday season before we lost the entire year—and we were randomly permuting C code into semantically identical but syntactically different manifestations to get the compiler to produce code that was 200, 125, 50, then 8 bytes smaller. Permuting as in, "for (i=0; i < x; i++)"—what happens if we rewrite that as a while loop using a variable we already used above for something else? This was after we'd already exhausted the usual tricks of, e.g., stuffing data into the lower two bits of pointers (which only works because all addresses on the R3000 were 4-byte aligned).
Ultimately Crash fit into the PS1's memory with 4 bytes to spare. Yes, 4 bytes out of 2097152. Good times.