Hacker News new | past | comments | ask | show | jobs | submit login

Honestly? I used to jabber on about this with regards to the still distant future of actual nanotechnology ... we need to find the guys who wrote videogames for arcades in the 1980s and press them for their secrets before so many brilliant tricks will be lost to time. They did so much with so little!

My guess is that we will slowly approach this wall and spend a lot of time trying for incremental gains, trying to avoid the inevitable, which would be the design of new chipsets with new instructions, sets of new languages explicitly designed to take advantage of the new hardware, and then tons of advances in compiler theory and technology. On top of it, very tight protocols designed for specific use.

I think we have layers upon layers of inefficiency, each using what was at hand. All reasonable things to do, in the short-term, based on the pressures of business. But in the end of the day we're still transmitting video over HTTP, of all things. Sure, we did it! But you can't tell me that it is efficient or even within the original scope of the protocol's concept.

Naturally, I think the whole thing would run about a trillion dollars and take armies of geniuses, but it would at least be feasible, just ... it would require a lot of will. And money.




The secrets of 1980s video game programmers?

1) hardware that doesn't change. One C64 is just like every other C64 out there. You knew what the hardware was and since it doesn't change, you can start exploiting undefined behavior because it will just work [1].

2) The problem domain doesn't change---once a program is done, it's done and any bugs left in are usually not fixed [2]. The problem domain was fixed becuase:

3) The software was limited in scope. When you only have 64K RAM (at best---a lot of machines had less, 48K, 32K, 16K were common sizes) you couldn't have complex software and a lot of what we take for granted these days wasn't possible. A program like Rouge, which originally ran on minicomputers (with way more resources than the 8-bit computers of the 1980s) was still simple compared to what is possible today (it eventually became Net Hack, which wouldn't even run on the minicomputers of the 1980s, and it's still a text based game).

4) The entire program is nothing but optimizations, which make the resulting source code both hard to follow and reuse. There are techniques that no longer make sense (embedding instructions inside instructions to save a byte) or can make the code slower (self-modifying code causing the instruction cache to be flushed) and make it hard to debug.

5) Almost forgot---you're writing everything in assembly. It's not hard, just tedious. That's because at the time, compilers weren't good enough on 8-bit computers, and depending upon the CPU, a high level language might not even be a good match (thinking of C on the 6502---horrible idea).

[1] Of course, except when it doesn't. A game that hits the C64 hard on a PAL based machine may not work properly on a NTSC based machine because the timing is different.

[2] Bug fixes for video games starting happening in the 1990s with the rise of PC gaming. Of course, PCs didn't have fixed hardware.

EDIT: Add point #5.


I have this theory that a lot of corporations know this but they don't want to be the pioneers who volunteer their money and man-hours, only for their competitors to then reap the fruits of their labour for free.

I can't prove it but I intuitively feel there's a lot of spite out there. Many people are unhappy with the status quo but are also unhappy with the idea to sacrifice their resources for everybody else -- and they will likely not only be non-grateful; they might try and pull an Oracle or Amazon and sue the creators over the rights of their own labour.

Things really do seem stuck in this giant tug of war game lately.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: