Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
1987: Kid submits 7 Robotron bugfixes using only the machine code from the ROMs (robotron2084guidebook.com)
230 points by bluedino on July 10, 2013 | hide | past | favorite | 65 comments


The most amazing thing about this to me is the description of the internals of Robotron... Given a lot of the 8-bit machine code I've read it's simply astonishing to hear about a game of that era being so structured rather than the more typical approach of the day of hardcoding data all over the place (some games even used the on screen life counter as the actual "variable" holding the number of lives left, often full of bugs like not overflow checking the values) and using a basic event loop hanging off the irq. A lot of the time it was seen as necessary in order to fit everything in the small amount of memory, and so even on larger machines like the Amiga a lot of programmers kept manually laying out data structures for many years...

Compare the recently released Prince of Persia code to that description, for example. I found the Prince of Persia code very clean and readable for the period when I read it, but still relatively typical of the style of 80's 8-bit machine code programs even if it looks like it avoided the dirtiest tricks.

A game from that era using dynamic memory allocation, multi-tasking and an object oriented structure just sounds totally alien. Even Amiga games, running on a system with substantially more memory, and on an OS with substantial built in support for it mostly, - with some honourable exceptions - just disabled the multi-tasking entirely and often hung everything off a loop tied to an interrupt handler or at most an interrupt handler and a separate "main" thread.


That's a Eugene Jarvis and Larry DeMar game. He also co-wrote Defender, Smash TV, and other classics with DeMar. Eugene now teaches at DePaul in Chicago.

http://en.wikipedia.org/wiki/Eugene_Jarvis


Jarvis also owns/runs Raw Thrills, a game design company outside of Chicago.

http://www.rawthrills.com

DeMar owns/runs Leading Edge Design, a slot machine design company. He also cofounded Spooky Cool Labs, a social game design firm. Spooky Cool was sold to Zynga late last month.

http://www.spookycool.com


The 6809 is a very nice 8-bit CPU, designed with the intention of using higher level languages---it could generate position independent code, had four general purpose index registers (all 16-bit, one was also the system stack register), two 8-bit accumulators that could be used as one 16-bit accumulator, and had some rather advanced addressing modes.


As a kid, I used to adore the 6502. Now, after many years, having used 68k and MIPS assembler, I feel that the 6502 is just awkward to code in. Everything you do is so hardcoded and specialized for the task.

I wish the C64 had used a 6809 or z80 instead.


The 68xx range wasn't viable for low cost machines until it was too late and Commodore was already deeply invested (as in, it owned MOS).

Keep in mind that the entire reason for being for the 6502 was that Chuck Peddle had tried (and repeatedly failed) to get Motorola to accept the idea of pushing a low cost design. At the time the 6501/6502 was introduced MOS charged $20 per CPU while Motorola was charging $175 for 6801's. Motorola dropped their price to $69 near instantly, but that was still a substantial percentage of the cost for a typical low end machine.

On a machine with more memory it might have mattered, but for a machine constrained to 64KB, the 6502 instruction set was easy enough to work with - it's not like you could afford to use a large stack most of the time anyway, for example.


One thing that I think that is important to understand is that these machines, and this code, is still out there, working. People are still playing these games (MAME) and the emulation scene that has arisen due to the fanboix fervour has really pushed a lot of the gems into the 21st century with new life and vigor. The "automatic game player"-coded-in-z80, i.e. hardware playing hardware, is a real hoot!

And last but not least, consider this: people are still writing amazing software for the older architectures, using modern principles, and learning astonishing new things about the "antique hardware", to boot! In my particular nook I have witnessed the Oric-1/Atmos machines (http://oric.org), typically considered a fairly "grotty" 8bit machine of the 80's, never enough decent software, died early .. yet now come to new life with the discovery of never-considered color-modes and graphics capabilities by some of the modern scenes very elite hackers, turning their eyes to the "old" machines and producing new hits!

I think a lesson in all of this is that the old machines still contain a lot of power. The robotron code could probably be refactored into something else, easily enough, if there was some intention behind doing it. There are a lot of 6809's out there, still, presumably in landfill its true, but nevertheless one could do worse than to re-discover old machines in the modern era, were one to want to sharpen ones skills ..


In Commodore circles the Oric machines were often derisively called "doorstops" as that was the only thing we considered them good for as kids. Of course this was without ever having used one, and was reatively mild compared to the Commodore fanboy hatred of Atari. Much less tried to exploit it to its limits like with e.g. the C64 which is in a similar situation in terms of new discoveries despite how well it was mapped out by the late 80's (I remember finally getting my hand on a copy of a hand written diagram of exactly how many bus cycles the graphics chip "stole" from the CPU under what circumstances - essential to many demo effects that even many Commodore designers insisted were impossible until they were demonstrated...)

I'm not sure about 6809's, but the very similar MOS/WDC 6502 derivatives is still manufactured by Western Design Centre, founded by Bill Mensch (the co-designer of the 6502 together with Chuck Peddle, who was on the 6800 design team at Motorola for some time, though he joined after the main 6800 design was done): http://www.wdesignc.com/wdc/


If you want to read a bit more about Robotron, there's another good write-up by someone who did the same thing: reverse engineered Robotron. It's a much shorter article, but includes some C# representations of the process management:

Steve's Tech Talk: Robotron and OOP http://www.atalasoft.com/cs/blogs/stevehawley/archive/2006/0...


Having some insider knowledge of how Williams wrote their videogames, Steve's disassembly is the best presentation of how the Robotron architecture worked. This setup was used in almost all of the games that DeMar and Jarvis did for a decade or two to come, including the pinball operating system (which was pretty much this exact same process/task system). These guys always wrote in assembly. I don't think C came along at Williams/Bally/Midway until they switched to an X86/PC-based architecture sometime around 1998-1999.

Earlier on HN someone posted an article about a Tron-like light cycle game where the cycles would break out of the framebuffer and start to mess with program memory. What's cool about Robotron is that you are actually killing processes when you wipe out the robots. A true superuser!


There is an old programmer joke that goes "I can write BASIC code in any language!" which is a lament about people writing spaghetti type code (which was normal for early programs in BASIC) even when the language supported much more elegant constructs. But the inverse is also true (although people rarely complain about it :-) since ultimately you're compiling into machine code, if you're writing in machine code you can write as elegantly as you could in the most beautiful high level code, its a function of the programmer, not the language.

There are people who write solid, supportable, code in any language. keep those people around as they will ultimately create the most code with the least technical debt.


The original joke is "The determined Real Programmer can write FORTRAN programs in any language." It's from an essay written by Ed Post titled "Real Programmers Don't Use Pascal". It's before BASIC :)


Nitpick: it can't be from "before BASIC" if it mentions Pascal. BASIC is older than Pascal (1964 vs 1970, according to Wikipedia)

Also, http://en.wikipedia.org/wiki/Real_Programmers_Don't_Use_Pasc... (with links to the essay; look at the email address. That alone shows that we are talking of a real programmer) says it's from 1983.


I was tickled pink to find the core bits of SpriteCore, the game library I brewed up as a teenager, and have been hacking on idly ever since, implemented in a for-realz 1980s arcade game. In assembly.


Hey, nice of you to dredge that up. At the time that I owned a Robotron machine, I worked at Adobe and had easy access to a ROM burner so joe and I pulled the ROMs and joe wrote a disassembler. We spent the next few weeks idly reading and commenting code on a shared text file. One of the the really fun things was spotting the bitmap images the output. They were 4bpp, so 2 pixels per char - they were surprisingly easy to find and once I did, I dumped the raw data from one set into a file and managed to convince PhotoShop to open it. I set the colormap entries and I was looking at Hulks on a Macintosh IIci.

A year later, I wrote an emulator for the sound board and on the Mac I had at home it ran at about 60% real time (this was a 25MHz 68K based machine). That's a great number for code written in C because it means that with some careful attention, I could easily getting it running better than realtime, especially if I wrote some of the routines in inline 68K and took advantage of the condition codes of the 68K matching up nearly exactly to the 6800 (thanks, Motorola!). Before I started that task, I ported the code to a PowerPC machine at work and it was running 4x real time, so I decided that Moore's law just won and assembly was a waste of time.

The model of execution was, again, fairly decent for the task at hand. The sound board was set up with either 5 or 6 data lines (depending on the rev) coming in to a latch. When the host game wrote to the PIA, it would latch the value and cause an interrupt on the sound board. The sound board (which was a 1MHz 6808 with 128 bytes of ram) would be lodged in a WAI (wait for interrupt) instruction. When it got an interrupt, it read the latch and then ran an enormous switch statement to decide what sound to make. The output was an 8 bit D/A converter, but it looked like most of the time they treated it like a one-bit, by either writing 0 or FF. Most of the sounds were made by using frequency generators to drive other frequency generators - FM symthesis (more or less) - and the individual sounds would just set up parameters and kick off the sound generator. Most of them were designed to make sound forever (or close to it) or until they were interrupted by another request for a sound.

The two that really stood out in that set (and there are a ton of sounds on it that are not used in Robotron), was a crowd cheer, which was white noise with some envelope modifiers with some whistles mixed over the top, and a peal of thunder. You can hear each of these by dropping credits into a Robotron until they come up. IIRC correctly, Jarvis was particularly proud of these.

Eventually, I modded the code to run as a CGI script and had it running on my website as RobotrOnline, which would let you set up the hardware lines that modeled the lines going in, let you set how many seconds you wanted and when you submitted the form, it spat back a wav file meeting those parameters.


Cached link, because I couldn't get it to load: http://webcache.googleusercontent.com/search?q=cache:HIzQu_b...



That was absolutely crazy. He decoded a 48K assembly program on paper, while working as a security guard? Just amazing.


I don't want to diminish what he did, but consider that this was one of the most common methods of learning assembler programming in depth in that era. Most books on the subject were relatively primitive in the context of games and demo programming, and so if you didn't want to lag behind, you needed to figure out techniques by studying disassembled programs.

(EDIT: Just to add that this is in reference to the disassembling bit. Pulling eproms to dump them and studying them so intently you start to fix bugs the game designers hadn't managed to solve is certainly dedication well above the typical level)

Many large demos at least for 8-bit computers, and quite likely some games too, were written even without a symbolic assembler, but instead entered using a simple machine code monitor (think just one step above debug.com on MS DOS, and decidedly more primitive than programming directly in gdb...) using hardcoded address references instead of labels.

I'd actually been programming on the C64 for a couple of years before I got an assembler that supported labels that was good enough to use (I had some before that, but they wasted too much memory and/or were too slow to seem worth it) and from people I met I gather that's a fairly common experience.


The printout-and-highlighters method of reverse engineering was indeed common way back then. It became even more well known in 1982, when Don Lancaster wrote a book entitled Enhancing your Apple II. In it was a chapter on "Tearing into Machine-Language Code," in which he explained a method for using a disassembly printout and colored highlighters to quickly reverse-engineer machine code. This method soon spread beyond the Apple II community. For example, from a 1983 book review in Microcomputing Magazine:

    Software buffs are likely to vote for the 
    third chapter ("Tearing Into Machine- 
    Language Code") as the jewel of the 
    collection. It is a unique contribution that 
    applies to all machine languages, of 
    which Apple is only one. Possibly no 
    manual anywhere that describes the use 
    of disassemblers and techniques for ex- 
    tracting hidden code can equal this 
    60-page adventure in decryption. Pro- 
    grammers would do well to look into this 
    chapter to check on techniques they 
    might have missed; commercial software 
    vendors may want to know some things 
    that hackers do to defeat them. 

    Source: Microcomputing Magazine (October 1983) [1]

[1] http://archive.org/stream/kilobaudmagazine-1983-10/Microcomp...


Interesting... The Apple machines were pretty much unknown in large parts of the world outside the US, so I've actually to this date not even seen an Apple II in person as far as I'm aware, and didn't come across Don Lancaster's stuff until a year or two ago either.

Europe was dominated largely by Commodore, with Sinclair, and Atari and maybe Amstrad in "supporting roles" far ahead of the usual suspect of weirder machines for pretty much the entire 80's, until the PC finally got the upper hand.

The computer shops near me in the early 80's at one point or another probably stocked about 20-30 distinct home computer brands, but I can't remember seeing any Apple products at any of them until the Macintosh.

It's fascinating how the view of home computing is often so completely different from the US vs Europe as a result.


Australian, checking in. I first saw a Commodore Pet, then an Apple II shop, in Perth WA, around 1977 - 1982, competing with each other. Tandy had their TRS-80 shops too, but generally .. Apple II was the first "real PC" I remember seeing in the neighborhood of the late 70's. In the early to mid 80's, it was all C64 and Atari and Amstrad for games, pretty well mixed, but a lot of the richer-kids had big Apple II investments, even into the 80's .. and there were of course a lot of other clones in the arena, at that time, too.


Were there large import tariffs or something making it prohibitively expensive to get Apple computers from the US, or did Apple not market the machines internationally?

Countries such as the USSR and Brazil simply made their own Apple clones. I'd suspect Western Europe did not do so because of legal ramifications.


No it wasn't down to tariffs. Keep in mind Commodore did immensely well in Europe despite being a US company (though in name from some point in time it was a Bahamas corporation for tax reasons)

It was many factors I think.

Apple indeed was not set up to handle international sales well, while Commodore had a huge international channel owing to a much longer history (the progenitor of Commodore Business Machines was founded in 1954).

But all indications are that Commodore (and probably Tandy) outsold Apple substantially in the late 70's in the US too. At that point Apple got the "hacker" market, Commodore focused on the business market.

It wasn't until the VIC 20 that Commodore started aggressively going after the consumer market, but the VIC 20 was a low end product, and it was probably first with the C64 that Commodore had a product going after some of the same market as the Apple II.

At this point Apple increasingly was the expensive choice, and the European market was apparently a lot more price sensitive. On top of that Apple had much more aggressively pursued schools in the US, and so I guess US parents might have seen the Apple II as a more "serious" choice at a time when Commodore was increasingly getting associated with games.

Commodore thoroughly trounced Apple in worldwide sales of 8-bit machines and probably for quite some time in 16 bit models from soon after the introduction of the Amiga, but more and more of their output went to Europe for these reasons (eventually quite a lot of Commodores machines were also manufactured in Germany) but also because Tramiel quickly realized that the European markets were higher margins, and so any if supply couldn't be ramped up quick enough for certain models, Europe got priority.

Commodore also burned itself in the US, and its US dealer network, through extreme price war with Texas Instruments and Tandy (pushing things so far that TI pulled out of the market and posted massive losses, but also massively hurting Commodore's profitability - likely a major factor when Jack Tramiel was ousted) and the damage it did to its good will with its US dealer network probably shouldn't be under estimated.

(If you're interested in the Commodore/non-Apple view of the early home computer market, "Commodore: A company on the edge" by Brian Bagnall is a fascinating read)


You've triggered a flashback. I now sorta recall using different colored highlighter pens to denote different kinds of things: loops, jumps, data, etc.


Not to derail, but wow! Thanks for pointing out those old Kilobaud issues on TIA.


TIA is amazing. There's also these amazing sites worth knowing about for old computer magazines:

http://www.atarimagazines.com/ (Not just Atari stuff - also Compute! etc.)

http://www.bombjack.org/commodore/ (Mostly Commodore, but also some more generic 6502 oriented magazines)

http://www.atarimania.com/list-atari-magazines.html


Nasir Gebelli was a game programmer that created some of the classic early games for the Apple ][ (Space Eggs, Gorgon). He eventually went on to create Final Fantasy at Square.

When programming his games in 6502 assembly for the Apple, Gebelli used the built-in mini assembler from the command line. No text editor, and according to most accounts no printer either. All in his head, by hand.


I don't know the Apple mini assembler, but that sounds similar to what I meant.

Either we'd load a tiny "monitor" which is what they were typically called in the Commodore world at least, and you'd have a primitive REPL with commands like ".D C000 C100" to disassemble addresses 0xC000 to 0xC100 or ".A C000" to get a prompt with the address 0xC000 after which you could enter one instruction, press enter and have it instantly assembled and get the next address (with no support for labels). If you needed to correct something, you'd press enter to exit back to the prompt, type ".A [address to correct] [new instruction]" and hopefully you wouldn't need to alter the length of the instruction as then you'd have the fun task of remapping absolute addresses and copy the following code around.

Then when you wanted to test something, you'd do ".G C000" and hope your program didn't crash things so bad you had to load things back in...

Scarily enough it was common at least in demo circles to not even save before each test as it was too slow with a tape or floppy and so we just hoped we wouldn't lose changes too often from bugs overwriting our program etc.

Often not all variables might even be reinitialized on starting the program again either, and so the program might differ depending on test runs... When we saved the programs it was often just a raw memory dump of the locations we'd assembled stuff to... (e.g. .S "filename" C000 C3FF might save the 1K range from 0xC000 to 0xC3FF)

(I keep using C000 by habit, even after 25 years or so away from the C64 - 0xC000 to 0xCFFF was free space in between the BASIC ROM and memory mapped registers for the graphics and sound chips, and BASIC would not touch memory above 0x9fff, so putting stuff at 0xC000 was popular since you could e.g. load basic programs without wiping it)


The Apple II had a monitor you could reach via:

    ]CALL -151
and you could type a program such as:

    *300:AD 30 C0 88 D0 FD 4C 00 30
but while there was a disassembler (300 L) there was no assembler!

Fortunately, if you had Integer Basic, you could do:

    ]INT
to get to Integer basic, then go into that monitor:

    >CALL -151
then jump into the "mini assembler" inside Integer Basic:

    *F666 G
and now you're in something resembling an assembler:

    !300:LDY#$0
    !302:JMP$309
    !305:JSR$FDED
    !308:INY
    !309:LDA $320,Y
    !30C:BNE $305
    !30E:JMP$FF65
    !$320:48 45 4c 4c 4f 20 48 41 
    !$328:43 4b 45 52 20 4e 45 57 
    !$330:53 21 00
    !303G


  *300:AD 30 C0 88 D0 FD 4C 00 30
OWWW my ears! And I think you meant "4C 00 03" there. :)


Yup definitely. Good catch!


Nicely explained. And hello back at ya. (JSR $FF3A)


>I keep using C000 by habit

It's been over 25 years since I last programmed in Assembly on my Commodore 64 but I will never forget typing:

SYS 49152


Yes, the starting address of the 4KB of "protected" memory that we'd poke our assembler opcodes into. All us C64 programmers (loosely used term in my case, I was ten years old or so) have that address burned into our brains forever.


SYS 64738


Agreed. A symbolic assembler was just a convenient way of managing names and jump targets. To reverse engineer it wasn't too hard to write down the bytes and translate to symbols because the instruction sets were relatively simple and regular (unlike current x86).


It wasn't just the method he used, though. It was 512 pages of assembly, and he spent more than 6 months studying it. The amount of effort he put in is astounding.


It sounds a lot when put like that, but it's not 512 pages of dense text, but in terms of typical presentation, perhaps four hex character address, a three character instruction and an operand of <10 characters or so - he could've fit the entire thing on 64 double sided pages easily, maybe less.

Lots of people printed out games and other programs in the 32K-64K size range all the time and studied them for months. It's what we did to work with "large" code bases. I've written perhaps a dozen pretty printing programs for asm (including ones that would painstakingly download narrower fonts to our dot matrix printer to fit more columns....) exactly because I wore out large number of printer cartridges, printing out code and kept trying to get things printed faster and on less paper, and that was a pretty common pursuit.

Again, it's a lot of effort, and some of what he did shows a quite impressive level of dedication (such as dumping the eproms, building his own clone of the game, and fixing bugs the designers hadn't found), but actually printing out a code base that size and studying it to learn was quite normal.

I'm all for recognizing this guys effort, but not for the most mundane part of what he did.


In my days in the 80's, I had one of these:

http://www.old-computers.com/museum/hardware/Oric_Atmos_MCP4...

.. and used it as a general work-buffer for expressing routines and dumping assembly .. my parents called it my "bogroll homework" since I usually took rolls of plotted product to the head with me, light reading ..


My father in law is a security guard for a private company.

He watches DVDs for 7 hours a day on a laptop and goes for an amble the other hour.

That is the entire job.

If only he put some of that time to better use like that kid did. It's not amazing, but a good use of the time.


In the early days of game piracy on computers like the Apple ][, people hand-disassembled the copy protection schemes to understand how they worked and develop workaround patches.

Granted, it's a lot easier to get to the object code on an Apple ][ than it was on a Williams Robotron. Harder still when you don't have documentation about the specific I/O ports on the hardware and other things like the blitter chip.


Compare that to Tom Skelly, lead programmer at Cinematronics (80s vector games) who had to reverse engineer an abandoned custom CPU design while writing games for said CPU in machine code on legal pads.

http://www.dadgum.com/halcyon/BOOK/SKELLY.HTM

It's amazing what you can do when you're focused.


The most amazing part about the Cinematronics/Vectorbeam games is that they had no CPU in the form of a microprocessor. It was all common TTL gates wired up into a simple processor architecture.


That's how me and my brother learned to program graphics and games as 14-year olds. We disassembled over 50% of Manic Miner on the Sinclair Spectrum using an hex-to-opcode sheet, lots of pen and paper, and oh yes tracing lines along the jumps. Fun times! (I continued ripping apart many other Spectrum games, but with actual disassemblers)

All it required was: free time, motivation and, above all, a mind free of assumptions and structure, ready to learn and be filled no matter the effort.


(25 years ago) the first and hardest step was to get the hex memory dump printed (the amount of pages to be printed required escalating of one's permissions to "operator" level) and physically get the said printout into your possession - depending on the situation one could try to charm the girl "operator on duty" in the university department datacenter or one could divert her attention in some other way while one's partner will try get to the printer ...


I can remember using 'ddt' to rummage through the code for WordStar (I think) on my old Kaypro back in the early '80s. It really caught me of guard when somewhere deep in the middle of the disassembly I ran into some chunk of data that said "Nosey, aren't you?"


Part 3 is also really interesting.

http://www.robotron2084guidebook.com/technical/christianging... (cached: http://webcache.googleusercontent.com/search?q=cache:SOgJkJa...)

Incredible that he was working for minimum wage after completing this. And he is true on saying that we value too much a university diploma (but this is slowly changing).


There is an interesting bug which I think has been left even in the ROMs which fix the more annoying problems like the corner shoot crash.

The objective of the game is to dodge and kill all the Robotrons on each screen while "saving" as many humans as possible by touching them. One of the Robotron models is the Brain, which reprograms the humans and sends them chasing after you. On the first level with the Brains, though, there are a whole bunch of Mommies on the screen and one Mikey (a little kid dressed in red). It turns out that the Brains won't start reprogramming humans until you save Mikey, so if you avoid him and gather up all the Mommies, you can rack up a lot of points and make the level much easier.

So, kind of a bug that turned into a feature. It's a one-off thing and probably not too relevant to people scoring well over 2 million points, but for a guy like me (many heartbreaking runs ending in the 900,000s), a good score on that level could be make-or-break for a whole 10-15 minute run.

Some of the scores guys put up on the original machines in the early 80s are even more impressive when you consider they could end their game at any time by shooting into the corner the wrong way.


What an abrupt ending... no resolution or anything, just this random tidbit!

"Larry insisted to come back and drive us to the airport. When he aggressively honked at a city bus driver for trying to cut him, this felt like a life lesson for me"


There's another page that explains what happened after - they offered him a job after he'd finished college but the video game crash ended up stopping that.


Why was he working as a security guard instead of writing programs?


In one of the later parts, he complains that employers only look at diplomas instead of what you can actually do. His prospects opened up a bit after he graduated.


Discussion of why you should aspire to be a security guard:

http://captaincapitalism.blogspot.co.uk/2013/05/why-you-shou...


Edward Snowden got his start at the NSA as a security guard!



What does he mean by partner? The partner seems twice his age so perhaps it's a poor translation for friend, professor, father, etc?


Awesome !! Is there so info on -

- which country was he from ? - Did he become a successful coder ??? - What is he upto these days ??

This story is incomplete !!



I loved this game. Great read thank you.


This is truly an amazing feat. What an awesome thing to do. My hat is off to him.


In part 3 a controller is mentioned called a "wireball". What is that?


It sounds like a messy translation of "mess of wires", like Christian had soldered on a bunch of extra wires to scope out various signal lines and interrupt the clock, etc.



Epic, pure epic.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: