Hacker News new | past | comments | ask | show | jobs | submit login
The Polygons of Another World (fabiensanglard.net)
704 points by ibobev on Jan 2, 2020 | hide | past | favorite | 125 comments

Love this, love the write up.. but on the amiga...

the A500 had 6 bitplanes, not 5, also, with PAL being the most popular region, I'd say 320x256 was the more popular resolution over 320x200. 6 bitplanes are used for Hold and Modify (HAM mode, where 2 bits select Red, Green, Blue or Palette, and the remaining 4 bits indicate either R G or B values, or the index of the color.) The other 6 bitplane mode is "Halfbrite" which doubles the palette size for a total of 64 colors, where the second half 32 colors are half the intensity of the first half. When running with 6 bitplanes, agnus (the memory controller if you will) steals a lot of cycles for Denise (the display chip) from both the 68K and the blitter. The blitter being crucial for drawing filled polygons.

The second thing that made the Amiga revolutionary (and one could argue, for this reason, that the PCI architecture of Hombre/AAA is not a true Amiga no matter Commodore's demise) - the second thing is the way that Agnus orchestrates memory access much like the microcode inside a CPU orchestrates registers and ALUs. In particular, Agnus controls the memory address lines and chip-register-address lines; the chip-register-address and the memory address lines are opened up simultaneously against eachother with one reading and the other writing. So, for instance, when reading bitplans, agnus opens the chip bus using one of the register addresses 0x110-0x11A for BPL1DAT to BPL6DAT depending on which of the 6 bitplanes is being read; while at the same time driving the appropriate memory address by putting the contents of agnus registers 0x0E0-0x0F6 for BPL1PTL/H to BPL6PTL/H on the address bus of the "chip memory" DRAMs. The datalines of the DRAM and the special chips (Denise here) directly open on eachother (eg. DRAM and Paula chip-selected at the same time.)

I'm not sure if I did it justice and got the elegance of this architecture across in this short post. I'm sure many enthusiasts like me who got their start on the Amiga programming 68K continue to discover new things about this machine.

Good times, warm memories.

Thanks for the correction. I will update the drawings asap.

Since you seem to know the machine well, maybe you can shed some light on something i was unable to clarify: What is the fastest way to clear a framebuffer to a desired color on an Amiga. My hypethesis is that it was done with the Blitter, by disabling all DMA entries "doing something else". Do you have any idea?

To set the framebuffer to a single color, consider that there are actually up to 6 "framebuffers" (one for each bitplane.) Consequently, it is to either set all bits for a given bitplane to 1 or to set them all to 0. To control the actual word that is blitted without DMA, you would use BLTADAT, BLTBDAT or BLTCDAT registers to control the corresponding blitter input word with DMA disabled (so the blitter does not write into these registers itself.)

A common technique to avoid having to blit each bitplane independently (esp. for BOB sprites where the blits are small and plentiful) is to interleave the bitplanes by row using the BPL1MOD and BPL2MOD register values; these values are added to each bitplane pointer address at the end of each row to move the pointer to the start of the next row; consequently a single blit operation can capture all bitplanes as in memory a row can consist of all row bits belonging to bitplane 0, all row bits belonging to bitplane 1, all row bits belonging to bitplane 2 and so on.

If this is the bitplane layout used then a single blit can only set the values to 0 or 15 (all 0's or all 1's for 4 bitplanes enabled); note however that the blitter too has modulo registers (BLTA/B/CMOD for each input and BLTDMOD for the destination) so perhaps best examine how fancy the programmer got with it (eg. multiple blits to clear the screen to a specific color even if the bitplanes are interleaved using BLTDMOD.)

Obviously I'm a bit rusty given it's been a few years, but I hope this helps. I'm a fan of your work, thank you.

I had no idea about the BPL1MOD and BPL2MOD. Now that i know they exist I don't see why a game developer would not use them.

I will try to double check with Eric Chahi but this is so much more efficient that it has to be the way he drew polygons (instead of drawing into a scrap buffer and blitting four times, he used BPL*MOD to draw and blit only once).

Unless I'm wrong, a drawback of using interleaved bitplanes to do fewer but larger blits is that the mask for the sprite shape gets bigger too.

If you do several blits, one per bitplane, you get to use the same mask data, but when using interleaved data the mask needs the same shape data * number of bitplanes involved.

It's still a pretty small price to pay compared to Atari ST/ZX Spectrum preshifted bitmaps.

Does that mean you cannot use the Blitter to draw lines and fill area as described in the article?

You'd use a single multi-bitplane interleaved fill-blit if the background surrounding it did not need to be preserved (eg. no mask requirements as per Flow - demos tend to do this for its speed).

I'd imagine Another World creates an off-screen single-bit deep bitmap and uses that as a mask. I sent you an email if you'd like to explore more, however the original programmer probably just remembers what he actually did.

Also note that non-overlapping polygons can be grouped together in both approaches by line-blitting with XOR destination (adjacent polygons share an edge and so the edge is XOR blitted twice); so there are substantial speed-ups to be had.

Thank you so much, I am going to amend the "fill" section. Are you ok with being credited as "quincunx" or you prefer something else?

That's very kind, quincunx is fine, or no credit is fine too. Glad I could be of help.

To maximize performance you'd tend to combine the blitter with a M68k loop, typically using MOVEM to maximize the number of bytes copied per instruction.

The reason for this is that the blitter can not utilize every single memory cycle.

So if you want to maximize graphics throughput you'll need to carefully balance the work between the blitter and CPU depending on which CPU.

There's also a NASTY bit you can set so the Blitter gets the bus cycles it want, any CPU access will have to wait for a spare cycle.

I'm not sure how much faster using that bit will make Blitter graphic operations in practise. I suppose when using 6 bitplanes a small speedup of like 10-20% could be seen.

so fun to see both of you discuss in depth like old magicians

Half-brite was not in the Amiga 1000 and early Amiga 500 and 2000. The later “ECS” chipset has included “halfbrite”. Most games did not take advantage.

Do you know of any games that used halfbrite? I can’t think of a single game that did.

I think there were like three games that used HAM mode, if you filter out “displaying a scan of the box art while the rest of the game loads”. Knights of the Crystallon, Pioneer Plague, and maybe Mindscape?

At least Pinball Dreams did and googling suggests Cannon Fodder. I suppose a few more must be in there.

Edit: https://retrocomputing.stackexchange.com/a/196/4616

Also The Settlers

Black Crypt - http://hol.abime.net/126 Links - http://hol.abime.net/892 Universe - https://en.wikipedia.org/wiki/Universe_%281994_video_game%29

Also interesting is the game 'Universe', which boasted achieving 256 colours on ECS (edit: a comment on the StackOverflow page linked below says it works on OCS as well). Some quick Googling before asking you guys about it as I was planning to turned up this:


The Amiga was such an interesting machine. It's too bad I never got into assembly programming back then.

Psygnosis' Shadow of the Beast used halfbrite, and was heavy on the copper effects too. Gameplay kinda sucked, but it was a hell of a demo against an ST or PC. When it was released no one had seen anything like it. :)

Edit: I suspected many Psygnosis titles used EHB, so I did a quick search, and discovered that the surreal barn-owl based defender clone -- Agony (Shades of Jeff Minter?) indeed used EHB, then got easily sidetracked as someone has made a UE4 based PC clone of it.


I remember my friends and I spent far more time showing it off to people than actually playing it properly.

Which fits, as people involved basically later described the original Shadow of the Beast pretty much as a graphics showcase with "no thought whatsoever" going into more than showing more parallax scrolling and more monsters in.

It's hard to show it to people now and have them understand just how jaw-dropping it looked to people at the time, both to Amiga users, but especially to PC users where most people still did not have graphics cards capable of lots of colours and smooth scrolling with even a single parallax layer, much less the dozen or so in Shadow of the Beast...

I also just realised the intro I remembered showing friends is the intro for Shadow of the Beast II, a game I never bothered playing - but where "everyone" had seen the couple of minutes of animation of the intro, at a time when having a long animated intro like that was in itself a big deal.

Also the smoothness - locked at 50 frames per second, it was mesmerising.

I came from the MSX computers, which usually had blocky character per character scrolling, or slower animations. The PC games were usually even crappier then that. And the Amiga stereo sound, was also incredible if you happened to play a game on a stereo TV set. Most games didn't even bother to mix the sound properly (to slow CPU and/or too few channels, 2 for left, 2 for right) but that the sound had some kind of "depth" not present in mono speakers, also added to the subconscious feeling that this is different.

(3D games - Wolfenstein & Doom - instantly switched the tables, it was obvious that the parrot was not merely pining but actually dead.)

Codetapper has a nice breakdown of how it did things


I still vividly remember Agony beautiful soundtrack. The game itself was very pretty but gameplay was nothing exceptional.

Most Amiga games with parallax used dual playfields mode. AFAIK, you couldn't combine that with EHB.

Shadow of The Beast used dual playfield graphics, not EHB. Agony too, for gameplay. Title screen etc could be EHB.


Fusion was another one that used half brite


I know of one game that actually used HAM (specifically HAM6, the original OCS version) during gameplay and it seemed to run very fast as well - it was a PD game called Zdzisław Bohater Galaktyki 3D and it's, well, not very serious [0].

[0] https://www.youtube.com/watch?v=IV5UPk8ZQZw

It seems all sequences are prerendered, it’s all animation.

Yes indeed, but typically displaying a fullscreen HAM image takes up several seconds [0] of very visible scanline progress. This is pretty much instantaneous and runs on even the base 68000 models, as long as they fulfill the memory requirement (which is admittedly rather hefty - 2MB chip and 8MB fast).

[0] example: https://youtu.be/Y9qzUbT_5Dk?t=95 timestamp and later on as well

That’s just Deluxe Paint being slow. A HAM image is no slower than an EHB image, it’s the same amount of bytes.

Lionheart? Very impressive pixel art, too.

Yeah, and pretty impressive performance wise as well.

It was only early NTSC (only) A1000s that didn't include EHB -- it was OCS, not an ECS feature. Far as I recall all revision A500s and A2000s had EHB. All PAL A1000s had it.

Quite a few games used it for drop shadows and such. Well, the ones that weren't ST identical, which was far too many.


I just want to let you know that I appreciate someone showing up in threads like this who have such solid knowledge of things like the details of the Amiga Halfbrite mode.

Targetting 320x256 makes it harder to port to 320x200. Meanwhile, porting 320x200 to 320x256 is easy, just add a small letterbox.

I always thought 256 vs 200 was a PAL vs NTSC thing? Is that wrong?

Ya, depends on the system. For example, the NES is 320x224 (NTSC) and 320x240 (PAL).

I can't speak to why they differ exactly though.

The number of lines per second is constant, but you get 50 frames per second on PAL and 60 frames per second on NTSC.

Actually the number of lines per second is not constant, but the overall video BW is “similar”. In terms of analog TV, PAL has at the same time lower refresh rate and higher channel BW, which in terms of computer resolutions means that NTSC is exactly 640x480 (as that is derived from NTSC timing) while PAL is something like 800x600. Home computers for all the time where that class even existed simply didn’t have both the framebuffer sizes and video BW to generate full resolution video, so the resolutions ended up halved (and in analog TV there is no horizontal resolution to speak of, so 320 is just as good number as any)


320x200x60 = 384k lines 320x256x50 = 410k lines

Akshually, the horizontal (BW) resolution has little to do with timing. It's as good as the analog components are and as good the computer can output. Amigas had 1280x256 resolution, if you wanted, on standard PAL equipment. (On a fuzzy TV set, you weren't likely to appreciate 1280 horizontal resolution though.)

You need to ignore the horizontal resolution and count the blank areas as well: PAL = 25 * 625 = 15625, and NTSC = 29.97 * 525 = 15734.25.

Haha I wasn’t thinking straight! Good thing I used 320 in both examples

Not wrong but the Atari ST's low resolution is 320 by 200, even for PAL, so targeting that resolution is helpful if you want to target both machines

It certainly was on the Amiga.

And only a few years afterwards, the Acorn Archimedes (launched in 1987!) would achieve comparable or better (especially in 3D) graphics and sound via pure CPU power, with little or no custom hardware. Cool stuff, but it was essentially a dead end architecturally. Even if Commodore hadn't screwed up to the extent it did, that whole generation of machines was tied to the 68k series, and would have died with it anyway. (Even Apple could not save the 68k, they jumped ship to POWER instead.)

Juste to state what is obvious for many but Archimedes CPU is none other than ... arm, the most ubiquitous cpu on the planet nowadays.

I would contest that, The CPU power advantage was considerable, the 68000 took 4 clocks to add two registers, anything fancier took more, compared to the ARM where you could build the kitchen sink and shift it across a bit in a clock. That was why the 3D performance was so much better. but to gain the capabilities that were available in 2D the advantages of playfields, sprites and copper lists you'd need far more CPU power.

I'm not sure "little or no custom hardware" is a great way to describe the Archimedes.

OP meant custom graphic hardware. Dumb framebuffer is also what Apple Macintosh used, combined with full graphic library in firmware. Brilliant less is more/worse is better approach. Mac graphic accelerators simply patched firmware routines for their own accelerated versions - no drivers, no software rewrites, plop a card and everything is faster.

The Archimedes had custom graphics hardware. The VIDC was an Acorn design.

The VIDC provided a configurable framebuffer (with choices for resolution and color depth), plus a single "mouse pointer" sprite. Audio was just PCM channels. The Amiga custom hardware did a whole lot more than that.

Sure, but the quote was “little or no custom hardware” - a bit wide of the mark! The thing didn’t even have an off the shelf cpu.

ARM is not an "off the shelf CPU"? Would you say that the Commodore 64 didn't use an "off the shelf CPU" simply because MOS Technology was owned by Commodore?

Well, yes and no - it is a slightly modified version, the 6510, so in that sense, yes. But in another sense, no, because what did they do, when they found themselves in need of a CPU? They used an existing design. That's pretty much what "off the shelf" means.

The VIC-II would count as custom hardware, I should think, as it sounds as if it was designed specifically for the C64. Same as the Amiga stuff was designed by the Amiga team for use in their computers. And same as the ARM CPU (and the other bits) were designed by Acorn for use in theirs.

Just leaving "The Ultimate Acorn Archimedes talk" here: https://www.youtube.com/watch?v=Hf67JYkUCHQ

Traditionally some corrections for the author:

>The Amiga 1000 could not boot by themselves, they had no ROM. The bootloader was on a floppy and you better not lose or damage it!

Amiga 1000 has bootloader (Kickstart loader) in ROM, chips U5N and U5P - two 256Kbit EPROMs/mask ROMs, 64KB in total. What it didnt have was firmware(Kickstart, like PC bios) or system(Workbench, like PC DOS/Windows) in ROM. You can see chips containing A1000 bootloader on the motherboard picture, its the two with stickers, right below two 8250 CIA chips https://www.bigbookofamigahardware.com/bboah/media/download_...

You cant boot(load) without a bootloader. Trivia: Back in the day Bill Gates held a record of writing the shortest bootloader for Altair 8800, computer with nothing but LEDs for output and switches for input. You had to input said bootloader every single time you wanted to load something (like BASIC) from tape. 'Computer Notes Volume 1, Issue 6, 1975' Page Twenty-One, Author Bill Gates: “I’ve written a bootloader that only takes 13 bytes of keyed-in data, but anything smaller than 20 bytes isn’t easy to use.”. He was finally beat by one byte in 2017 http://just8bits.blogspot.com/2017/03/doing-it-in-less-than-...

>It was Commodore best selling product with an estimated 6 millions units shipped from 1987 to 1991

other than you know, C64 ;-) and thats not counting other products like Datasette Commodore 1530, shipped with every VIC-20 and C64.


Chip RAM is not directly connected to CPU, its gated behind AGNUS https://www.pmsoft.nl/amiga/A500-block-diagram.jpg DBR signal is what switches CPU data bus access.

Thanks for these clarification, I will update the article and drawings tonight.

> firmware(Kickstart, like PC bios)

My understanding is that Kickstart contained both Exec (the kernel) and Intuition (the windowing GUI library), so it was more than a BIOS.

Corrections for http://fabiensanglard.net/another_world_polygons_PC_DOS:


VGA was hanging off 16 bit ISA bus. CPU wasnt touching ISA, Ram wasnt touching ISA. You had chipset with build-in memory controller, ISA arbitration and fast Cache was the norm. Typical contemporary 386/486 motherboard diagram http://www.textfiles.com/bitsavers/pdf/samsung/pc/98134-925-.... page 1-1 system block diagram will explain a lot.

> clear the screen

this must be the worst clear screen routing of all time :o why bank switch at all? setting mask to 0xff once will write to all of them at the same time

    outpw(0x3c4, 0xff02);
    memset(VGA,0, 8000);
done. Enough for 30fps clear speed using absolutely the worst 8bit ISA VGA card, something like Trident TVGA9000 or Realtek RTG3105.

>SOLVING COPY, 4x speed up

Transfer speed when writing to non crappy 16bit ISA VGA card is ~3MB/s. Reading is always slower, nobody optimized graphic cards for reading, sometimes even ending up under 1MB/s. I think I read Abrash or Carmack stating that IRL on real hardware this ended up being a wash in terms of performance.

>SURPRISINGLY DIFFICULT AUDIO, On Amiga and ST it was a piece of cake

ST has no native PCM output. Atari uses "Yamaha" YM2149 aka re-licensed AY-3-8910, audio chip from ZX Spectrums and Amstrad CPC. Its the same pain as trying to play PCM on Adlib (volume register?), except even lower fidelity.

> Datasette Commodore 1530, shipped with every VIC-20

The Datasette did not ship with every VIC-20. There were plenty of non-Commodore datasette clones on the market, however.

You are correct, they werent bundled with every VIC-20 nor C64/C128 right in the same box. Its hard to get any sales numbers for Commodore branded storage, I found claims of couple million floppy drives shipped up to ~1986 (Compute-Gazette-Issue-32-01 Feb 7, 1986). But third party tape decks/floppy drives must of been a drop in the ocean considering Commodore was a billion dollar company by 1984. Its a widely known secret they were making bulk of profits on peripherals (The Retro Hour podcast EP202-1980s Home Computer Wars. Interview with Michael Tomczyk, Jack Tramiels right hand man).

Realistically every single C64/VIC-20 user, and probably at least 1/2 of C128 ones had a tape drive, bulk of them manufactured by Commodore.

btw C128, considered a failure by many (including me), sold 4.5 million units, right in the ballpark for total Amiga, all models combined, worldwide sales.

Since you seem knowledgeable, and this seems surprisingly hard to google, do you know why it was named "Amiga"? I know the codename was "Lorraine" and, because I'm a native Spanish speaker, I know "amiga" means "female friend", but why was this name chosen? Did they simply pick a random name, and if so, why Spanish?

The team that designed (which Commodore later bought) was called Amiga Corporation.

They were initially called Toro, but there was another company with that name.

As for the choice - they just wanted a nice friendly name, and Amiga was that, being simultaneously a nice sounding foreign word (for the English speakers) by with the known friendly Spanish connotation to many.

They were originally called Hi-Toro, and wanted something more friendly so became Amiga corp whilst they were still based in Los Gatos. No doubt they were aware of the Spanish meaning. Allegedly the fact it came before Apple and Atari alphabetically helped the choice.

After the joystick and joyboard (joyboard was also the source of why Amiga crashes are called Guru meditations) production died off, they were left with only Lorraine, so Amiga became the machine name too...

A few games used the RAM intended for the Kickstart on A1000. The same game on A500 needed 1MB RAM.

A bit more info about the Amiga 1000 Kickstart write-only memory. https://www.old-computers.com/museum/doc.asp?c=28

Bootloaders are interesting, and the one that impresses me the most is the one for the EDSAC. It's 31 instructions, and implements a relocating assembler that is used to load and assemble the program you want to run on the computer.

Another World was _astounding_ when it came out. Although the Amiga was considered a high spec machine at the time, its graphics hardware was oriented around scrolling, sprites, and rectangular blits. This is good for the typical platformers and shoot-em-ups that dominated gaming at the time but useless for the type of full motion video the AW was attempting.

The Amiga's blitter could write almost 4Mbs per second (if you set things up juuuust right and you had the wind behind you). On a modern monitor (a modest 1920 * 1080) the Amiga's blitter would take almost 2 seconds just to clear the screen.

I think the coolest part about many games from this time period is how vastly differently they were developed compared to modern games from modern game studios.

"...it is the original version built from 1989 to 1991 by then 21 years old Eric Chahi working alone in his bedroom."

Because he got royalties from an earlier game, and that paid enough for him to take on this project. And he did pretty much everything except the music: Programming, graphics, story, box art. One dude. One 21yo dude. In his bedroom. And he wrote a VM! He wrote his own bytecode! He wrote his own VM host in assembly!! And then he wrote the game in his own bytecode!!!

And then you have the people who ported games in this time period, who often got nothing to work with. No code, no comments, no assets. Disassemble the thing yourself on a platform you might know very little about, and re-write the thing on a platform you do know a lot about. You get a couple of months. Chop chop, the publisher is waiting.

And if you haven't read Jordan Mechner's journals of him writing Prince of Persia from the same era, you should. It's equally bananas. And impressive.

Eric Chahi (and Jordan Mechner) undoubtedly have genius-level IQs which is to say that they stood apart even in that time period.

The more important thing to note is that the brutal constraints of the day and the extremely high bar to entry acted like a pre-sorting mechanism but also fueled the best and brightest to exceed their perceived limitations and unleash their creativity.

Compare and contrast with today, where the low bar to entry has led to a proliferation of garbage-level work (on all levels, not just games) that tends to not only drastically decrease SnR but calibrate the "standard". When everything looks like mediocre garbage, this is what everyone expects.

You take the good with the bad. There may be more "garbage" work as you put it, but that's because there is more work overall, likely by several orders of magnitude..

And with that still comes some gems to go along with the garbage, so in the end, we win.


I also enjoy that we still have a modern-day version of small indie developers (sometimes also one-person teams) building and shipping games.

It is of course a different landscape today and most small or single-person teams are working with an engine like Godot, Unreal or Unity (or many others), but given the increased complexity (and capability) of the underlying hardware, this sort of makes sense.

It was astounding back then and in my mind still is a work of art. Like Fabien says, its limited palette actually worked in its favor. The evocative graphics still remain some of my favorites, ever.

It's one of my most replayed games ever. Even when I know the plot and there are no surprises, this game manages every time to take me to... (cue music) ANOTHER WORLD.

If you haven't played this game I strongly recommend it. It's on GOG. It's on old emulators too but the graphics and input handling can be pretty rough and unforgiving.

Given that it’s apparently a bytecode executable for a VM, is there a modern implementation of said VM? Maybe even one that translates the draw ops into Vulkan commands? Seems like a fun project.

I second this. Another world was a fantastic, innovative game.

Also look for Another World – 20th Anniversary Edition.

Didn't realize they released an updated version. Just found this graphics comparison[1]. Looks like they mostly just smoothed out the lines.

1. https://www.youtube.com/watch?v=b86QXaOdgX8

IIRC the 20th anniversary version has backgrounds entirely redrawn and pre rendered as bitmap, they're not poly anymore. The animated stuff is the original polys rendered as always, only at much higher resolution than the original 320x200. Sadly they lack detail, see notably at 2:26 in the comparison video. Also at 2:47 you can see perspective has been corrected, but the overall result is less consistent (left wall vs five spoke machine). I very much prefer the original one overall, the reduced detail and inconsistencies in art of the new one just takes away the immersion.

What I found funny about watching that is that the 2014 version is exactly how I remembered the 1991 version. Memory smoothed it all out I suppose.

The magic of CRT :)

I know it was one of the last to get it, but my first experience with this game was the SNES version. The SNES version has some horrible slowdowns at parts, but the musical score is amazing! I was disappointed to hear other versions didn't have tat score. I haven't checked all the versions, is it unique to the snes? Does anyone know if any other version has that music? I was hoping the 20th anniversary would have had it but it doesn't.

Btw it's not as good but the sequel, heart of the alien, is only on the segacd.

I hunted down a Sega CD and Genesis just to play it.

There is a version on Playstation that's very good. You can switch between the original graphics or updated graphics by simply pressing a button at any time. It plays really well.

It's also on Steam now!

Amazing, as everything in Fabien's site.

I rushed, hoping to find some oldskool tricks and remember my time spent reading Denthor's VGA Trainer[0].

It seems I'll have to wait some more chapters for that, though. :-)

[0] http://textfiles.com/programming/astrainer.txt

Another World's original author Eric Chahi gave a great GDC "Classic Game Postmortem" talk about creating the game:


Using a custom VM was certainly a clever and efficient at the time. LucasArt used the same trick for a lot of point and click games.

I don't think this idea would scale well for much bigger projects, as debugging internal scripts can quickly become a bottleneck.

Well, the technique has been used in newer projects as well. Quake and Quake 3 both had VMs that ran the game logic.

Tooling is a perennial problem with extension systems, but it’s really damned if you do, damned if you don’t. If you’re not using an extension system, then you’re probably writing the game logic and the engine as one giant mass of C++ code. Writing tons of game logic in C++ will get you in trouble for lots of reasons. Writing an interface between your C++ core and some kind of scripting system will get you in trouble for completely different reasons.

Tim Sweeney did a presentation about the programming environment needs of different parts of games back in 2005, still worth reading: https://www.st.cs.uni-saarland.de/edu/seminare/2005/advanced...

Another example of custom VMs from that era is Infocom's Z-machine, used for text adventure games.

The strategy of using a bytecode VM to ship cross-platform games seems to have been pretty popular back then, which is a little surprising to me -- I thought Java came up with that idea first!

The first computer language in history was bytecode language for a VM.


A lot of people working during the PC revolution in the 80s that had a formal education would have worked on languages with exactly that kind of tech: various Lisp dialects, ML, Smalltalk. Same for all kinds of meta-programming, JIT or dynamic compilation, etc.

The Apollo Guidance Computer that was used for navigation and other control tasks in the moon missions also used bytecodes.

The reason was to save space, as well as not having to constantly update the software as the hardware specs changed.

I guess Java's "innovation" with VMs was requiring the user to download it separately from the programs it runs.

Its older than that. There was the p-code machine used for Pascal. https://en.wikipedia.org/wiki/P-code_machine

There was also a (brief) period in the early 90's where Microsoft's C++ had an option to compile to p-code. IIRC this was to reduce executable size.

Now that the Atari ST part of the article is out as well, I'll add some corrections regarding ST graphics format. The article states that "There are no bitplanes here like on the Amiga we saw in the previous entry." This is slightly inaccurate. ST does have bitplanes, but the organization is interleaved to 16 bit words, unlike on Amiga where the bitplanes are stored as separate memory regions.

So in 1 bpp hires mode, there's just a single bitplane (and the memory layout is identical when comparing to Amiga 640x400 interlaced hi-res 1 bpp screen mode). In 2 bpp medium res (640x200, 4 colors), there are two bitplanes, alternating bitplanes for every 16-bit word. So 16 bits of bitplane 0 for the first word, then 16 bits of bitplane 1 for the second word (these together form the first 16 pixels), and so on. In the 320x200 low res mode there are four bitplanes interleaved in the same way.

What this means from the programmer's perspective is that you can do similar bitplane tricks as with Amiga, e.g. modify just one of the bitplanes independent of the others. For instance effects where car headlights change the background image colors can be done the same way on both machines: render to just one bitplane and use the palette to achieve the change in colors.

However, when with Amiga it's trivial to modify bitplane pointers independent of each other to scroll graphics around, on Atari ST all bitplanes are tied to the same base pointer (because they are interleaved in the memory). (And to make things like full-screen scrolling even more difficult, this base pointer is aligned to 256 bytes, i.e. the low byte of the video memory base address is always 0 on Atari ST - on STE this is programmable.)

There are some cases where the interleaved bitplane format gives benefits: while neither of the machines are good for writing texture mappers (or other effects that require manipulation of single pixels), it's slightly more efficient on Atari ST because the layout allows some tricks (e.g. using MOVEP instruction to write 8 pixels to all 4 bitplanes at once from a single 32-bit register).

From the point of view of a game like Another World, the bitplane arrangement of Atari ST doesn't really cause additional problems when comparing to the Amiga. The issue is the lack of blitter (and STE blitter isn't that great improvement, it doesn't have the line drawing capabilities of Amiga blitter and has less flexible src/dst mechanism, so no wonder it isn't utilized by the game).

The SNES was the platform I played "Out of this World" on.

The interesting thing about it was, it was a polygon game that ran without using the SuprFX chip.

With the SNES being sprite-based I could never figure that out.

The article has a link to a video of the author of the SNES port talking about it. https://www.youtube.com/watch?v=tiq0OL8rzso

I've always wondered, did Nintendo demand that all games have continuously running soundtracks? I remember playing this game and Prince of Persia on the SNES and being really put off by these soundtracks someone shoehorned in.

Funny trivia about this:

>[...] They also wanted to replace all the music made by Jean-François. I had yielded for the extra songs, but I wanted to keep the music of the introduction, as it perfecly matched the atmosphere and the animation timing. This became a real struggle [...] So I took drastic measures. I thought of creating an endless fax. A huge fax of a meter long in which I wrote in big letters "keep the original intro music". I would insert it in the fax, enter the number, and when the transmission started, I would tape both ends of the letter together, which would create a circle that went on and on until there was no paper left in the offices of Interplay, at the other end.


I think hardware limitations were a huge source of creativity in games. If you want to make an interesting game, tie a proverbial hand behind your back and see where it forces you to go.

Game developers are ALWAYS held back by hardware limitations.

I think a lot of the time that just results in framerate or resolution or draw distance knobs being adjusted.

Biggest hardware hold back right now is all of the real time lighting and ray tracing stuff. Eventually that will be solved, then it will be something else that hardware hasn't caught up to yet.

Fabien's blog is a pot of gold. I love this "archeological" approach to programming. He also makes the articles feel like documentaries. It's just unique.

Couldn't agree more. His two books are also phenomenal [1]. If anything, worth getting a copy just to support his amazing work.

[1] https://www.amazon.com/Fabien-Sanglard/e/B075Q5W35H

I was one of the few kids who got a 3DO (yes really), and I had a copy of Another World for it, after already having played it on the Genesis, and I was a bit underwhelmed, since moving to comparatively more-powerful hardware didn't make the game look a million times better.

After a bit, though, I realized it wasn't that the 3DO version was underwhelming, but instead that the Genesis version was amazing. It already felt like a next-gen game, and so the upgrade to the 3DO didn't make a huge change.

In the end, it has become one of my favorite games. I love the art, the style, the cutscenes, and even though it's incredibly short, I love the story.

I never played Heart of the Alien, because I never had a Sega CD, but I'd like to play it at some point on an emulator.

"The 3DO version of Out of This World is quite different from the other versions in terms of graphics and sound. The polygon backgrounds have been replaced by hand-drawn versions, the quality of which varies from stunning to amateurish."


Yeah, I realize I didn't make it super clear; there were some enhancements, but it wasn't this huge thing like I was expecting. I was expecting it to be like the Genesis vs 3DO versions of Road Rash, where the 3DO version is vastly superior graphics-wise.

It's available on Nintendo Switch for those who are interested in trying it out.

I played this game in the PC, back then. It was one of those games I've consireded like "top game" for D.O.S. What a nostalgia bump I've got.

I played this for the first time on the Nokia N900 (armv7-a). I thought it pretty, but unplayable. Had no idea of the porting history until this article.

I was just eyeing the game on my steam library early in the day. Fantastic title, shame he went on to produce so few follow ups.

Eric is clearly a genius, I assume he hit a sophomore slump. He's only made a couple games since, many years later.

Even with just one major game to his name, its legacy lives on. Another World set the template for the "cinematic videogame" and even today you can see the influence of its cutscenes and minimalist gameplay on games like Ico/Shadow of the Colossus and a plethora of modern indie games. It accomplishes with wordless dialog what Hideo Kojima tries to do with 20 hours of overwrought monologues.

I couldn't agree more. It's one of my all-time favourite games. I replay it every few years.

Another World was so far ahead of it's time. I have such find memories of playing it on my brother's Amiga :)

It's also just a lovely piece of art.

fond memories ... ¯\_(ツ)_/¯

Awesome read

amazing stuff, give a try also to the video from the snes conversion programmer

If you install the 2004 version of the game on iOS (https://apps.apple.com/us/app/the-bards-tale/id480375355), the old 1980's versions are also installed as extras. I spent so much time on these, but never finished III, I remember the combats taking forever almost like infinite loops.

This is truly amazing.

Is there a known Javascript VM implementation of this game?

This is amazing.

Third time this has been posted in 17 hours:


Tell the mods: hn@ycombinator.com

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact