Hacker News new | comments | show | ask | jobs | submit login
8088 MPH: We Break All Your Emulators (oldskool.org)
319 points by drv 749 days ago | hide | past | web | 99 comments | favorite



Quick explanation of compiled sprites:

Most commonly a sprite is represented as a 2d array of pixels that you for X, for Y over and use math or branching to blend on to the screen. But, that's a lot of reading and writing, and a lot of the math ends up doing nothing because a lot of the pixels are intentionally invisible.

So, you could do some sort of visible/invisible RLE to skip over the invisible pixels. That's better, but it's still a complicated loop and still a lot of reading pixels.

So, many crazy democoders have decided to write "sprite compilers" that read the 2D color array of a sprite and spits out the assembly for the exact instructions needed to write each visible pixel one at a time as a linear instruction sequence with no branching. The sprites are then assembled and linked into the program code as individual functions. I believe they can even exclusively use immediate values encoded inside the instructions rather than reading the colors from a separate memory address. So, rather than read instruction, read data, write data; it becomes a read instruction, write data in two straight lines in memory.


> it becomes a read instruction, write data in two straight lines in memory.

So basically this kind of hack could not be used for a game, then, where interaction is needed?


IIRC, because of relative-address store instructions, the destination address does not have to be hard-coded. So, the sprites can still move around dynamically.

What's harder is clipping against the sides of the screen. With no branching, there's no way to prevent the sprite from writing past the end of a line/screen (wrapping/mem-stomping). So, there does need to be a test per sprite to detect that case and fall back on a more complicated blitter.

The Allegro game framework had a compiled sprite jitter as a feature early on. So, that would be existence proof of them being used in games :) http://alleg.sourceforge.net/stabledocs/en/alleg016.html


What's harder is clipping against the sides of the screen. With no branching, there's no way to prevent the sprite from writing past the end of a line/screen (wrapping/mem-stomping). So, there does need to be a test per sprite to detect that case and fall back on a more complicated blitter.

In fullscreen modes, you could also just make your screen buffer bigger than the actual screen by the width and height of your largest sprite.


In my game engine I wrote for the Apple IIgs a long time ago, I used compiled sprites and maintained a 1-scanline-wide mask that I used to clip the compiled sprite to the screen edge.

This only cost one extra AND instruction and allowed the sprites to be clipped to any size rectangular playfield while still maintaining almost all of the speed benefits.


That's actually cool, as it would also allow clipping against a "foreground" by varying the address of the scanline mask. E.g. imagine foreground trees in a jungle scene.


Exactly!

I did extend it to use a full-screen foreground mask that implemented this sort of clipping. I was able to make the mask scrollable which allowed the compiled sprites to appear "behind" fences and other complex shapes with per-pixel accuracy.

It could even be used to mask out individual pixel bits that allowed for fake "lighting" changes with a carefully chosen palette.


I keep wanting to do a "retro-game" and make use of what I've learned about these types of effects now. Despite how far machines like the C64 for example were pushed, I don't think they were pushed nearly as far in terms of games as with demos and it'd be fascinating to try to push the limits..

(maybe one day..)


I'd love to see a demoscene version of "Hacker's Delight" [0]

[0] http://www.hackersdelight.org/


I don't quite understand, could you explain?


It certainly can be (and has been) used for games where a limited set of sprites are drawn - each frame of animation is just a separate compiled sprite routine. Compiled sprite routines write at video memory addresses relative to a specified position, so they can be moved around at will.


If you create code for each sprite and only change the base address to write to, you can use it for games alright. Jazz Jackrabbit is one that I've seen mentioned using compiled sprites. Lots of DOS games basically had to.


It could probably be used for games as long as you keep within the bounds of your pixel data.

However, modern games use GPU acceleration instead of plotting the pixels with the CPU, and most higher languages don't expose the sort of functionality you need to use this trick in the first place.


Even the very first digital video game (Spacewar!) used something very much alike (Dan Edwards' outline compiler). Movable and rotatable – think of advancing by unit vectors –, compiled just in time from directional encodings. Read more at: http://www.masswerk.at/spacewar/inside/insidespacewar-pt4-oc...


I think Wolfenstein 3D compiles column scaling for each texture to speed up column rendering.


The comments in the Wolf3D source implied that the self-compiling raycasting wasn't faster on an 80286 than a conventional BSP tree would have been, and was in fact slower on an 80486 thanks to invalidating the code cache over and over.


Your game could dynamically modify the instructions.


Sure, but wouldn't that incur a loss of performance ?


On a modern pipelined CPU with separate instruction/data caches self-modifying code does have a rather large penalty since it has to flush the pipeline and the caches, but on the original 8088 PC which has no cache and no pipeline (there's only a 4-byte prefetch queue), the penalty is much smaller.


Some, but maybe not very much: The screen is redrawn much more frequently than a key is pressed.


You might get noticeable latency though.


The party this was released at had some absolutely mind-blowing stuff. The 8k and 64k competitions were amazing. The demoscene continues to be an astonishing force in computing.

Some of the stuff would be completely at home as installation art in any top modern art museum. My favorite is https://www.youtube.com/watch?v=XF4SEVbxUdE done in 64kb!

Here's the results with links to productions (most of them have youtube videos by now)

http://www.pouet.net/party.php?which=1550&when=2015


I think Chaos Theory still tops 64k.

https://www.youtube.com/watch?v=4DjBq2O0XXk


I found the raw video awesome enough to submit a few days ago, but the super-detailed explanation in this blog raises this to a whole new level of epic.

I wonder if youngsters who didn't grow up thinking 1 MHz is a perfectly acceptable CPU speed and that 640 KB is a whole lot of RAM will understand what the fuss is about here...


You know you've seen something special when the only thing you can think to say afterwards is "But it isn't supposed to do that...".


As a guy who grew up coding Basic on a 4mhz MSX I have no idea what's going on here...


I would love to see what could be done with a CPU running at 100KHz


> I wonder if youngsters who didn't grow up thinking 1 MHz is a perfectly acceptable CPU speed and that 640 KB is a whole lot of RAM will understand what the fuss is about here...

They won't, and thus they don't understand the value of demos in the first place. But don't blame them, even back in the days I knew many people who could not appreciate demos either.


Stop generalizing. Plenty will.


> Stop generalizing. Plenty will.

Look at modern forum discussions on Smartphones, it's full of youngsters comparing specs of their respective phones without grasping at all what they mean. Or maybe you are referring to a highly educated subset of youngsters, but that's very few of them.


You should meet some of these youngsters :) A few weeks ago, I taught 12 year old kid to write a mandelbrot zoomer in Processing--no, really I taught him the required basics of complex numbers, took about 2.5-3 (very fun) hours[0]. Afterwards we had a big A3 piece of paper with notes, graphs and formulas all over it, which he took home with him.

Two weeks later I met him again, of course I wanted to continue teaching him. I thought maybe the Z^4+C variant would be a nice step further. Turns out he already had written the Julia version of the zoomer ... on his Android phone, while waiting at the dentist's ... O_o

Now I used to be all about fractals when I was his age, later grew up to be a 4096 byte democoder (around 2000), I was sooo jealous, what I wouldn't have given for a pocket computer that powerful! Lucky kid :)

Aaanyway, apart from sharing this cute anecdote, my point is this. There's some extremely clever young bastards out there. Now there's not many, but they're also not extremely rare. I know a handful, although this particular guy is probably the cleverest right now. They come from all sorts of backgrounds, too. But the important part is not that they're highly educated, but that they're highly educatable, and given the opportunity to develop this. Their little hacker brains are hungry enough :)

Having written their own graphics code, running against CPU-limits (although we hit float precision before it got really slow), I'm sure he'd appreciate some of the awesomeness of stunts on a limited machine like the 8088. In fact one of the earlier graphics I programmed with him was something very similar to the circle interference pattern described in the article (it was mostly his own idea, playing with interference patterns, I just carefully nudged towards the classic demo effect, because I knew it'd result in a really cool effect).

[0] He already knew how to draw stuff with Processing. He already tried to look online for how the Mandelbrot algorithm works, but couldn't quite wrap his head around it. Missing bit of information turned out to be (a+b)(c+d)=ac+ad+bc+bd, hadn't learned that in school yet. If you explain i as rotation by 90 degrees, the rest becomes quite intuitive, visually. We also took a quick skim through that great WebGL-illustrated article about "How to fold a Julia fractal" (google it), while coding, way better than the five stapled pages I had when I was 15 :) :)


What environment did he use on an android? I would think programming on a small touchscreen device would be really tedious, wondering if there's something good out there.


Didn't ask. Probably just the standard Processing for Android thing? I'll get some more information next time I seem him.

Yes it's probably tedious, but when you're really into something, at that age, you just persevere because you can :) Also young children are incredible on touchscreens, small fingers and they grew up with them :) [I'm the opposite, I have some stress/burnout related tremors in my fingers, some mornings (when it's worst) I can hardly control the device's apps, let alone typing]


As if the majority of people had a clue about these things back in the day.


Probably more than now because computing was not really "mainstream" at the time. Before Win95 at least.


Computing was not mainstream, so kids had a better idea about it?


I guess the kids who got it were a smaller fraction of the total population of kids, but a bigger fraction of the population of kids who owned a computer.


This is HN, not Facebook.


These days I find it slightly weird that they don't share the source code of the demos or related tools. Demo scene has this wonderful alpha-male thing going.


The demoscene grew out of the cracking/warez scene, so it's somewhat implied you should take a disassembler to it if you really want to figure out what a demo is doing. It's the complete opposite to the OSS culture, where the prevailing assumption is that the source code is most important and nothing can be done without it; in the demoscene, it's more like "we don't need no stinkin' source!" I think these two approaches are both interesting in their own ways.


The demoscene and cracking/warez scene grew apart many decades ago. Even back when I was active (1998-2000 or so), they weren't a big influence any more. There was enough source and tutorials floating around. Just not for every demo released. Mainly because of the hassle with releasing somewhat "presentable" code. Many coders are happy to explain, if you just ask them, though.


Their culture sure is a bit alien. The authors are boasting about the dosbox they've broken, yet (it looks to me that) not a single bug/crash has been reported.

Edit: happily I'm wrong. Quoting ajenner below, "there are emulators (for other targets) which do emulate NTSC decoding properly, but until I did the research for this demo nobody understood how the CGA card generates composite signals well enough to be emulate it properly. I have some code which I hope to be adding to DOSBox (and any other emulators that want it) soon."


Thanks for the edit!


I can't speak for the other authors, but I for one plan to release the source code for the parts I wrote (it needs a bit of cleaning up first, though, and I want to get the technical write-ups done first).


I am sorry for whatever pain you're feeling.

I don't feel the ethics of open-source apply here or to any works of art.

Programs like Microsoft Word, which have a near-monopoly on the work that literally billions of people do everyday to be productive and feed their families, when not distributed in a free manner, are tools of unjust power.

I don't feel this person's expressive work, a lifelong dream with no monetary gain, that might merely provide a few weeks of bliss and 15 minutes of internet fame a little inspiration for the rest of us, then become horribly forgotten to the sands of time, is a tool of unjust power.

> alpha-male thing going

I am sorry for whatever you've experienced that leads you to sexist comments like this. I hope that it's able to work its way through your life until you reach the point that you can simply share another person's joy without feeling entitled to have a piece of it yourself.


Lol, I don't feel any pain. :)

I can see that my words can be easily interpreted like you did but I was quite literal is stating that the culture is WONDERFUL. I have no hatred against them although I'm definitely an outsider.

Also, I was literal when I referred to the closed source of these creations. I don't demand or expect them to release any source code. I'm merely wondering whether these demos would be MORE interesting with the source code released as well. As always. In 2015 it seems slightly weird that they don't.

To be clear, I definitely share the joy the authors feel accomplishing these feats. Deep respect.

But I don't get your sexism comment. I maintain that the sub-culture involves some behavior that can be described as "alpha-male". Maybe I was inaccurate with the wording. Could have said "competitive" as well.


> But I don't get your sexism comment. I maintain that the sub-culture involves some behavior that can be described as "alpha-male". Maybe I was inaccurate with the wording. Could have said "competitive" as well.

I don't think you've ever been to a demoparty :-)

If there's any "alpha male" behaviour whatsoever, it's purely in a self-ridiculing way. It used to be there, back in the nineties when all demosceners were insecure teenage nerds and some of them felt a need to compensate for something. That part of the scene is gone for twenty years now, but it's still a lot of fun to make references to that part of history.

My favourite example of this is the demo "Regus Ademordna" by Excess [0]. The title is the reverse of "Andromeda Sucks" in Norwegian. Andromeda is another Norwegian demogroup who had just made a reappearance in the demoscene at the time after having been gone since those nineties. They hadn't gotten the memo that all that alpha male stuff was something of the past, so they took serious offense, much to the enjoyment of the rest of the scene.

[0] https://www.youtube.com/watch?v=NkVbS4CTtfc


the bizarre condescension in your comment seems utterly unnecessary.


The condescension is a work of art. You can't achieve that level of elevated dismissal without years of practice.

/joke


Too bad HN is usually not a very fertile ground for such works of art .. But this is a demoscene thread ;-)


There's loads of demo source and demotools source been and being released.

The main reasons, afaik, for demo sourcecode to not be released are mostly circumstantial. Either the democoder forgets about it, because a lot of the code is one-off stuff and the next demo is going to be fresh and new! The other reason is that their code is a terrible mess of glue, ducttape and kludges, hacked together moments before the compo deadline (see article ;-) ). The democoder intends to clean up the code (see elsewhere in this thread ;-) ) but then forgets about it because after-demoparty-crash. Occasionally, however, they rest up a bit and later on write a great article about the tricks they pulled (while promising to clean up the code and release it "soon") -- saying this with a great big ;-) of course.

And when something happens to be not released, I've always found just mailing the coder about it incredibly helpful, they're happy to explain, I've made great friends, and learned amazing stuff.


Besides, I don't think my interrupt handler and API would be of that much interest (oOoooh, sexy, an API!)


... but someone could use it to disrupt the interrupt market.

Anyhow, omg Trixter :) You don't know me[0] but thanks for all the work you did on the Hornet archive!

[0] just a random 4k coder, ritz, https://www.youtube.com/watch?v=620CmQ9CJoU / http://www.pouet.net/prod.php?which=343


Keep in mind that demos are usually released at parties in competitions, and occasionally even rewarded with real value prices.


That's definitely not the reason the source is often not released. Often, these days, it's mostly the fact that the code will probably only build on two computers in the world and documenting/fixing that is way more boring than coding a new demo.


Amazing. 80×100 resolution 1024-color mode doing CRT controller (6545) tricks, on plain IBM CGA adapter. Also fullscreen bitmap rotation and 3D rendering on a 4.77 MHz Intel 8088 CPU. Wow.


I can't believe my eyes. 256 colors on CGA?! HOW?!


I believe you can manipulate the colour bleed between pixels on the composite output from a high resolution mode to achieve high-colour output. (They actually have a 1024-colour mode in part of the demo.)

There are probably a bunch of weird limitations on the adjacent colours you can achieve, but it’s an impressive effect regardless!

(Here’s the obligatory wikipedia page: https://en.wikipedia.org/wiki/Composite_artifact_colors It sounds like the demo developers have taken this effect & turned it up to 11.)


I remember that "extra colours" could be archived on a ZX Spectrum using the bleed of the RF or composite output.


On Atari ST, it was possible to change the palette colors in midflight, and using that trick to display more colors on screen. The technique was used by Spectrum 512 and Quantum Paint and many demos.

"QPs 512 mode is pretty straightforward; its only color limitation being that it can display a maximum of 40 colors on a single scan line. Mode 4K uses a special technique called "interlacing" in order to display a supposed 4,096 "colors -- http://www.atarimagazines.com/startv3n2/quantumpaint.html see also http://www.atari-wiki.com/index.php/ST_Picture_Formats


You could also go beyond that by changing the image 50/60 times a second, which would then blur together. Enabling you to display colours that aren't even in the ST's palette.


I'm not aware of the specifics anymore but by using well-placed NOPs when drawing a scanline you could make the borders disappear on the Atari ST thus getting a higher resolution. This was one of the many advantages the Amiga had over the Atari ST: being able to use the whole screen while the ST had a screen like a letterboxed movie except on all four sides of the screen.


Toggling the register to change between 50/60Hz mode at the right time, specifically, could reset the counter in the ST's Video Shifter and trick it to carry on drawing screen when it should have been outputting blank border. Top and bottom borders were however much easier, because you could open them with just one carefully-timed interrupt each (I used Timer-B, which was linked to horizontal blank and counted lines, if I remember!). Described as far back as the B.I.G. Demo (check the scrolltext).

Opening the left and right borders however required doing it for each line, I recall, which uses a lot more CPU time. (Unless, of course, there's a trick I don't know!)

Spectrum 512 uses NOP timings to swap the palette at regular intervals throughout the screen; the "4096 colour interlaced" mode just flickered between one colour and another on alternate blanks to give the visual impression of flickery intermediates (before the STe came out, which used the high bit of each nybble to actually have 4-bit-per-channel palettes of 16). That technique, in turn, came from the C64 scene, as did the border trick, though I think they wrote the screen address?

What's old is new again: plenty of lower-end TN LCD panels pull the same colour trickery to fake 8-bit colour from 6-/7- bit panels (or, reportedly, 10-bit deep colour from 8-bit ones in some cases).

This demo is crazy. I don't think CGA even gives them a VBL to hang off! Wonderful.


You're close with respect to the C64 border trick.

On the C64, you could pull the border in the width / height of one character in order to support smooth scrolling (coupled with registers to set a 0-7 pixel start offset for the character matrix). This was done so that the borders wouldn't move in/out while scrolling.

By turning this option on/off precisely timed, the VIC graphics chip never found itself at the "correct" location to enable the borders, and so never did.

Opening the top/bottom borders was done very early because it didn't require much timing.

Opening the left/right borders with static sprites happened soon afterwards.

Opening the left/right borders with moving sprites was particularly nasty because the VIC "stole" extra bus cycles from the CPU for each sprite present on a given scan line, so if you wanted to move sprites in the Y position and open the borders, you needed to adjust your timing by the correct number of cycles for each scan line, often done by jumping into a sequence of NOP's. There were additional complications, but that's the basics.

I think DYSP (Different Y Sprite Positions) on C64 was first achieved in 1988.


What's old is new again: plenty of lower-end TN LCD panels pull the same colour trickery to fake 8-bit colour from 6-/7- bit panels (or, reportedly, 10-bit deep colour from 8-bit ones in some cases).

That's known as temporal dithering/FRC (http://en.wikipedia.org/wiki/Frame_Rate_Control ). To get the 2 more bits of "fake" colour depth requires a 4-frame cycle, on which you display either the darker or lighter colour in sequences like 0000, 1100, 1110, and 1111. It's a form of PWM and the same technique used to drive those large outdoor graphic LED display signs, although to avoid flickering the frequencies are in the tens of kHz.


Thanks for correcting me, it was indeed the 50/60Hz switch. It's been too long!


Vague details from memory: To make the vertical borders disappear, you could reset the refresh timing (maybe it was just by switching from 50Hz to 60Hz mode) halfway through a frame, and the video chip on the ST would desync from the retrace in the CRT. The only way I know to draw on the horizontal borders of a vanilla ST was by perfectly timed border color changes - I used that along with the compiled sprites technique discussed elsewhere in this post, to write a border-less horizontal scroller.


The Amiga had such tricks as well to display 4096 colors at the same time on screen while it could only display up to 32 specs wise. Not sure how they do it on PC, but there's probably a way to achieve it as well.


CGA's "Composite" mode contains output artifacts, a trait also found in other early microcomputer graphics like those of the Apple II or Atari 800; with careful use of dithering, CGA can be expanded to a larger effective gamut[0] although most games wouldn't be as ambitious as this demo. It's not as well-known or used as the 4-color modes because it wasn't supported by later PC graphics adapters, nor was it in the spec of the standard monitors of the era. You can see this system struggle to reproduce the demo's intended look: [1]

[0] Example: https://www.youtube.com/watch?v=TfVe9l77zLU

[1] https://www.youtube.com/watch?v=aibZKrXc8Nk


The 4096 color HAM mode was always part of the spec.


yes, it is, but it can only be used in very specific conditions (you cannot have the colors where you want all over the place), it's a pure hack.


You could either set any pixel color to any of 16 palette entries OR hold red & green from left side pixel, modify blue, hold red & blue, modify green, etc.

You could also change those 16 colors for every scanline (horizontal line).

Overall there was a lot of control, but software such as Deluxe Paint did not support full possibilities of HAM mode.

Oh and, btw, EHB (extra halfbrite) mode has 64 colors. 32 colors palette + second 32 colors with brightness halved. http://en.wikipedia.org/wiki/Amiga_Halfbrite_mode


the most impressive ascii art effect I ever saw was during movement. the minute it stops the motion blur goes away. in effect, I concluded I was looking at temporal antialiasing-like effects. (forget which demo it was, some 3d thing.)

I wonder how much you could improve the effect (i.e. the woman after the 4:00 minute mark) by making it move. From the way it's drawn I'm not sure if you could get even 2 fps though.


I wanted to do some more impressive animation with the 1K color mode, but ran out of time unfortunately. One tricky part about doing animation with this mode is that it suffers from the "CGA snow" bug (visual glitching when writing to video memory). To avoid that by writing to video memory only during the vertical retrace/overscan period means that it would take about a quarter of a second to update the entire screen.


Is the CGA snow bug deterministic? Or at least model-able?


Totally deterministic. I think there's even an emulator which models it (though without a cycle-exact emulation of the CPU the snow won't be in the same places as on real hardware).


Wow, amazing. Would be interesting though to know which hacks make the emulators fail


I bet their color hacks end up doing something weird or possibly nothing with emulators. No idea about the implementation details of any PC emulator, but it sure would be tempting for an emulator to just display the bitmap copied from the emulated machine's display buffer instead of emulating the actual display adapter. Even emulating the adapter wouldn't be enough, I suppose, as the color tricks rely on some unintenional (from vendors PoV) bleed behavior, probably on the analog side of things. So I guess a proper emulator should also emulate some of the analog details of the display itself.


There are emulators (for other targets) which do emulate NTSC decoding properly, but until I did the research for this demo nobody understood how the CGA card generates composite signals well enough to be emulate it properly. I have some code which I hope to be adding to DOSBox (and any other emulators that want it) soon.


Do you know, how was the video uploaded to the YouTube made, if the emulators don't work? Was it really recorded with the plain camera?


We used a real PC obviously (my machine), and a capture device plugged into the NTSC composite output of the CGA card.


There is a link to download the music of the ASCII credits at the end ? I really like it.

Also, I would like to propose a little challenge. What do you think that could do you archive on this "virtual" computer :

- Specs : https://github.com/trillek-team/trillek-computer

- Implementation/Emulator : https://github.com/trillek-team/trillek-vcomputer-module

In a short resume :

- 32 bit RISC CPU running at 100KHz (to 1MHz)

- CGA like text mode, but can be remaped to any desired RAM address and the font can be changed on the fly. Fixed palette of 16 colours

- VSync interrupt + two timers

- 128KiB of RAM (to 1MiB)

- Floppy drive (max 1.2 MiB)

Extra : Displaying it against a Commodore 1084S monitor -> http://imgur.com/GuTVEdj


I'm interested to know why it breaks all the emulators. Certainly I wouldn't expect emulators to reproduce all the graphical glitches that this takes advantage, but what is it doing that actually crashes them?


The emulator I was mostly using for development (DOSBox) doesn't actually crash itself, but here's a nice example of how it goes wrong. In the final part of 8088MPH there is an instruction that modifies the instruction after it, but then (on the real hardware) the old version of the instruction is executed (because it's already in the CPU's prefetch queue when the modification is made). DOSBox executes the modified instruction because it doesn't simulate the prefetch queue. I tried moving the to-be-patched instruction above the patching instruction but that made the code take longer to execute and it no longer met the precise timing requirements necessary for the best audio quality.


Modifying code in the prefetch queue was a well-known anti-debugging/emulation trick in the pre-Pentium days but this is probably the first time I've heard it being used as an optimisation - seriously amazing work.


This is just a guess, but one thing that could do it would be cycle-exact self-modifying code using interrupts. Modify the code too early or too late and the CPU will execute the wrong instructions.

I see this demo as somewhat of a challenge to the emulator authors.


This is the coolest thing I've seen all year, way to go to everyone involved!


Now, I wonder what people could be doing 10, 20 years from now on today's hardware

Probably a lot less tricks up the sleeve are possible (especially with 3D accelerators and dependency on a lot of proprietary AND very complex software)

Or maybe just drop to the framebuffer and push pixels like it has always been done


> Probably a lot less tricks up the sleeve are possible (especially with 3D accelerators and dependency on a lot of proprietary AND very complex software)

Sorry but doesn't that mean there are a LOT MORE tricks up the sleeve possible? They might be hard to find if you have to reverse a proprietary driver, but why not? :)


Slight technical inaccuracy at the start: the Z80 also required a minimum of 4 clocks for a memory access, it wasn't better than the 8088 in that regard.


Nope. An opcode fetch cycle takes 4 clocks, but a normal read or write is only 3.That's why an instruction like LD a, (hl) takes 7 cycles. I believe the GBZ80 always takes 4, but it's more of a separate 8080 clone that borrows from the Z80 a bit than an actual Z80.


Thanks for the correction. I can't believe I've had it wrong for so many years.


Not sure why you are referring to the Z80? We compare against C64, which uses a 6510 (6502-derivative), not a Z80.


Oh nevermind, I thought you were referring to the intro screen of the demo... but there is something about Z80 in the article.


Possibly stupid question: does the demo run inside DOS or is it completely self-contained? (I would assume the latter.)


Not clear. They mention writing a custom loader, reading EXE files; whether that's calling the INT21h DOS functions or the INT13h BIOS disk loader is unclear. I don't see why they'd bother to write a custom disk filing system when you can just use the DOS one.

DOS is more something you run "on" than "in". It's just a filing and utility layer, no task scheduler, no memory protection.


It runs on top of DOS, using INT 0x21 to spawn the effect executables. The loader doesn't read the executables directly, it just decides when to start and stop them, plays music and bounces some text up and down while the effects are loading/decompressing/precalculating. Writing a demo that is it's own OS and has total control of the machine is definitely something like I'd like to have a go at in the future, though.


> "[Step] 5. Effect starts, magic occurs"


Incredibly impressive.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: