Hacker News new | past | comments | ask | show | jobs | submit login

> However, unless you're using a CPU from the 6502 era, it's probably not worth the trouble for multiplication and division.

When we talk PC, fixed point math was popular a few generations longer than the 6502 era. The 6502 had no multiply and division instructions at all, and up to the 80386 there was only integer multiply and division and that was slow as molasses. Before the 80486 fixed point wasn't a matter of speed, it was a matter of survival (at least for graphics and such).




I remember my old 8086 PC turbo'd to 8 MHz would crawl along as it rendered the wireframe space shuttle demo image that came with Autocad. I somehow got my hands on an 8087 math coprocessor, plugged that bad boy in, and the difference was phenomenal. Good times.


indeed, I was using fixed point maths on PlayStation 1 games in the mid to late 90s. It was often responsible for the gaps you'd see between polygons on many PS1 games.


Gaps were mostly a sign of sloppy developers and rushed schedules. Compare Tomb Raider 1 on both PC/PSX with Destruction Derby (reflections) and Gran Turismo 1/2 (polyphony).


Yes, there were ways to avoid it (shared vertex calculations), but the effect seen was mostly where there are two separate vertices that are supposed to be at the same position, aren't, because each vertex was positioned with different transformation matrices.


I was using it for z-80, game boy color homebrew. Getting down in the bits like this is.... bracing!


I remember doing a Gameboy Color game after working on a PS1 title. And even though I came from an 8 bit 6502 background (BBC Micro), going back to it was hell. 'Bracing' is an understatement! I couldn't imagine much worse these days.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: