Hacker News new | comments | show | ask | jobs | submit login

"The original IBM PC had a clock speed of 4.77 MHz. The 8088 processor inside was actually specified to run at 5 MHz, so how did IBM end up at 4.77?"

" At some point an IBM hardware design engineer made a leap: The Color Graphics Adapter would need a 3.579545 MHz signal to create a color subcarrier; one way to get that signal was to divide 14.31818 MHz by four; 14.31818 MHz is only about 5% less than the maximum speed of the Intel 8284 clock generator (which would divide it by three to get 4.77273 MHz for the 8088 processor). Thus by sacrificing 5% in performance, around $0.50 could be saved in parts"

Thus setting the tone of the future of the PC.




One man's cheap'n'nasty is another man's elegance. And $0.50 was a lot of money back in those days ;)

Anyway for computers of the era that generated TV output, it seems like it was common to start with whatever clock speed was necessary for the video output, and work back from there. Hence the odd CPU speeds of the C64, Amiga, Atari 8-bit, etc.


For some real excitement, look at how the Apple 2's graphics worked. If you ever worked with hi-res mode you know how screwey the colors were. Turns out they basically serialized the bits right out of ram into a crude NTSC signal and the bits ended up making different frequencies--hence different colors (the high bit of each byte specified whether to delay the rest of the 7 bits by half a clock).

Looking at an Apple 2 schematic is eye-opening. There's hardly anything there!


I have to do something similar with embedded devices to support USB. Of course, 12 MHz multiples don't look as odd as 3.579545 MHz to the casual observer.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: