Hacker News new | past | comments | ask | show | jobs | submit login

A lot of monitors* in that era (1994ish) that could handle 640x480 or 800x600 at an acceptable-ish refresh rate of 60, 72, or 75Hz, could only do 1024x768 as an interlaced mode - meaning odd and even lines were drawn on alternate frames, so the overall refresh rate was effectively halved. The lack of phosphor persistence over such long "frames" manifested as visible flicker. IIRC, 1024x768 @ 43Hz interlaced was a common (supported, if rarely used) standard. It hurt.

* Graphics card RAMDAC bandwidth was a limiting factor for higher resolutions x higher refresh rates, too. And video memory limited higher desktop resolutions x higher color depths for years - really until 3D accelerators were coming of age.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: