At some point an IBM hardware design engineer made a leap: The Color Graphics Adapter would need a 3.579545 MHz signal to create a color subcarrier; one way to get that signal was to divide 14.31818 MHz by four; 14.31818 MHz is only about 5% less than the maximum speed of the Intel 8284 clock generator (which would divide it by three to get 4.77273 MHz for the 8088 processor). Thus by sacrificing 5% in performance, around $0.50 could be saved in parts"
Thus setting the tone of the future of the PC.
Anyway for computers of the era that generated TV output, it seems like it was common to start with whatever clock speed was necessary for the video output, and work back from there. Hence the odd CPU speeds of the C64, Amiga, Atari 8-bit, etc.
Looking at an Apple 2 schematic is eye-opening. There's hardly anything there!