Hacker News new | comments | ask | show | jobs | submit login

To what can I attribute the consistently horrible quality of 64kHz streams ten or fifteen years ago? Would that fall under the "bad encoder" bucket?

Edit: christ, I mixed up bitrates (e.g. 192kbps) with sampling frequency (e.g. 192kHz) again. I was referring to 64kbps streams.




64 kHz isn't a standard sample rate -- you're probably thinking of the bit rate of an MP3 or AAC file. A 64-kbit MP3 does sound pretty awful.


Yup. Further confusing me was the fact that (if memory serves) Apple did offer MP3's at 192kbps for a while, before upping to 320kbps.

Edit: apparently my memory is worse than I thought.


Apple doesn't sell 320kbps anything, but 256kbps AAC, which is probably better than 320kbps MP3.


Hasn't Apple always been offering AAC? At first at 128 kbps and then 256 kbps.


iTunes has offered 128kps AAC and 256kbps AAC files. Now it's only the 256kbps versions.


mp3 encoders have gotten better over time. As well as general improvements in fidelity, older encoders had bugs that would cause occasional terrible encoding for fragments of a sample.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: