Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh now I understand what you're saying -- that makes sense. It's not what I've observed, but I definitely can imagine how settings for the compression, or what the algorithm concentrates on, could make that happen. I'll have to pay close attention now! And your rationalization probably makes sense -- I read the h.266 codec is supposedly focusing specifically on perception improvements in 4K/8K, so it may be addressing what your'e perceiving specifically.

Funnily enough, this reminds me of what I did with JPEGs on websites when Retina first became a thing. Instead of serving up different assets depending on resolution, I discovered that serving up a 2x resolution JPEG to everyone, with really crappy quality, was superior to serving up 1x with high quality. For 1x screens, blockiness was inherently shrunken because of JPEG's fixed block sizes, and for 2x screens the low quality was harder to perceive because it was at a small scale anyways. How compression interacts with scale/resolution is not a necessarily intuitive thing.



That's a brilliant conclusion on the JPEG hack, and yes, that's precisely what I've observed. You really notice it if you compare to even 1080i with high bitrates (OTA HD TV for example). It's weird.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: