Hacker News new | past | comments | ask | show | jobs | submit login

File size matters for networks, not compression. Compressors have an interface where you specify desired file size and the program tries to produce a file of that size. With better compression algorithm the image will be just of a better quality, time to download and cost per GB will be the same.



> Compressors have an interface where you specify desired file size and the program tries to produce a file of that size.

That’s not really the case for JPEG XL, where the main parameter of the reference encoder is in fact a target quality. There is a setting to target a certain file size, but it just runs a search on the quality setting to use.


This algorithm is supposedly lossless and requires no fiddling with such settings. So a good enough non lossy algorithm could be quite disruptive. Of course with a lossy algorithm, if you throw enough detail away it's going to be smaller.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: