Hacker News new | past | comments | ask | show | jobs | submit login

I'm a complete dunce when it comes to compression and how it fits in the industry, so help me out here. Say that everyone accepts that Zstandard is amazing and we should start using it. What would the adoption process look like? I understand individual programs could implement it since they would handle both compression and decompression, but what about the web?

Would HTTP servers first have to add support, then browser vendors would follow?




>what about the web

The browser sends the server a request header indicating which compression methods it understands. Current Firefox for example sends

    Accept-Encoding: gzip, deflate, br
meaning the server is free to send a response compressed with either gzip, deflate or brotli. Or the server can choose to send the data uncompressed.

This means the adoption path for the web would be an implemtation in at least one major browser, which advertises this capability with the Accept-Encoding header. Then any server can start using Zstandard for clients accepting it.


Nice, so really browsers and servers can implement it independently. Seems elegant, I like that browsers can advertise multiple compression formats. Thanks for the info!


It's also possible to implement a decompressor in javascript to support browsers which don't do it natively. The performance would likely suck but if you're truly bandwidth constrained and don't mind users having a bit of a lag, it's an option...


Assuming the size of the decompressor isn't larger than the savings you gained from using this compression algo over another...


Well the decompressor would be cached by the browser so it would pay off for repeat visitors. And if it gets used enough, the user will already have downloaded the decompressor from another site.


And easily shared across sites if they all use a popular CDN'd version.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: