Hacker News new | past | comments | ask | show | jobs | submit login

Having two independent system while destroying traffic savings from a transparent caching system seems like a bad trade off to me.

Consider you're a cloud provider running customer images. If everyone downloaded the same package via https over and over again, the incurred network utilization would be massive (to both you and the debian repository in general) compared to if everyone used http and verified via GPG, all from your transparent squid cache you setup on the local network.




I fear the trust issues with generic HTTP caching makes it infeasible.

It would probably be better to use a distributed system design for this.. BitTorrent or who knows ipfs maybe..




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: