Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You lose the ability to remove unused code, but for people using this, it doesn't matter, because it is cached anyway.

The number of HTTP requests really does not matter much these days, because of HTTP2. And again for people who use this service and get a speedup from it, it doesn't matter because it is cached.



> You lose the ability to remove unused code, but for people using this, it doesn't matter, because it is cached anyway.

I was under the impression that parsing (etc.) time matters as well, not just raw download time/latency?


Yes that's fair, with less code you have less to parse. Parsing time is ideally a negligible amount of time, especially because it can normally happen asynchronously, but for very large libraries I could see it being an issue.


"ideally"


Well yes and no, this will depend on how you build and serve the dependency graph. If they did their diligence with the production build you should be able to Optimize re-usability of common imports as well as optimizing modules to only import the bare minimum of what’s needed. Now granted as far as I can tell from a technical perspective in absolute terms you will always get the best builds tuning your builds locally, however it’s not all or nothing either


You assume I can use HTTP/2. I've pleaded with my upstream service owners to no avail. Since they control network traffic, I have no choice.


What do you mean by upstream service owners? All you need for HTTP2 is for your browser to support it as you download from the CDN domain.


In my organization, someone else controls the traffic that arrive to servers that host things like CDN assets. And it's all HTTP 1.1.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: