Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The common case is that the user doesn't have the file yet and then has to do another dns lookup (for the cdn) and establish another http connection (to the cdn).

There are so many public cdns now and so many versions of libraries scattered about that the chances that the user already has the resource cached is rather slim.

Your build tool might as well have combined it all and sent the same "cache this forever" header. And in the common case that would have come from the same host as some other files, so one dns lookup, one http connection.

I think public CDNs optimise an edge case for little benefit. But maybe it is clearer to me because I always have high latency where I live and my connection speed can be described as "medium" at best. I see sites only load some of their resources due to splitting files across CDNs on a daily basis.

I can understand that if you have a sub-20ms ping time to your local CDN and a cable modem this might be difficult to understand.

Making everything work or everything fail together is a lesson I've learned many times.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: