Hacker News new | past | comments | ask | show | jobs | submit login
Examining the file requests made by the top 100 Alexa sites (dangoldin.com)
15 points by dangoldin on March 10, 2014 | hide | past | favorite | 4 comments



Do the sizes of different types of file have any impact? For example, does a 50kb image have more impact than a 50kb JavaScript file? (I'm guessing yes, since the js file will tend to be compressed for sending, whereas the image is already compressed)

How about the use of cdn for common files? Is there a correlation between, say, the use of Google's jQuery cdn service and faster/slower sites? What about distinct domain lookups for multiple files? Is rakuten loading all those files from a single domain, while Google spread the 10 requests around a pile of specialist static file servers?

It's interesting data though!


I didn't get a chance to pull this info but should be pretty easy to do. I also expect different types of content to load at different speeds (images need to be rendered, JS needs to be evaluated, ..)

I published the code on GitHub and you can take a stab at it =)


Foremost conclusion: Fast load times depends on you serving everything yourself, ideally in as few files as possible. Google and wikipedia do this.

Sounds obvious, but funny how often we forget this -- most sites download scripts and resources from multiple domains.


Yep pretty much. I think the problem is that many sites now rely on 3rd party resources which slow everything down.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: