

Examining the file requests made by the top 100 Alexa sites - dangoldin
http://dangoldin.com/2014/03/09/examining-the-requests-made-by-the-top-100-sites/

======
mpettitt
Do the sizes of different types of file have any impact? For example, does a
50kb image have more impact than a 50kb JavaScript file? (I'm guessing yes,
since the js file will tend to be compressed for sending, whereas the image is
already compressed)

How about the use of cdn for common files? Is there a correlation between,
say, the use of Google's jQuery cdn service and faster/slower sites? What
about distinct domain lookups for multiple files? Is rakuten loading all those
files from a single domain, while Google spread the 10 requests around a pile
of specialist static file servers?

It's interesting data though!

~~~
dangoldin
I didn't get a chance to pull this info but should be pretty easy to do. I
also expect different types of content to load at different speeds (images
need to be rendered, JS needs to be evaluated, ..)

I published the code on GitHub and you can take a stab at it =)

------
anilgulecha
Foremost conclusion: Fast load times depends on you serving everything
yourself, ideally in as few files as possible. Google and wikipedia do this.

Sounds obvious, but funny how often we forget this -- most sites download
scripts and resources from multiple domains.

~~~
dangoldin
Yep pretty much. I think the problem is that many sites now rely on 3rd party
resources which slow everything down.

