Most popular sites use iframe to (either) asynchronously load javascript ads or load it on a separate page so that it doesn't effect your initial site speed. Most popular ad platforms also offer iframe specific codes you just have to ask for them (I know adify does). If they don't offer iframe codes, ask if it against their policy to load codes on iframe, they might make exceptions for high traffic sites (Arstechnica loads all ads on iframe).
For general optimization, yahoo has an excellent resource page: http://developer.yahoo.com/performance/rules.html I was able to bring my site from ~8-9s loading time to ~2-3s running on a not too powerful server.
Three optimizations that worked great for me.
- CDN for static files (maxCDN has a great cheap introductory offer of 1tb for $9.99 and offers PULL)
- Minify and Gzip CSS and js files and then fetch them from CDN.
The absolute best thing you can do to reduce page loading time is to cut the number of requests the browser has to make. Each request requires a round-trip to the server, with all the latency of a cross-country or cross-continent trip. It requires overhead for TCP/IP and HTTP headers. And most browsers limit concurrent open connections to 2-6, so the bulk of these requests are serialized and block further loading of the page.
Sprite all your images, so that it takes one request to get the whole chrome rather than one per image. If you have multiple JavaScript files, concatenate them together. Same with your CSS. Obviously, gzip them and push them to a CDN if possible, and cache them aggressively. And consider using data: urls to inline images directly into the page: the time saved on requests more than makes up for the added bytes from b64ing the data.
Thanks for the tips. Most of these I am already aware of. You are very much right about reducing the number of requests to bare minimum. I was able to cut down from 28 to 12, but the last few are tricky because some connections are external files (ads, analytics, disqus). Also I don't think its as much as the number of the connection thats the problem but the number of DNS lookup that will cause more delay. So if you have 12 file thats spread around 3 different locations, its better than having 5 files spread around 5 different locations.
Concatenate is also a good idea, but in some cases it breaks some JavaScripts for me.
Personally I am happy with ~2s avg load time, just trying to see how far I can push that number.
Most popular sites use iframe to (either) asynchronously load javascript ads or load it on a separate page so that it doesn't effect your initial site speed. Most popular ad platforms also offer iframe specific codes you just have to ask for them (I know adify does). If they don't offer iframe codes, ask if it against their policy to load codes on iframe, they might make exceptions for high traffic sites (Arstechnica loads all ads on iframe).
For general optimization, yahoo has an excellent resource page: http://developer.yahoo.com/performance/rules.html I was able to bring my site from ~8-9s loading time to ~2-3s running on a not too powerful server.
Three optimizations that worked great for me.
- CDN for static files (maxCDN has a great cheap introductory offer of 1tb for $9.99 and offers PULL)
- Minify and Gzip CSS and js files and then fetch them from CDN.
- PHP cache (APC, eaccelerator or xcache.)
I am trying to reach <2sec speed point now.