Hacker News new | comments | show | ask | jobs | submit login

I'm on a pretty fast line, and I am sometimes appalled with how long it takes pages to load, and often it comes down to loading content I have no interest in:

  * comments
  * ads and images
  * headers
  * sidebars
  * toolbars
  * menus
  * social network tools
  * meebo bars (really google, really?)
  * javascript to load all of the above crap
The amazing thing is how often I can't even begin to read the page for what seems like much of a minute as the page takes so long to render, or various pieces of the page jump, and shift and scroll.

I find that tools that localhost various ad servers help, and other tools that load the crap but keep it off the page help like adblock plus, but even more so, adblock plus' filters that let me shitcan all the crap.

One of these days I want to write an extension similar to adblock plus that seeks out and removes jquery crap. A lot of the reasons I can't read pages anymore seems to be jquery slideshows, jquery toolbars, jquery popups and the like.

I am pretty sure that graphing this out and we find the end of the web occurs sometime in 2018 when page designers and their bosses and engineers and marketing pukes have so larded down pages that the net runs out of available bandwidth and any page takes 4:33 to load.

There seems to be an expectation these days that everyone's on a 50Mbit+ connection with an ultra mega blazingly fast computer, and nobody tests their sites on anything but that setup.

Example: The Verge. Chrome reports theverge.com as making 118 requests with 2.62MB transferred, taking 11.04s (onload: 6.42s, DOMContentLoaded: 3.32s). Kotaku: 234 requests, 1.35MB transferred, 11.93s (onload: 10.91s, DOMContentLoaded: 4.28s).

I tested this on a maxed out 2010 MacBook Air, and that's with AdBlock Plus and ScriptNo.

The Verge is basically unusable while it loads and Chrome struggles to render it after it has loaded while scrolling on my machine. This is all-too-common these days.

Is there a consulting market for fixing these kinds of examples? I quite enjoy optimizing these kinds of issues, but perhaps the value is too implicit or too low for them to bother.

The issues these optimizations pose for the workflows of the whole team and the maintenance requirements are the blocker. You must be able to convince people that so much of their time are worth the gains (the same people that use high level languages and JS frameworks they may not need very much of for productivity boosts).

But for the slow sites highlighted, I don't understand why client-side performance and bandwidth usage apparently isn't a priority; I mean, their site, their front-end is their primary product, the thing they make money off of. A ten second load time is simply unacceptable; just look at the performance guys at Google, who have and are working hard to shave milliseconds off of page load time.

I'm probably going to offend a few people here, but creating a website without or with minimal performance optimization is just plain sloppy and uncraftmanshiplike work. Get your act together, or get a different job.

Right, and thanks for separating how to approach this issue in creation of an app vs maintenance. Real world issues like having a million images that are not compressed or sized correctly, or having part of an application out of your control that cannot be modernized or share assets with another part get in the way. There are companies that strive to outsource this work to a platform / appliance, like the recently acquired http://www.akamai.com/blaze instead of actually fixing/upgrading these issues (not to say that it works). After all, most companies don't have resources for their front-end devs to be profiling page loads by country and pushing HTTP performance under such dramatically different bandwidth/latency variations.

You have to think about the target demographic. The guys at The Verge are probably targeting those with faster computers, faster internet connections. Google is targeting everyone.

It sounds like the solution here, to respond to both lincolnbryant and nfm, is to find a way to highlight the benefit of a round of optimization but in an external fashion.

I, too, really enjoy performance optimization but that's an area that is especially hard to break into because many times the client wants something that just works. In my experience as a consultant it's rare that you get a client that cares about performance much unless it's a company like Apple that has a crazy amount of money to spend on making everything perfect (where 'perfect' isn't literal perfection but surely of a much higher caliber than the competition in many regards).

See my comment to the grandparent post. Just a 'skip to content' link at the top of the page and a way of accessing comments without javascript. People on text only clients are away then.


It's not like the Verge guys threw the page together themselves. They made a huge deal about working with Code and Theory who designed their web page. http://www.theverge.com/2011/11/1/2528367/welcome-to-the-ver... The fact that a respected professional web design company did this is scary.

Just tried theverge.com on a not so powerful computer :

35 seconds and 6.50MB transferred !

The Verge is painfully slow to load, even on mobile. Their content is good but their page is so heavy I tend to avoid it.

s/even/especially/. most mobile versions of sites load just as much cruft as standard browser pages.

Try it on a command line web client in your terminal. I use w3m as it has vaguely Vim like shortcuts.

Loads the front page in about 5 seconds on a 24 Kbit/s connection in the UK.

They could include a 'skip to content' link for those using text or audio interfaces, because the entire navigation stricture of the site appears to be included on every page. They also could make comments available in a way that makes them visible to a basic http client. Two minor mods like that would mean fast access to their reading material for cli nerds

Not disagreeing on ridiculous amount of requests, but The Verge (specially) and Kotaku audiences should lean towards high-end users.

What if I'm trying to read their offerings on my phone or tablet? A lot of the things these pages load are not worth the time it takes to load them.

They have mobile offerings. Theverge, in particular, Loads in about a second on my iPhone5.

In the case of theverge that works, but some "mobile-optimized" sites are worse than the desktop site. For example, navigating to extremetech.com from my ipad 1 takes 20 seconds over wifi before the page is interactive. After load, interaction is still slow and unreliable (and downright buggy). If i force it to use the desktop site the page loads faster and interacts more smoothly.

It's amazing but many mobile editions same to only be designed to make a screenshot to demonstrate to management that the site team did something mobile. Doesn't anybody ever actually try out these sites?

Mobile sites suck. I often change the user agent so I can see the desktop site on tablet or tethered laptop.

In that case I don't understand your argument.

You don't want to use full site because it has too many features and have, therefore, too big size.

You also don't want to use mobile site because it doesn't have all the bells and whistles.

Chose one :) ... or use Opera Mini and/or ziproxy on your server

False dichotomy... Either use a broken mobile site with half the content, or a full site that barely loads.

No, I'm saying that the 'features' on the full site are rarely worth the time and bandwidth required to load them. I'm not complaining about things like Google Earth or other high-bandwidth services, but the amount of visually and functionally useless cruft on many sites, news sites in particular.

I did consider this when using those as examples, but they were the first that sprang to mind. The sites that are utterly unusable tend to be forgotten entirely.

For comparison:

I got theverge.com in ~3 seconds from first click Using Chrome on a 40MB FIOS connection i7 2500k @ 3.6ghz w SSD & 16GB RAM on windows 7 without add-block or no script.

My PC is over a year old so it's probably the internet connection and then again the CPU is also 2-3+x as fast.

...its not your connection. its the 10mb pageload.

> "One of these days I want to write an extension similar to adblock plus that seeks out and removes jquery crap. A lot of the reasons I can't read pages anymore seems to be jquery slideshows, jquery toolbars, jquery popups and the like."

This might be something that NoScript's Surrogate Script feature could handle - it was originally developed to allow blocking Google Analytics without breaking sites that expected it to be loaded, and now includes surrogates for a lot of other junk.

If using Adblock Plus already, you might consider using fanboy's Annoyances list. It removes a fair bit of web clutter (though I often block unneeded elements on sites I frequent).


Funny blaming jQuery (one of the leanest frameworks, striving to only fix the browser, not provide app architecture). Disable that and not much of the web will work. Remember these same jQuery-ites want to ignore IE6 and 7 which is what the people he describes from around the world are way more likely to be using.

Just because people have bad net connections doesn't mean they run out-of-date software. Does anyone know how true this is?

I don't have a link right now, but I've seen lots of statistics that show much of the remaining IE6 traffic coming from China. Feature phone usage is also most prevalent in developing countries.

I believe that out-of-date software and slow connections are highly correlated. New hardware and fast connections are the most expensive part of computing. Old (or very low cost) hardware doesn't run up-to-date software. So there you have the reason for that correlation.

AFAIK the IE6 traffic from China is caused by modded IE6 versions that are very popular in China and pirated Windows Versions which do not update the browser because Windows Update is not working / disabled.

I have heard this of internet cafes, specifically Central and South America. Fellow web developers have returned to describe Facebook Chat using a Java tool, etc.

The amazing thing is how often I can't even begin to read the page...

Cannot upvote this comment enough.

Ghostery takes care of several of those things.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact