
Post Web site loads too slowly - The Washington Post - amr
http://www.washingtonpost.com/opinions/post-web-site-loads-too-slowly/2011/09/23/gIQAxocfrK_story.html?hpid=z2
======
drewcrawford
> In finding solutions, The Post doesn’t want to sacrifice ad revenue. Nor
> does it want to give up much of the valuable marketing information that
> comes from the tracking of your reading habits.

Translation: You're not the customer, you're the _product_.

I've seen this story too many times: We want high performance, AND we want a
simple user interface, AND we want these 20 features, and we will not listen
to the tech guys who tell us we can't have it all. Something has to give.

~~~
quanticle
In the context of the Post, these are excuses. They even admit that their site
loads more slowly than all of their competitors. What does the Post's site do
that the New York Times, Guardian, and Reuters don't?

~~~
mattdeboard
Sounds like the WaPo site has ceased being a website and become a bikeshed,
instead.

------
arghnoname
I disable javascript on the Post website. I did it initially because they did
something that breaks in Opera recently, but the page is noticeably snappier
and more usable. I haven't missed anything by turning it off.

------
422long
When I hear about a slow loading site, especially due to many third party
includes and dependencies my secondary concern from an operational perspective
is what happens when those 3rd party links go down?

Does the page load up as blank? Stop loading halfway through? Very difficult
for WashPost to achieve high availability when they only control a portion of
the experience.

~~~
guard-of-terra
When all the 3rd party content is linked properly (i.e. <script> blocks at the
bottom of page) - nothing bad happens, you would just see blanks instead of
third-party content.

When it's not linked properly (as far as I know most of those 3rd party
services by default advertise the "wrong" way) - not only it would display a
blank or half-blank page when the 3rd party service is down, also all the page
loads are noticeably slower for everyone on every page load.

I.e. I use neither facebook nor twitter nor google+, but half of the internet
is loaded a few hundred milliseconds slower because every page is crammed with
their fairly useless buttons.

Why useless? First, that functionality should be part of the browser, not of
the webpage. I should have a share button in my browser, for _every_ page I
see, and also it would prevent those services from tracking me.

Is there an firefox/chrome extension that:

\- Provides Like, Tweet, +1, Digg and all the buttons in the toolbar (only
ones I need)

\- Prevents browser from accessing button embeds therefore disabling tracking?

I would install it in a heartbeat.

~~~
thwarted
_First, that functionality should be part of the browser, not of the webpage._

This is why I prefer a bookmarklet, for the specific (of the 100s possible)
social sharing site I happen to like to use.

------
k33l0r
It amazes me how many high traffic sites seemingly can't be bothered to test
their sites with YSlow or Google Page Speed.

For example, The Post gets a D from Yslow, as does Techcrunch, but they aren't
the worst of the pack as ReadWriteWeb gets an abysmal F.

~~~
blauwbilgorgel
I was amazed too. Also at this public sneer from an editor: Won't developers
feel responsible for this?

They again, they include javascript files with a copyright from 2007
([http://www.washingtonpost.com/rw/sites/twpweb/js/wp_omniture...](http://www.washingtonpost.com/rw/sites/twpweb/js/wp_omniture.js)).
They source javascript files of 5 lines of length. They forget alt attributes
on navigation images. And they wrap their stories in this soup:

    
    
      #wrapperMainCenter, #wrapperInternalCenter, #container, #pagebody, #pagebody-inner, #article, .blog_entry, #c-main-content, #center, .content, .hnews hentry item
    

That can't be result of a single developer, or even of a single project.

There currently is nothing to fix when - to keep advertising and tracking
going - you are faced with over 100.000 bytes of third party javascript code
(I stopped counting).

Next to a complete redesign, a mentality change would be needed. Sure, you can
asynchronously load a single compressed and combined core javascript resource
just before <body> close. But would the advertisement department of The
Washing Post be happy if all advertisements showed up 5 seconds after the
content has loaded?

I wouldn't even know where to start bothering with this massive site. There
must be 10+ projects with different developers all working over the years to
build things like the Sports Section, the classifieds section etc. All using
their own javascripts and style sheets... Perhaps a good CDN to patch this oil
tanker.

~~~
thwarted
_But would the advertisement department of The Washing Post be happy if all
advertisements showed up 5 seconds after the content has loaded?_

I can almost see how that conversation could go down. "Our users are staring
at half-rendered content during the 15 seconds our site takes to load!
Scramble the web team! That's blank real estate that could use some ads on
it!"

------
DieBuche
It's not the video or whatever, it's just observing some basic rules:

-Combine external JavaScript

-Enable gzip compression

-Leverage browser caching

(As suggesteg by Chromes Inspector)

As well as

-Using css embeds where possible

-Severside caching if they don't do that already

------
fwonkas
Just loading JavaScript in a non-blocking fashion would probably improve
apparent page-load time significantly.

------
fooko0n
At least it loads faster than Salon.com. _Everything_ loads faster than
Salon.com.

~~~
k33l0r
That's not that surprising when you look at what grades Salon.com gets from
Page Speed and YSlow (a C and E, respectively):
<http://gtmetrix.com/reports/salon.com/Tebabbbf>

