

Using event capturing to improve Basecamp page load times - hawke
http://37signals.com/svn/posts/3137-using-event-capturing-to-improve-basecamp-page-load-times

======
krmmalik
That was a very well written and engaging article. most programming articles
concentrate on the programming logic only, but i liked this article because it
included the business case and explained the reasoning. Although i didnt quite
understand what's going on with the code itself, i still found it an
interesting read.

~~~
sstephenson
Thanks, I really appreciate it!

------
JBiserkov
The _last_ paragraph that should be on _top_ :

"If you’re looking to speed up page load times in your own projects, _start by
profiling_. Then identify the hotspots and defer them until the furthest point
of intent — where you know the user is about to perform the action you need to
set up."

~~~
newman314
It's too bad that the article did not measure how long it takes for the action
to fire when the user does perform the action after refactoring.

I would think that it's a bad user experience to experience a big pause right
when they want to do something. Sure the page loads faster but a noticeable
pause after a click would be tremendously annoying and contribute to the
"laggy" feel.

~~~
sstephenson
At that point it's much more about feel than a hard time measurement.

After implementing deferred initialization I asked myself "does reordering
feel laggy?" and the answer was no. If it had felt laggy then I'd have started
profiling again.

~~~
joshuacc
Excellent article. Do you mind going into what tools/techniques you used to do
the profiling?

~~~
sstephenson
Sure thing.

For high-level measurements I use good old "new Date().getTime()". One call
before, one call after, subtract the former from the latter and log it to the
console.

When I want to dig deeper, I go to the WebKit Inspector's Profile tab. You can
turn it on or off through the UI, or use
console.profile()/console.profileEnd() in your code to control it
programmatically.

~~~
masklinn
>For high-level measurements I use good old "new Date().getTime()". One call
before, one call after, subtract the former from the latter and log it to the
console.

Why not use `console.time` and `console.timeEnd` for that? They probably have
the same resolution as the underlying date object and they're significantly
less of a hassle.

~~~
sstephenson
Because I didn't know about it :) Great tip!

------
pilif
Instead of using capturing you could also just trigger another mousedown event
which will be seen by the sortable plugin. This has the advantage of working
even in old IEs.

Be mindful though that by deferring the setup of the sortable list until the
mouse action, you will introduce UI lag to the user. Sure: The page might load
a bit faster, but now users clicking on the sortable elements will have to
wait longer for their click to register.

As an aside, it's ironic that as usual, the browsers which would profit the
most from these kinds of speed optimizations don't support them to begin with.

It's also always the slowest browsers that require the polyfills with the
worst performance characteristics.

------
lovskogen
Why don't you just load it after the page has loaded and is displayed to the
user. Then they can scan the page and read while you're loading the sortable
and other functions.

------
firefoxman1
That was a really great explanation on how capturing works. I've just always
used bubbling. But wouldn't it be easier to just setTimeout for maybe 1.5
seconds? Or perhaps split your page load into two parts: Essential core stuff,
then once that's done it signals to start the second, less critical part,
which would include things like .sortable()?

------
MatthewPhillips
> document.addEventListener

Have a feeling you're going to regret that.

------
illumen
"New basecamp requires IE9 or higher." They just lost a sale with that.

Finally made the decision to use redmine(<http://www.redmine.org/>) instead.

~~~
dspillett
If they are saying IE9+ _only_ then that is a rather daft decision both
technically and commercially.

If they are meaning "something not from the stone age" (i.e. you need FF3.5+,
Chrome or Opera of similar lineage, or IE9+) then I can understand not
supporting IE8 or less. Increasingly people are deciding that it simply isn't
cost effective to continue to support IE8 and its elder kin, the few lost
sales that result being worth less than the cost in time and other resources
committed to testing those environments (i.e. people are taking the opinion
that someone using IE8 is a _cost_ not a _customer_ ).

It is a decision that should not be taken lightly of course. I _wish_ I could
refuse to support IE6 and IE8 but our clients are large banks who really are
stuck in the stone age in that regard. I assume that they fully considered the
pros and cons of telling IE6/7/8 users to get stuffed before taking that path.

If you are stuck on IE8 or less then you have no choice but to vote with your
wallet. If enough people do the same then they were wrong in their assessment
and will have to reconsider. Given the target market, as I understand it,
though: I don't think they are likely to be wrong.

~~~
sstephenson
Of course we support browsers other than Internet Explorer.

Here's the full list of supported browsers: <https://basecamp.com/browsers>

And way to miss the forest for the trees, guys—this article isn't about IE at
all. I've come to expect no less from the Hacker News crowd these days.

~~~
illumen
Well, it did have a whole IE support paragraph... so it is about IE8 somewhat.
I have been looking at basecamp over the last few days too. So for me, this
was the most relevant part of the article. Unfortunately some customers still
use IE8 (and even 7), so it would be silly of me to force a tool on them they
can't use.

Back onto the main topic...

In my experience IE8 is the most important one to profile and speed up - since
it is quite slower than other modern browsers. Spending time in the IE
developer tools profiling can give some pretty good gains.

Moving expensive code into click handlers can have the drawback of making the
user Touch Points, or where the user is doing interactivity slower. This is
where users can often notice the slow down. But I'm guessing you managed to
make it take less time than 200ms, and it's probably a little touched item
anyway, so probably still worth it. The alternative is to spread the
computation after load, so the user might not even notice it happen. Or if
you're modifying the DOM, why not just do it before you send it to the user -
and send them html which doesn't need a lot of javascript manipulation done.

It also brings up the point of testing and profiling. I noticed there was a
bug report in your blog comments. It's really helpful to have tests before you
start doing profiling, and especially once you have made your changes. Tools
like speed.pypy.org are really nice for tracking performance regressions too.
So, when a developer adds some new JS functionality into your app in 4 weeks
time, you can see when the performance regression was added.

cya

