
Faster Firefox landing page increases download conversion by 15% - mbrubeck
http://blog.mozilla.com/metrics/2010/04/05/firefox-page-load-speed-%e2%80%93-part-ii/
======
patio11
This worked about as well for me as it did for Firefox, when I implemented it
a few years ago, after reading the YSlow presentations. I've said it three
times on HN but I'll say it a fourth: watch one of the YSlow presentations,
get their checklist, go down and tick off items on it. It is the easiest money
you'll make in your life. Look at the improvements here: combining uncombined
JS files and inlining CSS. These are tweaks that you can have coded, tested,
and live in ten minutes or less.

As an aside: It is a little disturbing to me that I'm ahead of Mozilla on the
Internet technology adoption curve.

~~~
exit
does it make sense to inline css if it's the same across multiple pages?

~~~
anthonyb
Should do, since it's one less connection/round trip back to the server to get
more data.

~~~
johnswamps
Shouldn't it be cached, though? You can't cache if it's in-line. (I'm not a
web expert)

~~~
jeff18
The short answer is no. Even an extremely intense, CSS heavy page might have
say 16 kilobytes of CSS. After gzipping, that should turn into about 3
kilobytes. 3 kilobytes, inline with the rest of your content, is utterly
negligible and caching that would probably classify as a micro-optimization,
even though every instinct is telling you to put it in a nice, cacheable
stylesheet.

It feels dirty not to cache it, but if you look at a download waterfall and
see how much time it takes to send that separate http request (while blocking
the display of the page) it is well worth it.

~~~
fhars
So what you would need is a bit of logic that applies your site-wide css to
your landing page html on the server side and sends that to the browser,
either on demand or by statically producing a transformed html page. Does
anybody know about something like that? "html css server" are not really the
terms to enter into a search engine if you want specific results...

~~~
jeff18
No, you really just want to use inline CSS. Is it really worth all of the
trouble just to cache 3 kilobytes of data (probably 20 milliseconds of
download time)?

~~~
fhars
That is what I said, yes, except that I wanted a tool that does the inlining.
But as another comment said, that tool would probably not worth the hassle,
too.

~~~
jeff18
I'm not totally sure what you mean by tool. Whatever framework you're using
surely can include an external file, and if not, even Apache itself can handle
this via SSI. It really depends what your web app looks like.

------
iigs
It's clear that a benefit was seen. Moreover, it's clear that in other cases
(shopping at Amazon, searching at Google), more responsive servers mean an
increase in traffic, since people are free to shop / browse more, and it keeps
people's interest.

In the case of a single site with a single product / purpose (ignoring
Mozilla's other products), is there a generally agreed on explanation for what
is going on here? Do people really change their mind about something as big as
changing their web browser because of a delay of one-two seconds? Are these
people marginal users and unlikely to actually finish installing it, or likely
to abandon the browser after a single use?

I'd like a peek into the psychology of the marginal people in a scenario like
this.

~~~
tokenadult
I often conclude that the people running websites are doofuses, and thus their
product may be a no-good product, if the website doesn't meet the universal
usability standard of speedy page views. That's a simple quality proxy
heuristic.

------
CytokineStorm
Finally! It looks like someone actually did some simple math to determine if
their results were statistically meaningful. I wish more people would include
a section like this when they talk about conversion rates.

~~~
sesqu
_15% looks great, but how confident are we?

Running a one-sided Student’s t-test with a means difference of 14%, our
experimental data yields a P value of 0.000051. This means that there is only
a 0.0051% chance that we would obtain a 14% (or greater) improvement if the
real effect wasn’t at least this large._

Err... Student's t doesn't assign a probability to P(measured>=14 |
actual<14), it assigns one to P(measured>=14 | actual=14) under the assumption
of equal variances (which here means actual=0, so clearly not applicable).

I figure that the actual effect is 16.05% +- 0.17%. And that's simply by
assuming 145k is close enough to infinity, and for IE.html only.

That is, unless I'm mistaken about this, since I grok only the fairly basic
stuff. Which is certainly plausible in and of itself.

------
houseabsolute
I don't know if this talk was ever made public but I heard at Google that
under some circumstances cutting a few hundred milliseconds of latency can
double click-through, conversions, etc., or even more. I do occasionally see
Google people encouraging web developers to make their pages faster at
conferences and such. Personally I would be very concerned about the money I
was losing if I had a web site where the 95th worst percentile of my paying
customers had to wait more than two seconds (from the client's perspective)
for a pageload. I'd buy a bigger database machine, spread my servers to more
geographical regions, whatever it took to try to bring that latency down.

~~~
patio11
Google's Marissa Mayer has presented their page loading results a few times
publicly: in one unplanned experiment, about .5 seconds in marginal load time
caused number of user searches (and, by extension, clicks on ads and Google
revenue) to decline by 20%. That was the Web 2.0 Conference in 2006.

 _I'd buy a bigger database machine, spread my servers to more geographical
regions, whatever it took to try to bring that latency down._

Why make life hard. Buying a bigger database server costs actual money and
time. Getting out the YSlow checklist and knocking two to three things off of
it costs nothing and no significant amount of engineer time.

Gzip CSS/JS files: four lines in your Apache config, warm restart, done.
Collapsing CSS/JS together: 16 characters (:cache => true) in Rails, warm
restart, done. Spriting CSS images: open a web page, type a bit, copy/paste
what it tells you to to development, push to staging, verify it works, done.

Moreover, as compared to buying a DB or spreading your servers, these are
virtually guaranteed to _actually work_. Many, many of the suggestions that
come under the heading "performance improvements" do not _actually work_
because they address things that are not problems. (For example, unless you
have data which convincingly demonstrates differently, the vast majority of
websites can assume that the web stack is _not_ a problem. Optimizing your
code within the web stack is generally a hideously expensive waste of time.)

~~~
houseabsolute
Sure, fixing the webpage would come first. I only mean to demonstrate that I
would spare no metaphorical expense in fixing the problem.

------
tokenadult
It's surprising it would take people with their background so long to figure
that out.

~~~
wglb
Yes, but as seen by lots of recent HN postings have shown, actually using a/b
tests seems to be a rarity.

~~~
tokenadult
My comment was more along the lines that a page that promotes your service
should load reliably rapidly, because people who are kept waiting by page
loading may not stay around to convert to a user of your product, even if it
is free. The comparisons the Firefox team show with the webpages of Brand X
products made me think "Ouch!" because I am a user of Firefox myself.

~~~
RyanMcGreal
I've been using Firefox since ... well, since Mozilla 1.2. [1] It makes me sad
and nostalgic, but I have to admit that Chrome kicks ass at sheer page-
rendering speed. I still use Firefox for testing and troubleshooting (Firebug
is amazing), but I'm increasingly using Chrome for straight browsing.

[1] <http://www-archive.mozilla.org/releases/mozilla1.2/>

~~~
armandososa
Some of us are still using Firefox out of pure loyalty. I'm painfully loyal
(That's why I'm still watching _24_ and _Heroes_ ) but I'm thinking seriously
in Chrome.

~~~
colonelxc
I have a firefox shirt and backpack, I can't turn back now!

But really, Chrome is nice (and I occasionally use it), but it is still
lacking extensions that I use on my main machine. Some of them (such as
noscript) I hear wont be possible with Chrome's extension model.

------
pg
Incidentally, how is HN for speed?

~~~
patio11
"Very freaking fast."

The front page loads with 10 HTTP requests on an empty cache, and doesn't have
any user-perceptible blockage while waiting on elements. (Pulled this straight
out of YSlow.)

Improvement is overkill, but if you wanted to:

1) Creates images{1,2}.ycombinator.com and split y18.gif, s.gif, and
grayarrow.gif, and your JS and CSS files across the three domains. This will
cause older browsers to load them all in parallel. (The HTTP spec suggests
having no more than 2 requests to any one domain open at a time, so you end up
with stairstep loading graphs on browsers which are spec-compliant, such as
IE6. Browsers which have a more pragmatic view towards compliance with
published specifications, such as Firefox 3, will do 8 ~ 10 requests in
parallel.)

Note the balancing act in that every domain you add is another DNS query that
needs to get resolved once. Ideally you want to keep it to about 4 domains or
less.

2) Put far future expires headers on your static assets. You can bust caches
by using the Rails-y filename.js?timestampOfLastModificationOfCode method.

But, again, this site is probably the fastest one I use on a regular basis,
even from braindead clients like my Kindle. I wouldn't guess the marginal
improvement is worth the expensive engineer time.

~~~
jeff18
This is good advice, but not optimal. Using multiple domains in this case will
actually slow down the site because of the additional DNS lookups -- only do
that if you are serving up a ton of images.

a) The CSS file should be inline.

b) The images should be done away with entirely by including them as a data
uri.

c) For IE6-7 (data uris not supported) the fallback image should be sprited.

d) For bonus points (if you want HN to be served nearly instantly, globally)
splurge for an application caching CDN provider like Akamai.

------
robryan
Things like this always make me worry about the patience levels of some
people.

~~~
tokenadult
People ALWAYS have something more important to do than wait for your site to
load, always. It's not that your site is unimportant, it's that your users'
personal lives are more important to each of those users. But don't take my
word for this; this is a well replicated finding of usability research in
hypertext environments that go back before the development of the World Wide
Web.

~~~
robryan
I think they have hit the long tale of firefox browser adoption, people that
don't care much what browser they use.

~~~
tokenadult
We don't know how long that tail is

<http://en.wikipedia.org/wiki/Long_Tail>

until all Web browsers are equally easy to install, as the submitted article
shows that Firefox is discovering to its chagrin.

