

Google Website Optimizer Case Study: Daily Burn, 20%+ Improvement - divia
http://www.fourhourworkweek.com/blog/2009/08/12/google-website-optimizer-case-study/

======
onreact-com
"Conclusion: Simplified design improved conversion by an average of 20.45%."
is the most important sentence in this post.

To be honest you don't have to use Google Website Optimizer to notice that.

~~~
patio11
It is easy to believe the things you _want_ to believe. A/B testing is useful
for convincing you of the things you _don't_ want to believe.

Example: I'm releasing my Rails A/B testing framework this Sunday, with a case
study writeup taken from my site. My site currently allows people to defer
signup for the trial by means of a guest login. That sort of option is popular
here. I'm told it increases usability and user engagement, right?

Without ruining the surprise: that is a testable hypothesis. ;)

~~~
onreact-com
A/B testing is great, no doubt. The lesson of solely simplifying is just not a
great outcome of such a test. It's like driving a car for the first time and
then telling everybody that it's faster than walking.

------
videophile
There is some "New Math" being invoked here. If the conversion rate goes from
24.4% to 29.6%, it's disingenuous to claim a 21.2% improvement.

I stopped at the first misuse of statistics, but by that point, they had taken
out informative links from their web site and turned it into a landing page.
Frankly, a 5% change in conversion rate is in the noise compared to the effect
of targeted advertising (and I have no faith that these sloppy reporters
actually did a properly controlled experiment).

Edit: Downmodded for asking for real math over marketing math?

~~~
jbm
It's been a while since I've taken a math class, but I'm not sure what the
problem is with calling it a 21.2% improvement.

The old conversion rate was 24.4%. The new one was 29.6%.

(29.6 - 24.4) / 24.4 = .213 = 21.3%.

Is it misleading because their number of conversions is so low? (Not trying to
be snarky, I have no idea)

~~~
videophile
The improvement in the conversion rate is (29.6 - 24.4) = 5.2%.

Reporting "a percent of a percent" is downright misleading because it is
entirely dependent on the starting baseline and also because conversion rates
have high natural variance. Imagine an improvement from 2% conversion (for a
truly awful site) to 2.6% (for a site just as awful). That "30% improvement"
just isn't. If they were intellectually honest and called it a "0.6%
improvement," anyone would be able to see that the claimed improvement is well
within stddev.

~~~
webwright
Huh.

The number that matters is "how many more users are signing up since we
changed"? The answer is NOT 5.2%.

Example: I have a 1% conversion rate on 100,000 visitors. 1,000 customers!
Yay! I change something and I get it up to 2% on the next 100,000 visitors.
_2,000_ customers.

Which is a better description: "I doubled my conversion rate" or "I increased
my conversion rate by 1%?" I'd go with the former.

~~~
videophile
An equally valid question to ask is: "How many users are bouncing away from
that website after landing on the first page?" That raw rate improved by 5.2%.
When you quote percentages of percentages, the rates vary widely depending on
your point of view -- in the original example, an optimist might report a 20%
improvement, looking at the users who reached point B, and a pessimist might
report a 10% improvement, looking at the users who left without reaching point
B. This is only partly why it's a bad idea to quote percentages of percentages
-- they are dependent on the baseline. There are at least two baselines to
choose from (successful conversions vs bounces). Also, percents of percents
hide the task difficulty, which varies depending on the starting point (going
from 1% to 2% is probably not hard, going from 99 to 100% is probably very
very hard).

But the other reason why it's a bad idea to quote percentages of percentages
is that there is no indication whether the claimed improvements are actually
statistically significant. I didn't see any data in the article to indicate
they are, and what little I know about conversions from my own observations is
that they vary by at least 10% depending on totally random factors.

>Which is a better description: "I doubled my conversion rate" or "I increased
my conversion rate by 1%?" I'd go with the former.

I see your point and I'd use it if I needed to flatter myself. But I'm
sticking to the latter. And I'd be sure to mention the starting baseline as
well.

~~~
webwright
"I see your point and I'd use it if I needed to flatter myself. But I'm
sticking to the latter. And I'd be sure to mention the starting baseline as
well. But I'm sticking to the latter."

Help yourself. Just don't use that type of language in any conversation with
anyone else on the planet because you'll likely confuse the hell out of them
(take note of the "wtf is this guy talking about" downvotes you've received on
this thread)

It's not about flattery. It's about metrics that matter and revenue. Doubling
your conversion rate pretty much doubles your revenue. That aside, the goal
here is communication. I'd wager that if you said that you doubled your
conversion rate, people would grok what it meant (1% to 2% or 5% to 10%--
either way, a big win because it'd double revenue and profit down the funnel).
If you said you increased your conversion rate by 10%, I'd wager people would
assume that meant something like 10% to 11%.

~~~
videophile
>take note of the "wtf is this guy talking about" downvotes you've received on
this thread

I don't rely on opinion-polling to determine who is right, esp. when it comes
to questions related to science.

>conversation with anyone else on the planet

On your planet, a move from 0.0001% to 0.001% would be reported as "an
improvement of 1000%" No thanks.

> It's about metrics that matter and revenue.

Yes, we established long ago that it's about marketing-speak versus math.

So, since you have so many upmods, perhaps you can tell me what happens in
your world when the conversion rate goes from 0% to 1%? :-)

