
How We Increased Conversion Actions 218% And Increased Sales 0% - gsaines
http://georgesaines.com/2010/09/28/how-we-increased-conversion-actions-218-and-increased-sales-0/
======
aresant
Point is make sure you're optimizing around a QUALIFIED action.

EG - in the Skirtter conversion funnel - you're not disclosing it's a paid
service until checkout.

That's doing two things:

a) Creating a jarring experience for users that are genuinely interested and
didn't see any $$$ requirement before investing in watching a tutorial.

b) Sending lots of non-qualified people through the funnel.

You’ve got some great content here: <http://www.skritter.com/pricing> \- maybe
incorporate a little of that?

Alternatively use a testimonial to disclose $$$ like: "I learned more during
the past 5 months than the past 3 years – best $9.95 I ever spent!"

~~~
gsaines
Good idea aresant, as I mentioned, we're actually testing a lot of different
variables right now, and one of them is including testimonials to that effect.
It's not in the testing queue as yet however, due to limited traffic and an
abundance of tests!

------
carbocation
Thank you for this. I especially appreciated that you have brought the concept
of "publication bias" (though not by name) into the discussion of AB testing.
In medicine there is a tendency to publish positive results and not negative
results, leading to a bias in favor of positive results. Most likely, this is
an equally large or larger problem for bloggers.

~~~
patio11
Yep. I have an A/B results page which automatically collects closed tests.
<http://abingo.org/results> 75% or so of tests get a null result (A may or may
not be better than B.). Lots of the remainder tell me to what degree of
statistical significance my great new idea for improvement sucked. But you can
still improve over time.

~~~
gsaines
Hey Patio, thanks for sharing that link, I didn't even know that page existed!

------
biznickman
The greatest issue, as you acknowledge in your post, was that you weren't
testing the point closest to the sale. The first thing to test is the final
registration page. I often spend a fair amount of money to drive traffic that
I can optimize against to single page landing pages. If you can't make enough
money through Google Ads or Facebook ads (e.g. your cost per sale > revenue
per sale), then you either have a bad business or a shitty landing page.

Rather than just optimizing existing traffic, buy traffic and find out how far
off you are. The best part of these ad systems is that they are relatively
efficient. That means you should be able to quickly figure out how far you are
off.

While I'm sure "learn Japanese" has already been taken, there are probably
plenty of long-tail keywords as well as people on Facebook who haven't been
targeted. Your key is going to be to find out who those people are and drive
them to your site.

I tend to be pretty straight forward with my own investments in projects: if
there's a huge gap between cost of acquisition and revenue per customer (or
present value of the LTV of the customer), you probably need to build
something better or change your product all together. Your own internal cost
per user will end up being just as high as google ads (if not higher), except
google provides an instant vehicle for testing. In other words, it will help
you fail quicker ... something that's extremely valuable in the world of
startups.

------
chrisgoodrich
I think that many people that post about AB tests fail to mention that one's
expectations of AB testing have to align with the types of results one can get
from AB testing.

AB testing isn't meant to used through the entire conversion funnel. That is,
you can't measure an optimization on your homepage by sales. Sales is an
indication of your entire conversion funnel. Each page of every site should
have a single goal and you should AB test to that goal. In your case I would
say that you're AB test was successful; however, you didn't go far enough.
Once you were getting increased traffic to your signup page, it's time to AB
test the signup page and improve that conversion.

The statistics behind AB testing are often lost by the hype. The tests used to
calculated statistical significance require a random sampling and isolated
variables. The moment you try to attribute success or failure of your AB test
beyond the primary measurable goal, you're introducing outside variables that
will skew your results.

All of that being said, I would argue that true AB testing is not cost
effective for most startups. It takes a lot of timer and energy to test every
part of your conversion funnel.

~~~
ryanwaggoner
_In your case I would say that you're AB test was successful; however, you
didn't go far enough. Once you were getting increased traffic to your signup
page, it's time to AB test the signup page and improve that conversion._

This was my thought at first, but thinking about it now, their AB test
actually caused a _decrease_ in conversion on the signup page. And that's the
point they're making: they drove traffic to the signup page more effectively,
but that traffic was no more interested in actually signing up. You can do a
lot of things to massage the stats if you're only optimizing for a single
step. You can make your homepage promise people a golden pony if they'll only
click on the link to your pricing page, but they're not going to get there and
suddenly decide they don't care about the pony and want to give you money
instead.

~~~
chrisgoodrich
IMO that screams for more AB testing as there's obviously a disconnect between
the homepage and signup pages that should be tested.

------
btilly
In this situation I would recommend A/B testing to signup. Because that is the
first hard decision that people have to make.

If you're A/B testing to actual sale, I have two points to make. The first is
that you should be aware that it is OK to run multiple A/B tests at the same
time. The second is that you should pay close attention to whether there is a
possibility that your change will compress the sales cycle. If it does, an
apparently positive result can be meaningless.

~~~
gsaines
Yeah, we're actually running 3 tests currently, although you probably didn't
notice them because they are fairly subtle. We were doing multivariate stuff
before, but I think we're going to sticking with simple A/B tests for now due
to the limitation of traffic.

------
paraschopra
This is a great article to demonstrate that testing automatically doesn't
guarantee boost in conversions.

On the homepage, you optimized for conversions to signup page, so you got
that! If on signup page, you optimize for signups, you will get that
(hopefully). The only thing testing does is to substitute blind flying with a
set of gauges. The act of coming up with test objectives and good variations
is much more vital for increasing sales and conversions.

