

This one A/B testing case study will leave you smarter and wiser - paraschopra
http://visualwebsiteoptimizer.com/split-testing-blog/increase-relevance-to-boost-conversions/

======
tlarkworthy
I would probably click the one without the price (thinking I was getting a
price list for free) Then get really annoyed to find it was trying to charge
me $2.

increase click rate? ... yes

increase conversion rate ? inconclusive

I bet they have the conversion result numbers but they either contradicted the
story, or were weak results that showed the A/B test wasn't actually useful in
this case

~~~
siddharthdeswal
They didn't measure how this affected revenue. They're still in the process of
rolling out that test. (I work for VWO)

------
robjh
I didn't click the link, it sounds too much like an advert.

But i if i did i suspect that i'd find one weird trick that doctors hate.

~~~
siddharthdeswal
Thanks for the feedback. Tells us we need to think up better headlines.

------
ScottWhigham
Anytime I see some crazy wording like, "Which of these call-to-action button
versions increased clickthroughs by 600%?", my first thought is "Too small of
a sample size". Over a large sample size, it's hard to believe that the
difference in those two buttons would be anywhere close to 600%. Maybe I
missed it - did she say anywhere how many days/impressions were used?

~~~
siddharthdeswal
Sample size was 4500+ distributed between the two versions. In this case, the
sample size was not small but the conversion rate of the page was very low.

Which is why a 600% increase was even possible. That's usually a problem with
A/B testing in the real world. There are so many websites whose conversion
rates hover in the 0.xx% range. Thoughtful hypotheses almost always result in
significant gains. Not because they did something magical, but because they
simply communicated their business offer better.

