

Ask HN: Should I believe the results of our A/B testing? - oziumjinx

Over the last few weeks our little app got written up on some blogs around the web.  It boosted our traffic levels significantly for around a week and the conversion rates from the referring blogs were extremely high.  I'm attributing the high conversion rates to the fact that after someone read a review about our app, they wanted to come to our site and try it.  So before they even became a 'user' the conversion was already there, just waiting to be registered.<p>So lets say we had 100 users reading one of our mentions on a blog that click through to our site to give the app a try...Does the download button or headline of the landing page (which we are A/B testing) really matter that much at this point? These users had a pretty clear goal of trying our app.
The conversion rates from these visitors were around 75-85% across about 10 blogs (about 4,000 visitors in total).
The app has no registration or credit card required barriers.  The only requirement is that the user needs to have Firefox (which all of the blogs clearly mentioned before sending the users).<p>So should I be making judgment calls about the most effective headline or the color of the download button?  Or should I just throw away this data and look at other traffic sources before making the decision of which headline to set as the next control?
======
po
So you're saying the sale was basically yours to lose? I would think that if
the results of your tests say it was significant (assuming you ran them
correctly), then it probably was. If you have data, then you should use it to
your advantage. What exactly makes you not trust it?

It sounds like the question you mean to be asking is more like: Do I have
bigger fish to fry than playing with the color of the buttons on the site?

