Hacker News new | past | comments | ask | show | jobs | submit login

Nice and all, but I have started to wonder if the A/B-Testing mania is just the new SEO mania, resulting in just as much snake oil in the end.



I think the snake-oil nature of (much of) SEO stems from the fact that it's difficult or impossible to measure in an accurate, unambiguous way. Since A/B testing includes measurement by its very nature, I suspect it will be more resistant to snake oil.


Can you please elaborate how is it snake oil? I am genuinely curious since I run an A/B testing company.


There are probably some things that work (naturally, most products can be improved upon), but I doubt that it is a solution for everything. Already there was a comment in this thread from someone who claimed the popup forms did not work in his case.

Everybody is using the "you should follow me on Twitter" line on their blog, just because one guy made a successful A/B Test with that line.

It just seems that the A/B-Testing companies seem to promise a lot. Will everybody who uses visualwebsiteoptimizer see a 50% increase in signups (when they test the popup forms)? Apparently not... So it is a false promise of sorts.


I'm a bit troubled with my results being used to support that conclusion, in the matter of a pharmaceutical researcher who published "Penicillin is not an effective treatment for brain cancer" would be troubled to be cited in support of "Well, that tears it, medical research is a sham."

It seems to me that the issue isn't A/B testing, it is blind adoption of other people's recommendations. You certainly don't need A/B testing to give people suboptimal advice -- designers, developers, and marketers do it every day without any evidence whatsoever! ("Eat more bananas, they're good for brain cancer!" <-- the state of web design best practices and, ahem, not far off from the state of nutrition research)


The same thing happens with optimizing code for speed/efficiency/etc. People want rules of thumb and they don't want to hear the truth, which is "Rules of thumb are usually bullshit. You need to measure, change something, then measure again. Your system may or may not be the same as mine."


Btw. I don't like your analogy, I did not conclude from your example that A/B Testing never works. But one example is enough to show that it does not always work. Specifically, the claim "use A/B Testing and your signup rate will improve x%" is apparently not true.


Of course, I have no doubts that A/B Testing is useful, I take more issues with the way it is being marketed. Likewise a lot of SEO is probably useful, but there are also a lot of black sheep.


It sounds like you think that the reported results of these A/B tests are snake oil, not A/B testing itself. I think most authors of A/B-testing-report articles would recommend testing such changes yourself, as opposed to accepting their findings as fact for every site in existence. Those who said the popup forms did not work for them only know that because they A/B tested them.

I think a healthy conclusion is "this worked for us, so it might be a good idea, but you should test it for your site". Wouldn't you agree?


Yes and no - you can not test it all, there might be a point of diminishing returns after a certain level of optimization that has gone into a site. Just as with SEO - SEO is not completely useless, yet there are people who promise too much or even hurt you.

I suspect with A/B tests you could also hurt yourself, for example if you measure the wrong thing.

I don't know, maybe I am just a little bit tired of all the simplified promises.


Totally agree. A/B testing is not a panacea and certainly not every one who uses VWO sees improvements. In fact, last to last week, we prominently featured an article on our blog by Noah of AppSumo who said only 1 out of 8 A/B tests produce results of any sort. Here is that article: http://visualwebsiteoptimizer.com/split-testing-blog/a-b-tes...

1 out of 8 A/B tests produce results and that's a reality. And only the positive results get published so you may get an impression that A/B testing is snake oil.

Reasons why such case studies are important is not because these will work for everyone, but may work. In fact, if I knew modal boxes work every time why would I even A/B test them?

Naturally, A/B testing is just one part of puzzle but discounting it is snake oil is not justified.


No offense intended, I am sure VWO is a great product. Haven't tried it yet, but have been meaning to. Also the blog is a joy to read. It was more expressing a general worry.


No offense taken :)

Yes, I can see that A/B testing is definitely getting attention and hype and some people may directly implement results from A/B testing (which defeats the original purpose).




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: