Everybody is using the "you should follow me on Twitter" line on their blog, just because one guy made a successful A/B Test with that line.
It just seems that the A/B-Testing companies seem to promise a lot. Will everybody who uses visualwebsiteoptimizer see a 50% increase in signups (when they test the popup forms)? Apparently not... So it is a false promise of sorts.
It seems to me that the issue isn't A/B testing, it is blind adoption of other people's recommendations. You certainly don't need A/B testing to give people suboptimal advice -- designers, developers, and marketers do it every day without any evidence whatsoever! ("Eat more bananas, they're good for brain cancer!" <-- the state of web design best practices and, ahem, not far off from the state of nutrition research)
I think a healthy conclusion is "this worked for us, so it might be a good idea, but you should test it for your site". Wouldn't you agree?
I suspect with A/B tests you could also hurt yourself, for example if you measure the wrong thing.
I don't know, maybe I am just a little bit tired of all the simplified promises.
1 out of 8 A/B tests produce results and that's a reality. And only the positive results get published so you may get an impression that A/B testing is snake oil.
Reasons why such case studies are important is not because these will work for everyone, but may work. In fact, if I knew modal boxes work every time why would I even A/B test them?
Naturally, A/B testing is just one part of puzzle but discounting it is snake oil is not justified.
Yes, I can see that A/B testing is definitely getting attention and hype and some people may directly implement results from A/B testing (which defeats the original purpose).