Hacker News new | comments | show | ask | jobs | submit login

Don't wait: http://mynaweb.com/ [Yes, this is a shameless plug for my startup.]

It seems like most people who use these content optimization tools don't really understand the statistics involved. What are your thought on this? How do you educate your users on the merit of your approach vs a/b testing when the topic is so complex?

Also, despite this being a slightly pro a/b testing post, I have to say it's actually made me more interested in trying out Myna's approach MAB algorithm.

Same way every product from GWO and T&T on down: show a pretty graph that ignores the underlying assumption that it's even possible to use statistics to conjure certainty from uncertainty, and trust that users will never know or care about the difference.

/former AB test software dev who fought my users to try to stop them from misinterpretation results, and failed.

If it gives you comfort, if there is a significant underlying difference and the calculations are done right, with high probability they will get the right answer even though they are misunderstanding the statistics.

Acceptance of this fact has avoided a lot of potential ulcers for me.

Just to be clear, we don't have anti-MAB stance or pro-A/B testing. The point was that MAB is not "better" as an earlier article titled (20 lines of code that beat A/B testing) had claimed. These methodologies clearly serve two different needs.

You camouflaged your sign up button with a nature color. Took me the longest to find it.

You probably just pulled the wrong lever on his logarithmic-regret-optimised multi-armed bandit. ;)

I should note here that if you use Myna, you will be using a much better multi-armed bandit approach than the epsilon-greedy which lost in this blog post.

See my longer top-level comment for some of the trade-offs.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact