Hacker News new | past | comments | ask | show | jobs | submit login
How One Check Box Lowered Conversions by 17% (georgesaines.com)
17 points by gsaines on Dec 7, 2010 | hide | past | favorite | 24 comments



I would suspect that users are responding to your headline message "Get 5 tip emails on how to use Skritter" with concern that your product has a steep learning curve - I have to read 5 whole emails to figure out how to use the product? Yug.

The difference between a good headline for opt-in vs. a bad headline will account for decreases like you experienced, but a good one can actually increase your conversion rate.

Trying something less offensive, or even pushing in an active voice would be a better fit:

[X] Get free tips and bonuses from the Skritter newsletter.

or

[X] YES - I'd like the most recent news & updates on Skritter delivered to my inbox.

The following article about some of the resorts we drove is not perfectly relevant, but it demonstrates how the addition of a checkbox bumped results up by 11%:

http://conversionvoodoo.com/blog/2010/07/11-conversion-rate-...


That's a great point aresant, and we certainly didn't test the wording, which I admit would probably be a good idea. As I mentioned in the post, we actually thought very little of the wording or the styling or anything else, we were just thinking "let's get this email tips thing underway, oh yeah, just hook something up to give everyone the option."

Point definitely taken about the learning curve. I'll see if we can't make a few more variants of this to test whether that box could actually work for us.


I think that for a lot of A/B testing software (if not all) the statistics is not done correctly.

Here is what people do: They keep running the test till they feel that results are statistically meaningful. That is that they do not fix the N apriori.

But that is wrong.

Put another way if you wanted to test if a coin was biased you should not stop the test after 10 tosses if you got 9 heads and say the coin is biased. you should decide before hand how many tosses (N) and then conduct the test.

And if results are inconclusive you need to flush the data and start again


In the screenshot, the checkbox is actually checked off. Was that the default setting? If so, I could see that being off-putting to users -- whenever I see that, I assume they're trying to trick me into opting in for something.


I think people are very, very wary about pre-checked checkboxes just under payment details, because it reeks of cramming.

That is, they set off alarms not just that they might be subscribed to some email list, but that a "free trial subscription" (that auto-renews to a paid subscription) is being added to the order.


Agreed. Whether I'm automatically opted-in or opted-out to the marketing checkbox on signup forms is usually my first indication to the validity of the company/service.


It's been more than a month since we took the test down, but I do think we had it defaulted to checked ... never thought about it being sleazy, but that's a good point.


That was my immediate reaction as well: "Oh, they want to spam me. What else am I inadvertently agreeing to if I sign up?"


That was exactly my response. "If their default action is to send me five separate emails immediately after paying, am I going to be getting emails for the rest of my life for trying this thing out?"

I'm touchy about my inbox. I let DailyBurn email me for as long as I had a paid account there, and I still let OKCupid email me, years later. But they both had to earn my trust with a combination of a slow start and relevant emails. Pretty much everyone else annoyed me immediately.


I use 33mail.com, it's fantastic.


Actually, we were testing it checked on or checked off by default, too. I don't remember the details, but the conversion rate drop was from combining both of those flavors.


Even if people don't find it 'sleazy' per se, you're forcing people who don't want the emails to take two actions to submit the form - one of which requires care and attention, rather than just hitting the big button. The extra burden alone may well be enough to account for the drop off.


The sentence After only a short period of time we were able to determine with 94% confidence that the variant was performing worse... sounds like they might have fallen into the trap described here: http://www.evanmiller.org/how-not-to-run-an-ab-test.html


Yeah, I would really like to know how big the sample was.

It's scary to think a checkbox could have such a huge impact on my income next year. Are people really so fickle?


We had thousands of visitors and hundreds of conversions (I don't know the exact numbers off the top of my head). We let it run long enough that the confidence intervals for the results fell below .1%, for this particular test, we would have let it run longer, but the results let us know we didn't actually need to for the email test. Since it was likely to be worse, we just shut it down.

We are (unfortunately) well aware of how statistically significant things look with too little data and on the flip side, how to optimize for the wrong conversion goal (see my other post: http://georgesaines.com/2010/09/28/how-we-increased-conversi...) We use Visual Website Optimizer so we have to set a test, agree on a number of conversions we have to see before making our first judgment, and the adjust our expectations based on the number of times we peak.


If you got a 99.9% confidence, why do you say 94%? Is that just what it was when you reached the original sample size?


I'm surprised no one has mentioned this but the "x" isn't very confidence inspiring, especially next to financial fields. Patio11 once told a tale [1] of an Indian designer who created a "continue" graphic that was red with a yellow halt-don't-continue-past-here hand. I would not be surprised if a green checkmark instead of a black "x" would have helped conversions, but this is coming from someone who has not done any A-B tests...

[1] http://www.kalzumeus.com/2008/03/02/my-experience-with-outso...


The small things really do matter when dealing with fickle users. I wonder, though, how much of the 17% might be due to the placement of the checkbox. Putting it below the credit card info would make me think "wait, I'm paying these guys and they are telling me it isn't easy to use?". Perhaps moving it up by the email address field would yield different results.


Just what I was going to say. The form is split into 2 sections, one for account information, the other for billing. The "Tips" checkbox is in the billing section! This is a terribly bad place to put a checkbox labelled "Tips". Are these additional charges, as per a restaurant tip!? Do I get charged for these emails? Do I get sent 5 per day?


Yeah, that's also a good idea. I with the comment above about the wording, I'm wondering know if I couldn't iterate that checkbox to for the better. Thanks for the idea.


94% is not science. If you do 15 random experiments, there's a good chance one of them will succeed at the 94% confidence level.


Expedia had a similar problem (http://www.silicon.com/management/sales-and-marketing/2010/1...) with a different datafield, but the message is clear - use thy analytics well :-)


"Get 5 tip emails on how to use Skritter"

is very hard to place a value on, to a customer who hasn't even used Skritter yet (they are still at the registration stage!).

why not

"Get our Secret Tips to Language Learning free course (5 emails)"

or something like that. the benefit sounds much more general, although the content can probably be basically the same as it is now.


I don't think it's the presence of the checkbox so much as the reminder that "we will now have your email address and you will receive generic mail from us". I don't think I've ever heard of anyone /excited/ at the prospect of receiving generic mail from vendors, and certainly know of plenty who hate it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: