I think a more accurate headline is "Redesigned form increases conversion 25-40% (who knows why?)"
"While it's possible these adjustments also contributed to the increase, it's unlikely they were solely responsible for it"
What was he smoking when he wrote that? There's absolutely no justification for that statement that I can see.
I think the mad-lib form technique is interesting, and I'd love to see some good A/B testing of just that one change, without all the rest. You could also test reducing the number of input fields, which is also likely important, clearly marking what's optional and what's not, and hiding optional fields behind a link instead of presenting them all at once.
But you're right, of course, there is no reason to assume that the change was from the obvious of the 7 changes instead of from all (or one!) of the other six changes. Did anybody notice that the button call to action changed? I've seen A/B tests where that did 20% by itself. Did anybody notice that the header call to action changed? I've seen A/B tests where that did 20% by itself. Did anybody notice that they added the words "Thank you."? I've seen A/B tests where... actually, I haven't for "Thank you." (if you've got one, please share) But you get the general idea.
A fractional factorial design (http://en.wikipedia.org/wiki/Fractional_factorial_design) of the experiments and other statistic techniques (http://en.wikipedia.org/wiki/Bayesian_experimental_design) may decrease the amount of traffic needed.
I don't think the headline or article is wrong, though I suppose he could say, "your mileage may vary".
Regardless, I think it's compelling enough to try it out on a few forms. I'll try it on our site and crow if there's a meaningful change in either direction.
I'd also suspect seeing your full name in the paragraph would be more reason to use accurate information. Not to mention maybe there could be spam and security benefits in there too
Really like to hear if anyone does some tests against this
I didn't see any uptick in those who signed up. But the generation of anonymous accounts on behalf of the user certainly did increase the interaction with my site.
* "I'm given you" -> "I've given you", "I'm given", "I'm giving you", or "I'm givin' you"
* It's not clear what the field after "from" is supposed to be
* "your email is [ ] with a password of [ ]" -- what do you want my email password for? Try "your email is [ ] and your password is [ ]".
I've been reading Don't Make Me Think and Rocket Science Made Easy, both by Steve Krug. Good stuff if you want to catch issues like these. Just a small amount of usability testing ("a morning a month") can bring significant benefits.
I'd like to see some actual independent, not-the-same-dude tests before I switch to this style.
A/B Testing case studies are good for two reasons:
a) Help create curiosity on what the heck this A/B thing is (good for my product)
b) Gives you ideas to test on your website
But it is very dangerous to implement results without testing for your case. Very, very dangerous.
I wish they'd stop writing "You should follow me on Twitter" as well. That's just about the most irritating, patronising thing I've seen in weeks. I am quite capable of choosing how to spend my time, and following the orders of someone whose web site I have just arrived at (probably looking for some actual content, which has been shoved further down the page by some annoyingly over-sized blue bird logo) is not high on my list of priorities. On the contrary, you could just relabel your entire page "Back" and make it clickable to simulate the effect it has on me.
Ah, I feel so much better now. :-)
People like following blindly.
Full disclosure: in my student days I used to work for a direct mail company, not as a programmer but as a general layout and graphics guy. They weren't mail spammers per se, they specialized in hospital funding drives, or did mail followups on people who were on prescription drugs, to remind them to keep renewing.
Anyway these letters were tested just as carefully as we do today (only an A/B test involved actually printing and mailing stuff). The "Yes! I want to know more about $FOO" format may seem cheesy but it was a proven winner.
You would think people might get tired of it, but they believed that their targets were instead "trained" to respond to these sorts of appeals. That's literally what I was told, when I suggested they try something different.
So, maybe we should be paying close attention to those crappy cards that fall out of magazines?
The idea that direct marketers have something to teach us also probably spreads because it's counter-intuitive: you mean all those icky ads I grew up with were clever?
Now I'm just waiting for patio11 to apply his godlike A/B testing powers on this and show us the results :-)
P.S. If you're using Rails, download A/Bingo. If you're not, email me and I will write you design documents sufficient to create an A/B testing framework in your language of choice.
It should not take godhood to do A/B tests. I want A/B tests to be like for loops: not only will you know them and use them automatically every single time, you will laugh with your buddies about that one time somebody applied to your company without ever having written an A/B test before.
Unfortunately that project doesn't look that active, but it's the best out there now.
Great idea, and great to find it perform better. It probably helps alleviate the confusion some people feel inherent in dealing with their computer - the same kind of people who have no idea how to respond to a dialog box with yes/no|accept/decline, but can answer the same question promptly when posed verbally.
It's nice to see an actual implementation - and that it's not a conversion killer or anything.
But it would be nice to see a clearer example of it being the only change to a form.
Right now it looks as if it's only optional in the "after" form. And then it's quite obvious why it converts better.
It doesn't really say whether both forms address fields where optional.
The golden rule is the less you ask about the higher conversion. So if one of the changes they did where to make filling out address optional then it's obvious why it converts better.
If both are optional then I would say the "before" version seems almost to be made bad on purpose.
On the OP example at Huffduffer one clicks a "sign up" button (which has the mouseover feedback you lack incidentally) to get to a page which is for sign up. Hence the affordance is not so important as they're already in the "I need to enter data"-pipeline.
Also FWIW I think messing with default button formats should be avoided unless you're going to make sure that it's clearly a button (affordance again).
These kinds of form improvements should be done for trying out services that the user already is familiar with, like buying a car, getting a insurance quote, and where there are multiple choices of service providers, right?
Because, on a sign up form for a new web service, I really don't want users to be tricked into signing up. I only want those users who understand what my service offers (through the intro blurb, videos etc) to sign up and if you have decided to try based on the merit, why would a form design matter? If I lost a sign up due to a form design, my service is probably weak, no?
Even something as simple as asking for an email address in addition to username/password will lose you a percentage of signups. (As a real-world example, we tried removing all the fields from the signup page on Twiddla and saw a 1000% increase in users trying us out).
This form design seems to be a way of convincing people that they're not looking at a form, and therefore tricking them into filling it in to go do the thing they wanted to do in the first place.
Using a little creativity in the design, even for forms that users fill in only once has been a great way for us to set ourselves apart.