

Optimization at the Obama campaign: a/b testing - kylerush
http://kylerush.net/blog/optimization-at-the-obama-campaign-ab-testing/

======
patio11
_About halfway through the campaign we figured out that of all variables that
affect user behavior (design, usability, imagery, etc.), copy has the highest
ROI_

I thought I'd highlight this because it is true, apparently not obvious since
everyone discovers it for themselves, and _wildly actionable_ by many people
on HN.

~~~
ktr
Any idea or theory why that is? E.g., after reading sites like
<http://www.useit.com/> you'd think the only thing that people do is scan (not
read).

~~~
LanceJones
Scanning and reading are not mutually exclusive. In my opinion (and as someone
who leads testing at Adobe), scanning is a style of reading that has become
more commonplace with the Web -- a medium that is extremely copy heavy (a good
thing!). People who understand how to write copy know how to get their copy
read. Generalizations like "nobody reads" serve to keep everyone down EXCEPT
for the people who know -- based on data -- that statements like it are
erroneous.

------
pdog
_> Overall we executed about 500 a/b tests on our web pages in a 20 month
period which increased donation conversions by 49% and sign up conversions by
161%._

How did they measure this? Did they isolate the effects of the a/b testing?
Donation and sign up conversion rates will increase anyways as the day of the
election approaches.

~~~
karolisd
You're always testing against a control and have something to compare to.

If they really wanted to, they could've kept a small amount of traffic going
to the original control from before they started testing.

------
edwinnathaniel
HN-ers, I'm new at this whole A/B Testing, Copywriting, "testing your
hypothesis", and the whole "trying to sell something" + "MVP" (or the Lean
Startup).

I recalled yesterday I saw a link to Intuit methodology that also do a similar
A/B Testing to decide what to build:
[http://www.forbes.com/sites/bruceupbin/2012/09/04/intuit-
the...](http://www.forbes.com/sites/bruceupbin/2012/09/04/intuit-the-30-year-
old-startup/)

Could someone connect the dot to me or point to me to somewhere that I can
make sense of the whole thing?

How can one come up with many pages so quickly to test? and how one could
decide what the content of the page to test is?

~~~
mikkel
There are a few javascript-based solutions for testing

* Optimizely (A YC company with a WYSIWYG editor for specifying variants)

* Xander.io (Targeted at developers with HTML5 markup to specify variants - full disclosure I help with this project)

* Visual Website Optimizer

There are also server-side solutions depending on what framework you are
working within.

On getting started:

You need a goal for your pages (or collection of pages). It could be you want
users to sign up for a mailing list, or donate to a cause. As long as the goal
is clear, you will have a direction to work your hypotheses towards.

Once you have the goal, coming up with pages and content is the fun and
difficult part.

I like to think of pages as a combination of sections, with each section
having a bunch of variants - we wrote a simple example with xander.io -
<http://www.xander.io/tutorials.html> and there are a few on each of the
services I linked above.

It's definitely worth figuring out, let me know if I can help!

------
kevinconroy
Great write up. For those interested, I've found similar results in five years
of A/B testing on a non-profit fundraising site (www.globalgiving.org).

Pro tip: Don't just copy what Obama did. Your mileage will vary based on your
user base and context. Take their learnings and A/B test it on your own site
to see if it works for you or not. Not all optimizations are universal.

~~~
irahul
> Not all optimizations are universal.

One example from the top of my mind would be Wikipedia's "personal appeal from
Jimmy Wales" campaign. That photo made me uncomfortable, and I would have
liked a more panoramic shot with the subject in focus like the one used for
"Dinner with Obama". But Jimmy Wales responded to someone(I can't find the
thread; was on reddit I think) that it converted better than other
alternatives.

That said, starting with some known best practice(long forms, make it multi
step) and then measuring the alternatives goes a long way.

~~~
patio11
Wiki published all the numbers from that campaign. The personal appeal from
Jimmy creative absolutely ROFLstomped everything else they came up with. 10X
improvements, literally.

Here's the raw data.

[http://meta.wikimedia.org/wiki/Fundraising_2010/Banner_testi...](http://meta.wikimedia.org/wiki/Fundraising_2010/Banner_testing)

------
justjimmy
The biggest surprise was the 5% increase for breaking the steps - seems little
- when compared to the copy and imagery change (massive 19% and 21%).

That's kinda insane! Love the report, thanks.

~~~
karolisd
A lot of conversion optimization is finding the balance between the number of
steps and the complexity of each step.

------
irahul
I have mixed feeling about "Now save your payment information". "Save your
payment info for next time" makes it clear that it is optional whereas the
latter makes it look like part of the process. It might increase
conversion(and did per the post), but I get the feeling that is somehow
conning the user into signing up.

~~~
karolisd
That's one issue with a/b testing, it can lead to behavior that's all data
driven which then can lead to "tricky" designs being implemented. Sometimes,
someone should just be able to make a call.

~~~
mistercow
Well a call certainly had to be made about what they were going to test. The
trick there is that it's easier to get people to agree to "try a technique
out" even if it is unscrupulous, and that means a tricksy has a better chance
of getting its foot in the door.

------
MattSayar
I wish somebody would do A/B testing with internal company applications.

~~~
thematt
The big difference is most employees are _forced_ to use internal company
applications, so getting user adoption/conversion is not as big of a concern.

------
new_test
Wired coverage: <http://www.wired.com/business/2012/04/ff_abtesting/>

------
latchkey
It is interesting to see how the Romney campaign copied from the Obama
campaign.

For the flipside, what was copied from the Romney campaign?

------
chrisringrose
Awesome. Every business could learn from this. I know Facebook does A/B
testing.

------
danso
Why is it that there's no Romney campaign write-ups? Are they that less open
of a tech community? Yes, I know they lost, but technically, they beat out
many other competitors in the Republican primaries and so must have exercised
some kind of technical prowess?

(and yes, I'm aware of their fail-whale Orca system....that would also make
for a fun writeup)

~~~
danglazer
there actually was a write up yesterday.
[http://www.targetedvictory.com/2012/12/success-of-the-
romney...](http://www.targetedvictory.com/2012/12/success-of-the-romney-
republican-digital-efforts-2012/)

funny thing is they were using optimizely as well.

disclaimer: I work at optimizely

~~~
danso
I knew someone was going to point out my confirmation-bias...thanks!

------
ryangripp
Clearly, Kyle had no experience with eCommerce before this project as it's
evident in his writeup.

However, props to him for all the hard work.

~~~
irahul
> Clearly, Kyle had no experience with eCommerce before this project as it's
> evident in his writeup. However, props to him for all the hard work.

I don't know if it was your intention, but you sound incredibly vain and
arrogant. "Clearly Kyle had no experience" implies this is so trivial why is
anyone even talking about it. And to top if off, "However props to him..." is
even more insulting.

