
Ask HN: Why Does the Smiling People A/B Test Not Work for Us? - znq
We&#x27;re developers at heart, and marketing doesn’t come naturally to us. We’ve read multiple articles on the positive effects of placing smiling people on pages. So we decided to try this out.<p>Bugfender is available for several platforms, with our primary focus on iOS and Android. So we thought we’d A&#x2F;B test our original platform pages with a version of a smiling person.<p>We ran that test at a 50&#x2F;50 ratio for 10 weeks, tracked the number page views and the number of clicks through to the the signup form from each version.<p>Both versions were served on the same url, but for the purpose of this demonstration, we’ve separated them out, and you can view them here:<p>- https:&#x2F;&#x2F;bugfender.com&#x2F;platforms&#x2F;ios<p>- https:&#x2F;&#x2F;bugfender.com&#x2F;platforms&#x2F;ios-smiling<p>- https:&#x2F;&#x2F;bugfender.com&#x2F;platforms&#x2F;android<p>- https:&#x2F;&#x2F;bugfender.com&#x2F;platforms&#x2F;android-smiling<p>Based on what we’d read, we expected to see some sort of increase (however small) in sign-up clicks when coming from the smiling versions. However the result we had we very different:<p>• Android smiling CTR: 17.49%<p>• Android normal CTR: 21.29%<p>• iOS smiling CTR: 21.74%<p>• iOS normal CTR: 31.43%<p>* Percentage of users who clicked through to sign up after viewing each of the platforms page<p>Over the course of 10 weeks, pages served with a smiling person always performed worse, averaging 5% for Android and 10% for iOS.<p>We understand that there is no special formula that works for all, but we didn’t expect this, and we were wondering if anyone may have any insights or ideas as to why this might be?<p>Could it have been the extended content? Or that the picture was too generic? Did it perhaps distract from the real data about the product? These are some of our thoughts, but as marketing is not our forte, we really have no idea and this is just speculation.<p>We’re always keen to try out A&#x2F;B test and experiments, so we’d love to hear any ideas or suggestions you have too.<p>Thanks for taking the time to read this :)
======
dirktheman
The smiling picture focuses on features ('remote logginf for IOS apps), where
the normal page focuses on benefits ('Start Building Better iOS Apps Today').
Also, the layout/contrast is different.

To get meaningful results out of your A/B testing the number of variables
should be as little as possible. So if your premises is 'smiling people
convert better than a picture of an iPhone' you should keep every other detail
the same.

------
oldcynic
My take:

Generic stock photo dude draws the eye -- enough to distract from reading the
copy as he stands out, _but doesn 't add anything._ Why is he there? No idea,
but he's not helping me figure out your service.

Having a vague, unreadable, screenshot with some colours carelessly slapped
over his image appears very amateurish to my eye. Thus I am actually put off a
bug finding application.

No point scrolling on, you're not telling me anything, it's just yet another
generic page with generic empty sales words "enterprise grade encryption" \-
yawn (close page here).

So I wouldn't get below the first fold before leaving.

The non-smiling page _immediately_ presents me a screen and use-case that
looks like it might be useful to me. I'll spend a few seconds more and scroll
down to look around a little.

------
tylerhou
As a developer, being able to immediately see the value add from the example
picture above the fold on the non-smiling version is a really big sell. In one
glance I know what your software does and I know how I might need it.

With the smiling version, it's a lot less clear what your software does and
how I might use it. There's more text explaining the product, sure, but I
don't really want to spend the cognitive effort reading it. Even after I have
read it, I still don't really understand how your software solves my problem -
at least not nearly as well as when you give an example image.

------
hluska
I noticed some things and have some questions.

1.) What resolution/browser combination did you test? And what
resolution/browser combinations visited the site during your a/b test? On my
(old) iPhone, I don't see the smiling face at all. Instead, all I see is his
elbow.

2.) Your copy is significantly different between the smiling and non smiling
options. A/B testing gets messy as you change more variables. I think the copy
on the non smiling option is much stronger.

3.) The design is quite different between the two options. The texture on the
smiling version looks dirty and slightly pixelated.

4.) How big was your sample? One dirty secret of A/B testing is that you need
fairly significant levels of traffic before a test has any meaning.

Based on what I've seen, this isn't a very good study and I'd urge you not to
consider it actionable. Rather, I'd recommend further study. When you design
your next test, you should consider:

A.) What percentage of users can see the smiling face? A decent analytics tool
should give you all kinds of insight into resolutions, browsers and devices.
Remember, if you're sending 50% of your traffic to the smiling face, but only
50% of those people can see it, you're not testing what you think you are.

B.) Keep the copy and design elements identical. A/B testing gets messy as you
start changing more than one thing.

C.) Be careful with your sample size. If you only send a sample of 100 people
to each version, you can't draw many (if any) conclusions and expect them to
hold up under more traffic.

If you do a proper test and still get those results, you should look into the
people visiting your site and what their motivation is.

Do you have any evidence to suspect that the people who are visiting you are
currently trying to fix a bug? If so, is it possible the smiling face is too
much for stressed out developers?

------
claudiulodro
Your generic stock photo dude is not very appealing to developers. Try the
test with that superhero bugfender lady instead of the bro and I bet you'll
have better results.

------
saluki
Like others have said try to keep your A/B tests the same except for the one
change you are testing.

If you haven't already check out patio11's book.

[https://www.kalzumeus.com/2012/12/21/i-wrote-a-book-on-
conve...](https://www.kalzumeus.com/2012/12/21/i-wrote-a-book-on-conversion-
optimization-for-software-companies/)

It still has lots of great info.

His block, podcasts and comments here on HN are worth checking out too.

------
adzeds
As one of the other comments has already touched on. You did not actually test
smiling faces vs non-smiling faces as both the landing pages are very
different in more than just one element.

Also, from my experience your industry to better suited to different types of
tests such as the value proposition and the selling of the benefits rather
than the features of your product.

Answer the visitors need/desire and they will sign up

~~~
adzeds
I run a Conversion Rate Optimisation Agency and would be happy to help you
design a new landing page to test as part of a free consultancy, in return for
a review/testimonial of our work.

Let me know if interested.

