
How we reduced our cancellation rate by 87.5% - kareemm
http://blog.reemer.com/how-we-reduced-our-cancellation-rate-by-87-5
======
patio11
Seriously, one of the best and most actionable articles you'll read this week.
(n.b. This sort of thing _prints money_ for companies at pretty much all
sizes. Well, after you've got enough customers to worry about cancellations.)

I'll probably write something about this eventually. There are a lot of
generalizable tactics which repeatably work well. (Email engagement is
probably the highest bang for the buck, considering that you can implement it
in about an afternoon and, coming from the starting point "We send no email",
it will virtually immediately produce visible results.)

~~~
cperciva
On the topic of emails and account cancellations: When I started sending out
"Your Tarsnap account will be deleted soon" emails, the account attrition rate
dropped by 50%. The change wasn't from people who didn't realize their account
was going to be deleted -- they had received two emails already, 1 week and 2
weeks earlier -- but getting that last prompt seems to have shifted people's
default action from "do nothing" to "pay some money to make the account stay
around".

Some time later, I cut Tarsnap's account attrition rate by another ~50% by
adding a line to the "account will be deleted soon" email: _"If you've decided
to stop using Tarsnap, I'd love to know why."_ This wasn't deliberate -- I
added it for the simple reason that I really do want to get that information
-- but it seems to be causing people to stop and say "hmm, I can't think of
any good reason to not use Tarsnap, so maybe I should keep using it after
all".

~~~
enjo
Regarding that last bit:

I wonder if they feel a bit of social pressure to stay. Basically when you say
" _I'd_ love to know why" all of a sudden there is an actual person who would
be clearly harmed by their decision to leave.

~~~
polynomial
Would like to see some A/B testing with "I'd love to know why" vs. "We'd love
to know why."

~~~
cperciva
Fortunately, the number of people who let their Tarsnap accounts expire is
small enough that unless there was a huge signal it would take a very long
time to get a statistically significant result.

------
ThomPete
Great analysis and I really want to believe in this but I am a sceptic and
this:

 _Interested in learning how a cohort analysis can help your business grow?
Get in touch – I work with select clients to help identify growth and
retention opportunities, and build features to realize those opportunities._

Kind of killed it for me. Now I am not sure whether I should trust the
results.

~~~
patio11
Do you routinely distrust code written by people who take money for coding, on
the grounds that it might have been written with an eye to securing work?

~~~
ThomPete
I am not sure what that comparison is supposed to prove? Care to elaborate?

I have no doubt kareemm is a fantastic at his job. I have simply seen to many
cases where a case study is used to create sales. Again nothing wrong with
that. It just kind of makes me sceptic when I see fantastic results combined
with a a sales message.

Thats just me.

~~~
daemon13
Except that we probably don't have many gym owners here :-)

------
ehsanu1
If the average time to cancel is 61 days, and they waited only 2 months since
the changes to calculate the new cancellation rates, it stands to reason that
the cancellation rate will rise over the next few months, right?

Assuming a normal distribution (probably not that accurate, but it's just a
guess), the final cancellation rate would be about double that measured until
now.

Am I missing something here?

~~~
jamiequint
The key here is that it is a cohort analysis.

So while your analysis is correct and the cancellation rate will likely rise
over time, the key thing to check when comparing a pre-change cohort to a
post-change cohort to test for improvement is that the cancellation rates at
two weeks post-signup are significantly different. Obviously its the most
ideal if you can run the tests in parallel to reduce the risk of selection
bias. However, sometimes that is not always feasible, and with a result like
85%+ improvement is not really necessary to do so in order to assume that the
control was beaten.

~~~
jmilloy
I don't really understand. Isn't the "cohort analysis" part of it how the data
about the gyms that cancelled was extracted? That shouldn't affect how you
compare the results once you've made changes. I'm also not sure I understand
what you mean by "comparing a pre-change cohort to a post-change cohort"
because we're not comparing the cohorts, we're comparing the cancellation rate
among the whole list of customers.

~~~
jamiequint
Ah, the article is actually extremely unclear. What I took his conclusion to
mean was that some cohort of gyms that had signed up post-change had lower
cancellation rates than pre-change customers. If the analysis just took into
account the overall rate across all customers is prone to all sorts of errors
(including the one you mention).

~~~
ehsanu1
My impression was that the pre-change cancellation rate was based on all
customers over the lifetime of the product (they did calculate the 61 day
average from this), whereas the post-change was only for the new customers in
the last two months (the max it could be, and the only thing it makes sense to
measure).

------
jamiequint
I'm interested in why you found Mixpanel hard to use. It satisfies every
requirement you have described out of the box and only takes minutes to set
up. Unless you wanted to process a ton of past data it probably would have
been easier for you to skip all the manual data entry excel requires.

~~~
kareemm
I haven't used Mixpanel, only Kissmetrics. I wanted to run the analysis on all
of our data and while that's doable in Kissmetrics, Excel is faster (desktop
vs latency of web app requests), more flexible (I can do what I want vs being
limited to what KM lets me do), and likely easier to get data in (import CSV
vs ...?)

~~~
StavrosK
I had the exact same problem as you, so I'm writing
<http://www.instahero.com>. It will offer everything kissmetrics etc offer,
but will give you the ability to easily program your own reports, without
having to download your data or anything like that.

Think of it like kissmetrics having an "edit the code that generates this
report" option. You can leave your email or drop me a line if you want to try
out the beta.

~~~
fab1an
I would strongly suggest you drop the "websites not using us yet" line from
your landing page. Showing their logos isn't a trick to increase conversions,
but borderline dodgy -- and people _will_ notice.

~~~
StavrosK
Ah, sorry, that was just a joke, the logos came with the theme. I'll remove
them asap, thanks.

------
bdunn
These are the sort of posts that keep me coming back to HN. Great analysis,
and as a former Crossfitter - awesome product idea!

~~~
kareemm
thanks!

------
edhallen
Great post on the usefulness of cohort analysis (or of experimentation more
broadly, which if you think about it, is exactly what cohort analysis is - it
just uses the past as a control).

One thought on ways to analyze the follow-on problem of customers canceling
after 61 days (a problem similar to what I've seen at every web company I've
ever worked at).

First, perform the same cohort analysis you’ve already done, but look at the
cancelling customers vs retained customers at day 1, day 15, day 30 and day
45, then use this analysis to figure out your triggers (things like # of
Facebook posts needed by day 15, % of profiles claimed by day 30, etc).

Once you have your triggers, you can make proactively calling / emailing
problematic customers a key part of your daily routine. While discounts might
still be the way to go, this trigger based approach is one I've seen work
well. Additionally, because you are in touch with problematic customers it
often gives you insight into what do next.

------
DanielRibeiro
Related great post written by Shopify guys a while ago:
[http://www.shopify.com/technology/4018382-defining-churn-
rat...](http://www.shopify.com/technology/4018382-defining-churn-rate-no-
really-this-actually-requires-an-entire-blog-post)

------
Smerity
Congratulations on the impressive result. It seems you have a good product but
the real change appears to be encouraging those who wouldn't use your system
properly to improve their habits.

I wish you could work out more concretely why the situation improved but with
three substantial improvements (that likely impact different customers in
different ways) that's difficult. I could imagine "drop[ping] prices by
15-60%" would help those not using your product fully for example as even if
they don't use all the features they don't feel like they're overpaying.

~~~
kareemm
Our hypothesis is that the biggest reason for the change was price. If it's a
good use of my time, I'll run another analysis in a couple months to see if I
can isolate the reasons for change.

------
Angostura
Interesting article, two issues.

First, you changed two variables simultaneously, so its tricky to tell how
much either contributed. The cynic in me, suggests that the article _could_
actually be summarised as 'we cut our prices'.

Second, you right:

> So we improved our onboarding to help a gym owner export a CSV of their
> members’ email addresses to send to us.

Certainly in Europe, that could fall foul of data protection legislation,
you'll need to make sure that the customer has given permission for their data
to be past to 3rd parties.

------
janesvilleseo
Great article. I went to check out your site socialwod.com and was unable to
scroll on my iPhone. You may want to check your analytics to see if this
effecting a large percentage of your visitors. Keep up the great work!

~~~
kareemm
Thanks for the heads up - will take a look.

------
1123581321
Which items from your analysis made a the most difference?

~~~
kareemm
Not sure - will run another analysis in a couple months if it's important
enough to figure out.

------
philip1209
Cool. I've read about JBara, which provides a CRM plugin that aims to predict
when people are going to cancel and give you a chance to retain them.

------
RobPfeifer
I think the real takeaway here is: "Email is a better marketing channel then
Facebook"

------
mredbord
This is a good article. I have a few issues with price-reduction for existing
customers, though, that I want to highlight. I think it's fantastic that OP
got the desired results on customer churn, which was his goal - but I'd
categorize pricing changes as "gray hat" retention improvement with regard to
the overall health of his business and future revenue. Here's why:

On price specifically: In a subscription business like this one, you have to
meet a minimum utility requirement in each month that a customer is able to
cancel if you want to retain customers. Each customer's minimum utility is
different and could even be comprised of different factors/features depending
on the breadth of the product offering. But there is one factor that cuts
across all of them: price. A significant element of churn is price because the
initial purchase thrill may decrease over time and result in customer
cancellation requests at a certain point in their lifecycle. So, cutting price
is kind of an easy way to reduce churn in a subscription model...particularly
because people bought in at X and are now paying fractional X. Boom - happy
customers. Also, price-cutting is habit-forming, and the customers who
received a reduced price will come back asking for more reductions in time.

On features: Multiple times, I've seen the "get more people using our product"
as a good way to reduce churn. I won't comment on permission customer
marketing and whether or not what OP did was legal, but the results of a
feature like this are great, and seems like he added more than just this one.
He added improved functionality and invested in his product at reduced prices
- great deal!

On onboarding & cohort analysis: OP was right to focus on onboarding features
and adoption to improve stickiness among new cohorts. He would have also been
smart to raise the price for new customers if he materially improved the
product (which it sounds like he did). Over the same time period, he could
have had newer cohorts of higher paying customers, making the older ones less
important to the financials of the business. By "hiring" higher priced
customers to increasingly recent cohorts and continually "firing" older-lower-
priced customers, the balance of his revenue would have shifted to these
newer, more valuable customers over time, making the older-less-happy
customers less important to his business. That's how you really turn the crank
on a subscription business, and if your onboarding is good enough to
continually improve retention in new cohorts, you've really nailed it.

Overall, I don't mean to be overly critical of OP's choices. I aim to
highlight where optimizing for customer churn alone can harm the financials of
the business, particularly around price-cutting for existing customers. He's
doing lots and lots right with his cohort analyses, onboarding improvements,
and assumptions about churn impact of new features. However, we're in business
to make money, so these have to be balanced with the health of the business
itself.

