
Optimizely’s decision to ditch its free plan - capocannoniere
https://venturebeat.com/2018/02/05/optimizelys-decision-to-ditch-its-free-plan-suggests-a-b-website-testing-is-dead/
======
syncerr
A/B/n testing is certainly not dead. It's more likely Optimizely failed to see
enough movement from their free plan into their paid tiers and in an effort to
streamline/focus/cost-cut they're dropping the free plan.

This is interesting because Optimizely was in-part responsible for
commoditizing A/B testing in 2010/11\. Previously, it was a thousands-of-
dollars per month product.

~~~
Throwaway211218
I think the reason Optimizely decided to ditch the free plan, self serve
plans, and the lower-level enterprise plans is that churn is high in those
segments.

Of course, the move upmarket may or may not improve the mechanics of churn in
the short-term, but incumbents get displaced from the bottom, not the top.
Optimizely may close bigger deals now, but those deals will become fewer and
farther between until the spigot shuts down. Because of the move away from the
low-end of the market, a generation of new marketers will not be using
Optimizely. My expectation: ARR growth at Optimizely will continue to slow,
churn will continue to be an issue, execs will soon depart. I wonder if the
board will push for a sale. It's a shame, because speaking from the
perspective of a churned customer, the reason we churned is that their
packaging is crappy and their sales is terrible and their marketing-speak has
become unbearable. We didn't want to churn, there was just no good plan for
us.

------
dsiroker
When I started Optimizely with my co-founder, Pete Koomen, our goal was to
enable anyone in the world to do website A/B testing. I believed then, as I do
now, that we are democratizing the scientific method. That said, when a
company doesn’t consider experimentation a business critical process, they are
unlikely to dedicate the internal resources necessary to make it successful.
We experimented (ha!) with a free plan to see if it would be a good model for
getting customers to dip their toe into A/B testing. Turns out, the data
showed it wasn’t as effective as a robust free trial, which we continue to
offer for all products.

Also, A/B testing is far from dead. Companies that have a culture of
experimentation have grown their market cap by 756% in the last five years.
See our Opticon keynote for more context:
[https://www.optimizely.com/opticon/](https://www.optimizely.com/opticon/)

------
diiaann
We were using Optimizely until we didn't qualify for the lowest tier paid
plan. Not enough impressions?

We knew at our traffic tier, Optimizely experiment analysis was not going to
be statically robust. We were going to have our data person do the analysis
with different techniques to deal with the smaller sample size. What we really
wanted was the interface for a non-technical person to create the variation
and Optimizely to manage which variation people saw.

Instead of letting us continue, our account was canceled. In the next months,
when we do have the traffic, we are not going to consider them. Not impressed.

~~~
paraschopra
You are welcome to use VWO. I can help you set up a trial account with us, and
I usually hang around on HN to answer any of your questions. We have a
comparison page here:
[https://vwo.com/comparison/optimizely/](https://vwo.com/comparison/optimizely/)

~~~
diiaann
We are using VWO now! :D

------
rexbee
It seems far from dead, there are thousands of sites (many enterprise) that
are currently using Optimizely's js library

[https://nerdydata.com/technology-
reports/optimizely/78649239...](https://nerdydata.com/technology-
reports/optimizely/78649239-9bbe-4383-a09c-f65e072e54d8)

------
hn_throwaway_99
I get it, the economics of the web force articles to have silly clickbait
titles like this one, but it's still annoying.

I thought the article actually did a good job of exposing the problems of
"naive" A/B testing, but the conclusions of the article do not say, at all,
that "A/B web testing is dead".

~~~
dang
Yes. We deleted the baity bit above and left what I assume is the factual bit.

------
lukestevens
Optimizely presumably has no incentive to compete with Google Optimize for
free users:
[https://www.google.com/analytics/optimize/](https://www.google.com/analytics/optimize/)
.

~~~
jarym
Google can afford to offer lots of stuff for free - their principal revenue
remains Ads and until that needs to change then they'll be offering a lot for
free.

------
jbreckmckye
As I've been saying for several years, most website A/B testing in the wild is
statistically illiterate, expensive nonsense. Optimizely is right to cull
their MVT product.

~~~
user5994461
That's not related though. Free customers are being ditched because they have
a costs to support and provide no value to the company.

~~~
jbreckmckye
It's more than that. Optimizely fear that poorly delivering tests reflect
badly on their other products.

~~~
user5994461
By that line of reasoning, they should stop their product and close the
company.

No really, this is about money. Free customers don't pay the bills.

------
danielfoster
I'm curious if there will be any negative upstream effects in this decision
down the road. Optimizely doesn't need to be free, but an affordable and
mainstream option seems smart t get early-stage companies in the door. A wise
competitor would take advantage of this new gap in the market.

~~~
paraschopra
Early stage companies do not need A/B testing. They need to set up their
funnel and growth channels properly. Only when companies have a reasonably
stable metrics like (CAC/channel, LTV, CTR) should they consider _optimizing_
those metrics.

Moreover, without proper staffing (a dedicated resource for A/B testing), the
effort will simply fizzle out. To get benefits out of A/B testing, a company
needs to invest (in resources and process) and without that investment, A/B
testing is a hit-and-miss kind of a thing.

~~~
danielfoster
You're totally right, early stage companies do not need A/B testing and don't
have the resources to get it right. But that doesn't mean that many uninformed
entrepreneurs won't want to try it.

If the startup in question does get to the stage where real A/B testing is
necessary, the enterprise solution the founders are already familiar with will
probably have an edge in the selection process.

------
Yvonne_Kol
Author of the article here :)

I'm glad to hear it sparked your interest (btw, I didn't pick the headline).

I agree with what has been said about A/B testing so far - it is a great tool
for more mature companies, but I hope I could also make my case for
personalization. After all, the "let's test that" mentality is not always
applicable. Startups and SMEs should not underestimate how good their gut
feeling is when it comes to optimizing their website for different audiences.
And yes, splitting your visitors into different segments might seem
counterintuitive when talking about testing but I strongly believe it's the
way to go for companies who haven't achieved their desired results with A/B
testing.

Disclaimer: I work for Unless, a personalization service.

------
CptJamesCook
SplitOptimizer.com is a very cheap a/b testing tool. It’s not as full featured
as optimized but it works perfectly.

------
noahmbarr
In addition to scrapping a free plan, they also have moved away from
displaying pricing ALL TOGETHER.

~~~
Maro
Maybe they ran an A/B test :)

------
baron816
So A/B testing isn’t dead...just doing it the wrong way is slowly dying,
maybe.

------
toomuchtodo
Are there any open source alternatives to Optimizely? If not, would an
experienced SWE comment on an estimated difficulty level to get an MVP
together?

~~~
danmaz74
Not a full alternative, but long ago I created this super-simple A/B testing
tool you can use on top of Google Analytics - not back-end required!

[https://github.com/danmaz74/ABalytics](https://github.com/danmaz74/ABalytics)

~~~
gingerlime
You need to be VERY careful using Google Analytics as a backend. After a
certain threshold, they start sampling your data, and results will therefore
become highly inaccurate.

Disclaimer: I also created an A/B test open-source framework with Google
Analytics as (one potential) backend, but then realized it's sub-optimal. I
wrote about it on [0].

[0] [http://blog.gingerlime.com/2016/a-scaleable-ab-testing-
backe...](http://blog.gingerlime.com/2016/a-scaleable-ab-testing-backend-
in-100-lines-of-code-and-for-free/)

------
paraschopra
CEO of VWO here - a competitor to Optimizely.

The article's title "A/B testing is dead" is very misleading. A/B testing is
not a tool, it's mental model seeking learnings and improvements. Anytime you
introduce a new change to your business, you're implicitly or explicitly
trying to understand its impact. Most of the times, setting up an environment
to tease out real effects of change is hard (experiments in strategy are
usually very difficult) so you're left to guessing.

However, for mediums that allow easy tests (like website, landing pages where
you can set up scientific experiments with ease and have enough traffic to get
results in a short time period), why wouldn't you? The way I see it, rejection
of A/B testing is the rejection of the scientific method of conjecture and
improvement. In the highly contextual business environment with tens and
hundreds of conditional variables, best practices and "gut feeling" can only
go so far.

A/B testing for very small businesses (those who get attracted to free plans)
doesn't make a lot of sense because they usually don't have enough traffic to
get statistically significant results. Plus, optimization via A/B testing
comes only after you've gotten your basics right and it's now time to scale.
This is why at VWO we never had a free plan.

So Optimizely's decision to ditch free plan makes sense.

~~~
jbreckmckye
> A/B testing is not a tool, it's mental model seeking learnings and
> improvements

No, it's a validation phase most businesses use to sign off product decisions
that are already made. They invest thousands in a new feature, run an A/B
test, and then wait until they see an ambiguous 'win' in favour of the new
feature. This is somewhat akin to flipping a coin until you get three heads in
a row, and declaring yourself winner of the game.

You can denounce this as A/B testing gone wrong, but it's a reality of the web
industry that most testing is badly motivated and statistically illiterate.
The mentality I describe isn't limited to SMEs and tinpoint organisations,
either. I've seen it espoused by highly respected product managers at
prestigious organisations.

Where does this endemic of pseudoempiricism come from? Well, if I had to
guess, I'd point to the marketing of companies like your own.

Take a look at [https://vwo.com/ab-testing/](https://vwo.com/ab-testing/) \-
"AB testing - The Complete Guide". There's not one mention of statistical
significance either in word or spirit. There's no talk of setting a timeline
in advance of the test. There's no hard numbers to set a context of what's an
acceptable sample size - in fact the guide is specifically aimed at small
business owners.

So if anyone is undermining trust in the "scientific process", it's not
Optimizely, but in a way - you.

~~~
paraschopra
>There's no talk of setting a timeline in advance of the test. There's no hard
numbers to set a context of what's an acceptable sample size - in fact the
guide is specifically aimed at small business owners.

You should take a trial of our product. Before setting up a test in VWO, we
ensure you understand that the test will have to be run until we have
statistically significant results. We've moved away from frequentist
methodologies (where peeking at results was an issue) to bayesian. Check out
what we do here: [https://vwo.com/blog/smartstats-testing-for-
truth/](https://vwo.com/blog/smartstats-testing-for-truth/)

The guide is aimed at people who are just starting out, but I agree with you,
we should have pointed them to the importance of getting statistically correct
results.

