Hacker News new | comments | show | ask | jobs | submit login
Optimizely’s decision to ditch its free plan (venturebeat.com)
51 points by capocannoniere 6 months ago | hide | past | web | favorite | 43 comments



A/B/n testing is certainly not dead. It's more likely Optimizely failed to see enough movement from their free plan into their paid tiers and in an effort to streamline/focus/cost-cut they're dropping the free plan.

This is interesting because Optimizely was in-part responsible for commoditizing A/B testing in 2010/11. Previously, it was a thousands-of-dollars per month product.


I think the reason Optimizely decided to ditch the free plan, self serve plans, and the lower-level enterprise plans is that churn is high in those segments.

Of course, the move upmarket may or may not improve the mechanics of churn in the short-term, but incumbents get displaced from the bottom, not the top. Optimizely may close bigger deals now, but those deals will become fewer and farther between until the spigot shuts down. Because of the move away from the low-end of the market, a generation of new marketers will not be using Optimizely. My expectation: ARR growth at Optimizely will continue to slow, churn will continue to be an issue, execs will soon depart. I wonder if the board will push for a sale. It's a shame, because speaking from the perspective of a churned customer, the reason we churned is that their packaging is crappy and their sales is terrible and their marketing-speak has become unbearable. We didn't want to churn, there was just no good plan for us.


When I started Optimizely with my co-founder, Pete Koomen, our goal was to enable anyone in the world to do website A/B testing. I believed then, as I do now, that we are democratizing the scientific method. That said, when a company doesn’t consider experimentation a business critical process, they are unlikely to dedicate the internal resources necessary to make it successful. We experimented (ha!) with a free plan to see if it would be a good model for getting customers to dip their toe into A/B testing. Turns out, the data showed it wasn’t as effective as a robust free trial, which we continue to offer for all products.

Also, A/B testing is far from dead. Companies that have a culture of experimentation have grown their market cap by 756% in the last five years. See our Opticon keynote for more context: https://www.optimizely.com/opticon/


We were using Optimizely until we didn't qualify for the lowest tier paid plan. Not enough impressions?

We knew at our traffic tier, Optimizely experiment analysis was not going to be statically robust. We were going to have our data person do the analysis with different techniques to deal with the smaller sample size. What we really wanted was the interface for a non-technical person to create the variation and Optimizely to manage which variation people saw.

Instead of letting us continue, our account was canceled. In the next months, when we do have the traffic, we are not going to consider them. Not impressed.


We were in a similar spot. We were on their (then) silver plan, but by the time we were ready to upgrade to gold, there was no gold. And no silver lining (pun intended). Only Enterprise. (see [0] for more details)

That was the reason I ended up creating Alephbet[1], an open-source JS A/B testing framework you can use with lots of different backends. And later Gimel[1], which is a backend that runs on AWS Lambda and Redis.

For developers, I think Alephbet and Gimel are a great free and open source alternative to commercial tools. We're running it in production for a couple of years now and our monthly AWS bills are $peanuts/month. It's NOT for marketers or non-technical users though.

[0] http://blog.gingerlime.com/2015/alephbet-javascript-ab-test-...

[1] https://github.com/Alephbet/alephbet

[2] https://github.com/Alephbet/gimel

EDIT: formatting, typos


You are welcome to use VWO. I can help you set up a trial account with us, and I usually hang around on HN to answer any of your questions. We have a comparison page here: https://vwo.com/comparison/optimizely/


We are using VWO now! :D


These guys are the real deal.

I was disappointed that YCombinator funded Optimizely given that they were expressly copying the idea of a small, single-founder who was sharing his business and experience on these boards. As an observer admittedly without knowledge of who made that funding decision and why, it still seems the most unethical thing YC has ever done.


It seems far from dead, there are thousands of sites (many enterprise) that are currently using Optimizely's js library

https://nerdydata.com/technology-reports/optimizely/78649239...


I get it, the economics of the web force articles to have silly clickbait titles like this one, but it's still annoying.

I thought the article actually did a good job of exposing the problems of "naive" A/B testing, but the conclusions of the article do not say, at all, that "A/B web testing is dead".


Yes. We deleted the baity bit above and left what I assume is the factual bit.


Optimizely presumably has no incentive to compete with Google Optimize for free users: https://www.google.com/analytics/optimize/ .


Google can afford to offer lots of stuff for free - their principal revenue remains Ads and until that needs to change then they'll be offering a lot for free.


This should be higher. Optimize will eat the low end portion of the market just like Google analytics did.


I'm curious if there will be any negative upstream effects in this decision down the road. Optimizely doesn't need to be free, but an affordable and mainstream option seems smart t get early-stage companies in the door. A wise competitor would take advantage of this new gap in the market.


Early stage companies do not need A/B testing. They need to set up their funnel and growth channels properly. Only when companies have a reasonably stable metrics like (CAC/channel, LTV, CTR) should they consider optimizing those metrics.

Moreover, without proper staffing (a dedicated resource for A/B testing), the effort will simply fizzle out. To get benefits out of A/B testing, a company needs to invest (in resources and process) and without that investment, A/B testing is a hit-and-miss kind of a thing.


You're totally right, early stage companies do not need A/B testing and don't have the resources to get it right. But that doesn't mean that many uninformed entrepreneurs won't want to try it.

If the startup in question does get to the stage where real A/B testing is necessary, the enterprise solution the founders are already familiar with will probably have an edge in the selection process.


Author of the article here :)

I'm glad to hear it sparked your interest (btw, I didn't pick the headline).

I agree with what has been said about A/B testing so far - it is a great tool for more mature companies, but I hope I could also make my case for personalization. After all, the "let's test that" mentality is not always applicable. Startups and SMEs should not underestimate how good their gut feeling is when it comes to optimizing their website for different audiences. And yes, splitting your visitors into different segments might seem counterintuitive when talking about testing but I strongly believe it's the way to go for companies who haven't achieved their desired results with A/B testing.

Disclaimer: I work for Unless, a personalization service.


As I've been saying for several years, most website A/B testing in the wild is statistically illiterate, expensive nonsense. Optimizely is right to cull their MVT product.


That's not related though. Free customers are being ditched because they have a costs to support and provide no value to the company.


It's more than that. Optimizely fear that poorly delivering tests reflect badly on their other products.


By that line of reasoning, they should stop their product and close the company.

No really, this is about money. Free customers don't pay the bills.


SplitOptimizer.com is a very cheap a/b testing tool. It’s not as full featured as optimized but it works perfectly.


In addition to scrapping a free plan, they also have moved away from displaying pricing ALL TOGETHER.


Maybe they ran an A/B test :)


So A/B testing isn’t dead...just doing it the wrong way is slowly dying, maybe.


Are there any open source alternatives to Optimizely? If not, would an experienced SWE comment on an estimated difficulty level to get an MVP together?


For A/B tests implemented in backend code, I just use rand() to bucket a user, then store that bucket in redis.

After it's been live for a while, I pull the data from redis then use this open-source calculator to view the results: https://thumbtack.github.io/abba/demo/abba.html

For anything more complicated (e.g. split test shared across multiple systems, A/B testing multi-step funnels) I fall back to Kissmetrics.


With a ton of assumptions, mainly that you have an existing data or analytics pipeline, a simple in-house experimentation framework could take on the order of weeks rather than months to build.


There are no open source alternatives for the wysiwyg editor part of optimizely. But before you roll your own backend implementation have a look at https://facebook.github.io/planout/ .

(We had success in combining planout with a cms for the visual part. Marketing could do simple a/b tests. More complicated tests where done in the backend together with a developer.)


Not a full alternative, but long ago I created this super-simple A/B testing tool you can use on top of Google Analytics - not back-end required!

https://github.com/danmaz74/ABalytics


You need to be VERY careful using Google Analytics as a backend. After a certain threshold, they start sampling your data, and results will therefore become highly inaccurate.

Disclaimer: I also created an A/B test open-source framework with Google Analytics as (one potential) backend, but then realized it's sub-optimal. I wrote about it on [0].

[0] http://blog.gingerlime.com/2016/a-scaleable-ab-testing-backe...


I already posted this on another thread[0]. Please check it out and happy to answer any questions or accept PRs.

[0] https://news.ycombinator.com/item?id=16357600



I had googled, but was looking for recommendations based on actual experimentation or use. Apologies that wasn’t clear.


CEO of VWO here - a competitor to Optimizely.

The article's title "A/B testing is dead" is very misleading. A/B testing is not a tool, it's mental model seeking learnings and improvements. Anytime you introduce a new change to your business, you're implicitly or explicitly trying to understand its impact. Most of the times, setting up an environment to tease out real effects of change is hard (experiments in strategy are usually very difficult) so you're left to guessing.

However, for mediums that allow easy tests (like website, landing pages where you can set up scientific experiments with ease and have enough traffic to get results in a short time period), why wouldn't you? The way I see it, rejection of A/B testing is the rejection of the scientific method of conjecture and improvement. In the highly contextual business environment with tens and hundreds of conditional variables, best practices and "gut feeling" can only go so far.

A/B testing for very small businesses (those who get attracted to free plans) doesn't make a lot of sense because they usually don't have enough traffic to get statistically significant results. Plus, optimization via A/B testing comes only after you've gotten your basics right and it's now time to scale. This is why at VWO we never had a free plan.

So Optimizely's decision to ditch free plan makes sense.


> The way I see it, rejection of A/B testing is the rejection of the scientific method of conjecture and improvement.

How much of A/B testing being done is actually scientific?

The article touches on that a little bit, but I'm reminded of this gem from 2014:

http://blog.sumall.com/journal/optimizely-got-me-fired.html

When it was discussed 2 years ago, I teased out[0] (with a dose of cynical attitude) some key points about structural problems with A/B testing tools; problems, which navigate people away from scientific methods, and into the realm of easy-exploitable bullshit artistry.

I never had an opportunity to use or check out VWO, so I'd like to ask you: does your tool structurally helps users create sound statistical experiments? Can you share something about how you do it?

--

[0] - https://news.ycombinator.com/item?id=10873226


>: does your tool structurally helps users create sound statistical experiments? Can you share something about how you do it?

Yes, check out the methodology we use: https://vwo.com/blog/smartstats-testing-for-truth/


> A/B testing is not a tool, it's mental model seeking learnings and improvements

No, it's a validation phase most businesses use to sign off product decisions that are already made. They invest thousands in a new feature, run an A/B test, and then wait until they see an ambiguous 'win' in favour of the new feature. This is somewhat akin to flipping a coin until you get three heads in a row, and declaring yourself winner of the game.

You can denounce this as A/B testing gone wrong, but it's a reality of the web industry that most testing is badly motivated and statistically illiterate. The mentality I describe isn't limited to SMEs and tinpoint organisations, either. I've seen it espoused by highly respected product managers at prestigious organisations.

Where does this endemic of pseudoempiricism come from? Well, if I had to guess, I'd point to the marketing of companies like your own.

Take a look at https://vwo.com/ab-testing/ - "AB testing - The Complete Guide". There's not one mention of statistical significance either in word or spirit. There's no talk of setting a timeline in advance of the test. There's no hard numbers to set a context of what's an acceptable sample size - in fact the guide is specifically aimed at small business owners.

So if anyone is undermining trust in the "scientific process", it's not Optimizely, but in a way - you.


>There's no talk of setting a timeline in advance of the test. There's no hard numbers to set a context of what's an acceptable sample size - in fact the guide is specifically aimed at small business owners.

You should take a trial of our product. Before setting up a test in VWO, we ensure you understand that the test will have to be run until we have statistically significant results. We've moved away from frequentist methodologies (where peeking at results was an issue) to bayesian. Check out what we do here: https://vwo.com/blog/smartstats-testing-for-truth/

The guide is aimed at people who are just starting out, but I agree with you, we should have pointed them to the importance of getting statistically correct results.


> No, it's a validation phase most businesses use to sign off product decisions that are already made.

Interesting, could you elaborate further on this? I've only really worked with technically unsophisticated (i.e. non-tech companies), and they've always shown willing to do it properly and factor in results, and not set up leading experiments.

Is it a symptom of technical companies that staff are more bought-in to their solutions?


That's not the fault of the tool if people want to throw money at it and ignore the results.

Please let people who want to throw money throw money, while the rest of us can use A/B testing to improve our sites. Thank you.


Normally I don't upvote "CEO of <competing thing>" but this is a good comment.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: