"Someone sign up on my service and does something on my service that's expected of them to do?" BOOM! Leading indicator... No shit Sherlock...
As a marketer I'm getting tired to this buzz word shit around growth hacker...
For instance, when signing up to Facebook, there is a bunch of stuff they want you to do: add friends, invite people, add content like photos, and come back on day 1, day 2, and so on. It turns out that it's the first activity that is more predictive of future engagement than the others, and specifically 7 friends in 10 days is an important threshold. You can't figure that out from your intuition alone, and you definitely can't get the values of # of friends in # of days just by consulting your intuition, or your sense of what's obvious.
At the same time Chamath from Facebook was keen to point out that there was nothing mystical about what the growth team at Facebook did. He said the best way of describing what they did on the growth team is that they:
- Measured things
- Tested things
- Tried things
His main take-away was that one should be extremely rigorous and analytical when thinking about what drives engagement, and not rely on your gut, or company lore.
Except the end result is usually the exact same.
A wildly inaccurate model for the sole purpose of impressing advertisers|investors|stakeholders with fancy, meaningless charts.
The conclusions you can draw from these models are either utterly obvious, or false. I'm looking forward to hear about a single example of the opposite.
They were also at pains to mention that you cannot A/B test your way to a great product. You need deep insight and intuition to build a great product. But once you have a great product, you can help it grow faster by being rigorous about optimizing the drivers of growth.
"If, on a graph, the plotted data does not match the
initial hypothesis, make the line fatter."
-- J. Meals
I'm still waiting for an example where such a model predicted something non-obvious.
Did Myspace have growth hackers?
I suggest that we respectfully agree to disagree.
> ChenLi Wang, who runs the growth team at Facebook [sic: Dropbox?], said that the leading indicator of an engaged Dropbox user is when they put at least one file in one Dropbox folder on one device.
Basically - the leading indicator is any use at all. The other business' all had, relatively, a much higher threshold.
This is probably because Dropbox is immediately useful and stupid-easy to use.
Of course, it's hard to take this at face value since we don't know how everyone is defining an engaged user (and each definition isn't directly comparable to the other services' definitions).
Presumably there can be a correlation-not-causation effect, and optimizing said indicator wouldn't have as much as an effect on engaged users as it did on the indicator itself. Does anyone know of a situation where they had an increase in the leading indicator but none in engaged users? Might the relationship change over time?
Chamath stressed how critical it is to both minimize the time it takes to deliver a user their first 'Aha' moment, and then to continue to deliver those moments as regularly as possible. Presumably how many friends you have is causal, to some degree, of those 'Aha' moments - I expect that many of the 'Aha' moments on FB are to do with seeing cool content from your friends.
Nabeel also mentioned that it's also important to have various KPI metrics that you are aware of. His view was that there should be one over-arching operational metric that is broadly predictive of future engagement, and that the growth team should focus on that, but that, for any given tactic that you try out, you should be aware of its effect on a number of KPIs.
Erratic mouse movements, page refreshes, etc ;)