
Stop Measuring Everything - smalter
https://savagethoughts.com/stop-measuring-everything-8adb1118e0e5#.j0b6kqsfq
======
lmm
The correct response to bad measurement isn't to throw up your hands and stop
trying to measure. It's to get better measures.

Be careful about your measure. Check that what you're measuring actually
corresponds to what you think you're measuring. There are a lot of good
examples here, but the answer isnt' to stop measuring, to resort to "art" \-
it's to get better at it.

~~~
codezero
Yep. It seems like they took the opposite approach to what they should have.

They killed their billboard campaign because their attempts to attribute the
value of the campaign to growth failed.

They talked to a few people (got attribution) and learned it had a wider
effect than their original attribution did. This is more data, not less.

It sounds like they should have taken a different approach to attribution in
the first place when they moved from purely digital marketing to meatspace
ads.

~~~
garysieling
I'm curious if you or the parent have any suggestions for a problem I've been
thinking about.

I've been building a search engine for standalone lectures
([https://www.findlectures.com](https://www.findlectures.com)). For me, the
ideal outcome of being introduced to a new concept is that an idea sticks in
my head for a while and won't let go, and I'd like to help people replicate
that experience.

Are there ways that you could track something that gets at whether this
happens for people, or is this so qualitative that the best option is to
interview users?

~~~
codezero
That's tricky and I don't have any great advice. If you could find some
tangible piece of knowledge associated with each video, then have a quiz after
some time, that might help. Or introduce some kind of spaced repetition.

Alternatively, if you make the video categorization about what the viewer is
interested in learning (that is, make the interface drive people to see videos
based on what they want to learn, rather than having a hierarchical
organization), then you can ask them later "did you learn X?" with some other
survey like questions.

------
incongruity
The piece isn't terrible – the author has found the value of _qualitative
research_. Yay! (I do a bit of that, so I'm biased).

The _title_ however, is wrong. It's more about figuring out how to measure the
right things and not missing signal because of bad measures.

Open-ended, qualitative research is good at that because it helps the signal
stand out – you can find unknown unknowns and then figure out _how_ to be more
systematic in _measuring_ them.

------
iamthepieman
Money. You should measure money. everything else is just a proxy for that.

~~~
noja
That's a very sad way of thinking.

~~~
thunderbong
Yes, it is. But let's also not forget time. It should be money/time.

------
godelski
I don't understand this whole dichotomy between art and science. Or why people
are so insistent that they are mutually exclusive. Science is an art, taking
part in this myth hurts both art and science as a whole.

~~~
initram
I agree. And most art these days involves quite a bit of science. Photoshop's
image processing algorithms aren't just sloppily thrown together, for example!
It takes a lot of science to make good artistic tools, and artists often do a
lot of science when creating their art. (Checking audio levels, figuring out
when the light will be just right, etc.)

------
ihaveahadron
I think it helps to learn how to gamble or /play poker properly.

You'll learn what the poker players call "varience".

If you know varience (not patience) then over the long term, you aren't
gambling. Or you are gambling but in a way if done properly that will make you
money over the long term with 100% success.

~~~
dasil003
Assuming you are better than the other players.

