

Jokob Nielsen: Guesses vs. Data as Basis for Design Recommendations - edw519
http://www.useit.com/alertbox/guesses-data.html

======
tokenadult
"In our discussion group example,

"100% of the designers who provided external data were right, whereas

"25% of the designers who relied on their personal opinion were right.

"Most strikingly, 75% of guessers were wrong. You'd be better off tossing a
coin than asking advice of these people."

A very important point. Confident guessing can send more convincing than data-
based hypothesizing, even when it is flat wrong. This is why I appreciate any
participant here on HN who can provide data to back up statements, and why I
also appreciate comments that ask for data--yes, including from me--if a
statement is first made without data.

~~~
gizmo
> "Most strikingly, 75% of guessers were wrong. You'd be better off tossing a
> coin than asking advice of these people."

That claim above is absolutely ridiculous. If it were true you'd do much
better by always doing the exact opposite of what they propose. If you can do
as well as by doing the opposite of what they guess as by doing real research,
then the guys clearly know far more than a random coin.

Consistently getting 75% wrong is difficult.

The data sets are too small to do any meaningful extrapolation from. The
important message should've been about falsifying claims.

When you're designing anything you have a whole bunch of assumptions. Those
assumptions ("everybody knows how to do X") can be falsified by a single
counterexample. But this absolutely doesn't imply that if two people happen to
think X that that is a meaningful reason to assume X is correct.

~~~
tokenadult
_If it were true you'd do much better by always doing the exact opposite of
what they propose._

How do we know how many possibilities there are? Maybe there isn't just one
exact opposite choice.

But, agreeing with Jakob Nielsen, if guessers are getting it wrong more than
half the time when dealing with yes-no questions, it isn't a very good idea to
guess, and guesses are poor guidance for important business decisions about
Web services.

~~~
ThomPete
Of course depends on what the metrics are and how correct they are for
measuring what they intend to measure.

------
jfarmer
Part of being an expert is finding the right balance between accumulated
knowledge and tactical flexibility.

"Intuition" and "common sense" are nothing more than predjudices. They may be
right; after all, prejudices are usually founded in real-life experiences.

The problem comes when you over-generalize those experiences and crystalize
that over-generalization as knowledge.

A silly example: you run a test on a website about button colors and red is
the best color and that it made a large difference.

There are a lot of lessons you could possible take from this, e.g., 1\. Users
prefer red buttons over any other color 2\. You should always test button
color 3\. It's important to think about what draws users' eyes and test that,
including buttons. 4\. It's important to test everything, no matter how small

Learning (1) is probably wrong. Red might work in this situation, but not in
another. This is the problem with learning tactics in general. As the
competitive landscape evolves or the situation "on the ground" changes
specific tactics become less useful, and might even be worse than an
alternative you've precluded because of your "expertise."

(2) and (3) are probably good lessons to take away. (3) is better because it
opens your eyes to new, possibly fruitful things to test, although (2) isn't
bad.

(4) is the opposite of (1) in that it's an overbroad lesson that is hard to
apply and if done religiously would probably paralyze the decision-making
process.

Anyhow, that's how I think about applying "data-driven" processes to business,
marketing, and product decisions.

~~~
edave
Interesting, how do you feel about all the talk on Google's data-driven UI
development?

~~~
jfarmer
In principle I think it's fine, but from the story it sounds like the
decision-making process was busted.

What I've found is that companies that live or die by data don't have better
arguments, necessarily, they just have different arguments.

At the end of the day it's about someone making a decision and taking
responsibility for it. Sometimes forward momentum and the feelings of your
employees are more important than minute details like that.

The story was more about Marissa Meyer's ego and how she uses the data-driven
process at Google to frustrate her internal enemies, IMO, and less about the
specific fact that they tested 41 shades of blue.

Personally I'd just make a decision and get on with it. If, for whatever
reason, we thought the shade of blue would really, really matter you can
always test it latter.

The risk of being data-driven is that you wind up in a world of Cartesian
uncertainty. Do I really know ANYTHING? What if this thing I don't think
matters really DOES matter? Doesn't that mean I should test every detail no
matter how small?

Anyhow, these are issues you have to grapple with when you commit to being
data-driven.

There's no simple answer, although it's easy for Google to justify testing
every single thing because they have a gargantuan amount of traffic and even
an improvement of a fraction of a percent could mean millions of dollars in
revenue won or lost.

------
kingsley_20
Interesting quote: "Fair enough: All people are experts on their own
preferences". I have mostly found this to not be the case. Most people have
preferences, but mostly don't know why. People are, however, entitled to their
own preferences.

PS:It's "Jakob"

~~~
access_denied
> PS:It's "Jakob"

You mean "it's that Jakob again"? You must be designer.

Oh, I get it: you are a hacker, you mean the typo in the headline.

~~~
kingsley_20
//designer !(!=) hacker;

------
jfarmer
Forgive me for tooting my own horn: <http://20bits.com/articles/decisions-
without-data/>

I like the fact that Jakob took it one extra step and included data that shows
making decisions with data is (empirically) better than making decisions
without data.

Very meta. :)

------
ThomPete
I know programmers don't like to hear about intuition and that is why I am
probably being down voted for this too.

But that is what makes what you do stand out and what separates the excellent
designer from the mediocre designer.

Just the same that makes the excellent developer stand out, that which cannot
be put into a data set.

I challenge anyone to find a common design functionality that couldn't be
solved with common sense and intuition.

Something that you could prove the actual results would be better if they
where based on data.

The problem of course is that no such data exist.

~~~
dkarl
I have a hunch that if you use data to separate the excellent designers from
the mediocre designers, then the designers who use data turn out to be the
excellent ones. If you use intuition to separate the excellent designers from
the mediocre designers, then perhaps the intuitive designers fare better.

------
shellerik
I enjoy the fact that he uses two data points to support his conclusion that
using two data points gives you better conclusions than guessing. Nothing
circular about that reasoning at all. :-)

------
madair
This seems more like an advertisement than an article. Make them worry real
big that their experts or expertise just isn't good enough, then close with a
paragraph about our training. Hmmmm.

I like reading expert articles from people _not_ trying to sell me their
expert services.

~~~
tokenadult
I wonder what Peter Norvig (who doesn't sell Web consulting services, being
employed by Google full time) would say about Jakob Nielsen's general
approach? It happens that Nielsen has consulted with Google before, and Google
is acclaimed for its usability, so I think many of Nielsen's ideas make sense.
I'd hate to have experts stay silent to avoid accusations of having commercial
interests, when they could do what Nielsen does and share a lot of good advice
for free.

------
SergeyHack
Typo: 'Jokob' must be 'J _a_ kob'.

------
ThomPete
"Summary: Even the tiniest amount of empirical facts (say, observing 2 users)
vastly improves the probability of making correct UI design decisions."

Yes if you are a complete newbie or academic without any experience about
webdesign. Otherwise not in a million years.

~~~
olavk
Is that a guess, or do you have empirical data to back it up?

~~~
ThomPete
It's an informed opinion.

The map is not the territory.

