
When Worse Is Better (2011) - tambre
https://susam.in/blog/when-worse-is-better/
======
chupa-chups
This is so wrong since it implies that a decision is binary, thus taking the
opposite decision of a "wrong decision" is well defined and automatically
right.

Which is hilariously different to reality.

~~~
mrpara
Well, in the author's defense, they specifically stated that "Each assignment
required a yes-no decision". I agree, though, that there really is no great
lesson to be learned here since the real world doesn't work like that. The
technical bit about channel capacity was honestly more interesting. It's a
neat result (that should be fairly obvious to anyone who's worked with
probabilities) with a not-so-great example to illustrate it.

~~~
JoeAltmaier
Agreed, "Yes/No" is in no way correlated with "50/50 chance".

Will I win the lottery? Yes/No. 50/50?

My favorite conflation is the "are we in a simulation?" argument. That's a
yes/no also, and the ignorant conclusion is that its very likely we are.
Without any statistics on the probability either way.

~~~
TeMPOraL
Also, Pascal's wager. The presented sides of the bet aren't equal, because on
the one hand you have existence of a _particular_ God, and on the other hand
its non-existence, whereas in the real world, there's a whole space of
possible Gods that people believe in that are not included in the wager.

~~~
dredmorbius
"Wheareas in the real world" is an interesting phrasing for that poylemma.

------
brudgers
Binary choice describes a range not a distribution. Suppose Bob is optimistic
and always decides "yes". The opposite of Bob's decision is always "no".

The right course of action requires looking at the data, not theory. In a
context where the probability that "yes" is right is 0.05 -- 19/20 proposals
being bad ideas is not an implausible reality -- there's no excuse for keeping
Bob, just answer 'no'. There's no reason to assume that correct _yes_ 's and
_no_ ;s are equally distributed.

There's also no reason to assume that an incorrect _yes_ has the same cost as
an incorrect _no_ or that a correct _yes_ has the same benefit as a correct
_no_. Silicon Valley venture financing comes to mind.

To me, it's probably a good idea to base business decisions on business facts
rather than the beauty of a theory. It's not that theories shouldn't inform
decisions, but data is probably a better place to start.

------
arithma
As long as Bob doesn’t observe himself and tries to improve his performance.

------
whoisjuan
I think the problem with this example is assuming that both decision outputs
can equally impact profits (that both are perfectly symmetrical on their
effects).

For example, if this person is deciding whether or not to extend a credit line
to a client, the NO decisions are not going to have an effect on the profits
bottom line.

Saying NO a lot, in this case, means having the same cost to operate but fewer
opportunities to profit from new and existing clients. On the other hand,
saying YES a lot means that the company can profit from extending those credit
lines but there's a risk that they would lose money from unrecovered credits
and a more heavy collections operation.

So, actually, Bob is only useful if he is the conservative one that says NO a
lot. If Bob says YES a lot, his impact is only as good as the 5% of his
correct decisions.

------
harryh
Related to this is the maxim about how to tell if a game is based on luck or
skill: can you lose on purpose?

------
ncmncm
We named people like Bob "Oracles of Wrong".

In any big enough group there is always an Oracle of Wrong. They are very
precious. It's very hard to be right all the time, but less hard to be wrong
all the time.

Over decades I have puzzled over the existence of Oracles of Wrong. One clue
is that they always come from privileged backgrounds. Taking the easy, wrong
choice has never caused them any personal discomfort, but has often saved them
some minor inconvenience that the hard, right choice would have cost.

------
TeMPOraL
TL;DR: when facing binary choices (yes/no), stupidity is being ~50% correct,
not ~0% correct. When someone is correct e.g. only 5% of times, it means
they're strongly anticorrelated with reality, and you just have to do the
opposite of what they say to be 95% correct.

