

Cognitive biases - gcheong
http://io9.com/5974468/the-most-common-cognitive-biases-that-prevent-you-from-being-rational

======
tokenadult
I can't upvote this as much as it deserves. The comments posted before this
one point out the delicate issue in learning about biases in human thinking:
it's a lot easier to notice the other guy's biases than my own. I'm glad that
quite a few Hacker News participants like to share articles about systematic
flaws in human thinking. Over time, with practice, that can help us disagree

<http://www.paulgraham.com/disagree.html>

more constructively in comment threads here, and avoid the "worst argument in
the world"

[http://lesswrong.com/lw/ee7/cleaning_up_the_worst_argument_e...](http://lesswrong.com/lw/ee7/cleaning_up_the_worst_argument_essay/)

as we argue with one another.

------
gregholmberg
Here is an alphabetical list of cognitive biases. There seem to be more than
170, including a 'Google effect'.

<http://en.wikipedia.org/wiki/List_of_cognitive_biases>

.

Daniel Kahneman (prominent psychologist, 2002 Nobel in Economics) helped
develop "reference class forecasting" to avoid poor project management
decisions on novel or unusual projects.

<http://en.wikipedia.org/wiki/Reference_class_forecasting>

------
cristianpascu
I must say I have mixed feelings about these biases.

One thing I've noticed is that they are used as counter-arguments. "OH, you're
saying that because you're biased." Could we say that some people are biased
toward seeing biases everywhere?

Even if you've rationalized an opinion of yours in its entirety, as deep as
it's humanely and currently possible, you can still be subject to, say,
"wishful thinking".

Say you hold a belief about something that is not yet verified to be true of
false. It's a belief. It's not knowledge because you can't justify its truth
value yet. A skeptic philosopher might hold that you will never justify it
entirely. But that's a different story.

However, some people will just trow a "bias" at you for the simple reason they
think you wish it to be true, not that you believe and think it to be true.

~~~
frooxie
Reminds me of Eliezer Yudkowsky's comment that "I've seen people severely
messed up by their own knowledge of biases. They have more ammunition with
which to argue against anything they don't like."

~~~
confluence
Well if people can use these biases against the things they don't like - then
the things they don't like aren't backed up with data, or they aren't reasoned
correctly or they are argued fallaciously. Having a deep knowledge of
cognitive biases, logical fallacies and the improper use of statistics
destroys the vast majority of hypotheses that are stated on a daily basis.

This is not because we don't like these things. No, it's because most people
are full of shit.

See if you can see all examples of biases, fallacious arguments and poor
reasoning in my above rant!

I've used a fair few. But only because I know all of them. Knowledge allows me
to see through people whilst making myself opaque to others. Yes this can be
abused. However this is only true to the extent that most things are already
bullshit - using these techniques can only change what bullshit people
believe.

However these techniques don't effect what is true or not - they merely help
find it.

~~~
TeMPOraL
The problem Eliezer tried to highlight is that sloppy thinkers and/or people
with no desire to understand the truth of a given matter will tend to look
carefully for biases in arguments they don't like, while not applying the same
scrutiny to the arguments supporting their position. To some extent, it's
problem common to all of us - we tend to be more sceptical about things we
don't agree with than with the ones we like. But some people exhibit this very
strongly and they will surely hurt themselves more, the more ammunition they
have.

------
Wingman4l7
I highly recommend checking out the articles on <http://lesswrong.com/>, which
is a whole online community devoted to understanding and overcoming cognitive
biases.

------
bane
I highly recommend this book for anybody interested in this

[https://www.cia.gov/library/center-for-the-study-of-
intellig...](https://www.cia.gov/library/center-for-the-study-of-
intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-
analysis/)

------
bthomas
If you like this stuff, I can't recommend Thinking Fast and Slow enough

~~~
mbesto
I came here to say this. Look at my comment stream and you'll see me post the
Amazon link probably once a month, haha!

------
B-Con
I believe that knowledge of cognitive biases is useful for identifying flaws
in one's own thinking, but I can't say I've seen it used very productively in
argument. Especially for the type who is more of the "learn it and sling it"
fact-thrower with more knowledge than understanding. It becomes harder to talk
to those people because they're convinced they have your psyche all figured
out, no matter what logic you employ.

~~~
cma
This reeks of survivorship and hindsight bias.

------
strangestchild
It's a shame to see the driving/flying statistic trotted out again. My
absolute chance of dying in a car crash as opposed to a plane crash is
irrelevant when deciding which to take - as an individual I'm far more likely
to die of drowning than of acid burns, but I'd still rather go swimming in
plain old H2O. Neglecting the base rate is itself a common probabilistic
fallacy.

In fact the relative risk of flying versus driving depends significantly on
the way you choose to assess risk: chance of death per journey, per mile, or
per hour. Further, a straightforward mortality assessment does not factor in
the risk of nonlethal but debilitating injury. Like all good questions, it's
not as simple as it first appears.

------
frooxie
The best book on cognitive biases that I have read is Irrationality by Stuart
Sutherland.

------
chris123
There are many good Wikipedia pages on this. Start with the lists and drill
down to pages for individual items:
<https://www.google.com/search?q=wikipedia+bias+fallacy>

------
jakeonthemove
I don't know, choosing the middle option seems like a good thing - get the
best possible product for the least possible amount of money...

------
thewarrior
Do these biases give us any cognitive advantages ? We wouldnt have evolved
them unless they gave us some evolutionary benefits.

~~~
strangestchild
Absolutely. For example, the ingroup bias allows a group of related
individuals to benefit at the expense of those with whom they are less likely
to share genetic material; and the observation bias allows our brain to focus
on information that is more likely to be of interest - if I tell you tigers
have stripes, you're likely to notice more stripes and maybe spot more tigers.

In general, though, fallacies like this arise because the brain prefers rules
that are simple and quick to apply - they may not be optimal in terms of the
solution obtained, but they are effective heuristics once cost and time are
factored in. It's better to spot a hidden tiger quickly but occasionally get
it wrong, than to be the world's greatest tiger spotter given half an hour to
think about it. Most probabilistic and decision-making fallacies fall into
this bracket - the middle-choice fallacy is actually a pretty good heuristic
(as another poster pointed out)- but there are edge cases where it can trick
us if we don't think over our decisions rationally when we have the time and
freedom to.

