
Why you think you’re right, even when you’re wrong (2017) - ColinWright
https://ideas.ted.com/why-you-think-youre-right-even-when-youre-wrong/
======
tombert
I have tried to employ the strategy of "I'm probably wrong all the time". It's
not perfect, and occasionally I'll get defensive about something that I find
out is wrong later (I was staunchly against unit tests for example), but
having a mentality that your mind is fairly fallible is a good way to avoid
yelling at people, at least for me.

I do wish it were more politically acceptable for politicians to adopt this
mindset. Without espousing a side of "Democrat" vs "Republican", it feels like
the world views admitting that you are incorrect as a sign of weakness, or
"flip-flopping",

~~~
jtbayly
Perhaps you ought to assume you are wrong that you were wrong about unit
tests. :)

~~~
tombert
I just try and not hold any viewpoint with any dogmatism now. I'll admit that
maybe people will come back and realize that unit tests were stupid, but this
time at least I'll know that I was neither a staunch defender or detractor.

~~~
mixmastamyk
The final truth is usually, "it depends…"

------
jumpman500
But how do we know that the scout's belief is right? Seems like the author has
already come to the conclusion that its always going to be right. Doesn't that
just bring us back to the mindset of the soldier?

Maybe we just have to accept that part of the experiment of living in a
democratic and free society is allowing people to create their own values, and
accepting that people are going to get it wrong. Their values are going to
conflict with ours, and that's good because we need to be sure our values are
actually worth holding.

~~~
dahart
> But how do we know that the scout's belief is right?

I feel like the assumption in the article is that neither is right, but that
the scout is searching for right, and the solider is defending right.

> Seems like the author has already come to the conclusion that its always
> going to be right. Doesn't that just bring us back to the mindset of the
> soldier?

I don't think so. The mindset distinction isn't between right and wrong, both
types prefer to be right. The distinction is whether you're a motivated
defender of an idea in spite of the evidence, or whether you're searching for
truth and willing to change your position.

If you think about this in just pure statistical probabilities, who's more
likely to be right at any given time, the person who is certain they're right,
or the person who has changed their mind to favor the evidence?

It feels like this might even be related to the Monty Hall problem to me; once
you have some evidence that something's wrong, there's a distinct statistical
advantage to changing your mind, even when you don't know and you're choosing
randomly.

~~~
jumpman500
I like this. I think you have a better interpretation than my initial one.

If it is sort of probability based than it can't be right or wrong, just sort
of incomplete until we observe the consequences of the action. More often than
not the scout will probably be right, and I can concede that. But that doesn't
imply that the soldier will always be wrong either.

~~~
dahart
> But that doesn't imply that the soldier will always be wrong either.

Completely agree! I maybe implied too strongly that soldiers claim to be right
against contrary evidence, which isn't what Julia said. Not only are soldiers
often right, people often defend their stance and stop looking for the right
answer precisely because they are right and know it. The risk is just that
sometimes that decision is made prematurely.

Personally, I probably revert to soldier when I _think_ I'm being a scout more
often than I want to admit.

------
Isamu
This is very interesting.

>Julia Galef is a writer and co-founder of the Center for Applied Rationality,
a nonprofit organization devoted to helping people improve their reasoning and
decision-making, particularly with the aim of addressing global problems. She
is also the host of the Rationally Speaking podcast.

[http://www.rationality.org/](http://www.rationality.org/)

------
mcgwiz
"What do you most yearn for — to defend your own beliefs or to see the world
as clearly as you possibly can?"

Before you answer the latter, ask yourself when was the last time you admitted
to someone you were wrong about something significant to you both? Can you do
so more often?

------
andyjohnson0
Julia Galef did a talk last year for the Long Now Foundation on the
soldier/scout mindsets [1]. It's about 90 minutes including questions, and I
found it interesting end engaging.

[1] [http://longnow.org/seminars/02018/sep/12/soldiers-and-
scouts...](http://longnow.org/seminars/02018/sep/12/soldiers-and-scouts-why-
our-minds-werent-built-truth-and-how-we-can-change/)

------
xutopia
Scouting as described in this article is exactly what we need to foster in
today's false news world. I am doing all I can to drive this point home with
my children.

------
a_c
Not sure where I read, maybe Nonviolent Communication or somewhere else,
paraphrased - "Being right feels the same as being wrong". Think about
hundreds of years ago where people debate about whether the earth is flat,
either side truly believed they were right, even though one clearly wasn't

~~~
tialaramex
No. They didn't.

People who actually thought about this tended to pretty quickly figure out
that the only shape that works is the globe, a big ball. They estimated how
big it was. The ancient Greek estimates are pretty good. The result was
disheartening if your employer was an emperor because the globe was much
bigger than their empire. Still, you could point out that maybe most of it is
just ocean.

Most people didn't think about it at all. Seriously insisting that you believe
the Earth is flat, generally as part of some larger grand conspiracy theory,
is a modern phenomenon.

------
kgwxd
"Julia Galef is a writer and co-founder of the Center for Applied Rationality"

So the author has definitely heard the term "confirmation bias". Why make up a
new term for it?

~~~
theoh
She probably does know about cognitive bias, but you can't seriously be
suggesting that her bio establishes that with certainty. People who write and
set up organizations are a very mixed bunch. Galef is not exactly a high-
powered scientist or philosopher.

"Motivated reasoning" has it's own Wikipedia page: it's fine to talk about it
as something distinct from cognitive bias.
[https://en.wikipedia.org/wiki/Motivated_reasoning](https://en.wikipedia.org/wiki/Motivated_reasoning)

~~~
king_panic
"Motivated reasoning" is the product of confirmation bias. That's where the
distinction ends.

~~~
theoh
No. If anything confirmation bias is a consequence of one particular case of
motivated reasoning.

I think you will need to make a much stronger case if you want to convince
anyone that this distinction is not useful (and that, therefore, the
dismissive comment I initially replied to was justified.)

~~~
king_panic
I'll let you make my case: Do you make a decision before or after you gather
and interpret information?

------
sandwall
'To fight against confirmation bias, is to fight the hardest battle and never
stop fighting'

Biases are built in, it takes effort to be rational.

------
king_panic
Simple explanation for why we think we're right even when we're wrong: we
perceive what we believe. Not the other way around. Even when we're dead
wrong.

In psych they call this confirmation bias.
[https://en.wikipedia.org/wiki/Confirmation_bias](https://en.wikipedia.org/wiki/Confirmation_bias)

Scott Adams wrote a fantastic book on the subject, in the context of Trump's
election to the Presidency. [https://www.amazon.com/Win-Bigly-Persuasion-
World-Matter/dp/...](https://www.amazon.com/Win-Bigly-Persuasion-World-
Matter/dp/0735219710)

