
Ideology Impairs Sound Reasoning - Reedx
https://psyarxiv.com/hspjz
======
ilaksh
My take on this is that people need to realize that they are not an exception
to this. _Everyone_ has an ideology that overrides logic for many topics.

I think the reason for this is that our brains just generally don't operate on
a rational basis because it's not practical for humans. We have to rely on
preformed perspectives when it comes to certain broad perspectives. Another
reason is that it's almost necessary to adopt your group's perspectives in
order to fit in socially. Or at least it's unlikely you will have a different
perspective if much of your information comes from one group.

~~~
haberman
I believe the best ways to buck this weakness are:

1\. Invest more energy into trying to falsify your favored beliefs than into
confirming them.

2\. Embrace humility, since we're always wrong sometimes. Humility to me
means: you can have convictions, but you should never assume that people who
disagree with you are stupid or evil.

~~~
toufiqbarhamov
_I believe the best ways to buck this weakness are:

1\. Invest more energy into trying to falsify your favored beliefs than into
confirming them.

2\. Embrace humility, since we're always wrong sometimes. Humility to me
means: you can have convictions, but you should never assume that people who
disagree with you are stupid or evil._

I’d add two other ideas that I try to adhere to, continuing your list.

3\. Try to adopt the perspectives of those you disagree with, especially when
you’re finding yourself ascribing malice, stupidity, moral failing, or just
find yourself baffled by someone. Such feelings are often a sign that you
don’t understand where someone is coming from, and though you may still
disagree with them, you’ll have a better platform to discuss matters.

4\. Try to see yourself and your time through the same lens you use to look
back on history. We’re not special or unique, we’re goinf to be seen as
primitives eventually, and it can be helpful to imagine in which ways that
perception might be levied.

Both of these help to broaden perspective and depend empathy, while also
divorcing yourself from a strict “first person view” or tribal view. It’s also
a couple of concrete steps to gain some distance from your feelings in the
present, which makes your steps easier to achieve. You’ll also find that the
people you end up despising form a much smaller pool than most.

~~~
Haga
Torture yourself. Uncertainty is discomforting. If thinking and deciding is
easy for you - you are not doing it. Intellectual honesty is masochistic.

~~~
2muchcoffeeman
We must be very different people. Whenever I realise I’ve been wrong about
something for a long time, I’m fairly excited about it.

And I’ve been very stubborn in the past. The biggest struggle was being able
to accept being wrong in public. It’s much easier to have a ‘holy shit’ moment
in private!

------
james_s_tayler
A few good books I'm reading / have read on the subject

    
    
      The Righteous Mind
      The Elephant In The Brain
      In Defense of Troublemakers
    

The deck is stacked very far against us cognitively. We are a walking, talking
political nightmare unto ourselves and others.

The worst part is it's excruciatingly difficult and extremely unlikely for you
to find your own blind spots. So you need to hash things out with other
people. As others mentioned the best thing you can do is hold defeasibility
and corrigibility as some of your highest values and do your best to
understand all the pitfalls in our thinking.

~~~
acabal
I'm currently reading _The Righteous Mind_. While I haven't finished it yet,
so far a major theme has been that reason's major job is merely to justify
what the gut instinct has already decided.

Taking that theory to its conclusion, that means that most people are
approaching the question of rationality vs. emotion the wrong way by assuming
that appeals to reason and logic are the way to convince the average person.
Not so; one must convince their emotions first, and then they will reason
together a logical framework to fit their emotion.

It's a bit distressing to think about, but unsurprising when one reflects on
the course of history that humanity has taken, and is taking.

~~~
james_s_tayler
His experiment in one of the early chapters where they come up with these
little stories that are for some reason morally repugnant but on closer
inspection don't appear to be hurting anyone and then they press people for
their reasons on why they think it's wrong and they wind up inventing
justifications that are sometimes pretty far fetched. That was pretty
interesting.

There's another great bit of research detailed like that detailed in The
Elephant In The Brain.

In the 60s and 70s Roger Sperry and Michael Gazzaniga did some research for
which Sperry eventually won a Nobel Prize in 1981 where they showed images to
patients who had undergone a corpus callostomy where the two halves of the
brain can no longer communicate and asked them questions about what they saw.
The results were really striking. They did it in such a way that they show the
image to only the right side of the brain but then ask the left side of the
brain to describe what they saw verbally. They just invent some kind of
fantastical rationalization and they seem to fully believe that's what they
think despite the fact that it can't be. I'm explaining it kind of badly but
that part was jaw dropping. Seems to fit with Haidts research though.

------
tabtab
Often it comes down to trust, whether that's rational or not. Take climate
change. The topic is too involved for most non-scientists to absorb enough to
come to an informed decision on their own. Therefore, unless one invests
boatloads of time, they must rely on expert opinion for much of their
conclusion.

Therefore, climate-change deniers ask if one should trust scientists over
their favorite political pundits or favorite CEO's. There are enough incidents
of scientists being biased based on funding source to have some skepticism.

When I reply, "Don't pundits and CEO's have similar financial biases"?

Deniers typically respond something like, "Sure, so I have to rely on my gut,
and my gut gives them more credit than it gives to scientists. Most scientists
come from liberal-leaning universities."

~~~
parsnips
At a way to fully illustrate the OP's point.

~~~
tabtab
The title is misleading then. Ultimately complex topics probably require trust
regardless, unless one can spend a lot of time on them. I once debated about
whether one can clearly determine the Earth is not flat without relying on
experts. I found alternative models for just about every home-spun experiment.
Maybe with really good weather and carefully crafted survey tools one could
rule out the alternative models, but again that's a lot of time (at least for
the middle-ages; which is where we set our thought experiment. No Amazon.com
back then).

~~~
eeZah7Ux
> without relying on experts

This does not fly with most complex disciplines where it takes decades to
become proficient.

But relying VS not-relying on experts is a false dichotomy. You can choose
experts to trust, as rationally as possible.

~~~
tabtab
I'll reword it this way: the more you know, the less you have to rely on
experts, or at least the degree decreases. But the practical angle of this is
that one will still have to heavily rely on experts until they reach a non-
trivial point in their education/study of the topic. It's not realistic that
the average person master every controversial topic in order to vote fully
informed.

------
indigochill
I would counter that reasoning doesn't stand by itself. Reasoning is a method
of getting from A to B, but you need an A (your axioms) to reason from. It's
easy to disparage people whose axioms are different from yours (Jehovah is
God/Allah is God/There is no god), but sometimes that's all there is to it.
Though once emotions get involved (such as by challenging the
political/religious framework on which they've built their life), pretty much
everyone will also fail to adhere to strict reason.

------
scandox
Christopher Tyerman's body of work on the Crusades is really interesting in
this regard. One of the key points he keeps pressing home, is that though we
think of Medieval people as ignorant and superstitious, they were in fact
highly rational. And they were as a culture committed to rational
investigation of the world.

But they had axiomatic beliefs that they were building upon. Which meant they
were also committed to a rational investigation of the supernatural world.
Which we generally view as non-rational.

I'm very much paraphrasing and he would not express it so crudely.

------
nerdponx
I really really like the premise of this study, but I don't buy the results
fully.

The logic in the 2nd and 3rd prompts is subtle. In fact, I don't think the 2nd
one is in fact a syllogism -- it's unclear if Judge Wilson believes _if_ or
_if and only if_. In the former case, the statement is in fact _not_ a
syllogism, but in the latter case it is a syllogism. I wouldn't expect your
average study participant to pick up on the difference.

I am afraid that the only conclusion here is that, in the absence of a clear
logical argument to evaluate, (either due to ambiguity or complexity), people
fall back on their beliefs.

~~~
TomMckenny
Number 3 also requires the reader to infer iff.

But it's not an unreasonable requirement because without inferring iff you get
"Judge Wilson believes one has the right to end the life all living things"
and "Doctor Simmi believes the surgery should proceed no matter what"

Still, I imagine that's enough to throw of an unknown percent of people, alas.

>I am afraid that the only conclusion here is that, in the absence of a clear
logical argument to evaluate, (either due to ambiguity or complexity), people
fall back on their beliefs.

Still an interesting conclusion though.

------
viburnum
Practical men, who believe themselves to be quite exempt from any intellectual
influences, are usually the slaves of some defunct economist. Madmen in
authority, who hear voices in the air, are distilling their frenzy from some
academic scribbler of a few years back.

------
fossuser
This feels like evidence for motivated skepticism:
[https://wiki.lesswrong.com/wiki/Motivated_skepticism](https://wiki.lesswrong.com/wiki/Motivated_skepticism)

------
tlb
Before this research, did a lot of people assume the null hypothesis that
ideology doesn’t impair sound reasoning?

Psychology seems to be responding to the replication crisis by studying some
really obvious truisms.

~~~
shard972
Before I die, ill be reading an academic article explaining how the sky is in
fact, not blue.

------
rntz
As far as I can tell from skimming the paper, it fails to establish that
ideology impairs reasoning _more_ than any _other_ kind of pre-existing
belief.

There is a well-established bias, called _belief bias_ (which the paper
mentions), against accepting the logical validity of arguments whose
conclusions you disbelieve. The study tested examples of this where the
conclusions were political (agreed with liberal or conservative viewpoints),
but AFAICT did not use a control test where the conclusions were apolitical,
but the participants still agreed or disagreed with them.

A control could have established whether political arguments were _more_ (or
less, or equally) susceptible to belief bias. But they didn't use one. So the
study only establishes that political arguments are susceptible to belief
bias.

------
fhfjgkvjvj
Sure, ideology impacts formal reasoning, but if you look at the examples given
the participants have good reason to reject the conclusions of formal logic
that is based on terrible premises.

Despite the result of a simple sequence of sentences divorced from reality,
cigarettes really are bad for you and salads are good (depending on their
contents).

In study 1) participants might disagree with dangerous drugs being banned, and
they disagree that marijuana is dangerous. In 2) premise 1 seems relatively
straightforward, then premise 2 is a highly ideological belief (as is the
point of the study).

I don't think this is very revealing. When you add 1+2 and get 68445788, you
are suprised by the conclusion and check your work. People responding to this
aren't dumb, they just aren't playing along with what they regard as faulty
reasoning. Basically, they are likely saying I know what the researcher wants
me to say, but this is wrong and I won't go along with it.

~~~
scythe
Sometimes I wonder if the general decrease in trust in society affects
psychology research:

[http://www.niemanlab.org/2019/01/a-gloomy-vision-for-fake-
ne...](http://www.niemanlab.org/2019/01/a-gloomy-vision-for-fake-news-
in-2019-low-trust-societies-the-death-of-consensus-and-your-own-lying-eyes/)

The Milgram experiment is the most prominent example which might not be
possible today because it depends on participants' willingness to obey
scientists. The Milgram _effect_ is still real, but it has become harder to
measure.

So in this case, when participants are asked to reason based on premises they
don't agree with, they might, as you've suggested, refuse. They might
interpret the researchers' intent as nefarious, e.g. "if I agree this
syllogism is valid the researcher will report that I support the conclusion".
This line of thinking is of course absurd, but it comes close to what I've
observed that some people believe about scientists.

~~~
fhfjgkvjvj
Is it? Scientists and PR firms get funded by big donors. Professional
scientists are typically honest, but which studies get funded are picked by
people that aren't necessarily. The public sort of understands that things
generally are arrayed against them though they can't necessarily explain why
mechanistically.

In this case, it might be even simpler than that. It might be that regardless
of what they believe about scientists and the formal logic of their
assignment, they just don't like being forced to say things they despise. This
could be seen as noble though its not exactly a plus for debating.

------
fzeroracer
I have to agree with some of other commentators that I don't think this study
is particularly revealing, because there's a lot of nuance in the questions
that doesn't directly follow logical thinking. I want to address the first two
questions

>All drugs that are dangerous should be illegal. Marijuana is a drug that is
dangerous. Therefore, Marijuana should be illegal.

The flaw with this question is that it's divorced completely from the cultural
reality we live in. Cigarette smoking is legal, despite being dangerous. The
final clause does not actually follow from the rest of the logic, because we
have tangible examples of Drug A being Dangerous and Legal. People might think
of this, and therefore disregard the conclusion. I don't think that would be
the result of ideological impairment, but rather the result of people
examining nuance.

>Judge Wilson believes that if a living thing is not a person, then one has
the right to end its life. She also believes that a fetus is a person.
Therefore, Judge Wilson concludes that no one has the right to end the life of
a fetus.

This is just poorly worded. If something is not a person, then Wilson believes
you have the right to terminate it. Since a fetus is a person, then no one has
the right to terminate it. That logic does not necessarily follow from the
initial statement; because the nuance there is that Wilson is expanding her
personal logic into the logic of everyone else. Again, there's a bit of nuance
in that people may personally not be for abortion, but believe that women
still have the right to abortion despite their own personal beliefs.

The logic is also initially biased right from the start, since it implies that
Wilson would also believe killing animals, pets etc is OK.

I don't think these two questions really resolve the issue of bias preventing
sound reasoning, because it implies that the conclusion of the two questions
was logical in the first place. They effectively managed to prove that two
highly nuanced questions are, in fact, highly nuanced.

~~~
antidesitter
The marijuana argument you cited is logically valid. Logical validity isn’t
about whether the premises are true. It’s about whether they entail the
conclusion.

~~~
fzeroracer
Is it, though? Like let's go down some similar lines of logic:

> All websites that are dangerous should be banned. Hacker News is a dangerous
> website. Therefore Hacker News should be banned.

> Cigarette smoking causes cancer. Things that cause cancer should be illegal.
> Therefore, cigarette smoking should be illegal.

> Things that can harm children should be taken away. Phones can harm
> children. Therefore, phones should be taken away

All of these function as the exact same logic posited in the study. A is B. B
is C. Therefore, A is C. However, this really doesn't apply to the categories
above because when you start talking about 'dangerous' or 'causes cancer' or
'harms children' etc there is a wide berth for rational disagreement.

~~~
antidesitter
> when you start talking about 'dangerous' or 'causes cancer' or 'harms
> children' etc there is a wide berth for rational disagreement

That doesn't matter as long as the meaning of words remains the same
throughout the argument.

~~~
fzeroracer
The meaning of the word can remain the same, but the way something has the
property of said word can vary drastically. As a thought experiment:

Dangerous things should be banned.

Riding your bike without a helmet is dangerous.

Doing cocaine is dangerous.

Driving a car is dangerous.

Ergo, those three items listed above should be banned.

Are all of these actions equally dangerous? The answer almost everyone would
give is no. That's where the bias enters into play. The definition of the word
hasn't changed at all throughout.

~~~
antidesitter
Do you understand the concept of "logical validity" [1]? The argument you
stated is valid if "dangerous" means _exactly the same thing_ in each and
every clause. That doesn't mean the premises (in particular the first one) are
true.

[1]
[https://en.wikipedia.org/wiki/Validity_(logic)](https://en.wikipedia.org/wiki/Validity_\(logic\))

------
collyw
It seems quite concerning then, that most political parties are ideologically
driven. I would really like to see a political party focused on evidence based
politics.

~~~
barry-cotter
Politics isn’t mostly about values or beliefs, it’s mostly about winning,
beating the other side. Insofar as it is about either we have a proposal to
make values explicit and to incentivise teaching the goals that serve those
values.

[http://mason.gmu.edu/~rhanson/futarchy.html](http://mason.gmu.edu/~rhanson/futarchy.html)

> Futarchy: Vote Values, But Bet Beliefs

> by Robin Hanson

> This short "manifesto" describes a new form of government. In "futarchy," we
> would vote on values, but bet on beliefs. Elected representatives would
> formally define and manage an after-the-fact measurement of national
> welfare, while market speculators would say which policies they expect to
> raise national welfare. Democracy seems better than autocracy (i.e., kings
> and dictators), but it still has problems. There are today vast differences
> in wealth among nations, and we can not attribute most of these differences
> to either natural resources or human abilities. Instead, much of the
> difference seems to be that the poor nations (many of which are democracies)
> are those that more often adopted dumb policies, policies which hurt most
> everyone in the nation. And even rich nations frequently adopt such
> policies. These policies are not just dumb in retrospect; typically there
> were people who understood a lot about such policies and who had good
> reasons to disapprove of them beforehand. It seems hard to imagine such
> policies being adopted nearly as often if everyone knew what such "experts"
> knew about their consequences. Thus familiar forms of government seem to
> frequently fail by ignoring the advice of relevant experts (i.e., people who
> know relevant things).

~~~
collyw
That's true, which is probably worse than being ideology driven when I think
about it.

~~~
eeZah7Ux
There's a typical moral dilemma involved - the political "trolley problem":

Suppose you are leading a party that has an effective plan to address a lot of
preventable deaths.

If you lose, many people won't be saved.

If you bend your values and engage in manipulating public opinion, making
empty promises, lying and so on you are much more likely to win.

What would you do?

------
the6threplicant
But how do you become ideological in the first place.

Either you're a conspiracy theorist and believe so many weird things that
adding another believe won't top the cart.

Or is it that we want to believe explanations that we understand (or think we
do) and not believe what we don't understand (science).

------
xvilka
The only hope for humanity is to find a way to fix this nature mistake in
future generations by correcting the genetic code (or anything else important
for this process), when it will be completely reverse engineered. All
cognitive biases at once.

It might take centuries though...

~~~
dwaltrip
It's a noble program that can achieve meaningful things, but we should be
aware that all information processing systems will have biases. The map is not
the territory, e.g. the information processing system can never perfectly
represent or model reality.

------
mbrodersen
Always ask the question: "What evidence would I/you need to change my/your
opinion?"

------
tokai
I intuitively understand the word ideology as denoting false consciousness, in
the marxist sense of the word. So the title reads like a total tautology for
me, a la Unsound Reasoning Impairs Sound Reasoning. Guess I'm experiencing
ideology.

------
gibsonf1
So True!

------
ymgch
Ideology or religion, don’t know what’s worse.

