
Why Facts Don’t Change Our Minds (2017) - BerislavLopac
https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds
======
tunesmith
We have this weird fetish these last few years with asserting in broad
language that people don't change their minds, don't respond to evidence, dig
their heels in further when presented with evidence that contradicts their
preconceived opinions, etc. It's all over pop psychology and the headlines.

Bugs the heck out of me, because if the language they use is literally true,
then no one would ever be convinced to change their minds ever. And yet, we
do.

It's true that perhaps _some_ or _many_ people never change their minds, or
that all people might be _apt_ to behave that way when they're not focusing
their attention, but that's wildly different language then the headlines use.

This article is obviously guilty as well - "Why Facts Don't Change Our Minds".
Like, ever? The content of the article doesn't back up the headline, but the
headline is what people remember.

Even some of the content is probably guilty. In the Stanford capital
punishment study, is it really true that _each and every individual_ in the
study responded as they describe? Because that's how the article is written.

And the problem is it gives people more excuse to give up - to not engage with
someone who is wrong, or to dismiss someone who is right (that they think is
wrong). Because, studies!

The real lesson is the opposite - that we have to study and learn critical
thinking, and practice, as a discipline, changing our minds when the evidence
or reasoning warrants it. Just because it's hard doesn't mean it's impossible,
and in fact the ability to do so is part of what makes us an evolved species -
or more generally, our ability to surpass our instincts and evolved traits.

Show me some experiments that demonstrate what conditions need to be in place,
in order for people to change their minds after they've proven to be
resistant. (Hint: psychological safety and lack of time pressure.) That's
what'll be valuable. I'm tired of all these other studies that just say we
don't.

~~~
titanix2
Well, most of the time the "Why Facts Don't Change Our Minds" question could
actually be rephrased as "Why Don't Everybody Agree with the Left?" I just
skimmed the article, but the mention of Steve Bannon at end is an indication
that the article probably belongs to that trend. This turns the issue into the
political ground, not the scientific one. Thus, the word "fact" is used as an
appeal to authority when its actually an ideological discussion that is at
stake. Some facts are carefully ignored or dismissed when not fitting the
narrator world's view (in either sides), so it would be better to acknowledge
that it's opinions that are discussed, and that their are not absolutely
right.

~~~
gameswithgo
How about "why don't facts change people's minds about global warming"?

or "why don't facts change people's minds about gun control?" (this is an
issue with both the left, they are both wrong about many aspects of gun
issues!)

These are just two cases that came to mind, where I have friends/family who I
talk to about it, and I can explain some facts in great detail that I'm
familiar with, and in that moment they will seem to change their mind a little
bit. But a couple months later they are right back to their old thinking.
Maybe people tend to forget the stuff they would rather not believe.

~~~
mbrumlow
> How about "why don't facts change people's minds about global warming"?

I have always thought it was either the lack of the ability to understand the
facts, or the shear unwillingness to actually look at the facts.

~~~
PurpleBoxDragon
What about the way the information is often presented to the average person as
a justification for policy change. How often does the average person encounter
information about global warming devoid of an attempt to change policy in
relation to the information? That alone would cause reason for a person doubt
the authenticity of the information since the source is seen as biased. Add in
how badly news organizations butcher the actual science, and there is plenty
of room for doubt that then can be exploited by those who specifically seek to
do so.

~~~
skosch
Hmmm, interesting perspective. However, forget about the denialist fringe for
a moment. The majority of people do understand that climate change is bad, and
most feel paralyzed into inaction by all the doom-and-gloom messaging – _welp,
looks like we 're all screwed, now let me get back to what I was doing_.

Few understand that climate change is still very much a solvable problem, and
that our success crucially depends on building the political will for new
policies around carbon pricing and land use — and that, in turn, hinges on
creating an understanding for the scales at which such policies must and can
enact change. The current mainstream discourse, on the other hand, is half
fatalistic cynicism, and half turn-off-the-lights-and-reuse-your-plastic-bags
blather.

Ergo, talking about policy change is important, no?

------
the_greyd
I say that's a good thing. It's a good thing that beliefs have inertia, and it
makes sense. Think of the years of patterns of thinking and neural connections
that have formed in your brain. Are you expected to read something and change
that in an instant? Imagine when Darwin's Theory of Evolution came out,
assuming it had perfect evidence for it, do we expect everybody to instantly
change their beliefs? And sometimes what we say facts are poor research, see
ongoing replication crisis in psychology, social science, nutrition etc. The
time it takes to change a mind is a long complicated process which is
justified. A world where people's minds are easily changed by facts would be a
world of fads.

~~~
Bartweiss
One of my favorite points in favor of facts, debate, and persuasion is the
observation that you almost never see someone _change_ their mind, but you can
find legions of people who _have_ changed their minds.

There are a host of reasons we don't respond to facts with "oh, I guess I'll
abandon all my deeply-held beliefs immediately!" Everything from "that's an
embarrassing loss of face" to "I'd like to fact-check you before I accept
that, but it's rude to say I doubt you" to "brains don't work that way, it's
physically impossible to discard a whole belief on demand" factors in. And I
agree, this is hardly a bad thing. Taking time to change our minds is a safety
feature. Not only are facts sometimes false, or falsified, or misleading, but
most of us aren't great at knowing what's relevant when.

On almost any topic, there is _someone_ who could argue me into an
embarrassing defeat, or at least an awkward stalemate of "we can't both be
right". This is true for most people about all things, and all people about
most things. I am overwhelmingly confident that young-earth creationism is
wrong, but I've seen its adherents speak; I have a lot of arguments they can't
answer, but they have a lot of arguments I can't answer either. I don't know
enough about the geology of Mount Ararat to explain why some arcane point
about flood sedimentology is wrong, but the correct response to that is not
for me to agree that the Earth is 5,000 years old. And that's for a fringe
theory that I'm uncommonly well-qualified to rebut - the problem only gets
worse when we move to a lay viewer looking at any serious debate.

People today may unprecedented access to facts, but that didn't give us the
time or memory or training to evaluate every possible issue from the bare
facts up. Everyone who's ever cast a vote is, on a great many issues, working
from heuristics and expert opinions and best guesses. With only one lifetime
to learn things, that's not avoidable. I'm sometimes alarmed by how harshly
people will resist looking into new facts and evidence, but the idea that
people should promptly respond to compelling-sounding facts by changing their
minds doesn't strike me as a workable one.

~~~
the_greyd
After reading a bit of evolutionary psychology, I realized to my astonishment
that humans have evolved to survive, and not to be perfectly rational.
Rational so much so its useful. I'm actually convinced that extreme
rationality has evolutionary disadvantages. When you realize the universe has
no meaning, does it motivate you and your species to keep breeding and take
risks? As Schopenhauer said, if you truly considered all the costs associated
with having kids before having them, no rational being would ever have kids!
No person thinking clearly would ever trade 10-15 minutes of feeling good for
a lifetime of costs and responsibilities. Then there's tribalism and ingroup
and outgroup psychology which dictates much of what we see. These are survival
mechanisms meant for living in small tribes, malfunctioning in a globally
connected civilization. I would suggest yuval harrari's recent article on how
humans are hackable and how that completely demolishes our concepts of liberal
democracy , free market etc which assumes rational actors behaving in their
self-interest.

------
BerislavLopac
My problem with the studies described at the beginning is with this
conclusion:

 _Even after the evidence “for their beliefs has been totally refuted, people
fail to make appropriate revisions in those beliefs,” the researchers noted._

My understanding, from the descriptions in the article, was that it was not
the beliefs were _refuted_, it was just that what they were told originally
was not true. The subjects had no reason to believe the latter assertion above
the former.

~~~
geezerjay
It is also important to note that facts, in the true sense of the word,
represent single observations of a phenomenon, while beliefs are more about
models that reflect relationships between observable phenomena and a way to
model and predict behavior.

Thus it's easy to understand that a single observation, particularly one which
is given at face value, is not enough to scrap whole models.

To provide an example, should we scrap the whole notion of gravity
accelerating all dropped objects uniformly independent of mass if someone
observes a feather taking more time to fall than a canonball? I mean,that's a
fact. Anyone ca see it and reproduce the same behavior. So, should everyone
just abandon the notion of gravity acritically because of a single fact was
presented?

------
tobr
For some reason this made me think of the problems with side effects in
programming. If we can read and write willy-nilly, it’s very difficult to
figure out why the data is the way it is, and we can’t undo/rewind unless we
planned for it all along.

When our minds are mulling over some piece of information, whether it’s true
or not, it will have side effects on other thoughts, opinions and emotions. If
we’re confronted with a revised set of facts later, there is no way to rewind
all those side effects.

------
dang
Discussed at the time:
[https://news.ycombinator.com/item?id=13810764](https://news.ycombinator.com/item?id=13810764)

------
izzydata
They sure change my mind. I'm just not always convinced what people are saying
are facts. At best they are the current best explanation and at worst only
opinions.

~~~
merpnderp
At one end of the spectrum of what constitutes a fact is Richard Feynman's
definition, which was anything he'd replicated himself in a lab. He considered
everything else he knew to be based on faith (in people, in organizations, in
whatever made the claim, but which he held in high enough esteem to actually
believe.)

------
philipps
Both this and the previous discussion show that HN readers believe facts (or
information) can and do change peoples’ minds. The way I understand the
article (and the books it refers to) it suggests that social factors have a
stronger effect on opinions than facts. I find this hypothesis plausible when
I look at the polarized discussions I see playing out at the societal level,
as well as interactions with other people individually. And maybe, rather than
refute it, the HN reaction to the article is further evidence for the
mechanism the article describes?

Edit: typo

------
Isamu
Belief is not generally fact based. Going strictly with the facts is an
incredibly disciplined thing. Feynman used to play games to keep himself
sharp, to avoid fooling himself because as he said, the easiest person to fool
is yourself.

With people in general the belief comes first and then you backfill with
supporting evidence and argument.

Also we tend to pull our beliefs into our identity. Then a challenge to our
beliefs is a challenge to our identity, and very hard to swallow. So it is
also difficult but important to keep yourself from identifying yourself by
your beliefs.

------
dagaci
Humans are overwhelmingly emotional creatures. We are a bunch of monkeys with
only a fragment of real awareness of why we are doing X and Y.

Human brains judge another persons whole life in milliseconds (we all do it,
can't help that), but then struggle with what is quite simple math. That
should tell you something about what we are.

My advice is don't ever forget that logic is only the smallest part. Demeanor
and a strong image is much more important than it should be, simply presenting
a "fact" is not enough.

------
hevi_jos
We should talk about complexity and complication here.

Complex is the opposite of simple. If there is 50 equal elements communicating
between them there we have a complex system.

This is important because a thousand ants behave totally different from an
isolated ant.

You just can't study an ant and expect to understand the nest or anthill.

Psychologists try to make it all the time and it is totally wrong: They
studied isolated rats in order to understand addiction to drugs and they did
not understood that they are social animals which travel 20miles-30kms each
day.

So they jailed the rats in a cage, because it was easier to study them that
way, and extracted conclusions that were totally bogus for human beings. Those
conclusions were the basis for the US war on drugs.

Here they do the same. They make a very simple experiment, and extract
conclusions about the whole system. In Africa, when humans whatever,more
fiction than science.

It is ok to make fiction and speculation, but you should always differentiate
what we really know with high degree of confidence from what we do not.

------
Asturaz
The study Mercier and some European colleagues did together which is mention
in this article seems very similar to The Choice Blindness study.

[https://www.wired.com/2010/08/choice-
blindness/](https://www.wired.com/2010/08/choice-blindness/)

[http://www.lucs.lu.se/wp-content/uploads/2011/01/Choice-
Blin...](http://www.lucs.lu.se/wp-content/uploads/2011/01/Choice-Blindness-
summary.pdf)

The latest paper which is incredible interesting from that study
[http://www.lucs.lu.se/wp-
content/uploads/2018/08/Strandberg-...](http://www.lucs.lu.se/wp-
content/uploads/2018/08/Strandberg-Siv%C3%A9n-Hall-Johansson-P%C3%A4rnamets-
False-Beliefs-and-Confabulation-Can-Lead-to-Lasting-Changes-in-Political-
Attitudes-2018.pdf)

------
netcan
This is a tangent.

How much/long does a study like this cost/take? Is it possible/useful to scale
this?

From reading into some of the replicability discussion, I've gotten curious
about social science generally. What's an expirement like this trying to
prove/demonstrate. The journalistic narrative likes the easy "X debunked." I
imagine researchers have a more nuanced perspective.

I guess what im asking is what does the larger effort look like? Do these
studies eventually add up to a larger understanding of how we _do_ form
opinions, where facts _do_ change our minds....

Replicability is one thing. How about generalisability?

~~~
TangoTrotFox
The replication rate is social science studies is very poor. And an even
larger problem is that the studies themselves do not actually test what they
aim to test, often because they simply cannot. For instance do video games
cause violence? The only way you could realistically test this would be to
have one group somehow prevented from playing video games for their entire
life, or at least up to a certain age, and another group obligated to play
video games in a similar fashion, and then some control group presumably free
to do as they saw fit. And then follow these individuals for decades and see
what happens.

That's not really possible. So instead the experiments that are made are toy
experiments, but when you're not really testing what you're trying to measure,
it becomes impossible to prove anything, and possible to show anything. For
instance could you design a toy experiment that might indicate video games
cause violence? Of course. Could you design a toy experiment that might
indicate video games don't cause violence? Of course. The experiments are
meaningless.

As an example of the problem, 'emotional intelligence' in work is all the rage
right now. Yet the keystone study that sparked it is really just quite bad.
The author had people split off into groups and perform a variety of tasks
such as, literally, planning a shopping list. Using some method of determining
who made the best shopping list, the author then determined that the groups
who had the highest average IQ did not perform the best, whereas their
'emotional intelligence' as measured by an, again literally, "reading the mind
in the eyes test" mapped better to performance - so therefore, it's not merit
alone that judges performance but some emotional intelligence, at least as
measured by "reading the mind in the eyes" that does. That's just broken
logic. At the bare minimum IQ != specific task merit. The most logical way to
perform that experiment would be to have created teams of those that performed
individually best on any given task to work together against those that did
not score so well on merit, but did well on the "reading the mind in the eyes
test". Of course she did not do this, the bare minimum to even begin
approaching this question, since the result would not be what she wanted. And
negative results don't get published. Yet now there have been likely hundreds
of articles and spin off studies taking that study's unjustified conclusion as
a granted.

So no, there is absolutely no big picture progress in the social sciences.

~~~
Chyzwar
I think there is a way. Facebook/Google have all data to perform studies like
that. Problem is that current University system is broken. You are forced to
publish short success stories instead doing actual research.

------
macspoofing
Don't facts need to be contextualized in a larger belief framework to be
useful? "Sun rises in the East" is a fact that is compatible with a
heliocentric and geocentric view of the world. So that fact on its own is
useless in explaining the world. In politics, your ideology will guide your
interpretation. If you have a particular view on, say, immigration, you will
highlight certain facts, suppress or dismiss inconvenient facts, and in
general interpret all facts through your ideological lens.

------
chiefalchemist
If anyone would like to explore the topic a bit further try "The Influential
Mind" by Tali Sharot. It was on FT's short list for best of 2017. It's a quick
easy read. Gladwell-esque. The difference iis, she is an actual scientest who
works in that field. Tho' perhaps that makes her bias? ;)

[https://www.amazon.com/Influential-Mind-Reveals-Change-
Other...](https://www.amazon.com/Influential-Mind-Reveals-Change-
Others/dp/1627792651)

------
ASinghUK
The article is basically saying that if we're told we're right, then we're
told we're wrong, we will still believe we're right even though we're
presented with a contrary fact? The test group aren't asked their answers in
response to being told a fact - its more of a psychological game in my opinion

------
TheSpiceIsLife
> _In 1975, researchers at Stanford invited a group of undergraduates to take
> part in a study ... A few years later, a new set of Stanford students was
> recruited for a related study._

We have to wonder if _Stanford undergraduate student_ generalises to the whole
population.

It doesn't require much stretching of the imagination to see that _circa 1975
Stanford undergraduates_ , as a cohort, may score below average on objective
measures of _humbleness_.

~~~
conistonwater
Does it really rely on 1975 Stanford undergrads, though? For example, I think
Dan Kahan (Yale's Cultural Cognition Project at
[http://www.culturalcognition.net/blog/](http://www.culturalcognition.net/blog/))
usually says about the same thing about facts and changing minds.

~~~
TheSpiceIsLife
The first discussion in your link talks about peoples bias for wanting to
remove plastic from the ocean remote from the source, rather than at the
source.

Surely it's obvious that preventing plastic entering the ocean at the _source_
does nothing to remove the plastic that's already there.

Maybe I should read more of this blog, but that doesn't inspire confidence.

~~~
conistonwater
I think you should, if you scroll back a couple of years, there's plenty of
discussion of published research on science communication, which might be more
interesting to you.

~~~
TheSpiceIsLife
Cool, thanks, I'll have a gander.

------
dqpb
This is why cache invalidation is important.

------
merpnderp
But what is considered a fact? Richard Feynman only considered a thing a fact
if he'd replicated the results himself. Everything else he took on faith.

When the FDA said you should limit fat intake, that was considered a fact and
only nuts would disagree, but that is now considered a much trickier
statement. When the FDA says the flu vaccine is safe, that is a fact and only
nuts would disagree, but that is not considered a questionable statement. But
it isn't hard for a nut to use the first to confuse the latter.

------
slx26
This is very long and complicated, but I'll try to write a quick comment.

Many people know from experience how hard it's to convince anyone to change
what they say, the way they behave, etc, from facts alone. I say "behave",
because it's very important to realize the difference between "rationally
accepting" and "caring about a truth or fact", more to that later.

We also know that people can change their minds quite easily when they are
exposed to pretty much anything for enough time, with enough repetition. Even
when we know something is false, the repetition of a certain discourse can
have a noticeable effect on us. When you pair that with other types of
external pressures, it becomes even more egregious.

The key point is that we can't easily change how people feels while being
respectful with that people and trying to make them change through words only.
We are irrational, and we all have different priorities. Even if I accept a
fact, it might not be something important to me, even if I rationally say it
is, so I won't change the way I act, and it won't matter that much. Otherwise,
I might say that I don't accept something just because I don't feel that way,
even if I have to discard and ignore (unconsciously) the facts. What "I feel
"is more relevant than the facts that I "don't really (want) to understand".
As tobr puts very well in another comment too, all these feelings and ideas
have a long term effect: "When our minds are mulling over some piece of
information, whether it’s true or not, it will have side effects on other
thoughts, opinions and emotions. If we’re confronted with a revised set of
facts later, there is no way to rewind all those side effects."

I thought it was interesting to write about all this, not only because it
really helps a lot understand why facts might or might not help much changing
the way people behaves, but it also helps us understand better how to actually
make people change their minds. Words might be very effective for those that
are already in a similar line of thought, but in other cases we might want to
try changing the way people feels about something instead, by trying to making
them live, in first person, the contradictions in their own beliefs
(unfortunately, for many technical issues, you can't do that without becoming
a teacher (and that assuming the other person trusts you enough to let you
teach something), but then it's not surprising that people can't trust facts
that depend on knowledge they don't have, it's only natural). And all this
also help us understand that even though we might accept many rational truths,
we all have different priorities, so the ones we end up acting upon deserve
some consideration. Sometimes you are so focused on your own causes that don't
understand how others don't share it, when it's simply that they have their
own ones too, not necessarily that they don't rationally accept and understand
what you are doing. And there are a billion worthy causes. For some people it
might be about saving the world. Others focus only on their kids. We can do a
lot to manage better that collision between rationality and irrationality,
between facts and feelings, because both are critical to us as human beings.

~~~
emodendroket
> We also know that people can change their minds quite easily when they are
> exposed to pretty much anything for enough time, with enough repetition.
> Even when we know something is false, the repetition of a certain discourse
> can have a noticeable effect on us. When you pair that with other types of
> external pressures, it becomes even more egregious.

After decades people still believe that global warming is a hoax and vaccines
cause autism, despite mountains of oft-repeated evidence that this is not the
case.

~~~
slx26
well, not the evidence they are frequently exposed to.

but of course there are many other factors. I already talked about the problem
with more technical arguments, but nowadays in many cases we have this
aversion to science and mistrust for anything that might come from it. also,
many people are mostly exposed to what they think and feel, because when
someone else tells them something they don't believe in, they don't hear that,
they just repeat their own discourse again for themselves. so they are exposed
to that. and in case it was a bit confusing, I didn't say that "people _will
always_ change their minds [...]", there are a lot of factors. but that we can
very easily doubt ourselves when information comes predominantly from only one
direction? for sure. and that direction doesn't necessarily need to be
"science" or "media". we are just bad at detecting where the relevant
information comes from for a person.

------
antidesitter
Michael Huemer has written extensively on the topic of why people are
irrational about politics [0]. I recommend this brief talk [1] which goes over
some of the key ideas. Fundamentally, it's a problem of incentives: There is
little or no incentive for people to be rational about their political
beliefs, and there is frequently an incentive to be _irrational_ about them.
In particular:

 _The theory of Rational Irrationality holds that people often
choose—rationally—to adopt irrational beliefs because the costs of rational
beliefs exceed their benefits. To understand this, one has to distinguish two
senses of the word “rational”: Instrumental rationality (or “means-end
rationality”) consists in choosing the correct means to attain one’s actual
goals, given one’s actual beliefs. This is the kind of rationality that
economists generally assume in explaining human behavior. Epistemic
rationality consists, roughly, in forming beliefs in truth-conducive
ways—accepting beliefs that are well-supported by evidence, avoiding logical
fallacies, avoiding contradictions, revising one’s beliefs in the light of new
evidence against them, and so on. This is the kind of rationality that books
on logic and critical thinking aim to instill.

The theory of Rational Irrationality holds that it is often instrumentally
rational to be epistemically irrational.

The theory of Rational Irrationality makes two main assumptions. First,
individuals have non-epistemic belief preferences (otherwise known as
“biases”). That is, there are certain things that people want to believe, for
reasons independent of the truth of those propositions or of how well-
supported they are by the evidence. Second, individuals can exercise some
control over their beliefs. Given the first assumption, there is a “cost” to
thinking rationally—namely, that one may not get to believe the things one
wants to believe. Given the second assumption (and given that individuals are
usually instrumentally rational), most people will accept this cost only if
they receive greater benefits from thinking rationally. But since individuals
receive almost none of the benefit from being epistemically rational about
political issues, we can predict that people will often choose to be
epistemically irrational about political issues._

[0]
[http://www.owl232.net/papers/irrationality.htm](http://www.owl232.net/papers/irrationality.htm)

[1]
[https://www.youtube.com/watch?v=4JYL5VUe5NQ](https://www.youtube.com/watch?v=4JYL5VUe5NQ)

------
taneq
I still think they do.

~~~
enriquto
Ι see what you did there, you Russell.

------
lisper
I just now (30 minutes ago) changed my mind about something based on a fact,
and it wasn't even something that someone pointed out to me. I just realized
on my own that I had been wrong:

[https://news.ycombinator.com/item?id=18111870](https://news.ycombinator.com/item?id=18111870)

I wonder if this fact will change anyone's mind about the idea that facts
never change anyone's mind.

~~~
emodendroket
Why such a mindlessly literal interpretation of an title clearly not intended
to be literal?

~~~
hnbroseph
well, it's a time-honored tradition of we geeks on the internet. it helps us
feel clever. :)

------
jokoon
People are free to believe what they want. The problem with science is that it
cannot change, so it can appear as a problem in politics, people feel science
is a prison you cannot criticize.

Also there are good reasons why science is not in control of our lives.

What's worse is that people will also speak in the name of the science of
economics to advocate about any kind of policy or tax break.

~~~
commandlinefan
> The problem with science is that it cannot change

Maybe science can't change, but our collective understanding of it changes
(hopefully gets better) almost all the time.

~~~
jokoon
I guess people don't realize how science illiterate people can be. That's the
issue I guess. If you don't understand science, you cannot trust it.

