
Why Facts Don’t Change Our Minds (2017) - skm
https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds
======
im3w1l
I really dislike this narrative that facts don't change our minds and we are
all irrational.

The thing is, truth does exert a pull on our beliefs. It's a slow force. It
may take years for people to come around to it. Sometimes it even happens on a
generational scale. But we are approaching the truth. Everything in history,
and everything in our daily experience tells us this. A couple of experiments
where researchers manage to fool the people in their studies does not disprove
this overall trend.

What scares me about this narrative, is that people are using it to discredit
democracy. "Look how stupid people are! We have to spoonfed them the
cherrypicked facts that lead them to the right beliefs. We have to decide
everything for them."

~~~
simonh
Whether you like it or not has no bearing on whether it's true. In fact that's
a confirmation bias right there. Presented with evidence of something you find
unpalatable you simply reject it outright purely on the basis you don't like
it. Personally I'd rather know.

The point isn't that evidence has no power, it's that it has dramatically less
power than most people think. However there are strategies for getting us out
of the personal bias quagmire, such as the scientific method and the approach
described in the article of providing an account or explanation of your
position and the reasons for it. The debating rule of first explaining your
opponent's position in your own words, but in a form they accept as being
accurate, before trying to rebut it is also hugely powerful. These do seem to
work and help lead us to better outcomes, so this is valuable and actually
useful work.

~~~
im3w1l
> Presented with evidence of something you find unpalatable you simply reject
> it outright purely on the basis you don't like it.

I presented a case for why I think it is wrong. And I presented a case for why
people have reason to push it despite it being wrong: it gives them more
power. Considering peoples motivations is important, and when someone stands
to gain we should be suspicious and have to go over everything extra
carefully.

~~~
simonh
I suppose I was a bit antagonistic in my post sorry, but if anything your
observation supports the article's position. Why should facts exert only a
slow, painstaking generational force on belief if people are actually
rational? Surely it should have immediate effect?

I really don't see what these researchers or journalists have to gain, beyond
what they would gain from doing any research or journalism. I'm just not
seeing any credible counter-arguments.

~~~
im3w1l
> Why should facts exert only a slow, painstaking generational force on belief
> if people are actually rational? Surely it should have immediate effect?

If you look at rational as a binary, then people aren't rational. But people
are a little bit rational. Sufficiently rational, to eventually find the
truth.

I like the parallel with machine learning. Many, many bright minds have tried
to formalize our intuitions into automatic systems. Gradually they make
progress. But it's plain to see that it's not as easy as "just incorporate the
new fact". We have systems that can deal with facts, and systems that can
learn from experience. But systems that do both, that can learn from
experience and express that in terms of facts, or use facts to guide it's
exploration, that's an open problem.

As for who has to gain, I do think journalists, and editors, and newspaper
owners have something to gain. Their role transforms from giving people "just
the facts", to manipulating people into the right beliefs. What the right
beliefs are? That's for the journalists and their benefactors to decide.

~~~
michaelmrose
A small minority of people are rational enough to forge new truths in the face
of conflicting and complicated information usually in a small narrowly focused
way. A much larger minority is capable of digesting and making use of the work
product of the former group in a productive way again within the scope of a
broader but still narrow scope.

The majority is too stupid to make up their own minds and needs to be educated
at a young age to accept the work product of prior generations experts because
they are just too unintelligent to evaluate it for themselves. This is
literally most people.

The fact that this is unpleasant doesn't make it untrue.

------
SuoDuanDao
I'm reminded of Tetlock and Gardner's excellent book 'Superforecasting', which
was essentially a study of people who consistently score at the top of
prediction markets. One key thing that these 'superforecasters' had in common
was that any new information caused them to update their model of the world,
but none caused them to update it very much - typical people making
predictions either didn't update their model or updated it too much in
response to new facts.

I think it makes a lot of sense, when one is trying to identify patterns in
information, that it's easy to over- or undervalue novel information. We don't
necessarily know what a new fact means, so ignoring it is one common error
while paying too much attention to it is another.

~~~
erichocean
> _We don 't necessarily know what a new fact means_

We also rarely even know if a "new fact" is actually true. So many studies
don't replicate that it makes sense to hold off on updating core beliefs
whenever "new facts" seem unlikely or in contradiction with previously known
(and reliable) facts.

SSC had a nice article (now gone) that discussed this for a scientific theory
that had literally hundreds of confirming studies done for it. All wrong. The
"new facts" were bullshit. So even with tons of studies, it's reasonable to be
skeptical in some situations.

It's also great that, eventually, science was able to figure out the "new
facts" were bullshit. Yay, science. But it also means that people aren't being
irrational when they don't immediately alter their fundamental beliefs while
the ink is still dry, especially "new facts" that seem in contradiction with
everything else we know…

------
dlkf
IMO the conclusion "facts don't change our minds" is a stronger conclusion
than the first two experiments show. On my reading, the first two experiments
show that:

1\. if I have a uniform/undefined prior (how the fuck should I know how
risky/conservative firefighters are?)

2\. and then I'm given an anchor

3\. and then told the anchor is bunk

4\. the anchor still affects me

But I suspect this hinges very heavily on the fact that our initial prior is
basically non-existent. By contrast, if you:

1\. picked a topic where I actually have some prior belief (What country is
colder: Sweden or Germany?)

2\. gave me some information "Germany is actually colder on average than
Sweden because of a weird atmospheric thing that affects the nordics"

3\. told me that 2 was BS

I highly doubt you'd be able to replicate 4.

~~~
eagsalazar2
Specific facts are orthogonal to the actual underlying positions held, which
are presented outwardly as other positions for the sake of political cover,
hence the illusion of facts not changing minds. What's needed is an
understanding of the actual underlying, usually hidden, positions, then
present facts to disrupt _those_ positions.

~~~
dlkf
This is unclear to me. Can you explain what you mean in the context of my
alternative example?

~~~
NoSorryCannot
Not OP, but

You might not know anything about risk-taking behavior of firefighters but you
may already have some vague belief like "firefighters are heroes" that
obliquely colors your impression of their behavior.

Or alternatively you might hold onto the info that Germany is colder because
your underlying belief is more like, you don't like cold and you don't like
Germany, so you'd like to also believe that Germany is colder than other
places.

This entails two things. One, your apparent position on firefighter behavior
or the weather in Germany can change depending on what in the context of the
conversation is being construed as good or bad. Second, trying to inform you
with specific facts on these issues is unlikely to change your mind because
the drivers of your positions are your more general beliefs about firefighters
and Germany.

In politics, I think partisanship often degenerates in this way. Arguing the
issues is often just a facade for arguing for your party's position or arguing
against an opposing party's position, regardless of merit. Facts won't work
here to change minds.

~~~
dlkf
You're just making same claim as the authors in the study, but adding a
proposed mechanism. But just as they don't have evidence, neither do you. You
have to actually do the study to prove it. If you believe that the experiment
would work this just means you and I have different priors on the matter.

------
olah_1
I saw that Yuri Bezmenov interview[1] ages ago and didn't really think of it
until now, when crime statistics are openly denied almost as if crime doesn't
really exist at all.

Then I thought back to that Bezmenov interview with what he said about
"demoralization". When a population is demoralized, they cannot discern true
information when it is staring them in the face.

I think ignoring facts has less to do with some kind of esoteric psychological
process and more to do with raising multiple generations to believe that
they've been lied to and the whole "system" is evil.

[1]:
[https://www.youtube.com/watch?v=wYaR7mWxuf8](https://www.youtube.com/watch?v=wYaR7mWxuf8)

~~~
mistermann
The general public _has_ been lied to, to a significant degree. People who
have full trust in entities that have and continue to publish untruths seem
more irrational to me than _supposedly_ irrational skeptics, etc (it is
unknowable what the aggregate rationality of a given group is, but good luck
finding anyone rational enough to realize that).

~~~
olah_1
This comment is too vague and general for me to interact with. I have a notion
that I disagree with what you’re getting at here, but I can’t be sure.

I agree that the public is lied to. But that is usually through
editorialization of headline news that omits or emphasizes convenient
information for the sake of a narrative. What I’m talking about is being
presented with raw information and considering it.

~~~
mistermann
> I agree that the public is lied to. But that is usually through
> editorialization of headline news that omits or emphasizes convenient
> information for the sake of a narrative.

How would you have any way of knowing this? And I mean that as a serious
question, not as snark.

------
trabant00
I see quite a big problem with those studies: the facts where made up and the
truth was contingent, not necessary.

So why was it expected of the participants to change their minds? Nothing they
could verify disproved their initial position.

For me all this proves is what I already knew: "garbage in, garbage out".

edit: as below comment pointed out this might not be the problem of the
studies but of how the article tries to use them to prove its point.

~~~
simonh
For example in the one where the participant's own answer was disguised as
that of another person we can't discount the result so easily. That's also
true of the studies where participants downgraded their confidence when asked
to give an account of it.

On the invented studies, bear in mind that the point wasn't to measure
changing the participant's mind, only for them to rate the value of a study
that either supported or contradicted their initial position. Their only basis
for evaluating the value of either study was their own pre-existing bias, so
objectively they had no reason to evaluate them differently.

That's quite different from expecting them to change their minds, as the
reasons for them holding their position might not even have been addressed by
the study. For example someone who disagrees with capital punishment on moral
grounds may not care whether it is an effective deterrent or not so may no
have any reason to doubt a study that it is an effective deterrent.

------
TopHand
What politicians know that the authors of this study don't seem to realize, is
that if we are told the same story repeatedly for long enough, no matter how
absurd, we'll start believing it. If you throw in some scary outcome if we
don't believe the story, we'll come around sooner. It seems that fear will
cause us to re-examine our beliefs and values.

~~~
Majromax
That's the illusory truth effect
([https://en.wikipedia.org/wiki/Illusory_truth_effect](https://en.wikipedia.org/wiki/Illusory_truth_effect)),
which is one of the understood cognitive biases.

------
RoutinePlayer
According to 19th century German philosopher Arthur Schopenhauer, “All truth
passes through three stages: First, it is ridiculed. Second, it is violently
opposed. Third, it is accepted as self-evident.”

~~~
rafaelvasco
Oh that's a fact right there. Take any piece of evidence or information, if it
is ridiculed too much or violently opposed, and has been for ages, but no one
forgot it, then it's probably true or partially true.

~~~
danaris
Yep, this "round earth" nonsense that has had people ridiculing good honest
flatworlders for generations will die out any minute!

Aaaaany minute....

~~~
rafaelvasco
Ok you found a negative to my law. That was too easy. Now look with more
attention and tell me a positive. You can do it!

------
jstanley
Related: Epistemic Learned Helplessness:
[https://web.archive.org/web/20180406150429/https://squid314....](https://web.archive.org/web/20180406150429/https://squid314.livejournal.com/350090.html)

------
RcouF1uZ4gsC
First of all have all these psychological studies been replicated?

Part of the reason, “facts” don’t change our mind is that a lot of “facts”
aren’t really facts like physics, but are rather the result of statistical
games.

Finally, and I think the biggest issue is that a lot of facts rely on trust,
since they are practically impossible for the average person to fully verify.
And I think, for a variety of reasons, trust has been lost. Think about
vaccines. Say back in the 1950’s, you probably knew or heard of someone who
died from polio. You mom, might have had a sibling that died from one of the
other vaccine related illnesses. The doctor recommending the vaccines, was
seen as a trusted friend. He(it was usually a he back then) probably spent his
whole life in your town. He knew your grandparents. Maybe he delivered your
parents. He would spend hours at the bedside of a sick child or a dying
grandparent. Maybe he was the one who delivered your children as well. Now
when he says that he recommends you give your child this vaccine, you are
going to listen.

Now forward to modern times. You book your appointment. You go to the office
where you wait for hours. The pediatrician comes in and rushes through a 15
minute visit. Says your kid should get vaccinated. On the way home you listen
to an investigative report of how doctors are paid by big pharma to prescribe
drugs. By the way, you have never heard of anyone you know getting one of
these vaccine preventable illnesses.

Now the gap between the educated elites and regular people in this country is
widening. They do t interact much socially. They do t even live together. In
the United States, the non-college educated have seen a steady decline in
their real wages and well-being. Of course they are going to distrust “facts”
put out by the elite who are seen as out of touch.

I say this as someone who totally believes in vaccines and have persuaded many
of my friends that they should have their children vaccinated. The growing gap
between the rich and poor in this country is at the root of many issues.

~~~
treeman79
Trust is a big part of it.

Facts are closely related to statistics. It’s possible to be both true and a
complete lie at the same time.

Abusive people will often use “facts” to control victims. You learn to be very
mistrustful after awhile.

~~~
zzzcpan
Facts is a loaded term here. Facts cannot rely on trust, those are called
authoritative opinions, not facts. And opinions about vaccination are still
just opinions, not facts. If you say, for example, that it's hazardous not to
vaccinate, like the article does, it's not a fact, it's a judgement and
advertising judgements is the essence of propaganda, it's basically the
opposite of a fact, dystopian use of the word fact. But actual decent factual
picture about vaccines is complicated, it's about balancing many big and small
risks: catching the virus while you live your life, catching something at the
clinic while getting vaccinated, having complications from vaccination, being
subjected to unnecessary treatments and drugs because doctors want to profit
from you that may also cause complications, or just being able to afford it
vaccination, and so on. Not to mention all the unknown unknowns and not
knowing how to evaluate the risks involved. And poor but still factual picture
would at least not advertise any judgement and would present the reasoning for
everyone to make their own conclusions.

~~~
ghthor
Thank you, this is a well balanced comment and it's highly valuable to have
this type of viewpoint in our world.

------
abetusk
Interesting read. They're basically proposing that our anti-rational behavior
came out as a type of 'hyper-socialization'. I can believe it and, if true,
would point to why things like changing the Overton window [1] and other mass
public perception shifts change individual perception.

I don't think it's the only way to change peoples minds and I hesitate to dive
into "just employ emotional reasoning" as that seems dangerous.

From personal experience, another effective way is to change people's minds is
by giving them "skin in the game".

I've tried, over the years, to convince friends of the solution to the Monty
Hall [2] problem. After explaining the solution and them either not believing
it or not understanding it, I then play the game with them with 100 doors and
revealing 98 after the first pick. Once this game is played a couple times,
they understand the solution much more readily.

My take on this is that they suddenly have a personal stake in the game, even
if it's weak. There's a personal cost that takes the form as social shame or
loss aversion, even for a game that's played between friends with no money
involved, that gives them a stake. Once they start wanting to actively avoid
losing, they're much more willing to listen to reason.

The article points out that our anti-rational behavior is at odds with
survival but I would bet there's a level of abstraction below which our
survival minded rationality kicks in and above which we don't have enough of a
stake in the answer to use our rationality to good effect.

[1]
[https://en.wikipedia.org/wiki/Overton_window](https://en.wikipedia.org/wiki/Overton_window)

[2]
[https://en.wikipedia.org/wiki/Monty_Hall_problem](https://en.wikipedia.org/wiki/Monty_Hall_problem)

------
082349872349872
An ancient (albeit trivial) argument for facts not changing minds is that
rhetoric was a distinct discipline from logic.

------
pier25
> _strong feelings about issues do not emerge from deep understanding_

I've thought about this too on my own strong feelings. The more I know about
something, the more I understand its nuances, pros and cons, etc, the less I
feel strongly about it. Now when I spot myself with a strong feeling about
something I try to remind myself that I'm most likely missing something.

We see this constantly in the dev world. Younger devs feel very strongly about
languages, libraries, frameworks, etc, probably because they have a shallower
understanding of the thing.

------
Isamu
It takes constant training and energy to follow where the facts lead you.
Feynman used different approaches as a way to keep himself focused on the
facts and not exclusively what he “knew” was true. He said the easiest person
to fool is yourself.

Mostly people want to validate their intuition and gut feelings and don’t want
to experience the discomfort of finding out that their intuition is not
magically correct.

------
dang
Why they didn't in 2018:
[https://news.ycombinator.com/item?id=18099488](https://news.ycombinator.com/item?id=18099488)

Why they didn't at the time:
[https://news.ycombinator.com/item?id=13810764](https://news.ycombinator.com/item?id=13810764)

------
iconjack
The fundamental problem is that our beliefs become part of our identity, and
thus most of the time we're not actually seeking the "truth". This is
obviously true when it comes to religion, and almost as bad when it comes to
politics. And these days, a lot of "science" has become hyper political: race,
climate, gender, evolution. Forget changing anyone's mind on those topics, no
matter what facts you have in your arsenal.

------
mD5pPxMcS6fVWKE
Truth is only important to us as long as it contributes positively to our
well-being. This sort of mushrooms is edible and this one is poisonous -
everyone would agree on that. As far as more abstract truths are concerned:
people believed for centuries that the Earth is flat. Many still do. If you
said otherwise, society would probably burn you for heresy, so the cost of
truth was hugely negative.

------
btmoney06
Were the New Yorker honest, they'd entitle this: "Why the Uneducated Don't
Understand That You're Right." Which is a shame. This type of information
should be used to help better the reader by asking them to understand their
own blind spots--not indulge the reader by telling them that their adversary
is ignorant and irraitonal.

------
SmokeyHamster
Slightly misleading headline. The study tested how much a lie persists in
someone's mind even after they're told the truth.

The study found that facts do indeed change people's minds, just not as much
as we'd like, because the initial impression sets expectations. Caldini talks
about this in some of his books on persuasion.

------
bigpumpkin
The Stanford experiment forgot to account for the fact that the students
could've used the fake score they first received as a useful prior on how
difficult the task was. It does not show that "Facts Don't Change Our Minds".

------
gadders
The New Yorker can't help itself, can it? Reasonably fair article, but then
suddenly veers into:

"When I talk to Tom and he decides he agrees with me, his opinion is also
baseless, but now that the three of us concur we feel that much more smug
about our views. If we all now dismiss as unconvincing any information that
contradicts our opinion, you get, well, the Trump Administration."

And:

"(They can now count on their side—sort of—Donald Trump, who has said that,
although he and his wife had their son, Barron, vaccinated, they refused to do
so on the timetable recommended by pediatricians.)"

The thing is with studies like this is it's used by people on the losing side
of elections to start complaining about "low information voters" with the
subtext being "If only everyone was as clever as me and all my friends that
think the same then [thing I disagree with] would never win elections."
Ironically this also lets them avoid any introspection as to whether they may
lose because there are defects with their policy positions.

~~~
Barrin92
> it's used by people on the losing side of elections to start complaining
> about "low information voters" with the subtext being "If only everyone was
> as clever as me and all my friends that think the same then [thing I
> disagree with] would never win elections."

it's pretty backed up by evidence (and honestly attending a Trump ralley),
that the average voter of Trump is less educated, much more prone to
misinformation, and simply holds a ton of trivially wrong beliefs about the
state of the world.

That's without making a value judgement about the voter or saying they
shouldn't have their vote which they should of course because there's no
requirement for voting in a democracy, but it seems silly to pretend that such
a thing as an uninformed group of voters does not exist, or even cannot exist
because it would be offensive in a way.

Autocrats and corrupt leaders have banked on them throughout all of history,
and measured, intelligent and truthful discourse is not always found in the
majority.If we're concerned with truth then "they keep losing elections" or
might makes right style arguments hold no value, in fact they're quite
dangerous.

~~~
war1025
> It's pretty backed up by evidence (and honestly attending a Trump ralley),
> that the average voter of Trump is less educated, much more prone to
> misinformation, and simply holds a ton of trivially wrong beliefs about the
> state of the world.

This is just as true of your "average" Democrat. The "average" person is
woefully misinformed about most things. It's probably safe to say that nearly
everyone, myself and the majority of the HN crowd included, is misinformed
about many things that aren't critical to our day to day life.

~~~
Barrin92
It's not as true and there's actually been studies on the particular voter
behaviour in 2016, and belief in 'fake news' (as in literally made up stuff)
was a strong predictor of defection from the Democratic to the Republican
ticket, and there's solid psychological evidence why this affects
conservatives in particular[1]

It's also very trivial to see if you eyeball the size of the market for
misinformation. While there are some highly partisan left-wing media in the
US, and there were some facebook pages targetting say, Bernie voters it paled
in the market for the Trump base, literally by a magnitude or so in revenue.
Which I think is very obvious too if one looks at the size of the audiences of
youtube channels attracting those audiences or people like Alex Jones.

NPR in 2016 actually did an interview with one such 'entrepreneur', who
actually tried to sell fake news to virtually everyone, but had very little
success with liberal audiences.[2]

[1][https://www.theatlantic.com/science/archive/2017/02/why-
fake...](https://www.theatlantic.com/science/archive/2017/02/why-fake-news-
targeted-trump-supporters/515433/)

[2][https://www.npr.org/sections/alltechconsidered/2016/11/23/50...](https://www.npr.org/sections/alltechconsidered/2016/11/23/503146770/npr-
finds-the-head-of-a-covert-fake-news-operation-in-the-suburbs)

~~~
war1025
For one, the two articles you linked are from liberal media sources. Of course
they are going to find fault with conservatives.

More importantly, just because conservatives are more likely to believe a
certain form of fake news doesn't mean liberals are immune to being misled.
All it means is that conservatives are motivated by different things than
liberals, and will therefore latch on to a particular flavor of things that
confirm their beliefs. Liberals love confirmation bias just as much as anyone.

Find any random person on the street and ask them to explain why they hold the
views they do. You'll quickly find that opinions are based on emotion and
backfilled later with plausible explanations.

------
thisrod
"Knowledge advances one funeral at a time" \- old physics saying.

------
rbecker
"Why some facts, on some topics, don't change our minds as much as they maybe
should" would better reflect the article content.

------
troughway
Jordan Peterson studied/covered this as well; here's a short clip -
[https://www.youtube.com/watch?v=sWbj-2DRLps](https://www.youtube.com/watch?v=sWbj-2DRLps)

~~~
war1025
It's downvoted and greyed out because of a general hate of Jordan Peterson
(which I've found is really independent of political affiliation), but it's a
good clip in my opinion.

------
dutch3000
i very much enjoyed the article, but i do prefer apolitical content when
possible. unsure why it was necessary to reference trump in the vaccine
portion. people (authors included) that can’t control themselves from
injecting politics where it doesn’t naturally belong are becoming more and
more irritating imo.

~~~
squarefoot
> unsure why it was necessary to reference trump in the vaccine portion.

It wasn't necessary, however it gave the authors the opportunity to test in
just one line if the summary was true, and I guess it worked.

I also don't want politics injected into scientific topics, but the role of
politicians is to rule for people's good, and talk with extreme caution and
responsibility because of the trust people give them. When a high profile
politician says "this is good", a lot of people will follow the advice
blindly, so when a politician put people lives at risk by telling for example
that Hydroxychloroquine works as a cure for the Coronavirus (to date at least
one dead and one intoxicated after following that advice), it's politics
actually harming lives with dangerous information, which makes everyone's duty
to inject back common sense into the debate. If only because scientists don't
have the same exposure, and it becomes so hard or even impossible for them to
undo the damages done by clueless politicians who talk about things they don't
know squat.

BTW. I would have the same exact opinion even in the case it was Obama or
Clinton doing what Trump did.

~~~
dutch3000
i’ve purposely attempted to fully disconnect from all politically related news
and i’ve begun to notice oddities in conversation patterns mostly. the pattern
is mainly people injecting political comments in completely nonrelated topics
being discussed. the trump references were not educational, yet a sign of the
author’s inability to control himself. it’s just a complete turn off for me.
people can easily make the connections in the article to today’s reality, it
doesn’t need to be explicitly referenced.

------
MaxBarraclough
(2017)

~~~
dang
Added. Thanks!

------
baxtr
Is that actually a fact?

~~~
willvarfar
I don't mean to be paranoid, but do you think the article might be
misinformation in some study of the collective gullibility of the HN crowd?

~~~
danaris
Well, given that it was published in the New Yorker 3 years ago, it clearly
wasn't written primarily for us.

Given that it is discussing the results of actual psychological studies (that
I have seen talked about in a number of other places), it is vanishingly
unlikely that it is in some way intended to study anyone's gullibility.

