Hacker News new | past | comments | ask | show | jobs | submit login
Why Facts Don’t Change Our Minds (2017) (newyorker.com)
144 points by BerislavLopac on Oct 1, 2018 | hide | past | favorite | 158 comments



We have this weird fetish these last few years with asserting in broad language that people don't change their minds, don't respond to evidence, dig their heels in further when presented with evidence that contradicts their preconceived opinions, etc. It's all over pop psychology and the headlines.

Bugs the heck out of me, because if the language they use is literally true, then no one would ever be convinced to change their minds ever. And yet, we do.

It's true that perhaps some or many people never change their minds, or that all people might be apt to behave that way when they're not focusing their attention, but that's wildly different language then the headlines use.

This article is obviously guilty as well - "Why Facts Don't Change Our Minds". Like, ever? The content of the article doesn't back up the headline, but the headline is what people remember.

Even some of the content is probably guilty. In the Stanford capital punishment study, is it really true that each and every individual in the study responded as they describe? Because that's how the article is written.

And the problem is it gives people more excuse to give up - to not engage with someone who is wrong, or to dismiss someone who is right (that they think is wrong). Because, studies!

The real lesson is the opposite - that we have to study and learn critical thinking, and practice, as a discipline, changing our minds when the evidence or reasoning warrants it. Just because it's hard doesn't mean it's impossible, and in fact the ability to do so is part of what makes us an evolved species - or more generally, our ability to surpass our instincts and evolved traits.

Show me some experiments that demonstrate what conditions need to be in place, in order for people to change their minds after they've proven to be resistant. (Hint: psychological safety and lack of time pressure.) That's what'll be valuable. I'm tired of all these other studies that just say we don't.


Well, most of the time the "Why Facts Don't Change Our Minds" question could actually be rephrased as "Why Don't Everybody Agree with the Left?" I just skimmed the article, but the mention of Steve Bannon at end is an indication that the article probably belongs to that trend. This turns the issue into the political ground, not the scientific one. Thus, the word "fact" is used as an appeal to authority when its actually an ideological discussion that is at stake. Some facts are carefully ignored or dismissed when not fitting the narrator world's view (in either sides), so it would be better to acknowledge that it's opinions that are discussed, and that their are not absolutely right.


"You cannot convince a right person"

This was a realization I came to years ago, long before it was a political topic. And when I say 'right person', I don't mean right in political views, but I mean "A person that believes they are correct in their thinking on the topic".

But that's what politics seems to be these days "Convincing your side they are right", then repeat it often so they don't forget they are right. In this way you can create a person that never changes their mind.


The question has political import but there is, of course, no way to write those in without someone skimming the article finding a way to call it a hatchet-job and refusing to consider it.

Perhaps we might call this another way facts don't change our minds: we can't be bothered to consider them if interspersed with even a mild statement of opinion that differs from our own.


Without endorsing, titanix's view of the article, I have to say it's bothersome that all the political examples leaned the same way.

It's an article about confirmation bias, and while most examples were neutral, they only ventured into politics when it would confirm the biases of a left-leaning readership.


> it's bothersome that all the political examples leaned the same way.

But they don’t: The article also cites the example of vaccine denialism which is much more pervasive on the political left (though they tie it back to Trump). As an aside I agree that it’s somewhat lazy that these kinds of articles always come back to politics. Then again, it’s hard not to discuss the Trump administration when talking about “post-truth”.

EDIT: apparently I was misinformed, vaccine denialism doesn’t seem to follow political leaning, though it does seem to follow political extremism (on both sides of the political spectrum), see [1]. But apparently my mistake is a somewhat widespread belief ([2]).

[1] https://theconversation.com/anti-vaccination-beliefs-dont-fo...

[2] https://www.washingtonpost.com/news/energy-environment/wp/20...


>The article also cites the example of vaccine denialism which is much more pervasive on the political left

Is it? Or is that how you perceive it? Because vaccine denialism as far as I've seen doesn't abide by political alignment. All it really takes is a poor understanding of science and history and living in a world where modern medicine has been so successful that vaccines are a victim of their own success. Where a lot of the people alive today, in the U.S., can't even comprehend even comprehend diseases being a real threat which makes vaccines seem kind of unnecessary.

Throw in some autism scares and any other BS you can manufacture and a plurality of people from all sorts of beliefs and backgrounds who buy into some wonkass bullshit out of fear and ignorance. And no political party has a lock on that demographic...


I'd say it maps better to some political tendencies than others; by necessity American political parties are coalitions of a ton of tendencies with plenty of differences among them.


That stuff (and associated new-age health mumbo jumbo) has a lot more pull with hard-right guys than you'd think.


Well the article itself doesn't state that it's more pervasive on the left and just makes the link to Trump anyway. So it still actually does the lean the same way.


So you're saying they take an issue which is roundly described as wrong, vaccine denialism, state that it is much more pervasive on the political left, then without any irony, you state they didn't lean left even though they blamed this issue on Trump?

I'm going out on a limb and claiming that taking a left leaning fault in society and blaming it on Trump, does not count as a right leaning example.


They don’t blame it on Trump though. And I’m not stating that they don’t lean left (and it’s likely that they do).


Lurking beneath the surface, I'm sure, is bewilderment at the Trump phenomenon by New Yorker readers. But nevertheless, the piece isn't a partisan screed.


I read the GP as accusing the article of that old trick of making a point (that facts don't convince people) by dissertating about a different, just slightly related issue (how facts don't convince people). People do that a lot, even instinctively, as it saves both the labor of going out and gathering actual evidence and the risk of being called a liar.

Yet, after a quick dissertation on his point he goes out and does the same thing by arguing about the political views of the author...

Thus, I don't think your complaint is fair. You could complain about hypocrisy, but that's not productive. I do think he is correct in questioning the article premises...

Anyway, after seeing all the reaction to the GP, I'm inclined to think your point is also correct, about people turning things into identity politics too.


How about "why don't facts change people's minds about global warming"?

or "why don't facts change people's minds about gun control?" (this is an issue with both the left, they are both wrong about many aspects of gun issues!)

These are just two cases that came to mind, where I have friends/family who I talk to about it, and I can explain some facts in great detail that I'm familiar with, and in that moment they will seem to change their mind a little bit. But a couple months later they are right back to their old thinking. Maybe people tend to forget the stuff they would rather not believe.


"Do humans cause climate change?" is a question of fact. "What type of gun control should we have?" is not a question of fact, so you shouldn't expect that simply stating some facts relevant to the issue would automatically change someone's opinion.


I agree, but there are facts about gun issues that make that gun control question harder to answer. People who believe strongly that guns make a country much safer. People who don't understand what assault rifles are, or that you can shoot up a school 99% as well with a `regular` rifle, people who think lack of guns will make a country MUCH safer. There are plenty of empirical evidence to refuse those kinds of beliefs but people hold on to them pretty strongly.


>People who don't understand what assault rifles are,

Or don't care for pedants who try arguing that an AR-15 isn't an assault rifle... because the issue isn't about terminology. Although trying argue about that does deflect from the real issue though.

>or that you can shoot up a school 99% as well with a `regular` rifle

Are you trying to say a bolt action hunting rifle is 99% as effective at massacring large numbers of people compared to a magazine fed semi-automatic weapon? Since you seem to have some complaint about "assault rifles" being used colloquially you might want to define what is a regular rifle, technically.

>people who think lack of guns will make a country MUCH safer. There are plenty of empirical evidence to refuse those kinds of beliefs but people hold on to them pretty strongly.

What you mean to say is you can tolerate the tens of thousands of gun deaths each year. And your belief is that if we take all those guns away those same number of deaths will just transfer over to stabbings or poisonings. Well while I'm not a badass or anything, I'll take my chances with a knife wielding maniac over a gun wielding maniac.

So... all those developed first world nations that have strong gun control laws still have an equivalent amount of murder and weapon assaults because everyone who would have used a gun still picks up a less dangerous weapon and is somehow just as efficient as they would be with a gun?

The biggest problem the U.S. has with gun control is that there's so many guns in the country that it will take a generation, maybe a 100 years for anything but the most draconian laws to have a tangible effect. And the whole time you'll have 2nd amendment wonks crying every day for 100 years about how gun control doesn't work because it didn't instantly fix the issue. However doing nothing hasn't been working either.


You contradict yourself when you talk about the issue not being one of terminology when you say the opposite by saying a `regular` rifle is a bolt action rifle when the most common retail rifle sales are of magazine fed semi-automatic weapons.

Please come back with an internally coherent argument, and then we can start from there.


Most proposed "assault-weapons bans" ban features like magazines larger than ten rounds and bump stocks.

Whether you want to call a qualifying weapon a "regular" rifle or an "assault" rifle is not pertinent because neither of those terms is well-defined.


> Most proposed "assault-weapons bans" ban features like magazines larger than ten rounds

Magazines larger than ten rounds are OEM standard on the regular rifles that are in common retail sale.

You're still failing at being internally consistent.


Perhaps the appeal of an assault rifle in this situations is the emotional charge a spree killer feels in wielding one. The marketing of the weapons - in fictional media as well as in gun magazines is that these are tools of power, probably something that appeals to the kind of a spree killer. So... maybe assault rifles being banned would be a consequence of their great marketing. (And why they get a bump in sales after every mass shooting?)


> people who think lack of guns will make a country MUCH safer. There are plenty of empirical evidence to refuse those kinds of beliefs but people hold on to them pretty strongly.

Well, I look at it this way. Both the left and the right (including our current administration in the US) as well as those who financially and politically support the current administration (such as the NRA) agree emphatically that gun control works, and lack of guns in the hands of laws abiding citizenry is safer. So when both the left and the right agree through words and policies, and it's demonstrated through safety and security, it's hard to argue otherwise. The question really then becomes one of scope.


> But a couple months later they are right back to their old thinking. Maybe people tend to forget the stuff they would rather not believe.

Or it could be that their follows/likes on social media tend to be an echo chamber and dominate the information they receive. As soon as you stop talking to them about something different than what they normally hear, it quickly fades from view as their social feeds bring the "facts" that support their current position to the front again 24/7.


> How about "why don't facts change people's minds about global warming"?

I have always thought it was either the lack of the ability to understand the facts, or the shear unwillingness to actually look at the facts.


What about the way the information is often presented to the average person as a justification for policy change. How often does the average person encounter information about global warming devoid of an attempt to change policy in relation to the information? That alone would cause reason for a person doubt the authenticity of the information since the source is seen as biased. Add in how badly news organizations butcher the actual science, and there is plenty of room for doubt that then can be exploited by those who specifically seek to do so.


Hmmm, interesting perspective. However, forget about the denialist fringe for a moment. The majority of people do understand that climate change is bad, and most feel paralyzed into inaction by all the doom-and-gloom messaging – welp, looks like we're all screwed, now let me get back to what I was doing.

Few understand that climate change is still very much a solvable problem, and that our success crucially depends on building the political will for new policies around carbon pricing and land use — and that, in turn, hinges on creating an understanding for the scales at which such policies must and can enact change. The current mainstream discourse, on the other hand, is half fatalistic cynicism, and half turn-off-the-lights-and-reuse-your-plastic-bags blather.

Ergo, talking about policy change is important, no?


What is the most compelling website/report you know of that one who is willing to have their mind changed should read?


Generally, because "something some guy said" is not a fact, and the article makes the very same mistake three times. For example, "95% of climate scientists agree" is not a fact about global warming. It's a fact about climate scientists, and at best a rumor about climate change.

What would be a fact? The GISS raw data would come close. But James Hansen doesn't release it. No, the data has to first be "cleaned", lest people misinterpret it. That reduces a fact to "James Hansen said", and I might simply conclude that Hansen is a moron and discard the rumor.

In short, to convince someone, it isn't enough to tell them what you think, you also have to give them the means to verify those claims. You can verify that particular claim by reading E. T. Jaynes: "Probability Theory---The Logic Of Science".


I wouldn't know what to do with raw data about climate change and I'm unusually well educated. Just not in climate science. You can't expect a majority of people to educate themselves sufficiently in every relevant topic to make informed decisions. That's why we have scientists who study things for years and years and specialize in interpreting data. The rational thing to do for almost everything is to trust the majority of the specialists.


> The rational thing to do for almost everything is to trust the majority of the specialists.

Brilliant way to miss the point, which is that rational reasoning also tells you whom to trust. If you have two groups pointing fingers at each other while yelling "Those are not specialists! We are!", you gain no information about who really is. You need something else, and untampered data is something else, even if you can't analyze it yourself.


Well, only one of those two groups actually studied climate science, so my money is on them being the actual specialists.


You could just as well throw your money at any other group of advocacy scientists... say creation scientists.

Of course, I am not an actual creationist, so dont throw any money at my belief that the world is billions of years old.


>>For example, "95% of climate scientists agree" is not a fact about global warming. It's a fact about climate scientists, and at best a rumor about climate change.

If 95% of climate scientists agree that something is a fact, that is not "at best a rumor" about climate change.


> If 95% of climate scientists agree that something is a fact, that is not "at best a rumor" about climate change.

True, but if you're a big enough pedant and chase down what is actually behind the "95% of climate scientists agree" claim, you'll discover it's not actually what it's sold as, which is a shame because I believe there's probably more than enough genuine evidence that resorting to this sort of thing is unnecessary to change people's minds. To be fair, it's worked very well on most people, but there are still plenty of others on (or on the wrong side of) the fence that I suspect could have their minds changed if the ~facts were presented in the right way.


What is the "right way" to convince these people? I'm genuinely curious about your research/experiential findings on this, because I'm at a loss.


I must admit to smirking at the unintended irony created by the title of the article, and the publication it appears in.

That said, Jonathan Haidt's book The Righteous Mind does outline a very similar thing. That our 'reasoning' is almost always a post-hoc attempt to explain how we already feel. And Haidt, while claiming neutrality is very definitely on the right end of the spectrum.


> That said, Jonathan Haidt's book The Righteous Mind does outline a very similar thing. That our 'reasoning' is almost always a post-hoc attempt to explain how we already feel.

I think Hume beat him to this position by a bit.


And St. Augustine beat Hume to his position by more than a century :)

In The Righteous Mind Haidt actually quotes Hume's "Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them."

Given the above, I'm still happy with Haidt as a reference here, as he references Hume, and also has a decade or two of research to back up his claims.


> And Haidt, while claiming neutrality is very definitely on the right end of the spectrum.

Is he? I have always felt like he is at the centre, if not slightly to the left, where most social psychologists are. (BTW, I am on the left myself.)


He's always considered himself a left of center, moderate liberal until he wrote The Rightous Mind. He doesn't necessarily identify with the right but he disavows some of his older liberal beliefs.

https://www.nytimes.com/2012/07/29/magazine/a-liberal-learns...


I see it differently. It appears to me that large swaths of the political right have gone over to intellectual relativism, where everything is reduced to mere opinion, equally valid to all other opinions, and facts simply don't matter. See any discussion of climate change for that.

To be fair, it's happening some on the left as well, but not to the degree it's happening on the right.

Of course, that's my opinion, and I'm a leftist, so you can just dismiss me out of hand.


The left ignores facts when it suits them as well, but with different subjects. See race and IQ correlation, gender wage gap, implicit association test.


Generally those discussions follow a certain pattern. Opponents object to the conclusions drawn from the data, and the proponents accuse the opponents of not accepting the data.


Race-IQ correlation? Really?

Do tell.


Oh for god's or whomever's sake, please don't post a flamewar incantation like this on HN.

https://news.ycombinator.com/newsguidelines.html


The Minnesota Transracial Adoption [1] study is something that, for better or for worse, changed my outlook. It was a major and longterm study designed to try to prove once and for all that environmental factors were the dominant factor in the correlation between race and IQ. The study followed the results of well to do white parents adopting children of black, half black, indigenous/asian, and white ancestry. In it they controlled for location, income, family stability, age of the adopted children, etc -- as many environmental variables as they could.

The problem is that the study ended up showing the exact opposite of the desired outcome. By age 17 the difference in IQ between an adopted black child and an adopted white child was 18 points. Interestingly even adopted white children and the biological white children born to well to do parents showed a substantial 4 point difference in IQ which could be explained by genetic differences in those who end up of the means to adopt children, and those who end up in situations where they're left to give away their children for adoption.

There was even a phenomenal accidental control. A number of the adopted children originally identified as having two black parents actually had one white and one black parent. And these children fit the IQ outcomes of the other half-black half-white children, which was about 10 points above the children from two black parents. This means even though the parents thought the children came from two black parents, and the children themselves also thought they came from two black parents - their performance mapped to their genetic background and not their environmental assumption.

The authors have tried to undermine the results of their own study in two main ways. The first is by suggesting that differences could be attributable to prenatal environmental differences. This not only defies belief given the differences and extreme consistency, but if the authors actually believed this, it would have made the entire study, which went on for nearly 2 decade, completely pointless. The other is suggesting that skin color can create environmental differences. This is a much stronger argument against the results. However it also suffers from the same post-facto rationalization, but even more importantly -- this hypothesis is also strongly undermined by the accidental control group of half black children. Given their misidentification it's safe to say that these children did not have strongly indicative transracial features. And in any case they would have identified as 100% black given that's what they thought they were.

I think that this study's results are precisely why there have been no further studies, of the same scale, performed on the topic. The results are not what we want to believe.

[1] - https://en.wikipedia.org/wiki/Minnesota_Transracial_Adoption...


Your analysis of this study is, to put it gently, not the prevailing view in academic psychology. In addition to methodological criticisms, the findings of your cited study not only haven't replicated, but have been contradicted by larger studies that show, for instance, adoption from low-SES parents to high-SES parents --- in co-sibling controlled studies --- is associated with IQ gains of up to 15 points.

It's always a little interesting to me to see the vintage of the studies "HBD" advocates introduce their ideas to HN with.


Please do link to any studies that actually designed, developed, and carried out a comparable experiment with different results. I'd be more than happy to read them! I try to base my views on evidence so much as I possibly can -- and to my knowledge, such studies do not exist. On that note please do not link to any studies where an individual decided to 'process' data and p-hack their way to a proof for their ideological biases. There are indeed a multitude of such studies, and such behavior goes a long ways towards explaining the current replication crisis plaguing psychology.

p-hacking is particularly easy to do with IQ since there's a peculiar effect in that the heritability of IQ is not fixed. When we are young our IQ is generally able to be more significantly influenced by environmental factors, yet as we age our IQ becomes ever more a product of genetics with recent studies showing the heritability to be upwards of 80%.

So for instance in this study if you look at the results of the children at age 7, you'll note that both black and white children had measured IQs that were about 10% higher than their IQ at age 17 (around 8 and 10 points respectively). This is normal and indicative of a privileged upbringing, as all the children in this study had. Yet that privilege counter intuitively does not carry over into adulthood, which is phenomenally interesting -- but also makes it very easy to generate disingenuous arguments when it comes to this topic.


"80% heritability" (which, from the literature, appears to be a wild overestimate, overshooting even Herrnstein's estimate from _The Bell Curve_) does not mean "80% genetically determined". Heritability and genetic determination are different concepts.

Regardless, the only cite you've brought to this thread is a study from the 1970s which has not only not replicated, but has, as I've said, been contradicted by later studies, studies informed by the methodological criticisms the Minnesota study received --- some of those criticisms coming from the study's own authors.


The Left has plenty of unsupported ideas.

The question is really why people debate Physics or Chemistry. People that thought the Sun was a perfect sphere would discount things they could directly see with their eyes.


> People that thought the Sun was a perfect sphere would discount things they could directly see with their eyes.

What do you mean by this? The Sun is, to a very good approximation, a perfect sphere. Or as perfect a sphere as one is likely to find in the physical world.


It's one thing to say it's a very good approximation of a sphere. It's another thing to say it's in fact an absolutely perfect sphere.

People had firm belief in the second and discounted evidence to the contrary. Even ignoring that rotation causes a significant distortion (https://www.nasa.gov/topics/solarsystem/features/oblate_sun....) they would not accept https://eclipse2017.nso.edu/coronal-mass-ejections-cme/ as a 'real' thing.


> Or as perfect a sphere as one is likely to find in the physical world.

That's precisely the point. One may argue that the sun being a perfect sphere is a fact, but someone else may also point the facy that the surface of the sun shows significant deviations from the perfect sphere, thus the facy being that the sun is not a sphere.

The thing is, they wouldn't be discussing facts but instead personal opinions based on observations. Thus both would be right and wrong while the facts were still the same.


The problem is that the lab isn't isolated, and so facts produced by the lab can have strong political bias. A 'wrongthink' produced by a lab would have a far stronger criticism applied to it than a 'rightthink' using the same methodology. Combined with issues like competing for funding, publish or perish, and the extreme lack of reproduction, it casts a level of doubt on any facts coming from science, especially when they have strong tie in to political areas.

And to specify, the 'rightthink' and 'wrongthink' examples I've studied in the past were cases that didn't breakdown by any common political stance like left or right, but were findings that either agreed or disagreed with positions held strongly by people regardless of political standing. I do think the results would apply as expected when considering a political issue that broken down per party lines, but the reaction is much stronger when there there is more uniform agreement for or against something.


It was mostly a neutral, intelligently written article until that last paragraph.

It's as if the author wrote a good article about confirmation bias, and someone jumped in at the last second and said "wait, add a line implying this is all about shitty conservatives so our readers will like it." I wonder if anyone at the New Yorker appreciates the irony.

Only those idiots who disagree with me have cognitive biases.


partisanship is a misleading mental shortcut.

rather, loss aversion (a cognitive bias) can explain a fair bit of the behavior here. people with (perceived) status or privilege don't want to lose those statuses/privileges, so seek to rationalize their position. if it means attacking the underpinnings of facts and evidence, so be it (as the bias goes). it's part of our primitive brains to do so, and requires the use of our modern brains to keep in check.

also, facts and evidence live on a continuum, where facts have low margin of error while evidence has high margin of error. the way you address a high margin of error is to gather lots of evidence, so that statistically, you narrow the margin of error to something tiny, and then you can call it a fact. for example, the sun rising tomorrow has a margin of error that's approximately zero, so we call it a fact.

most reasonable people are not interested in arguing facts, especially if those calling into question those facts have no basis of expertise or credible counter-evidence.

you can have your own beliefs, but you don't get to have your own facts. moreover, it's reasonable to discount people who believe something without evidence, especially in the face of evldience to the contrary.


> could actually be rephrased as "Why Don't Everybody Agree with the Left?

"the left" doesn't mean much of anything other than meaning "those i don't like".

it's a general derogatory phrase used by (for example) commentators like ben shapiro... on most episodes of his show, it's an handwave reference to indistinct millions of people that is almost always followed by chatter about how terrible they are. just the absolute worst. evil people. etc.

being one of "the right" doesn't mean you would like someone like steve bannon.


I see it often come up with regard to alternative health topics like vaccine refusal or homeopathy. I could be wrong, but my perception is that people who hold these beliefs are more likely to be left leaning.

It also comes up often in the global warming context, obviously, which goes the other way.


According to a Pew Research Study [1] in regards to vaccine refusal it appears to rather than politicial affiliation (left/right) it's the magnitude that is the big indicator. Which isn't surprising given that anti-vaccine is an extreme position, that those who are extreme in their political opinions are also extreme in their private lives.

[1] http://www.pewinternet.org/2015/01/29/public-and-scientists-...


There are anti-vaxxers on both sides, but since Trump's rise, I've noticed a big upswing in right wing anti-vaxxers.

It shouldn't be surprising. He's been pushing the vaccine autism link since 2012:

"Massive combined inoculations to small children is the cause for big increase in autism...."

"Autism rates through the roof--why doesn't the Obama administration do something about doctor-inflicted autism. We lose nothing to try."

"Healthy young child goes to doctor, gets pumped with massive shot of many vaccines, doesn't feel good and changes - AUTISM. Many such cases!"

"I am being proven right about massive vaccinations—the doctors lied. Save our children & their future."


> Why Don't Everybody Agree with the Left

As someone on the left, I find it bothersome that when people say "the left" they either mean neoliberals or SJW, neither of which actually stand for leftist ideas. Neoliberals believe in a right-wing economic agenda, while playing identity politics and SJWs want to feel good about themselves from their couch, so they play identity politics.

Both of these, I have a massive issue with, but due to this caricature of the left coming from the right, I cannot honestly debate with the other side.


> could actually be rephrased as "Why Don't Everybody Agree with the Left?"

Well, let's have a look at the actual examples in the article: what things are people apparently failing to change their minds about? (I've also included a few things that aren't examples of that but that seem relevant.)

1. Their ability to distinguish real from fake suicide notes. No obvious political slant.

2. What makes a good firefighter. No obvious political slant.

3. The effectiveness of capital punishment. Highly politicized topic, but reported experiment and reported results both symmetrical.

4. Answers to an unspecified set of reasoning problems. Presumably no political slant.

5. Next study described is one in which people actually did* change their minds, when asked to explain in detail things they thought they understood. Also not political.*

6. Next one also isn't about changing minds. Found that a rough proxy for ignorance about Ukraine is correlated with eagerness for the US to intervene there. Definite opportunity for political slant here.

7. General discussion about mutually reinforcing ignorance. Has kinda-gratuitous anti-Trump comment at end.

8. Next one also isn't about changing minds. Found that trying to explain things in detail leads to more moderate opinions, whichever side one's on. Politically contentious questions; procedure and results seem symmetrical.

9. Alleged dangers of vaccination. Anti-vax sentiment tends to skew left, though I think less drastically than is sometimes thought. Brief, semi-gratuitous, Trump comment here, but it isn't particularly negative.

10. Whether owning a gun makes you safer. Politically contentious question.

So. Most of the stuff here is either entirely unpolitical (1,2,4,5) or concerned with political issues but balanced in both procedure and conclusions (3,8). The last section (9,10) describes a book that discusses demonstrably-false opinions held by some people who skew left (9) and some who skew right (10). That leaves 6 (which looks to me -- your opinion might differ -- like an unbiased attempt to investigate a question with political consequences) and that little swipe at the Trump administration in 7. (And, if you really find it offensive, the comment about Steve Bannon at the end.)

I don't think any reasonable person could summarize this as anything even slightly like "Why don't everybody agree with the left?", even though those little swipes make it clear that the author is no fan of Donald Trump. (A characteristic he shares with plenty of Republicans, so it's not exactly a clear indicator of egregious bias.)


I think antivax is an example of bipartisan stupiity as well even if motivations differ. It seems to be an easy answer to uncomfortable questions and decisions. Whether it is accepting something from an uncomfortable source (science, something "unnatural", government mandate, etc.) or avoiding the pain involved. I am aware of my personal biases but I suspect that if the vaccine form factor was changed to be truly painless we would see antivax numbers decline. Also inspired by sociopathic scam "homeopathic vaccines" which were liquids to be drunk. (Especially ironic when vaccination is the one area where dilluiting something harmful to help with the original cause can actually work for a change.)


> I don't think any reasonable person could summarize this as anything even slightly like "Why don't everybody agree with the left?"

it's quite fashionable to trash talk this amorphous and indistinct group called "the left".

"conservative" political commentators in particular love using "the left" to as a generic stand in as "those undesirables" or "those fools" or "those leeches" etc.


One quick note on point 10: the question of whether owning a gun makes you safer isn't a politically contentious one -- it is a knowable fact in the realm of empiricism.

Whether individuals should be allowed to own guns, without regard to whatever risk there might be, is the politically contentious one.

But the mere question of safety isn't political (how things should be), it's observable (how things are).


The fact that something is empirically knowable is not, unfortunately, a guarantee that it isn't politically contentious.


The thing is, slim majorities of conservatives will answer phone surveys with things like "yes, Obama was born in Kenya" or "no, global temperature isn't rising".

Specious argument can come from anybody, but the "la la can't hear you" portion of the left is measurably smaller compared to the right.


Source?


There really was a scientific poll were 51% of Republican voters said the believe that Obama was born in Kenya.

https://www.politico.com/story/2011/02/51-of-gop-voters-obam...


Did you really mean to use the word "scientific" here? If so, I'm not sure what you mean for a poll to be scientific.


There's a lot of material out there on what makes a scientific poll. Basically, demographic weighting and a derived margin of error.

It's not the same thing as a web poll.


A good point.

When I see the word 'fact', I remind myself that even now evolution is still only considered a theory - despite the weight of evidence on its side. Because in science something might just show up that invalidates all that came before, and maintaining that faint skepticism is what separates science from belief.

(And of course evolution is a theory, not a mere hypothesis because of this weight of evidence).

If you want to show me a fact, make sure it comes with a mathematical proof attached.


> even now evolution is still only considered a theory

No, you simply misunderstand what “theory” means in the scientific context. It’s a synonym for “explanatory model”. And “evolution” in this context stands for something close to “gradual change via random mutations and natural selection [as well as other, less important mechanisms]”. Calling it “theory” does not imply that it’s not based on observable facts, or that there’s doubt about it. It’s simply used to distinguish observable facts from an explanatory model.

There is no “progression of certainty” from hypothesis → theory → fact. As another commenter has said, evolution is as much a theory as gravity.


There's a difference between the theory of evolution and the fact of evolution borne out by the fossil record.

Using your definition, even gravity is a "theory", yet things still fall to the ground when we drop them from above.


A few years ago I really enjoyed arguing politics on the internet. Not trolling, but legitimately arguing. I ran into several people who would just link-bomb the conversation.

So one day, when I had a couple of hours free I sat down and read every single link. All of them.

The lack of substance beyond an opening paragraph that claimed a point was appalling. When it was brought to the attention of those link-bombers, they actually agreed that there was no substance and that they hadn't read those links at all. I was really proud of my friend and thought we had actually resolved a political disagreement on Facebook.

Then I saw them dropping the exact same links in another conversation within a couple of days.

Facts without context don't prove anything. Some of the facts but not all of them just frame the story in a particular light.

After years of this stuff, it becomes clear that nobody has time to become an expert on every subject just to be able to identify the critical details that people are leaving out.

Eventually, you just research your positions on the subjects that matter to you and vote accordingly. Now, if something doesn't make logical or mathematical sense, I oppose it. If there's no logic or math involved, I generally take no position at all.


That's called the Gish Gallop and for whatever reason is the mode de jure for winning an Internet argument now.

https://rationalwiki.org/wiki/Gish_Gallop


> winning an Internet argument now.

and before! The prevalence of dumb things to link and dumb people to link them is a bit higher now but the Gish Gallop isn't new on the internet.


I am really souring on such conversations because of the tremendous number of bad-faith arguing tactics people use -- willfully misconstruing your point, racing as quickly as possible to a glib dismissal without really understanding what you said, or, like you said, link-bombing. Another failure mode I find extremely frustrating is failure to understand analogies. For instance, if I were having a discussion about whether potatoes are healthy it might go like this:

A: Of course potatoes are healthy! They're natural!

B: If everything natural were healthy it'd be healthy to eat hemlock.

A: Potatoes aren't hemlock and it's really offensive you'd make that comparison.


I think people instinctively refuse to parse analogies because it's too much work for too little reward. Practically every time somebody starts to use an anology it becomes an anology battle because of course it wasn't perfect. Secondly it's like trying to prove a point by introducing a bunch of weird mental tricks. Yeah you might have a point there but it's rude to expect me to parse all this extra made up stuff you just throw at me. If you have a point, you can make it without an anology.

I have the same opinion on fallacies, practically every single time somebody says "wait, you just used a fallacy" it's wrong, it doesn't matter or he himself just used five fallacies by pointing that out. It's an endless battle over magic words that have so much social weight behind them that using them should be forbidden in any serious debate. That's my personal, very unpopular opinion.


If you don't want to do work understanding someone's argument then I'd suggest not participating in a debate; there is no way to present an argument that makes consuming it completely mindless. And really, it doesn't take that much work to understand "if we really believed that principle was true for ANY political philosophy then it'd be just as true for the Nazis as it is for whatever group you're supporting."

As far as logical fallacies go: I don't like the Internet style of just rattling off a bunch of logical fallacies to dismiss an argument, but I think it's fair to say "this argument is such-and-such a fallacy because of X, Y, and Z."

If we took your suggestions I don't know how anyone would have a substantive debate at all.


In most cases the problem with link bombs is that the stated conclusions don't fit the linked facts, and people can't separate the verifiable fact from their conclusion.


I tend to model people changing opinion as ships with different inertia changing course:

If you formed your opinion yesterday, it has low inertia, and it is easy to change course. If you formed your opinion a decade ago, it is high inertia, and the best I can do in a conversation is a little nudge in the right direction, hoping it will have some effect later on.

The causes for opinion inertia are known. Long held opinions are part of people's identity (think religion, for example). People actively filter for supporting evidence, eschewing opposing views. It takes a while to breach both barriers.

If you test high inertia opinion change in one sitting, like studies do, it looks immutable.

(don't ever think you are immune to these inertia effects. Everyone is affected)


Another cause is that changing your opinion means admitting to yourself (and others) that you were wrong. And that's hard for many people to do, especially with long-held beliefs.


My observation is that when the real point of the argument is social pecking order bullshit where proving you got X fact wrong means you are evil (stupid, etc) and now deserve to be treated really terribly, people dig their heels in as the only possible defense against "agreeing" to be crapped on. If that's not what is at stake, people are vastly more willing to learn and grow.

Some social settings are better than others in that regard.


> Bugs the heck out of me, because if the language they use is literally true, then no one would ever be convinced to change their minds ever. And yet, we do.

This is the wrong conclusion to draw from the discussion. People do change their minds, but rarely all at once. What they want is to be consistent, so you have to move the needle a little bit at a time.

One technique to convince somebody of something: first you have to show you understand their position better than they do and are on their side. Then you can lead them towards the correct solution. It's called "pacing and leading".


I agree. It might be better to say, "People don't change their minds when we want them too. But people do change their minds when they come to their own realizations about the information they have. And sometimes that sucks because it can takes days, months, years, a lifetime."

For example, I'm an ex-Jehovah's Witness and over on the r/exjw subreddit a common thing you see is a person starts posting. They've just realized their long held sacred beliefs are nonsense, but all their friends and family are JWs (which is a problem because JW culture is such that losing your faith makes you a pariah, and possibly shunned by those friends and family) so they want to know how to wake up their friends and family to the fact that JW beliefs are nonsense. But you just can't. Believers have strong faith and ignore contrary evidence and apostate arguments while under the spell of their faith. The believer has to make a lot of decisions for themselves to be able to read a bunch of facts, apply them rationally and decide that their faith doesn't stand on it's own two feet.

And depending on how or what you believe in, not just religion, but any subject, if you have pre-established or long held beliefs about it, information that runs contrary becomes suspect.

After all people will argue until they use up all the oxygen in the room over evolution (because of the perceived implications to their religious faith or other beliefs). But not a goddamn person you've ever met is going to pick the atomic weight ceiling of the heaviest possible atom as the hill they want to die on.


I agree that people do change their minds, but it is not neccessarily a direct result of facts. Things like emotion, social pressure, and moral arguments are far more persuasive. People just get angry when their facts arn't enough to pursuade someone. But sometimes you need to engage people at a more personal level than just science and statistics.


>But sometimes you need to engage people at a more personal level than just science and statistics...

Now I don't know if I would write this off as simple "emotion". Obviously, "emotion" is probably part of it, but so is "social pressure", etc as you point out. I'd probably summarize that collection of factors as "Ideology". Convincing people to change their minds is likely one reason that ideology is actually useful to humans, even though oftentimes it bears little resemblance to the underlying reality we live in. It allows us to understand and interpret the world in ways that are comfortable and intuitive. Being "comfortable" matters.


Traditionally ideology is more often cast as a reason people cannot understand the implications of facts staring them in the face.


It's both.

It's the reason people ignore facts. And the reason people change their minds.

Just to sharpen the point, think of it this way, ideology can make you "change your mind" and start believing things that have absolutely no basis in fact.


A good example of this is the history of medecine. Perfectly intelligent and educated people believed some really wild things about how infection occurred. It took a long time to convince people that hygiene was neccessary, and that hospitals should prevent contamination between patients. Nowadays we all believe in bacteria and viruses without question, but most of us have never seen either with our own eyes. But over the years our basic ideology has changed.


Medicine is still, in fact, full of people who believe things that, 100 years from now, will be likely elicit belly laughs. It's all about the ideology. If the tenets of the ideology tell you you're right, then you're right. (Or as right as you need to be if most people are willing to adhere to that ideology.)


Agreed

"Facts don't change people's mind" forget about the presentation of the facts, usually in a condescending and over authoritarian way.

No fact is an absolute truth that shouldn't be open to discussion, facts help, but when people get "holier than thou" about them, that's usually when the pushback happens (not only on those occasions, though)


I would also add that the word "fact" is often abused to refer actually to strongly-held beliefs instead of actual substantiated and well-accepted info. Thus when those feeling empowered by holding facts criticise others for not changing their mind when faced with their facts, they are actually attacking others for not succumbing to their personal beliefs.


> This article is obviously guilty as well - "Why Facts Don't Change Our Minds". Like, ever? The content of the article doesn't back up the headline, but the headline is what people remember.

I think these days most readers, especially of publications like the New Yorker, are more media savvy than you give them credit for. When you read the headline, did you really think it would be an article espousing the view that nobody anywhere has ever had their point of view altered by learning something new?


>And the problem is it gives people more excuse to give up - to not engage with someone who is wrong, or to dismiss someone who is right (that they think is wrong). Because, studies!

I know someone who is the victim of a professional conspiracy theorist. This guy has figured out how to pull people in with explanations of why your typical conspiracy theory is wrong. He'll happily refute that 9/11 was an inside job in public. So his marks think "Great, rational guy! Let's learn more!"

But that's when you have to pay. All "hidden secrets" are on Patreon of course. But now his marks are literally and emotionally invested. I read once some conspiracy theorists feel like they have a special position because they know something the public doesn't.

Between that (if it's true) and the fact you've literally paid for whatever hogwash this guy is selling you'll soon find yourself (as my friend did) explaining that every study quoted by The World Health Organization is flawed because they profit from vilification of Monsanto. How does he know? Studies that refute the studies! Written by whom, you may ask?

In which case you'll be challenged for challenging "the true skeptics" who totally have it figured out. And the burden of proof is left to you: if you can't refute the studies that refute the studies well you must be wrong and now must accept The True Facts.

It feels very much like a conversation with my friends in cyber security who see a boogie man behind every tree. I am willing to entertain rational fears. But when you don't like something new and reject it for an undefinable reason, or even worse because then it could reveal secret squirrel knowledge only cyber may know then don't be surprised if I too am a skeptic.


I remember reading some article with a former head of Goldman Sachs. He was saying that the best traders aren't necessarily the smartest ones. The best traders are those who don't get attached to a position. So from that example, it would seem that some people are able to change their minds in response to feedback (and do well because of it).


It stems from a misunderstanding of what it means to say that you've changed somebody's mind. Hardly anybody ever changes their mind in the moment. That's not how opinions or the stubborn humans who hold them work. People change their mind on issues over time, in response to multiple inputs.

If you expect to see evidence of changing minds literally represented in comment threads or at the end of dinner-table arguments, then you've totally failed to understand the value of those discussions. Intense discussions routinely change minds, but they rarely end with any evidence that a change has taken place.

The notion that contentious discussions on broad cultural disagreements are valueless is, I think, mistaken and the meme is, in general, profoundly damaging.


One of my favorite phrases is “a man convinced against his will is of the same opinion still” because it reminds me of the ever-useful principle that if someone has personal (and maybe even irrational) reasons to keep believing something they shouldn’t, like that they have a good chance at winning the lottery this time (no they don’t) and finally get out of the poverty that’s haunted them for decades, or that bitcoin is finally their ticket to entrepreneurial independence (probably not), or that the foreign prince really does want to send them money and all they need to do is buy iTunes gift cards and send him the numbers over the phone, no amount of facts are going to change their minds.


> This article is obviously guilty as well - "Why Facts Don't Change Our Minds". Like, ever? The content of the article doesn't back up the headline, but the headline is what people remember.

To me the obvious implication of the title is not that they never change our minds (we all know this is not true), but that they very rarely do -- at least much more rarely than one would expect.

At any rate, this is an article from a weekly magazine; I'd expect somewhat more careful reading than occurs in a newspaper to be the norm.


> if the language they use is literally true, then no one would ever be convinced to change their minds ever.

Not sure how you've made this leap. The point is not that people never change their minds, just that the way they do so is less evidence-based than one would hope. It even says later in the article:

> Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science.


> antithetical to the goal of promoting sound science

That's perhaps an argument for the sciences directly built on positivism (the philosophical positivism), but that list ends quickly. Everything else is based on at the very least odds, but mostly consensus, and convincing based on those is required for anything beyond a few basic sciences.

And positivism only covers math, large parts of physics and chemistry (but not all), and ... that's mostly it. I mean, you could argue small portions of economics, biology also qualify, but only small portions.

Consensus itself, like presented in climate science and medicine, is an argument by authority. So if you convince someone of the truth of, say, global warming, you have in fact convinced them mostly of a social fact: that lots of people studying this problem seriously come to this conclusion. You have not convinced them that "because X leads to Y, here's the proof, Y leads to Z, here's the proof, this will now happen". You can't do that. These sciences don't work like that.

Things like climate science, medicine, psychology itself, ... just aren't made up of rational argument but from tying anecdotes into a larger framework and some amount of statistical inference. For all those sciences any positivist would say that

1) there are (very small) odds that the science is entirely wrong

2) the odds that significant parts or specific studies, even when executed entirely correctly and with integrity, is quit large: obviously 1 in 20 should be outright wrong at 95% confidence


I can't understand what you're saying. As far as I can tell, climate science and medicine and to some extent psychology depend on genuine, evidence based reasoning. It's not pure guesswork.

Either way this is very tangential to my point.


> no one would ever be convinced to change their minds ever. And yet, we do.

"I've never seen anybody change their behavior as the result of a well-reasoned, rational argument. I have seen people change their behavior to avoid ridicule." -- Scott Adams


I've seen plenty of people change their behavior as the result of a well-reasoned, rational argument. I'm not sure what Scott Adams' problem is.


Yes. It would help to test how rational the study subjects are to begin with, why / why not they are so, etc. And then, what can / cannot be done to increase flexibility.

That said, be careful that we with for. There is certainly unforeseen consequences here.


All facts (those that are not lies) are true under specific circumstances/environment. Capitalism is bad. This statement is not always correct and neither is it always wrong. When unchecked, capitalism is bad. Capitalism also gave rise to SpaceX, Apple ...


Apple, SpaceX exists because of Free Markets. There is no reason to think they required our current flavor of Capitalism to come to fruition.


I am not sure I follow the point you are making. The point I was trying to make is that there are good and bad things about capitalism. Often people are always against capitalism or always pro-capitalism but it isn't as simple as that. When you start delving into details you will find some things about capitalism are good and some aren't so great. I for one tink capitalism is the best of the worst lot. I don't think much of the banking and finance systems that are part of capitalism today. As far as I know and I am not economics professor, Free Market is a component of capitalism. Once you produce the goods the market will decide on the price.


What do you expect from a publication that hired people like Talia Lavin.

https://www.thecut.com/2018/06/new-yorker-fact-checker-ice-t...

Both sides of the political extreme are guilty of claiming their "facts" are real and others are ignorant for not believing them. It's the modern day equivalent is "our god is the one true god" nonsense. But the left just has a much bigger and louder megaphone to shout at people.

It's obvious to anyone paying attention that the New Yorker nor any other publication cares about facts. They care about their own agenda. The New Yorker isn't an objective publication. They are an advocacy group.

Essentially it's the Ministry of Truth attacking people for wrongthink.


I say that's a good thing. It's a good thing that beliefs have inertia, and it makes sense. Think of the years of patterns of thinking and neural connections that have formed in your brain. Are you expected to read something and change that in an instant? Imagine when Darwin's Theory of Evolution came out, assuming it had perfect evidence for it, do we expect everybody to instantly change their beliefs? And sometimes what we say facts are poor research, see ongoing replication crisis in psychology, social science, nutrition etc. The time it takes to change a mind is a long complicated process which is justified. A world where people's minds are easily changed by facts would be a world of fads.


One of my favorite points in favor of facts, debate, and persuasion is the observation that you almost never see someone change their mind, but you can find legions of people who have changed their minds.

There are a host of reasons we don't respond to facts with "oh, I guess I'll abandon all my deeply-held beliefs immediately!" Everything from "that's an embarrassing loss of face" to "I'd like to fact-check you before I accept that, but it's rude to say I doubt you" to "brains don't work that way, it's physically impossible to discard a whole belief on demand" factors in. And I agree, this is hardly a bad thing. Taking time to change our minds is a safety feature. Not only are facts sometimes false, or falsified, or misleading, but most of us aren't great at knowing what's relevant when.

On almost any topic, there is someone who could argue me into an embarrassing defeat, or at least an awkward stalemate of "we can't both be right". This is true for most people about all things, and all people about most things. I am overwhelmingly confident that young-earth creationism is wrong, but I've seen its adherents speak; I have a lot of arguments they can't answer, but they have a lot of arguments I can't answer either. I don't know enough about the geology of Mount Ararat to explain why some arcane point about flood sedimentology is wrong, but the correct response to that is not for me to agree that the Earth is 5,000 years old. And that's for a fringe theory that I'm uncommonly well-qualified to rebut - the problem only gets worse when we move to a lay viewer looking at any serious debate.

People today may unprecedented access to facts, but that didn't give us the time or memory or training to evaluate every possible issue from the bare facts up. Everyone who's ever cast a vote is, on a great many issues, working from heuristics and expert opinions and best guesses. With only one lifetime to learn things, that's not avoidable. I'm sometimes alarmed by how harshly people will resist looking into new facts and evidence, but the idea that people should promptly respond to compelling-sounding facts by changing their minds doesn't strike me as a workable one.


After reading a bit of evolutionary psychology, I realized to my astonishment that humans have evolved to survive, and not to be perfectly rational. Rational so much so its useful. I'm actually convinced that extreme rationality has evolutionary disadvantages. When you realize the universe has no meaning, does it motivate you and your species to keep breeding and take risks? As Schopenhauer said, if you truly considered all the costs associated with having kids before having them, no rational being would ever have kids! No person thinking clearly would ever trade 10-15 minutes of feeling good for a lifetime of costs and responsibilities. Then there's tribalism and ingroup and outgroup psychology which dictates much of what we see. These are survival mechanisms meant for living in small tribes, malfunctioning in a globally connected civilization. I would suggest yuval harrari's recent article on how humans are hackable and how that completely demolishes our concepts of liberal democracy , free market etc which assumes rational actors behaving in their self-interest.


Agree with most of what you said, but

>A world where people's minds are easily changed by facts would be a world of fads...

is an interesting statement.

Because we live in a world of fads anyway. This even though peoples' minds are not easily changed by facts.


Let me rephrase that: it would be a world where we would become extinct quickly. I am convinced that extreme rationality has evolutionary disadvantages. So the emphasis on rationality only makes sense in a modern society when you can in some sense override your biological programming since you don't have to fight a bear and hunt etc.


So true. I read this article recently about doctors not switching to the newest techniques. [https://www.nytimes.com/2018/09/10/upshot/its-hard-for-docto...]

The interesting thing is that I think the author makes the opposite point of what they were intending. The tl;dr of the article is that a study came out saying doctors should strictly control blood sugar in the ICU. It was very slow to gain adoption. Eventually it started to take hold but a new more rigorous study came out and said this was a bad practice as it was actually causing too many hypoglycemic events and killing patients. The new recommendation was to not strictly control blood sugar in the ICU. The author states that doctors are now not switching back to the strategy they had in the first place. It seems like waiting for a preponderance of evidence might always be the best path forward, especially doctors and other situations where people's lives are at risk.


"Imagine if America created an evidence-based centrist party in the 1940s that listened to the scientists how much more advanced things would be. Mandatory frontal lobotomies for hysterical women who show sapphic tendencies. Free cigarettes for kids to smooth their breathing." https://twitter.com/getfiscal/status/1032299239160913920


> Free cigarettes for kids to smooth their breathing.

How was this “evidence-based”?


My problem with the studies described at the beginning is with this conclusion:

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted.

My understanding, from the descriptions in the article, was that it was not the beliefs were _refuted_, it was just that what they were told originally was not true. The subjects had no reason to believe the latter assertion above the former.


It is also important to note that facts, in the true sense of the word, represent single observations of a phenomenon, while beliefs are more about models that reflect relationships between observable phenomena and a way to model and predict behavior.

Thus it's easy to understand that a single observation, particularly one which is given at face value, is not enough to scrap whole models.

To provide an example, should we scrap the whole notion of gravity accelerating all dropped objects uniformly independent of mass if someone observes a feather taking more time to fall than a canonball? I mean,that's a fact. Anyone ca see it and reproduce the same behavior. So, should everyone just abandon the notion of gravity acritically because of a single fact was presented?


For some reason this made me think of the problems with side effects in programming. If we can read and write willy-nilly, it’s very difficult to figure out why the data is the way it is, and we can’t undo/rewind unless we planned for it all along.

When our minds are mulling over some piece of information, whether it’s true or not, it will have side effects on other thoughts, opinions and emotions. If we’re confronted with a revised set of facts later, there is no way to rewind all those side effects.



They sure change my mind. I'm just not always convinced what people are saying are facts. At best they are the current best explanation and at worst only opinions.


At one end of the spectrum of what constitutes a fact is Richard Feynman's definition, which was anything he'd replicated himself in a lab. He considered everything else he knew to be based on faith (in people, in organizations, in whatever made the claim, but which he held in high enough esteem to actually believe.)


Agreed. Or even worse. Purposefully misreporting of research data. Example: recently disgraced "scientist" Brian Wansink.


Both this and the previous discussion show that HN readers believe facts (or information) can and do change peoples’ minds. The way I understand the article (and the books it refers to) it suggests that social factors have a stronger effect on opinions than facts. I find this hypothesis plausible when I look at the polarized discussions I see playing out at the societal level, as well as interactions with other people individually. And maybe, rather than refute it, the HN reaction to the article is further evidence for the mechanism the article describes?

Edit: typo


Belief is not generally fact based. Going strictly with the facts is an incredibly disciplined thing. Feynman used to play games to keep himself sharp, to avoid fooling himself because as he said, the easiest person to fool is yourself.

With people in general the belief comes first and then you backfill with supporting evidence and argument.

Also we tend to pull our beliefs into our identity. Then a challenge to our beliefs is a challenge to our identity, and very hard to swallow. So it is also difficult but important to keep yourself from identifying yourself by your beliefs.


Humans are overwhelmingly emotional creatures. We are a bunch of monkeys with only a fragment of real awareness of why we are doing X and Y.

Human brains judge another persons whole life in milliseconds (we all do it, can't help that), but then struggle with what is quite simple math. That should tell you something about what we are.

My advice is don't ever forget that logic is only the smallest part. Demeanor and a strong image is much more important than it should be, simply presenting a "fact" is not enough.


We should talk about complexity and complication here.

Complex is the opposite of simple. If there is 50 equal elements communicating between them there we have a complex system.

This is important because a thousand ants behave totally different from an isolated ant.

You just can't study an ant and expect to understand the nest or anthill.

Psychologists try to make it all the time and it is totally wrong: They studied isolated rats in order to understand addiction to drugs and they did not understood that they are social animals which travel 20miles-30kms each day.

So they jailed the rats in a cage, because it was easier to study them that way, and extracted conclusions that were totally bogus for human beings. Those conclusions were the basis for the US war on drugs.

Here they do the same. They make a very simple experiment, and extract conclusions about the whole system. In Africa, when humans whatever,more fiction than science.

It is ok to make fiction and speculation, but you should always differentiate what we really know with high degree of confidence from what we do not.


The study Mercier and some European colleagues did together which is mention in this article seems very similar to The Choice Blindness study.

https://www.wired.com/2010/08/choice-blindness/

http://www.lucs.lu.se/wp-content/uploads/2011/01/Choice-Blin...

The latest paper which is incredible interesting from that study http://www.lucs.lu.se/wp-content/uploads/2018/08/Strandberg-...


This is a tangent.

How much/long does a study like this cost/take? Is it possible/useful to scale this?

From reading into some of the replicability discussion, I've gotten curious about social science generally. What's an expirement like this trying to prove/demonstrate. The journalistic narrative likes the easy "X debunked." I imagine researchers have a more nuanced perspective.

I guess what im asking is what does the larger effort look like? Do these studies eventually add up to a larger understanding of how we do form opinions, where facts do change our minds....

Replicability is one thing. How about generalisability?


The replication rate is social science studies is very poor. And an even larger problem is that the studies themselves do not actually test what they aim to test, often because they simply cannot. For instance do video games cause violence? The only way you could realistically test this would be to have one group somehow prevented from playing video games for their entire life, or at least up to a certain age, and another group obligated to play video games in a similar fashion, and then some control group presumably free to do as they saw fit. And then follow these individuals for decades and see what happens.

That's not really possible. So instead the experiments that are made are toy experiments, but when you're not really testing what you're trying to measure, it becomes impossible to prove anything, and possible to show anything. For instance could you design a toy experiment that might indicate video games cause violence? Of course. Could you design a toy experiment that might indicate video games don't cause violence? Of course. The experiments are meaningless.

As an example of the problem, 'emotional intelligence' in work is all the rage right now. Yet the keystone study that sparked it is really just quite bad. The author had people split off into groups and perform a variety of tasks such as, literally, planning a shopping list. Using some method of determining who made the best shopping list, the author then determined that the groups who had the highest average IQ did not perform the best, whereas their 'emotional intelligence' as measured by an, again literally, "reading the mind in the eyes test" mapped better to performance - so therefore, it's not merit alone that judges performance but some emotional intelligence, at least as measured by "reading the mind in the eyes" that does. That's just broken logic. At the bare minimum IQ != specific task merit. The most logical way to perform that experiment would be to have created teams of those that performed individually best on any given task to work together against those that did not score so well on merit, but did well on the "reading the mind in the eyes test". Of course she did not do this, the bare minimum to even begin approaching this question, since the result would not be what she wanted. And negative results don't get published. Yet now there have been likely hundreds of articles and spin off studies taking that study's unjustified conclusion as a granted.

So no, there is absolutely no big picture progress in the social sciences.


I think there is a way. Facebook/Google have all data to perform studies like that. Problem is that current University system is broken. You are forced to publish short success stories instead doing actual research.


Which study are you referring to? Several are referenced.


Don't facts need to be contextualized in a larger belief framework to be useful? "Sun rises in the East" is a fact that is compatible with a heliocentric and geocentric view of the world. So that fact on its own is useless in explaining the world. In politics, your ideology will guide your interpretation. If you have a particular view on, say, immigration, you will highlight certain facts, suppress or dismiss inconvenient facts, and in general interpret all facts through your ideological lens.


If anyone would like to explore the topic a bit further try "The Influential Mind" by Tali Sharot. It was on FT's short list for best of 2017. It's a quick easy read. Gladwell-esque. The difference iis, she is an actual scientest who works in that field. Tho' perhaps that makes her bias? ;)

https://www.amazon.com/Influential-Mind-Reveals-Change-Other...


The article is basically saying that if we're told we're right, then we're told we're wrong, we will still believe we're right even though we're presented with a contrary fact? The test group aren't asked their answers in response to being told a fact - its more of a psychological game in my opinion


> In 1975, researchers at Stanford invited a group of undergraduates to take part in a study ... A few years later, a new set of Stanford students was recruited for a related study.

We have to wonder if Stanford undergraduate student generalises to the whole population.

It doesn't require much stretching of the imagination to see that circa 1975 Stanford undergraduates, as a cohort, may score below average on objective measures of humbleness.


Does it really rely on 1975 Stanford undergrads, though? For example, I think Dan Kahan (Yale's Cultural Cognition Project at http://www.culturalcognition.net/blog/) usually says about the same thing about facts and changing minds.


The first discussion in your link talks about peoples bias for wanting to remove plastic from the ocean remote from the source, rather than at the source.

Surely it's obvious that preventing plastic entering the ocean at the source does nothing to remove the plastic that's already there.

Maybe I should read more of this blog, but that doesn't inspire confidence.


I think you should, if you scroll back a couple of years, there's plenty of discussion of published research on science communication, which might be more interesting to you.


Cool, thanks, I'll have a gander.


>It doesn't require much stretching of the imagination to see that circa 1975 Stanford undergraduates, as a cohort, may score below average on objective measures of humbleness.

Which doesn't matter in this test. Both groups in this study would have the presumed bias, so the differences in scores could still be attributed to the independent variable.


This is why cache invalidation is important.


But what is considered a fact? Richard Feynman only considered a thing a fact if he'd replicated the results himself. Everything else he took on faith.

When the FDA said you should limit fat intake, that was considered a fact and only nuts would disagree, but that is now considered a much trickier statement. When the FDA says the flu vaccine is safe, that is a fact and only nuts would disagree, but that is not considered a questionable statement. But it isn't hard for a nut to use the first to confuse the latter.


This is very long and complicated, but I'll try to write a quick comment.

Many people know from experience how hard it's to convince anyone to change what they say, the way they behave, etc, from facts alone. I say "behave", because it's very important to realize the difference between "rationally accepting" and "caring about a truth or fact", more to that later.

We also know that people can change their minds quite easily when they are exposed to pretty much anything for enough time, with enough repetition. Even when we know something is false, the repetition of a certain discourse can have a noticeable effect on us. When you pair that with other types of external pressures, it becomes even more egregious.

The key point is that we can't easily change how people feels while being respectful with that people and trying to make them change through words only. We are irrational, and we all have different priorities. Even if I accept a fact, it might not be something important to me, even if I rationally say it is, so I won't change the way I act, and it won't matter that much. Otherwise, I might say that I don't accept something just because I don't feel that way, even if I have to discard and ignore (unconsciously) the facts. What "I feel "is more relevant than the facts that I "don't really (want) to understand". As tobr puts very well in another comment too, all these feelings and ideas have a long term effect: "When our minds are mulling over some piece of information, whether it’s true or not, it will have side effects on other thoughts, opinions and emotions. If we’re confronted with a revised set of facts later, there is no way to rewind all those side effects."

I thought it was interesting to write about all this, not only because it really helps a lot understand why facts might or might not help much changing the way people behaves, but it also helps us understand better how to actually make people change their minds. Words might be very effective for those that are already in a similar line of thought, but in other cases we might want to try changing the way people feels about something instead, by trying to making them live, in first person, the contradictions in their own beliefs (unfortunately, for many technical issues, you can't do that without becoming a teacher (and that assuming the other person trusts you enough to let you teach something), but then it's not surprising that people can't trust facts that depend on knowledge they don't have, it's only natural). And all this also help us understand that even though we might accept many rational truths, we all have different priorities, so the ones we end up acting upon deserve some consideration. Sometimes you are so focused on your own causes that don't understand how others don't share it, when it's simply that they have their own ones too, not necessarily that they don't rationally accept and understand what you are doing. And there are a billion worthy causes. For some people it might be about saving the world. Others focus only on their kids. We can do a lot to manage better that collision between rationality and irrationality, between facts and feelings, because both are critical to us as human beings.


> We also know that people can change their minds quite easily when they are exposed to pretty much anything for enough time, with enough repetition. Even when we know something is false, the repetition of a certain discourse can have a noticeable effect on us. When you pair that with other types of external pressures, it becomes even more egregious.

After decades people still believe that global warming is a hoax and vaccines cause autism, despite mountains of oft-repeated evidence that this is not the case.


well, not the evidence they are frequently exposed to.

but of course there are many other factors. I already talked about the problem with more technical arguments, but nowadays in many cases we have this aversion to science and mistrust for anything that might come from it. also, many people are mostly exposed to what they think and feel, because when someone else tells them something they don't believe in, they don't hear that, they just repeat their own discourse again for themselves. so they are exposed to that. and in case it was a bit confusing, I didn't say that "people will always change their minds [...]", there are a lot of factors. but that we can very easily doubt ourselves when information comes predominantly from only one direction? for sure. and that direction doesn't necessarily need to be "science" or "media". we are just bad at detecting where the relevant information comes from for a person.


Michael Huemer has written extensively on the topic of why people are irrational about politics [0]. I recommend this brief talk [1] which goes over some of the key ideas. Fundamentally, it's a problem of incentives: There is little or no incentive for people to be rational about their political beliefs, and there is frequently an incentive to be irrational about them. In particular:

The theory of Rational Irrationality holds that people often choose—rationally—to adopt irrational beliefs because the costs of rational beliefs exceed their benefits. To understand this, one has to distinguish two senses of the word “rational”: Instrumental rationality (or “means-end rationality”) consists in choosing the correct means to attain one’s actual goals, given one’s actual beliefs. This is the kind of rationality that economists generally assume in explaining human behavior. Epistemic rationality consists, roughly, in forming beliefs in truth-conducive ways—accepting beliefs that are well-supported by evidence, avoiding logical fallacies, avoiding contradictions, revising one’s beliefs in the light of new evidence against them, and so on. This is the kind of rationality that books on logic and critical thinking aim to instill.

The theory of Rational Irrationality holds that it is often instrumentally rational to be epistemically irrational.

The theory of Rational Irrationality makes two main assumptions. First, individuals have non-epistemic belief preferences (otherwise known as “biases”). That is, there are certain things that people want to believe, for reasons independent of the truth of those propositions or of how well-supported they are by the evidence. Second, individuals can exercise some control over their beliefs. Given the first assumption, there is a “cost” to thinking rationally—namely, that one may not get to believe the things one wants to believe. Given the second assumption (and given that individuals are usually instrumentally rational), most people will accept this cost only if they receive greater benefits from thinking rationally. But since individuals receive almost none of the benefit from being epistemically rational about political issues, we can predict that people will often choose to be epistemically irrational about political issues.

[0] http://www.owl232.net/papers/irrationality.htm

[1] https://www.youtube.com/watch?v=4JYL5VUe5NQ


I still think they do.


Ι see what you did there, you Russell.


I just now (30 minutes ago) changed my mind about something based on a fact, and it wasn't even something that someone pointed out to me. I just realized on my own that I had been wrong:

https://news.ycombinator.com/item?id=18111870

I wonder if this fact will change anyone's mind about the idea that facts never change anyone's mind.


Why such a mindlessly literal interpretation of an title clearly not intended to be literal?


well, it's a time-honored tradition of we geeks on the internet. it helps us feel clever. :)


People are free to believe what they want. The problem with science is that it cannot change, so it can appear as a problem in politics, people feel science is a prison you cannot criticize.

Also there are good reasons why science is not in control of our lives.

What's worse is that people will also speak in the name of the science of economics to advocate about any kind of policy or tax break.


> The problem with science is that it cannot change

Maybe science can't change, but our collective understanding of it changes (hopefully gets better) almost all the time.


I guess people don't realize how science illiterate people can be. That's the issue I guess. If you don't understand science, you cannot trust it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: