Bugs the heck out of me, because if the language they use is literally true, then no one would ever be convinced to change their minds ever. And yet, we do.
It's true that perhaps some or many people never change their minds, or that all people might be apt to behave that way when they're not focusing their attention, but that's wildly different language then the headlines use.
This article is obviously guilty as well - "Why Facts Don't Change Our Minds". Like, ever? The content of the article doesn't back up the headline, but the headline is what people remember.
Even some of the content is probably guilty. In the Stanford capital punishment study, is it really true that each and every individual in the study responded as they describe? Because that's how the article is written.
And the problem is it gives people more excuse to give up - to not engage with someone who is wrong, or to dismiss someone who is right (that they think is wrong). Because, studies!
The real lesson is the opposite - that we have to study and learn critical thinking, and practice, as a discipline, changing our minds when the evidence or reasoning warrants it. Just because it's hard doesn't mean it's impossible, and in fact the ability to do so is part of what makes us an evolved species - or more generally, our ability to surpass our instincts and evolved traits.
Show me some experiments that demonstrate what conditions need to be in place, in order for people to change their minds after they've proven to be resistant. (Hint: psychological safety and lack of time pressure.) That's what'll be valuable. I'm tired of all these other studies that just say we don't.
This was a realization I came to years ago, long before it was a political topic. And when I say 'right person', I don't mean right in political views, but I mean "A person that believes they are correct in their thinking on the topic".
But that's what politics seems to be these days "Convincing your side they are right", then repeat it often so they don't forget they are right. In this way you can create a person that never changes their mind.
Perhaps we might call this another way facts don't change our minds: we can't be bothered to consider them if interspersed with even a mild statement of opinion that differs from our own.
It's an article about confirmation bias, and while most examples were neutral, they only ventured into politics when it would confirm the biases of a left-leaning readership.
But they don’t: The article also cites the example of vaccine denialism which is much more pervasive on the political left (though they tie it back to Trump). As an aside I agree that it’s somewhat lazy that these kinds of articles always come back to politics. Then again, it’s hard not to discuss the Trump administration when talking about “post-truth”.
EDIT: apparently I was misinformed, vaccine denialism doesn’t seem to follow political leaning, though it does seem to follow political extremism (on both sides of the political spectrum), see . But apparently my mistake is a somewhat widespread belief ().
Is it? Or is that how you perceive it? Because vaccine denialism as far as I've seen doesn't abide by political alignment. All it really takes is a poor understanding of science and history and living in a world where modern medicine has been so successful that vaccines are a victim of their own success. Where a lot of the people alive today, in the U.S., can't even comprehend even comprehend diseases being a real threat which makes vaccines seem kind of unnecessary.
Throw in some autism scares and any other BS you can manufacture and a plurality of people from all sorts of beliefs and backgrounds who buy into some wonkass bullshit out of fear and ignorance. And no political party has a lock on that demographic...
I'm going out on a limb and claiming that taking a left leaning fault in society and blaming it on Trump, does not count as a right leaning example.
Yet, after a quick dissertation on his point he goes out and does the same thing by arguing about the political views of the author...
Thus, I don't think your complaint is fair. You could complain about hypocrisy, but that's not productive. I do think he is correct in questioning the article premises...
Anyway, after seeing all the reaction to the GP, I'm inclined to think your point is also correct, about people turning things into identity politics too.
or "why don't facts change people's minds about gun control?" (this is an issue with both the left, they are both wrong about many aspects of gun issues!)
These are just two cases that came to mind, where I have friends/family who I talk to about it, and I can explain some facts in great detail that I'm familiar with, and in that moment they will seem to change their mind a little bit. But a couple months later they are right back to their old thinking. Maybe people tend to forget the stuff they would rather not believe.
Or don't care for pedants who try arguing that an AR-15 isn't an assault rifle... because the issue isn't about terminology. Although trying argue about that does deflect from the real issue though.
>or that you can shoot up a school 99% as well with a `regular` rifle
Are you trying to say a bolt action hunting rifle is 99% as effective at massacring large numbers of people compared to a magazine fed semi-automatic weapon? Since you seem to have some complaint about "assault rifles" being used colloquially you might want to define what is a regular rifle, technically.
>people who think lack of guns will make a country MUCH safer. There are plenty of empirical evidence to refuse those kinds of beliefs but people hold on to them pretty strongly.
What you mean to say is you can tolerate the tens of thousands of gun deaths each year. And your belief is that if we take all those guns away those same number of deaths will just transfer over to stabbings or poisonings. Well while I'm not a badass or anything, I'll take my chances with a knife wielding maniac over a gun wielding maniac.
So... all those developed first world nations that have strong gun control laws still have an equivalent amount of murder and weapon assaults because everyone who would have used a gun still picks up a less dangerous weapon and is somehow just as efficient as they would be with a gun?
The biggest problem the U.S. has with gun control is that there's so many guns in the country that it will take a generation, maybe a 100 years for anything but the most draconian laws to have a tangible effect. And the whole time you'll have 2nd amendment wonks crying every day for 100 years about how gun control doesn't work because it didn't instantly fix the issue. However doing nothing hasn't been working either.
Please come back with an internally coherent argument, and then we can start from there.
Whether you want to call a qualifying weapon a "regular" rifle or an "assault" rifle is not pertinent because neither of those terms is well-defined.
Magazines larger than ten rounds are OEM standard on the regular rifles that are in common retail sale.
You're still failing at being internally consistent.
Well, I look at it this way. Both the left and the right (including our current administration in the US) as well as those who financially and politically support the current administration (such as the NRA) agree emphatically that gun control works, and lack of guns in the hands of laws abiding citizenry is safer. So when both the left and the right agree through words and policies, and it's demonstrated through safety and security, it's hard to argue otherwise. The question really then becomes one of scope.
Or it could be that their follows/likes on social media tend to be an echo chamber and dominate the information they receive. As soon as you stop talking to them about something different than what they normally hear, it quickly fades from view as their social feeds bring the "facts" that support their current position to the front again 24/7.
I have always thought it was either the lack of the ability to understand the facts, or the shear unwillingness to actually look at the facts.
Few understand that climate change is still very much a solvable problem, and that our success crucially depends on building the political will for new policies around carbon pricing and land use — and that, in turn, hinges on creating an understanding for the scales at which such policies must and can enact change. The current mainstream discourse, on the other hand, is half fatalistic cynicism, and half turn-off-the-lights-and-reuse-your-plastic-bags blather.
Ergo, talking about policy change is important, no?
What would be a fact? The GISS raw data would come close. But James Hansen doesn't release it. No, the data has to first be "cleaned", lest people misinterpret it. That reduces a fact to "James Hansen said", and I might simply conclude that Hansen is a moron and discard the rumor.
In short, to convince someone, it isn't enough to tell them what you think, you also have to give them the means to verify those claims. You can verify that particular claim by reading E. T. Jaynes: "Probability Theory---The Logic Of Science".
Brilliant way to miss the point, which is that rational reasoning also tells you whom to trust. If you have two groups pointing fingers at each other while yelling "Those are not specialists! We are!", you gain no information about who really is. You need something else, and untampered data is something else, even if you can't analyze it yourself.
Of course, I am not an actual creationist, so dont throw any money at my belief that the world is billions of years old.
If 95% of climate scientists agree that something is a fact, that is not "at best a rumor" about climate change.
True, but if you're a big enough pedant and chase down what is actually behind the "95% of climate scientists agree" claim, you'll discover it's not actually what it's sold as, which is a shame because I believe there's probably more than enough genuine evidence that resorting to this sort of thing is unnecessary to change people's minds. To be fair, it's worked very well on most people, but there are still plenty of others on (or on the wrong side of) the fence that I suspect could have their minds changed if the ~facts were presented in the right way.
That said, Jonathan Haidt's book The Righteous Mind does outline a very similar thing. That our 'reasoning' is almost always a post-hoc attempt to explain how we already feel. And Haidt, while claiming neutrality is very definitely on the right end of the spectrum.
I think Hume beat him to this position by a bit.
In The Righteous Mind Haidt actually quotes Hume's "Reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them."
Given the above, I'm still happy with Haidt as a reference here, as he references Hume, and also has a decade or two of research to back up his claims.
Is he? I have always felt like he is at the centre, if not slightly to the left, where most social psychologists are. (BTW, I am on the left myself.)
To be fair, it's happening some on the left as well, but not to the degree it's happening on the right.
Of course, that's my opinion, and I'm a leftist, so you can just dismiss me out of hand.
The problem is that the study ended up showing the exact opposite of the desired outcome. By age 17 the difference in IQ between an adopted black child and an adopted white child was 18 points. Interestingly even adopted white children and the biological white children born to well to do parents showed a substantial 4 point difference in IQ which could be explained by genetic differences in those who end up of the means to adopt children, and those who end up in situations where they're left to give away their children for adoption.
There was even a phenomenal accidental control. A number of the adopted children originally identified as having two black parents actually had one white and one black parent. And these children fit the IQ outcomes of the other half-black half-white children, which was about 10 points above the children from two black parents. This means even though the parents thought the children came from two black parents, and the children themselves also thought they came from two black parents - their performance mapped to their genetic background and not their environmental assumption.
The authors have tried to undermine the results of their own study in two main ways. The first is by suggesting that differences could be attributable to prenatal environmental differences. This not only defies belief given the differences and extreme consistency, but if the authors actually believed this, it would have made the entire study, which went on for nearly 2 decade, completely pointless. The other is suggesting that skin color can create environmental differences. This is a much stronger argument against the results. However it also suffers from the same post-facto rationalization, but even more importantly -- this hypothesis is also strongly undermined by the accidental control group of half black children. Given their misidentification it's safe to say that these children did not have strongly indicative transracial features. And in any case they would have identified as 100% black given that's what they thought they were.
I think that this study's results are precisely why there have been no further studies, of the same scale, performed on the topic. The results are not what we want to believe.
 - https://en.wikipedia.org/wiki/Minnesota_Transracial_Adoption...
It's always a little interesting to me to see the vintage of the studies "HBD" advocates introduce their ideas to HN with.
p-hacking is particularly easy to do with IQ since there's a peculiar effect in that the heritability of IQ is not fixed. When we are young our IQ is generally able to be more significantly influenced by environmental factors, yet as we age our IQ becomes ever more a product of genetics with recent studies showing the heritability to be upwards of 80%.
So for instance in this study if you look at the results of the children at age 7, you'll note that both black and white children had measured IQs that were about 10% higher than their IQ at age 17 (around 8 and 10 points respectively). This is normal and indicative of a privileged upbringing, as all the children in this study had. Yet that privilege counter intuitively does not carry over into adulthood, which is phenomenally interesting -- but also makes it very easy to generate disingenuous arguments when it comes to this topic.
Regardless, the only cite you've brought to this thread is a study from the 1970s which has not only not replicated, but has, as I've said, been contradicted by later studies, studies informed by the methodological criticisms the Minnesota study received --- some of those criticisms coming from the study's own authors.
The question is really why people debate Physics or Chemistry. People that thought the Sun was a perfect sphere would discount things they could directly see with their eyes.
What do you mean by this? The Sun is, to a very good approximation, a perfect sphere. Or as perfect a sphere as one is likely to find in the physical world.
People had firm belief in the second and discounted evidence to the contrary. Even ignoring that rotation causes a significant distortion (https://www.nasa.gov/topics/solarsystem/features/oblate_sun....) they would not accept https://eclipse2017.nso.edu/coronal-mass-ejections-cme/ as a 'real' thing.
That's precisely the point. One may argue that the sun being a perfect sphere is a fact, but someone else may also point the facy that the surface of the sun shows significant deviations from the perfect sphere, thus the facy being that the sun is not a sphere.
The thing is, they wouldn't be discussing facts but instead personal opinions based on observations. Thus both would be right and wrong while the facts were still the same.
And to specify, the 'rightthink' and 'wrongthink' examples I've studied in the past were cases that didn't breakdown by any common political stance like left or right, but were findings that either agreed or disagreed with positions held strongly by people regardless of political standing. I do think the results would apply as expected when considering a political issue that broken down per party lines, but the reaction is much stronger when there there is more uniform agreement for or against something.
It's as if the author wrote a good article about confirmation bias, and someone jumped in at the last second and said "wait, add a line implying this is all about shitty conservatives so our readers will like it." I wonder if anyone at the New Yorker appreciates the irony.
Only those idiots who disagree with me have cognitive biases.
rather, loss aversion (a cognitive bias) can explain a fair bit of the behavior here. people with (perceived) status or privilege don't want to lose those statuses/privileges, so seek to rationalize their position. if it means attacking the underpinnings of facts and evidence, so be it (as the bias goes). it's part of our primitive brains to do so, and requires the use of our modern brains to keep in check.
also, facts and evidence live on a continuum, where facts have low margin of error while evidence has high margin of error. the way you address a high margin of error is to gather lots of evidence, so that statistically, you narrow the margin of error to something tiny, and then you can call it a fact. for example, the sun rising tomorrow has a margin of error that's approximately zero, so we call it a fact.
most reasonable people are not interested in arguing facts, especially if those calling into question those facts have no basis of expertise or credible counter-evidence.
you can have your own beliefs, but you don't get to have your own facts. moreover, it's reasonable to discount people who believe something without evidence, especially in the face of evldience to the contrary.
"the left" doesn't mean much of anything other than meaning "those i don't like".
it's a general derogatory phrase used by (for example) commentators like ben shapiro... on most episodes of his show, it's an handwave reference to indistinct millions of people that is almost always followed by chatter about how terrible they are. just the absolute worst. evil people. etc.
being one of "the right" doesn't mean you would like someone like steve bannon.
It also comes up often in the global warming context, obviously, which goes the other way.
It shouldn't be surprising. He's been pushing the vaccine autism link since 2012:
"Massive combined inoculations to small children is the cause for big increase in autism...."
"Autism rates through the roof--why doesn't the Obama administration do something about doctor-inflicted autism. We lose nothing to try."
"Healthy young child goes to doctor, gets pumped with massive shot of many vaccines, doesn't feel good and changes - AUTISM. Many such cases!"
"I am being proven right about massive vaccinations—the doctors lied. Save our children & their future."
As someone on the left, I find it bothersome that when people say "the left" they either mean neoliberals or SJW, neither of which actually stand for leftist ideas. Neoliberals believe in a right-wing economic agenda, while playing identity politics and SJWs want to feel good about themselves from their couch, so they play identity politics.
Both of these, I have a massive issue with, but due to this caricature of the left coming from the right, I cannot honestly debate with the other side.
Well, let's have a look at the actual examples in the article: what things are people apparently failing to change their minds about? (I've also included a few things that aren't examples of that but that seem relevant.)
1. Their ability to distinguish real from fake suicide notes. No obvious political slant.
2. What makes a good firefighter. No obvious political slant.
3. The effectiveness of capital punishment. Highly politicized topic, but reported experiment and reported results both symmetrical.
4. Answers to an unspecified set of reasoning problems. Presumably no political slant.
5. Next study described is one in which people actually did* change their minds, when asked to explain in detail things they thought they understood. Also not political.*
6. Next one also isn't about changing minds. Found that a rough proxy for ignorance about Ukraine is correlated with eagerness for the US to intervene there. Definite opportunity for political slant here.
7. General discussion about mutually reinforcing ignorance. Has kinda-gratuitous anti-Trump comment at end.
8. Next one also isn't about changing minds. Found that trying to explain things in detail leads to more moderate opinions, whichever side one's on. Politically contentious questions; procedure and results seem symmetrical.
9. Alleged dangers of vaccination. Anti-vax sentiment tends to skew left, though I think less drastically than is sometimes thought. Brief, semi-gratuitous, Trump comment here, but it isn't particularly negative.
10. Whether owning a gun makes you safer. Politically contentious question.
So. Most of the stuff here is either entirely unpolitical (1,2,4,5) or concerned with political issues but balanced in both procedure and conclusions (3,8). The last section (9,10) describes a book that discusses demonstrably-false opinions held by some people who skew left (9) and some who skew right (10). That leaves 6 (which looks to me -- your opinion might differ -- like an unbiased attempt to investigate a question with political consequences) and that little swipe at the Trump administration in 7. (And, if you really find it offensive, the comment about Steve Bannon at the end.)
I don't think any reasonable person could summarize this as anything even slightly like "Why don't everybody agree with the left?", even though those little swipes make it clear that the author is no fan of Donald Trump. (A characteristic he shares with plenty of Republicans, so it's not exactly a clear indicator of egregious bias.)
it's quite fashionable to trash talk this amorphous and indistinct group called "the left".
"conservative" political commentators in particular love using "the left" to as a generic stand in as "those undesirables" or "those fools" or "those leeches" etc.
Whether individuals should be allowed to own guns, without regard to whatever risk there might be, is the politically contentious one.
But the mere question of safety isn't political (how things should be), it's observable (how things are).
Specious argument can come from anybody, but the "la la can't hear you" portion of the left is measurably smaller compared to the right.
It's not the same thing as a web poll.
When I see the word 'fact', I remind myself that even now evolution is still only considered a theory - despite the weight of evidence on its side. Because in science something might just show up that invalidates all that came before, and maintaining that faint skepticism is what separates science from belief.
(And of course evolution is a theory, not a mere hypothesis because of this weight of evidence).
If you want to show me a fact, make sure it comes with a mathematical proof attached.
No, you simply misunderstand what “theory” means in the scientific context. It’s a synonym for “explanatory model”. And “evolution” in this context stands for something close to “gradual change via random mutations and natural selection [as well as other, less important mechanisms]”. Calling it “theory” does not imply that it’s not based on observable facts, or that there’s doubt about it. It’s simply used to distinguish observable facts from an explanatory model.
There is no “progression of certainty” from hypothesis → theory → fact. As another commenter has said, evolution is as much a theory as gravity.
Using your definition, even gravity is a "theory", yet things still fall to the ground when we drop them from above.
So one day, when I had a couple of hours free I sat down and read every single link. All of them.
The lack of substance beyond an opening paragraph that claimed a point was appalling. When it was brought to the attention of those link-bombers, they actually agreed that there was no substance and that they hadn't read those links at all. I was really proud of my friend and thought we had actually resolved a political disagreement on Facebook.
Then I saw them dropping the exact same links in another conversation within a couple of days.
Facts without context don't prove anything. Some of the facts but not all of them just frame the story in a particular light.
After years of this stuff, it becomes clear that nobody has time to become an expert on every subject just to be able to identify the critical details that people are leaving out.
Eventually, you just research your positions on the subjects that matter to you and vote accordingly. Now, if something doesn't make logical or mathematical sense, I oppose it. If there's no logic or math involved, I generally take no position at all.
The prevalence of dumb things to link and dumb people to link them is a bit higher now but the Gish Gallop isn't new on the internet.
A: Of course potatoes are healthy! They're natural!
B: If everything natural were healthy it'd be healthy to eat hemlock.
A: Potatoes aren't hemlock and it's really offensive you'd make that comparison.
I have the same opinion on fallacies, practically every single time somebody says "wait, you just used a fallacy" it's wrong, it doesn't matter or he himself just used five fallacies by pointing that out. It's an endless battle over magic words that have so much social weight behind them that using them should be forbidden in any serious debate. That's my personal, very unpopular opinion.
As far as logical fallacies go: I don't like the Internet style of just rattling off a bunch of logical fallacies to dismiss an argument, but I think it's fair to say "this argument is such-and-such a fallacy because of X, Y, and Z."
If we took your suggestions I don't know how anyone would have a substantive debate at all.
If you formed your opinion yesterday, it has low inertia, and it is easy to change course. If you formed your opinion a decade ago, it is high inertia, and the best I can do in a conversation is a little nudge in the right direction, hoping it will have some effect later on.
The causes for opinion inertia are known. Long held opinions are part of people's identity (think religion, for example). People actively filter for supporting evidence, eschewing opposing views. It takes a while to breach both barriers.
If you test high inertia opinion change in one sitting, like studies do, it looks immutable.
(don't ever think you are immune to these inertia effects. Everyone is affected)
Some social settings are better than others in that regard.
This is the wrong conclusion to draw from the discussion. People do change their minds, but rarely all at once. What they want is to be consistent, so you have to move the needle a little bit at a time.
One technique to convince somebody of something: first you have to show you understand their position better than they do and are on their side. Then you can lead them towards the correct solution. It's called "pacing and leading".
For example, I'm an ex-Jehovah's Witness and over on the r/exjw subreddit a common thing you see is a person starts posting. They've just realized their long held sacred beliefs are nonsense, but all their friends and family are JWs (which is a problem because JW culture is such that losing your faith makes you a pariah, and possibly shunned by those friends and family) so they want to know how to wake up their friends and family to the fact that JW beliefs are nonsense. But you just can't. Believers have strong faith and ignore contrary evidence and apostate arguments while under the spell of their faith. The believer has to make a lot of decisions for themselves to be able to read a bunch of facts, apply them rationally and decide that their faith doesn't stand on it's own two feet.
And depending on how or what you believe in, not just religion, but any subject, if you have pre-established or long held beliefs about it, information that runs contrary becomes suspect.
After all people will argue until they use up all the oxygen in the room over evolution (because of the perceived implications to their religious faith or other beliefs). But not a goddamn person you've ever met is going to pick the atomic weight ceiling of the heaviest possible atom as the hill they want to die on.
Now I don't know if I would write this off as simple "emotion". Obviously, "emotion" is probably part of it, but so is "social pressure", etc as you point out. I'd probably summarize that collection of factors as "Ideology". Convincing people to change their minds is likely one reason that ideology is actually useful to humans, even though oftentimes it bears little resemblance to the underlying reality we live in. It allows us to understand and interpret the world in ways that are comfortable and intuitive. Being "comfortable" matters.
It's the reason people ignore facts. And the reason people change their minds.
Just to sharpen the point, think of it this way, ideology can make you "change your mind" and start believing things that have absolutely no basis in fact.
"Facts don't change people's mind" forget about the presentation of the facts, usually in a condescending and over authoritarian way.
No fact is an absolute truth that shouldn't be open to discussion, facts help, but when people get "holier than thou" about them, that's usually when the pushback happens (not only on those occasions, though)
I think these days most readers, especially of publications like the New Yorker, are more media savvy than you give them credit for. When you read the headline, did you really think it would be an article espousing the view that nobody anywhere has ever had their point of view altered by learning something new?
I know someone who is the victim of a professional conspiracy theorist. This guy has figured out how to pull people in with explanations of why your typical conspiracy theory is wrong. He'll happily refute that 9/11 was an inside job in public. So his marks think "Great, rational guy! Let's learn more!"
But that's when you have to pay. All "hidden secrets" are on Patreon of course. But now his marks are literally and emotionally invested. I read once some conspiracy theorists feel like they have a special position because they know something the public doesn't.
Between that (if it's true) and the fact you've literally paid for whatever hogwash this guy is selling you'll soon find yourself (as my friend did) explaining that every study quoted by The World Health Organization is flawed because they profit from vilification of Monsanto. How does he know? Studies that refute the studies! Written by whom, you may ask?
In which case you'll be challenged for challenging "the true skeptics" who totally have it figured out. And the burden of proof is left to you: if you can't refute the studies that refute the studies well you must be wrong and now must accept The True Facts.
It feels very much like a conversation with my friends in cyber security who see a boogie man behind every tree. I am willing to entertain rational fears. But when you don't like something new and reject it for an undefinable reason, or even worse because then it could reveal secret squirrel knowledge only cyber may know then don't be surprised if I too am a skeptic.
If you expect to see evidence of changing minds literally represented in comment threads or at the end of dinner-table arguments, then you've totally failed to understand the value of those discussions. Intense discussions routinely change minds, but they rarely end with any evidence that a change has taken place.
The notion that contentious discussions on broad cultural disagreements are valueless is, I think, mistaken and the meme is, in general, profoundly damaging.
To me the obvious implication of the title is not that they never change our minds (we all know this is not true), but that they very rarely do -- at least much more rarely than one would expect.
At any rate, this is an article from a weekly magazine; I'd expect somewhat more careful reading than occurs in a newspaper to be the norm.
Not sure how you've made this leap. The point is not that people never change their minds, just that the way they do so is less evidence-based than one would hope. It even says later in the article:
> Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science.
That's perhaps an argument for the sciences directly built on positivism (the philosophical positivism), but that list ends quickly. Everything else is based on at the very least odds, but mostly consensus, and convincing based on those is required for anything beyond a few basic sciences.
And positivism only covers math, large parts of physics and chemistry (but not all), and ... that's mostly it. I mean, you could argue small portions of economics, biology also qualify, but only small portions.
Consensus itself, like presented in climate science and medicine, is an argument by authority. So if you convince someone of the truth of, say, global warming, you have in fact convinced them mostly of a social fact: that lots of people studying this problem seriously come to this conclusion. You have not convinced them that "because X leads to Y, here's the proof, Y leads to Z, here's the proof, this will now happen". You can't do that. These sciences don't work like that.
Things like climate science, medicine, psychology itself, ... just aren't made up of rational argument but from tying anecdotes into a larger framework and some amount of statistical inference. For all those sciences any positivist would say that
1) there are (very small) odds that the science is entirely wrong
2) the odds that significant parts or specific studies, even when executed entirely correctly and with integrity, is quit large: obviously 1 in 20 should be outright wrong at 95% confidence
Either way this is very tangential to my point.
"I've never seen anybody change their behavior as the result of a well-reasoned, rational argument. I have seen people change their behavior to avoid ridicule." -- Scott Adams
That said, be careful that we with for. There is certainly unforeseen consequences here.
Both sides of the political extreme are guilty of claiming their "facts" are real and others are ignorant for not believing them. It's the modern day equivalent is "our god is the one true god" nonsense. But the left just has a much bigger and louder megaphone to shout at people.
It's obvious to anyone paying attention that the New Yorker nor any other publication cares about facts. They care about their own agenda. The New Yorker isn't an objective publication. They are an advocacy group.
Essentially it's the Ministry of Truth attacking people for wrongthink.
There are a host of reasons we don't respond to facts with "oh, I guess I'll abandon all my deeply-held beliefs immediately!" Everything from "that's an embarrassing loss of face" to "I'd like to fact-check you before I accept that, but it's rude to say I doubt you" to "brains don't work that way, it's physically impossible to discard a whole belief on demand" factors in. And I agree, this is hardly a bad thing. Taking time to change our minds is a safety feature. Not only are facts sometimes false, or falsified, or misleading, but most of us aren't great at knowing what's relevant when.
On almost any topic, there is someone who could argue me into an embarrassing defeat, or at least an awkward stalemate of "we can't both be right". This is true for most people about all things, and all people about most things. I am overwhelmingly confident that young-earth creationism is wrong, but I've seen its adherents speak; I have a lot of arguments they can't answer, but they have a lot of arguments I can't answer either. I don't know enough about the geology of Mount Ararat to explain why some arcane point about flood sedimentology is wrong, but the correct response to that is not for me to agree that the Earth is 5,000 years old. And that's for a fringe theory that I'm uncommonly well-qualified to rebut - the problem only gets worse when we move to a lay viewer looking at any serious debate.
People today may unprecedented access to facts, but that didn't give us the time or memory or training to evaluate every possible issue from the bare facts up. Everyone who's ever cast a vote is, on a great many issues, working from heuristics and expert opinions and best guesses. With only one lifetime to learn things, that's not avoidable. I'm sometimes alarmed by how harshly people will resist looking into new facts and evidence, but the idea that people should promptly respond to compelling-sounding facts by changing their minds doesn't strike me as a workable one.
>A world where people's minds are easily changed by facts would be a world of fads...
is an interesting statement.
Because we live in a world of fads anyway. This even though peoples' minds are not easily changed by facts.
The interesting thing is that I think the author makes the opposite point of what they were intending.
The tl;dr of the article is that a study came out saying doctors should strictly control blood sugar in
the ICU. It was very slow to gain adoption. Eventually it started to take hold but a new more rigorous
study came out and said this was a bad practice as it was actually causing too many hypoglycemic events
and killing patients. The new recommendation was to not strictly control blood sugar in the ICU. The
author states that doctors are now not switching back to the strategy they had in the first place. It
seems like waiting for a preponderance of evidence might always be the best path forward, especially
doctors and other situations where people's lives are at risk.
How was this “evidence-based”?
Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted.
My understanding, from the descriptions in the article, was that it was not the beliefs were _refuted_, it was just that what they were told originally was not true. The subjects had no reason to believe the latter assertion above the former.
Thus it's easy to understand that a single observation, particularly one which is given at face value, is not enough to scrap whole models.
To provide an example, should we scrap the whole notion of gravity accelerating all dropped objects uniformly independent of mass if someone observes a feather taking more time to fall than a canonball? I mean,that's a fact. Anyone ca see it and reproduce the same behavior. So, should everyone just abandon the notion of gravity acritically because of a single fact was presented?
When our minds are mulling over some piece of information, whether it’s true or not, it will have side effects on other thoughts, opinions and emotions. If we’re confronted with a revised set of facts later, there is no way to rewind all those side effects.
With people in general the belief comes first and then you backfill with supporting evidence and argument.
Also we tend to pull our beliefs into our identity. Then a challenge to our beliefs is a challenge to our identity, and very hard to swallow. So it is also difficult but important to keep yourself from identifying yourself by your beliefs.
Human brains judge another persons whole life in milliseconds (we all do it, can't help that), but then struggle with what is quite simple math. That should tell you something about what we are.
My advice is don't ever forget that logic is only the smallest part. Demeanor and a strong image is much more important than it should be, simply presenting a "fact" is not enough.
Complex is the opposite of simple. If there is 50 equal elements communicating between them there we have a complex system.
This is important because a thousand ants behave totally different from an isolated ant.
You just can't study an ant and expect to understand the nest or anthill.
Psychologists try to make it all the time and it is totally wrong: They studied isolated rats in order to understand addiction to drugs and they did not understood that they are social animals which travel 20miles-30kms each day.
So they jailed the rats in a cage, because it was easier to study them that way, and extracted conclusions that were totally bogus for human beings. Those conclusions were the basis for the US war on drugs.
Here they do the same. They make a very simple experiment, and extract conclusions about the whole system. In Africa, when humans whatever,more fiction than science.
It is ok to make fiction and speculation, but you should always differentiate what we really know with high degree of confidence from what we do not.
The latest paper which is incredible interesting from that study
How much/long does a study like this cost/take? Is it possible/useful to scale this?
From reading into some of the replicability discussion, I've gotten curious about social science generally. What's an expirement like this trying to prove/demonstrate. The journalistic narrative likes the easy "X debunked." I imagine researchers have a more nuanced perspective.
I guess what im asking is what does the larger effort look like? Do these studies eventually add up to a larger understanding of how we do form opinions, where facts do change our minds....
Replicability is one thing. How about generalisability?
That's not really possible. So instead the experiments that are made are toy experiments, but when you're not really testing what you're trying to measure, it becomes impossible to prove anything, and possible to show anything. For instance could you design a toy experiment that might indicate video games cause violence? Of course. Could you design a toy experiment that might indicate video games don't cause violence? Of course. The experiments are meaningless.
As an example of the problem, 'emotional intelligence' in work is all the rage right now. Yet the keystone study that sparked it is really just quite bad. The author had people split off into groups and perform a variety of tasks such as, literally, planning a shopping list. Using some method of determining who made the best shopping list, the author then determined that the groups who had the highest average IQ did not perform the best, whereas their 'emotional intelligence' as measured by an, again literally, "reading the mind in the eyes test" mapped better to performance - so therefore, it's not merit alone that judges performance but some emotional intelligence, at least as measured by "reading the mind in the eyes" that does. That's just broken logic. At the bare minimum IQ != specific task merit. The most logical way to perform that experiment would be to have created teams of those that performed individually best on any given task to work together against those that did not score so well on merit, but did well on the "reading the mind in the eyes test". Of course she did not do this, the bare minimum to even begin approaching this question, since the result would not be what she wanted. And negative results don't get published. Yet now there have been likely hundreds of articles and spin off studies taking that study's unjustified conclusion as a granted.
So no, there is absolutely no big picture progress in the social sciences.
We have to wonder if Stanford undergraduate student generalises to the whole population.
It doesn't require much stretching of the imagination to see that circa 1975 Stanford undergraduates, as a cohort, may score below average on objective measures of humbleness.
Surely it's obvious that preventing plastic entering the ocean at the source does nothing to remove the plastic that's already there.
Maybe I should read more of this blog, but that doesn't inspire confidence.
Which doesn't matter in this test. Both groups in this study would have the presumed bias, so the differences in scores could still be attributed to the independent variable.
When the FDA said you should limit fat intake, that was considered a fact and only nuts would disagree, but that is now considered a much trickier statement. When the FDA says the flu vaccine is safe, that is a fact and only nuts would disagree, but that is not considered a questionable statement. But it isn't hard for a nut to use the first to confuse the latter.
Many people know from experience how hard it's to convince anyone to change what they say, the way they behave, etc, from facts alone. I say "behave", because it's very important to realize the difference between "rationally accepting" and "caring about a truth or fact", more to that later.
We also know that people can change their minds quite easily when they are exposed to pretty much anything for enough time, with enough repetition. Even when we know something is false, the repetition of a certain discourse can have a noticeable effect on us. When you pair that with other types of external pressures, it becomes even more egregious.
The key point is that we can't easily change how people feels while being respectful with that people and trying to make them change through words only. We are irrational, and we all have different priorities. Even if I accept a fact, it might not be something important to me, even if I rationally say it is, so I won't change the way I act, and it won't matter that much. Otherwise, I might say that I don't accept something just because I don't feel that way, even if I have to discard and ignore (unconsciously) the facts. What "I feel "is more relevant than the facts that I "don't really (want) to understand". As tobr puts very well in another comment too, all these feelings and ideas have a long term effect: "When our minds are mulling over some piece of information, whether it’s true or not, it will have side effects on other thoughts, opinions and emotions. If we’re confronted with a revised set of facts later, there is no way to rewind all those side effects."
I thought it was interesting to write about all this, not only because it really helps a lot understand why facts might or might not help much changing the way people behaves, but it also helps us understand better how to actually make people change their minds. Words might be very effective for those that are already in a similar line of thought, but in other cases we might want to try changing the way people feels about something instead, by trying to making them live, in first person, the contradictions in their own beliefs (unfortunately, for many technical issues, you can't do that without becoming a teacher (and that assuming the other person trusts you enough to let you teach something), but then it's not surprising that people can't trust facts that depend on knowledge they don't have, it's only natural). And all this also help us understand that even though we might accept many rational truths, we all have different priorities, so the ones we end up acting upon deserve some consideration. Sometimes you are so focused on your own causes that don't understand how others don't share it, when it's simply that they have their own ones too, not necessarily that they don't rationally accept and understand what you are doing. And there are a billion worthy causes. For some people it might be about saving the world. Others focus only on their kids. We can do a lot to manage better that collision between rationality and irrationality, between facts and feelings, because both are critical to us as human beings.
After decades people still believe that global warming is a hoax and vaccines cause autism, despite mountains of oft-repeated evidence that this is not the case.
but of course there are many other factors. I already talked about the problem with more technical arguments, but nowadays in many cases we have this aversion to science and mistrust for anything that might come from it. also, many people are mostly exposed to what they think and feel, because when someone else tells them something they don't believe in, they don't hear that, they just repeat their own discourse again for themselves. so they are exposed to that. and in case it was a bit confusing, I didn't say that "people will always change their minds [...]", there are a lot of factors. but that we can very easily doubt ourselves when information comes predominantly from only one direction? for sure. and that direction doesn't necessarily need to be "science" or "media". we are just bad at detecting where the relevant information comes from for a person.
The theory of Rational Irrationality holds that people often choose—rationally—to adopt irrational beliefs because the costs of rational beliefs exceed their benefits. To understand this, one has to distinguish two senses of the word “rational”: Instrumental rationality (or “means-end rationality”) consists in choosing the correct means to attain one’s actual goals, given one’s actual beliefs. This is the kind of rationality that economists generally assume in explaining human behavior. Epistemic rationality consists, roughly, in forming beliefs in truth-conducive ways—accepting beliefs that are well-supported by evidence, avoiding logical fallacies, avoiding contradictions, revising one’s beliefs in the light of new evidence against them, and so on. This is the kind of rationality that books on logic and critical thinking aim to instill.
The theory of Rational Irrationality holds that it is often instrumentally rational to be epistemically irrational.
The theory of Rational Irrationality makes two main assumptions. First, individuals have non-epistemic belief preferences (otherwise known as “biases”). That is, there are certain things that people want to believe, for reasons independent of the truth of those propositions or of how well-supported they are by the evidence. Second, individuals can exercise some control over their beliefs. Given the first assumption, there is a “cost” to thinking rationally—namely, that one may not get to believe the things one wants to believe. Given the second assumption (and given that individuals are usually instrumentally rational), most people will accept this cost only if they receive greater benefits from thinking rationally. But since individuals receive almost none of the benefit from being epistemically rational about political issues, we can predict that people will often choose to be epistemically irrational about political issues.
I wonder if this fact will change anyone's mind about the idea that facts never change anyone's mind.
Also there are good reasons why science is not in control of our lives.
What's worse is that people will also speak in the name of the science of economics to advocate about any kind of policy or tax break.
Maybe science can't change, but our collective understanding of it changes (hopefully gets better) almost all the time.