As is often said you cannot reason someone out of a position they didn't reason themselves into. It also goes to show the immense danger in allowing authorities to lie and build a straw man because this leads to permenant damage to the thinking of the people who believed it even when they know later it was a lie.
Adopting scientific thinking and evidence is significantly harder work which combined with the bias of prior experience makes it very hard indeed for new evidence to change existing minds. We know this as science progressing one dead person at a time, even scientists can not adopt new thinking we have to wait until they die while training all new people with the new understanding.
> As is often said you cannot reason someone out of a position they didn't reason themselves into.
But the point here is that you also cannot reason someone out of a position that they did reason themselves into. In the first study with the suicide notes, students had decent reasons to believe they were good/bad at the task; yet even after they learned that those reasons were false, they kept believing it.
The problem, as someone who feels this way, is that you’re always bad at anything relative to someone else. If that’s how your mind sees it then being in the top 5% doesn’t help because there’s still a million people you see all the time doing incredibly better
>"We know this as science progressing one dead person at a time, even scientists can not adopt new thinking we have to wait until they die while training all new people with the new understanding.
We have deep flaws as a species."
I've always considered this an advantage. The previous generation provides stability by sticking to what it knows and believes and the next generation adapts society to new understanding and circumstances. On aggregate this provides a measured response to changes in society and understanding.
It may also be painful to watch and experience if this paradigm shift has real-world implications aka if it matters at all.
It took a long time before the community of doctors accepted the experimentally proven concept of washing hands and instruments before doing something to a patient, e.g. a surgery. While it may have provided some nice stability, it also prolonged and produced quite some suffering.
Even now its a problem getting healthcare workers to wash their hands, its a constant task of enforcement.
We have had good and increasing evidence for the air borne spread of diseases since the early 20th century and yet even now airborne spread of viruses via aerosol is rejected by most. We have experiments showing that Covid and other viruses spread airborne in small 1-3 micrometre aerosoles and confirmed transit over quite large distances on the wind. We also have experimental evidence that it can survive in the air for many hours.
Yet the WHO still has not accepted that viruses spread this way and indeed many governments are rejecting it as well. This is a Semmelweis moment happening currently and its not just one scientist being rejected its hundreds all coming at the problem from different directions finding the same thing.
Are health officials rejecting the idea that viruses can spread airborne? Or merely rejecting the idea that we should have some common, widespread intervention to prevent it?
It seems like there’s not much room to disbelieve the former, but plenty of sensible people could have different opinions on the latter.
This worked well when the biggest technical advance that would happen across two generations would be a new style of beer tankard, and when the biggest social change was the old king dying and the new king being throned.
When the pace of change outstrips this, you develop Problems.
The entirety of 20th century history has comprised us wrestling with this.
Depends on how you look at it. I believe this "perfect" species would be quite moronic. You can never think entirely rational as that would be quite horrible. At least until you come to the the point where you cannot find a rational reason to continue your existence/eat chocolate/drink alcohol/..., because you likely will not find one. And different people have different ideas what they deem important or not and that is completely fine.
That said, it is important to deliver rational explanations to the largest degree possible. The recent trend to "curate" free information is quite destructive. Even if that information is just nonsense. We also have some form of "endorsed" nonsense authorities would like us to believe and you probably should not.
The panic about "those other people might believe something wrong and that is a problem" is a disease that should be overcome and you should adapt accordingly.
I have to agree with the second comment as well - this shows that you cannot reason someone out a position they reasoned themselves into.
Let’s try what the article said - perhaps I am not understanding your use of words, could you show me how they are linked? The reasoning (article) - not reasoning (your statement) part seems to be moving in different direction.
>science progressing one dead person at a time, even scientists can not adopt new thinking
I saw an interesting theory on this the other day that it's not about challenges thinking but about politics and power. Scenario:
Young researcher has no power, does short term research contracts for low money. Randomly gets lucky and makes a cool discovery, gets promoted, permanent position on the faculty. Spends rest of career milking that and getting promoted.
Hence the attacks on their theory (string/amyloid etc is rubbish) are attacked back as a political attack on their position and salary, rather than their being incapable of seeing what's what.
It's like if you go to a communist country and try saying this is no good, switch to markets and democracy. You get arrested not because they can't see that but because they can.
> We know this as science progressing one dead person at a time
> We have deep flaws as a species.
Speak for yourself. 'We' doesn't know anything or have anything. The perceived collective consensus view on whatever, is actually just a projection - it is not really there. 'Facts' are consensus opinion, deferral to expert opinion - the term (defined by the royal society) has authority and hierarchy baked in. https://www.etymonline.com/word/fact
With regards to science, the scientific method applied personally, should count with oneself - if one understands oneself to be a rational thinker the method is great. If one is rational, able to change one's mind as one verifies (or fails to verify) this-or-that then great. If one prefers to read articles on science sites and believes one now 'knows' something (without personal verification) one is choosing to believe one knows. This is not actual knowing - to call a 'belief' -> 'knowledge' is a subversion of the meaning of the term.
Ultimately knowing facts is a subjective endeavour, objectivity is not possible. One cannot take oneself out of the experience, no matter how hard one pretends to be objective. One can pretend one is a 'we' (psychotic behaviour) and that 'we know this or that' when we merely read an article or watch a video somewhere. If you ask me, this is to be so deluded and disassociated from reality, that it is like being in the middle of the ocean and thinking one is climbing a hill as one is lifted by a wave.
The collective consensus which you call "not really there" is actually the belief of the people who have the power to influence the society. I think that's why people are so interested in it - because it matters for government policy, who people vote for, etc. which affects everyone.
When the collective consensus said that infant formula was as good as breast milk, it led to a lot of babies being fed that. Then the collective consensus reversed and a lot of babies got breast milk instead. Those babies grew up to be people like you and me with all the possible lifelong good or harm that might have come from whatever the collective consensus at the time was. People like mothers and nurses who implemented it did so even if they didn't really believe it. That's the power of collective consensus - to influence just about everyone. Denial is punished socially so people conform rather than act on their own beliefs, however well informed.
Yes. Beliefs/stories are real, in so far as individuals do have them, and will act on them. But characterising a 'fact' as 'true', believing it, saying it is so, does not make it actually true. It is in this sense that I meant "not really there" - there is no verifiable reality to the thing that one can confirm.
This is why I think the study of logic and having fields where the culture generally agrees on "what counts" as good argumentation is really important. Does it solve these problems entirely? Certainly not. But look at, for example, the culture in the scientific community. From my outsider's perspective, it seems the scientific community has adopted a series of guardrails that generally prevent "bad" research from getting published.
Famously, it doesn't always work, and I'm not ignorant to the latest series of scandals involving illegitmate journals, p-hacking, etc. But I think these tend to be the attention grabbing headlines, rather than the "norm." Glad to be challenged on that point.
But to return to my initial idea: I do think that we can arm ourselves with certain principles that, if applied, will move us away from biased thinking. Simply being aware of "confirmation bias" and other psychological pitfalls might make us more capable of figuring out where we're going wrong. As the article notes, people are famously blind to their own errors, but quite good at pointing out others'. So it's not like it's impossible for us to become better and stronger thinkers.
It just takes effort and some "meta-thinking" and I think some personal virtue and character to become better!
I think the mechanism in the scientific community that allows progress are just multiple perspectives.
The version where there is one scientific authority that sets these guardrails is prone to fail and does that quite reliably. Different faculties have different requirements for their scientific work, but even here the established players often become a problem. The power of journals and their reputation can be detrimental, but as long as there is competition, it should work.
I think some field are less vulnerable here than others. You cannot just deliver scientific work in sociology/political science. You will have to fight a lot of people who will disagree for ideological reasons. These are by nature distinct from discussion about what color some gluon needs to have in some theoretical particle, although you have entirely "people focused conflicts" here as well.
But overall you should never put your belief in guardrails. They will be wrong and the next ostracized scientist was right in the end. The only relevant content is indeed the one in the scientific work itself. Whom you delegate them to evaluate them for you is a personal matter and no institution alone can take up this task.
> But to return to my initial idea: I do think that we can arm ourselves with certain principles that, if applied, will move us away from biased thinking. Simply being aware of "confirmation bias" and other psychological pitfalls might make us more capable of figuring out where we're going wrong. As the article notes, people are famously blind to their own errors, but quite good at pointing out others'. So it's not like it's impossible for us to become better and stronger thinkers.
This is why it's so important we teach our children things like critical thinking and reasoning, educate them on the various biases we're all prone to, and teach them how to recognize propaganda that works on those biases.
> From my outsider's perspective, it seems the scientific community has adopted a series of guardrails that generally prevent "bad" research from getting published.
> But to return to my initial idea: I do think that we can arm ourselves with certain principles that, if applied, will move us away from biased thinking.
Run the proposition that science has cast its net of expertise and authority a bit too broadly by the science crowd and see how that goes over.
Or, pay attention going forward to what that crowd has to say about philosophy and other non-hard-science disciplines while reading your various socials.
There's an extremely simple answer to this: humans don't actually change their minds rationally. People arrive at most beliefs emotionally. The reason the people DO NOT change their mind when told that "the scores were fake" is because they felt good when told the scores showed that they were correct, and that reinforced their own belief about their skill.
When told "actually the scores were made up" this is new evidence but it doesn't change the underlying narrative (they already believed they were good, and that was emotionally reinforced). If you keep this in mind a lot of human behavior becomes much more predictable
That doesn't entirely explain why the group with lower scores still thought they did worse than average. Why would they reinforce a false narrative that made them feel bad?
It might be explained that the mind will ignore evidence that doesn't confirm their emotions regardless of whether those emotions made them feel better or worse.
That would explain things like imposter syndrome, where people continuously ignore evidence that they are doing well because it doesn't confirm their lack of confidence in their skills.
You just rationally explained a logical issue in the parent's post. Will the parent change their mind, or will they reinforce the positive narrative that they were explaining to everyone how it worked override it?
The article doesn't provide details, but it would be interesting to know what percentage of the test subjects, if any, did in fact change their view in light of new evidence. Surely some people are more rational than others.
Maybe you can find the answer to your question in the original papers? These are probably the publications:
Ross, L., Lepper, M.R. and Hubbard, M. (1975) Perseverance in Self-Perception and Social Perception: Biased Attributional Processes in the Debriefing Paradigm. Journal of Personality and Social Psychology, 32, 880-892.
Anderson, C.A., Lepper, M.R., & Ross, L. (1980). The perseverance of social theories:
The role of explanation in the persistence of discredited information. Journal of
Personality and Social Psychology, 39, 1037-1049.
It’s a reasonable assumption. Some people are shorter than others.
Also, quite a bit of what we class as mental illness seems to be an overabundance of rationality. The existential dread of the depressed. The questioning of societal norms by autists. Neither are wrong, but both are socially dysfunctional.
Then, there are those of us who just can’t hack belief. “Just trust me, it’s true” was never an acceptable answer, and so I was a kid full of “why”, and am the same as an adult. If it can’t be explained, rationally and coherently within the context of everything else, it just gets put in a “maybe” pigeonhole and awaits further evidence or information. I form hypotheses, but they are not precious to me - I suppose for me because my identity isn’t tied to any concept in particular, and I’m quite acceptant of myself as a remarkable yet purposeless molecular Heath Robinson contraption. I also doubt that I am alone.
Am I perfectly rational? Lord, no. Do I think I manage a higher level of rationality than most, and have an… “unusual” life to show for it? Sure.
The insights gained from artificial intelligence about neural networks can help explain why facts often fail to change our minds. Repeated exposure to certain messages strengthens the neural connections that support them, embedding these ideas deeply in our memory and influencing our perceptions and beliefs. As a result, when we encounter facts that contradict these established beliefs, the brain is less likely to integrate the new information. Instead, it relies on the well-worn pathways created by repeated messages, making it challenging to replace old beliefs with new facts. This process shows how repetition can entrench ideas, creating mental resistance to changing perspectives even when presented with solid evidence.
This article is a collection of interesting experiments, but it didn't quite explain the "why" in the title for me. Maybe the point is simply that we humans are naturally inclined to make our beliefs persistent.
I think the proper term for this is "backfire effect":
I understood their "why" to be more of an evolutionary thing... "Reason is an adaptation to the hypersocial niche humans have evolved for themselves". They also seem to cite confirmation bias as part of it as well.
Facts changing / not changing our mind will depend on what you define as a fact.
Usually the nuance is that just because “someone” said that something is true, doesn’t make it automatically true.
Most people will change an objectively held belief in light of tangible evidence. However; news, statistics and statements from other people are not inherently believable.
Something we can see / smell / taste / hear / touch will absolutely change someone’s mind.
In the article, people were critical about positions they themselves created though.
There is a lot of weight being carried by “someone” and “tangible evidence”; based on that framing it’s possible that tangible evidence is only tangible if it is from the correct “someone”.
Not sure if this what you wanted to say, but this seems to be the fundamental truth you are espousing. Vs “there is some set of facts that will change people minds.”
People very much reject what they can see, smell, taste, hear and touch.
The one thing that impacts such behavior is not facts, but people actually working on the problem (“what are the implications of your belief, what happens when this occurs? Etc.)
Can you give examples of people rejecting what they see/touch/etc.? Or are they actually rejecting some other intangible information that's required to interpret those things in the "correct" way?
For convenience’s sake we can use the article itself -
>In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion. Once again, they were given the chance to change their responses. But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.
People rejected what they themselves reasoned was acceptable - the exact same thing, when it was presented back to them.
If you want other examples, which match specific conditions and criteria you are looking for, then I must admit that my laziness is sadly a bar to your edification.
For the purposes and context of this thread though, this example should suffice.
It clearly shows how our brains and cognition can reject information, which they themselves can see, touch and feel.
That's not rejecting what they saw. It's the opposite - trusting what they saw when re-presented with "their" answers combined with what the researchers told them about whose they were. They had apparently forgotten what they initially chose, or at least weren't very sure and trusted what the researchers told them instead.
That's like going to a shop once and remembering it was on street X, then returning later and finding that it's actually on street Y. Nobody would think "The sign looks like it says Y but it really says X and my eyes are deceiving me.". They might think the sign was changed or the shop moved or they remembered it wrong or some other explanation that doesn't involve rejecting what they can see.
To accept even tangible evidence, one needs to be prepared to change their opinion at all. Very often the opinion, and the worldview it buttresses, is too important for the comfort and confidence in the world to allow any evidence ruin it. Counterexamples would be rejected as rare deviations or observational errors. Systemic evidence would be written off as propaganda and gaslighting. Persuasion by close friends may even be seen as perfidy and an attempt to maliciously pull a leg.
Sadly, the most correct model of the world is not necessarily a prerequisite for the best fitness and thus survival.
> As is often the case with psychological studies, the whole setup was a put-on.
Do people who participate in psychological studies ever realize this is the case? Surely after the first time being misled by someone in a lab coat you'll be more skeptical the next time?
At a seminar, the economist Binmore noted this as a problem - every psychological study from Milgram on has been a trick, and so subjects (undergraduates) are alert to this. He just wanted to study how markets / prices could form, if memory serves, but he could see students reading the instructions and trying to figure out where the trick was.
Psych studies have to inform participants of the purpose of the test and debrief them if they want to survive ethics review.
This can be a challenge, but does lead to many interesting workarounds.
The original change blindness / door experiment conducted outside a lab setting didn’t get permission from participants before enacting the test.
This had to be corrected, and subsequent tests needed participants to sign off before they could participate. If you know the experiment, you would assume that being informed of the experiment would not be conducive to the test.
The researchers found a way to still create a workable experiment. Volunteers would be able to go to a kiosk to sign up. If they signed the release (? forget the name) form the kiosk attendant would duck down out of line of sight for a second to pick something up, and would be replaced by the accomplice.
They would then direct the volunteer to the debriefing room.
>Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles...
I wonder if the use of AI may help with these things - able to absorb a much larger quantity of facts about what's going on than your average voter but being easier to check for bias than typical news sources say.
Does this explain some LLM behavior when they refuse to give up their position? I've heard about this and I've seen it myself. Not saying it's because of the architecture, but perhaps because of human conversations in the training data.
I've seen facts dismissed with "most of what you said is a lie" by people who believe what they are saying.
My favorite is someone saying "if you're fine with the absolute shitshow of recent years you haven't been paying attention", i.e. Fox tells me everything sucks, I listen to a thousand hours of that, you make one short list of contradictory facts, and it looks to me like lies and nonsense.
The researchers mostly demonstrated that if you lie to people, you can confuse them. This is not much of a result.
They're not really doing the experiment they claim they are doing. They need an experimental design that doesn't involve lying.
This is the biggest flaw in the New Yorker article. It assumes that the rational thing to do, upon being told you'd been lied to, is to re-evaluate your assumptions while assuming that THIS time the researchers are telling you the truth. When in fact, it's far more rational, when you discover that someone has lied to you, to start mistrusting ANY information from that source. Including the statement, "I lied earlier, here's what's really going on."
So even though the participants had no basis for persisting in their belief that they'd done well or poorly, they also had no basis for changing their belief. Because why would you trust the word of an admitted liar? And so, in the absence of any reason to change their previously-formed assumptions, they persisted in them. The New Yorker article thinks that that is irrational, but for that to be irrational, you have to assume that taking the word of a known liar is rational behavior. I, for one, don't agree with that premise.
> I'm not a woman and I don't have daughters. Do I care about abortion. No!
Empathy allows seeing things from others' perspective. Rational thinking allows looking ahead on the chess board. My kids are grown and gone but unlike some olds, I don't object one bit to paying taxes for public schools, in fact I vehemently object to across-the-board slashing of spending† on public schools because that way lies disaster.
† which is a different issue than whether to cut particular programs, etc.
facts themselves are fictions - they are only a description of memes which have achieved consensus. Persuasion is all about 'ambient propaganda' - just flood the information system with your memes-of-choice and we will come to accept them as true
I find this to be particularly true with modern feminists, I don't think they have ill intentions but as a group they see to suffer from overcorrection and start touching subjects outside their understanding, the abuse was clear on some subjects so they extrapolate those findings on everything else, if you show a modern feminist that most homeless and most suicides are overwhelmingly done by men any sort of rational discussion it's thrown out of the window, or correlations with poverty or any subject that isn't gender, pointing any of those correlations it's perceived as an attack and not any longer any intelectual pursuit, the tribalism kicks in and it's impossible to take it out.
True for most people on the left today. For example, they don't like Elon Musks personality/politics (which is fine, very understandable), so they are completely unable to objectively evaluate his achievements. Because they don't like him, he must be stupid, and therefore they must find reasons to dismiss anything he's ever done. It's almost comical to watch. There is no amount of evidence that can change their mind. First hand accounts from the engineers working under him, multiple biographies, at least 5 companies worth more than a billion dollars, two in the hundreds of billions. But no, it's all luck or fraud.
Maybe he's an evil Forrest Gump?
Honestly, seeing this kind of brain rot is kind of radicalizing in itself. You start to question what other views you've adopted are just group think and emotional reactions.
If you wonder in what kind of environment human "reason" possibly evolved read this commentary on the experience of a young woman kidnapped by a primitive society and her survival among them.
First, "citation needed", but also for me personally, I'm often totally undecided or otherwise un-committed to one view or another about something, because I sense my knowledge of that thing is minimal, or my only exposure to that thing is via hearsay and anecdotes etc. … so it remains in that unresolved state until I either gain more information or am forced to make a choice with incomplete info. I don't often see this approach from other people, for whatever reason.
We all "know" that people behave irrationally, yet studies that attempt to demonstrate systematically irrational behaviors end up producing weak or irreproducible results. Or, the studies uncover rational behaviors that the researchers didn't expect.
The one I remember is about "loss aversion" where it turned out that people were making reasonable choices, but not ones based on the simplistic probabilities that the researchers had designed for their experimental method. Another one is the "marshmallow test" where it turns out that some children have a reason to not trust adults. I don't have detailed references or anything like that.
Somewhat famously, people do get more rational when money is on the line. Not homo economicus rational, but a little more connected to reality. Usually this is interpreted cynically.
> And yet, mainstream theories of economy assume that individuals are reasonable actors successfully and efficiently acting in their own self-interest.
They really don’t. In the context of this discussion it’s quite ironic that this myth is so widespread.
I think in US barriers of entry are lower. I've seen a video about a software developer in US that lived in the neighborhood where garbage collection company wanted to charge extra for getting out of the truck and emptying the trash bins.
This guy basically told their neighbors that he can provide same service they used to have at the same rate, collected orders and bought random used "smaller" (but still quite large) garbage truck with robotic hand to pick up the bins.
In Europe without special driving license for trucks he wouldn't be able to drive it (it weighed above 3.5T). I'm assuming other kinds of licenses are also required to ensure the collected garbage is disposed of properly.
In many places garbage collection is a service that's ordered through the local government so that the prices can be kept similar for everybody so that people from remote places (who'd need to pay more for collection) won't just toss their garbage into the forests and such.
>> If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.
Confirmation bias much? Ironic?
>> Of course, what’s hazardous is not being vaccinated; that’s why vaccines were created in the first place. “Immunization is one of the triumphs of modern medicine,” the Gormans note. But no matter how many scientific studies conclude that vaccines are safe, and that there’s no link between immunizations and autism, anti-vaxxers remain unmoved.
There are very few studies for vaccines proving their safety, and even those few are only for a FEW vaccines.
I was thinking they were oversimplifying a bit with "vaccines are safe" as the reality is they have some hazards but generally much less than the disease. But not always, for instance the Astra covid vaccine seemed to have worse effects than the risk of the illness in some young people and got discontinued for those for that reason. I'm not sure if over dumbing it down is a good idea.
Adopting scientific thinking and evidence is significantly harder work which combined with the bias of prior experience makes it very hard indeed for new evidence to change existing minds. We know this as science progressing one dead person at a time, even scientists can not adopt new thinking we have to wait until they die while training all new people with the new understanding.
We have deep flaws as a species.
reply