I find that a lot of people aren't used to changing their mind. For these people, conversation is to convince others rather than engage in exchanging and synthesis of ideas. To take this further, in general people are very attached to their own ideas, and if presented with viewpoints contrary to them, will fight as though backed into a corner. I think this comes from attaching their ego to their opinion, so if they're wrong, then they're "worse" of a person, so they really don't want to be wrong. Perhaps it's also a form of sunk cost fallacy.
Part of "gauging the conversation" is to understand if your conversation partners are in such a mode - and if so, then to adjust appropriately.
> I think this comes from attaching their ego to their opinion, so if they're wrong, then they're "worse" of a person, so they really don't want to be wrong. Perhaps it's also a form of sunk cost fallacy.
Mmm. I've seen similar things but I interpret it differently. People literally can't think that quickly - it takes a few days (!) ideally involving a some good sleep to really synthesize new ideas.
That time required to think implies that even if someone is convincing it still makes sense to plow ahead with disagreeing with them in the here and now.
1) It is possible that their certainty is convincing but not their argument. In my early days on HN that happened to me a few times - I got convinced in conversation because someone seemed confident despite having clearly understood what I'd said. Later I swung back to my initial opinion later after reviewing the arguments carefully and deciding that the compelling part was flawed.
2) Even if the argument is compelling, it still makes some sense to just throw out all the arguments you have, shotgun style, to find out what the appropriate counters to them are in the future.
Regardless, I'm pretty sure people do change their minds, it just happens a bit slower than people expect so nobody notices it happening. Of course, there is a particular style needed to achieve that - if they get threatened or attacked while changing their mind they will not do it.
I agree with this. I don’t think I’ve ever changed my mind on an opinion (although I have on facts I was wrong about) based on a single conversation. In fact I usually don’t even notice my opinion changed until next time I think about the topic and realize I feel differently than before.
I used to be more politically radical but in recent years I have found myself subconsciously mellowing out in favor of reformism. It literally happens without me even knowing it. The same has happened also with regards to social issues, which I used to dismiss as divisive and not important, but now realize that from the perspective of people affected it is of the utmost importance, which probably stems from a very gradual process of becoming more well socialized.
I think what you’re describing is just a natural part of growing up. It’s well documented that folks tend to be more radical in their political leanings when they’re younger.
For synthesizing new ideas maybe it takes that long. But if you hold any serious views that you expect to have challenged by others (relating to politics or otherwise), you should have clear rules for why you believe those things and exactly what evidence it would take to cause you to change your mind. Then, you can change your beliefs the exact moment that new information is provided to you.
I try to start all of my conversations about politics or sensitive issues with the question "What would you have to be shown, or what evidence would it take for you to change your position on X?". If the person doesn't have a good answer to that, then they are usually either an ideologue or have not actually seriously thought about the issue in any depth, and it's probably going to be fruitless to try and have a conversation about that subject.
I abide by this rule and can change my views at the drop of a hat when presented with new evidence.
The question makes more sense if you additionally pose an alternative (i.e. change my position to Y). Given strictly like this, all I can really say is that I would change my position if you demonstrated to me that it was in my own or others' self-interest to behave differently?
Since we are talking about assessments and decisions about a course of action and not values, the question falls apart. I think it was intended to be a trick question calling you out as a hypocrite but from what I have read of your comments on HN your answer would run more like:
A: Examples of a better way to phrase the question.
A: That for some category or type of person the question won't make sense, in which case it's better to ask a different question--or perhaps not to engage in conversation.
You might start by asking for what evidence--experience, observations, or stories--have framed their approach before asking for them to speculate about new evidence that would cause them to change their decision.
I have used your approach and found it effective in dealing with managers: is there any new data or information we could gather that would affect your decision? If the answer is "no" then you know that you are dealing with a political situation (small "p" as in organizational politics) not a problem solving one.
Those are fair suggestions. I'm not typically inclined to "do the work for them" so-to-speak, by asking them what experiences & observations have framed their approach, because that's basically priming the person to have a particular response to the second question which might not be in alignment with their actual beliefs.
A lot of the time people don't have a good response to the second question because they have never actually even considered the possibility that they could ever hold the opposing view on something, or what holding that view would entail.
If there is a value implicit seeing to gather disconfirming evidence by asking the decision maker what would change their mind, it seems to me to be a commitment to the scientific method of allowing new data to overturn existing models and hypotheses.
My apologies if I imputed motives in your question that were not there.
I am curious: what would you do if you were facing a high stakes situation where another person's decision was going to have a material impact on your life? How would you go about trying to change their decision?
Sure, "commitment to the scientific method" is also a "value", if we want to be abstract like that.
Regarding your question: Depends on how much impact on my life it would have, and what exactly the decision is. I have no prepared strategies to change people's minds apart from presenting reasonable arguments and maybe slightly bullshitting my way through it at times. There was never a need for more than that.
In the extreme the response would be lies and even violence, if nothing else works, I suppose. Again, it depends on what the situation and "material impact" on my life is.
I really don't think this is how it "should" work. If you took time to actually research something and build nowledge in that area, a single sentence or verbal paragraph should not change your opinion. You should take time to evaluate and reconcile whenever was new. And spend time to check whether what was said was actually true. A single thing someone with no special authority on the mayter said should not make you flip.
Debates are more about speed and quick recollection then anything else.
Also, to your last paragraph, it does not show the other people are ideologues. It just show they see you want to win verbal game rather then engage in serious debate into the topic. People are not interested in it, because why would they?
> Also, to your last paragraph, it does not show the other people are ideologues. It just show they see you want to win verbal game rather then engage in serious debate into the topic. People are not interested in it, because why would they?
In fairness, there are more interpretations beyond wanting to win.
Trying to make the point general: that filter ("do you have an evidence standard?") does identify some people who will change their minds, but will rule out others who are also willing to change their minds.
Probably quite a lot of someones if my theory is correct. People can be quite flexible towards someone who is respectful and wants to help them get better outcomes. I've had a instances where I turn out to have convinced someone 24 hours after the fact - and vice-versa.
I don’t waste my time trying to convince anyone of anything unless it directly affects me.
That means the only person I have to convince of anything in my personal life right now is my wife and even that it’s related to shared goals. For instance she’s religious - I’m not. What’s the purpose of trying to convince her of anything.
If relatives have a different opinion when I eat dinner with them, I just nod and change the subject quickly to something not related to for instance religion and politics.
At work, I have to do some convincing. But even then, “I stay in my lane”. If it is a larger project and I’m only responsible for one slice, I will give my opinion on the rest. But they can take it or leave it.
There are topics on which I hold opposing positions logically and emotionally. There are topics on which I can hold different positions depending on my mood.
Humans are not robots, we do not compute our worldview from set of learned facts or opinions - it is formed by lived experience.
Therefore, neither convincing your opponent is guaranteed to ultimately change their mind - they may easily change it back hour later. Nor is it pointless trying to convince someone who appears immovable in their position - your words may be the last drop that will make them change their mind or at least put a crack in their worldview next time they think about the topic.
Glad I'm not the only one who doesn't think so fast to be able to change an opinion within a conversation. It takes me literal hours of reflecting on past conversation to change. There is no way any non-trivially held opinion of mine could change within a single conversation. Questioned? Maybe. Changed? No.
I would say it is the overwhelming majority of people’s approach. They have a point of view and are emotionally committed to it. They have this commitment almost equally across opinions even if they only arrived at it by first gut reaction moments ago.
They also project this mode onto their conversation partners. For example, if you offer a counter point or drawback to a proposal, they assume you are strongly opposed to the proposal. Rather than discuss the drawback, they will launch into a stream of defenses of the proposal.
I haven’t found any reliable ways to steer people out of this. The most useful adjustment is to expect it and know that most people are just giving you depth on one intuitively appealing solution.
Simply put most people can’t think straight and therefore can’t converse logically either.
My typical method of helping with this is to find common ground. Find something you both agree on, even if it’s only tangentially related, and then pose an argument and phrase it as such.
For example, when discussing if the Earth is flat or round, you might both find agreement in that nobody has ever fallen off the edge of the Earth. Then you might try to pose an argument as to why - which is more likely, that nobody has ever traveled far enough to fall off, or that the earth is actually a continuous plane?
I find assigning probabilities to topics often helps.
But sometimes you can’t, and that’s when you agree to disagree and politely exit the discourse.
The first part of this sounds very closely related to something called Rogerian argument [1] which aims to find new opportunities for consensus by building on views already held in common.
The bit about assigning probabilities is interesting, precisely because I can think of very few contexts in which it would be of use to me. People seem to have little tolerance for shades of uncertainty when expressing views, whereas privately we think in probabilities all the time. It's as though we play a kind of poker where our need to conserve our 'stack' of reputational authority makes us relegate the actual ideas in contention to mere 'hands' to be represented, bluffed, and trivially discarded when a more amenable certainty presents itself.
> For example, when discussing if the Earth is flat or round, you might both find agreement in that nobody has ever fallen off the edge of the Earth.
So I'm not trying to argue that the earth is flat, but how do you respond if they ask for proof that nobody has fallen off the edge of the earth?
It's not like you can point to some record of all history and show that the event doesn't exist within it. You might try to explain that there is no edge to fall off of, but if you can convince them of that, then the conversation never needed to happen in the first place.
> how do you respond if they ask for proof that nobody has fallen off the edge of the earth?
The same way you should respond to all questions. Truthfully. You have no proof, and you don't care to waste your time looking for it. If that is a showstopper for the conversation, it was never worth having in the first place. Simple as that.
I think that's practically what will happen a lot of the time, but it's not very satisfying. It seems to amount to "this person won't be convinced by my unconvincing argument, so the conversation wasnt worth having". (Unconvincing in that you're basically asking them to take it on faith rather than demonstrating it rigorously).
>you might both find agreement in that nobody has ever fallen off the edge of the Earth.
You've revealed yourself as not having actually found common ground with flat earthers. Evvvvverybody knows that it's not about travel distance. You can't fall off the edge of the earth because of the ice walls. It's not like it just ends. That'd be silly. The oceans would drain out into flat space.
I think that interpersonal relatedness or 'connection' relates to the degree to which people are willing to have their minds changed, or perhaps more accurately, admit to having their mind changed. Being open to admitting a change of mind requires a certain amount of vulnerability that can be difficult or impossible, depending on how well you know the other person.
Anecdotally, I've noticed that in group interactions where time is invested up front in 'ice breakers' (the non-cringey kind) or other exercises where the objective is not about the topic at hand but rather to better get to know the group participants, this can help build interpersonal relations. Seeing others as more 3-dimensional helps with social bonding and thus makes us more open to being more vulnerable.
This mode is essentially universal. Our brain is a world modelling and outcome prediction machine. It being wrong would be an existential threat in the environment it evolved to function. This is why being fundamentally wrong is extremely uncomfortable for us.
The difference is in the level of abstraction an individual takes their stand at and becomes uncomfortable when challenged. Which, I think, depends on how grounded their beliefs, knowledge or opinions are - how many layers and columns of "I'm still right" they can fall back to.
You ask people “how did you arrive at this conclusion”. That’s a good invitation to extract a walkthrough of how that conclusion was arrived at, what observations, data, evidence, references, whatever informed their conclusion about whatever subject.
Even worse than this, sometimes I find myself changing my mind internally but feeling compelled to defend my initial opinion, as if being convinced is having lost. That’s definitely detrimental to a conversation.
Yes. This can happen when the person you're talking to is in the right but is being arrogant/obnoxious/smug about it. Publicly agreeing would be rewarding rudeness. I don't think that's even such a bad thing; they might only be right once, but they'll probably be rude forever, so why ever give them the satisfaction?
Effective arguing doesn't have to look like "arguing". You have to leave people an off-ramp so they can change their minds without feeling like they are losing a status contest.
Most people are emotional, not rational. Even people who are supposed to be rational are rarely unemotional about their beliefs.
It's why facts and evidence are so unpersuasive. If you don't make them emotionally and personally relatable most of the population won't understand what you're trying to say.
Which is why the rule is "People won't remember what you said, but they'll remember how you made them feel."
Isn’t it hypocritical to criticize people, though, for having their egos attached to their opinions? I mean, who hasn’t? It’s likely that we only think better of ourselves when we are disagreeing with others on topics that do not matter to us deeply (and which does to them—so really, who’s being the bad person here?), but everyone has something that they deeply believe in and upon which they anchor their identity. Otherwise, why would we bother arguing with others if we didn’t believe that our beliefs are the correct guiding principles for our day-to-day actions that define our identities?
I think there is a worthwhile distinction between the values you hold, and specific ways of implementing those values. When disagreeing with another person you're far more likely to differ on the exact road you want to take, but not actually the destination - presumably because a lot of individual assumptions play into the implementation you deem ideal.
To give an example, I do attach my ego to a lot of my values. But at the same time I know that I am fallible, and more likely to be wrong than right on any given topic, even if I do my best to research it. So if I really hold true to my values, I can't stomp my feet and say "But we have to do it this way!" if someone else can reasonably argue for a different (and potentially better) road to take.
Most points of contention are complex and there isn't a water tight argument one way or the other. Most people aren't very convincing in the best of circumstances.
Thus, most people don't change their mind because they don't get presented a sufficiently good explanation that would move them from one not-water-tight position to another not water-tight-position.
Though he primarily talked about religion and politics (it's an old essay), I see people attaching other concepts to their identity, and then verbally fighting to the death if you point out the slightest flaw in any of these concepts.
I'm curious about why you would want (or, expect) an institution to resolve a cultural or interpersonal phenomenon. Maybe I've misunderstood your meaning or intent. It strikes me that this is a behavior, and if it's common, the best way to address it is by changing the norm through positive examples and gentle feedback.
Off the cuff, looking to an institution to shape some behaviors has not generally gone well for humanity. I'm sure that we can think of counter-examples, but this seems to be contrary to what makes Western culture successful.
It must come from outside pressure. Internal dynamics of institutions work in the retrograde direction: Individuals most easily rise to power by flattering prevailing opinions.
I find that a lot of people aren't used to changing their mind. For these people, conversation is to convince others rather than engage in exchanging and synthesis of ideas. To take this further, in general people are very attached to their own ideas, and if presented with viewpoints contrary to them, will fight as though backed into a corner. I think this comes from attaching their ego to their opinion, so if they're wrong, then they're "worse" of a person, so they really don't want to be wrong. Perhaps it's also a form of sunk cost fallacy.
Part of "gauging the conversation" is to understand if your conversation partners are in such a mode - and if so, then to adjust appropriately.