Tolerating some amount of cognitive dissonance is one of the best things that I was able to develop in myself recently. In coding for example, I used to suffer mental anguish over seeing inconsistencies in codebases I'm working on and have an OCD-like drive to fix them. Learning to live with those consistencies freed me up to work on what actually matters.
Furthermore the unforgiving drive for consistency is a reason why people don't update their beliefs when new evidence comes to light. Consider Superforcasting[1] (a book about people with an unusual ability in forcasting the future) the author says that one common trait among superforcasters is that they have a larger capacity for tolerating cognitive dissonance. The drive to avoid cognitive dissonance shackles you to your existing beliefs (see confirmation bias).
The ability to deal with uncertainty is not the same as cognitive dissonance. There is no cognitive dissonance involved in accepting that mutually contradictory propositions P and Q could be true, you just don't know yet which is. Cognitive dissonance is when you believe that both P and Q are true at the same time. Humans have a tendency to jump to conclusions because they don't like not knowing something, and any answer is better than no answer. It's precisely those people who can't deal with uncertainty who must deal with the most cognitive dissonance when the conclusion they jumped to turned out to be wrong, particularly when they've turned that conclusion into dogma.
For the most part. Cognitive dissonance is the stress that results in believing P and Q are true at the same time, along with a few other scenarios such actions contradictory to one's beliefs or being confronted with evidence that is contrary to one of your beliefs (belief disconfirmation paradigm). We then engage dissonance reduction to arbitrate the difference and lessen the stress. One of the easiest methods of which is ignoring or rejecting the conflicting evidence or ignoring the voices that are--literally--causing you mental pain.
Socially, belief disconfirmation is arguably the most problematic because it turns rational discourse into something very, very ugly. Instead of persuading people, evidence and debate can further entrench a given belief instead. When certain issues became almost identity and lifestyle statements (climate change denial, the anti-vax movement, GMO hysteria, etc.), contrary evidence is perceived as a personal attack instead of a weighing of scientific evidence to determine fact. Unfortunately, overcoming this isn't easy. And it's made that much harder when people are able to retreat into their preferred echo chambers rather than be forced to directly confront the dissonance.
Everyone must jump to a conclusion at some point -- you can't possibly collect all the evidence in the world. I think it is also a case of cognitive dissonance in believing that you can believe otherwise.
If you are only thinking about it, yes. If you want to take a decision based on the thought process, you have to pick a side at one point, knowing that it might not be the right side.
You can make decisions in the face of uncertainty based on an assessment of the risk and the probabilities of the possible outcomes. If you get in your car you put your seatbelt on without knowing whether you're going to need it for that particular ride. There's no need to "pick a side" on the question of whether you're going to have an accident on that ride. Being uncertain about something is almost the opposite of cognitive dissonance, which comes from being quite certain about contradictory propositions.
Only if you think your decisions purely based on fact, rather than thinking of your decisions as based on the best facts you have at the time.
Cognitive dissonance is believing that two contradictory theories are most likely true based on the evidence you have. Since the evidence of the truth of one is also evidence for the untruth of the other, it means that when evaluating the evidence for one, you discount a different set of evidence than you do when evaluating the evidence for the other.
This is never rational, and never something that you have to do. It's usually done to avoid conclusions that would force you (according to your own ethics) to give up something that you have, or not take something that you want.
> This is never rational, and never something that you have to do.
Not sure I agree with that.
The real world is full of mutually exclusive moral problems, like when "do unto others" runs directly into "first do no harm" when deciding to give help to someone whose bad behavior you might be enabling.
The problem is continuing to hold both beliefs. You can't have two prime directives. You can't say "never do harm to anyone" and "always defend your children." You have to explicitly prioritize beliefs that have the potential to cause dilemmas.
During an interview one of my answers was: "for me - it sounds shit to live ambitions !"
And the answer was like the old adam and eve thingy: "But shit is a super fertilizer, it gives a filmsy plantlet growth and thriving.
"Because i have seen where you're going with this", i answered, "let me tell you that some people think, you can make every stand - when you only ground it"
"The test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function." (F. Scott Fitzgerald)
I've also noticed something about the term "cognitive dissonance" in popular discussion. It's used as a pejorative explanation when the speaker can't figure out how someone else harmonizes two apparently inconsistent or contradictory positions. But when it's a speaker responding to someone else's accusation of cognitive dissonance, something like Fitzgerald's line usually comes up.
It's probably a useful concept in the context of doing actual psychology (either science or therapy), but I've come to believe that in an argument it's often a form of ad hominem attack, like most stories participants in arguments tend to spin about each other.
Furthermore the unforgiving drive for consistency is a reason why people don't update their beliefs when new evidence comes to light.
Only if you reject the evidence that does not fit with your beliefs. If you use the new evidence to examine and update your beliefs then you can be both open to new evidence and avoid cognitive dissonance.
There is a difference between managing probability mass and "managing" cognitive dissonance. If one of two events will occur, or only one of two hypotheses can be true, with equal probability according to the information you possess, it is not cognitive dissonance to hold that fact in your head. Cognitive dissonance would be more like formulating two hypotheses in logical contradiction to each other, and insisting they must both be true.
I don't think they are that different. First, the basic insight from Bayes is that belief of all kind can be represented as probabilities. Second, as @dancompton alluded to, most things are not as cut and dry as to say that they are logically in contradiction. There is usually a compatiblist view or a third alternative.
If you are presented with evidence that requires a Bayesian update, and you are smart enough to figure this out, yet you still do not perform the Bayesian update (or you calculate it incorrectly according to your whims as opposed to the evidence), then I would say you are cognitively dissonant in proportion to the change in probability mass you should have recalculated but did not.
You keep replying to me with one-sentence posts that say little and have no point. If you have something to say or a point you're trying to make, I'd prefer you just do that rather than trying socratic-method everyone to enlightenment.
To (try to) answer your question, to the best of my ability based on limited information what your question even is: no, it is not about detecting when that update should be performed. Remember that we are talking about brain states, i.e. the-world-as-you-think-it-is, as it is represented in your mind, and perhaps if you want to push it, as you are consciously aware of it. If you hold conflicting beliefs but you are truly ignorant of the fact, then you can't really be said to have cognitive dissonance. In Bayes terms, I guess this is analogous to having priors, but not being aware of how to evaluate them based on evidence, and maybe even that you should do so in the first place. Cognitive dissonance, on the other hand, would be having priors, knowing how (and when) to update them, doing the update, and then ignoring the results. Alternatively, it would be performing faulty updates and being at least partially aware of the error in doing so.
If it negates it for the majority of inputs, but you insist it negates it for very few or no inputs, I would call that cognitive dissonance as well. In other words, if it's highly unlikely that two things are both true, but you consider it very likely they are both true, that's probably cognitive dissonance even if the two propositions are not logical contradictions.
Do you disagree that given a bag of all functions which accept N inputs, it is unlikely that 2 chosen at random would negate each other across all N inputs?
Obviously I don't disagree with that. Where are you going with this? People don't formulate their beliefs by picking randomly from a collection of all theoretically-possible beliefs. To expand on your example, if you are selecting from a bag of beliefs according to some criteria (based on some combination of morality, desire, etc) which are themselves in conflict (e.g. "I wish to be feared; I wish to be loved") then I think you are more likely to pick some beliefs which happen to contradict each other (again, if not logically, then for most inputs) than you would just by selecting randomly. Or, probably, if your criteria are more in tune with one another.
If you are using a Bayesian approach then all new evidence does is adjust the probability of your belief being true. Only those impervious to new information have beliefs with 0 or 1 probabilities.
Furthermore the unforgiving drive for consistency is a reason why people don't update their beliefs when new evidence comes to light. Consider Superforcasting[1] (a book about people with an unusual ability in forcasting the future) the author says that one common trait among superforcasters is that they have a larger capacity for tolerating cognitive dissonance. The drive to avoid cognitive dissonance shackles you to your existing beliefs (see confirmation bias).
[1]: http://www.amazon.com/Superforecasting-Science-Prediction-Ph...