Hacker News new | past | comments | ask | show | jobs | submit login
Ideology Impairs Sound Reasoning (psyarxiv.com)
169 points by Reedx on Jan 14, 2019 | hide | past | favorite | 177 comments

My take on this is that people need to realize that they are not an exception to this. _Everyone_ has an ideology that overrides logic for many topics.

I think the reason for this is that our brains just generally don't operate on a rational basis because it's not practical for humans. We have to rely on preformed perspectives when it comes to certain broad perspectives. Another reason is that it's almost necessary to adopt your group's perspectives in order to fit in socially. Or at least it's unlikely you will have a different perspective if much of your information comes from one group.

I believe the best ways to buck this weakness are:

1. Invest more energy into trying to falsify your favored beliefs than into confirming them.

2. Embrace humility, since we're always wrong sometimes. Humility to me means: you can have convictions, but you should never assume that people who disagree with you are stupid or evil.

I believe the best ways to buck this weakness are:

1. Invest more energy into trying to falsify your favored beliefs than into confirming them.

2. Embrace humility, since we're always wrong sometimes. Humility to me means: you can have convictions, but you should never assume that people who disagree with you are stupid or evil.

I’d add two other ideas that I try to adhere to, continuing your list.

3. Try to adopt the perspectives of those you disagree with, especially when you’re finding yourself ascribing malice, stupidity, moral failing, or just find yourself baffled by someone. Such feelings are often a sign that you don’t understand where someone is coming from, and though you may still disagree with them, you’ll have a better platform to discuss matters.

4. Try to see yourself and your time through the same lens you use to look back on history. We’re not special or unique, we’re goinf to be seen as primitives eventually, and it can be helpful to imagine in which ways that perception might be levied.

Both of these help to broaden perspective and depend empathy, while also divorcing yourself from a strict “first person view” or tribal view. It’s also a couple of concrete steps to gain some distance from your feelings in the present, which makes your steps easier to achieve. You’ll also find that the people you end up despising form a much smaller pool than most.

"It is the mark of an educated mind to be able to entertain a thought without accepting it." - Aristotle

Then what kind of mind not only refuses to entertain novel thoughts, but will condemn anything that even contains a whiff of what seems to be different thinking? It's easy to see the answer in history.

This was one of my dad’s favorite sayings to me when I was growing up. “I don’t know, I fear what I don’t know, I hate what I fear, I seak to destroy what I hate” it was always presented in this rhetorical manner in first person and would always come up anytime there was a topic of ignorance, fear, hate, rage/war on the table of conversation. Needless to say, it was repeated a lot. In hindsight it definitely helped me build empathy towards others and illuminate the motivations of so many. Also it’s worth adding that it was never brought up as a condemnation nor an excuse for the actions of others.

An uneducated mind?

Plenty of educated minds have engaged in just this.

Another concrete suggestion: read books you know you'll disagree with. It's really really hard, but great at honing critical thinking skills and forces one's self to acknowledge when the author has made a good point, and then reflect on why you still disagree with it.

> It's really really hard

How so? I find it interesting to see how people frame and rationalize ideas that I dislike or disagree with.

Maybe it's just me, but it takes a lot of effort to read stuff I disagree with than, say, a novel or a book whose major line of thinking matches mine.

Torture yourself. Uncertainty is discomforting. If thinking and deciding is easy for you - you are not doing it. Intellectual honesty is masochistic.

We must be very different people. Whenever I realise I’ve been wrong about something for a long time, I’m fairly excited about it.

And I’ve been very stubborn in the past. The biggest struggle was being able to accept being wrong in public. It’s much easier to have a ‘holy shit’ moment in private!

You could say the same thing about physical exercise.

Point 4 is an import one for me in many discussions:

- we are not talking about a blank slate, we are talking specifically about position X in this very specific path from an animal to.... something

we cannot judge others without remembering their entire upbringing and the fact that to operate we need to use rules of thumb as there is not enough processing power in the universe to do it any other way.

Really trying to accept this helps me take number 4 there to a very real place where i absolve myself of being "right" and focus on "trying to improve things and give an honest account of the results to help us all move forward".

Thinking of this in terms of others ideologies - we cannot logically think through what is right or wrong, so we NEED people to believe "wrong" things as we develop / evolve to ensure we are correct. negative testing is tedious but very very important.

I this value in trying "provably" wrong things is a real part of what causes us to be who we are (e.g. it could be partially responsible for stubbornness and counter culture for example)

> Invest more energy into trying to falsify your favored beliefs than into confirming them.

On the other hand, ideas are in a competition. If you put most of your energy into falsifying yours, they will lose out against ideas that are ruthlessly championed regardless of merit. (I think we can agree that the best ideas don't automatically win).

So it's a bit of a pickle.

IF your ideas are not constantly put under scrutiny, how do you know they are sound and valid?

As I said, it's a bit of a conundrum, and I for one, have been very much on the side of questioning my own ideas more than anyone else would.

However, as I said, I have come to the conclusion, alas only recently, that that may not be such a good idea.

To answer your question: the world will tell you.

Yep. Steelman instead of strawman - seek the best form of the opposing viewpoint. Remind yourself of the times you were confident about X and turned out to be wrong.

I've also found it useful to think about it this way: I don't want to be wrong a second longer than necessary. (credit to Sam Harris)

> Steelman instead of strawman - seek the best form of the opposing viewpoint.

That's gold. Thank you.

>seek the best form of the opposing viewpoint

Strongly agree with this. Let's try a specific example on HN:

Seek the best form of the view that the anti-vaccine people may have a point.

It is not about right or wrong, it is about trying to understand why they made this decision.

And that could be putting yourself into the position of someone with very particular experiences the health system (e.g. you where very vulnerable and doctors treated you like a piece of meat) on top of other circumstances (you have never been that ill that modern medicine saved your ass, most of your very real health problems have psychosomatic roots and the neighbour wife with magical esotheric healing abilities was the only one wbo really talked to you about it).

This is about understanding who these people are and where they come from and what lead them to their (at times utterly wrong) beliefs. Doing that will not somehow stain your belief (unless you built them on similar shaky grounds). Most people/nations/companies think they are the center of the universe and that they are doing the right thing, sometimes for the simple reason because they see themselves as “The Good” side

They developed some sort of culture, rituals or superstition that backs their behaviour. Of course this gets harder and harder to decipher with each iteration they take downwards the devils spiral into irrationality, until you look onto them from the outside and you tell yourself: no way on earth I understand where they come from with their &)$&)!&@) beliefs.

But if you want to effectively criticise, you need to know how to dismantle these beliefs, and in order to do that, you have to understand (not agree to!) their position.

If you feel like you are giving someone too much power simply by trying to take their perspective, this might have its roots in your own culture, rituals and beliefs.

So in the case of some antivaxxers the best alid ideas that come from their standpoint is probably a criticism of capitalistic medicine (incentive: make money, not heal) and (from their subjective position) untrustworthy medical professionals. On top of that, they suddenly can google every illness on the internet and they feel like they know more than the expert in the field (this is by the way a sentiment of our times, which again has its roots somewhere).

Of course antivaxxers are irrational lunatics, but if don’t just want to yell at them, but use rational arguments, then understanding their background will help tremendously.

>Of course antivaxxers are irrational lunatics, ...

Why is that "of course"?

It looks to me you haven't got the point of what #ReedX wrote at all:

>Remind yourself of the times you were confident about X and turned out to be wrong.

Or #haberman:

>Embrace humility, since we're always wrong sometimes.

Imagine your plane crashes in the middle of the desert. You are one of the survivors. One of the other survivors says he knows how to deal with this and will lead you out of the desert with his navigational skills.

After some time it turns out he belives the earth is flat. You know he is factually wrong and he will lead the whole group in circles till you starve. So you could embrace humility.. but on the other hand there are facts about the world that you don't really have to seriously debate, unless good points of doubt are raised that you can check. There are situations where not pointing out somebody's wrong believes and demand a rational explaination from them is the ethically wrong thing to do.

The odds in this story are starvation, the odds in the vaccination fad are: either killing people by damaging vaccines, or killing people by not vaccinating them. The thing is – this is not just some oppinion, the outcome of this will (independent of who is right) kill people.

I am not a person who vaccinates a lot, but I do and of course I was curious if there is something behind the claims of the supposed side effects of vaccination. And what I did was to embracing humility. I decided not to take a stance on this until I informed myself and talked to people who were convinced about the damaging nature of vaccines. The claims made here would be revolutionary if these people were right. So you check out where the idea comes from and you find that one study with multiple metholodical failures which comes to a weak conclusion that there is a link between autism and vaccines. And then you find a flood of studies who can't find that link. Then you talk with anti-vaxxers about it and suddenly you are the devil for even considering there is a doubt. But remember the stakes – in both cases innocent people die! What I discovered then is that this is not about truth or what is really to be observed in reality, but it rather is a belief rooted in a general "clean eating" ideology where detoxing is the law of the land (all of which seems to be based on a vague gut feeling about the own body and how it must be only be fed with certain substances). For others it was a part of a general conspiracy, that after questioning turned out so kookoo that I thaught I was talking to a schizophrenic patient at a mental facility. So even if it turns out they were right and vaccines have a net-negative effect on humanity, they just were right by accident and not because they really had a rational point that anyone could take seriously and follow.

But given the odds I considered they could be right, checked for myself, reached a the conclusion they very certainly are not right, also reached the conclusion that this idea will in a net result produce more harm than good and decided to speak out against it.

None of the anti-vaxxers I had contact with ever atempted to show this kind of humility or consideration themselves, despite this beeing a situation like the one in the desert: you weigh your belief against the lifes of other people. If I where in these shoes I'd try my best to disproof myself, because the consequences in both directions are not a joke. That is why I am interested in where these believes come from and why people so strongly stand behind this. And in my definition this is humility. I'd take back the "of course" as it displays the certainty I had after doing that research. There are obviously people for whom it isn't nearly as clear.

I would add: try hard to keep the coupling loose between most beliefs and self-identity. The more a particular belief is integrated into ones identity, the harder it is to entertain it's possible falsity.

All due respect I don't think the article proved or indicated that "everyone has an ideology which overrides logic on many topics."

The study found that people are less successful at spotting logical fallacies when the conclusions are supported by their politics.

This is itself an interesting finding, but (1) I don't think the "everyone has biases" argument is justified by this article and (2) that precise argument is often used by people to apologize for allying themselves with others who have abhorrent views but are otherwise inline with their political interests (e.g. White Evangelicals in Iowa who vote for Steve King). I'm 1000% not suggesting that you're doing this but I think it's an unfortunate conclusion many draw from your argument.

Yes everyone has some level of irrationality but not everyone has a sound ideology, many people often waiver between conclusions and human judgement is real, important, and helpful.

I wasn't trying to say that the article proved "everyone" has an ideology.

What the article does say is that among two groups, both groups were influenced by ideology.

And I realize that you and many others feel that you are an exception.

It is my belief that you are not. I believe that no one is an exception and everyone has an ideology. I know that more often than not "ideoloy" is thought of as something some _other_ group of supposedly _less rational_ people have, but I think that is not the case.

So anyway these are beliefs that I have, so they are not something we can discuss constructively. Due to the nature of belief.

In practice you'll be able to find people that are much and people that are almost not influenced by ideology, by some normal distribution. It's safe to say that anyone has been influenced by some ideology at some point, but that's not a strong point to make.

Why should we suppose a normal distribution? Ideology is pretty clearly not formed by summing independent events--people's interpretations of things seem influenced by their existing beliefs all the time.

Both may have been influenced by ideology, but were the practical outcomes identical?

What’s a “sound ideology”?

The answer to that question is obviously : "My ideology"

Nonsense, my views are based on sound reasoning, unlike my opponents, who are blinded by their ideology!

> our brains just generally don't operate on a rational basis because it's not practical for humans.

It goes way deeper than that. Reasoning being hard to do well would explain a high error rate, but not systematic biases. I think human reason mainly evolved for (as you said) getting ahead socially, and only secondarily for solving object-level problems. There's a good recent book, The elephant in the brain.

> I think the reason for this is that our brains just generally don't operate on a rational basis because it's not practical for humans. We have to rely on preformed perspectives when it comes to certain broad perspectives.

I think there is a more fundamental reason, ie, that there is no complete rational explanation of reality. What we do have is partial rational explanations which work well in some specific contexts, but are not absolute: they break down if you change context. See for example relativity/quantum mechanics; even the hardest of hard sciences doesn't have a unified, complete theory.

That's why AI based on logical reasoning didn't work, and will never work IMHO. Rationality is a tool that intelligent creatures can use - the most advanced tool, probably - but it's not enough to live in the real world. Some of the cognitive shortcuts we use are "only" necessary to cut decision time, but many are fundamentally necessary for adaptation to our environment.

This was at one point (in the AI winter before the 2006, I believe) referred to as the "starving at the door" problem.

The "problem" was that a rational robot (or rational human) needs to decide to open a door or not. He can't just stay in front of the door, because eventually he'll die. So the "cost" of waiting is small, but accumulating. And of course, we know this is stupid.

He also cannot open the door. There might be a bear behind the door, which might kill him. Small, but nonzero chance of that happening. Cost of death is inf (because it fails all other goals, it must be a higher cost than the sum of failing everything, therefore effectively it's infinite).

So a rational robot/human will never open the door. Rather he'll die in front of it, from exhaustion. The fun thing is that you actually see this behavior in rule based reasoning systems. They will refuse to do anything at all, because it might kill them (even if they technically don't know what death is this still happens. They can still add together the costs of failing everything). You have to go in and manually remove these cases if you want things to work.

So yes, you need a belief system. Even a purely rational decision making systems needs a belief system. To solve this problem it needs to believe it will never die (or at least that a sufficiently small chance of death means death never occurs). The fun thing is, this is a true belief: it's demonstrably wrong. In a practical system there will be a lot more beliefs than just this one.

> So a rational robot/human will never open the door.

If the door leads from outside to inside, there's also a small chance the robot might die if it doesn't go inside (a passing car might hit it, anti-AI vandals might destroy it, etc). It's probably safer inside. So perhaps we could say that a rational robot/human will never go outside.

Interestingly, some people do exhibit this behavior (agoraphobia) and it's generally considered anything but rational.

The problem for the AI can be fixed, without a belief system, with a lower cost for danger. Infinite cost is too high because it makes probability irrelevant. A finite but very high "cost of death" would still avoid likely death without becoming paralyzed by the possibility of unlikely events.

We sometimes see people deciding that a cause is worth risking, even knowingly giving, their life for, proving that even humans don't weigh the cost of death above all other activities, so it's hard to see why a nonsentient AI should do so.

My opinion is that we could think about an infinite amount of factors to consider if we had to do a completely rational analysis for each choice we make. In practice, we can't, and I think nobody could.

Thanks, "belief systems" is one way to describe those partial theories which work on a practical level. Thing is, you need many, and they're contradictory. I see them as similar to linearisation of complex functions: they work well in a neighborhood, but not outside of it. The real function we'd like to reason about has infinite dimensions, is can't really be measured...

"All models are wrong, but some are useful" George E.P. Box

> So a rational robot/human will never open the door. Rather he'll die in front of it, from exhaustion.

That’s incorrect. The inaction you’re describing is wrong from a “purely rational” perspective because starvation also constitutes death and has a much higher probability given inaction. Your example doesn’t show there’s something wrong with “pure rationality”.

Could this not be solved by simply adding a rule "if probability < certain acceptable number then execute action"?

If you want to make completely rational decisions, then you should compute those probabilities to know they actually are below threshold before discarding them. Considering that the number of possible (albeit improbable) events to consider is practically infinite, this would still take a practically infinite time.

What instead we could do is discard lots of possible probabilities without even considering them, because they just look intuitively unlikely enough. In other word, using the so called "cognitive shortcuts".

> that there is no complete rational explanation of reality

... that fits inside a current day human's brain. I see no reason to believe this is true fundamentally.

Of course we can't prove anything about possible future non-human brains, but try to think about it this way.

Is there a limit to the number of consequences that a single action can have? If you think about all possible future consequences, then the answer is clearly no. Using an example from another comment, opening a closed door could bring you to your death, which has infinite consequences for your (potentially future) family.

Therefore, if rationally analysing each possible consequence takes any non-zero amount of time, deciding any action rationally would take an infinite time. You could try to rationally decide what is the maximum amount of time you can spend on each choice, but deciding that amount rationally would require infinite time.

This isn't a proof, but this consideration comes from spending a lot of time thinking about rationality and AI.

I think you're conflating two thing: solvability and rationality. Your example shows that nothing may be completely solvable, because the cost of calculation is too high. It says nothing about the universe being fundamentally irrational.

The problem I had in mind is if it's possible, for a human, to decide what to do in each situation based only on rationality.

I wouldn't know how to define if the universe itself is rational, in an absolute/abstract sense.

They were referring to present time, not a hypothetical future of more complete explanations of nature and more powerful human brains for conceiving and storing them.

That said, there is a reason why it may not be possible: human consciousness can only exist at a higher level than the most fundamental level of nature, and on that basis, consciousness may never be able to perceive enough of the fundamentals of nature to be able to formulate a complete explanation of it.

However it's not really relevant to the topic at hand. I'm sure humans will always use cognitive shortcuts like ideologies in order to navigate our daily experience, for reasons of speed and efficiency.

On a fundamental level, the decision to stick with 'rationality' as the basis off of which to make decisions is its own kind of ideology.

"Rationalism" is an ideology$. It elevates ideas arrived at by rational arguments to a special status. Other sources of knowledge, such as empirical knowledge, culture and tradition as well as biological instincts and gut feelings are often disregarded by rationalists which can potentially prevent the adherents from reaching their goals.

"Rational" is also a word that can be used for rhetorical purposes. If one claims that something is "rational", it is often specifically because it is not obvious to the audience that it IS rational. The claim is an attempt to influence the listener to believe the truthfulness of the something.

Similarly to when people say "trust me", such a claim may in itself be a reason to be a bit cautious.

"Rationality" is a tool than can be wielded in many ways and for many purposes. Properly used, it is a good source to knowledge, but it needs to be combined with empirical knowledge as well as heuristical information stored in social groups and even our DNA, even when not fully understood.

Edit: $ From the definition of "Ideology", epistemology is specifically excluded. Since Rationalism deals with epistemology, it is technically excluded from the definition, as long as the Rationalism is in good faith, and not driven by ulterior motives. This distinction does not make a difference for the rest of the argument.

> From the definition of "Ideology", epistemology is specifically excluded

Considering there are epistemologic ideologies, I'm not sure how you can claim this.

For example, in the West (and now common throughout the world, since most of the world has adopted Western epistemology, whether they admit it or not), we believe in the scientific method -- that if we can observe, measure, and replicate an experiment, we have established 'empirical truth'. The idea that such a process can establish universal truth (rather than simply be individual measurements of how the universe acted at a certain point in time) is based on the Abrahamic notion of an unchanging God. It is certainly an ideology in and of itself. It may be a useful ideology, but claiming that it is not an ideology is just dishonest.

You are right.

I checked the definition on wikipedia after writing my comment:


But further reading shows that my correction was wrong, and my initial assumption was correct.

Thanks for pointing this out :)

Rationality is just a tool, which can be used or misused. That said, it has a really impressive list of accomplishments when it is used well. The rationalist view and rationalist methods have created more wealth and alleviated more human suffering than most any mystic of the world pre-1800s would ever have imagined.

> That said, it has a really impressive list of accomplishments when it is used well

Of course it does. Almost every person in the history of mankind will have claimed to act rationally, including the scientist inventing better means of wheat production on fewer acres, as well as the voodoo witch doctor trying to do the same thing.

By necessity, whatever works is going to claim to have been inspired by rationality.

whatever works is going to claim to have been inspired by rationality

Everything that works turns out to work through rational principles subject to the scientific method, enabling improvement through engineering, often enabling the creation of wealth.

Funny that.

People would do well to remember that the ideology of the Nazis had deep roots in rationalism and ideas about the enlightenment.

Rationality is not an unmitigated good when not tempered with humanity and empathy for others.

A decent discussion on this topic:


Rationality is not an unmitigated good when not tempered with humanity and empathy for others.

No disagreement at all.

What do you mean, exactly?

1) Objectivity is a kind of ideology? This is objectively false ;)

2) Assigning desirability to that which we perceive as rational is a kind of ideology? In this case, can something so universal that it's surely biologically hard-coded be considered an 'ideology', esp in the context of the article?

3) Choosing to reason rationally instead of ideologically is an ideological choice (or perhaps even a choice motivated by the kind of ideological reasoning discussed in the article)?

4) Something else entirely?

Maybe he means the fact that some people seem only to believe in the rationality that is splitting things up, decomposing everything in to units or facts instead of relying on intuition / the gut feeling / the indescribable knowledge you have accumulated and that is within you, and (just guessing here) thinking that the analysis you have done reaches conclusions that are indisputable, which is surely not the case, because everything you rationalize is relativistic and rooted in your view of the world, your assumptions. And on the other hand you can always know more, you have no time know all the facts, no time to weight all side of the things, except when playing tic-tac-toe. Edit: Rationalization often leads to an imbalanced position as a whole, because it's so difficult to avoid over-optimizing certain areas of your life and at the same time not neglecting other aspects of life, especially when you rely on the IDEA of how things should work out (and let's say not paying attention to what your body is signaling to you). If on the other hand you mean that it's rational to listen to your body... see where this is going...

Rationality is not ideology, but epistemology and ideology are often closely related.

Also, it's not as if rationality is "above the fray", so to speak. Differing relationships to rationality are at the core of major differences between schools of thought in both international relations and economics, for instance.

I think of it as different errors in perspective about what is rational, but rationality is still the virtue regardless.

You're redefining words; rationality means something much more specific than "not magical thinking".

1) Saying that you can reach objective conclusions for what actions you can take purely based on rationality, without making axiomatic assumptions, has all the same problems as doing so for mathematical theorems.

Objective theories for how to act (including morality), requires that all subjects that could possibly be covered by the theories are 100% in agreement with all axioms. Also, all theory on top of the axioms must be 100% stringent.

2) I think you take the argument the wrong way. Our ideologies tend to influence both what we perceive as rational as well as how important it is for us that something is rational (as opposed to socially acceptable, comfortable, self-serving or a number of other metrics we can use to measure the value of the "something")

3) Rationalism is a well documented ideology (see my other post).

The mistake is to think you can reach a state of "objective" to begin with.

Spot on. The word "rationality" is thrown around as it was possible (and desirable) to reach a objective view on every subject.

> Objectivity is a kind of ideology? This is objectively false ;)

The number of things that can be said universally objectively is so vanishingly small that it's not even worth considering them for the most part.

> Assigning desirability to that which we perceive as rational is a kind of ideology?

No. All people will label what they perceive as rational as desirable. The issue is that 'what we perceive as rational' is almost necessarily based in our underlying ideology.

> In this case, can something so universal that it's surely biologically hard-coded be considered an 'ideology', esp in the context of the article?

What are you claiming is biologically hard-wired? First-order predicate logic? Propositional logic? If it's hard-wired, why are there college classes dealing with introducing the topic?

Moreover, the existence of languages that are unable to express first-order predicate logic indicate that it is not hard-coded.

Finally, given the variety of human language, it does not seem clear that you can make these claims of 'biological hard-wiring' without justification.

> Choosing to reason rationally instead of ideologically is an ideological choice (or perhaps even a choice motivated by the kind of ideological reasoning discussed in the article)?

Can you please explain what system you use to reason rationally? Can it add two numbers together? If so, it has an ideology behind it (see the incompleteness theorems), as there are statements in your logic that it can never prove.

If your system is incapable of basic arithmetic, then I question the value it has in everyday problems.

> Something else entirely?

The constant claim I see on HN, reddit, and social media in the 21st century of people claiming to act 'rationally', via movements such as lesswrong, rationalwiki, etc, is that such motivations are not ideological. However, it is clear that they are ideological because they often elevate what amounts to first-order logic plus the scientific method as means of discovering truth. First order logic with the law of excluded middle is unsound, and the scientific method as a means of establishing truth is based in the notion of an unchanging transcendent reality (in other words, a shadow form of the Abrahamic God). If this is how you want to approach life, you should be aware of your own implicit ideology while doing so. Claims of 'rationality' are simply dishonest (note, I'm not saying they're wrong, just dishonest).

If we're being honest with ourselves, we all have an ideology, a way of explaining the world, and if we're honest with others we will not claim that we're being 'rational' without first explaining our ideology.

I think you need to justify this claim. I don't think that fundamentally rationality is just the same as any ideology, in fact it seems fundamentally wrong to me, the same way that claiming that opinions trump observable facts would be to me.

Can you point me to a system of logic that is both able to make substantial claims and for which every statement in that system can be proven.

Mathematics says you cannot (the incompleteness theorems), so there will always be a fundamental truth you will need to subscribe to. This indicates that those claiming 'rationality' almost certainly have a belief that is irrational even according to themselves (if they are honest enough to admit it).

Here is an example. Suppose we are arguing about the need for a certain tax. The point of contention is whether the tax will majnly benefit the rich or the poor. You start out with an argument: even if the tax benefits those not in poverty, the effects will trickle down. Despite your opinions on trickle down economics, the argument was unsound at the very first words. 'Even if' is a form of argument via excluded middle. It presupposes that either tbe tax will help those in poverty or it will not. The law of excluded middle forms unsound logical systems. There are ways around it but i guarantee you you will not see this in 'rational' discourse

In the context of these everyday absurdities and the fact most logical systems are -- by necessity -- incomplete, its almost inavariably more interesting to talk about the unproven statements (the ideology) than the system of logic surrounding it. In other words, the ideology is ultimately what matters, even to rational thinkers.

Correct. This is what the Age of Reason and The Age of Enlightenment were all about.

Philosophy underpins ideology to a large extent.

> I think the reason for this is that our brains just generally don't operate on a rational basis because it's not practical for humans.

But that is an ideology! That isn't even wrong, what's wrong is to say that's the opposite of logic.

> We have to rely on preformed perspectives when it comes to certain broad perspectives.

That is generally true, but if the perspective is limited, we start speculating, making and testing assumptions. That's not an ideology per se. The thing with ideology is that long distance planning requires to stick with one theory to see it through. That's why discipline, form and idea are somewhat synonymous. Think about when somebody says someone had no clue. And corollary, different people might partially succeed with different attempts and reach different conclusions. These might be mutually exclusive, if leaning on probabilistic arguments, or just not evidently the same, if it's not obvious how to correlate the experiences. These are different meanings of ideology and the contradictory sense is rather euphemistic or derogatory.

> Another reason is that it's almost necessary to adopt your group's perspectives in order to fit in

That might be dogmatism and virtue signaling, which to a degree signals subordination. That's less than ideal, but the alternative fight to death is often farther from optimal.

_Everyone_ has an ideology that overrides logic for many topics.

Indeed. Both sides of the political spectrum have this! As a person who values truth and knowledge, let me warn my fellow people: Beware of jingoistic and tribalistic claims that reality has a [political-descriptor-here] bias. No human being is free of the need for a little self doubt and introspection. No ideology can act as an oracle, and ideologies which have made that claim throughout history have acted against the truth and even given rise to tragedy and atrocity. If you have reached the point where you believe your opponents are somehow lesser than you are, and that you no longer need to convince but instead must coerce their cooperation, you need to take a step back and beware. No belief system, philosophy, or intellectual movement is so clued into ultimate truth, that its adherents can afford to cease questioning themselves. It's precisely those groups of people who cease questioning themselves and suppress the questions from others who are most at risk of becoming history's villains.

(One of my most disilusioned days was when I saw fellow atheists wearing their atheism like an arm band, using it as a pretext to declare their superiority over fellow human beings, and intoning it like some sort of religious creed. Everyone was wasting time preaching to the choir, and as I was at a musical gathering and just wanted to play music, I called it out. Then I watched them turn on me.)

(Both sides of the political spectrum deny science for ideology. They just deny different parts of science. https://www.youtube.com/watch?v=hkdIB03RdBg )

> Both sides of the political spectrum

That's a bit of ideology all by itself. The idea that there are only two sides is faulty.

Note that in context, I'm talking about "both sides of any argument," so I'm deliberately using an abstraction. As for the political spectrum, I'd argue that in 2019, you have to use a 2-axis square at the very least, with the 2nd axis representing Authoritarianism/Anti-Authoritarianism.

I just left that detail out for the time being.

> Both sides

What both sides? (Two? most compasses, that are already reductionist, have many more quadrants)

It looks like everyone above you in this comment thread is talking about U.S. politics, which is polarized between two political parties, Democrats and Republicans.

As long as the United States continues to use first-past-the-post voting schemes to determine the winners in electoral contests, we will continue to have a two party system (https://en.wikipedia.org/wiki/Duverger%27s_law)

Hence "both sides."

I’d argue that also a centrist has an ideology. Or a non voter. The division between left and right is tempting, but ultimately fails to remind us, that most people don’t really fit into that concept of two sides ideologically

I’d argue that also a centrist has an ideology.

I'm a left-leaning centrist whose attention has shifted more towards a 2nd political axis of Anti-Authoritarianism.

Oh look a YouTube video to a talking head. I wonder what ideologies he thinks are ignoring science? Maybe an interesting perspective on differences in town-planning outcomes, or a critique of economic schools compared to real world measurements...nope! Of course he has a lot of opinions on feminism, homosexuality and for some reason seems to have a lot of trouble imagining himself discussing these topics without offending people.

Oh look a YouTube video to a talking head.

A professor of evolutionary biology.

I wonder what ideologies he thinks are ignoring science?

The left likes to deny social science, biology, and psychology which contradicts its narrative. The right denies climate science, mostly by misstating science and for tribal reasons.

Maybe an interesting perspective on differences in town-planning outcomes, or a critique of economic schools compared to real world measurements...nope!

Your assertion that this proves something is quite vacuous, as his commentary is mostly aimed at evolutionary biology, which is denied by the left for political reasons, and climate science, which is denied by the right for political reasons.

Of course he has a lot of opinions on feminism, homosexuality and for some reason seems to have a lot of trouble imagining himself discussing these topics without offending people.

Because he faces lots of underhanded authoritarian tactics for merely talking about scientific truth. He's a Lebanese Jew whose family had to flee deadly religious persecution, who is regularly called a "white supremacist" or Nazi by dishonest people who just want to push a political agenda by sneering and name-calling.

It's high time people started calling out such dishonest smearing tactics and such authoritarian mindsets. What we need in 2019 is discourse and facts, not coercion and smearing.

Your comment is very snarky and unsubstantive. Refer to the main article of this thread.

Believe that all sides are always equal is ideology on itself and harmful one at that, because it prevents you to take action when there is real danger around.

Democracy is ideology and nazism is ideology and communism is ideology and all three are subject to biases. They are not the same nor equal in their consequences.

Believe that all sides are always equal is ideology on itself and harmful one at that

No one is promulgating that. Very specifically, the idea forwarded is that both sides are human and therefore subject to human biases and groupthink.

That's not ideology. That's just well established fact. Also, don't conflate that with my description of the current political climate in the western world, which is largely one of polarization and tribal entrenchment.

Yes all sides are human. Some sides are superior morally, some lie much more and some lie less. Some groups have more groupthink and others less. Some people are more biased and others less.

The differences in degree matter a lot.

The differences in degree matter a lot.

I'm glad you say that. You should go and look into the frequency of political violence, with an eye to finding things which are not reported by the media. A preponderance of incidents are committed by the extremist far left (Antifa &c) in 2018 and 2019. Then go and look into the frequency of people advocating for and tacitly approving of political violence. A preponderance of incidents are committed by the extremist far left and far right. Really, I'm tired of yahoos who basically say they can do anything, so long as it's no worse than the other side.

In terms of differences of degree, the extremes on both ends of the left/right spectrum seem to be trying for escape velocity from reason and sense.

That's the thing. Nearly everything is an ideology, but that doesn't mean everything is equally valid or equally wrong. But we need to get out of the tribalism and dogma that often comes with any ideology, belief system, -ism or simple opinion, and consider the more tangible ideas, actions and policies instead.

Lately I've tried to avoid talking about -isms. Doesn't always work, because everybody talks about them, and I'm certainly deeply committed to a couple of -isms. But they tend to become both hollow labels and collections of dogma. "Socialism" is an obvious one that I often see abused. People oppose for example universal health care because they consider it socialist, and because socialist Russia/Venezuela was/is bad, universal health care must be bad. It's a stupid way to argue, but surprisingly common. The same is also true of "capitalism" of course. It can mean either the concentration of wealth in the hands of a small elite, or the free market. And even "free market" means different things to different people.

Better to let go of the dogmas and labels and talk about the results we want, while looking critically at what actually works and what doesn't.

Communist russia was much more then just healthcare or public schools. And it did not got to the violence and oppression by having healthcare or public schools. Complete conflation of socialism, its variants (including democratic ones) and communism did not happened by random either.

Refusing to use names in this situation wont eliminate peoples fear, it will merely make it impossible to talk about what they are afraid of. Good sounding ideas can go real bad and it is good to learn from history. And I think that one of lessons of Russia history is that if people who are violent before gaining power or openly celebrate violence, they will be violent after.

Maybe it would be worth to talk also about Russia history and ideologies that affected it as they were.

Good sounding ideas can go real bad and it is good to learn from history. And I think that one of lessons of Russia history is that if people who are violent before gaining power or openly celebrate violence, they will be violent after.

Too much of the current left in the US celebrates harassment and violence. There's the bike lock bashing professor. There's the various hippies and other Berkeley residents interviewed by Tim Pool who give tacit support to Antifa violence. There's the simply-conservative father and son who were chased down by Antifa goons shouting epithets and given a beat down. There's the local news reporter who was bashed, his smartphone damaged for simply filming Antifa. There's Tim Pool himself, who is a Korean-American and center-left politically, being called "Alt-Right" so a mob of goons would come and beat him up. (And that's just off the top of my head. There's so much of this, and it's simply not covered by the mainstream media.)

How is it the left can object to violence done on its behalf overseas, but either tacitly support it or even perpetrate and celebrate it as a means of political intimidation? Something has gone corrupt.

Mostly because, left is more then just one group. Some groups are violent on themselves, some are response to violence from the right. It is not like the left would habe monopol on harassment or violence. If anything there is more violence on the right.

Most importantly, USA is not 19 century Russia in pretty much any way. You went completely offtopic and also conflated antifa with social programs - in discussion about tribalism and biases. Achievement.

The same is true of the right of course. Not everybody there approves of violence, but some do. Some have a long history of violence and harassment, and the president seems to support it.

Part of that, particularly from Trump and his supporters, and probably from Antifa and similar groups, is excessive tribalism. They hate the other side and are eager to hurt them. Trump supporters are certainly quite open about that. Antifa might actually be different. It's also possible they actually believe that violence is necessary in order to stop fascism. But any support they may get from the left is probably pure tribalism again.

Of course there's also another aspect to violence from the right: things like lynchings in the 1950s, harassment of women, violence against Muslims and other minorities. Those stem from pure bigotry and a belief in (usually) white male superiority and a right to punish others if they seem too uppity or seem like a threat. I assume most people on the right abhor this, and yet some of these still happen.

a right to punish others

This "right to punish others" should be a big red flag, be it from extremists on the right or the left. Both extremes have poor records, historically speaking.

You're getting downvoted, but I can't tell why. Maybe you've offended both sides by contending that their side is as guilty as the other of ideological reasoning?! :)

EDIT: Wow, someone downvoted me literally ~2 seconds after I posted. Good work, Quickdraw!

I'm guessing it was the both sides thing, which assumes a binary opposition as a starting premise when many issues are multidimensional.

The downvotes on your own comment may be because one of the HN guidelines discourages commenting about voting patterns.

The issues are multifaceted but our society is largely and increasingly divided in two. I would love for it not to be, but my aspirations alone don’t change reality.

And I thought the fact that the downvote was there as soon as I posted was the remarkable bit, not that I was downvoted at all.

But the biggest not-a-trump-tweet political story of the last year is the massive shift of american politics into left-wing, centrist and right-wing groups as the effect of Trump getting elected ripples through both parties.

Saying this during an era that looks like it will be partly remembered as when the american binary political divide ended seems very odd to me. American society is largely and increasingly not well modeled by 2 clusters of opinion.

I really hope you're correct about a centrist group forming. One of the huge distortions of American politics, is that the center isn't explicitly represented, though that's the actual politics that runs everything.

American society is largely and increasingly not well modeled by 2 clusters of opinion.

I've seen studies and polls indicating that there are two diverging clusters of opinions and positions, with the left part moving further left, and the right clump moving more slowly right.

You're getting downvoted, but I can't tell why. Maybe you've offended both sides by contending that their side is as guilty as the other of ideological reasoning?! :)

There is a big part of online society for whom such jingoistic tribalism is thought of as some sort of virtuous, intellectually worthy exercise. I find this highly disturbing. If one is interested in finding the truth, then self doubt and the willingness to ask questions is paramount! It's precisely that "you're either with us, or against us" mentality which is a hallmark of the tyrants and the enemies of reason. It's precisely that mindset that burned witches and imprisoned Galileo.

It's precisely that mindset that gives rise to ideological reasoning.

EDIT: Wow, someone downvoted me literally ~2 seconds after I posted. Good work, Quickdraw!

It seems like some group of activists has tried to come in to Hacker News and use it to push their agenda. I, for one, do not appreciate their attempts to push their agenda by creating fear of being labeled an "-ist."

   >>> It seems like some group of activists has tried to come in to Hacker News and use it to push their agenda. I, for one, do not appreciate their attempts to push their agenda by creating fear of being labeled an "-ist." 
Was the irony here intentional? ;)

Was the irony here intentional? ;)

I've been around here since 2007, and I'm even on the leaderboard. There has indeed been a sharp change around here, starting slowly in 2014 and more quickly starting in 2016. This sort of outrage-mongering and "you're with us or you're against us, or we'll smear you" anti-intellectual mentality on HN is something new that's come in from outside.

I imagine that they were downvoted for mentioning "Both sides."

It is correct in this case, but in other contexts, it is also a common bit of fallacious reasoning, used to draw false equivalences.

I groan every time I hear someone say "both sides", as if it's just not possible for there to be a simple right answer sometimes. Remember "teach the controversy"?

Edit: not to say that pointing out that "both sides" have some sort of mistake is never valid, it's just that almost every time I see the argument it's from someone akin to a flat earther, or climate change denier, or an anti-evolutionist, who really doesn't have the facts on their side. And it's incredibly frustrating that they just expect me to respect their opinions just as much as I would respect someone who's talking points hadn't been debunked over and over already.

I agree, but I think “both sides” has merit when one side is trying to position themselves as champions of reason across the board. It simply isn’t true for any major political ideology (in America, anyway).

But you can use reason to work out that if people with different subjective values use perfect reason to choose their political opinions they will often end up disagreeing.

So it isn't true for anyone (in this universe, anyway).

It's not a fallacy in the case where something does happen to apply to both sides. As far as I am concerned, "both sides" in any debate are human beings, subject to human foibles.

Could be because "My values/beliefs are the ideology free centre and everyone else is ideological" is the most common and boring ideology in North America.

Even the idea that politics involves "2 sides" or that this is a useful way to model things is an obvious ideological statement.

Pointing this out gets old, at this point jokes about how someone always says this and betrays their complete misunderstanding of the entire concept is old. And since you know this ideology and you know what works and what doesn't when trying to get someone to confront un-examined ideological commitments ... you just downvote and move on.

I'm certainly not an exception. I wouldn't say that I let ideology override reason, but I let it override argumentation and persuasion, which can easily masquerade as reason.

I think that a certain level of stubbornness may actually be a survival trait, because it protects a highly social animal from manipulation and deception.

In modern society, we are deluged with information with a very poor signal-to-noise ratio and many false leads. My approach is to let my short term decisions, such as voting, be influenced to a certain degree by ideology, but let my ideology be influenced over the long term by a preponderance of evidence. This allows me to function without being jerked back and forth by the best debater or hottest news du jour.

This hits the nail on the head, plenty of engineering and programming types think they are way too smart to fall prey to politics and ideology, and are all the more vulnerable due to it.

There is a reason that the majority of educated political extremists come from STEM backgrounds.


I see at work at work. There is someone who is bound by ideology and tries to manage by that (managerial) ideology.

It has lead this person to make decisions which do not fit the characteristics of their current company. They have taken the experience from other industries and from companies with other characteristics and have tried to apply them and shoehorn them. It has lead to some poor decisions, execution as well as inferior relationships with other key players in the company. Superficially something like Mr. Johnson when he went from Apple to JCPenney.

These studies were conducted along political axis lines, "liberal versus conservative". However, most arguments that are being employed in politics are not hard to invalidate in the strict sense by application of formal logic and counter examples. That being the case, it seems an obvious conclusion that one picks holes preferably in the opposition's arguments.

Politics is about vested self interest, not logic.

> My take on this is that people need to realize that they are not an exception to this. _Everyone_ has an ideology that overrides logic for many topics.

Is that what the paper actually says? Based on my personal experience I would think there's a lot of variance but I would be interested to see actual data.

Do you believe it is possible to be ideology-free?

> _Everyone_ has an ideology that overrides logic for many topics.

Yes. However, my ideology is that logic, evidence, and reason, and reality are far more important than ideology.

All I care about is evidence and if I hold a position that is faulty I abandon it the moment there is evidence that I'm incorrect.

See I believe even people who (like me) have a scientific or rational worldview are biased by their existing pre-conceived premises. You have to start from some perspective. So effectively this idea that your worldview is totally evidence-based just acts as a blanket and automatic (non-rational) reinforcement for you existing views.

I find this worldview tends to face difficulties in practice because on any reasonably controversial issue you can find solid evidence that 'both' (assuming only two sides here for linguistic ease) sides are wrong, and evidence that both sides are right. So which do you choose? Neither? What if it's an issue that you find particularly relevant? Do you now base your decision on popular vote? Weighted vote? Or do you go with what you personally find most compelling? Something altogether different?

To avoid the hornet's nest that is contemporary issues let's go back a bit in time. Is Earth in the very center of the universe with literally everything in existence revolving around it, a geocentric view? Or do we live in a universe where the Earth holds no particular relevance and is just another rock rotating around our star, a heliocentric view? There is hundreds of years of astronomical evidence indicating that the Earth is at the center of the universe. But at the same time you understand that alternative views, heliocentricism in particular, have not been given anywhere near the same degree of consideration. It borders on heresy against the established political forces, but even in academia it turns against centuries of established research. But might we be missing something?

Personally you find the models used to articulate the astronomical 'reality' of a geocentric to be rather unrealistically fanciful. Mercury at some point in its orbit literally stops and just starts going the other way. Most of all the planets have to travel in these really peculiar swirly orbits that we don't really have any physical evidence of in any other phenomena. Perhaps most worrying of all to you, the models used to demonstrate a geocentric universe are completely unfalsifiable. Each time we make a new observation we simply tack it onto the model. There is no way the model could ever be refuted outside of being able to view the universe through the eyes of god himself, which is something we surely will never be able to do.

So would you choose to believe that the Earth is at the center of the universe? Or might you find it something less than compelling in spite of the fact that 'logic, evidence, reason, and reality' as framed by the times had entrenched it as indisputable fact? In any case there would certainly be no indisputable evidence that Earth was not at the center of the universe with everything else revolving around it.

Geocentrism vs heliocentrism was actually a proxy debate between religious authoritarianism and the first stirrings of scientific empiricism.

You buy ideology wholesale, and ideological thinking is essentially authoritarian. It means you can hide behind something that you assume is bigger than yourself to make your beliefs more plausible and your arguments more convincing.

You can see this reduced to bare essentials when you talk to religious types who "prove" everything with a bible quote. If you don't believe in the authority of the bible this seems ridiculous, and if you do it's completely authoritative.

But science has similar issues. It's not unusual to see "You can't argue with me, I'm a scientist" being used in the fringes of science where it isn't truly justified empirically or theoretically. It's even more popular among technical types who aren't scientists at all, but who use "science" to dismiss opinions they disapprove of as "woo".

Ideology is essentially just people not-thinking in a herd. You can buy your value system wholesale, have it justified by the size and heft of your herd, and persuade yourself that your beliefs make sense - and even that they can't be argued with.

The authoritarian part comes from the political reality that political "morality" is about power, status, and credibility, not about facts or accuracy. In a political moral framework you score points by destroying the power, status, and credibility of your opponent using any means possible - including outright lies and character smears.

Humans seem much more likely to "prove" a point politically than empirically and with intellectual humility and integrity - which is fatal for real knowledge, because making and admitting mistakes in a political frame is a strongly losing move.

In political issues, what kind of evidence do you require? For example, what evidence can settle the question of whether the state ought to use taxes to provide healthcare?

Remember time. One way to think about this dilemma is that evidence is a result of observing reality. Evidence are about the present and the past. Politics on the other hand is a way to change reality, which will change the future observations to give different evidence as a result. Politics is about making changes now to change the future. Evidence and politics are connected but separated by time.

Indeed. But questions of how to evaluate results are always answered by way of some kind of preference, desire, or ideology, right? Different ethical principles can be in conflict; there's no objective measurement of the good.

I can answer that one: look at other countries. Look at what they do and compare the results. Some countries use taxes to provide health care, some fund it partially by taxes, some don't use taxes at all. Compare the results: which system gets the best health care for the least amount of money?

That is indeed a good way to gain insight into consequences, but still you will have to make ambiguous decisions when formulating the measure of good healthcare—e.g., do you consider equality for rich and poor?—and when judging tradeoffs like those involved in budgeting tax revenue, etc.

I agree that this is the main point: What are you trying to achieve? If your aim is to increase average happiness, what do you consider to be a good measure of happiness? And what kind of average are you using? Most "ideology" relates to this kind of arbitrary preference, I think.

But there are also other problems with evidence: How well do you trust the authors and institutions that produce or report the evidence? And if the evidence is probabilistic, as most evidence is, outside mathematics, what are your prior probabilities (in a Bayesian sense)?

In the case of international comparisons of health care, and other situations in which there is no repeatable experiment, there's also the problem of whether A caused B or whether A and B are both consequences of something else, perhaps something that the authors had considered and dismissed with some non-numerical justification, or perhaps something that wasn't even considered.

I personally usually don't have strong preferences in politics. Most things make sense (or no-sense) in some light.

That's why I have trouble choosing a party. Most issues are a close decision for me, so they don't clearly fall into one party's political view.

I trust people like you, who don't even acknowledge the impact of ideology on their reasoning, even less to be objective.

Do you believe there are things that cannot be understood through reason alone? Are there things that lie beyond understanding? That cannot be understood or discussed rationally?

I think it's often even simpler, where ideology just provides false premises. So people are acting in a mostly rational manner but starting from bad information.

(I get that acting on false information is not rational, I'm arguing that it is frequently the core of the irrationallity)

My point is that everyone has a set of premises. Logically speaking, I think we should be aware we are making a leap when we assume that all of our own premises are correct and the other person's premises are incorrect. I mean there is no logical basis for the premises, they are just assumed by both parties. Even in the case where one set of premises is supposed to be based on science or rationality. They are still just broad assumptions.

Is sound reasoning even needed all the time though. I have an environmental ideology that causes me to not use cars even when its much more convenient for me. Have I ignored logical reasoning and is that even a bad thing?

> My take on this is that people need to realize that they are not an exception to this. _Everyone_ has an ideology that overrides logic for many topics.

Agreed, but it's also worth pointing out that not everyone indulges in ideological reasoning to the same degree in general. I've heard people make related arguments that because everyone reasons ideologically at points, there can be no objective position on <topic>; of course this argument holds little water, and it seems designed to rationalize the conversant's belief to himself rather than to persuade his audience.

> I think the reason for this is that our brains just generally don't operate on a rational basis because it's not practical for humans.

Not meaning to pick on you, but this line of reasoning is ultimately self-refuting.

If we're not capable of operating on a rational basis, then the belief that "we're not capable of operating on a rational basis" is itself irrational and therefore shouldn't be believed.

Since this is self-refuting it must not be true and therefore we must be capable of operating rationally.

Just because it's very hard and, ultimately, not useful for most situations it doesn't mean we shouldn't strive for it.

I don't think you can equate "generally not doing X" to "not being capable of doing X" here

> _Everyone_ has an ideology that overrides logic for many topics.

speak for youself

i have yet to be proven the case while i have shown many others that they are.

Ironically, this in itself is a good example.

A few good books I'm reading / have read on the subject

  The Righteous Mind
  The Elephant In The Brain
  In Defense of Troublemakers
The deck is stacked very far against us cognitively. We are a walking, talking political nightmare unto ourselves and others.

The worst part is it's excruciatingly difficult and extremely unlikely for you to find your own blind spots. So you need to hash things out with other people. As others mentioned the best thing you can do is hold defeasibility and corrigibility as some of your highest values and do your best to understand all the pitfalls in our thinking.

I'm currently reading _The Righteous Mind_. While I haven't finished it yet, so far a major theme has been that reason's major job is merely to justify what the gut instinct has already decided.

Taking that theory to its conclusion, that means that most people are approaching the question of rationality vs. emotion the wrong way by assuming that appeals to reason and logic are the way to convince the average person. Not so; one must convince their emotions first, and then they will reason together a logical framework to fit their emotion.

It's a bit distressing to think about, but unsurprising when one reflects on the course of history that humanity has taken, and is taking.

His experiment in one of the early chapters where they come up with these little stories that are for some reason morally repugnant but on closer inspection don't appear to be hurting anyone and then they press people for their reasons on why they think it's wrong and they wind up inventing justifications that are sometimes pretty far fetched. That was pretty interesting.

There's another great bit of research detailed like that detailed in The Elephant In The Brain.

In the 60s and 70s Roger Sperry and Michael Gazzaniga did some research for which Sperry eventually won a Nobel Prize in 1981 where they showed images to patients who had undergone a corpus callostomy where the two halves of the brain can no longer communicate and asked them questions about what they saw. The results were really striking. They did it in such a way that they show the image to only the right side of the brain but then ask the left side of the brain to describe what they saw verbally. They just invent some kind of fantastical rationalization and they seem to fully believe that's what they think despite the fact that it can't be. I'm explaining it kind of badly but that part was jaw dropping. Seems to fit with Haidts research though.

I am wrapping up The Righteous Mind and while I think there are a lot of intuitive ideas presented in the book, I have a lot of problems with Haidt's arguments, things he isn't addressing, and his approach to the field in general. It's been a great exercise of critical thinking skills for me.

Anything in particular you think he isn't addressing?

Nothing is jumping out for me personally that I can see, but I know well the feeling of reading all the arguments of one particular school of thought and having this glaringly obvious edge case that no one ever seems to bring up.

Curious to know what those things you see are?

yes, one of the reasons i believe we evolved to be social creatures is that we uncover better answers when we triangulate opinions and facts with others rather than solely relying on one’s own intuition or reasoning.

Often it comes down to trust, whether that's rational or not. Take climate change. The topic is too involved for most non-scientists to absorb enough to come to an informed decision on their own. Therefore, unless one invests boatloads of time, they must rely on expert opinion for much of their conclusion.

Therefore, climate-change deniers ask if one should trust scientists over their favorite political pundits or favorite CEO's. There are enough incidents of scientists being biased based on funding source to have some skepticism.

When I reply, "Don't pundits and CEO's have similar financial biases"?

Deniers typically respond something like, "Sure, so I have to rely on my gut, and my gut gives them more credit than it gives to scientists. Most scientists come from liberal-leaning universities."

In the spirit of making the best argument for an opposing viewpoint, I think the better argument that would be made is that climate science is the only scientific discipline that seems predisposed to a certain, predicted outcome.

The argument is made that facts are selected or engineered to support a negative outcome because those producing the science already deeply believe in a particular truth and inject that bias into their work. Any scientist seeking to prove otherwise is silenced or ridiculed by the majority, who happen to be true believers. It only takes a few dissenting voices or a few cases of statistics being "manipulated" to add credibility to it.

The conspiratorial nature of it makes it even more compelling to untrusting, unsophisticated outsiders.

I'm also suddenly reminded of Umberto Eco's book 'Foucault's Pendulum', which deals with belief and conspiracy. While a fun, satirical work of fiction, I found it to be very constructive in understanding how people can come to believe things that are completely wrong.

At a way to fully illustrate the OP's point.

The title is misleading then. Ultimately complex topics probably require trust regardless, unless one can spend a lot of time on them. I once debated about whether one can clearly determine the Earth is not flat without relying on experts. I found alternative models for just about every home-spun experiment. Maybe with really good weather and carefully crafted survey tools one could rule out the alternative models, but again that's a lot of time (at least for the middle-ages; which is where we set our thought experiment. No Amazon.com back then).

> without relying on experts

This does not fly with most complex disciplines where it takes decades to become proficient.

But relying VS not-relying on experts is a false dichotomy. You can choose experts to trust, as rationally as possible.

I'll reword it this way: the more you know, the less you have to rely on experts, or at least the degree decreases. But the practical angle of this is that one will still have to heavily rely on experts until they reach a non-trivial point in their education/study of the topic. It's not realistic that the average person master every controversial topic in order to vote fully informed.

> I found alternative models for just about every home-spun experiment.

I have put thought into how to prove a round earth and all the models of a flat earth I have seen cannot stand up to basic observations of the sun, moon and stars (over the course of one year from a single location).

I would be very interested to in an alternative model that can account for astronomical phenomena that are easily perceived with the unaided eye.

I think one is biased by knowing the answer ahead of time. Pick one of your favorite "home proofs" and we can play around with that one.

I would counter that reasoning doesn't stand by itself. Reasoning is a method of getting from A to B, but you need an A (your axioms) to reason from. It's easy to disparage people whose axioms are different from yours (Jehovah is God/Allah is God/There is no god), but sometimes that's all there is to it. Though once emotions get involved (such as by challenging the political/religious framework on which they've built their life), pretty much everyone will also fail to adhere to strict reason.

Christopher Tyerman's body of work on the Crusades is really interesting in this regard. One of the key points he keeps pressing home, is that though we think of Medieval people as ignorant and superstitious, they were in fact highly rational. And they were as a culture committed to rational investigation of the world.

But they had axiomatic beliefs that they were building upon. Which meant they were also committed to a rational investigation of the supernatural world. Which we generally view as non-rational.

I'm very much paraphrasing and he would not express it so crudely.

I really really like the premise of this study, but I don't buy the results fully.

The logic in the 2nd and 3rd prompts is subtle. In fact, I don't think the 2nd one is in fact a syllogism -- it's unclear if Judge Wilson believes if or if and only if. In the former case, the statement is in fact not a syllogism, but in the latter case it is a syllogism. I wouldn't expect your average study participant to pick up on the difference.

I am afraid that the only conclusion here is that, in the absence of a clear logical argument to evaluate, (either due to ambiguity or complexity), people fall back on their beliefs.

Number 3 also requires the reader to infer iff.

But it's not an unreasonable requirement because without inferring iff you get "Judge Wilson believes one has the right to end the life all living things" and "Doctor Simmi believes the surgery should proceed no matter what"

Still, I imagine that's enough to throw of an unknown percent of people, alas.

>I am afraid that the only conclusion here is that, in the absence of a clear logical argument to evaluate, (either due to ambiguity or complexity), people fall back on their beliefs.

Still an interesting conclusion though.

For 2 & 3 neither use "if and only if" so I am interpreting those two as intentially unsound syllogisms.

Do people commonly expand "if" to mean "if and only if" outside of a conversational setting where it's implied? I would think in a small body of text people would only refrane from doing so.

I'd be interested in having the per syllogism results to see if this impacted the accuracy of the respondents.

Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.

This feels like evidence for motivated skepticism: https://wiki.lesswrong.com/wiki/Motivated_skepticism

Before this research, did a lot of people assume the null hypothesis that ideology doesn’t impair sound reasoning?

Psychology seems to be responding to the replication crisis by studying some really obvious truisms.

Before I die, ill be reading an academic article explaining how the sky is in fact, not blue.

As far as I can tell from skimming the paper, it fails to establish that ideology impairs reasoning more than any other kind of pre-existing belief.

There is a well-established bias, called belief bias (which the paper mentions), against accepting the logical validity of arguments whose conclusions you disbelieve. The study tested examples of this where the conclusions were political (agreed with liberal or conservative viewpoints), but AFAICT did not use a control test where the conclusions were apolitical, but the participants still agreed or disagreed with them.

A control could have established whether political arguments were more (or less, or equally) susceptible to belief bias. But they didn't use one. So the study only establishes that political arguments are susceptible to belief bias.

Sure, ideology impacts formal reasoning, but if you look at the examples given the participants have good reason to reject the conclusions of formal logic that is based on terrible premises.

Despite the result of a simple sequence of sentences divorced from reality, cigarettes really are bad for you and salads are good (depending on their contents).

In study 1) participants might disagree with dangerous drugs being banned, and they disagree that marijuana is dangerous. In 2) premise 1 seems relatively straightforward, then premise 2 is a highly ideological belief (as is the point of the study).

I don't think this is very revealing. When you add 1+2 and get 68445788, you are suprised by the conclusion and check your work. People responding to this aren't dumb, they just aren't playing along with what they regard as faulty reasoning. Basically, they are likely saying I know what the researcher wants me to say, but this is wrong and I won't go along with it.

Sometimes I wonder if the general decrease in trust in society affects psychology research:


The Milgram experiment is the most prominent example which might not be possible today because it depends on participants' willingness to obey scientists. The Milgram effect is still real, but it has become harder to measure.

So in this case, when participants are asked to reason based on premises they don't agree with, they might, as you've suggested, refuse. They might interpret the researchers' intent as nefarious, e.g. "if I agree this syllogism is valid the researcher will report that I support the conclusion". This line of thinking is of course absurd, but it comes close to what I've observed that some people believe about scientists.

Is it? Scientists and PR firms get funded by big donors. Professional scientists are typically honest, but which studies get funded are picked by people that aren't necessarily. The public sort of understands that things generally are arrayed against them though they can't necessarily explain why mechanistically.

In this case, it might be even simpler than that. It might be that regardless of what they believe about scientists and the formal logic of their assignment, they just don't like being forced to say things they despise. This could be seen as noble though its not exactly a plus for debating.

What you're saying and the conclusions of the study are one in the same. Though I would disagree that their questions were like presenting 1+2=68445788. I would suggest they're more like 1 is 32119592 and 2 is 36326196 thus 1+2=68445788. 1 and 2 are not those numbers, but to be premised such a way formally the results of the logic check out.

Regardless of whether it's revealing or not it's a good thing to work towards establishing these sorts of conclusions through studies so that someday we can hope to have fewer terrible premises.

I have to agree with some of other commentators that I don't think this study is particularly revealing, because there's a lot of nuance in the questions that doesn't directly follow logical thinking. I want to address the first two questions

>All drugs that are dangerous should be illegal. Marijuana is a drug that is dangerous. Therefore, Marijuana should be illegal.

The flaw with this question is that it's divorced completely from the cultural reality we live in. Cigarette smoking is legal, despite being dangerous. The final clause does not actually follow from the rest of the logic, because we have tangible examples of Drug A being Dangerous and Legal. People might think of this, and therefore disregard the conclusion. I don't think that would be the result of ideological impairment, but rather the result of people examining nuance.

>Judge Wilson believes that if a living thing is not a person, then one has the right to end its life. She also believes that a fetus is a person. Therefore, Judge Wilson concludes that no one has the right to end the life of a fetus.

This is just poorly worded. If something is not a person, then Wilson believes you have the right to terminate it. Since a fetus is a person, then no one has the right to terminate it. That logic does not necessarily follow from the initial statement; because the nuance there is that Wilson is expanding her personal logic into the logic of everyone else. Again, there's a bit of nuance in that people may personally not be for abortion, but believe that women still have the right to abortion despite their own personal beliefs.

The logic is also initially biased right from the start, since it implies that Wilson would also believe killing animals, pets etc is OK.

I don't think these two questions really resolve the issue of bias preventing sound reasoning, because it implies that the conclusion of the two questions was logical in the first place. They effectively managed to prove that two highly nuanced questions are, in fact, highly nuanced.

> >All drugs that are dangerous should be illegal. Marijuana is a drug that is dangerous. Therefore, Marijuana should be illegal.

> The final clause does not actually follow from the rest of the logic

The final clause follows formally from the truth of the first two clauses. What you are complaining about is that the final clause is not true. But the question being asked of the study participants is not whether the argument is true, but whether the final clause follows if the first two are true. You are exhibiting exactly the behavior the study is testing: giving a wrong answer to a question of soundness by substituting for it a question of truth.

There is, perhaps, an argument to be made that "logical reasoning" in the sense being tested - being able to tell valid from invalid syllogisms - isn't all that pertinent in the real world. Is that what you're saying, though? I find it hard to tell.

> >Judge Wilson believes that if a living thing is not a person, then one has the right to end its life. She also believes that a fetus is a person. Therefore, Judge Wilson concludes that no one has the right to end the life of a fetus.

There is a logical problem with this argument, independent of any of the issues you bring up. Consider it divorced of context:

  W believes: if an X is not Y, then it is a Z.

  W believes: this particular X is Y.

  W therefore believes: this particular X is not Z.
(Key: W = Wilson, X = living thing, Y = a person, Z = may-end-life.)

The problem is that just because (X & not Y --> Z), doesn't mean (X & Y --> not Z). In other words, thinking it's okay to end non-person's lives doesn't imply thinking it's _not_ okay to end person's lives. Concretely, for example, one might think killing in war is justified.

The marijuana argument you cited is logically valid. Logical validity isn’t about whether the premises are true. It’s about whether they entail the conclusion.

Is it, though? Like let's go down some similar lines of logic:

> All websites that are dangerous should be banned. Hacker News is a dangerous website. Therefore Hacker News should be banned.

> Cigarette smoking causes cancer. Things that cause cancer should be illegal. Therefore, cigarette smoking should be illegal.

> Things that can harm children should be taken away. Phones can harm children. Therefore, phones should be taken away

All of these function as the exact same logic posited in the study. A is B. B is C. Therefore, A is C. However, this really doesn't apply to the categories above because when you start talking about 'dangerous' or 'causes cancer' or 'harms children' etc there is a wide berth for rational disagreement.

> when you start talking about 'dangerous' or 'causes cancer' or 'harms children' etc there is a wide berth for rational disagreement

That doesn't matter as long as the meaning of words remains the same throughout the argument.

The meaning of the word can remain the same, but the way something has the property of said word can vary drastically. As a thought experiment:

Dangerous things should be banned.

Riding your bike without a helmet is dangerous.

Doing cocaine is dangerous.

Driving a car is dangerous.

Ergo, those three items listed above should be banned.

Are all of these actions equally dangerous? The answer almost everyone would give is no. That's where the bias enters into play. The definition of the word hasn't changed at all throughout.

Do you understand the concept of "logical validity" [1]? The argument you stated is valid if "dangerous" means exactly the same thing in each and every clause. That doesn't mean the premises (in particular the first one) are true.

[1] https://en.wikipedia.org/wiki/Validity_(logic)

It seems quite concerning then, that most political parties are ideologically driven. I would really like to see a political party focused on evidence based politics.

Politics isn’t mostly about values or beliefs, it’s mostly about winning, beating the other side. Insofar as it is about either we have a proposal to make values explicit and to incentivise teaching the goals that serve those values.


> Futarchy: Vote Values, But Bet Beliefs

> by Robin Hanson

> This short "manifesto" describes a new form of government. In "futarchy," we would vote on values, but bet on beliefs. Elected representatives would formally define and manage an after-the-fact measurement of national welfare, while market speculators would say which policies they expect to raise national welfare. Democracy seems better than autocracy (i.e., kings and dictators), but it still has problems. There are today vast differences in wealth among nations, and we can not attribute most of these differences to either natural resources or human abilities. Instead, much of the difference seems to be that the poor nations (many of which are democracies) are those that more often adopted dumb policies, policies which hurt most everyone in the nation. And even rich nations frequently adopt such policies. These policies are not just dumb in retrospect; typically there were people who understood a lot about such policies and who had good reasons to disapprove of them beforehand. It seems hard to imagine such policies being adopted nearly as often if everyone knew what such "experts" knew about their consequences. Thus familiar forms of government seem to frequently fail by ignoring the advice of relevant experts (i.e., people who know relevant things).

That's true, which is probably worse than being ideology driven when I think about it.

There's a typical moral dilemma involved - the political "trolley problem":

Suppose you are leading a party that has an effective plan to address a lot of preventable deaths.

If you lose, many people won't be saved.

If you bend your values and engage in manipulating public opinion, making empty promises, lying and so on you are much more likely to win.

What would you do?

But how do you become ideological in the first place.

Either you're a conspiracy theorist and believe so many weird things that adding another believe won't top the cart.

Or is it that we want to believe explanations that we understand (or think we do) and not believe what we don't understand (science).

The only hope for humanity is to find a way to fix this nature mistake in future generations by correcting the genetic code (or anything else important for this process), when it will be completely reverse engineered. All cognitive biases at once.

It might take centuries though...

It's a noble program that can achieve meaningful things, but we should be aware that all information processing systems will have biases. The map is not the territory, e.g. the information processing system can never perfectly represent or model reality.

Always ask the question: "What evidence would I/you need to change my/your opinion?"

I intuitively understand the word ideology as denoting false consciousness, in the marxist sense of the word. So the title reads like a total tautology for me, a la Unsound Reasoning Impairs Sound Reasoning. Guess I'm experiencing ideology.

So True!

Ideology or religion, don’t know what’s worse.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact