I think the reason for this is that our brains just generally don't operate on a rational basis because it's not practical for humans. We have to rely on preformed perspectives when it comes to certain broad perspectives. Another reason is that it's almost necessary to adopt your group's perspectives in order to fit in socially. Or at least it's unlikely you will have a different perspective if much of your information comes from one group.
1. Invest more energy into trying to falsify your favored beliefs than into confirming them.
2. Embrace humility, since we're always wrong sometimes. Humility to me means: you can have convictions, but you should never assume that people who disagree with you are stupid or evil.
I’d add two other ideas that I try to adhere to, continuing your list.
3. Try to adopt the perspectives of those you disagree with, especially when you’re finding yourself ascribing malice, stupidity, moral failing, or just find yourself baffled by someone. Such feelings are often a sign that you don’t understand where someone is coming from, and though you may still disagree with them, you’ll have a better platform to discuss matters.
4. Try to see yourself and your time through the same lens you use to look back on history. We’re not special or unique, we’re goinf to be seen as primitives eventually, and it can be helpful to imagine in which ways that perception might be levied.
Both of these help to broaden perspective and depend empathy, while also divorcing yourself from a strict “first person view” or tribal view. It’s also a couple of concrete steps to gain some distance from your feelings in the present, which makes your steps easier to achieve. You’ll also find that the people you end up despising form a much smaller pool than most.
How so? I find it interesting to see how people frame and rationalize ideas that I dislike or disagree with.
And I’ve been very stubborn in the past. The biggest struggle was being able to accept being wrong in public. It’s much easier to have a ‘holy shit’ moment in private!
- we are not talking about a blank slate, we are talking specifically about position X in this very specific path from an animal to.... something
we cannot judge others without remembering their entire upbringing and the fact that to operate we need to use rules of thumb as there is not enough processing power in the universe to do it any other way.
Really trying to accept this helps me take number 4 there to a very real place where i absolve myself of being "right" and focus on "trying to improve things and give an honest account of the results to help us all move forward".
Thinking of this in terms of others ideologies - we cannot logically think through what is right or wrong, so we NEED people to believe "wrong" things as we develop / evolve to ensure we are correct. negative testing is tedious but very very important.
I this value in trying "provably" wrong things is a real part of what causes us to be who we are (e.g. it could be partially responsible for stubbornness and counter culture for example)
On the other hand, ideas are in a competition. If you put most of your energy into falsifying yours, they will lose out against ideas that are ruthlessly championed regardless of merit. (I think we can agree that the best ideas don't automatically win).
So it's a bit of a pickle.
However, as I said, I have come to the conclusion, alas only recently, that that may not be such a good idea.
To answer your question: the world will tell you.
I've also found it useful to think about it this way: I don't want to be wrong a second longer than necessary. (credit to Sam Harris)
That's gold. Thank you.
Strongly agree with this. Let's try a specific example on HN:
Seek the best form of the view that the anti-vaccine people may have a point.
And that could be putting yourself into the position of someone with very particular experiences the health system (e.g. you where very vulnerable and doctors treated you like a piece of meat) on top of other circumstances (you have never been that ill that modern medicine saved your ass, most of your very real health problems have psychosomatic roots and the neighbour wife with magical esotheric healing abilities was the only one wbo really talked to you about it).
This is about understanding who these people are and where they come from and what lead them to their (at times utterly wrong) beliefs. Doing that will not somehow stain your belief (unless you built them on similar shaky grounds). Most people/nations/companies think they are the center of the universe and that they are doing the right thing, sometimes for the simple reason because they see themselves as “The Good” side
They developed some sort of culture, rituals or superstition that backs their behaviour. Of course this gets harder and harder to decipher with each iteration they take downwards the devils spiral into irrationality, until you look onto them from the outside and you tell yourself: no way on earth I understand where they come from with their &)$&)!&@) beliefs.
But if you want to effectively criticise, you need to know how to dismantle these beliefs, and in order to do that, you have to understand (not agree to!) their position.
If you feel like you are giving someone too much power simply by trying to take their perspective, this might have its roots in your own culture, rituals and beliefs.
So in the case of some antivaxxers the best alid ideas that come from their standpoint is probably a criticism of capitalistic medicine (incentive: make money, not heal) and (from their subjective position) untrustworthy medical professionals. On top of that, they suddenly can google every illness on the internet and they feel like they know more than the expert in the field (this is by the way a sentiment of our times, which again has its roots somewhere).
Of course antivaxxers are irrational lunatics, but if don’t just want to yell at them, but use rational arguments, then understanding their background will help tremendously.
Why is that "of course"?
It looks to me you haven't got the point of what #ReedX wrote at all:
>Remind yourself of the times you were confident about X and turned out to be wrong.
>Embrace humility, since we're always wrong sometimes.
After some time it turns out he belives the earth is flat. You know he is factually wrong and he will lead the whole group in circles till you starve. So you could embrace humility.. but on the other hand there are facts about the world that you don't really have to seriously debate, unless good points of doubt are raised that you can check. There are situations where not pointing out somebody's wrong believes and demand a rational explaination from them is the ethically wrong thing to do.
The odds in this story are starvation, the odds in the vaccination fad are: either killing people by damaging vaccines, or killing people by not vaccinating them. The thing is – this is not just some oppinion, the outcome of this will (independent of who is right) kill people.
I am not a person who vaccinates a lot, but I do and of course I was curious if there is something behind the claims of the supposed side effects of vaccination. And what I did was to embracing humility. I decided not to take a stance on this until I informed myself and talked to people who were convinced about the damaging nature of vaccines. The claims made here would be revolutionary if these people were right. So you check out where the idea comes from and you find that one study with multiple metholodical failures which comes to a weak conclusion that there is a link between autism and vaccines. And then you find a flood of studies who can't find that link. Then you talk with anti-vaxxers about it and suddenly you are the devil for even considering there is a doubt. But remember the stakes – in both cases innocent people die! What I discovered then is that this is not about truth or what is really to be observed in reality, but it rather is a belief rooted in a general "clean eating" ideology where detoxing is the law of the land (all of which seems to be based on a vague gut feeling about the own body and how it must be only be fed with certain substances). For others it was a part of a general conspiracy, that after questioning turned out so kookoo that I thaught I was talking to a schizophrenic patient at a mental facility. So even if it turns out they were right and vaccines have a net-negative effect on humanity, they just were right by accident and not because they really had a rational point that anyone could take seriously and follow.
But given the odds I considered they could be right, checked for myself, reached a the conclusion they very certainly are not right, also reached the conclusion that this idea will in a net result produce more harm than good and decided to speak out against it.
None of the anti-vaxxers I had contact with ever atempted to show this kind of humility or consideration themselves, despite this beeing a situation like the one in the desert: you weigh your belief against the lifes of other people. If I where in these shoes I'd try my best to disproof myself, because the consequences in both directions are not a joke. That is why I am interested in where these believes come from and why people so strongly stand behind this. And in my definition this is humility. I'd take back the "of course" as it displays the certainty I had after doing that research. There are obviously people for whom it isn't nearly as clear.
The study found that people are less successful at spotting logical fallacies when the conclusions are supported by their politics.
This is itself an interesting finding, but (1) I don't think the "everyone has biases" argument is justified by this article and (2) that precise argument is often used by people to apologize for allying themselves with others who have abhorrent views but are otherwise inline with their political interests (e.g. White Evangelicals in Iowa who vote for Steve King). I'm 1000% not suggesting that you're doing this but I think it's an unfortunate conclusion many draw from your argument.
Yes everyone has some level of irrationality but not everyone has a sound ideology, many people often waiver between conclusions and human judgement is real, important, and helpful.
What the article does say is that among two groups, both groups were influenced by ideology.
And I realize that you and many others feel that you are an exception.
It is my belief that you are not. I believe that no one is an exception and everyone has an ideology. I know that more often than not "ideoloy" is thought of as something some _other_ group of supposedly _less rational_ people have, but I think that is not the case.
So anyway these are beliefs that I have, so they are not something we can discuss constructively. Due to the nature of belief.
It goes way deeper than that. Reasoning being hard to do well would explain a high error rate, but not systematic biases. I think human reason mainly evolved for (as you said) getting ahead socially, and only secondarily for solving object-level problems. There's a good recent book, The elephant in the brain.
I think there is a more fundamental reason, ie, that there is no complete rational explanation of reality. What we do have is partial rational explanations which work well in some specific contexts, but are not absolute: they break down if you change context. See for example relativity/quantum mechanics; even the hardest of hard sciences doesn't have a unified, complete theory.
That's why AI based on logical reasoning didn't work, and will never work IMHO. Rationality is a tool that intelligent creatures can use - the most advanced tool, probably - but it's not enough to live in the real world. Some of the cognitive shortcuts we use are "only" necessary to cut decision time, but many are fundamentally necessary for adaptation to our environment.
The "problem" was that a rational robot (or rational human) needs to decide to open a door or not. He can't just stay in front of the door, because eventually he'll die. So the "cost" of waiting is small, but accumulating. And of course, we know this is stupid.
He also cannot open the door. There might be a bear behind the door, which might kill him. Small, but nonzero chance of that happening. Cost of death is inf (because it fails all other goals, it must be a higher cost than the sum of failing everything, therefore effectively it's infinite).
So a rational robot/human will never open the door. Rather he'll die in front of it, from exhaustion. The fun thing is that you actually see this behavior in rule based reasoning systems. They will refuse to do anything at all, because it might kill them (even if they technically don't know what death is this still happens. They can still add together the costs of failing everything). You have to go in and manually remove these cases if you want things to work.
So yes, you need a belief system. Even a purely rational decision making systems needs a belief system. To solve this problem it needs to believe it will never die (or at least that a sufficiently small chance of death means death never occurs). The fun thing is, this is a true belief: it's demonstrably wrong. In a practical system there will be a lot more beliefs than just this one.
If the door leads from outside to inside, there's also a small chance the robot might die if it doesn't go inside (a passing car might hit it, anti-AI vandals might destroy it, etc). It's probably safer inside. So perhaps we could say that a rational robot/human will never go outside.
Interestingly, some people do exhibit this behavior (agoraphobia) and it's generally considered anything but rational.
The problem for the AI can be fixed, without a belief system, with a lower cost for danger. Infinite cost is too high because it makes probability irrelevant. A finite but very high "cost of death" would still avoid likely death without becoming paralyzed by the possibility of unlikely events.
We sometimes see people deciding that a cause is worth risking, even knowingly giving, their life for, proving that even humans don't weigh the cost of death above all other activities, so it's hard to see why a nonsentient AI should do so.
"All models are wrong, but some are useful" George E.P. Box
That’s incorrect. The inaction you’re describing is wrong from a “purely rational” perspective because starvation also constitutes death and has a much higher probability given inaction. Your example doesn’t show there’s something wrong with “pure rationality”.
What instead we could do is discard lots of possible probabilities without even considering them, because they just look intuitively unlikely enough. In other word, using the so called "cognitive shortcuts".
... that fits inside a current day human's brain. I see no reason to believe this is true fundamentally.
Is there a limit to the number of consequences that a single action can have? If you think about all possible future consequences, then the answer is clearly no. Using an example from another comment, opening a closed door could bring you to your death, which has infinite consequences for your (potentially future) family.
Therefore, if rationally analysing each possible consequence takes any non-zero amount of time, deciding any action rationally would take an infinite time. You could try to rationally decide what is the maximum amount of time you can spend on each choice, but deciding that amount rationally would require infinite time.
This isn't a proof, but this consideration comes from spending a lot of time thinking about rationality and AI.
I wouldn't know how to define if the universe itself is rational, in an absolute/abstract sense.
That said, there is a reason why it may not be possible: human consciousness can only exist at a higher level than the most fundamental level of nature, and on that basis, consciousness may never be able to perceive enough of the fundamentals of nature to be able to formulate a complete explanation of it.
However it's not really relevant to the topic at hand. I'm sure humans will always use cognitive shortcuts like ideologies in order to navigate our daily experience, for reasons of speed and efficiency.
"Rational" is also a word that can be used for rhetorical purposes. If one claims that something is "rational", it is often specifically because it is not obvious to the audience that it IS rational. The claim is an attempt to influence the listener to believe the truthfulness of the something.
Similarly to when people say "trust me", such a claim may in itself be a reason to be a bit cautious.
"Rationality" is a tool than can be wielded in many ways and for many purposes. Properly used, it is a good source to knowledge, but it needs to be combined with empirical knowledge as well as heuristical information stored in social groups and even our DNA, even when not fully understood.
Edit: $ From the definition of "Ideology", epistemology is specifically excluded. Since Rationalism deals with epistemology, it is technically excluded from the definition, as long as the Rationalism is in good faith, and not driven by ulterior motives. This distinction does not make a difference for the rest of the argument.
Considering there are epistemologic ideologies, I'm not sure how you can claim this.
For example, in the West (and now common throughout the world, since most of the world has adopted Western epistemology, whether they admit it or not), we believe in the scientific method -- that if we can observe, measure, and replicate an experiment, we have established 'empirical truth'. The idea that such a process can establish universal truth (rather than simply be individual measurements of how the universe acted at a certain point in time) is based on the Abrahamic notion of an unchanging God. It is certainly an ideology in and of itself. It may be a useful ideology, but claiming that it is not an ideology is just dishonest.
I checked the definition on wikipedia after writing my comment:
But further reading shows that my correction was wrong, and my initial assumption was correct.
Thanks for pointing this out :)
Of course it does. Almost every person in the history of mankind will have claimed to act rationally, including the scientist inventing better means of wheat production on fewer acres, as well as the voodoo witch doctor trying to do the same thing.
By necessity, whatever works is going to claim to have been inspired by rationality.
Everything that works turns out to work through rational principles subject to the scientific method, enabling improvement through engineering, often enabling the creation of wealth.
Rationality is not an unmitigated good when not tempered with humanity and empathy for others.
A decent discussion on this topic:
No disagreement at all.
1) Objectivity is a kind of ideology? This is objectively false ;)
2) Assigning desirability to that which we perceive as rational is a kind of ideology? In this case, can something so universal that it's surely biologically hard-coded be considered an 'ideology', esp in the context of the article?
3) Choosing to reason rationally instead of ideologically is an ideological choice (or perhaps even a choice motivated by the kind of ideological reasoning discussed in the article)?
4) Something else entirely?
Also, it's not as if rationality is "above the fray", so to speak. Differing relationships to rationality are at the core of major differences between schools of thought in both international relations and economics, for instance.
Objective theories for how to act (including morality), requires that all subjects that could possibly be covered by the theories are 100% in agreement with all axioms. Also, all theory on top of the axioms must be 100% stringent.
2) I think you take the argument the wrong way. Our ideologies tend to influence both what we perceive as rational as well as how important it is for us that something is rational (as opposed to socially acceptable, comfortable, self-serving or a number of other metrics we can use to measure the value of the "something")
3) Rationalism is a well documented ideology (see my other post).
The number of things that can be said universally objectively is so vanishingly small that it's not even worth considering them for the most part.
> Assigning desirability to that which we perceive as rational is a kind of ideology?
No. All people will label what they perceive as rational as desirable. The issue is that 'what we perceive as rational' is almost necessarily based in our underlying ideology.
> In this case, can something so universal that it's surely biologically hard-coded be considered an 'ideology', esp in the context of the article?
What are you claiming is biologically hard-wired? First-order predicate logic? Propositional logic? If it's hard-wired, why are there college classes dealing with introducing the topic?
Moreover, the existence of languages that are unable to express first-order predicate logic indicate that it is not hard-coded.
Finally, given the variety of human language, it does not seem clear that you can make these claims of 'biological hard-wiring' without justification.
> Choosing to reason rationally instead of ideologically is an ideological choice (or perhaps even a choice motivated by the kind of ideological reasoning discussed in the article)?
Can you please explain what system you use to reason rationally? Can it add two numbers together? If so, it has an ideology behind it (see the incompleteness theorems), as there are statements in your logic that it can never prove.
If your system is incapable of basic arithmetic, then I question the value it has in everyday problems.
> Something else entirely?
The constant claim I see on HN, reddit, and social media in the 21st century of people claiming to act 'rationally', via movements such as lesswrong, rationalwiki, etc, is that such motivations are not ideological. However, it is clear that they are ideological because they often elevate what amounts to first-order logic plus the scientific method as means of discovering truth. First order logic with the law of excluded middle is unsound, and the scientific method as a means of establishing truth is based in the notion of an unchanging transcendent reality (in other words, a shadow form of the Abrahamic God). If this is how you want to approach life, you should be aware of your own implicit ideology while doing so. Claims of 'rationality' are simply dishonest (note, I'm not saying they're wrong, just dishonest).
If we're being honest with ourselves, we all have an ideology, a way of explaining the world, and if we're honest with others we will not claim that we're being 'rational' without first explaining our ideology.
Mathematics says you cannot (the incompleteness theorems), so there will always be a fundamental truth you will need to subscribe to. This indicates that those claiming 'rationality' almost certainly have a belief that is irrational even according to themselves (if they are honest enough to admit it).
Here is an example. Suppose we are arguing about the need for a certain tax. The point of contention is whether the tax will majnly benefit the rich or the poor. You start out with an argument: even if the tax benefits those not in poverty, the effects will trickle down. Despite your opinions on trickle down economics, the argument was unsound at the very first words. 'Even if' is a form of argument via excluded middle. It presupposes that either tbe tax will help those in poverty or it will not. The law of excluded middle forms unsound logical systems. There are ways around it but i guarantee you you will not see this in 'rational' discourse
In the context of these everyday absurdities and the fact most logical systems are -- by necessity -- incomplete, its almost inavariably more interesting to talk about the unproven statements (the ideology) than the system of logic surrounding it. In other words, the ideology is ultimately what matters, even to rational thinkers.
Philosophy underpins ideology to a large extent.
But that is an ideology! That isn't even wrong, what's wrong is to say that's the opposite of logic.
> We have to rely on preformed perspectives when it comes to certain broad perspectives.
That is generally true, but if the perspective is limited, we start speculating, making and testing assumptions. That's not an ideology per se. The thing with ideology is that long distance planning requires to stick with one theory to see it through. That's why discipline, form and idea are somewhat synonymous. Think about when somebody says someone had no clue. And corollary, different people might partially succeed with different attempts and reach different conclusions. These might be mutually exclusive, if leaning on probabilistic arguments, or just not evidently the same, if it's not obvious how to correlate the experiences. These are different meanings of ideology and the contradictory sense is rather euphemistic or derogatory.
> Another reason is that it's almost necessary to adopt your group's perspectives in order to fit in
That might be dogmatism and virtue signaling, which to a degree signals subordination. That's less than ideal, but the alternative fight to death is often farther from optimal.
Indeed. Both sides of the political spectrum have this! As a person who values truth and knowledge, let me warn my fellow people: Beware of jingoistic and tribalistic claims that reality has a [political-descriptor-here] bias. No human being is free of the need for a little self doubt and introspection. No ideology can act as an oracle, and ideologies which have made that claim throughout history have acted against the truth and even given rise to tragedy and atrocity. If you have reached the point where you believe your opponents are somehow lesser than you are, and that you no longer need to convince but instead must coerce their cooperation, you need to take a step back and beware. No belief system, philosophy, or intellectual movement is so clued into ultimate truth, that its adherents can afford to cease questioning themselves. It's precisely those groups of people who cease questioning themselves and suppress the questions from others who are most at risk of becoming history's villains.
(One of my most disilusioned days was when I saw fellow atheists wearing their atheism like an arm band, using it as a pretext to declare their superiority over fellow human beings, and intoning it like some sort of religious creed. Everyone was wasting time preaching to the choir, and as I was at a musical gathering and just wanted to play music, I called it out. Then I watched them turn on me.)
(Both sides of the political spectrum deny science for ideology. They just deny different parts of science. https://www.youtube.com/watch?v=hkdIB03RdBg )
That's a bit of ideology all by itself. The idea that there are only two sides is faulty.
I just left that detail out for the time being.
What both sides? (Two? most compasses, that are already reductionist, have many more quadrants)
As long as the United States continues to use first-past-the-post voting schemes to determine the winners in electoral contests, we will continue to have a two party system (https://en.wikipedia.org/wiki/Duverger%27s_law)
Hence "both sides."
I'm a left-leaning centrist whose attention has shifted more towards a 2nd political axis of Anti-Authoritarianism.
A professor of evolutionary biology.
I wonder what ideologies he thinks are ignoring science?
The left likes to deny social science, biology, and psychology which contradicts its narrative. The right denies climate science, mostly by misstating science and for tribal reasons.
Maybe an interesting perspective on differences in town-planning outcomes, or a critique of economic schools compared to real world measurements...nope!
Your assertion that this proves something is quite vacuous, as his commentary is mostly aimed at evolutionary biology, which is denied by the left for political reasons, and climate science, which is denied by the right for political reasons.
Of course he has a lot of opinions on feminism, homosexuality and for some reason seems to have a lot of trouble imagining himself discussing these topics without offending people.
Because he faces lots of underhanded authoritarian tactics for merely talking about scientific truth. He's a Lebanese Jew whose family had to flee deadly religious persecution, who is regularly called a "white supremacist" or Nazi by dishonest people who just want to push a political agenda by sneering and name-calling.
It's high time people started calling out such dishonest smearing tactics and such authoritarian mindsets. What we need in 2019 is discourse and facts, not coercion and smearing.
Democracy is ideology and nazism is ideology and communism is ideology and all
three are subject to biases. They are not the same nor equal in their
No one is promulgating that. Very specifically, the idea forwarded is that both sides are human and therefore subject to human biases and groupthink.
That's not ideology. That's just well established fact. Also, don't conflate that with my description of the current political climate in the western world, which is largely one of polarization and tribal entrenchment.
The differences in degree matter a lot.
I'm glad you say that. You should go and look into the frequency of political violence, with an eye to finding things which are not reported by the media. A preponderance of incidents are committed by the extremist far left (Antifa &c) in 2018 and 2019. Then go and look into the frequency of people advocating for and tacitly approving of political violence. A preponderance of incidents are committed by the extremist far left and far right. Really, I'm tired of yahoos who basically say they can do anything, so long as it's no worse than the other side.
In terms of differences of degree, the extremes on both ends of the left/right spectrum seem to be trying for escape velocity from reason and sense.
Lately I've tried to avoid talking about -isms. Doesn't always work, because everybody talks about them, and I'm certainly deeply committed to a couple of -isms. But they tend to become both hollow labels and collections of dogma. "Socialism" is an obvious one that I often see abused. People oppose for example universal health care because they consider it socialist, and because socialist Russia/Venezuela was/is bad, universal health care must be bad. It's a stupid way to argue, but surprisingly common. The same is also true of "capitalism" of course. It can mean either the concentration of wealth in the hands of a small elite, or the free market. And even "free market" means different things to different people.
Better to let go of the dogmas and labels and talk about the results we want, while looking critically at what actually works and what doesn't.
Refusing to use names in this situation wont eliminate peoples fear, it will
merely make it impossible to talk about what they are afraid of. Good sounding
ideas can go real bad and it is good to learn from history. And I think that
one of lessons of Russia history is that if people who are violent before
gaining power or openly celebrate violence, they will be violent after.
Maybe it would be worth to talk also about Russia history and ideologies that
affected it as they were.
Too much of the current left in the US celebrates harassment and violence. There's the bike lock bashing professor. There's the various hippies and other Berkeley residents interviewed by Tim Pool who give tacit support to Antifa violence. There's the simply-conservative father and son who were chased down by Antifa goons shouting epithets and given a beat down. There's the local news reporter who was bashed, his smartphone damaged for simply filming Antifa. There's Tim Pool himself, who is a Korean-American and center-left politically, being called "Alt-Right" so a mob of goons would come and beat him up. (And that's just off the top of my head. There's so much of this, and it's simply not covered by the mainstream media.)
How is it the left can object to violence done on its behalf overseas, but either tacitly support it or even perpetrate and celebrate it as a means of political intimidation? Something has gone corrupt.
Most importantly, USA is not 19 century Russia in pretty much any way. You went completely offtopic and also conflated antifa with social programs - in discussion about tribalism and biases. Achievement.
Part of that, particularly from Trump and his supporters, and probably from Antifa and similar groups, is excessive tribalism. They hate the other side and are eager to hurt them. Trump supporters are certainly quite open about that. Antifa might actually be different. It's also possible they actually believe that violence is necessary in order to stop fascism. But any support they may get from the left is probably pure tribalism again.
Of course there's also another aspect to violence from the right: things like lynchings in the 1950s, harassment of women, violence against Muslims and other minorities. Those stem from pure bigotry and a belief in (usually) white male superiority and a right to punish others if they seem too uppity or seem like a threat. I assume most people on the right abhor this, and yet some of these still happen.
This "right to punish others" should be a big red flag, be it from extremists on the right or the left. Both extremes have poor records, historically speaking.
EDIT: Wow, someone downvoted me literally ~2 seconds after I posted. Good work, Quickdraw!
The downvotes on your own comment may be because one of the HN guidelines discourages commenting about voting patterns.
And I thought the fact that the downvote was there as soon as I posted was the remarkable bit, not that I was downvoted at all.
Saying this during an era that looks like it will be partly remembered as when the american binary political divide ended seems very odd to me. American society is largely and increasingly not well modeled by 2 clusters of opinion.
American society is largely and increasingly not well modeled by 2 clusters of opinion.
I've seen studies and polls indicating that there are two diverging clusters of opinions and positions, with the left part moving further left, and the right clump moving more slowly right.
There is a big part of online society for whom such jingoistic tribalism is thought of as some sort of virtuous, intellectually worthy exercise. I find this highly disturbing. If one is interested in finding the truth, then self doubt and the willingness to ask questions is paramount! It's precisely that "you're either with us, or against us" mentality which is a hallmark of the tyrants and the enemies of reason. It's precisely that mindset that burned witches and imprisoned Galileo.
It's precisely that mindset that gives rise to ideological reasoning.
It seems like some group of activists has tried to come in to Hacker News and use it to push their agenda. I, for one, do not appreciate their attempts to push their agenda by creating fear of being labeled an "-ist."
>>> It seems like some group of activists has tried to come in to Hacker News and use it to push their agenda. I, for one, do not appreciate their attempts to push their agenda by creating fear of being labeled an "-ist."
I've been around here since 2007, and I'm even on the leaderboard. There has indeed been a sharp change around here, starting slowly in 2014 and more quickly starting in 2016. This sort of outrage-mongering and "you're with us or you're against us, or we'll smear you" anti-intellectual mentality on HN is something new that's come in from outside.
It is correct in this case, but in other contexts, it is also a common bit of fallacious reasoning, used to draw false equivalences.
Edit: not to say that pointing out that "both sides" have some sort of mistake is never valid, it's just that almost every time I see the argument it's from someone akin to a flat earther, or climate change denier, or an anti-evolutionist, who really doesn't have the facts on their side. And it's incredibly frustrating that they just expect me to respect their opinions just as much as I would respect someone who's talking points hadn't been debunked over and over already.
So it isn't true for anyone (in this universe, anyway).
Even the idea that politics involves "2 sides" or that this is a useful way to model things is an obvious ideological statement.
Pointing this out gets old, at this point jokes about how someone always says this and betrays their complete misunderstanding of the entire concept is old. And since you know this ideology and you know what works and what doesn't when trying to get someone to confront un-examined ideological commitments ... you just downvote and move on.
I think that a certain level of stubbornness may actually be a survival trait, because it protects a highly social animal from manipulation and deception.
In modern society, we are deluged with information with a very poor signal-to-noise ratio and many false leads. My approach is to let my short term decisions, such as voting, be influenced to a certain degree by ideology, but let my ideology be influenced over the long term by a preponderance of evidence. This allows me to function without being jerked back and forth by the best debater or hottest news du jour.
There is a reason that the majority of educated political extremists come from STEM backgrounds.
It has lead this person to make decisions which do not fit the characteristics of their current company. They have taken the experience from other industries and from companies with other characteristics and have tried to apply them and shoehorn them. It has lead to some poor decisions, execution as well as inferior relationships with other key players in the company. Superficially something like Mr. Johnson when he went from Apple to JCPenney.
Politics is about vested self interest, not logic.
Is that what the paper actually says? Based on my personal experience I would think there's a lot of variance but I would be interested to see actual data.
Yes. However, my ideology is that logic, evidence, and reason, and reality are far more important than ideology.
All I care about is evidence and if I hold a position that is faulty I abandon it the moment there is evidence that I'm incorrect.
To avoid the hornet's nest that is contemporary issues let's go back a bit in time. Is Earth in the very center of the universe with literally everything in existence revolving around it, a geocentric view? Or do we live in a universe where the Earth holds no particular relevance and is just another rock rotating around our star, a heliocentric view? There is hundreds of years of astronomical evidence indicating that the Earth is at the center of the universe. But at the same time you understand that alternative views, heliocentricism in particular, have not been given anywhere near the same degree of consideration. It borders on heresy against the established political forces, but even in academia it turns against centuries of established research. But might we be missing something?
Personally you find the models used to articulate the astronomical 'reality' of a geocentric to be rather unrealistically fanciful. Mercury at some point in its orbit literally stops and just starts going the other way. Most of all the planets have to travel in these really peculiar swirly orbits that we don't really have any physical evidence of in any other phenomena. Perhaps most worrying of all to you, the models used to demonstrate a geocentric universe are completely unfalsifiable. Each time we make a new observation we simply tack it onto the model. There is no way the model could ever be refuted outside of being able to view the universe through the eyes of god himself, which is something we surely will never be able to do.
So would you choose to believe that the Earth is at the center of the universe? Or might you find it something less than compelling in spite of the fact that 'logic, evidence, reason, and reality' as framed by the times had entrenched it as indisputable fact? In any case there would certainly be no indisputable evidence that Earth was not at the center of the universe with everything else revolving around it.
You buy ideology wholesale, and ideological thinking is essentially authoritarian. It means you can hide behind something that you assume is bigger than yourself to make your beliefs more plausible and your arguments more convincing.
You can see this reduced to bare essentials when you talk to religious types who "prove" everything with a bible quote. If you don't believe in the authority of the bible this seems ridiculous, and if you do it's completely authoritative.
But science has similar issues. It's not unusual to see "You can't argue with me, I'm a scientist" being used in the fringes of science where it isn't truly justified empirically or theoretically. It's even more popular among technical types who aren't scientists at all, but who use "science" to dismiss opinions they disapprove of as "woo".
Ideology is essentially just people not-thinking in a herd. You can buy your value system wholesale, have it justified by the size and heft of your herd, and persuade yourself that your beliefs make sense - and even that they can't be argued with.
The authoritarian part comes from the political reality that political "morality" is about power, status, and credibility, not about facts or accuracy. In a political moral framework you score points by destroying the power, status, and credibility of your opponent using any means possible - including outright lies and character smears.
Humans seem much more likely to "prove" a point politically than empirically and with intellectual humility and integrity - which is fatal for real knowledge, because making and admitting mistakes in a political frame is a strongly losing move.
But there are also other problems with evidence: How well do you trust the authors and institutions that produce or report the evidence? And if the evidence is probabilistic, as most evidence is, outside mathematics, what are your prior probabilities (in a Bayesian sense)?
In the case of international comparisons of health care, and other situations in which there is no repeatable experiment, there's also the problem of whether A caused B or whether A and B are both consequences of something else, perhaps something that the authors had considered and dismissed with some non-numerical justification, or perhaps something that wasn't even considered.
That's why I have trouble choosing a party. Most issues are a close decision for me, so they don't clearly fall into one party's political view.
(I get that acting on false information is not rational, I'm arguing that it is frequently the core of the irrationallity)
Agreed, but it's also worth pointing out that not everyone indulges in ideological reasoning to the same degree in general. I've heard people make related arguments that because everyone reasons ideologically at points, there can be no objective position on <topic>; of course this argument holds little water, and it seems designed to rationalize the conversant's belief to himself rather than to persuade his audience.
Not meaning to pick on you, but this line of reasoning is ultimately self-refuting.
If we're not capable of operating on a rational basis, then the belief that "we're not capable of operating on a rational basis" is itself irrational and therefore shouldn't be believed.
Since this is self-refuting it must not be true and therefore we must be capable of operating rationally.
speak for youself
i have yet to be proven the case while i have shown many others that they are.
The Righteous Mind
The Elephant In The Brain
In Defense of Troublemakers
The worst part is it's excruciatingly difficult and extremely unlikely for you to find your own blind spots. So you need to hash things out with other people. As others mentioned the best thing you can do is hold defeasibility and corrigibility as some of your highest values and do your best to understand all the pitfalls in our thinking.
Taking that theory to its conclusion, that means that most people are approaching the question of rationality vs. emotion the wrong way by assuming that appeals to reason and logic are the way to convince the average person. Not so; one must convince their emotions first, and then they will reason together a logical framework to fit their emotion.
It's a bit distressing to think about, but unsurprising when one reflects on the course of history that humanity has taken, and is taking.
There's another great bit of research detailed like that detailed in The Elephant In The Brain.
In the 60s and 70s Roger Sperry and Michael Gazzaniga did some research for which Sperry eventually won a Nobel Prize in 1981 where they showed images to patients who had undergone a corpus callostomy where the two halves of the brain can no longer communicate and asked them questions about what they saw. The results were really striking. They did it in such a way that they show the image to only the right side of the brain but then ask the left side of the brain to describe what they saw verbally. They just invent some kind of fantastical rationalization and they seem to fully believe that's what they think despite the fact that it can't be. I'm explaining it kind of badly but that part was jaw dropping. Seems to fit with Haidts research though.
Nothing is jumping out for me personally that I can see, but I know well the feeling of reading all the arguments of one particular school of thought and having this glaringly obvious edge case that no one ever seems to bring up.
Curious to know what those things you see are?
Therefore, climate-change deniers ask if one should trust scientists over their favorite political pundits or favorite CEO's. There are enough incidents of scientists being biased based on funding source to have some skepticism.
When I reply, "Don't pundits and CEO's have similar financial biases"?
Deniers typically respond something like, "Sure, so I have to rely on my gut, and my gut gives them more credit than it gives to scientists. Most scientists come from liberal-leaning universities."
The argument is made that facts are selected or engineered to support a negative outcome because those producing the science already deeply believe in a particular truth and inject that bias into their work. Any scientist seeking to prove otherwise is silenced or ridiculed by the majority, who happen to be true believers. It only takes a few dissenting voices or a few cases of statistics being "manipulated" to add credibility to it.
The conspiratorial nature of it makes it even more compelling to untrusting, unsophisticated outsiders.
I'm also suddenly reminded of Umberto Eco's book 'Foucault's Pendulum', which deals with belief and conspiracy. While a fun, satirical work of fiction, I found it to be very constructive in understanding how people can come to believe things that are completely wrong.
This does not fly with most complex disciplines where it takes decades to become proficient.
But relying VS not-relying on experts is a false dichotomy. You can choose experts to trust, as rationally as possible.
I have put thought into how to prove a round earth and all the models of a flat earth I have seen cannot stand up to basic observations of the sun, moon and stars (over the course of one year from a single location).
I would be very interested to in an alternative model that can account for astronomical phenomena that are easily perceived with the unaided eye.
But they had axiomatic beliefs that they were building upon. Which meant they were also committed to a rational investigation of the supernatural world. Which we generally view as non-rational.
I'm very much paraphrasing and he would not express it so crudely.
The logic in the 2nd and 3rd prompts is subtle. In fact, I don't think the 2nd one is in fact a syllogism -- it's unclear if Judge Wilson believes if or if and only if. In the former case, the statement is in fact not a syllogism, but in the latter case it is a syllogism. I wouldn't expect your average study participant to pick up on the difference.
I am afraid that the only conclusion here is that, in the absence of a clear logical argument to evaluate, (either due to ambiguity or complexity), people fall back on their beliefs.
But it's not an unreasonable requirement because without inferring iff you get "Judge Wilson believes one has the right to end the life all living things" and "Doctor Simmi believes the surgery should proceed no matter what"
Still, I imagine that's enough to throw of an unknown percent of people, alas.
>I am afraid that the only conclusion here is that, in the absence of a clear logical argument to evaluate, (either due to ambiguity or complexity), people fall back on their beliefs.
Still an interesting conclusion though.
Do people commonly expand "if" to mean "if and only if" outside of a conversational setting where it's implied? I would think in a small body of text people would only refrane from doing so.
I'd be interested in having the per syllogism results to see if this impacted the accuracy of the respondents.
Psychology seems to be responding to the replication crisis by studying some really obvious truisms.
There is a well-established bias, called belief bias (which the paper mentions), against accepting the logical validity of arguments whose conclusions you disbelieve. The study tested examples of this where the conclusions were political (agreed with liberal or conservative viewpoints), but AFAICT did not use a control test where the conclusions were apolitical, but the participants still agreed or disagreed with them.
A control could have established whether political arguments were more (or less, or equally) susceptible to belief bias. But they didn't use one. So the study only establishes that political arguments are susceptible to belief bias.
Despite the result of a simple sequence of sentences divorced from reality, cigarettes really are bad for you and salads are good (depending on their contents).
In study 1) participants might disagree with dangerous drugs being banned, and they disagree that marijuana is dangerous. In 2) premise 1 seems relatively straightforward, then premise 2 is a highly ideological belief (as is the point of the study).
I don't think this is very revealing. When you add 1+2 and get 68445788, you are suprised by the conclusion and check your work. People responding to this aren't dumb, they just aren't playing along with what they regard as faulty reasoning. Basically, they are likely saying I know what the researcher wants me to say, but this is wrong and I won't go along with it.
The Milgram experiment is the most prominent example which might not be possible today because it depends on participants' willingness to obey scientists. The Milgram effect is still real, but it has become harder to measure.
So in this case, when participants are asked to reason based on premises they don't agree with, they might, as you've suggested, refuse. They might interpret the researchers' intent as nefarious, e.g. "if I agree this syllogism is valid the researcher will report that I support the conclusion". This line of thinking is of course absurd, but it comes close to what I've observed that some people believe about scientists.
In this case, it might be even simpler than that. It might be that regardless of what they believe about scientists and the formal logic of their assignment, they just don't like being forced to say things they despise. This could be seen as noble though its not exactly a plus for debating.
Regardless of whether it's revealing or not it's a good thing to work towards establishing these sorts of conclusions through studies so that someday we can hope to have fewer terrible premises.
>All drugs that are dangerous should be illegal. Marijuana is a drug that is dangerous. Therefore, Marijuana should be illegal.
The flaw with this question is that it's divorced completely from the cultural reality we live in. Cigarette smoking is legal, despite being dangerous. The final clause does not actually follow from the rest of the logic, because we have tangible examples of Drug A being Dangerous and Legal. People might think of this, and therefore disregard the conclusion. I don't think that would be the result of ideological impairment, but rather the result of people examining nuance.
>Judge Wilson believes that if a living thing is not a person, then one has the right to end its life. She also believes that a fetus is a person. Therefore, Judge Wilson concludes that no one has the right to end the life of a fetus.
This is just poorly worded. If something is not a person, then Wilson believes you have the right to terminate it. Since a fetus is a person, then no one has the right to terminate it. That logic does not necessarily follow from the initial statement; because the nuance there is that Wilson is expanding her personal logic into the logic of everyone else. Again, there's a bit of nuance in that people may personally not be for abortion, but believe that women still have the right to abortion despite their own personal beliefs.
The logic is also initially biased right from the start, since it implies that Wilson would also believe killing animals, pets etc is OK.
I don't think these two questions really resolve the issue of bias preventing sound reasoning, because it implies that the conclusion of the two questions was logical in the first place. They effectively managed to prove that two highly nuanced questions are, in fact, highly nuanced.
> The final clause does not actually follow from the rest of the logic
The final clause follows formally from the truth of the first two clauses. What you are complaining about is that the final clause is not true. But the question being asked of the study participants is not whether the argument is true, but whether the final clause follows if the first two are true. You are exhibiting exactly the behavior the study is testing: giving a wrong answer to a question of soundness by substituting for it a question of truth.
There is, perhaps, an argument to be made that "logical reasoning" in the sense being tested - being able to tell valid from invalid syllogisms - isn't all that pertinent in the real world. Is that what you're saying, though? I find it hard to tell.
> >Judge Wilson believes that if a living thing is not a person, then one has the right to end its life. She also believes that a fetus is a person. Therefore, Judge Wilson concludes that no one has the right to end the life of a fetus.
There is a logical problem with this argument, independent of any of the issues you bring up. Consider it divorced of context:
W believes: if an X is not Y, then it is a Z.
W believes: this particular X is Y.
W therefore believes: this particular X is not Z.
The problem is that just because (X & not Y --> Z), doesn't mean (X & Y --> not Z). In other words, thinking it's okay to end non-person's lives doesn't imply thinking it's _not_ okay to end person's lives. Concretely, for example, one might think killing in war is justified.
> All websites that are dangerous should be banned. Hacker News is a dangerous website. Therefore Hacker News should be banned.
> Cigarette smoking causes cancer. Things that cause cancer should be illegal. Therefore, cigarette smoking should be illegal.
> Things that can harm children should be taken away. Phones can harm children. Therefore, phones should be taken away
All of these function as the exact same logic posited in the study. A is B. B is C. Therefore, A is C. However, this really doesn't apply to the categories above because when you start talking about 'dangerous' or 'causes cancer' or 'harms children' etc there is a wide berth for rational disagreement.
That doesn't matter as long as the meaning of words remains the same throughout the argument.
Dangerous things should be banned.
Riding your bike without a helmet is dangerous.
Doing cocaine is dangerous.
Driving a car is dangerous.
Ergo, those three items listed above should be banned.
Are all of these actions equally dangerous? The answer almost everyone would give is no. That's where the bias enters into play. The definition of the word hasn't changed at all throughout.
> Futarchy: Vote Values, But Bet Beliefs
> by Robin Hanson
> This short "manifesto" describes a new form of government. In "futarchy," we would vote on values, but bet on beliefs. Elected representatives would formally define and manage an after-the-fact measurement of national welfare, while market speculators would say which policies they expect to raise national welfare.
Democracy seems better than autocracy (i.e., kings and dictators), but it still has problems. There are today vast differences in wealth among nations, and we can not attribute most of these differences to either natural resources or human abilities. Instead, much of the difference seems to be that the poor nations (many of which are democracies) are those that more often adopted dumb policies, policies which hurt most everyone in the nation. And even rich nations frequently adopt such policies.
These policies are not just dumb in retrospect; typically there were people who understood a lot about such policies and who had good reasons to disapprove of them beforehand. It seems hard to imagine such policies being adopted nearly as often if everyone knew what such "experts" knew about their consequences. Thus familiar forms of government seem to frequently fail by ignoring the advice of relevant experts (i.e., people who know relevant things).
Suppose you are leading a party that has an effective plan to address a lot of preventable deaths.
If you lose, many people won't be saved.
If you bend your values and engage in manipulating public opinion, making empty promises, lying and so on you are much more likely to win.
What would you do?
Either you're a conspiracy theorist and believe so many weird things that adding another believe won't top the cart.
Or is it that we want to believe explanations that we understand (or think we do) and not believe what we don't understand (science).
It might take centuries though...