One thing I'm stuck regarding rationality vs instinct -
I'm partial to the argument that the reason we have heuristics or "rules of thumbs" is because it saves our energy. If we invest fully rational mental energy on every decision, then we wouldn't get as much done. Heuristics are useful because they are useful, usually correct, and save time/energy.
However, they are also sometimes wrong. And I also know that cognitive biases can lead us astray in the same way that a heuristic can be wrong. And that's where rationality is useful, because it helps us make the correct decision in those cases.
If both are true, then the optimized way of living would be to use heuristics where they don't get you in trouble, and use rational examination when the result would be counterintuitive. Here's where I'm stuck - how do you recognize ahead of time when you should ignore your instinct/heuristic/cognitive-bias, and instead use the more exhaustive rational examination?
It's as if we need to develop an intuitive skill on when to recognize that a reality is likely to be counterintuitive. "Oh, my gut tells me that this is one of those cases where my gut will be wrong." Which seems a bit contradictory.
It does seem contradictory, but it's not. You definitely learn to recognize your biases, it's a skill. There have been many times when I've caught myself thinking something, only to then say "wait a minute, I've just fallen victim to bias X" and then reversed my decision or opinion.
I'm not great at it, but I'm very happy that it catches at least some of my brain's bugs.
The stakes in your decision? If doing a full rational analysis has a cost in time and effort, but leads to a more accurate result on average, there's a level of risk/reward at which it's worth applying and one where it's not.
Of course, identifying the crossover point would seem to require a second-order heuristic...
Maybe he's being cagey about it because his "evidence" is incredibly weak sauce. He expects most of the audience for his book to nod along when he describes a guy who managed to achieve the highest political office in an incredibly competitive environment as unable to make rational decisions. He cites for his analysis of GWB's cognitive function a whole two people.
First, there's David Frum, who served as a speechwriter for 13 months from early 2001 to early 2002. Before that he had "no connection to the campaign or the Bush family". He's used that stint to great career effect, but seriously, there are a lot of folks who worked more closely with Bush for a longer period of time and have written about his thought and leadership style. Surely if Stanovich wasn't just looking for a good "hook" to snare his liberal readership, he could have found better sources, critical and otherwise. Oh, and Frum was a huge initial supporter of the Iraq invasion, and obviously Stanovich or his imagined readership wouldn't dream of calling that a clearly irrational decision.
Second, there's a bit from a George Will column. The section in italics was strangely omitted by Stanovich:
"He has neither the inclination nor the ability to make sophisticated judgments about competing approaches to construing the Constitution. Few presidents acquire such abilities in the course of their pre-presidential careers, and this president particularly is not disposed to such reflections."
So which U.S. presidents weren't "dysrationalic" by this standard?
This guy may be a terrific research psychologist, but he's obviously not above slandering a president and pandering to his readership with bad scholarship just to sell books.
Ignoring the polarisation of the debate when a politician's name is used:
It's possible for someone to reach a high office in a competitive environment and still to make decisions that are not so rational. Cognitive biases are very strong, and affect most people, and so people in high office have their own biases.
I find that "slandering a president" phrase a bit odd. The idea that W was stupid or irrational is the kinder interpretation of his presidency. The alternative explanation is that he was just plain evil.
It's also odd to point to his performance in politics as evidence of his rationality. Politics is a deeply irrational field, and I wouldn't consider becoming President an indicator that a person is rational any more than I would consider becoming head of the Flat Earth Society such.
I'm not saying these specific conclusions are legitimate, but I don't think it comes down as "slander", really.
I'm struggling not to repeat myself, but suppose I wrote a book with the first chapter titled "Inside mikeash's Mind" that purported to diagnose you with the propensity to make irrational decisions, and proceeded to document this inability by referring to a former employee who resigned after a year rather than be fired and selectively quoting something written about you by a non-associate...and nothing else. It's hard to see how that would not be a slander on you, regardless of whether you've done more good or ill in your life.
 Not in the legal sense, of course, since it's written, not spoken.
If I had had a long public life filled with actions whose only explanation are malice or irrationality, I don't think it would be slander, no, even if the argument wasn't very well constructed. That doesn't make it a good argument, I just don't think it qualifies as slander. (Or libel.) For one thing, slander or libel in the US has to be actually false.
I'll grant assumption, but flawed? What else is there? I recognize that there are some people who will explain his actions as "brilliant, wonderful, generous, pure of heart, etc." but I have no qualms simply calling them wrong.
Yet George Bush's presidency is widely acknowledge to be, well, a fail parade. Or perhaps we should go with by their works ye shall judge them. An (extremely brief) overview of Bush's presidency:
* he waved off warnings of Bin Laden ("All right. You’ve covered your ass, now")
* he turned a highly effective -- under clinton -- national disaster agency into a joke, at least in part by putting a failed horse judge in charge of it, then oversaw an ineffective response to one of the largest national disasters to ever strike the US (katrina)
* he lost Bin Laden at Tora Bora
* he and his chosen vp and sec state back channeled incredibly wrong information into the public, in part via the nyt, in order to sell a war with, well, people who had no weapons of mass destruction and who were uninvolved in 9/11;
* by the by on the above, he fired the general (shinseki) who gave an accurate prediction of iraq war costs
* Bush hired wolfowitz, who predicted the iraq war would cost between $10 and $100B (what's $700B and counting between friends)
* Bush ignored, well, basically everyone who understood much of anything about the middle east in favor of a war which wildly destabilized a highly geopolitically important (oil) zone, while empowering iran
So your link, in which the author writes,
[...] President Bush's thinking has several problematic aspects: lack of
intellectual engagement, cognitive inflexibility, need for closure, belief
perseverance, confirmation bias, overconfidence, and insensitivity to
consistency. These are all cognitive characteristics that have been studied
by psychiatrists and that can be measured with at least some precision.
However, they are all examples of thinking styles that are not tapped by IQ
tests. Thus, it is not surprising that someone could suffer from many of
these cognitive deficiencies and still have a moderatively high IQ.
Bush's cognitive deficiencies do not impair performance on intelligence tests,
but they do impair rational decision making. 
Even if I grant you all of that, it doesn't change the fact that Stanovich's claims re: the cognitive abilities of George W. Bush are based on extremely flimsy evidence. If he had stronger evidence than the testimony of Frum and a column from George Will (who, as far as I know, never attended a meeting in which GWB made a decision), wouldn't he have used it?
Indeed, in contrast to your laundry list, Stanovich doesn't present a single concrete example of an irrational decision by the former president and the cognitive bias that led to it. He doesn't have to, because he wants to appeal to a readership that is delighted to find that not only is the target of their political animus wrong, but actually suffering from a mental disability!
It's intellectually lazy and unworthy of a serious scholar.
If this had been a dissertation I would completely agree with your point of view. However, serious scholarship has little to do with writing a popular book. If anything Malcolm Gladwell is the gold standard for popular, interesting, and bulllshit books that sell like hot cakes.
Consider his religious affiliation suggests a level of irrational thinking, but you can't say such things if you want a best seller. Yet, there are plenty scholarly works focused on the connection between religion an irrationality because scholars especially those with tenure can get away with such things.
and I have recommended that book repeatedly in Hacker News discussion over the years. The book is readable, interesting, and surprising, and the bibliography cites most of the best recent research on human cognition.
I have a hard time reading Yudkowsky. I'm not as in love with his writing as he is. I think he makes good points. But if you're starting from the position of evidence-based beliefs and Bayesian reasoning, the knowledge gained by suffering through a longwinded essay seems too little. I have tried a few times, because others seem to love Less Wrong, but each time I found I came away with little or no new information.
1) No writer can please everybody, so it's fine if he's not your style.
2) Have you read only a few things that he wrote about, for example, the better known cognitive biases and such, or have you also read some of the more advanced sequences?
Because if you're reading the advanced essays and feel you already know all that stuff, you are a very rare breed -- good for you! I hope you're working on some hard problem in an un-sexy field and not building another photo-sharing app :)
To quote Dr. Aubrey de Grey:
>It has always appalled me that really bright scientists almost all work in the most competitive fields, the ones in which they are making the least difference. In other words, if they were hit by a truck, the same discovery would be made by somebody else about 10 minutes later.
3) When someone says they don't like something that I like, I ask them what is it that they like. I figure maybe they've found something even better and I'd love to get my hands on it too, and it's also a good way to see if they're just signalling superiority by disliking things that many others like. So what would you recommend I read to learn more about human rationality? Anything other than the usual suspects (Jayne, Kahneman, Tversky, Schelling, Hatie & Dawes, etc)?
1) Agreed. I observe other people like him, so in this case it's probably all me.
2) I've tried to read some of the major sequences, but end up quitting partway through. Each essay seems so long for what it's trying to impart. I tried reading HPMOR in hopes of getting the same or similar information with a whimsical story instead, but didn't enjoy that either.
I doubt I'm a rationalist prodigy, so I'm more worried that I'm missing out on something profound than failing humanity :)
3) Unfortunately, he's the only author I'm even aware of that writes on human rationality and is well-regarded. I spend most of my reading time on other topics.
Well, the people over at the Center for Applied Rationality have tried to work on something like this, if only to attempt to better measure the effects of their workshops. I suspect this is a very hard undertaking.
Define "rational". Psychologists like Kahneman often say they're using some agreed-upon definition that exists outside their field. But applications of Bernoulli and Bayes are an area of ongoing research, even narrowly . More generally, no theory of general intelligence exists, and to assume you can trick intelligent agents like humans with simplistic experiments is unconvincing IMO.
The notion of domain-independent rationality is the favourite way for some nerds (so called "sceptics") to pump up their egos and then throw around nonsensical judgements in domains they don't know the first thing about. But hey, they know the Bayes theorem! So they will out-smart everyone on every topic! In some sense this is an intellectual philosophers stone, the notion (it is never said out loud but that is what seems to happen subconsciously) that once you are "rational" enough you can judge things more effectively without having expertise in a given field, medicine for example.
A good example is Nassim Taleb's last book, a collection of completely ridiculous opinions on biology, medicine, computing and fitness...
Depending on how rigorously the research on this is done, this is a test I might actually take, especially since, as he says in the article, a lot of it is stuff you can improve if you find yourself lacking.
On the other hand, I don't want to take an IQ test. This is because you end up with a number which can now be used to place you on a scale relative to others, which as far as I'm concerned is fairly useless. I'd much rather be judged based on what I've done rather on some number that purports to measure my potential.
 In my (fairly limited, I grant) experience, IQ scores are mostly used by insecure navel-gazers as a sort of bragging right.
While I am not there yet myself with most of the biases, it seems that memorizing the name of a bias once you understand it helps you spot it more often in the daily life. [Sorry I do not know which cognitive bias would that be! :-)]
Just looked up these books at Amazon. They sound promising, but at least the first and the last ones do not seem to be related to rationality. Could you please provide more information on your learning from these books as related to rationality?
The first book also sounds a promising read on consciousness. I am keen on knowing your key learning on that too to help me prioritize. I'll definitely be reading it myself, but cannot read right away due to existing load.
The best way to test this would be to approach it like Intelligence research originally approached that hypothesis for IQ scores: by testing a reasonably large set of twins separated at birth. That's such a tricky criteria to come by, I can't imagine it'll be easy to do again, but it did help cement the idea that IQ was likely not significantly impacted cultural/environmental factors, and thus possibly a result of subtle genetic traits. The interesting thing about those studies though, was that the separated twins tended to have a lot of other unexpectedly subtle quirks in common too (such as sense of humor). So in that context, I actually wouldn't be surprised if rationality falls under those similarities as well.
I used to naively assume that intelligence implied rationality. An old work colleague taught me differently.
My old work friend was really intelligent. He had been considering a PhD in philosophy and had problems to stop talking about Wittgenstein.
On one hand, this guy had a large interest -- and deep studies -- on the subjects of language meaning, epistemology, logic, the scientific method, etc.
On the other hand, he believed in conspiracy theories like being a 9/11 truther, etc.
I tried to discuss this contradiction with him, but probably tried to reason on a too low level. I argued that there is an infinite number of conspiracy theories that would satisfy the available facts. It might have worked better to simply ask: "What does your philosophy studies say about conspiracy theories based on hand picked facts?"
Except if he is right, you know. Which depends on which conspiracies he is arguing about, and on what depth. It's just as wrong to assume the official or popular version of the stories was correct, just because it was reported in the media that way.
If he argued for example that there was widespread surveillance of Americans 2 years ago, lots of people would have told him "No, that's a conspiracy theory". Well, turns out it wasn't.
I'm not going to go over this guy's many and varied strange opinions, just note that they were conspiracy theories and that he thought people he didn't agree intellectually with were dishonest (e.g. AI researchers from before the AI winter). And so on.
(And if you have a hard and certain opinion about something really complex you don't understand without good facts/logic for support, what are you doing..? Don't answer, that was a rhetorical question.)
The reality is rational people do dismiss hypotheses completely unsupported by the available evidence. They just at the same time acknowledge that they could theoretically be wrong.
There's nothing irrational with saying; "Aliens did not build the Pyramids", or "I know that aliens did not build the pyramids". It is assumed that a rational person acknowledges an infinitesimally small possibility that they are wrong in stating such a fact, but the reality is, all claims whether negative ("There is no conspiracy") or positive ("There is a conspiracy") are accepted by a rational person with such an acknowledgment of varying importance, so there's no reason for a rational person to append every statement with "but I could be wrong".
The crucial flaw I think that you (and the many, many others who have made such arguments) make is a fallacious distinction between positive and negative claims.
Let's take a positive claim like: "There is a chair in the corner of the room". This claim, however, well supported by physical evidence (our rational subject has felt the chair, seen the chair, and seen others interact with the chair), can't be known rationally with absolute certainty- of course, there's always the possibility that some sufficiently advanced technology, conspiracy, mind-altering substance, or intelligent being is deceiving the rational subject, and there is no chair. This remote possibility must be acknowledged, but at the same time, it is dismissed by every rational agent, and rational agents do not base their actions on such possibilities. The point is, every positive claim like "There is a chair" can be framed as a negative claim like "There is no force or phenomena deceiving me into believing there is a chair". Thus, if your suggestion that "dismissing unsubstantiated negative claims is irrational" is accurate, it would follow that you also cannot accept any positive claims without being irrational.
Rationality is having an open mind, accepting the most likely conclusion based on the available evidence, and properly estimating the degree to which you may be wrong. It works exactly like that for both positive and negative claims.
I hope you realize this isn't just semantics. This difference is crucial in understanding why arguments like "You don't know for certain if there's no God" and "You can't prove there's no conspiracy" are flawed, and these arguments are actually used by irrational people to justify all kinds of nonsense.
I'm not going to discuss arguments about one of the beliefs of a non-rational person. If you feel a pressing need to iron out the finer distinctions of such stuff, I'm certain there are multiple suitable sub-reddits.
(But I'll add: There are also an infinite number of non-conspiracy theories fitting the facts.)
Edit: coldtea, sorry I'm not going to discuss e.g. if an infinite number of possible alien species could be responsible for 9/11. (And no, not even if the possibly existing alien species are enumerable :-) ) See above. I might, if I had more time.
> he believed in conspiracy theories like being a 9/11 truther, etc.
On that subject, there's a video that it would be great it every American watched. It's made by a group called Architects & Engineers for 9/11 Truth. It's about evidence that explosives were used on 9/11. https://www.youtube.com/watch?v=Ddz2mw2vaEg
Seriously, come on, people are not rationally thinking creatures at all, nobody is. There is nothing to even measure.
If everybody became rational, the world would collapse instantly. Do you realize how many people would cease their socially indispensable work if they acted perfectly rational from an individual (meaning egotistical) perspective?
I clearly see that I am irrational, but I cannot stop being so.
I estimate that in order to become rationally thinking, I would have to undergo an unbearably painful transformation of my entire mental entity. My mind just works that way, it is hardwired irrationally, as any other human's mind.
The article and the attempt itself are great though.