I'm partial to the argument that the reason we have heuristics or "rules of thumbs" is because it saves our energy. If we invest fully rational mental energy on every decision, then we wouldn't get as much done. Heuristics are useful because they are useful, usually correct, and save time/energy.
However, they are also sometimes wrong. And I also know that cognitive biases can lead us astray in the same way that a heuristic can be wrong. And that's where rationality is useful, because it helps us make the correct decision in those cases.
If both are true, then the optimized way of living would be to use heuristics where they don't get you in trouble, and use rational examination when the result would be counterintuitive. Here's where I'm stuck - how do you recognize ahead of time when you should ignore your instinct/heuristic/cognitive-bias, and instead use the more exhaustive rational examination?
It's as if we need to develop an intuitive skill on when to recognize that a reality is likely to be counterintuitive. "Oh, my gut tells me that this is one of those cases where my gut will be wrong." Which seems a bit contradictory.
I'm not great at it, but I'm very happy that it catches at least some of my brain's bugs.
Of course, identifying the crossover point would seem to require a second-order heuristic...
No, he outright makes that "diagnosis" of the former president, going so far as to title the chapter 1 "teaser" of his book "Inside George W. Bush's Mind". It's available here:
Maybe he's being cagey about it because his "evidence" is incredibly weak sauce. He expects most of the audience for his book to nod along when he describes a guy who managed to achieve the highest political office in an incredibly competitive environment as unable to make rational decisions. He cites for his analysis of GWB's cognitive function a whole two people.
First, there's David Frum, who served as a speechwriter for 13 months from early 2001 to early 2002. Before that he had "no connection to the campaign or the Bush family". He's used that stint to great career effect, but seriously, there are a lot of folks who worked more closely with Bush for a longer period of time and have written about his thought and leadership style. Surely if Stanovich wasn't just looking for a good "hook" to snare his liberal readership, he could have found better sources, critical and otherwise. Oh, and Frum was a huge initial supporter of the Iraq invasion, and obviously Stanovich or his imagined readership wouldn't dream of calling that a clearly irrational decision.
Second, there's a bit from a George Will column. The section in italics was strangely omitted by Stanovich:
"He has neither the inclination nor the ability to make sophisticated judgments about competing approaches to construing the Constitution. Few presidents acquire such abilities in the course of their pre-presidential careers, and this president particularly is not disposed to such reflections."
So which U.S. presidents weren't "dysrationalic" by this standard?
This guy may be a terrific research psychologist, but he's obviously not above slandering a president and pandering to his readership with bad scholarship just to sell books.
It's also odd to point to his performance in politics as evidence of his rationality. Politics is a deeply irrational field, and I wouldn't consider becoming President an indicator that a person is rational any more than I would consider becoming head of the Flat Earth Society such.
I'm not saying these specific conclusions are legitimate, but I don't think it comes down as "slander", really.
 Not in the legal sense, of course, since it's written, not spoken.
* he waved off warnings of Bin Laden ("All right. You’ve covered your ass, now")
* he turned a highly effective -- under clinton -- national disaster agency into a joke, at least in part by putting a failed horse judge in charge of it, then oversaw an ineffective response to one of the largest national disasters to ever strike the US (katrina)
* he lost Bin Laden at Tora Bora
* he and his chosen vp and sec state back channeled incredibly wrong information into the public, in part via the nyt, in order to sell a war with, well, people who had no weapons of mass destruction and who were uninvolved in 9/11;
* by the by on the above, he fired the general (shinseki) who gave an accurate prediction of iraq war costs
* Bush hired wolfowitz, who predicted the iraq war would cost between $10 and $100B (what's $700B and counting between friends)
* Bush ignored, well, basically everyone who understood much of anything about the middle east in favor of a war which wildly destabilized a highly geopolitically important (oil) zone, while empowering iran
So your link, in which the author writes,
[...] President Bush's thinking has several problematic aspects: lack of
intellectual engagement, cognitive inflexibility, need for closure, belief
perseverance, confirmation bias, overconfidence, and insensitivity to
consistency. These are all cognitive characteristics that have been studied
by psychiatrists and that can be measured with at least some precision.
However, they are all examples of thinking styles that are not tapped by IQ
tests. Thus, it is not surprising that someone could suffer from many of
these cognitive deficiencies and still have a moderatively high IQ.
Bush's cognitive deficiencies do not impair performance on intelligence tests,
but they do impair rational decision making. 
Indeed, in contrast to your laundry list, Stanovich doesn't present a single concrete example of an irrational decision by the former president and the cognitive bias that led to it. He doesn't have to, because he wants to appeal to a readership that is delighted to find that not only is the target of their political animus wrong, but actually suffering from a mental disability!
It's intellectually lazy and unworthy of a serious scholar.
Consider his religious affiliation suggests a level of irrational thinking, but you can't say such things if you want a best seller. Yet, there are plenty scholarly works focused on the connection between religion an irrationality because scholars especially those with tenure can get away with such things.
This must be some kind of alternate reality where GWB did not declare war on Iraq based on a complete knee-jerk (read: irrational) reaction to 9/11.
And I'm being very generous here. There are many smart people who have provided evidence that GWB's decision to invade Iraq was based on malice.
It's possible for someone to reach a high office in a competitive environment and still to make decisions that are not so rational. Cognitive biases are very strong, and affect most people, and so people in high office have their own biases.
and I have recommended that book repeatedly in Hacker News discussion over the years. The book is readable, interesting, and surprising, and the bibliography cites most of the best recent research on human cognition.
A great place to study the art of human rationality is LessWrong.com. The 'sequences' are the place to start:
2) Have you read only a few things that he wrote about, for example, the better known cognitive biases and such, or have you also read some of the more advanced sequences?
Because if you're reading the advanced essays and feel you already know all that stuff, you are a very rare breed -- good for you! I hope you're working on some hard problem in an un-sexy field and not building another photo-sharing app :)
To quote Dr. Aubrey de Grey:
>It has always appalled me that really bright scientists almost all work in the most competitive fields, the ones in which they are making the least difference. In other words, if they were hit by a truck, the same discovery would be made by somebody else about 10 minutes later.
3) When someone says they don't like something that I like, I ask them what is it that they like. I figure maybe they've found something even better and I'd love to get my hands on it too, and it's also a good way to see if they're just signalling superiority by disliking things that many others like. So what would you recommend I read to learn more about human rationality? Anything other than the usual suspects (Jayne, Kahneman, Tversky, Schelling, Hatie & Dawes, etc)?
2) I've tried to read some of the major sequences, but end up quitting partway through. Each essay seems so long for what it's trying to impart. I tried reading HPMOR in hopes of getting the same or similar information with a whimsical story instead, but didn't enjoy that either.
I doubt I'm a rationalist prodigy, so I'm more worried that I'm missing out on something profound than failing humanity :)
3) Unfortunately, he's the only author I'm even aware of that writes on human rationality and is well-regarded. I spend most of my reading time on other topics.
A good example is Nassim Taleb's last book, a collection of completely ridiculous opinions on biology, medicine, computing and fitness...
On the other hand, I don't want to take an IQ test. This is because you end up with a number which can now be used to place you on a scale relative to others, which as far as I'm concerned is fairly useless. I'd much rather be judged based on what I've done rather on some number that purports to measure my potential.
 In my (fairly limited, I grant) experience, IQ scores are mostly used by insecure navel-gazers as a sort of bragging right.
The bit about Bush should have been omitted.
But, that may just be my irrationality.
While I am not there yet myself with most of the biases, it seems that memorizing the name of a bias once you understand it helps you spot it more often in the daily life. [Sorry I do not know which cognitive bias would that be! :-)]
Something that would give me a paragraph and ask me to name the bias present if any.
The Commanding Self
Influence: The Psychology of Persuasion
The first book also sounds a promising read on consciousness. I am keen on knowing your key learning on that too to help me prioritize. I'll definitely be reading it myself, but cannot read right away due to existing load.
My old work friend was really intelligent. He had been considering a PhD in philosophy and had problems to stop talking about Wittgenstein.
On one hand, this guy had a large interest -- and deep studies -- on the subjects of language meaning, epistemology, logic, the scientific method, etc.
On the other hand, he believed in conspiracy theories like being a 9/11 truther, etc.
I tried to discuss this contradiction with him, but probably tried to reason on a too low level. I argued that there is an infinite number of conspiracy theories that would satisfy the available facts. It might have worked better to simply ask: "What does your philosophy studies say about conspiracy theories based on hand picked facts?"
Such a sad, sad waste of a good brain.
Except if he is right, you know. Which depends on which conspiracies he is arguing about, and on what depth. It's just as wrong to assume the official or popular version of the stories was correct, just because it was reported in the media that way.
If he argued for example that there was widespread surveillance of Americans 2 years ago, lots of people would have told him "No, that's a conspiracy theory". Well, turns out it wasn't.
If somebody argued that the CIA knowlingly had people sell crack in the USA in order to finance foreign operations, people would also yell that it's a conspiracy theory. But then you get this: http://en.wikipedia.org/wiki/CIA_and_Contras_cocaine_traffic...
(And if you have a hard and certain opinion about something really complex you don't understand without good facts/logic for support, what are you doing..? Don't answer, that was a rhetorical question.)
Thus, you can't say it wasn't a conspiracy. Yeah, you can't say it was one either. Both of you are jumping to conclusions, what's irrational by the definition of the article.
There's nothing irrational with saying; "Aliens did not build the Pyramids", or "I know that aliens did not build the pyramids". It is assumed that a rational person acknowledges an infinitesimally small possibility that they are wrong in stating such a fact, but the reality is, all claims whether negative ("There is no conspiracy") or positive ("There is a conspiracy") are accepted by a rational person with such an acknowledgment of varying importance, so there's no reason for a rational person to append every statement with "but I could be wrong".
The crucial flaw I think that you (and the many, many others who have made such arguments) make is a fallacious distinction between positive and negative claims.
Let's take a positive claim like: "There is a chair in the corner of the room". This claim, however, well supported by physical evidence (our rational subject has felt the chair, seen the chair, and seen others interact with the chair), can't be known rationally with absolute certainty- of course, there's always the possibility that some sufficiently advanced technology, conspiracy, mind-altering substance, or intelligent being is deceiving the rational subject, and there is no chair. This remote possibility must be acknowledged, but at the same time, it is dismissed by every rational agent, and rational agents do not base their actions on such possibilities. The point is, every positive claim like "There is a chair" can be framed as a negative claim like "There is no force or phenomena deceiving me into believing there is a chair". Thus, if your suggestion that "dismissing unsubstantiated negative claims is irrational" is accurate, it would follow that you also cannot accept any positive claims without being irrational.
Rationality is having an open mind, accepting the most likely conclusion based on the available evidence, and properly estimating the degree to which you may be wrong. It works exactly like that for both positive and negative claims.
I hope you realize this isn't just semantics. This difference is crucial in understanding why arguments like "You don't know for certain if there's no God" and "You can't prove there's no conspiracy" are flawed, and these arguments are actually used by irrational people to justify all kinds of nonsense.
(But I'll add: There are also an infinite number of non-conspiracy theories fitting the facts.)
Edit: coldtea, sorry I'm not going to discuss e.g. if an infinite number of possible alien species could be responsible for 9/11. (And no, not even if the possibly existing alien species are enumerable :-) ) See above. I might, if I had more time.
That's wrong too. Social life and human action don't have infinite possibilities.
Depending on what scale you draw the line for detail, there are is a finite number of theories fitting any fact -- and less of them, if you include possibilities (Ocamm's razor).
And the more facts you have, the smaller the number of theories becomes.
On that subject, there's a video that it would be great it every American watched. It's made by a group called Architects & Engineers for 9/11 Truth. It's about evidence that explosives were used on 9/11. https://www.youtube.com/watch?v=Ddz2mw2vaEg
The result is around zero.
Seriously, come on, people are not rationally thinking creatures at all, nobody is. There is nothing to even measure.
If everybody became rational, the world would collapse instantly. Do you realize how many people would cease their socially indispensable work if they acted perfectly rational from an individual (meaning egotistical) perspective?
I clearly see that I am irrational, but I cannot stop being so.
I estimate that in order to become rationally thinking, I would have to undergo an unbearably painful transformation of my entire mental entity. My mind just works that way, it is hardwired irrationally, as any other human's mind.
The article and the attempt itself are great though.
It is often in the interest of individuals to contribute to the group.