Hacker News new | past | comments | ask | show | jobs | submit login
Yes, You’re Irrational, and Yes, That’s OK (nautil.us)
39 points by dnetesn on Feb 26, 2015 | hide | past | favorite | 12 comments



I often wonder if it's truly as irrational as we think, or if the fallacy is assuming that all our behavior should be viewed exclusively as centered upon the individual.

A lot of weirdness looks logical from the perspective of what I call "gene logic." If you frame us as containers for selfish genes, a lot of weird things might become "rational."


The fact that it ends up being beneficial to the species does not make the behavior any more rational. Strictly speaking, the label rational implies a cognitive phenomenon where some "thinking subject" represents a problem in an abstract way and applies the rules of logic to solve it.

Informally speaking, of course, people use "rational" as synonym of "smart", "cleaver" or even "good". In this case, I think the word you are looking for is closer to "adaptive".


I'm considering the idea that the thinking subject in this case is best conceptualized as a vehicle for genes. In that case survival/prosperity has a completely different and even alien meaning from what we typically imagine. To give a trivial example: if I kill myself in order to save my children, I have succeeded and this is rational from the genetic point of view. It's not rational from a purely individualistic / hedonic point of view.


This is precisely my point. You attribute the capacity to reason to chemical molecules because their behavior seems purposeful. It does not occur to you that living entities may have goals and achieve those goals by using information processing means other than Reason(TM).

By the same logic, you could argue that tables are indeed rational to fall down, because otherwise they would fly into the sky.


Yea in the end you need to set some axiomatic framework for rationality.

If you asked me, I would probably include some term about the happiness of others around me, and some further away in my stochastic utility expectation-maximization/risk-minimization in that a rational individual should follow. But then I'm not sure if that would be necessary since I might be automatically less happy if I see others are miserable.

Not relevant to this kind of analysis, but there's also a few of bugs (or paradoxes) to naive definitions of rationality: for example, say you took a pill that somehow leaves you completely happy but in a state of "stasis" and people provided you just subsistence food (essentially, taking heroin) -- if you define happiness in a straightforward way it's something you should seek.

In other words, there's no small mathematical formula that will give you the perfect 'rational' action to take, it requires nitty gritty definitions involving the human mind.


This is true, but it's too easy to go from that to assuming that ALL our cognitive biases were selected for. Some of them just emerge because it's hard to design a brain that doesn't work like that.

But some really basic cognitive biases, like the tendency for irrational confidence, are clearly evolutionarily adaptive.


True -- if you take this too far you get into the flimsier parts of evolutionary psychology.

On irrational confidence I've long wondered about whether Dunning-Krueger type effects are adaptive. If we had a rational image of our level of expertise, would we give up in despair in the face of undertakings like trying to understand theoretical physics?


I don't think the over evaluation part of the Dunning-Krueger effect has much to do with evolution. Basically it is a fundamental physical limitation: if you've never even seen an obstacle you can't say whether you could jump over it or not. After that it's just the natural optimism that's required by all species on earth to do anything non-trivial, nothing special. The under evaluation part is more interesting though, why not just keep going?


Since we can understand theoretical physics, giving up trying to understand it would not be rational, at least not in every case. You'd need to somehow derive the utility you expect to gain from understanding a thing and then compare that with the resources required to understand it. Admittedly, both of these things are hard to estimate when you have no idea what you're talking about in the first place.


Great article and totally stand by this in how I've designed my consumer product. However, it is quite poorly written. If I had not spent a ton of time reading academic papers there's no way I would have been able to follow this.


Indeed, heuristics and biases have evolutionary origins and served their purpose in the EEA (it really couldn't be any other way...), and indeed, manipulating people into correcting for these heuristics and biases without their knowledge may not necessarily be the best course of action.

I wonder how many people will take this to mean that the study of heuristics and biases is for Misguided Robotic Bad People (economists) and we're Perfect The Way We Are.


Nice article. I expect that there will be more and more acceptance that we cannot always be rational, and I expect there will be more and more products that help us to leverage our finite willpower to better make rational decisions. I think there is a relation between willpower and rational decision-making. (Which may be a reason why it is good to sleep on something before making a decision.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: