

Humans are not Automatically Strategic (2010) - jeremynixon
http://lesswrong.com/lw/2p5/humans_are_not_automatically_strategic/

======
YokoZar
While the author has a great underlying point, I find this style of writing a
particular disservice: "Perhaps 5% of the population has enough abstract
reasoning skill to verbally understand that the above heuristics would be
useful once these heuristics are pointed out."

This is a version of "you, enlightened reader, are the special chosen few" \-
to me it reads no different in rationalist essays as when it gets pitched to
me by multi-level marketers selling their get rich quick schemes. If you're
familiar with reading bullshit such an appeal to the reader's vanity should
immediately send up a few red flags, and it can completely detract from what
would otherwise be a reasonable piece.

~~~
jere
The site has always given me that vibe. I'm not sure I understand the point of
a bunch of people getting together to celebrate how smart they are. Especially
since they're pulling statistics out of thin air and using unconvincing
examples: you're telling me most comedians _don 't_ keep track of which jokes
are getting laughs?!

My final impression: [https://xkcd.com/874/](https://xkcd.com/874/)

------
maaaats
If you like blog posts like these, check out the book _Rationality: From AI to
zombies_. Basically a compilation of these posts.

~~~
roryokane
Link: [https://intelligence.org/rationality-ai-
zombies/](https://intelligence.org/rationality-ai-zombies/). The ebook
collection has a suggested price of $5.00, and a minimum price of $0.00.

------
mangeletti
Wow, this is really interesting... I'm going to return later to read all the
comments herein and in the article.

------
paulsutter
> Why? Most basically, because humans are only just on the cusp of general
> intelligence.

I agree. General intelligence, to me, is defined as the ability to create a
greater intelligence. An ability that's hypothesized for humans, but yet
unproven. Therefore, on the cusp of general intelligence.

~~~
dvanduzer
General intelligence does seem to be non-orthogonal to strategic thinking.

(The nuance is what Yudkowsky et al. tend to write about.)

second edit: "Can we actually change that en masse?" is the most interesting
question.

~~~
gull
Is there any way this be done systematically with software?

~~~
dvanduzer
Are you talking about software that provides relevant information for any
human making a general decision? Or are you describing another kind of
software?

