
What Developmental Milestones Are You Missing? - ggreer
http://slatestarcodex.com/2015/11/03/what-developmental-milestones-are-you-missing/
======
SilasX
The milestone about understanding trade offs really resonated with me. I had
been using (what I call) the "Scylla-Charybdis Heuristic":

-Don't trust any model that implies X is too low unless it's also capable of detecting when X would be too high

I've been frustrated by how people can fail this for X=the minimum wage,
immigration, ease of getting public assistance (just to show that it's across
the spectrum), as well as less political issues like "approaching strangers"
or "trying to persuade after a rejection".

There is a very common mentality out there that cannot admit to any downsides
to a favored policy. I had often attributed this to "well, they have to be
wary of tripping others' poor heuristics", but this may itself be the #1
fallacy Alexander mentions! It can just as well be that the person cannot
think in terms of trsdeoffs.

~~~
widforss
I tend to avoid people who comes up with simple solutions based on their
ideology. Folks who are honest about not having a simple solution makes a much
more intelligent impression on me.

Does this comment make me a hypocrite?

~~~
ionised
What about situations where a simple solution would be the most effective and
appropriate?

Would you be averse to it just because it is simple?

~~~
pjc50
Potentially, yes? Especially if it's a long-standing complex problem. History
is full of oversold simple solutions. Hearing "just do .." should immediately
alert you to the possibility that the speaker hasn't understood your
situation.

~~~
eli_gottlieb
>History is full of oversold simple solutions.

Of course, history is _also_ full of people insisting, "No, hold on, it's far
more complicated than _that_!" and then being totally wrong.

For instance, people spent a very long time believing that a difficult-to-
model combination of many different factors produced stomach ulcers. Then an
experiment was done, and voila, the real cause was _Helicobacter pylori_.

Simplicity (or, in fact, _regularization_ ) is helpful far more often than
it's harmful.

~~~
mrob
In the case of stomach ulcers it actually is more complicated. _Helicobacter
pylori_ is the most common cause, but not the only cause:

[http://www.uptodate.com/contents/association-between-
helicob...](http://www.uptodate.com/contents/association-between-helicobacter-
pylori-infection-and-duodenal-ulcer)

------
duncanawoods
IMHO LW grossly overstates the importance of statistical thinking.

If you have a job making decisions for groups then statistics matters greatly
but in our personal lives, genuinely statistical problems are so rare compared
to the huge number of decisions we actually face.

Most of our decisions are characterised by uncertainty, uniqueness and an
absence of data. Real-world rationality is less like mathematics and more like
system design where we weave together plans and strategies. For example using
contingency planning - a rational buyer ensures the seller has a good return
policy rather than trying to use Baye's rule to analyse their past purchase
experiences.

IMHO the overlooked core of rationality is creativity - you have to imagine
possibilities and invent plans. Its only when you reduce decision-making to
math problems that math seems so important.

~~~
goodcanadian
You are exhibiting a case of failing in the "2\. Ability to model other people
as having really different mind-designs." The author is a psychologist, not a
statistician. When he talks about thinking probabilistically, he is not
talking about sitting down and doing the math. He is talking about recognizing
that there is no such thing as certainty and understanding that even very
unlikely outcomes may occasionally occur. It is about making reasonable
rational decisions in an uncertain world, hedging uncertainty if possible, and
generally being OK with it.

~~~
kazagistar
In college I had a friend who didn't understand 50/50\. If an outcome was not
yet decided, then both options are the same "size", right?

One would like to think that my debates with him about this, or maybe the
stats class he took, we're what helped fix this intuition. But what probably
was just as important was playing games like LoL, where in the face of
uncertainty, you make calculated risks all the time. At some point, it is
impossible to get better without making intelligent risks.

------
TeMPOraL
I'm gonna add - understanding feedback loops. A _lot_ is written about
economics and politics by people who don't seem to get that one, and those
discussions are mostly pointless. Myself I understood it only after control
theory course in university, though now I recognize that high-school level
chemistry teaches it too (chemical equilibrium). After you grok it you start
to see how incentives can feed one another and that it's perfectly reasonable
for a bad system to exist in which there is no single actor to assign blame to
(compare: is television stupid because people are dumb, or are people dumb
because TV is stupid?).

~~~
chipsy
I've had the experience of understanding feedback loops perfectly well but
still feeling inclined to blame a "big bad" for most ills. I had trouble
developing an ethical basis for my actions because I swing across the
political aisle very readily based on utilitarian arguments, but
simultaneously feel tons of empathy for others and thus want to be in
collaboration with everyone I meet. Thus I swept through the gamut of
political thought, radical in various ways at various times, moderate at
others.

Working through that meant gaining a better distinction of the self from the
system. I explicitly use a virtue ethics mode for my immediate surroundings
and everyday behaviors(i behave in the way in which i feel the least guilt),
and then a utilitarian mode when I have to make a decision affecting
potentially large groups(e.g. my voting decisions tend to be less selfish and
more model-based). This keeps my priorities healthy because it stops me from
pointing to an inappropriate "scale" of thinking for the situation.

------
djhn
> whether the listener needs the piece, already has the piece, or just plain
> lacks the socket that the piece is supposed to snap into.

This is very relevant to how we think about education, communication,
information design and propagation.

The combination of this SSC and the linked David Chapman piece are very good.
They explained and formalized quite a few concepts that have seemed like weird
nebulous glitches in the matrix to me. And not only did it make some things
click into place, it expanded the amount of places things could click into.

------
dools
Reminds me of the 7 transformations of leadership
[https://hbr.org/2005/04/seven-transformations-of-
leadership](https://hbr.org/2005/04/seven-transformations-of-leadership)

------
vlehto
>2\. Ability to model other people as having really different mind-designs

I got recently very good lesson about this while learning about MBTI.[1] It's
really not too scientifically rigorous model. But it shows you how some people
could function in fundamentally different way than you. And there is people
who subscribe to each of the 16 categories. The Big Five is more statistically
valid. But it is lacking in the potential to explain world from different
points of view.

1\.
[https://en.wikipedia.org/wiki/Myers%E2%80%93Briggs_Type_Indi...](https://en.wikipedia.org/wiki/Myers%E2%80%93Briggs_Type_Indicator)

------
teekert
"Ability to think probabilistically"

I think that is really powerfull, the ability to think in a Bayesian way, to
not label something as one thing or the other but to accept fully, that as
things stand now, something can be 70% A and 30% B. This does not even have to
collide with objectivist views as the thing is either A or B but, based on all
the information you have you can only be 70% sure it is A.

New information may well lead to to a new 20% A, 70% B, 10% something else. I
tend to judge people by their ability to do this. Never enter a discussion
unwilling to change your mind is one of my mantra's. I'm also fully aware I
break it often by the way.

I have often seen two experimental situations with highly overlapping
histograms for one parameter and then have a colleague ask: "So where do we
draw the line for positive/negative?" I always say: "We don't. We call the
sample that is exactly in the middle 50% positive, 50% negative and we have
said everything we can about said sample given the 1 dimensional information
we have."

What's cool for the Dune fans, this seems to me to be exactly what Bene
Geserit witches and mentats do. Ingest proofs, take tiny, tiny hints, combine
them in a Bayesian manner, produce a likelihood of truth for a certain
hypothesis... Perhaps I'm over-interpreting (and projecting) ;)

~~~
dmichulke
This entails the ability to change one's mind as in "previously, I would have
done this but given the new evidence now I favor that" \- something often seen
as weakness in politicians.

~~~
kabouseng
Not just in politicians...

"Hmmm I think this bug will take a week to fix"

Gets fixed in two days.

"So why were you wrong in your estimate?" or even worse, now your estimates
gets discounted next time. And the same holds true if it happens the other way
round.

~~~
jdietrich
It can be useful to explicitly state error bars. "Delivery will take three to
seven days" communicates uncertainty much more clearly than "delivery will
take about five days".

------
zamalek
_Anecdotal and worthless information but someone might find it interesting._

> he gets the thing, where “the thing” is a hard-to-describe ability to
> understand that other people are going to go down as many levels to defend
> their self-consistent values as you will to defend yours.

It think it boils down to anger.

I can't recall what causes it (the basal ganglia if I'm not mistaken); at
birth we have "biological firmware" that protects our internal mental state -
part of which is our belief system. When our internal state is challenged a
flight or fight response is invoked.

Hypothesis 1: this means that the anger or irrational response to a challenge
of beliefs _could_ be a biological disposition.

Hypothesis 2: if you still have any semblance of neuroplasticity it might be
possible to 'unwire' or 'reflash' this primitive "firmware".

I've been attempting to do this over a few years: killing off that primitive
part of my brain by proactively seeking out conversation with intelligent
people who are opposed to my beliefs: mostly bigots. I have anecdotal evidence
that it might be working (although it can't be quantified or proven in any
way) - mostly by observing this flight or fight response in others when I have
none.

~~~
vlehto
It's called "cognitive dissonance".

If someone passionately disagrees with you, accusing him of cognitive
dissonance works _always_. So you can assume high ground by being right and
being capable of seeing the motivations of the other party. (Or it might be
that your opponent actually passionately disagrees. But lets not talk about
that.)

~~~
zamalek
> cognitive dissonance

Very, _very_ close to what was referring to. As I recall there is a biological
reason underpinning effects like cognitive dissonance and confirmation bias
(Freudian Denial in general). I could just be remembering things incorrectly
and you may have hit the nail on the head.

------
delinka
2\. Ability to model other people as having really different mind-designs

I believe not learning this is a problem in the communities within which I
grew up. Bible belt, rural ... The reinforcement of "we should all
think/believe/do the same things" seems to come from a lack of exposure to any
people or ideas from outside the community; from "isolationism" one could say.

~~~
johncolanduoni
I think the other extreme of elitist intellectuals has exactly the same
problem; the understand that people have different mind-designs, but thats
because their minds are __wrong __.

~~~
vlehto
I think it's typical for people to divide other people into two categories.
The ones who function just like me, and enemies who are nothing like me.

You find that phenomena in surprising places. I'd claim that nobody is
completely immune.

~~~
will_pseudonym
"There are two types of people in this world. People who divide the world into
two types, and those who don't."

~~~
vlehto
Good catch. :)

I was thinking about Judith Butler. Philosophist, gender theorist and
published author. I read her "Gender Trouble" to a bout halfway. It occurred
to me that she might not really recognize that some women might be genuinely
straight. It explained the tone of the book bit too well. Also my queer ex-
roommate also expected everybody appearing straight to just be closeted. These
are individuals who _really_ should understand that sexuality comes in all
colors of rainbow.

------
bakhy
really interesting read.

the only thing that bugs me, with this text, and the whole rationalist world-
view, is how human emotion sometimes becomes a thing to be ashamed of.
something beneath a hypothesized "true", "correct" condition. like, the part
about recognizing that other people have different minds, different motives,
and thus reach different conclusions - on one hand i agree that it is a matter
of personal development, and that many people are just plain jerks (sometimes,
myself included). but on the other hand, the political examples given in the
text seem like a normal emotional response. isn't it simply hard trying to
think for everyone, understand everyone all the time? is it even possible?
particularly when the issue is something that, for whatever reason, you feel
personally. it is not automatically a sign of retarded cognitive development
if you just simply got tired.

rationalism sometimes seems so, well, christian - we are all filthy sinners by
default.

~~~
TeMPOraL
> _the only thing that bugs me, with this text, and the whole rationalist
> world-view, is how human emotion sometimes becomes a thing to be ashamed of.
> something beneath a hypothesized "true", "correct" condition._

The "rationalist world-view" is just noticing that (per #1) what brain tells
you and what the world really is are two different things, therefore your
emotions may or may not be properly aligned with reality (btw. this is also
the primary insight of Cognitive Behavioral Therapy). Emotions are just like
opinions, but in different part of the brain - you want to have ones that
correspond to reality.

"Contrary to the stereotype, rationality doesn't mean denying emotion. When
emotion is appropriate to the reality of the situation, it should be embraced;
only when emotion isn't appropriate should it be suppressed." \-
[http://wiki.lesswrong.com/wiki/Emotion](http://wiki.lesswrong.com/wiki/Emotion)

~~~
bakhy
OK, but let's take again the political example - there is a confounding social
factor there. i think it is common to regard a concession, in the eyes of the
public, as defeat. this drives many people to shout their positions, and
disregard rational counterarguments, resorting to personal attacks (like the
"democrats are pro-crime" claim). even if those people are in fact perfectly
reasonable otherwise... the political example is really insufficiently
explained by bringing in just cognitive development.

now, generally, from experience, i have the impression that higher mental
skills often translate simply into a higher capability of producing more
elaborate rationalizations of positions that a person would have held anyway,
positions which are rooted in emotions. pointing out rational counterarguments
can in many situations only go so far. in fact, i have many times had an
easier time discussing with people one might call primitive, since you get to
the truth of their motives very quickly, and with respectful approach to
people's feelings, a lot can be achieved. with very rational people it's
sometimes really difficult to break through the layers of rationalizations.

i'm not saying that rationalism is bad, or that the author (or you) like to go
around calling people retards :D i just feels that despite best intentions it
can lead to disregard for emotion, even though in many cases emotion is the
ultimate relevant "reality of the situation". we can and must make ourselves
better, but we can never stop being people.

~~~
yummyfajitas
The point of rationality is to detect your own cognitive failures and to
recognize and exploit those of others. It's pretty explicitly NOT about
ignoring them.

So when I observe someone saying "that guy is racist, don't believe his
arguments", I can recognize that this is an emotional claim. It's an attempt
to demonize a person rather than refute the argument. I can then observe that
no one is disputing the actual argument, evaluate that on merit, and have a
true belief about the world unbiased by my desire not to affiliate with
racists. I can also update my beliefs about the rationality/honesty of the
person saying "hey that guy is racist".

Conversely, I can also recognize that a desire not to be racist is strong in
people, and exploit it when I want to manipulate the less rational. For
example, if I'm arguing against economic protectionism with an emotionally
driven person, I might use Jim Crow as an example of protectionism rather than
occupational licensing.

~~~
bakhy
i think you missed my point, which was that these so called "cognitive
failures" are really integral to what we are. rationalism is not generally
wrong. improving your cognitive abilities should be a goal for everyone. but
it tends to lead people to a worldview where simple humanity is deemed
inferior, wrong, and where a person actually starts believing that they
outgrew themselves and their humanity, that they are superior, objective, free
of bias. this is an illusion, nobody is free of bias.

in fact, people who believe they successfully suppressed or outgrew any
emotion are typically the ones most influenced by it, subconsciously.

~~~
falcolas
Much like the Zen ideologies, there is no end state to being a rational human
being. You can't just stop at one point and say "That's it, I'm perfectly
rational now, therefore..."

Instead, it's more of an ongoing process. The process of identifying your
biases and reasoning about them is on-going. You will always have biases, the
trick is to make the subconscious influence conscious. To come back to the Zen
comparison - the more rational you become, the more you realize you're not
rational at all.

~~~
bakhy
well said. i just feel it is something people easily say, but hardly put into
practice.

i am being unfair, perhaps, as the other comment here states. i may be
identifying some negative aspects in some people with a whole group that did
not deserve it. but, OTOH, you two may be resorting to the "No True Scotsman"
fallacy.

~~~
falcolas
> you two may be resorting to the "No True Scotsman" fallacy.

That may be a fair statement, since I do not believe there is a perfectly
rational person in the real world - just those who are working to fight
against their inherit irrationality. All we can ever rationally (oh, the
irony) expect is that they do their best.

------
calcsam
Scott makes a lot of good points, but frames them in a very Less Wrong kind of
way, when the key to development or growth is always being able to think more
clearly.

You could pick any number of other developmental milestones in other
categories, for children, teenagers, or adults.

In the ethical bucket, children begin to recognize of abstract rules as system
of dispute resolution ('he should go first because he got there first'),
develop a concept of fairness, etc

In the social / emotional bucket, adolescents begin desiring independence and
feel peer pressure much more acutely.

Is there any reason to believe that theory-of-mind-relevant developmental
psychology is more important than the other parts?

~~~
TeMPOraL
I think you're reading too much into that post. I don't see any place where
Scott makes this particular type of development as the most important one.
It's simply the one he decided to focus his today's blog post on.

------
gohrt
SSC writing is so rambling and confusing. Someone could have a great career in
condensing SSC blog posts into concise, readable essays, not steeped in LW
arcana.

~~~
SilasX
Thanks for the idea!

------
lifeisstillgood
This suggests another role for government - to break a system out of a local
peak (forgotten term) or sub-optimal Nash equilibrium

Interesting

~~~
TeMPOraL
That's basically _the_ role of the government. It's a solution to coordination
problems, which are in themselves those local optimas you're thinking about.

------
lordnacho
Seems to me numbers 3 and 4 are quite unlikely to develop unless you studied
statistics. Something like the Monty Hall problem is unobvious to most people
I've met. Heck even phds discuss it. Conditional probability is really tricky
and counterintuitive.

The other example is this old error, which is attributed to doctors (for some
reason): You have a test which is 99 % accurate (will show the correct status)
for presence of some illness. 1/1M people have the actual illness. You find
someone with a positive score. What's the chance they have it? Well actually
not as high as you think.

~~~
tomp
The main reason Monty Hall is non-intuitive is that the shift in probability
is rather small (relatively.

To make it more intuitive, we can simply make the shift in probability
absurdly high!

Imagine you have 1000 doors, behind one there is a car and behind others,
there are goats. You pick one door. The show host opens _all other_ 998 doors,
except yours and another door, revealing only goats. Do you think it's more
likely the car is behind your door or behind the other unclosed door?

~~~
purplelobster
The problem is in the phrasing, and you're actually not improving on it. The
way you explain the problem, it would still be 50/50\. What changes the
situation is that the show host CANNOT open the door with the car in it. The
way you phrase it, it's not clear that the show host has any information that
the participant doesn't have, but he does.

~~~
benmmurphy
He does kind of improve on it. You do have information about whether the host
knows which door the car is behind. If the host does not know which door the
car is behind he just did something very unlikely so you should update your
model based on this information. Assuming you have reasonable priors about the
distribution of whether the host has information about the car then after he
did what he did it is very likely he has information about where the car is.

Though actually I don't think you can then use this information to answer the
question because it is tainted with the assumption that the car is not behind
your door. :(

~~~
purplelobster
Yes, but that's not what this problem is about. Monty Hall actually knew where
the car was and never picked it, that's how the show worked. The problem is
that whenever the Monty Hall problem comes up, this is never clear in the
phrasing. For someone who have never seen "Let's make a deal", this is not
obvious, and thus people are confounded by the problem when really it's not
very confounding if you knew the facts.

On a side note, I think what you talked about would be 0.2% chance:
product((n-1)/n) n=3 to 1000

I'm not a stats/probability wiz, but I suppose if you need to decide between
Monty Hall using his knowledge or not, you'd be fairly certain by this
point...

------
roel_v
I"m trying to understand the fundamental concept he's describing - how are the
examples different from 'rational thought'? Or are they that, in a way? And is
the conclusion then (bringing the first and second parts together) that one
can only develop this through being taught? But then, don't the same arguments
that applied to the politics example apply to mysticists? And isn't that
epistemological relativity all the way down?

------
kailuowang
I dont see these as hard milestones that requires the amount of effort
suggested by the article. To me, it all comes down to strict logical thinking.

Take the last milestone of understanding tradeoffs as an example, if you are
good at identifying logic jumps in an argument (or the validity), you will be
good at understanding tradeoffs. Because you can't be logical to think
"something has a downside and thus I shouldn't pick it."

------
dsfsdfd
1 and 2 were a gotcha for me for many years...

------
klunger
There are some good ideas in here. But, the discussion of "primitive cultures"
is so problematic that it undermines my ability to take him seriously.

~~~
TeMPOraL
> _But, the discussion of "primitive cultures" is so problematic that it
> undermines my ability to take him seriously._

Consider then that maybe it's a flaw in your ability to #1 - "to distinguish
“the things my brain tells me” from “reality”". Especially that there's little
"discussion" of "primitive cultures" there, just a list of factual
observations about them made by various people.

------
manish_gill
Sorry but can't take anything associated with Lesswrong even remotely
seriously. To call them a cult isn't an exaggeration.

~~~
ajkjk
Why?

------
Agathos
So if I catch you in an error, I can not only call you on that but now I can
imply that your are developmentally stunted as well? Sounds like fun! Let me
try:

 _The Post argues that because the Democrats support gun control and protest
police, they are becoming the “pro-crime party”._

Actually, the Republican-leaning columnist Ed Rogers argues this. If Scott
Alexander were a fully developed adult, he would recognize that individual
columnists' views may not align with those of the paper. If indeed "the paper"
can even be said to have a single point of view.

~~~
mrob
That depends on exactly how the column was labeled. Newspapers as corporations
endorse the views of their columnists by default. For the view to be
attributed soley to the columnist there has to be something calling it out
specifically as a personal view.

