
The Happiness Code: Cold, hard rationality - applecore
http://www.nytimes.com/2016/01/17/magazine/the-happiness-code.html?hn=1
======
unabst
Their method boils down to "think about what you're doing." It sounds simple
enough, but it's something that takes practice, and as someone who does
practice it, the lack of practice by others does become apparent. We tend to
shoot ourselves in the foot, and the control and focus we need to get things
done often is a simple matter of not shooting ourselves, as opposed to gaining
some new or special insight.

The reason why many end up on facebook or addicted to cigarettes or pile on
credit card debt or subscribed to Time Warner is not because we suck. It's
because there are seriously effective forces actively trying to get us to do
these things. Cigarettes maybe not so much as before, but it boils down to
them physically being addictive and their ads being effective etc. The
engineers at facebook aren't trying to help you not waste time. Their actively
seeking to keep you there. So in the end, we're not underestimating our own
inadequacy. We're underestimating our collective competency at getting people
to buy and do things we want. It's just that we find ourselves at the wrong
end of it too often.

And regarding happiness... At the end of the article there is a guy talking
about how he gets up an hour early to do the things he wants... That's pretty
much it. Happiness is also a skill because it takes practice, and the best way
to practice it is by making other people happy. Once we get good at
recognizing the simple causes of happiness, we can then do the same with
ourselves, almost as if we were just another person. And it does boil down to
the simple pleasures such as "making coffee and listening to Moby-Dick". There
really isn't much more to it.

So practice thinking about what you're doing, and practice making everyone
happy. Include "yourself" in "everyone". Brilliant!

~~~
prostoalex
> It's because there are seriously effective forces actively trying to get us
> to do these things.

You will probably enjoy "The Shallows" by Nicholas Carr if you're on the
receiving end of this or "Hooked" by Nir Eyal if you're on the production end
of this.

------
nefitty
Jesus, this sounds so complicated. Mental models of self-behavior are
effective ways to change those behaviors, but it sounds like these guys are
piling way too much on at once. My most effective behavior change models have
always been simple and almost singular. The one I'm exploring now is "Dopamine
in your brain correlates with motivation." Others have been "Plan ahead all at
once so you aren't tempted to procrastinate", "Measure it and visualize it to
drive yourself forward", etc. I've effectively changed my behavior in massive
ways by playing with simple models of myself. Constantly thinking of all those
cognitive biases and mental errors seems like it would exhaust me to the point
of paralysis.

~~~
chipsy
I agree with the sentiment. I don't think there's a solution in focusing your
rumination inwards - in my case it's a good way to feel more anxious.

But I do have a bit of learned rationality in me, in the form of "I know I
will commit error _x_ in the future, therefore..." So rather than try to
become a master of self-control, I make moves to arrange the world around me
so that there are safeguards, the error is insubstantial, and I end up on the
happy path automatically 90% of the time.

Similarly, I focus personal development on going "from strength to strength"
rather than on shoring up weak areas. If I start from the models I know and
keep extending them out more and more, I will get to the things I was
overlooking, eventually. One thing I am doing now is using Streak Club [0] to
develop a single habit over a long period of time, that I would ordinarily
excuse myself from.

[0] [https://streak.club/](https://streak.club/)

~~~
nefitty
Awesome. I've managed to get a consistent meditation habit built using
streaks. My longest chain was a little over 100 days straight!

------
SCAQTony
Is self evaluation of one's "modus operandi" really a recipe for happiness?
Not to be a smart ass, but throwing around confident statements about
rationality and charging people $3,900 just smells like "Garage Scientology."

~~~
pjscott
In general, it's a bad idea to form opinions of people from a news story about
them -- sometimes accurate news doesn't make for a good story. The CFAR folks
are... what's the opposite of cocksure? Epistemically cautious?

Here's an illustrative example. The New York Times article says this:

"[...] Afterward, participants are required to fill out an elaborate self-
report, in which they’re asked to assess their own personality traits and
behaviors. (A friend or family member is given a similar questionnaire to
confirm the accuracy of the applicant’s self-assessment.)"

What the article doesn't mention is that the reason they're giving such a
questionnaire to a friend or family-member is to cut down on self-report bias
in their one-year longitudinal study of whether or not they're getting real
results [1]. People can trick themselves into thinking that their lives have
changed for the better after spending time and money on it, and the ask-a-
friend thing is an attempt by the CFAR people to avoid getting a rosy-tinted
picture of their own efforts.

They could get away with being a lot less honest.

[1]
[http://lesswrong.com/lw/n2g/results_of_a_oneyear_longitudina...](http://lesswrong.com/lw/n2g/results_of_a_oneyear_longitudinal_study_of_cfar/)

------
SwellJoe
I befriended one of the founders of CFAR a few years ago when visiting NYC,
before she moved to Berkeley and founded CFAR. She genuinely practices what
she preaches, and was doing so long before CFAR. It's interestingly disarming
to meet someone who is so smart, but has no fear of seeming dumb by asking
lots of questions (including seemingly silly ones). It has stuck with me for
years, in fact, and it's been one of my personal goals ever since to stop
being such a damned know-it-all (mostly to myself) and ask more questions, and
more genuinely engage in conversation even when I think I'm smarter and know
more than the people I'm talking to.

I'm sure their approach to rationality would have a different effect for
different people, and the article covers the different ways people step out of
the comfort zones. But, my personal weakness, one that slows my own personal
growth and development and impacts my productivity and general happiness more
than most, is the "I am smart" armor that I built up as a kid (because I
wasn't all that good at other stuff, being smart was the identity I embraced).

It has all sorts of negative consequences. I don't ask questions when I
should, for fear of looking like I'm not smart and on top of everything. When
complex things are hard, it is more frustrating than it needs to be, because
most of the time I expect thinky things to not be hard (or at least, when in
school and comparing my own performance vs. effort to others in the class,
nothing was hard). A lot of it boils down to that "I am smart" belief, rather
than taking an approach that accepts that being smart is a process and not a
permanent condition.

Anyway, if hanging out casually for a couple of weeks with someone practicing
this stuff has had a years-long and mostly positive effect on me, I would
guess a formal program would be awesome.

------
dunkelheit
What I don't quite get is why discussions of rationality so often mention
threats from superhuman AIs. Is it just because that happens to be the
interest of the key figures of the movement?

~~~
lahwran
Ugh, there's a lot of argument about that in the cfar alumni community. Some
folks take it for granted (why??? Take things for granted about such an
uncertain subject???) that we're just doomed unless you Give Miri Money(tm).
Those folks tend to be pretty good about actually carrying out what is
reasonable behaviour in most other ways if only they were right about that one
thing - if it really were such a big deal, you'd want to not ignore it.

Meanwhile, another part of the alumni community actually understands the
theory behind ai and machine learning, and those folks end up in arguments
frequently with the first category about the topic.

The reason you hear about it is the first category is a pretty panicked and
hopeless group - for the people who actually believe yudkowsky's "recursive
algorithmic improvement" to be able to give large improvements, they generally
think that humanity "loses by default" if they do nothing. So they tend to be
very, very into recruiting. Thankfully they're not so nuts about it that
they'll never change their minds, the problem is it takes a lot of explaining
to get the theoretical basis for why the recursive self improvement thing
isn't actually as scary as they think it is. No, it's not going to take an
hour as soon as an ai is built, learning is hard, and humans are freakishly
good at it.

Computers will beat us at data efficiency eventually but it's gonna take a
while, and current machine learning is better at being data inefficient but
getting good results from the large amounts of data. And the best you can do
isn't good enough to make miri's monster - unbiased, maximally data-efficient
Bayesian inference doesn't actually fit in the universe in either a time or
memory sense if you try to build a full ai out of just that one thing. And
approximating it is, you guessed it, less data efficient.

~~~
someguydave
Okay so: 1.) preaches about new destroyer god 2.) saviors who possess the
secret knowledge of salvation 3.) give them money 4.) think correctly, not
incorrectly

Yep, it's a cult.

~~~
lahwran
Eh... Sort of? The two organisations together would form a cult, but the
community split I mention makes it a bit confusing. Overall, I agree that
miri's level-of-cult is too damn high.

~~~
someguydave
Interestingly, it's an ancient form of cult called Gnosticism. Gnostics teach
that the material world is evil ("gives rise to the AI"), and only through
hidden knowledge ("correct thinking") can one find the true path to spiritual
salvation ("become an immortal human")

[https://en.wikipedia.org/wiki/Gnosticism](https://en.wikipedia.org/wiki/Gnosticism)

------
RikNieu
Initially I was interested in reading a bit more about these exercises they
do(sounds like a form of mindfulness, to be honest) but seeing a fee of $3900
set off all kinds of alarm bells in my head.

Now I'm more interested in their marketing strategy with its obvious market
segmentation of premium + suggestiveness. I'm also curious as to how they
handle sales objections and how they approach up-selling.

~~~
philh
> I'm also curious as to how they handle sales objections

A friend of mine told them he was interested, but hesitant because it was a
lot of money. They asked him whether he knew anyone else who'd attended who he
could ask. As it happened, he had me. I told him something like: it's really
hard to know if this has had long term effects on me, I do think I'm happier
than before, I'm definitely glad I went. My friend decided to attend.

> and how they approach up-selling

I'm not sure what you mean by this? I can't think of anything they'd need to
upsell. I suppose, for the yearly reunion, they say "it's free to attend, but
it costs us $x per participant, and if you could pay that much, we'd greatly
appreciate it". You could count that as a form of upselling, and their
approach to it is "please".

~~~
RikNieu
> I'm not sure what you mean by this? I can't think of anything they'd need to
> upsell.

Ah, ok. So it's a one-time sale only? No other products or services that they
recommended to you afterwards?

May I ask you some other questions regarding their service, if you don't mind?

~~~
philh
Yeah, once only.

Feel free to ask. I attended in... April 2013, I think, and then helped out at
another of their workshops in I think November 2014. So I might not remember
things in much detail, and I don't know what changes have happened since, but
I'll answer the best I can.

Also, someone linked to this thread on the CFAR alumni mailing list, so others
might see your questions and be able to answer better.

(Actually, another caveat: for at least some of the questions you could
plausibly ask, the true-and-complete answer includes uncomfortable personal
details that I'd rather not share. So still feel free to ask, but I may elide
those details.)

~~~
RikNieu
Yeah, no , I won't ask anything too personal.

1\. Why the high cost? Was this addressed?

2\. Do you personally feel the cost was justified?

3\. What benefits/improvements specifically did you notice afterwards.

4\. And your friend? Did their life or attitude observably improve?

5\. How much of what you learned is freely available on the net? In other
words, is the course simply a curated set of information that is otherwise
available elsewhere?

6\. Do you still receive regular communications from CFAR, and if so, what do
these communications consist of?

7.(last one!) Regarding the uncomfortable personal questions, was this
information that CFAR recorded and archived?

Thanks!

~~~
philh
1\. It costs a lot to run. They have to rent out space, get meals, snacks,
stationary. The staff:attendee ratio is pretty high. And they need to meet
their general operating costs - office space and staff - not just for the
workshops but as an organization. They've written about their finances:
[http://lesswrong.com/lw/n39/why_cfar_the_view_from_2015/#fin...](http://lesswrong.com/lw/n39/why_cfar_the_view_from_2015/#finances)
(you might also find the rest of that interesting).

2\. Justified for them to charge, certainly. Worthwhile for me to pay, I think
so.

3\. Difficult to say, partly because I don't have strong memories of what I
was like before attending. I think that post-CFAR, I'm generally happier, more
likely to try new things, better at making phone calls. I expect other changes
too. But I can't say that these are definitely attributed to CFAR.

It's also difficult because I don't often explicitly use the techniques they
teach, but that doesn't mean I don't use them. E.g. about six months ago I
went mostly-vegetarian. I realized that for the most part, I'm just as happy
without meat as with it, but there are exceptions. So I let myself have those
exceptions, and get almost all the benefits of vegetarianism with few of the
costs. This was a form of goal factoring, and I don't know if I would have
done it without CFAR. But I didn't sit down and think "okay, I'm going to goal
factor this".

4\. I'm not sure. This was too long ago, and I don't think I saw him super-
often around that time period.

5\. I suspect most of the specific techniques can be found elsewhere in one
form or another. They run tests to try to find the best way to teach the
techniques. There's also value in curation, and in having someone there to
help dubug and adapt to circumstances. ("You suggest committing to a specific
time to do this, but I'm reluctant because my social life is pretty
unpredictable." "Sure, maybe try picking a time, but giving yourself
permission to change it, maybe up to three times.")

But what they're trying to teach isn't really techniques. This won't do it
justice, but it's closer to: they're trying to teach the mindset that lets you
generate the techniques. That lets you say, "okay, here's a problem. Let's try
to solve it. Is this solution going to work? No. Okay, what could I do that
would actually work?" and to come up with answers.

I'm not aware of any other resources trying to teach that.

6\. They have an alumni mailing list, but CFAR doesn't post to it much as an
organization. They sometimes ask for volunteers to help out at workshops, or
if someone can help them scope out venues, or for beta testers for classes
they're working on. Outside that, I don't receive anything from them (except
automated thank-you emails when I donate).

7\. Not at all. Some of it came up during conversations, either naturally or
deliberately as something I chose for comfort zone expansion. There was also a
session of "againstness training", which was even more optional than
everything else, in which Valentine deliberately asked me uncomfortable
questions so I could practice being in a super-uncomfortable state.

When I said about personal stuff, I just meant that - for example, if one
thing I'd got from CFAR had been "I realized I could come out to my closest
friends, but probably not my family", then I wasn't going to make that public
knowledge. It wasn't to do with CFAR itself.

~~~
RikNieu
Cool, thanks for the reply

------
hotcool
Cold hard rationality also means being honest with ourselves. The stories we
tell ourselves may not be true, and this incongruity can cause a lot of harm:
[https://medium.com/@hypnobuddha/be-honest-are-you-lying-
to-y...](https://medium.com/@hypnobuddha/be-honest-are-you-lying-to-
yourself-727d86116e9b#.bn77hn4ln)

------
Takizawa
Their videos are worth checking out.
[http://rationality.org/videos/](http://rationality.org/videos/)

That said, I am not sure it is rationale to pay $3,900 for four days of
training. At least give me a celebrity like Tony Robbins.

~~~
lahwran
As someone who went, I agree. I think most people who do it right now consider
the high price to be a donation to help them scale, rather than an actual
product-for-money trade. I'd pay $500, maybe, if I thought the marginal
benefit they'd get from my going was negligible. That's how much conferences
of that length usually cost, anyway. Also, it's totally right that the
techniques aren't exactly the point - you don't go to a tech conference
because you can't read about the things presented there elsewhere, you go
because you won't focus enough on them. But it's definitely not 4k of value
from the workshop, it's 3.5k of giving them momentum to refine and scale the
org, and .5k of actual experience.

------
tim333
There was an earlier discussion of the the Center for Applied Rationality (the
company discussed rather than the NYT article) here:

[https://news.ycombinator.com/item?id=4751584](https://news.ycombinator.com/item?id=4751584)

------
dschiptsov
How about children's warm, openhearted instinctivity?)

Also scientific psychology (opposite of popular meme jogging) tells us that
pure rationality is a myth - fictional concept of the mind. We are driven and
motivated by hardwired, non-verbal heuristics , such as looks, status, health
and beauty, which could be defined, at least for living beings, as youth +
health/good genes markers (lack of any age or sickness related deformities).

Is there any estimation what percent of GDP and personal wealth has been spent
amually on booze, hookers and status items? Including all the money spent to
impress potential mates?)

The more appropriate meme instead of "rationality" would be "understanding"
(and Joy instead of happiness) - I do, more or less, understand "how it
works", so I could enjoy it occasionally, not too often to become a mere numb
consumer. Read about Dorian Gray also.)

------
smaddox
HPMOR was just referenced in a NY Times article. Awesome.

------
yarou
The tl;dr version of this is that "Applied Rationality" is a New Age cult,
whose God is "Rationality" and whose Devil is human nature, rather than
watered down Eastern mysticism.

The creepy part is how they expect you to live together and unquestionably
repeat the mantras ("techniques"). It reeks of dianetics and other bullshit
that claims to be a cure-all for all of your problems.

~~~
p4wnc6
This is a bit disingenuous but not completely. I think a comparable service
would be outdoor wilderness survival training, like BOSS in Boulder. Part of
what makes it effective is that you are dislodged from an environment in which
you can comfortably cling to the heuristics you already use to get by. In the
new environment, you have to adjust all your norms and it provides more of a
blank slate, cognitively, on which to imprint the lessons.

The CFAR stuff is like this too, but rather than being an outdoor wilderness
survival school, it's just a survival school. It's even weirder than what it
would take to survive in the wilderness, because the space of mental tools is
so much more vast than the space of physical tools tailored to one type of
environment.

I would guess that many CFAR employees would like their service to feel more
like a "boot camp" sort of thing -- a transformative experience in which the
intensity of learning and the bandwidth demanded are extremely high compared
with what that intensity and bandwidth will be back in regular life. But I
also think they don't want it to feel like an indoctrination, and would want
to preserve and even enhance someone's ability to be skeptical, even about
CFAR itself.

In that sense, promoting self skepticism, CFAR is very different than a cult,
and just because it shares some superficial aspects of a cult doesn't mean
it's fair to make that comparison.

But, but, I still do agree with you that CFAR has work to do to prove that
they are not just a marketing engine fleecing bored rich people who fancy
themselves seeming like philosopher savants or some shit. Merely having
verifiably good, open content, like the LessWrong sequences, is not enough.
They further have to show that they are willing to change, and verify that
they aren't just a certain kind of boutique fraternity.

I for one would really welcome hearing ideas about alternative ways to teach
rationality. For example, I recently read the science fiction book _The Black
Cloud_ by Fred Hoyle, and I was particularly interested in a part of the book
where humans communicate with a far more intelligent being. Hoyle's writing is
fun and all, but what I really thought was cool was the idea of a human (Hoyle
himself) trying to emulate a being far smarter than him, and how believably he
did this. But of course, on closer inspection, we should expect that Hoyle's
portrayal would not be good enough, or else such superior intellect would be
in our grasp merely by imagining how it should sound.

I think I got more out of reading that fiction book, in terms of thinking
about how to think better, than I did out of vast swaths of LessWrong. Maybe
that says more about me than anything else, but it is a data point that maybe
there are all kinds of ways to elucidate the useful tools of rationalism, and
the format of CFAR might not even be close to optimal unless your goal is to
vend a status merit badge to a certain set of semi-wealthy people.

~~~
TheOtherHobbes
The classic cult tell is the fact that if you strip out all the claims to
spiritual and moral wisdom, cults exist to service the leadership with money,
narcissistic strokes and a sense of authority, sexual opportunities, and free
labour. (I have a non-scientific theory that this is how religions propagate.
They're such an effective way to provide all of the above that whatever the
dogma, the social dynamics are just too attractive for weaker individuals to
ignore.)

Aside from money, it's hard to see how that applies here. (If Yudkowsky was
running this personally I'd definitely be concerned.)

But while there are obvious culty elements here, there doesn't seem to be a
funnel which uses introductory bootcamps/workshops to find the most
suggestible converts so it can sell them more and more expensive follow-ons.
There also isn't any sense that there's a "reward" scheme where loyal
followers are allowed into an inner circle - from which they can publicly
purged if they misbehave.

It looks more like there are some interesting brain hacks on offer, packaged
into a format that's maybe too intense to be ideal.

~~~
p4wnc6
By this definition I would argue that most SF _start-ups_ are more similar to
cults than CFAR is -- though as I said in my comment above, I do agree that
CFAR hasn't conclusively proved yet that this is more than just a sort of
Space Camp for bored rich people of a particular variety.

~~~
lahwran
oh my god yes. I'm frequently creeped out by how much my various employers
have wanted me to be TOTALLY AND ENTIRELY on board with their mission. like, I
like making awesome software, and I like customers enjoying it, but plz no I
do not want to devote my life to x thing just because it's both fun and gives
me money. If I'm going to devote my life to anything, it's going to be doing
something like building computational models of the genome or something.

------
proksoup
You do what you want to do. Ask yourself what you want to do, and why you want
to do it.

------
throwaway999888
What other approach do expect from a place/culture like that.

~~~
dang
Not cool. Please post civilly and substantively, or not at all.

~~~
bigdubs
Judging by the user name I doubt we'll be seeing this person again.

edit: Just kidding they have 201 karma.

~~~
throwaway999888
I'm not great at naming.

------
johnzabroski
Too much drugs from Haight Ashbury...

~~~
dang
Generalizations about Silicon Valley are a tedious media game; please let's
not make it worse.

