

Doubting at the door: when it's right to be wrong. (For the lesswrong crowd.) - KennethMyers
http://techno-anthropology.blogspot.com/2011/01/doubting-at-door-when-its-right-to-be.html

======
Helianthus16
I think your language, even in the title, indicates the problem with the
question, and why, while I appreciate the goals, purposes, and actions of the
lesswrong crowd, I wouldn't consider... um, "joining up."

The whole premise, as I see it, of being a rationalist is that there are
particular methods to working with truth, that there is an actual truth, and a
solid, singular rationality. The problem is that there is no sense in, having
begun using them, declaring that you are now in charge of the truth or really
have any particular access at all.

Statistical analysis doesn't prove things correct. It allows us to prove
things incorrect. The assertion that whatever is left is true, is correct,
typifies rationality.

But here's the thing, here's why it's called lesswrong, why the idea is to
increment in favor of the truth: everyone's actually pretty rational. Some
people are more wrong than others, but given the assumptions they work with,
they are making rational choices.

If you are declaring people irrational, the problem is with your definition of
rationality, not with them. In finding out _why_ they are rational you will
find out _why_ they think differently than you.

Take atheism vs. theism. The assumption of irrationality on either party shuts
down debate and lets no information traverse between the viewpoints.

But if you realize that everyone's pretty much the same, the problem you raise
just _goes away_. Everyone's at a different place on the axis of rationality,
no one is stupid, and there are people who are more right than you so there's
not much point being annoyed with those who are less right.

So calling one's self a rationalist, to me, is either equivalent to saying "I
am a human being," because all humans are somewhat rational and believe in the
assertion that true things can be known (whether they say so or not), or
"Other people are dumber than me and that is a part of my identity."

~~~
derefr
Calling yourself a rationalist means, basically, that you're a virtue-ethicist
who considers rationality to be a virtue: you're willing to allow for the
maximization of truth at the expense of (some amount of) utility. In its way,
this is a signaling mechanism like any other: there's no "rational sense" in
doing something that will earn you locally sub-optimal utility, even if that
thing is rationality itself.

However, like the local signal of revenge allows for global cooperation in the
iterated prisoner's dilemma, maximizing for local truth is a clear signal to
others that they can globally rely on your work to benefit all parties equally
(including them), rather than just whoever the rationalist works for.
Succinctly, "no one has their ICBMs pointed at the enemy's universities."

~~~
Helianthus16
Your definition does not contest my either-or equivalence, my assertion that
"rationalist" either means "human being" or "someone whose identity involves
caring about other people's dumbness."

Mostly it's just dressing up how people think with terminology. Virtue-
ethicist = "believes in bad and good." Maximizing truth = "trying to be
right."

Who is not a rationalist by your definition?

------
syllogism
> There are plenty of good reasons not to be a rationalist. In fact, valuing
> the truth as a sort of Platonic ideal to which everything else must be
> subservient is an arbitrary moral choice in a universe where moral facts are
> either non-existent or inaccessible.

The rationalists' position is that believing true things is almost always
strategically superior to believing untrue things. This is an empirical claim,
not one about "Platonic ideals". You can choose to value whatever you want,
but presumably you have some ends you want to achieve, which requires decision
making, which is usually assisted more by true beliefs than by false beliefs.

Now, there's some evidence about cases where this might not be true. The blog
post raises this about happiness surveys, etc. This evidence seems to me to
weakly establishes a negative correlation, but not necessarily that
rationality causes less happiness. At any rate, for any particular subgoal
(say, becoming better at programming, achieving material wealth, etc),
believing true things is probably going to be good strategy. That's the
rationalist claim, anyway. For happiness in general, it gets murkier.

Yudkowsky also has a whole thing about how a rationalist can still win at
Newcomb-like problems. The gist of it is, what's rational is whatever will
meet your goals. This means if you need to preserve some false beliefs, you
should go ahead and do that.

For example, you may find false beliefs about how long a project will take
motivate you to get started. If you really accepted that your project would
take a long, long time to bear fruit, maybe you'd never have begun, and you'd
have been worse for that. In this case you've got this other irrationality
about how you're weighing future benefits that you're compensating for by
lying to yourself about how far into the future you'll reap your rewards.

Here, the two irrationalities together take you to a local maximum. If you
only got rid of the false belief about the project times, you'd be worse off.
But, if you could get rid of both irrationalities you'd be _better_ off. You
wouldn't have the motivation problem AND you could plan more efficiently.

But maybe you just can't get rid of your tendency to over-discount future
benefits. If so, you should just accept the local maximum, rather than anti-
strategically attacking your tendency to underestimate project costs and
leaving yourself worse off.

You could make a similar argument about religion and many other false beliefs.
Which local maxima we can realistically step beyond is another empirical
question. But I think it's clear that there are many false beliefs that are
strictly negative: that is, you're better off without them, even if you don't
change anything else about your thinking.

