Anecdotally I found that the Less Wrong community tends to be decidedly more full of crap than average. In the same vein as spiritual materialism, many people that engage in a bias witch-hunt seem to be falling prey to "logical materialism", where the whole exercise turns into people deluding themselves into thinking they're somehow "better" than others because they're less full of crap than average.
It's good to know thyself, but it's no use if your knowledge isn't tempered by wisdom, and you're not going to get that by reading blog posts about cognitive biases online, no matter how good the posts.
It's also heartbreaking to see intelligent people getting so excited about ideas like cryonics and personality uploading. I mean, they're interesting things to talk about, but a lot of people on LW seem to actually think they might get to live forever. It's kinda sad.
I wouldn't totally dismiss sci-fi concepts like brain uploading. But then I'm not totally sold on their viability either.
One of the things I always thought was interesting about cryonics was data integrity. It could be a while before you hit the singularity. (Or whatever it is you think will wake you up.) Even with liquid nitrogen I doubt your brain can be 100% preserved. So let's say hypothetically you get yourself some proper Ray Kurzweil recursively improving A.I. And as a common courtesy decide to revive the Alcor people. If you have 99% of someones brain image and use statistical algorithms like Bayes theorem to fill in the rest is it still the same person when you wake them up? How about 99.99% their brain? 99.999999? (Which brings us back to the semantics of labels and reductionism vs. wholism.)
People who think they'll live forever have huge logic fails on their hands. Ignoring the heat death of the universe as an obvious one.  In fact every time I think of the whole business Issac Asimov's The Last Question comes to mind. 
This is pretty much where I land on this. Given any empirical theory of consciousness, it's never going to be "I get to live in the computer," just "I die, but a copy of my brain lives in the computer." And it's pretty hard to draw a bright line between that and "I die, my brain slowly decays for thousands of years, then it's surgically reconstructed and awoken." Still feels like death to me.
Of course, to your transhumanist theorist, there's probably just as much connection between me and the computer as there is between me in the night and me in the morning; it's all just an illusion created by a persistent brain state. But that doesn't help either, because now you're describing a kind of immortality -- "This exact stream of continuous consciousness is a dead end, but something very like it will continue to be and think of itself as part of me and that gives me and it some comfort about the whole thing" -- which the general public has been achieving quite successfully for some time in the form of procreation.
In fact, since every living organism represents a terminal link in a chain of unbroken life dating back to the first self-replicating molecule, it might be quite reasonable to say that we have all been alive for billions of years at least, we just don't remember most of it. But this is changing: I can go on Wikipedia today and recover the memories of our culture dating back for most of its existence.
Naturally, they grow vaguer the further they go back in time, as memories do; but for events that take place today, we have a record which far exceeds human memory in accuracy and exactness of fact, and which will very soon be competitive with it in emotional effect. It is quite realistic for me to expect to create a record of my life which has as much effect on my descendants in a hundred years as my own memories of today would have on me were I to live that long.
So I think it's possible the transhumanists missed the boat. Or rather, they're on the boat already and just don't realize it. The human macro-organism does, starting from now, seem to stand a decent chance of living to see the heat death of the universe.
I don't mind it though as much -- I am much more disturbed by the prospect of a tyrannical or even "Friendly" AI some of them seem to be fond of.
Ignoring the implications of an AI that may be required to support a transhuman, I believe that one could draw up a very pragmatic argument why transhumanism could be useful even to us, embodied souls. What you described as inter-generational memory via things like history books and Wikipedia is good but not perfect (the classic example is that history gets written by the victors). A transhuman living for thousands of years would potentially bring a fresh perspective to the table, even if that perspective was imperfect too. Just the way both a free market and an ecosystem can benefit from a diversity in their pool of ideas/genes, so can humans.
Same. Even if such a thing is possible, the probability that it's done right the first time is close to nil. And since a recursively improving singularity A.I can be assumed to irreversibly take control of the balance of power, it's not really something that you could afford to screw up.
Of course, Yudowsky (To the extent that he's doing anything.) seems to be working off the assumption that if he doesn't do it someone else will.
The argument I'm making about memory is that when communication has become advanced enough, you can't make a clear distinction between inter-generational memory and meat-memory over a significantly long lifespan. People change over time; give it enough time and you're as different from yourself now as your great grandchildren will be.
I don't think we need long-lived human beings or human personality constructs to gain the societal advantages of "I remember when..." It's feasible today for a person to record and archive audiovisual, geospatial and limited haptic data of their entire life experience, beginning to end. We can't record your thoughts, but if they're important you can write them down. I'd also wager that we'll see almost fully convincing sensory recording, which is a plain prerequisite for uploading, well before any life-extension technology which deserves the title of immortality. It would then be unrealistic for your descendants to say that they remember the things that happened to you only because these recordings would be far superior to memory.
Of course, the only issue is that we've had this sort of thing for a good while now, and it turns out we just aren't that interested in the things that happened a long time ago, just like, aside from the highlights, I don't care that much about what happened to me ten years ago.
Ten years ago, of course, I thought that everything that was happening to me was quite important. That's why I label this idea of immortality "greedy"; it represents the whim of a brain state at the present moment to continue to influence the world long after it has become irrelevant. Just look at the current state of US politics to see where that gets us. (I've never seen a transhumanist argue that every transient state should be preserved in perpetuum, but I'd be curious to know what they tend to think about it.)
The point being that if uploading constitutes a form of immortality, so does having kids; the same theory of consciousness underlies both.
 This is a bit of a tangent, but I think this is (most of) the reason that burial rituals are one of the cornerstones of human society. Obviously it doesn't matter to the dead person what happens to them, but it is crucially important for us to have a say in what happens to us; we hope that, if we respect our parents' wishes after they die, our children will respect ours. And we take this so seriously that, in fact, they do.
I suppose I consider transhumanism, especially cryonics and uploading, to be a very highly developed burial practice. If it is the wish of a dying man to have his brain frozen in nitrogen, I will respect his wish, and even humor his beliefs about what that might mean. But I don't believe it means any more in reality than if we stuck him in the ground with everyone else.
And yes, I recognize the irony in writing this much about something I think is silly to spend time thinking about :)
And they aren't horrified at the prospect of that being possible?
Not much thinking going on there, I suppose.
I expect it is not so much about wishing not to die, which even a transhumanist must admit is at least no worse than living, but wishing not to have loved ones die. Truly tragic.
I suspect it is you that needs to put a bit more thought into this matter.
This is seriously weirding me out.
Than average what?
I often lurk on LessWrong, and post there very occasionally. I find it to be a very rich source of original ideas, some of which are truly profound. YMMV, of course.
I'm curious if you think there is anything else we could stand to work on. I interpreted your use of the word "wisdom" to mean a lack of arrogance, but if you were using it to mean other stuff as well I'd love to know.
(I'm a fan of yours, BTW.)
But the idea "Hey, want to be more rational? Join our community" gives me the willies. If you want to be more rational, then joining a groupthink-ish community is the last thing you should want to do.