Hacker News new | past | comments | ask | show | jobs | submit login
Where Are All the Successful Rationalists? (applieddivinitystudies.com)
92 points by jeffreyrogers 5 days ago | hide | past | favorite | 130 comments





My hypothesis:

Rationalists are the opposite of visionaries.

A job at Google making $200-400k, having good work life balance and enough time to pursue your hobbies is success. It is also more reliably achievable. A rationalist thrives in the absence of uncertainty. Becoming a decent engineer is a far more reliable way to be successful than starting the next billion dollar company. That is practically leaving everything up to luck.

The truth is, we only see two kinds of successful people in media. The lucky, the pig-headed visionary and those who fall in line. Rationalists are by ideology averse to luck, stubbornness and compliance. Go figure, why they do not feature in cohorts so opposed to their core ideology.

Now militant rationalists are rarely rationalists. If anything, acknowledging the virtue of gut feeling/intuition in presence of uncertainty and the the prevalence of uncertainty in everything we do, are the 2 most evident facts of our universe. To make a strong claim of correctness in the presence of massive uncertainty, is innately irrational.

That being said, a lot of the most successful people I've met, use various tools of rationalism quite frequently.


> A job at Google making $200-400k, having good work life balance and enough time to pursue your hobbies is success. It is also more reliably achievable.

Yes. Perhaps restating your point in statistical terms, "Where Are All the Successful Rationalists?" is entirely the wrong question. A rationalist is not particularly interested in outliers and assumes their outcome is no likely to be better than the odds predict.

The right question is "What is the median level of success among all rationalists compared to other groups?" My hunch is that you'd find that they do tend to be more successful in the aggregate.

If you only look at the set of people who have made a ton of money to draw your conclusions, it would tell you you should buy more lottery tickets.


Correlation is not causation. Maybe nerdiness makes you interested in both software and movements with names like "rationalism", but if the movement didn't exist I think nerdy people would still be making good money in software without it.

That's me to a T. I was a "militant rationalist" as it were, back in high school. Anyone remember talk.origins?

Yes, I spent a lot of time there and on the JREF forums.

it's no longer around? talkorigins.org seems to work

I was referring more to the newsgroup than the archives site, but I used past tense because I haven't been a regular there in almost two decades now.

[1] https://groups.google.com/g/talk.origins


> A job at Google making $200-400k, having good work life balance and enough time to pursue your hobbies is success.

What job at Google (or anywhere for that matter) do you have that actually facilitates good work-life balance?


Software engineer.

Source: at Google for 4.5 years. Rarely if ever had to stay late.


Rationalists seem extremely common amongst the $300k+/year software engineers I know. If that isn’t “successful”, then I’m not sure we’re using the same word anymore.

You can argue that the causal arrow doesn’t point in the right direction—that is, that people that are successful just happen to wear the affectation of “rationalist” because that happens to be fashionable, and not that their rationalism led them to a “successful” career, but that doesn’t seem to hold up against scrutiny, in my experience. Of those that I personally know, most have been engaged with the community since at least the golden days of lesswrong.


> Rationalists seem extremely common amongst the $300k+/year software engineers I know

I'm a programmer who lives on the other side of the world (so no chance for me to make $300k+/year) and I must say that all this "rationalism" discussion makes me a little confused: do people really believe in this sort of stuff? Do they actually equate "success" with (mostly owning) "money"? Do they really think a "rational"(-ist) person would mainly think about how to earn (supposedly more) money? Why on Earth would he/she do that? Money is just a tool. Holding an important position in society (CEO, founder, whatever) is just a hindrance, it keeps one away from actually thinking about the stuff that really matters.

I'm pretty sure all this stuff was explained a lot better a long time ago by people a lot more smarter than me (right now I'm thinking at one of Plato's works, maybe "Symposium"? I'm not sure, I've last read many of them ~20 years ago), point is this specific "view of the world" seems very US-specific to me.


Earning lots of money sets you up to donate lots of money to effective charities. Many rationalists believe that's the best thing you can be doing for global quality-of-life.

If one understands that the value of your work is almost certainly higher than your salary, then you'd conclude to benefit the world the most, you'd be better allocating your work directly to the cause of global quality-of-life.

This. Money is no hindrance. It can be set in motion for good. It's the only thing capable of producing real change.

> Holding an important position in society (CEO, founder, whatever) is just a hindrance, it keeps one away from actually thinking about the stuff that really matters.

You may believe stuff like this, but many others don't, so it should be phrased as your opinion rather than as a fact.

Elon Musk, for example, has very strong beliefs about humanity's future on other planets. His position as founder/CEO of SpaceX lets him actually work towards making that dream a reality. You or I can dream all we want but we can't make them reality. If this is something you care about, then he's clearly successful in ways that we aren't, that are directly attributable solely to his role as wealthy person/founder.


I definitely recommend reading the sequences. More than anything, it's about recognizing cognitive biases. Altruism is where money comes in, but I would classify that as the Effective Altruism movement.

I think this is exactly right. Being successful in the top 0.1% is not actually rational. Even if you're brilliant, your chance of succeeding at that level is low. A truly rational person seeks out the best risk-adjusted return, not the best absolute return. If you're a technical minded person, making 300k/year in a software job is about the best risk-adjusted return of any profession I can think of.

Well the risk is you have to move to the US to get that kind of salary in a software job.

And the cost of living is so high that despite the fantastic salary most software engineers in Silicon Valley will never buy a house there.

Great place to make money before retiring elsewhere though.


Well yes, but many other countries have similar math when it comes to software engineering salaries as opposed to other fields. In most places, going into software is a good thing to do moneywise.

Germany not. Software is more on the lower end of high skilled work, definitely not at the top end.

Interesting. Still, am I correct that lower end of high skilled work is still middle-upper class in terms of income?

I find it difficult to find reliable numbers (since I work in a different industry), but it seems median entry salary for IT graduate is 40k€, which is 25k€ nett anually. I also find the middle-upper class difficult for germany (since salary upper class doesnt mean you are asset upperclass aka rich).

Personally I would say 25k€ nett is middle class; but here everybody wants to belong to the middle class so it usually has a wide stretch.

Big companies are unionized so salarys do not get that big, but are considered higher then in non-unionized companies.

How is the situation in the US?


What is the optimal risk adjustment ratio? And it is chosen by reason or by personal preference? Many software engineers are a bit risk averse and just do what they are told to do. Many entrepreneurs would rather experience the rushes of a business roller-coaster and walk away with no money than just sit and program as told.

There is no optimal ratio per se. You just divide gain by risk and maximize that quantity. This is a scale invariant function, at least, with respect to its maximum.

I think you must be omitting variables, or simply trying to maximize dollars or some other approximately ordered quantity rather than maximize subjective fulfillment in life.

Which variables are being omitted? Certainly there are other variables relevant to a quality life than money. But the rules apply to any utility function you might come up with. You want to maximize risk-adjusted return.

I’ve asked a number of people I know who are not “tech adjacent” (meaning that they don’t frequent boards like HN or tech Twitter, and they don’t live in the Bay Area) if they’ve ever heard of the Internet-based phenomenon that calls itself “rationalism”, and I have yet to find one who has heard of it. So I am skeptical about your claim regarding the causal arrow: I only know of this phenomenon because I’m in tech, and other tech people pointed me to it.

To me, this feels a little like saying “lots of tech people I know with high salaries were early contributors to Wikipedia, therefore being an early contributor to Wikipedia probably made them successful.”


I had become a rationalist(ish) at least a decade before I knew what rationalism was as a movement. I very similarly has lost my faith in religion, far before I ever knew what atheism as a movement was too.

It can be considered causal. In that, many people follow the thought processes that underlie rationalism which direct them towards careers and outcomes of the sort that the person above was mentioning. (95+ percentile salary as a stable independent contributor in a field that values logic and structured thought). They may do it while being completely ignorant of the movement, but still being a rationalist for all intents and purposes.


> They may do it while being completely ignorant of the movement, but still being a rationalist for all intents and purposes.

If you consider “rationalism” (in the sense of the subject of this thread) to be equivalent to “following a scientific or rational thought process”, then the main question asked in the article becomes nonsensical. “Where are all the successful people who followed rational thought processes?” is a genuinely foolish question, because you can find countless notable examples with no effort at all.

But of course that’s not what the post was asking, and it’s why the poster has a harder time answering the question. The post refers to the very specific Internet phenomenon of “rationalism” which, while cleverly incorporating the notion of rational thought into its name, actually refers to a specific group of people who follow a specific set of teachings.

And those people are massively concentrated in US tech and tech-adjacent areas, largely because that’s where this specific set of beliefs took off first. That’s the causal arrow here.


Answer: Too busy being successful to spend time writing things on the internet to make themselves visible to you as rationalists? (Similar to the classic idea that, to a first approximation, the only people with enough free time to attend Mensa meetups are those who haven't been otherwise successful.) This overlaps with some of the supplied answers.

That was a pretty good Mensa burn, hadn't heard until today and I just got a good chuckle. It's the corollary to the same burn against teachers, but punching Up instead of Down.

For those who don't know the phrase: Those who can't do, teach.

And to go further, the derogatory phrase means if you can't make money doing something, you could still make money teaching other people to do it.


> punching Up instead of Down Are you implying Mensa holds more "institutional" power than teachers?

Good catch, I suppose I see them as more 'Up' because of the perceived narcissism and the very real 98th percentile IQ membership requirement, the 2%, so to speak.

Every third phrase in the article is inscrutable, and understanding it requires that I click through and read a 2,000-word Scott Alexander post. So I’m going to go out on a limb and offer a hypothesis to answer the post’s main question: rationalists are prevented from being successful in normal endeavors because they’re too busy trading weird in-group language and parsing encyclopediae of inefficiently-organized exposition transmitted via blog post.

The actual answer is that rationalists are highly successful, but successful rationalists tend not to talk loudly about the weird esoteric thing they learned their mindset from, because they know how to read a room.

Unfortunately my only evidence is being a successful rationalist and knowing a lot of successful rationalists.


Right. Has anyone ever surveyed successful people at large on their views on rationalism? I wouldn't guess that most successful people are rationalists but I would definitely bet that a very significantly higher number of successful people are rationalist then the population in general, even if it's still a small number.

Then again, having a stated preference for philosophical camp at all probably corresponds highly with success against the population who is totally uninterested or hasn't gotten enough education to be exposed.

Either way I don't buy the premise.


I think most successful people would say "of course I'm rational but what's the point of making an -ism out of it?"

Good branding always incorporates a word that people find desirable. The very best branding selects entire branches of human endeavor and claims them as its own.

Has anyone tried starting a cult of goodism?

That's, funnily enough, one of the arguments he makes, sort of.

I think the underlying issue is that rationalists are wanna-be intellectuals: they want fancy-sounding jargon, but without the rigorous intellectual hard-work to back it up.

(Not that most jargon is good in an academic context either, but at least there you can kinda see why it’s sometimes necessary)

(The other reason might be because rationalism, at least recently, has been adjacent to crackpot racist phrenology ideas, and so has driven away contributions from most of the world, preventing them from branching out into the real world)


On the other hand, maybe some people are interested in a type of reading/writing but that doesn't mean they could do it themselves.

Like last night I watched the writer Jia Tolentino interview Chanel Miller (the woman who was raped by Brock Turner, if you're familiar with the case.) Watching Jia think in real time is magical to me. Her writing, too, is at times incisive, at others deeply personal, sometimes indignant or marveling or both, and can still pull back to culture/society level relevance.

I would say that the best of Jia's longform writing is among the best narrative non-fiction being written today.

Is there a part of me that wants to be able to do that? i.e. am I a wannabe? I mean, kind of, but I also really enjoy it and that's more the point. I enjoy Ren Hang's photography but am not trying to be a photographer. I enjoy helping founders but don't have much desire to found another company.

We don't _have_ to assume that everyone who enjoys something only says that they do because they are status-obsessed and lazy.

EDIT:

Another example I would give is Daniel Day Lewis. Good god the length he goes to to method act; the almost terrifying commitment. And it fucking pays off. But do I want that life? Hell no. Because I'm lazy? Maybe if I wanted that life then work ethic would be a question, but what disqualifies my trying to be Daniel Day Lewis in the first place is that I don't want to make the trade-offs he does — even if I could.


Approximately none of this reflects my experiences with rationalists I've met through things other than LessWrong meetups, etc. If you define rationalism very narrowly to mean "the type of people that religiously attend LW/CFAR events because they aren't too busy for that," this may have elements of truth, but it's also intentionally excluding a lot of people who are meaningfully members of the rationalsphere.

I think they tend to use a lot of fancy jargon because the jargon is actually really good, and there's no real replacement for it without circumlocution.

I'm not even sure if it was intended, but I chuckled reading the word `circumlocution`.

I have been heavily influenced by lesswrong. After the initial stage of encorporating the rationality identity and arguing about everything with everyone, I tried becoming more of an applied rationalist when I realised the first approch isn't leading me to winning anything. Shortly after I read: https://www.lesswrong.com/posts/ZbgCx2ntD5eu8Cno9/how-to-be-...

I made a series of life changes. I can't attribute the blog post alone to these changes, but it was something that pushed me towards a more meaningful life. I stopped spending money on things and started buying experiences more often. I bought a bycicle (a thing but gives you many experiences) and started cycling to/from work in an effort to a) exercise b) spend more time outside. I broke up with my girlfriend of 9 years because I figured it just wasn't making me any happier. Cycling in London, UK for years made me very satisfied but I also got tired of the climate and I ended up finding a remote role and relocating to a climate where I can be outside every day. Finally I have started strength training - and now I am in a much better shape after doing it on a regular basis for a few years. It's hard to say that I am happier exactly, but there is a sense of liberty I feel that is hard to describe. The life I am living feels like my own, I feel the sense of control and I am liberated. I would like to think many of these things count as "winning".


I've treated lesswrong as a way to still make terrible choices, just by expressing bias in more complex thinking, which hides it better and makes us feel batter when we still experience the same outcomes.

how would you judge the choices you have made? what would be a better choice, and how would you know it was better?

Poorly, because hindsight is hard, I just would skip the steps related to hiding my biases behind overly precise numbers.

"poorly" and "bad" related to what? what measure are we using to evaluate these outcomes? Whats the magical thing that we use to tell whether "rationality" is bad at achieving "it"?

Poorly and bad refer to the inability to accurately assess outcomes objectively and accurately in a way that identifies predictors for future outcomes.

In other words, the techniques praised in lesswrong writings dress up what is effectively bias shuffling, which results in the same problem of being unable to assess the predictors of outcomes, just now better hidden behind falsely confident figures instead of admission of uncertainty.


> Poorly and bad refer to the inability to accurately assess outcomes objectively and accurately in a way that identifies predictors for future outcomes.

So at some point you were able to assess outcomes objectively and update your perceptions? What would you call that process?

> In other words, the techniques praised in lesswrong writings dress up what is effectively bias shuffling, which results in the same problem of being unable to assess the predictors of outcomes, just now better hidden behind falsely confident figures instead of admission of uncertainty.

I’m asking how you can identify a truth-finding process that is not bias shuffling and what you would call it (since “rational” is now taboo).


You're barking up the wrong tree. I offer no solutions, I only spot plenty of problems with what lesswrong tries to sell as one.

You say "I broke up with my girlfriend of 9 years because I figured it just wasn't making me any happier." Now do you think that was a wise decision?

(It's also hard to tell what subtle distinction the verb "figured" is meant to convey in your sentence, if any, beyond say a neutral verb like "concluded".)


I was very involved with the Less Wrong community starting when it was the joint Overcoming Bias blog between Yudkowski and Hanson through the splinter off and into the formation of MIRI (fmr: Singularity institute) around 2008-2011 or so.

I think it comes down to a few factors.

1. Deep Learning took off and took the narrative as the "right path" to "General AI." Back in its heyday all the LW folks (including myself) were really focused on approaching AI from the perspective of Pearl's Causality and the work of Ray Solomonoff

2. There were no practical products, tools, frameworks or research that came out of that group that could either be easily applied to existing business or demonstrated SOTA on any computing domain.

3. AGI/GAI/HLAI/etc... is generally still (unfairly IMO) seen as the domain for cranks and zealots - which was the primary end-goal of the LW crowd, where "rationality" was the practical middle-step.

4. It kind of turned into a mini-cult around Yudkowski, which IMO turned people off

[0] https://intelligence.org/


I would also add in the toolmaker's problem.

Rational thinking, like programming, is a tool. When you turn programming on itself, you get metaprogramming and new programming languages. Which is cool for programmers, and many a very skilled programmer has fallen down that rabbit-hole, but neither particularly profitable nor necessarily all that useful. To get the big bucks and make a big -- direct -- difference in the world, you need to combine the tool, programming, with domain knowledge and real-world problems.

Likewise, if you turn rational thinking on itself, you get metacognition, thinking about thinking, and rationalists. It's neat, but at the end of the day...


> It kind of turned into a mini-cult around Yudkowski

I think this transformation had already happened when I first heard of rationalists. Some of the things that they were saying seemed vaguely interesting. But I couldn't quite put my finger on something being off. At some point I came to the conclusion that Yudkowski was some sort of cult leader like figure.

A cult around rational thinking. Isn't that something.

Then later tiger king came out and I realized that apparently everything can devolve into forming cults.


Humans can turn (almost) anything/anyone into a tribal fetish.

The rationalist community and the number of people who call themselves rationalists has actually grown enormously (exponentially, in fact) since those early days. Modern rationalists exist in all walks of life. The early Yudkowsky-centered rationalist movement is a historical, not a modern depiction.

I don't think just rationality cuts it, you need a goal. For example, I see many MBAs being rational in their relentless focus on efficiency. Yet, this is reactive (to the market), it doesn't follow any value in itself. To attempt to do something that nobody did before, or even to pursue a certain value above other, is inherently irrational.

Yanis Varoufakis told a nice story about him teaching game theory. At the beginning of the semester, he offered the class a free pass of the course: Each student would write down the grade they would want and they would all be awarded by the minimal grade that was proposed. But first, they would have to agree on this proposal. He mentions that there always was a person who refused to agree. When he asked them, why did they were against it, they responded, without a hint of irony, there would surely be some idiot that would write down a low grade (and they would not pass).

Frankly, I don't think this problem has a rational solution. It's a paradox - by being totally rational, you're being irrational. And I think the same is true for rationalism as a life philosophy. If you apply it to everything, the life will be hollow.


I don't see how the disagreeing students in Varoufakis's classes are irrational. There are at least four reasonable explanations I can think of for their behavior.

1. They perceive a tail risk that at least one student is willing to spite the others. Possibly a student that has less to lose from bad grades than their classmates do, but that's not the only context where it could happen. While an assumption that everyone has the same values and will make the optimal choice under those values often leads to an effective approximation, modeling human behavior isn't always that simple... especially when only one person has to deviate for your model to make a catastrophically wrong prediction.

2. Grades aren't quite on an absolute scale; at least when pushed to extremes like "everyone in the class gets an A+", the most sophisticated readers of the transcripts will interpret that A+ differently than an A+ in the same class where the distribution of grades was more typical.

3. Getting yourself to behave in your long-term self-interest isn't always trivial. People do things like publicly commit to donating to a cause they hate if they don't achieve goal X, to try to increase their chance of reaching that goal when they've wanted to but failed in the past. Meaningful grades can be better than their absence if you struggle with self-control and trust the professor enough.

4. If you have a clear interest in seeing your classmates put effort into the course, you might vote against this.

Note that, in some of these cases, the student has an incentive to be quiet about their primary motive and instead give the cover-story explanation Varoufakis heard. I don't think those cases are likely (I think it really is mostly #1), but I wouldn't rule them out.


The disagreeing students are rational and irrational at the same time; their rationality forces them to act irrationally. That's why it is a paradox, kind of similar to the https://en.wikipedia.org/wiki/Card_paradox

Of course you can argue about the details in particular situation, just like you can explain that the card is not really a paradox, as the statements were written by different people, etc.

The main reason why I pointed out the story is that it is a certain answer to the question, why are not rational people (more) successful? We can easily imagine a class of "irrational" students, who take the Varoufakis' proposal, all get their good grades, and move on to other tasks, beating the hardcore rationalists in the process.


This is a very good point. Rationalism is inherently flawed in that it lacks a value system, and all value systems are inherently irrational. It manifests in philopshy as the "is-ought" problem.

Rationality is about internal consistency of beliefs and values.

"Rationalism is inherently flawed in that it lacks a value system" strikes me as a category error. Giving you a value system isn't its job; instead, it provides a toolbox for identifying and resolving contradictions in your preexisting beliefs and values.

(Since we all work with imperfect information about the world, internal consistency is far from a guarantee of accuracy; but it improves the odds.)


There are two issues here.

First of all, not all value systems are rational and optimizable. The most relevant ones to humans simply aren't. Building a rational system to maximize these value systems is an impossible task, it's a category error. Like trying to run regression with a non-differentiable cost function.

Value systems are an "ought". No rational, logical system can ever change them. Resolving contradictions in your values is a categorical error, because there is no such thing as having more than one value means inherent contradiction.

As a result, a purely rationalist perspective is ineffective. You can only achieve a certain level of rationality.

Which is already exactly what people have been doing for millenia, trying to find the most logical way to follow their values, and trying to phrase their inherent, base values into a set of explicit values that represent them as well as possible. It's not anything different from what secular philosophy has done for over 4000 years now if you construe it that way.

Finally, it's never possible to know that your system is without any contradiction, because that would be equivalent to solving the halting problem. So, do you have a notion that contradiction can be estimated? Measured? How do you come to the conclusion that your belief system, with that contradiction removed, is any less contradictory than before, knowing full well that you might have introduced a more complex, as of yet unknown contradiction, or ten thousand of them?

And so it seems that having soft contradictions is not actually necessarily a bad thing, and that jumping to resolve them should only be done if they actually are terminal, because balancing contradictions in your cognitive process is actually very doable and a good tool. It's not just about information either, even with perfect knowledge it's impossible to be sure of the absence of contradiction.

In other words, it's never possible to prove a the consistency of a model in that model itself. You can try to prove it using the framework of another model, but how do you know that that model is consistent? You can't.

This is why I personally try to ground the internal consistency of my though process in the material world - a contradiction is only problematic if resolving it can materially improve the situation or understanding I derive from it.

But then again, Philosophical materialism vs idealism is a whole other can of worms, so il leave it to that very small facet of it.


It is not necessary to resolve all contradictions in one's beliefs and values to gain some benefit from rational introspection. Even professional physicists haven't arrived at a consensus on how general relativity and quantum mechanics fit together; that doesn't make the internal consistency of those two models worthless.

Yes, and that is a completely useless remark that is the basis of Western philosophy since 4000 years. This isn't what "lesswrong rationalism" is about, that is about frantically trying to eliminate as many contradictions as possible even if it means being less effective at actually optimizing the values, whether it is by eliminating flexibility, increasing the likelihood of error, or abstracting away the contradictions to such a level that they can't be consciously dealt with anymore.

As you've missed in my first post, the issue is that personal philosophies are fundamentally different from physics, in that in physics there is a correct model, that can be reached by eliminating contradictions and constantly iterating. As far as models to maximize personal success, however that is, there is no such thing as a correct model, there will always be contradictions, and hyperfocusing on the most rational possible model to optimize for an irrational goal will almost always yield worse results.

Rationalism doesn't just mean "we can derive some benefit by carefully trying to use rationality to arrive to better models and decision", the same way that scientism doesn't just mean "we can derive some benefit by using science", rationalism is the idea that *all beliefs and decisions should be based purely on rationality". Unless your goal is purely quantifiable and free of contradictions, this is actively harmful.

I never once suggested that rationality isn't useful or that resolving some contradictions is useless, I actively stated the reverse, that they are both useful insofar that they actually make the situation better for you, which is absolutely not what rationalism is about.

Rationalism presupposes the supremacy of rationality, which itself cannot be rationally proved.


Any toolbox can be misused. Anything can be made into an idol.

The fact remains that the "lesswrong rationalism" community was the first to popularize several rather sophisticated reasoning concepts. They have provided common reference points that have made it easier for me to communicate my thinking to others and vice versa, and I am far from the only writer/reader who has benefited in this way. As a consequence, I'm generally sympathetic to this group, and personally identify as a rationalist.

I agree that the heuristic "internal consistency is always worth actively working on" can go very wrong. It looks like you're defining rationalism as structuring of one's life around that heuristic; I'm guessing you must have had bad experiences with people who actually do that.


In that case, you're not advocating for rationalism. It's a word that has a well accepted definition of a philosophical framework where reason is the primary source of justification and (moral) truth.

Just using reason to come to conclusions that aren't necessarily ultimately justified or deriving truth from pure reason is literally the entire concept of Western philosophy. And by Western, I mean from Iceland to Bagdad and beyond. I'm fairly certain it's also the entire concept of Eastern philosophy too but I don't know much about that.

Lesswrong is certainly good for introducing concepts and allowing discussions. That does not a philosophy make. I'm not critiquing the website, nor the community, but the "pure reason" rationalist philosophy that a lot of people there adopted and peddled, without spending enough time justifying it, instead spending vast amounts of time applying. This is exactly what the article was addressing when it talks about rationalism, too.

Rationalism is, by definition, idolatry of reason. At the very minimum, to even show any kind of rationalism, you have to believe that somethings are true by virtue of pure reason, independently of the empirical world. Rationalism really doesn't have any thing to do with using or not using reason.


What's your take on Harris' The Moral Landscape?

I find it pretty much useless. It's just a confused case for rule utilitarianism, an age old moral framework, that makes some good arguments for it but fixes none of its problems. Overall, the central claim is unproven.

To make the central claim worthwhile, Harris would have to find a way to measure well-being. Since he can't, there is really no difference between that and bog standard utilitarianist ethics.


Actually I think the value system of rationality is based on the (arguably irrational) belief in the supremacy of rationality.

This is an ahumanist or post-humanist value system. Rationalism becomes a movement of transcendence from mortality, fleshly desires, the unconscious, and nature.

The rationalist worships the neural network.


Varoufakis is not only very irrational but mentally ill and I take full responsibility for saying this.

> Pope Clement VII was so influential that King Henry VIII had to leave the church and invent in Anglicanism just to escape.

This is so amazingly reductive I can't take anything else in the post seriously.


Perhaps further evidence of too reductive of an approach:

>The LSD has an estimated $5B in annual revenue, with an estimated $100B in funds, and protestants can make claims to nearly every US president.

Importantly, this is the LDS, not the LSD. Do rationalism's thinkers engage seriously enough to be able to make important cultural contributions? Or do they have such a surface-level view of everything (too much breadth, not enough depth – too much depth is unpractical, and practicality is key for rationalists) that even in this post they cannot be bothered enough to cross the t's and dot the i's enough to avoid a confusion between a psychoactive substance and an influential religious institution?

More to the point, rationalism's great age has passed. Rationalists did carry import in the post-Reformation era, but in philosophical concepts, we have long left the era of classical Enlightenment behind. We have even left Modernism behind, which was arguably born out of Rationalism. We are at the tail end of postmodernism – of course you don't find "highly successful" "rationalists" in the post-modern age. While you don't necessarily have to be conversant (or even aware) of Barthes' structuralism, "success" (at least, in the sense that is discussed in this post – cultural import, legacy, etc.) is dependent on living and breathing postmodernism.

Like a fish in water, you don't have to know what postmodernism is, exactly. I do not think any TikTok star knows what postmodernism is. But they intuit that they must be a chimera that is the product of its audience, that the signs they create must be readily meme-able and able to live a life outside of their author. Rationalists do not breathe in this water – their hierarchical approach is stuck at the turn of the last century, and so they will not find themselves carrying import at the beginning of this one.


These aren't the same rationalists. This is about LessWrong-style Bayesian rationalism from ~2009 or so. The name is pretty unfortunate, though.

Oddly, I feel compelled to defend that bit of glibness. Sure, the Pope vetoed the annulment of Henry VIII's marriage because he was under extreme pressure to do so and Henry VIII left the Catholic Church out of frustration with this rather than anything resembling sympathy for what became the English Reformation, but the point was the Church's stance on being above worldly politics was borne out of being so integral to worldly politics they needed excuses. It'd be great for the rationalist movement if their leading figures were so influential on business, politics and actually existent AI they had to feign disinterest in them

I guess the flip side of that argument also lives in British politics: the PM's advisor Dominic Cummings, who fulfils a Rasputin-like role of being on the one hand a magnet for criticism and on the other hand someone the PM will go to unprecedented extremes to continue to receive advice from[1]. He's also someone that has Overcoming Bias and Yudowsky in his short blogroll and sniggers at his detractors that "don't understand epistemological uncertainty" when his comments that his priorities "in some possible branches of the future... will be an error" get media attention. An archetypal 'rationalist' who is incidentally currently the UK's most influential policy wonk.

[1]Usually British PMs don't accept their Chancellor's resignation so their adviser can fire his staff, or refrain from firing advisers whose actions become the #1 subject of public outcry from their core voter base.


...right, but it isn't correct. Clement wasn't feigning disinterest, his interest was in not granting an annulment. And Henry was able to appoint Cranmer, he was able to dictate his domestic situation freely, he had no real influence beyond the spiritual (which is why Henry left). I am not sure what you mean by "world politics"...that isn't a term that makes sense in this context.

Also, the situation with Cummings isn't remotely odd. What confuses people is recent history. Blair/Brown was the archetype of why you don't want a powerful Chancellor: Brown briefed against Blair and the govt constantly, his advisers used to attack other ministers, it was insane. Osborne's relationship with Cameron was a function of seeing Brown attempt to destroy Blair.

But that isn't always the case when there are issues with other Ministers or the Civil Service (both of which were true when Johnson took power). Thatcher is the prime example: almost everyone in the Tory party hated her, her 1981 Budget was made by Alan Walters not Howe, Lawson resigned when she opposed ERM (again, due to advice from Walters). Equally, in foreign policy she ignored her Ministers and, more importantly, the Foreign Office (who gave uniquely terrible, bizarre advice in this period...even by FO standards).

Our system is built with this tension in mind. Cameron had a strong policy unit. Blair didn't have one but he relied on a small circle of Ministers so it wasn't necessary. It is very normal, and quite necessary, for a Prime Minster to be hostage to no-one. The Prime Minister is always responsible to Parliament, that hasn't changed. But the growth of the media, growth in complexity of policy, and growth in suspicion of the Civil Service (particularly after the 70s) has led to this requirement.

And btw, I don't think most people in the public understand how ludicrous the situation with Javid was (it generally wasn't reported because the person in question was liked by the media because they leaked). Javid's advisers were actively briefing against the govt daily. When Javid was sacked, his advisers were out there literally the day after briefing against the govt. It wasn't close. It wasn't even remotely close. It is a very good thing that SPADs get taken down a peg because the massive proliferation in their number since the early 90s has meant some attempting to adopt a remit that vastly exceeds their purpose.


Hi Dominic :) In all seriousness, talk about it being a 'good thing that SPADs get taken down a peg or two' is ironic when my observation was that Cummings is literally the only SPAD I can think of that any PM has appeared so beholden to. I'm sure Johnson isn't the first PM ever to let a minister go on a SPAD's advice, but the adviser is usually in the background, not personally firing the Chancellor's staff. And how often is it that a SPAD not only survives lawbreaking in his personal life becoming a focus for public outrage (instead of being summarily fired and if really valued directed to a suitably close think tank and told he'll be rehired when the fuss dies down) but has the PM stan for his 'utmost integrity' as the warmup act for his own press conference. You don't think it's odd that when the public refuses to believe a SPAD claiming the reason he took his wife out for a lockdown drive on her birthday to 'test his eyes', the PM swaps his contact lenses for glasses so he can give a statement about coronavirus affecting eyesight? By all means name me a comparable example of a PM brazening it out to defend an underling who in normal circumstances is supposed to be invisible.

None of the political intrigue itself has anything to do with rationalism of course, but it does show a pretty archetypical 'rationalist' whose counsel was ranked unusually highly by a global leader

As for Clement, I never said he feigned disinterest, I agreed with the original blogger that that this was one of many examples of ecclesiastical orders getting involved in all the City of Man stuff the Church's founding mythology placed them outside; often excommunication was even more transparently nothing to do with spiritual matters. Sure, excommunication proved no impediment to Henry VIII resolving his domestic issues at the cost of only a few centuries of religious strife and antagonistic relations with Catholic states, but unless your argument is that this proves the Church wasn't any more influential on the politics and economics of the Middle Ages than LessWrong on the politics and economics of the world today, I'm not really sure why you're twisting my original statement to make that tangential point


I understand that you can't think of an example...that is why I gave you an example (Thatcher relied almost totally on her policy unit because, again, literally everyone else hated her). I will give you another: under Cameron, everyone had the same issues with Hilton (a hint, Tory backbenchers always complain, people seem to forget this every few years).

What you don't answer is why you believe that a PM wouldn't fire an adviser briefing against the govt? The only instance where this has occurred was Brown (and remember, pretty much every person around Blair asked repeatedly for Brown to be sacked,and most people ended up leaving because they couldn't deal with Brown). Indeed, this occurred under Cameron (with one of May's closest advisers) and the person was fired, no questions. The odd thing was that Javid refused to sack the people in question (I remember the day it happened the advisor was briefing the press that Johnson was going to fire Cummings...it was utter insanity, and this was after it was well known that they had leaked govt documents...this is typical for the Tories but Javid was given plenty of room, and chose to shoot himself).

Being "a focus for public outrage" is neither here nor there (thankfully). Some people are perpetually outraged, doing what they demand doesn't affect things.

Advisers also aren't supposed to be invisible. Again, when anyone has advised the PM, it has caused problems (with backbenchers, with the Civil Service, etc.). Under Blair, everyone complained about his advisers. Under Cameron, everyone complained about his advisers. Under Thatcher, everyone complained about her advisers. Under Brown, particularly, everyone complained about his advisers. Under May, particularly, everyone complained about her advisers. They are not supposed to be doing interviews and briefing the media (openly), and they clearly aren't supposed to be briefing against ministers in other departments...but they aren't invisible either.

It isn't tangential, the original point was: Clement was influential...saying that he wasn't influential is not a tangential point.


But Thatcher seeking economic advice from an economist, Tory backbenchers finding Hilton a ridiculous figure and SPADs regularly getting fired by their seniors are not remotely similar to Cummings taking personal responsibility for firing the Chancellor's employees and holding press conferences to tell the public they're wrong to be upset at him for breaking lockdown. If my argument was that PMs never listened to their advisers before you might have a counter example; but my point is that advisers whose behaviour in their personal life has a direct measurable impact on polling support for the government don't, in normal circumstances, keep their jobs, never mind do public briefings in the Rose Garden. I'm not sure why my original assertion that Cummings was an apparent 'rationalist' with quite a lot of influence has prompted quite such an impassioned defence of his actions, but my last post asked for an example of a PM going to comparable extremes to retain an adviser whose personal life is at the centre of negative media attention and you responded by pointing out PMs fire advisers quite a lot...

Suffice to say you are also perhaps in a minority of one in believing that Clement VII had no influence on the Middle Ages. Especially when the point of reference the Church's influence is being compared with is a collection of moderately popular blogs and think tanks.


"Explain the English Reformation in one sentence" is a pretty tall order, but it seems reasonable to wonder whether Clement VII would have allowed the annulment if Catherine of Aragon's nephew hadn't just sacked Rome and taken Clement prisoner.

The geography and logistics were wrong for either a successful covert assassination misson or installing/supporting an anti-pope to oust for one. The reason Anglicism lasted was because it was a large island such that it was hard to both have a navy capable of overcoming it, raising an army large enough to defeat it, and transport said large army after the naval fight.

Anglicism was a branch born of cynical political power struggles and everyone knows it.


Funding your companies, advising your executives and politicians, building your products, and smack talking on HN.

It is a funny question though. I might posit that it's not a philosophy so much as a subculture that appreciates a certain flavour of writing. There's a form, like a prog-rock anthem or a minimalist piece, where writers ourselves tend to appreciate it, but where most people are just looking for something to dance to.

If I could characterize most of the criticism I've read of rationalist writing, it would be that their main complaint is they can't dance to it.


"How to Solve It" by Polya is a great book about general techniques for solving hard math problems. It also contains a warning: "[...]if we're not careful, the next generation of students will be able to talk the hind leg off a donkey about the thought-processes involving in solving problems but will be able to operate only at a very low level in terms of content."

My (admittedly untestable or at least hard to test) hypothesis is that people who identify themselves as rationalists fell into this trap.


It's an easy trap to fall into. Reading and thinking about how you'll think when you need to solve that hard problem is waaaay easier than actually solving the hard problem. Plus, you get this little bonus of still feeling like your doing something productive -- "I'm just sharpening the axe really quick." Meanwhile, no actual work is done.

The quote at the end of the article about computers being for community, not insight, resonates well with me, especially given where this is posted right now. I also empathize with the author's take on reading rationalist literature basically making me feel less alone rather than helping develop any kind of special insight or skill at winning anything, really.

Rationalism is hydrofluoric acid. It silently creeps inside you and hollows out essential parts of you when you thought it was just a tool helpful for dissolving some external problem.

You start with a problem, and instead of solving the problem, you end up just losing your desire for the solution.

It's our nature to be self obsessed. So when we learn a new way of thinking, we cannot help but make ourselves the primary target of it. And nothing will disabuse you of a deeply held, arational, intensely motivational heroic narrative like the tools of rationalism.


Counterpoint: a lot of successful people who do not claim the mantle of rationalism are infact rationalists.

I run into successful rationalists all the time in the business world. They might not self-label (I don't, though I acknowledge that the sequences have drastically changed my life), but we're definitely in the demo he's talking about. Yes, there may be some selection bias, but I know for a fact that the way I'm able to contribute at work has quite a bit to do with things I've learned from the rationalist community.

It definitely improves things to work some of the relevant lingo into your work environment so you can more easily help colleagues grok why the perspective you have on a topic seems very different, and I can see an argument that in many areas you'd be better off with specific knowledge. But is that the trade-off? Not in my experience. You can have both, or at least 80% of both.


I personally know at least one rationalist who is amongst the wealthiest people in the world and got there at least in part as a direct consequence of his rationalist interests, uses the lingo etc.

(No, I won’t say who it is)

That being said most people who self-identify as rationalists are significantly held back by (a) lack of charisma (b) the fact that they believe that their elaborate mental constructs are reality. Of course these are just another mental narrative, and not even a particularly useful one.


Rationalists tend to be irrational people who need help.

No doubt there are some good thinkers in the sphere, but I don't know why you'd correlate rationalists with 'success'


>Rationalists tend to be irrational people who need help.

wot..


I suppose an analogy is that the most puritanically religious are often those who feel most inclined to sin. Your average sinner doesn't feel the need to go full-on zealot to keep to the straight and narrow.

Scott Alexander: "A Christian proverb says: “The Church is not a country club for saints, but a hospital for sinners”. Likewise, the rationalist community is not an ivory tower for people with no biases or strong emotional reactions, it’s a dojo for people learning to resist them."

https://slatestarcodex.com/2014/01/12/a-response-to-apophemi...


Because rationalism is more about not taking avoidable losses by frankly being stupid than producing outliers by doing something normally stupid or crazy that pays off and get noticed. Nobody pays attention to the 999 times someone prays for rain and fails but the 1 time it does and it called "working a miracle". That sort of selection bias is in play here but worse as they ignore the background yields of regularly applied rationalism.

Not to mention the projection involved - "success" is easy to come by if you have a crowd of rubes paying for what they want to believe but rationalism doesn't offer anything with such popular appeal directly.

Really the whole article is anti-intellectualism fallacies strung together in a "If where the rays of the sun don't create gold why aren't you rich!" style fallacious retort.


Jeff Bezos seems to have a pretty rationalist philosophy, although he doesn't cite Yudkowsky. Gates and Musk seem to have aspects of it too.

"I knew that when I was 80 I was not going to regret having tried this. I was not going to regret trying to participate in this thing called the Internet that I thought was going to be a really big deal. I knew that if I failed I wouldn’t regret that, but I knew the one thing I might regret is not ever having tried."[1]

https://medium.com/@alyjuma/the-regret-minimization-framewor...


I can by no means claim to be as knowledgeable on any of this stuff as the post's author is, but my take on this has always been that rationalism is a philosophical view, and thus, it's really not clear why you'd think it should lead to great material success given that philosophers in the modern world tend not to achieve said great success. I am personally enthralled by philosophy and I'd also consider myself a rationalist (and have spent a decent amount of time reading SSC), but I'm unable to draw any sort of putative causal connection between that and being very successful. Indeed, delving into this stuff is what I do to screw around, so pretty much exactly the opposite.

The problem is the competition is much better than anticipated. Human irrationality is greatly overstated, because people conflate being bad at explaining why they are doing something with doing something irrational. Humans aren't anywhere near as irrational as portrayed. If they were, they'd be dead. Not just, like, dying today, but dead long ago, and you'd have never been born, because humans would just be dead. Reality does not permit high degrees of irrationality to survive very long. To listen to some people describe it, humans are beings who, when presented with a choice between a cookie now or two cookies in 15 minutes, choose to gouge their eyes out.

Rationality studies aren't going to rewire a fundamentally crazy, irrational person into a rational one that is suddenly the one-eyed man in the land of the blind. Rationality studies can only clean up some of the rough edges on a being that fundamentally behaves fairly rationally most of the time. There isn't anywhere near as much room for optimization as some people seem to think.

And part of the reason for this is, it is very difficult to overstate how bad humans are at explaining why they are doing something. Just, mind-blowingly bad. Study after study reinforces the endless parade of anecdotes from my own life. Everything agrees; people are terrible at this.

But, the thing is... doing the rational thing for the wrong reason, or without any reason at all, is still doing the rational thing. Jodie may gossip terribly about Ethel for all sorts of reasons our calm, collected putative "rationalists" declare to be "irrational", and Jodie herself may think she's doing it because Ethel is ugly and smells bad and just pissed her off yesterday, but nevertheless, the goal Jodie doesn't even consciously know she has is likely to be attained, and Jodie's access to better mates is likely to be successfully increased at Ethel's expense. (Not "guaranteed", but likely enough that, well, this strategy has been around since the dawn of recorded history and is still going strong for a reason, you know?)

A good chunk of the remaining belief about human irrationality will come from people who aren't clearly looking at the world, and will declare on one level or another that that's an irrational thing for Jodie to want... but... goals are much less amenable to "rationality" than means. That you personally think Jodie shouldn't want that, or shouldn't need to want that, or whatever, doesn't change the fact that Jodie is operating on that goal, and really quite rationally pursuing it.

That said, there is room for improvement. But I don't think it's the sort of improvement that leads to fame and visibility very often; it's life improvement. Nobody's famous for really having their life together, and making good decisions. But there are still plenty of people who are better at it and plenty who are worse, and even if there isn't as much room as some people may think to improve that, there's still plenty to make it worthwhile.


I can't read the name "Jodie" without hearing the song "Jody" by Tatsuro Yamashita, the eminent inventor of Japanese City Pop.

https://www.youtube.com/watch?v=0J-SNMOcB5M

My apologies for not actually engaging with your ideas. I mostly agree with you, and have no major criticism or connections to add. I just thought you might like the song.


I have no idea what this guy is talking about to be honest. I can't really discern even what claim they are trying to disprove in this 10 year old blog post by another author referencing another discussion that I don't have background on.


This is a great post, and i'm a huge fan of Glen Weyl, but Glen Weyl is the biggest purveyor of illegible-to-nontechnical-people technocratic ideas i've ever seen. I think his ideas are great and interesting, but I don't really understand on what level he views himself as separate and distinct from the rest of technocracy.

Where are the successful philosophers in general? His opening complaint is that the most successful rationalist philosophers are stuck in academic philosophy and then goes on to name the only modern philosophers I've ever heard of.

Academic philosophy has long been a circle jerk that occasionally in a moment of triumphant success becomes useful as a side note to something in AI research or whatever.

I mean, when was the last time that an academic philosophy paper made it to the front page of HN?( which is probably the most receptive mainstream audience that is ever going to exist for something like that)


I'm not generally into whatever scene this article is all about, but this seems like a sort of a down the philosophical rabbit hole that maybe has too little data or not a lot to do with 'success'?

Maybe there's not enough rationalists to decide or maybe we don't know how many folks are rationalists or not so we can't answer the question?

Or success has nothing to do with being a rationalist or not? Or 'success' isn't easily defined / observed?

This seems like measuring something that I'm not sure you can measure, or tell if it matters or not.


Of course analyzing arguments logically is a valuable skill, but here's the thing: the educational preparation for fields where rational thinking is most valuable is already chock full of opportunities to practice and refine rational thinking. After years of practice, what further marginal gains can be made by further analysis and study of the skill itself? Isn't it overwhelmingly likely that a person with that background has other, more pressing weaknesses they can work on to promote their success?

Unfortunately, the people who gravitate towards rationalism tend to be people who are already good at it and naturally oriented towards it, people who are more likely to lack the skill of turning it off. When somebody makes an argument to you, you want to think along with them, try to inhabit their head temporarily. You'll learn things about them that allow you to work productively with them and even understand the factual content of their speech better. If you're diagnosing the logical structure of their irrationality while they speak, it's going to knock you out of their headspace, like spotting plot holes in a movie takes you out of the film. You've spotted something invalidating, and now you have to wrestle your brain back into paying attention. I would wager that the best movie critics aren't generally elite at spotting continuity errors. Learning to turn critical distance on and off, and to direct it where you need it to go (focusing on emotional dynamics, for instance, instead of logical structure,) is probably the skill that many people attracted to rationalism really need, but by doubling down on rationalism they're making it even more automatic.

That aside, I think the fundamental question of the article is bizarre. Where are the successful "sprinters" in professional football? Speed is a critical factor for wide receivers and defensive backs, and they train for it, but they don't go around calling themselves sprinters or talking about sprint training in their press conferences. Speed is a skill and a quality for them, not an identity. Self-identifying as a "sprinter" would be counterproductive for a defensive back, because it takes one element of success and elevates it above the others instead of balancing it against them. It only makes sense if being a sprinter is more important to them than being successful as a defensive back, and you wouldn't imagine a person with that order of priorities to achieve elite success as a defensive back.

As a skill and a quality, rather than an identity, rationalism might be common at the top of every field; who knows. That's a more interesting question at least.


Are rationalists less likely to enter the pool of candidates endeavouring for wide-publicly-visible 'success' due to them noting the low probability of success reality - (and the discomforting fact that this arguably arises out of factors beyond one's control/foresight)?

Entering this pool arguably requires a touch of useful delusion (e.g. trump).


I think most super-successful people have taken gambles that were apriori foolish but ended up paying off. People who behave more or less rationally may not get the big jackpots, but they get a lot of small wins and avoid most catastrophes.

Some of the richest people in the world - I'm thinking Bezos, Gates and Buffett - are pretty rational even if they don't blog much.

How about Dominic Cummings? I don’t like the guy, but he clearly thinks of himself as a rationalist and has been extremely successful.

I'm basing this more on friendships with libertarians than the article (tho I did read it, if not get the meaning); people are not rationality engines; reason is at the behest of emotions; reason is even often used to back-fill explanations for decisions made by the weighing of various neural networks for reasons concrete but obscure.

Success for humans certainly is increased by an ability to reason well (quickly or slowly, depending on the field), but even more so on being able to motivate or persuade other humans; even technically, my success at learning complicated stuff is less being able to code and fix systems, but more at being able to consistently and correctly explain the complicated stuff and the logical consequences of the complicated stuff in meetings. I force myself to keep programming because I need to keep that understanding alive, but companies would be happy to have me just explaining stuff. And the people I've enjoyed working for the most are able to create a warm feeling of we are all in it together making cool new technologies! Yay we are making the internet! Pump up the feelings so we are motivated to use reason to solve problems in the path.


I've seen the claim that reason is just emotional rationalization before. It strikes me as dreadfully cynical. Rationalizing one's emotions is the entire point of life. We experience something, it confuses us, we try to figure it out and explain it, model it so that we can better predict in the future. In other words, it's called "understanding". Understanding is a good thing, not a bad thing.

I think the criticism is better and more narrowly targeted toward those that don't prioritize consistency. Yes, from your experiences and emotions, try to derive (or infer, rather) your principles. That's fine! But then, don't stop there. From there, try to remain consistent with those principles. And if they conflict, use that emotional experience to again understand and update your principles and then try to remain consistent with those.


Reason is not only or solely emotional rationalization, my point was that in the substrate of the brain, emotions are the motivator for our actions and long term patterns. You can't understand your self in completeness, at least so far. We are all just too complicated. There may be pieces you understand about your typical reactions and preferences, but like why do I like organ music and my wife hates it? Why do I love spicy curries and others don't? Very complex answers if any exist to these questions, much less why do I prefer a group decision making with a bit of arguing and a bit of time to mull over the answers? I don't derive or infer principles, I try to live up to principles, to squeeze something important into a few words so I can more often do the right thing and avoid the wrong thing. I certainly have emotions that are never going to be made into principles - I have even gotten so angry that I wished ill towards others, tho I cannot recommend the desire to harm others as beneficial for own happiness, the others well being, or society as a whole.

For me the point of life is closer to building satisfying emotional experiences and connections, to people and to an great on-going work; to do good and heal harm; to enjoy each day's beauty. Those things are what make my reason work well. When I'm scheming in anger, my brain is less proficient at reason. When I'm walking under the pulsing Mars and mulling over some terraform issue, I'm a veritable fountain of rationality, which none the less cannot explain the happiness I feel when I return to the family after the walk.

If you think you can come up with a reason to do one thing rather than another that is not based on something isomorphic to emotional drives, I wish good luck to you but I cannot have much hope in the likelihood of success.


The post talks like rationalism is a choice someone makes. It's not though.

Funny typo: 'The LSD has an estimated $5B in annual revenue". :)

I assumed it was a Star Trek reference.

Secular Humanists > Rationalists.

The Rationalists can get into difficult corners. Rationally, the way to fix many problems is de-humanising.


I'd rather be a rationalizationist, they've never been wrong once.

Certainly, convincing rationalists their rationality is based on skewed evidence and false reasoning is hard. I try to ration my time investment on this, rationalising that I will gain more by doing less. Which is in it's own way, the Zen outcome: rationally, we're all dead for a long time so, why bother?

Postscript: rationalisationism is post-hoc reasoning. I did take the last cookie, now, think of a justification why it's OK.. working in children's brains since the invention of the cookie jar.


Genius ! Got a good chuckle out of me.

There are lots of successful rationalists.

As an example, Objectivists generally accept the label of being rationalists. And the people at https://en.wikipedia.org/wiki/List_of_people_influenced_by_A... came under the influence of Objectivism before they were successful.

Some of the more prominent names on that list include Mark Cuban (Sharktank), Alan Greenspan (economist), Penn Gillette (magician), John Mackey (Whole Foods), Gene Roddenberry (Star Trek), Peter Thiel (Paypal), and Jimmy Wales (Wikipedia). And those are just ones who publicly talked about it. How many more achieved success but didn't choose to talk about it?


The article is talking about this rationalism: https://rationalwiki.org/wiki/LessWrong

There are successful objectivists, but objectitivm is very different from "lesswrong rationalism".

Moreso though, I take issue with this :

>came under the influence of objectivism before they were successful

Not only are there people in that list that were successful since before they were born, but the condition you cite is not necessary to be in that list. You can have been influenced by Rand as a teenager, grown out of it by the time you became successful, and still be in that list, and you can also be part of that list if you were influence by it after becoming successful.

Also, objectivism is objectively incorrect, and unlike rationalism is an anti-materialist ideology.


To whit, Ayn Rand's novels are massive best-sellers, and many other Rand acolytes like Nathaniel Branden also published massive best-sellers. Out of their readers, there are many, many, many total losers and mediocrities. That list is cherry picking given that most of the names on that list do not even pay dues to the Ayn Rand Institute and could not be considered card carrying Objectivists. There have been many successful card carrying Objectivists, but claiming that everyone who read "The Fountainhead" derived their success from Ayn Rand in some way is just silly.

Somebody who was influenced by somebody who generally accepts the label X doesn't really fit the definition of X, do they?

They might, but they might also just like capitalism because it does well for them and find it enjoyable when somebody tells them "and you know what? you're not just lucky, you're right".


If something ends in 'ist' or 'ism' it probably isn't rational. But then, perfect rationality is only possible if you can keep the irrational aspects of the universe out of your personal reality...

Scientist.

Humanitarianism.

Orthodontist.

Egalitarianism.

Industrialist.

Magnetism.

Psychologist.

Stoicism.

Virologist.

Capitalism.

Neurologist.

Formalism.

I could list dozens more.


list ;)

>Tyler Cowen once called rationalism “just another kind of religion”, but if so, it seems to be a fairly unsuccessful one.

it's much worse, it's basically self-help or Oprah for nerds, religion at least has some spiritual value. I honestly don't know how anyone can tolerate that stuff, it ranges from cringe inducing harry potter fan fiction to bad takes on AI. One of the worst things I've experienced working in software development is that there's way too many people who keep sending me these blog posts.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: