Rationalists are the opposite of visionaries.
A job at Google making $200-400k, having good work life balance and enough time to pursue your hobbies is success. It is also more reliably achievable. A rationalist thrives in the absence of uncertainty. Becoming a decent engineer is a far more reliable way to be successful than starting the next billion dollar company. That is practically leaving everything up to luck.
The truth is, we only see two kinds of successful people in media. The lucky, the pig-headed visionary and those who fall in line. Rationalists are by ideology averse to luck, stubbornness and compliance. Go figure, why they do not feature in cohorts so opposed to their core ideology.
Now militant rationalists are rarely rationalists. If anything, acknowledging the virtue of gut feeling/intuition in presence of uncertainty and the the prevalence of uncertainty in everything we do, are the 2 most evident facts of our universe. To make a strong claim of correctness in the presence of massive uncertainty, is innately irrational.
That being said, a lot of the most successful people I've met, use various tools of rationalism quite frequently.
Yes. Perhaps restating your point in statistical terms, "Where Are All the Successful Rationalists?" is entirely the wrong question. A rationalist is not particularly interested in outliers and assumes their outcome is no likely to be better than the odds predict.
The right question is "What is the median level of success among all rationalists compared to other groups?" My hunch is that you'd find that they do tend to be more successful in the aggregate.
If you only look at the set of people who have made a ton of money to draw your conclusions, it would tell you you should buy more lottery tickets.
What job at Google (or anywhere for that matter) do you have that actually facilitates good work-life balance?
Source: at Google for 4.5 years. Rarely if ever had to stay late.
You can argue that the causal arrow doesn’t point in the right direction—that is, that people that are successful just happen to wear the affectation of “rationalist” because that happens to be fashionable, and not that their rationalism led them to a “successful” career, but that doesn’t seem to hold up against scrutiny, in my experience. Of those that I personally know, most have been engaged with the community since at least the golden days of lesswrong.
I'm a programmer who lives on the other side of the world (so no chance for me to make $300k+/year) and I must say that all this "rationalism" discussion makes me a little confused: do people really believe in this sort of stuff? Do they actually equate "success" with (mostly owning) "money"? Do they really think a "rational"(-ist) person would mainly think about how to earn (supposedly more) money? Why on Earth would he/she do that? Money is just a tool. Holding an important position in society (CEO, founder, whatever) is just a hindrance, it keeps one away from actually thinking about the stuff that really matters.
I'm pretty sure all this stuff was explained a lot better a long time ago by people a lot more smarter than me (right now I'm thinking at one of Plato's works, maybe "Symposium"? I'm not sure, I've last read many of them ~20 years ago), point is this specific "view of the world" seems very US-specific to me.
You may believe stuff like this, but many others don't, so it should be phrased as your opinion rather than as a fact.
Elon Musk, for example, has very strong beliefs about humanity's future on other planets. His position as founder/CEO of SpaceX lets him actually work towards making that dream a reality. You or I can dream all we want but we can't make them reality. If this is something you care about, then he's clearly successful in ways that we aren't, that are directly attributable solely to his role as wealthy person/founder.
Great place to make money before retiring elsewhere though.
Personally I would say 25k€ nett is middle class; but here everybody wants to belong to the middle class so it usually has a wide stretch.
Big companies are unionized so salarys do not get that big, but are considered higher then in non-unionized companies.
How is the situation in the US?
To me, this feels a little like saying “lots of tech people I know with high salaries were early contributors to Wikipedia, therefore being an early contributor to Wikipedia probably made them successful.”
It can be considered causal. In that, many people follow the thought processes that underlie rationalism which direct them towards careers and outcomes of the sort that the person above was mentioning. (95+ percentile salary as a stable independent contributor in a field that values logic and structured thought). They may do it while being completely ignorant of the movement, but still being a rationalist for all intents and purposes.
If you consider “rationalism” (in the sense of the subject of this thread) to be equivalent to “following a scientific or rational thought process”, then the main question asked in the article becomes nonsensical. “Where are all the successful people who followed rational thought processes?” is a genuinely foolish question, because you can find countless notable examples with no effort at all.
But of course that’s not what the post was asking, and it’s why the poster has a harder time answering the question. The post refers to the very specific Internet phenomenon of “rationalism” which, while cleverly incorporating the notion of rational thought into its name, actually refers to a specific group of people who follow a specific set of teachings.
And those people are massively concentrated in US tech and tech-adjacent areas, largely because that’s where this specific set of beliefs took off first. That’s the causal arrow here.
For those who don't know the phrase: Those who can't do, teach.
And to go further, the derogatory phrase means if you can't make money doing something, you could still make money teaching other people to do it.
Unfortunately my only evidence is being a successful rationalist and knowing a lot of successful rationalists.
Then again, having a stated preference for philosophical camp at all probably corresponds highly with success against the population who is totally uninterested or hasn't gotten enough education to be exposed.
Either way I don't buy the premise.
(Not that most jargon is good in an academic context either, but at least there you can kinda see why it’s sometimes necessary)
(The other reason might be because rationalism, at least recently, has been adjacent to crackpot racist phrenology ideas, and so has driven away contributions from most of the world, preventing them from branching out into the real world)
Like last night I watched the writer Jia Tolentino interview Chanel Miller (the woman who was raped by Brock Turner, if you're familiar with the case.) Watching Jia think in real time is magical to me. Her writing, too, is at times incisive, at others deeply personal, sometimes indignant or marveling or both, and can still pull back to culture/society level relevance.
I would say that the best of Jia's longform writing is among the best narrative non-fiction being written today.
Is there a part of me that wants to be able to do that? i.e. am I a wannabe? I mean, kind of, but I also really enjoy it and that's more the point. I enjoy Ren Hang's photography but am not trying to be a photographer. I enjoy helping founders but don't have much desire to found another company.
We don't _have_ to assume that everyone who enjoys something only says that they do because they are status-obsessed and lazy.
Another example I would give is Daniel Day Lewis. Good god the length he goes to to method act; the almost terrifying commitment. And it fucking pays off. But do I want that life? Hell no. Because I'm lazy? Maybe if I wanted that life then work ethic would be a question, but what disqualifies my trying to be Daniel Day Lewis in the first place is that I don't want to make the trade-offs he does — even if I could.
I made a series of life changes. I can't attribute the blog post alone to these changes, but it was something that pushed me towards a more meaningful life. I stopped spending money on things and started buying experiences more often. I bought a bycicle (a thing but gives you many experiences) and started cycling to/from work in an effort to a) exercise b) spend more time outside. I broke up with my girlfriend of 9 years because I figured it just wasn't making me any happier. Cycling in London, UK for years made me very satisfied but I also got tired of the climate and I ended up finding a remote role and relocating to a climate where I can be outside every day. Finally I have started strength training - and now I am in a much better shape after doing it on a regular basis for a few years. It's hard to say that I am happier exactly, but there is a sense of liberty I feel that is hard to describe. The life I am living feels like my own, I feel the sense of control and I am liberated. I would like to think many of these things count as "winning".
In other words, the techniques praised in lesswrong writings dress up what is effectively bias shuffling, which results in the same problem of being unable to assess the predictors of outcomes, just now better hidden behind falsely confident figures instead of admission of uncertainty.
So at some point you were able to assess outcomes objectively and update your perceptions? What would you call that process?
> In other words, the techniques praised in lesswrong writings dress up what is effectively bias shuffling, which results in the same problem of being unable to assess the predictors of outcomes, just now better hidden behind falsely confident figures instead of admission of uncertainty.
I’m asking how you can identify a truth-finding process that is not bias shuffling and what you would call it (since “rational” is now taboo).
(It's also hard to tell what subtle distinction the verb "figured" is meant to convey in your sentence, if any, beyond say a neutral verb like "concluded".)
I think it comes down to a few factors.
1. Deep Learning took off and took the narrative as the "right path" to "General AI." Back in its heyday all the LW folks (including myself) were really focused on approaching AI from the perspective of Pearl's Causality and the work of Ray Solomonoff
2. There were no practical products, tools, frameworks or research that came out of that group that could either be easily applied to existing business or demonstrated SOTA on any computing domain.
3. AGI/GAI/HLAI/etc... is generally still (unfairly IMO) seen as the domain for cranks and zealots - which was the primary end-goal of the LW crowd, where "rationality" was the practical middle-step.
4. It kind of turned into a mini-cult around Yudkowski, which IMO turned people off
Rational thinking, like programming, is a tool. When you turn programming on itself, you get metaprogramming and new programming languages. Which is cool for programmers, and many a very skilled programmer has fallen down that rabbit-hole, but neither particularly profitable nor necessarily all that useful. To get the big bucks and make a big -- direct -- difference in the world, you need to combine the tool, programming, with domain knowledge and real-world problems.
Likewise, if you turn rational thinking on itself, you get metacognition, thinking about thinking, and rationalists. It's neat, but at the end of the day...
I think this transformation had already happened when I first heard of rationalists. Some of the things that they were saying seemed vaguely interesting. But I couldn't quite put my finger on something being off. At some point I came to the conclusion that Yudkowski was some sort of cult leader like figure.
A cult around rational thinking. Isn't that something.
Then later tiger king came out and I realized that apparently everything can devolve into forming cults.
Yanis Varoufakis told a nice story about him teaching game theory. At the beginning of the semester, he offered the class a free pass of the course: Each student would write down the grade they would want and they would all be awarded by the minimal grade that was proposed. But first, they would have to agree on this proposal. He mentions that there always was a person who refused to agree. When he asked them, why did they were against it, they responded, without a hint of irony, there would surely be some idiot that would write down a low grade (and they would not pass).
Frankly, I don't think this problem has a rational solution. It's a paradox - by being totally rational, you're being irrational. And I think the same is true for rationalism as a life philosophy. If you apply it to everything, the life will be hollow.
1. They perceive a tail risk that at least one student is willing to spite the others. Possibly a student that has less to lose from bad grades than their classmates do, but that's not the only context where it could happen. While an assumption that everyone has the same values and will make the optimal choice under those values often leads to an effective approximation, modeling human behavior isn't always that simple... especially when only one person has to deviate for your model to make a catastrophically wrong prediction.
2. Grades aren't quite on an absolute scale; at least when pushed to extremes like "everyone in the class gets an A+", the most sophisticated readers of the transcripts will interpret that A+ differently than an A+ in the same class where the distribution of grades was more typical.
3. Getting yourself to behave in your long-term self-interest isn't always trivial. People do things like publicly commit to donating to a cause they hate if they don't achieve goal X, to try to increase their chance of reaching that goal when they've wanted to but failed in the past. Meaningful grades can be better than their absence if you struggle with self-control and trust the professor enough.
4. If you have a clear interest in seeing your classmates put effort into the course, you might vote against this.
Note that, in some of these cases, the student has an incentive to be quiet about their primary motive and instead give the cover-story explanation Varoufakis heard. I don't think those cases are likely (I think it really is mostly #1), but I wouldn't rule them out.
Of course you can argue about the details in particular situation, just like you can explain that the card is not really a paradox, as the statements were written by different people, etc.
The main reason why I pointed out the story is that it is a certain answer to the question, why are not rational people (more) successful? We can easily imagine a class of "irrational" students, who take the Varoufakis' proposal, all get their good grades, and move on to other tasks, beating the hardcore rationalists in the process.
"Rationalism is inherently flawed in that it lacks a value system" strikes me as a category error. Giving you a value system isn't its job; instead, it provides a toolbox for identifying and resolving contradictions in your preexisting beliefs and values.
(Since we all work with imperfect information about the world, internal consistency is far from a guarantee of accuracy; but it improves the odds.)
First of all, not all value systems are rational and optimizable. The most relevant ones to humans simply aren't. Building a rational system to maximize these value systems is an impossible task, it's a category error. Like trying to run regression with a non-differentiable cost function.
Value systems are an "ought". No rational, logical system can ever change them. Resolving contradictions in your values is a categorical error, because there is no such thing as having more than one value means inherent contradiction.
As a result, a purely rationalist perspective is ineffective. You can only achieve a certain level of rationality.
Which is already exactly what people have been doing for millenia, trying to find the most logical way to follow their values, and trying to phrase their inherent, base values into a set of explicit values that represent them as well as possible. It's not anything different from what secular philosophy has done for over 4000 years now if you construe it that way.
Finally, it's never possible to know that your system is without any contradiction, because that would be equivalent to solving the halting problem. So, do you have a notion that contradiction can be estimated? Measured? How do you come to the conclusion that your belief system, with that contradiction removed, is any less contradictory than before, knowing full well that you might have introduced a more complex, as of yet unknown contradiction, or ten thousand of them?
And so it seems that having soft contradictions is not actually necessarily a bad thing, and that jumping to resolve them should only be done if they actually are terminal, because balancing contradictions in your cognitive process is actually very doable and a good tool. It's not just about information either, even with perfect knowledge it's impossible to be sure of the absence of contradiction.
In other words, it's never possible to prove a the consistency of a model in that model itself. You can try to prove it using the framework of another model, but how do you know that that model is consistent? You can't.
This is why I personally try to ground the internal consistency of my though process in the material world - a contradiction is only problematic if resolving it can materially improve the situation or understanding I derive from it.
But then again, Philosophical materialism vs idealism is a whole other can of worms, so il leave it to that very small facet of it.
As you've missed in my first post, the issue is that personal philosophies are fundamentally different from physics, in that in physics there is a correct model, that can be reached by eliminating contradictions and constantly iterating. As far as models to maximize personal success, however that is, there is no such thing as a correct model, there will always be contradictions, and hyperfocusing on the most rational possible model to optimize for an irrational goal will almost always yield worse results.
Rationalism doesn't just mean "we can derive some benefit by carefully trying to use rationality to arrive to better models and decision", the same way that scientism doesn't just mean "we can derive some benefit by using science", rationalism is the idea that *all beliefs and decisions should be based purely on rationality". Unless your goal is purely quantifiable and free of contradictions, this is actively harmful.
I never once suggested that rationality isn't useful or that resolving some contradictions is useless, I actively stated the reverse, that they are both useful insofar that they actually make the situation better for you, which is absolutely not what rationalism is about.
Rationalism presupposes the supremacy of rationality, which itself cannot be rationally proved.
The fact remains that the "lesswrong rationalism" community was the first to popularize several rather sophisticated reasoning concepts. They have provided common reference points that have made it easier for me to communicate my thinking to others and vice versa, and I am far from the only writer/reader who has benefited in this way. As a consequence, I'm generally sympathetic to this group, and personally identify as a rationalist.
I agree that the heuristic "internal consistency is always worth actively working on" can go very wrong. It looks like you're defining rationalism as structuring of one's life around that heuristic; I'm guessing you must have had bad experiences with people who actually do that.
Just using reason to come to conclusions that aren't necessarily ultimately justified or deriving truth from pure reason is literally the entire concept of Western philosophy. And by Western, I mean from Iceland to Bagdad and beyond. I'm fairly certain it's also the entire concept of Eastern philosophy too but I don't know much about that.
Lesswrong is certainly good for introducing concepts and allowing discussions. That does not a philosophy make. I'm not critiquing the website, nor the community, but the "pure reason" rationalist philosophy that a lot of people there adopted and peddled, without spending enough time justifying it, instead spending vast amounts of time applying. This is exactly what the article was addressing when it talks about rationalism, too.
Rationalism is, by definition, idolatry of reason. At the very minimum, to even show any kind of rationalism, you have to believe that somethings are true by virtue of pure reason, independently of the empirical world. Rationalism really doesn't have any thing to do with using or not using reason.
To make the central claim worthwhile, Harris would have to find a way to measure well-being. Since he can't, there is really no difference between that and bog standard utilitarianist ethics.
This is an ahumanist or post-humanist value system. Rationalism becomes a movement of transcendence from mortality, fleshly desires, the unconscious, and nature.
The rationalist worships the neural network.
This is so amazingly reductive I can't take anything else in the post seriously.
>The LSD has an estimated $5B in annual revenue, with an estimated $100B in funds, and protestants can make claims to nearly every US president.
Importantly, this is the LDS, not the LSD. Do rationalism's thinkers engage seriously enough to be able to make important cultural contributions? Or do they have such a surface-level view of everything (too much breadth, not enough depth – too much depth is unpractical, and practicality is key for rationalists) that even in this post they cannot be bothered enough to cross the t's and dot the i's enough to avoid a confusion between a psychoactive substance and an influential religious institution?
More to the point, rationalism's great age has passed. Rationalists did carry import in the post-Reformation era, but in philosophical concepts, we have long left the era of classical Enlightenment behind. We have even left Modernism behind, which was arguably born out of Rationalism. We are at the tail end of postmodernism – of course you don't find "highly successful" "rationalists" in the post-modern age. While you don't necessarily have to be conversant (or even aware) of Barthes' structuralism, "success" (at least, in the sense that is discussed in this post – cultural import, legacy, etc.) is dependent on living and breathing postmodernism.
Like a fish in water, you don't have to know what postmodernism is, exactly. I do not think any TikTok star knows what postmodernism is. But they intuit that they must be a chimera that is the product of its audience, that the signs they create must be readily meme-able and able to live a life outside of their author. Rationalists do not breathe in this water – their hierarchical approach is stuck at the turn of the last century, and so they will not find themselves carrying import at the beginning of this one.
I guess the flip side of that argument also lives in British politics: the PM's advisor Dominic Cummings, who fulfils a Rasputin-like role of being on the one hand a magnet for criticism and on the other hand someone the PM will go to unprecedented extremes to continue to receive advice from. He's also someone that has Overcoming Bias and Yudowsky in his short blogroll and sniggers at his detractors that "don't understand epistemological uncertainty" when his comments that his priorities "in some possible branches of the future... will be an error" get media attention. An archetypal 'rationalist' who is incidentally currently the UK's most influential policy wonk.
Usually British PMs don't accept their Chancellor's resignation so their adviser can fire his staff, or refrain from firing advisers whose actions become the #1 subject of public outcry from their core voter base.
Also, the situation with Cummings isn't remotely odd. What confuses people is recent history. Blair/Brown was the archetype of why you don't want a powerful Chancellor: Brown briefed against Blair and the govt constantly, his advisers used to attack other ministers, it was insane. Osborne's relationship with Cameron was a function of seeing Brown attempt to destroy Blair.
But that isn't always the case when there are issues with other Ministers or the Civil Service (both of which were true when Johnson took power). Thatcher is the prime example: almost everyone in the Tory party hated her, her 1981 Budget was made by Alan Walters not Howe, Lawson resigned when she opposed ERM (again, due to advice from Walters). Equally, in foreign policy she ignored her Ministers and, more importantly, the Foreign Office (who gave uniquely terrible, bizarre advice in this period...even by FO standards).
Our system is built with this tension in mind. Cameron had a strong policy unit. Blair didn't have one but he relied on a small circle of Ministers so it wasn't necessary. It is very normal, and quite necessary, for a Prime Minster to be hostage to no-one. The Prime Minister is always responsible to Parliament, that hasn't changed. But the growth of the media, growth in complexity of policy, and growth in suspicion of the Civil Service (particularly after the 70s) has led to this requirement.
And btw, I don't think most people in the public understand how ludicrous the situation with Javid was (it generally wasn't reported because the person in question was liked by the media because they leaked). Javid's advisers were actively briefing against the govt daily. When Javid was sacked, his advisers were out there literally the day after briefing against the govt. It wasn't close. It wasn't even remotely close. It is a very good thing that SPADs get taken down a peg because the massive proliferation in their number since the early 90s has meant some attempting to adopt a remit that vastly exceeds their purpose.
None of the political intrigue itself has anything to do with rationalism of course, but it does show a pretty archetypical 'rationalist' whose counsel was ranked unusually highly by a global leader
As for Clement, I never said he feigned disinterest, I agreed with the original blogger that that this was one of many examples of ecclesiastical orders getting involved in all the City of Man stuff the Church's founding mythology placed them outside; often excommunication was even more transparently nothing to do with spiritual matters. Sure, excommunication proved no impediment to Henry VIII resolving his domestic issues at the cost of only a few centuries of religious strife and antagonistic relations with Catholic states, but unless your argument is that this proves the Church wasn't any more influential on the politics and economics of the Middle Ages than LessWrong on the politics and economics of the world today, I'm not really sure why you're twisting my original statement to make that tangential point
What you don't answer is why you believe that a PM wouldn't fire an adviser briefing against the govt? The only instance where this has occurred was Brown (and remember, pretty much every person around Blair asked repeatedly for Brown to be sacked,and most people ended up leaving because they couldn't deal with Brown). Indeed, this occurred under Cameron (with one of May's closest advisers) and the person was fired, no questions. The odd thing was that Javid refused to sack the people in question (I remember the day it happened the advisor was briefing the press that Johnson was going to fire Cummings...it was utter insanity, and this was after it was well known that they had leaked govt documents...this is typical for the Tories but Javid was given plenty of room, and chose to shoot himself).
Being "a focus for public outrage" is neither here nor there (thankfully). Some people are perpetually outraged, doing what they demand doesn't affect things.
Advisers also aren't supposed to be invisible. Again, when anyone has advised the PM, it has caused problems (with backbenchers, with the Civil Service, etc.). Under Blair, everyone complained about his advisers. Under Cameron, everyone complained about his advisers. Under Thatcher, everyone complained about her advisers. Under Brown, particularly, everyone complained about his advisers. Under May, particularly, everyone complained about her advisers. They are not supposed to be doing interviews and briefing the media (openly), and they clearly aren't supposed to be briefing against ministers in other departments...but they aren't invisible either.
It isn't tangential, the original point was: Clement was influential...saying that he wasn't influential is not a tangential point.
Suffice to say you are also perhaps in a minority of one in believing that Clement VII had no influence on the Middle Ages. Especially when the point of reference the Church's influence is being compared with is a collection of moderately popular blogs and think tanks.
Anglicism was a branch born of cynical political power struggles and everyone knows it.
It is a funny question though. I might posit that it's not a philosophy so much as a subculture that appreciates a certain flavour of writing. There's a form, like a prog-rock anthem or a minimalist piece, where writers ourselves tend to appreciate it, but where most people are just looking for something to dance to.
If I could characterize most of the criticism I've read of rationalist writing, it would be that their main complaint is they can't dance to it.
My (admittedly untestable or at least hard to test) hypothesis is that people who identify themselves as rationalists fell into this trap.
You start with a problem, and instead of solving the problem, you end up just losing your desire for the solution.
It's our nature to be self obsessed. So when we learn a new way of thinking, we cannot help but make ourselves the primary target of it. And nothing will disabuse you of a deeply held, arational, intensely motivational heroic narrative like the tools of rationalism.
It definitely improves things to work some of the relevant lingo into your work environment so you can more easily help colleagues grok why the perspective you have on a topic seems very different, and I can see an argument that in many areas you'd be better off with specific knowledge. But is that the trade-off? Not in my experience. You can have both, or at least 80% of both.
(No, I won’t say who it is)
That being said most people who self-identify as rationalists are significantly held back by (a) lack of charisma (b) the fact that they believe that their elaborate mental constructs are reality. Of course these are just another mental narrative, and not even a particularly useful one.
No doubt there are some good thinkers in the sphere, but I don't know why you'd correlate rationalists with 'success'
Not to mention the projection involved - "success" is easy to come by if you have a crowd of rubes paying for what they want to believe but rationalism doesn't offer anything with such popular appeal directly.
Really the whole article is anti-intellectualism fallacies strung together in a "If where the rays of the sun don't create gold why aren't you rich!" style fallacious retort.
"I knew that when I was 80 I was not going to regret having tried this. I was not going to regret trying to participate in this thing called the Internet that I thought was going to be a really big deal. I knew that if I failed I wouldn’t regret that, but I knew the one thing I might regret is not ever having tried."
Rationality studies aren't going to rewire a fundamentally crazy, irrational person into a rational one that is suddenly the one-eyed man in the land of the blind. Rationality studies can only clean up some of the rough edges on a being that fundamentally behaves fairly rationally most of the time. There isn't anywhere near as much room for optimization as some people seem to think.
And part of the reason for this is, it is very difficult to overstate how bad humans are at explaining why they are doing something. Just, mind-blowingly bad. Study after study reinforces the endless parade of anecdotes from my own life. Everything agrees; people are terrible at this.
But, the thing is... doing the rational thing for the wrong reason, or without any reason at all, is still doing the rational thing. Jodie may gossip terribly about Ethel for all sorts of reasons our calm, collected putative "rationalists" declare to be "irrational", and Jodie herself may think she's doing it because Ethel is ugly and smells bad and just pissed her off yesterday, but nevertheless, the goal Jodie doesn't even consciously know she has is likely to be attained, and Jodie's access to better mates is likely to be successfully increased at Ethel's expense. (Not "guaranteed", but likely enough that, well, this strategy has been around since the dawn of recorded history and is still going strong for a reason, you know?)
A good chunk of the remaining belief about human irrationality will come from people who aren't clearly looking at the world, and will declare on one level or another that that's an irrational thing for Jodie to want... but... goals are much less amenable to "rationality" than means. That you personally think Jodie shouldn't want that, or shouldn't need to want that, or whatever, doesn't change the fact that Jodie is operating on that goal, and really quite rationally pursuing it.
That said, there is room for improvement. But I don't think it's the sort of improvement that leads to fame and visibility very often; it's life improvement. Nobody's famous for really having their life together, and making good decisions. But there are still plenty of people who are better at it and plenty who are worse, and even if there isn't as much room as some people may think to improve that, there's still plenty to make it worthwhile.
My apologies for not actually engaging with your ideas. I mostly agree with you, and have no major criticism or connections to add. I just thought you might like the song.
Academic philosophy has long been a circle jerk that occasionally in a moment of triumphant success becomes useful as a side note to something in AI research or whatever.
I mean, when was the last time that an academic philosophy paper made it to the front page of HN?( which is probably the most receptive mainstream audience that is ever going to exist for something like that)
Maybe there's not enough rationalists to decide or maybe we don't know how many folks are rationalists or not so we can't answer the question?
Or success has nothing to do with being a rationalist or not? Or 'success' isn't easily defined / observed?
This seems like measuring something that I'm not sure you can measure, or tell if it matters or not.
Unfortunately, the people who gravitate towards rationalism tend to be people who are already good at it and naturally oriented towards it, people who are more likely to lack the skill of turning it off. When somebody makes an argument to you, you want to think along with them, try to inhabit their head temporarily. You'll learn things about them that allow you to work productively with them and even understand the factual content of their speech better. If you're diagnosing the logical structure of their irrationality while they speak, it's going to knock you out of their headspace, like spotting plot holes in a movie takes you out of the film. You've spotted something invalidating, and now you have to wrestle your brain back into paying attention. I would wager that the best movie critics aren't generally elite at spotting continuity errors. Learning to turn critical distance on and off, and to direct it where you need it to go (focusing on emotional dynamics, for instance, instead of logical structure,) is probably the skill that many people attracted to rationalism really need, but by doubling down on rationalism they're making it even more automatic.
That aside, I think the fundamental question of the article is bizarre. Where are the successful "sprinters" in professional football? Speed is a critical factor for wide receivers and defensive backs, and they train for it, but they don't go around calling themselves sprinters or talking about sprint training in their press conferences. Speed is a skill and a quality for them, not an identity. Self-identifying as a "sprinter" would be counterproductive for a defensive back, because it takes one element of success and elevates it above the others instead of balancing it against them. It only makes sense if being a sprinter is more important to them than being successful as a defensive back, and you wouldn't imagine a person with that order of priorities to achieve elite success as a defensive back.
As a skill and a quality, rather than an identity, rationalism might be common at the top of every field; who knows. That's a more interesting question at least.
Entering this pool arguably requires a touch of useful delusion (e.g. trump).
Success for humans certainly is increased by an ability to reason well (quickly or slowly, depending on the field), but even more so on being able to motivate or persuade other humans; even technically, my success at learning complicated stuff is less being able to code and fix systems, but more at being able to consistently and correctly explain the complicated stuff and the logical consequences of the complicated stuff in meetings. I force myself to keep programming because I need to keep that understanding alive, but companies would be happy to have me just explaining stuff. And the people I've enjoyed working for the most are able to create a warm feeling of we are all in it together making cool new technologies! Yay we are making the internet! Pump up the feelings so we are motivated to use reason to solve problems in the path.
I think the criticism is better and more narrowly targeted toward those that don't prioritize consistency. Yes, from your experiences and emotions, try to derive (or infer, rather) your principles. That's fine! But then, don't stop there. From there, try to remain consistent with those principles. And if they conflict, use that emotional experience to again understand and update your principles and then try to remain consistent with those.
For me the point of life is closer to building satisfying emotional experiences and connections, to people and to an great on-going work; to do good and heal harm; to enjoy each day's beauty. Those things are what make my reason work well. When I'm scheming in anger, my brain is less proficient at reason. When I'm walking under the pulsing Mars and mulling over some terraform issue, I'm a veritable fountain of rationality, which none the less cannot explain the happiness I feel when I return to the family after the walk.
If you think you can come up with a reason to do one thing rather than another that is not based on something isomorphic to emotional drives, I wish good luck to you but I cannot have much hope in the likelihood of success.
The Rationalists can get into difficult corners. Rationally, the way to fix many problems is de-humanising.
Postscript: rationalisationism is post-hoc reasoning. I did take the last cookie, now, think of a justification why it's OK.. working in children's brains since the invention of the cookie jar.
As an example, Objectivists generally accept the label of being rationalists. And the people at https://en.wikipedia.org/wiki/List_of_people_influenced_by_A... came under the influence of Objectivism before they were successful.
Some of the more prominent names on that list include Mark Cuban (Sharktank), Alan Greenspan (economist), Penn Gillette (magician), John Mackey (Whole Foods), Gene Roddenberry (Star Trek), Peter Thiel (Paypal), and Jimmy Wales (Wikipedia). And those are just ones who publicly talked about it. How many more achieved success but didn't choose to talk about it?
Moreso though, I take issue with this :
>came under the influence of objectivism before they were successful
Not only are there people in that list that were successful since before they were born, but the condition you cite is not necessary to be in that list. You can have been influenced by Rand as a teenager, grown out of it by the time you became successful, and still be in that list, and you can also be part of that list if you were influence by it after becoming successful.
Also, objectivism is objectively incorrect, and unlike rationalism is an anti-materialist ideology.
They might, but they might also just like capitalism because it does well for them and find it enjoyable when somebody tells them "and you know what? you're not just lucky, you're right".
I could list dozens more.
it's much worse, it's basically self-help or Oprah for nerds, religion at least has some spiritual value. I honestly don't know how anyone can tolerate that stuff, it ranges from cringe inducing harry potter fan fiction to bad takes on AI. One of the worst things I've experienced working in software development is that there's way too many people who keep sending me these blog posts.