Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think there are two main things about LW that strike some people as cultish. (There are others, less important.) Both are less true than they were, say, a year ago.

1. Its distinctive brand of rationalism grew out of this huge long series of blog posts by Eliezer Yudkowsky, conventionally referred to on LW as "The Sequences". So: we have a group of people united by their adherence to a set of writings by a single person -- a mixture of generally uncontroversial principles and more unusual ideas. It's not a big surprise if this reminds some people of religious scriptures and the prophets who write them.

2. The LW culture takes seriously some ideas that (a) aren't commonly taken very seriously in the world at large, and (b) share some features with some cults' doctrines. Most notably, following Yudkowsky, a lot of LW people think it very likely that in the not too distant future the following will happen: someone will make an AI that's a little bit smarter than us and able to improve itself (or make new AIs); being smarter than us, it can make the next generation better still; this iteration may continue faster and faster as the AIs get smarter; and, perhaps on a timescale of days or less, this process will produce something as much smarter than us as we are smarter than bacteria, which will rapidly take over the world. If we are not careful and lucky, there are many ways in which this might wipe out humanity or replace us with something we would prefer not to be replaced by. -- So we have a near-omnipotent, incomprehensible-to-us Intelligence, not so far from the gods of various religions, and we have The End Of The World (at least as we know it), not so far from the doomsdays of various religions.

Oh, and LW is somewhat associated with Yudkowsky's outfit, MIRI (formerly the Singularity Institute), and Yudkowsky is on record as saying that the Right Thing to do is to give every cent one can afford to them in order to reduce the probability of a disastrous AI explosion. Again, kinda reminiscent of (e.g.) a televangelist telling you to send him all your money because God is going to wrap things up soon. On the other hand, I do not believe that's his current position.

For the avoidance of doubt, I do not myself think LW is very cult-like.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: