Hacker News new | past | comments | ask | show | jobs | submit login

I don’t know why, but it reminds me of Roko’s Basilisk: https://slate.com/technology/2014/07/rokos-basilisk-the-most...



Roko's Basilisk is not a thing. It's just a repackaging of the idea that God or Jesus or whatever is watching over you at all times, so you better please him or you will burn in hell for all eternity. But this time with AI instead of spirits because it originates from a community that considers themselves highly rational, and therefore rejects using religion as a foundation for decision making. But an all-powerful omniscient AI, now that's something they can get behind because that's not religion, that's technology!


Yudkowskyist rationalism is a prime example that what you simply banish out of hand may percolate back up through your ideas -- unless you know enough about whatever it is you're trying to banish to recognize it.

The thing they tried to banish is religion: it simply morphed into another form. "Those who do not learn about religion are doomed to repeat it", as it were.


I'm always mildly surprised when I encounter people on the internet talking about Roko's basilisk as something that Eliezer Yudkowsky / "the rationalists" believe in. Very few people ever took it seriously, and Yudkowsky wasn't one of them.

Another one that comes up a lot is the idea that "AGI ruin is vanishingly unlikely, but it would be so bad that we should be worried about it anyway". I don't think I've ever seen anyone make that argument with a straight face. Yudkowsky himself thinks that AGI ruin is very likely indeed. (As in, it has a >50% chance of ending life on Earth within a century.)

Of course that group does hold many wacky beliefs. Things like the entire universe splitting into pieces billions of times a second, people's brains being a kind of generalized refrigerator, the wisdom of not taking a free $1000, and the possibility of bringing sufficiently well preserved dead people back to life. Also, thinking that there's a high chance of AGI ruin is, if anything, an even wackier belief than thinking it's very unlikely. So there's still plenty of room to make an argument that Yudkowsky and many of his readers have ended up believing wacky things despite their disdain for religion.

I do find it very odd, though, that those two misconceptions are so popular. It's like Gell-Mann amnesia: If people can be so wrong about this particular internet subculture that I happen to know something about, how can I trust anything said by anyone about a culture they aren't a member of?


I didn't say the community at large believes in it. I did say that it's a repackaged idea that originated from their community, and that's 100% true.

Maybe you're interpreting when I said they could "get behind" the idea as to mean I was implying that the entire community endorsed it, but that's not what I had intended to convey. What I meant to say was that some members of that community came up with this idea and then engaged with it in a way they would not engage with God or Jesus and Hell solely because of the framing of the deity as a technology instead of a spirit, and Hell as a simulation instead of another dimension.


Classic Chesterton’s Fence.

https://fs.blog/2020/03/chestertons-fence/


Your thoughts are very well articulated. When you put it like that, it finally makes sense to me as to why I subconsciously avoid that community.

People on LessWrong like to suppose probabilities. Saying that the probability of a hypothetical situation is such and such. Ie: people argue "simulation theory" must be true because the the probability of this and that blah blah blah means that one of their suppositions is true. But that's all just baloney as they have no way to actually quantify their assumptions.

Therefore it's not science backed by experimental evidence, but rather hypothetical scenarios backed by leaps in logic - which they justify has having such and such probability if some other precondition is true/false, blah blah blah


You're under the impression that anyone on RW actually takes Roko's Basilisk seriously? It's a mildly amusing thought experiment, precisely zero people have ever changed their behaviour because of it.


It didn't seem like the person you're responding to was saying "Roko's Basilisk is a real thing".


... in that a sweeping ostensibly prophetic message is a lie intended to manipulate?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: