Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Two years unmasking a well-funded Silicon Valley 'apocalypse cult' (truthdig.com)
24 points by anigbrowl on Aug 30, 2023 | hide | past | favorite | 19 comments


Extreme longtermism is hard to discern from believing in heaven and hell. E.g. the future is so important that almost anything can be justified in the near term because of the far away ends, which often we won't even life to experience.

As well often the longtermism speculation about possibly "living forever" and where "anything is possible" where "all needs are met" and we will be "traveling among the stars". That sounds very Biblically inspired to me.

The fact unfortunately is the future is really hard to predict with any certainty. So justifying any significant suffering in the near term in service of something that is far in the future is often a fools errand (or worse.)

Some of the black-and-white scenarios posited also are unrealistic. It isn't likely either AGI or humanity that survives -- this is just a sci-fi plot. Usually it isn't an us or them situation in the world, less you want to motivate a populace to war.

What usually happens is that all possible futures are achieved in some fashion and they coexist and there is long-term complex competition and cooperation and evolution between them. For example, there are probably human colonies and probably AGI colonies and hybrid colonies and they trade with each other and do their own things. Some will expand faster than others, etc.

Contrary to most popular sci fi literature that is human centric (StarTrek, Star Wars, The Expanse, Foundation, etc), if you think about it, AGI / robots are infinitely better suited to exploring and exploiting the galaxy compared to us - infinite lifespans, transmissible consciousness, no need for biological life support, repairability, more varied physicality, ability to survive high G/radiation, etc.


>What usually happens is that all possible futures are achieved in some fashion and they coexist and there is long-term complex competition and cooperation and evolution between them. For example, there are probably human colonies and probably AGI colonies and hybrid colonies and they trade with each other and do their own things. Some will expand faster than others, etc.

What is your reference for what is usual in this context?


> What is your reference for what is usual in this context?

Probably a wishy washy statement, I apologize.

I was sort of partly making an analogy to evolution where everything is generally tried out in a random fashion and then what works tends to stick around. It is really hard to predict what will work ahead of time because context matters and it is always changing. What may work for a while may not work in the future because of changing context.

I also do not mean all possible futures that one can ever think. I mostly mean speculative futures that are often talked about in sci fi books.

My experience in my life is that technological inventions talked about regularly in sci fi tend to be achieved over the long run, although it is hard to predict exactly when.


I think that's a particularly poor example, given that an estimated 99 percent of all species are already extinct.


> I think that's a particularly poor example, given that an estimated 99 percent of all species are already extinct.

Huh?

Anyhow, it is really hard to predict evolution over the long-term because of the ever changing context (environment, other species, etc.). Generally things happen and are tried out, experimented with and what works sticks around. Those claiming to be able to predict the future concretely from many potential futures are probably wrong (and are also likely fans of Azimov's Foundations series.)


Exactly. This is what's so confusing to me. How can any intelligent person think that they can predict what will happen even 10 years out, let alone 100 or 1000. The chaotic dynamics of nonlinear systems renders the Foundation series laughably ungrounded, and seems to be overlooked by many today.


What almost always happens in the context of evolution is that things are tried out and what works sticks around until something changes and the species goes extinct. Nearly every species that ever existed is already extinct. It's hard to predict exactly what will happen in the long term with evolution, but that doesn't mean "the species that are now populous will always remain so" is a good bet.


Complex teleological hierarchies of followers of many types of worlds have been around since the Gnostics before Christ.

If it is elite warriors, or philosopher-magicians or kings or emperors or gurus, many earlier faiths and ideologies have supposed a superior plane of existence for their "type" of believer.

We do (in reality) have nations, classes, cultures and maybe one could say secret societies that live blessed lives by some design. Some "animals" are more equal than others. Still, I would venture luck has a far bigger role in their favored status in this material plane.

By this reasoning ... extreme actions (without a genuine benefit in a 2 or fewer generations) ... are usually done by cults.


The operative concept here is: Cult. A lot of people look at effective altruism et al and wonder, is it really so bad that some people want to be really good at helping other people? Overall, no, but it's a cult. Sam Bankman-Fried used that cult to build a cryptocurrency empire of madness that collapsed on everyone.

A lot of people would say that innovation requires a certain amount of crazy, and great ideas sometimes emerge from these weirdo goings-ons, and I suppose that's fine as long as we recognize that some of these folks are batshit nutjobs floating up to their eyeballs in kool-aid. Beyond that, feel free to steal anything good that comes of it...


The gist of this is that ff those who are concerned with AGI catastrophe are wrong but try to prevent the perceived looming disaster then they might do bad things. Bad because they caused harm to people by were unnecessary.

I don't think anyone would disagree with that, but the author side-steps the most important dynamic here, which is whether there really are extreme risks from AGI. The closest we see to an argument for why we should share the author's implied assumption that AGI won't be a problem is just flippant remarks by framing those who argue that there are great risk are "believers".


Every time I encounter this sort of argumentation,

I come away deflated that the critics are as breathless and histrionic, for all their careful posture of reasonable pragmatic sobriety, as those they take issue with.

There is real, important, debate to be had about many of the notions being attacked via this godawful acronym, which is a remarkable example of broad brush overreach itself.

That debate is not happening or invited by these sorts of critiques, which I assume are largely about carving out a brand in a fashion not overly distinguishable from the posting of edgy think-pieces on LinkedIn.

Absolutism, and contempt, spell the death of productive collective discussion. The presumption that those being attacked are unworthy of coming to the table, insinuated or openly stated in various personal attacks and insinuations, undoes all moral authority.

What a sh-t show.


I can’t take this seriously. Why do people give so much weight to the opinions of crackpots?


When you say crackpots are you referring to Sam Bankman-Fried?


His pot is not as cracked .... but quite full of his victims money apparently.


in which case i believe the correct term is "fraudster"


> longtermism is bad philosophy and that, if taken literally, it could be used to “justify” a wide range of extreme actions.

Isn't that the danger of ANY philosophy or religion? I don't buy that "longtermism" itself is a dangerous mission any more than anything else. If people take extreme, evil actions in the name of X religion, it should not invalidate that religion. I never heard of longtermism, but it seems to be based on sacrificing now and planning for the future generations. Sounds reasonable to me. That is what civilizations have always done. Cathedrals are not built for this generation, but for the next.


The funny thing is that a lot of the ppl on social media that I've seen taking this stance against "longtermism" tend to also hold their own long term agendas about utopia. This "conflict" feels really contrived to me.


author is a flakey writer. Writes an article about "TESCREAL"[1] but doesn't define it but links to another of their articles where it's defined buried in the middle of the 4th paragraph. Author notes they come from this crackpot culture, and they're really still in it and they do a weak job of organizing their thoughts for any audience not in their head.

[1] "transhumanism, Extropianism, singularitarianism, cosmism, Rationalism, Effective Altruism and longtermism" being major themes and key words in the crackpot intelligentsia scene the article is about.


Agreed, the article was pretty unreadable. His criticisms are relatively fair, but the article structure was horrible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: