Hacker News new | past | comments | ask | show | jobs | submit login

If your best idea is to refashion the world as a giant prison camp, that's pretty fascist whether or not one likes to think of oneself that way; if Bostrom thinks it's a necessity but doesn't believe it to be fascist because he personally does not wish to wield power over people then that's just idiotically fascist. (There's a viable side argument here about the difference between totalitarianism and fascism as one kind of totalitarianism, so if you prefer just use the latter term).

Now that I recall, Bostrom is the guy who came up with the 'simulation hypothesis' that was so in vogue a decade or so ago, and basically argued against the reality of the world on the theory that the possibility of sufficiently advanced simulations made the probability and ubiquity of such inevitable to the point that we were statistically more likely to be living in one. This struck me as a warmed-over version of the literary conceit that we're all just characters in a particularly elaborate-seeming novel or stage play. So now he's gone from reanimating a literary corpse to reanimating the economic corpse of Jeremy Bentham.

To be frank, I think we should stop listening to this guy. He's terribly clever but also deeply neurotic, like a pessimistic Ray Kurzweil (whose prediction record is rather better than he gets credit for, but similarly dwells a little too much in his own imagination). Utilitarianism aspires to be a benign philosophy but its proponents tend to overestimate their capacity for foresight and moral reasoning and end up creating deeply inhumane systems which are arguably worse than the problems they set out to solve.




>This struck me as a warmed-over version of the literary conceit that we're all just characters in a particularly elaborate-seeming novel or stage play.

He also came up with the notion of apocalyptic "superintelligences"[0], which is a warmed-over version of 20th-century critiques of the corporation, starting with one written by an economist and a lawyer[1]. People like Cory Doctorow[2], Ted Chiang[3] and Charlie Stross[4] have pointed out the similarities.

>He's terribly clever but also deeply neurotic, like a pessimistic Ray Kurzweil (whose prediction record is rather better than he gets credit for, but similarly dwells a little too much in his own imagination).

He's a transhumanist[5] who co-founded the WTA (now Humanity+)[6] in 1998. I think he's probably realised along the way that doom and gloom sells: he founded FHI in 2005[7] and inspired Musk to fund CSER in 2012[8] with his book[0]. It's pretty much his day job now to be deeply pessimistic.

And it looks like doom-and-gloom think tanks are catching on: France is apparently hiring science fiction writers to dream up future threats[9]. Nice job if you can get it, I guess.

[0] https://en.wikipedia.org/wiki/Superintelligence:_Paths,_Dang...

[1] https://en.wikipedia.org/wiki/The_Modern_Corporation_and_Pri...

[2] https://boingboing.net/2015/07/03/why-were-still-talking-abo...

[3] https://boingboing.net/2017/12/18/skynet-llc.html

[4] https://boingboing.net/2017/12/29/llcs-are-slow-ais.html

[5] https://en.wikipedia.org/wiki/Transhumanism#Growth_of_transh...

[6] https://en.wikipedia.org/wiki/Humanity%2B

[7] https://en.wikipedia.org/wiki/Future_of_Humanity_Institute

[8] https://en.wikipedia.org/wiki/Centre_for_the_Study_of_Existe...

[9] https://www.theverge.com/2019/7/24/20708432/france-military-...


Damn, I do that for free and a large mug of coffee every morning (crossposted to 'if you're so smart how come you're not rich' thread)


Sorry. Still, it never ceases to amaze me what some people end up doing for work. "You mean they pay people for this?!" is usually my first reaction.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: