Hacker News new | past | comments | ask | show | jobs | submit login

Some people think that. I've heard description from multiple smart people along the lines of "empty and despairing" and "leaves me with an empty feeling at the end". I think they probably thought through the implications. I can't comment though, maybe some of the books in the series are different than others.



Some of the books are _definitely_ different than others. In whole, the universe can be seen as somewhat troubling as well, but the vast majority is quite optimisitic. The majority of the universe that is, bad things still do happen in the plot.


Implications such as?


There's various things, but one of the main problems these people probably have, as some of them are AI researchers/simulationists, is that the real goals of the AIs are very unclear. Simulations at a very high detail either work and are widely used or are faked (causing horrible implications if they really are faked), so it's likely the AIs use simulations in planning. It's even worse if the simulations are not faked at all, as the AIs would run thousands or millions of simulations of really horrible situations, testing different interventions. If they are trying to reduce suffering, even not as their sole concern, why are they doing this? If they aren't trying to reduce suffering very much, the mind control is not really in the "human"'s best interests, it's just to control them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: