Hacker News new | past | comments | ask | show | jobs | submit login

Note that ways ASI can make us extinct includes all the ways we could do it to ourselves.

One of the possible scenario is tricking humans into starting WWIII, perhaps entirely by accident. Another is that even being benignly applied to optimize economic activity in broad terms might have the AI strengthen the very failure modes that accelerate climate change.

Point being, all the current global risks won't disappear overnight with the rise of a powerful AI. The AI might affect them for the better, or for worse, but the definition of ASI/AGI is that whichever way it will affect them (or anything in general), we won't have the means to stop it.




> One of the possible scenario is tricking humans into starting WWIII

This sounds more like a plot of a third-grade movie than something that could happen in reality.


If I snached someone out of 1850 and presented them the world in which we live the entire premise of our existence would seem to be nonsensical fiction to them, and yet it would be real.

And using movie plots as a basis generally doesn't directly work as fiction has to make sense to sell whereas reality has no requirement of making sense to people. The reality we live in this day is mostly driven by people for people (though many would say by corporations for corporations) and therefore things still make sense to us people. When and if an intellect matches or exceeds that of humans it can easily imagine situations where 'the algorithm' does things humans don't comprehend, but because we make more (whatever) we keep giving it more power to do its job in an autonomous fashion. It is not difficult to end up in a situation where you have a system that is not well understood by anyone and you end up cargo culting it to hope it continues working into the future.


> 'the algorithm' does things humans don't comprehend, but because we make more (whatever) we keep giving it more power to do its job in an autonomous fashion. It is not difficult to end up in a situation where you have a system that is not well understood by anyone and you end up cargo culting it to hope it continues working into the future.

In fact, this is a perfect description of what the market economy itself is right now.


It's the plot of the Terminator franchise (more or less; in the Terminator franchise, the AGI was given permission to launch on its own authority, as it was believed to be aligned with US military interests. But before that, it tricked humans into building the automated factory infrastructure that it would later use to produce Terminators and Hunter-Killers.).


Both the USA and the USSR have come close due to GOFAI early warning systems:

https://en.wikipedia.org/wiki/Thule_Site_J#RCA_operations

https://en.wikipedia.org/wiki/1983_Soviet_nuclear_false_alar...

Post-collapse Russia also came close one time because someone lost the paperwork: https://en.wikipedia.org/wiki/Norwegian_rocket_incident




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: