Hacker News new | past | comments | ask | show | jobs | submit login

Well actually, no!

There are a lot of things to this Armageddon-through-AI equation. Firstly this thing called 'energy'. A paper clip maximizer will long reach its demise before converting a maximum part of planet to paper clips, merely because it needs a lot of resources to produce energy, and maintain itself too.

In fact true AI would not destroy anything. Because true AI would know it doesn't have sufficient data to make any such decisions.




There is more than enough energy on Earth alone to destroy humanity. Most likely an AI would want to maximize it's energy. Covering the entire face of the Earth with solar panels, or even building a matrioshka brain from the mass of the solar system.

An AI not specifically programmed to value human life would have no reason to keep us around. It'd either kill us because it sees us as a threat, competition, or just annoyance, or it would kill us by accident as it consumes all the resources it can.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: