Hacker News new | past | comments | ask | show | jobs | submit login

Physics. You're right that ecosystems are brutal. That's exactly why I'm not worried about AI as an existential threat to humanity.

A few years back Bill Joy was sounding the alarm on nanotechnology. He sounded a lot like Elon Musk does today. Nanobots could be a run-away technology that would reduce the world to "grey goo". But nothing like that will ever happen. The world is already awash in nanobots. We call them bacteria. Given the right conditions, they grow at an exponential rate. But they don't consume the entire earth in a couple of days, because "the right conditions" can't be sustained. They run out of energy. They drown in their own waste.

AI will be the same. Yes, machines are better than us at some things, and that list is growing all the time. But biology is ferociously good at converting sunlight into pockets of low entropy. AI such as it exists today is terrible at dealing with the physical world, and only through a tremendous amount of effort are we able to keep it running. If the machines turn on us, we can just stop repairing them.




The danger of nanotechnology is that it can be built better than biological life. It could outcompete it in it's own environment, or at least different environments.

Solar panels can collect more energy than photosynthesis. Planes can fly faster than any bird. Guns are far more effective than any animal's weapons. Steam engines can run more efficiently than biological digestion. And we can get power from fuel sources biology doesn't touch, like fossil fuels or nuclear.

We conquered the macro world before we even invented electricity. Now we are just starting to conquer the micro world.

But AI is far more dangerous. It would take many many decades - perhaps centuries - of work to advance to that level. It's probably possible to build grey goo, it's just not easy or near. However AI could be much closer given the rapid rate of progress.

If you make an unfriendly AI, you can't just shut it off. It could spread it's source code through the internet. And it won't tell you that it's dangerous. It will pretend to be benevolent until it no longer needs you.


> Solar panels can collect more energy than photosynthesis. Planes can fly faster than any bird. Guns are far more effective than any animal's weapons. Steam engines can run more efficiently than biological digestion. And we can get power from fuel sources biology doesn't touch, like fossil fuels or nuclear.

A gun isn't effective unless human loads it, aims it and pulls the trigger. All your other examples are the same. We do not have any machine that can build a copy of its self, even with infinite energy and raw materials just lying around nearby. Now consider what an "intelligent" machine looks like today: a datacenter with 100,000 servers, consuming a GW of power and constantly being repaired by humans. AI is nowhere near not needing us.


Because we haven't had the reason or ability to make self replicating machines yet. It's possible though. With AI and some robotics, you can replace all humans with machines. The economy doesn't need us.


Advanced nanotechnology is not the only possible way to achieve power: http://slatestarcodex.com/2015/04/07/no-physical-substrate-n...


Yeah, interesting. I'll just point out that my argument is not that AI can't affect the physical world. Clearly it can. It's that AI is still embodied in the physical world, and still subject to the laws of physics. We're so much more efficient and effective in the physical world, that we are not threatened by AI, even if it gets much more intelligent than it is today.


great read. thanks for that.


"If the machines turn on us, we can just stop repairing them."

Never understood this reasoning.

We are not talking about machines vs. biologic life, this is a false dichotomy. We are talking about intelligence.

Intelligence is the ability to control the environment through the understanding of it. Any solvable problem can be solved with enough intelligence.

Repairing a machine is just a problem. The only limitations for intelligence are the laws of physic.


That's my point. The laws of physics are a bitch. We like to think of the internet as a place of pure platonic ideals, where code and data are all that matter. But that ethereal realm is still grounded in matter and ruled by physics. And without bags of mostly-water working 24/7 to keep it going, the internet just falls apart.


> But they don't consume the entire earth in a couple of days, because "the right conditions" can't be sustained. They run out of energy. They drown in their own waste.

Maybe, but not necessarily, and even if they do "drown in their own waste" they might take a lot of others with them. When cyanobacteria appeared, the oxygen they produced killed off most other species on the planet at the time [1]. The cyanobacteria themselves are still around and doing fine.

[1] https://en.wikipedia.org/wiki/Great_Oxygenation_Event




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: