Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm confused. You mention nuclear weapons, which everyone was convinced would destroy the world and didn't, then go and claim that AGI, with the same potential, will assuredly do it.

Just as nuclear weapons radically transformed the world, dramatically reducing the amount of armed conflict, AGI may have a similar transformative effect.

I see no signs that this is going to lead to destruction. Is it really the sign of an intelligent machine to go all Skynet on us?

Even that doomsday scenario had machine intelligences fighting for us. I think your pessimism is confusing the relative probability of the outcomes.



I'm not being pessimistic, I'm being realistic. I absolutely want a positive outcome, where we build machines millions of times smarter than us, and they magically develop human values and morality and decide to help us.

But making that happen is very very hard, and its far more likely they will be paperclip maximizers. There's no reason they would care about us any more than we care about ants.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: