Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the second time in recent days that Kaczynski's writings have been the subject of HN discussion. Rather than rehearse what I had to say then, I'll just link it: https://aaron-m.com/2017/07/09/on-kaczynski


Taking "technology" to its logical end, you will probably arrive at AGI. And despite being decades off away from it, there are already many people making sure that it will be "safe" since the default otherwise will most likely lead to human extinction.

https://waitbutwhy.com/2015/01/artificial-intelligence-revol...

https://waitbutwhy.com/2017/04/neuralink.html

EDIT: I am getting a lot of downvotes for this comment. And just to clear up intention, all I am trying to say here is that yes, technology could be deleterious but we can prevent it in ways other than concluding “and that’s why we have to start blowing shit up and killing people”. I don't see why that's offensive.


Singularitarianism tends to polarize opinion here, I think, in ways that tend to find stronger expression via the voting buttons than the reply option. I wouldn't worry too hard about it.


I'm not smart enough to dismiss cautions that AGI will lead to human extinction, but many of the arguments I have seen, including Bostrom's, don't adequately explain that AIs are not life, and therefore do not compete for resource the way life does. That is, an AGI will not be alive, but neither can it die. It doesn't have genes, and it will not have evolved to propagate genes competitively.

So, while it's possible that AGI will be as bad or worse than competing with a more powerful and intelligent species of animal, they're not animals, and probably won't behave like animals. Increasing chimp intelligence, on the other hand, is probably not a good idea. Bonobos, maybe. Not chimps.

Also missing from the discussion is that intelligence isn't the only danger from competitors for resources. Insects collectively weigh a multiple of the total human biomass and many of them would eat us all, given half a chance. Microbes could evolve to kill us all, directly, or by modifying the ecosystem.


Part of what we see as "human intelligence" comes from our biological origins. For example, intelligence AND emotions drive our actions: ambition, empathy, fear, greed, jealousy, love, etc. They are survival and reproductive mechanisms built into our brains based on the human condition.

My response to the theory that an all-powerful computer intelligence will one day have us at it's whim is why would it even care? For the same reason it lacks empathy, it would also lack any ambition or fear or jealousy.

The bigger risk is relying too much on A.I. built for very specific purposes (ANI). A contrived example would be putting A.I. in charge of managing the nuclear arsenal. A bug in the A.I. that caused a pre-emptive strike could wipe out all life on earth.


I'm going to stay on the self-promotion train for one more stop, because I actually just a little while back started writing a long piece on a subject very much adjacent to what you're describing: https://aaron-m.com/2017/08/01/on-the-theodicy-of-system-sho...

Part 2, although I know where I intend to take it, is very inchoate at this point for a reason that I mention in a postscript to part 1. I'd love to have any feedback anyone would care to provide! It'll be of great use in improving the back half of the essay.


Indeed - the most likely dangerous kind of AI is the amoral servant; one that decides to massacre humans based not on its own volition but on the orders its has been given by a human.


I know I'm revoking it by commenting here, but I did give you a downvote. It wasn't that you said something ridiculous. It was that you started from Kacyznksi and segued into a topic that's only tangentially related.


I think it's very optimistic to imagine that we're only decades away from inventing God! But I'm also glad there are people taking the time to consider how we might do so without too gravely regrettable a result.


[flagged]


...says the voice in the back of my head, every time I say or write anything at all. How did you get out here?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: