I've opened my eyes recently to the catastrophe that we're headed toward with unaligned AI. I know plenty of people here aren't worried, and are actually excited about it. I was too, but I just hadn't really thought very deeply about it before. I was imagining the most rose-tinted sci-fi. Now I'm trying to figure out how this doesn't end extremely poorly for us all, and can't.
I am trying not to panic, not to sink into a deep despair, but it seems like even if AI doesn't actually kill everyone (which apparently over half the people working on AI think has a good chance of happening!) it's going to screw up things much worse than just the weak AIs that we've used to create the never-ending attention economy. It seems like AI is a fire that has already started to burn and consume us, and we just keep feeding it instead of fighting it. Maybe we create AIs to cure our diseases, then someone uses one to take out the power grid and modern society collapses and hundreds of millions starve. Maybe even if a super-intelligent AI doesn't take over, we rely on them more and more until every important decision is made by AI instead of people, and when bad things start happening, it's too late to undo it. We're going so, so fast, and there's nobody at the helm.
I have a little boy, and I don't this world for him. He's so naive, so excited about technology, and I cry for him when he's not looking. I wasn't doing too bad up until today, tried to have a little hope and maybe start the grieving process ahead of time, but it's just so terrifying. I'm trying so hard to shout "stop! stop!" but the world is so noisy.
And to the people working on making better, smarter AIs -- do you think that this is going to be a good thing for humanity, to develop and release these things to everyone, so quickly? Are you not worried that you're bringing about the end of the world, or at least going to cause massive amounts of suffering? I have heard that a lot of people working in AI are privately very concerned, but don't feel like it'll make much of a difference if they quit or not.
In every case, we successfully managed the transition and emerged a more advanced civilization.
People feared domesticated plants, the printing press, the mechanical loom, formal education, radio, the automobile, nuclear tech, and so on. Socrates had misgivings about written language because of all the harm it might cause. I'm sure the same thoughts were had about the domestication of fire a million years ago.
In living memory, people cried doomsday about overpopulation, crop failure, mass starvation, and nuclear annihilation. In time, those fears were replaced by fear of acid rain, the ozone hole, deforestation, even the plastic rings on a 6 pack. Now we hear it's CO2, plastics, rising seas, population collapse, wokeism, and AI.
Yet here we are, stronger than ever.