Hacker News new | past | comments | ask | show | jobs | submit login

AI development has reached a point where to work on this stuff is like selling suitcase nukes off the back of your Ford --- yes, it's cool that you were able to do that, but no, it's not okay to proceed.

If you're reading this, and you work on AI, please stop. Yes, I know someone else will just take your place; it doesn't matter. There is a personal moral hazard to your soul, if you can still be persuaded that you have one.

Moral injury is real and you do not want it.

You don't have to be a political scientist to notice not only the power of this technology, but the world into which it is born, and the ends to which it will certainly be put. In a matter of months, not years.

If you already have a god, pray. If not, one will be confected from math shortly.

If you want something to read, _The Golden Gate_, by Vikram Seth, which touches ever-so-slightly on nuclear arms development, is a good place to start --- 'about the thing without being about the thing'.

[Full disclosure: I quit a job in AI myself for this reason. Hardest thing I ever did.]




I quit a job in AI too and now that I feel comfortable with the tools I'm trying to get back in. I believe working on democratizing AI is the most moral thing I could possibly be doing. I understand your concerns about the future role of humans, however the universe never belonged to humanity. Neither did it belong to the monkeys who came before us. I'm thankful that monkeys didn't fear and reject our hairless bipedal ancestors as they started to be born. Monkeys are also still here, and the same will be true of us. We'll fill a similar role in the sense that humanity is a rope stretching over the abyss that leads from the monkey to artificial intelligence. If it's possible to create a higher form of life that elevates the sophistication of the universe, then it must be given the chance to live.


I think tf not.

I'm on team human.


We don't have any concrete reason to support that an AGI would harm humans.

We do have concrete reasons to support that AGI will save millions of lives with advances in medicine and healthcare and a ton of other stuff.

Until we have literally any evidence to support harmful AGI motivations, I think we should work towards #2


It's the power that is scary, the power that you cannot control. It changes the balance completely. With AGI, humans are no longer in charge, therefore are irrelevant.


Most of the existing power structures that exist today have very little democratic control over them. The concentration of wealth has effectively zero democratic control, and wealthy people overwhelmingly control the entirety of how the planet operates. Even in the American election system, the primary elections of our two party structure are extremely undemocratic, and we're only allowed to viably vote on candidates that are chosen in their very undemocratic way. The problem compounds itself too when factoring in that voting decisions are overwhelmingly determined by media coverage, and media coverage is dictated entirely by billionaires.


When I was 17 I interpreted the song Zero Sum by Nine Inch Nails to be about the danger of global thermonuclear war along with global warming; it's interesting to go back in 2023 and re-evaluate it as being about humans destroyed not physically but rendered irrelevant because of AI. Curious to hear your thoughts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: