The obvious solution to AI safety is already right there: the OpenAI ToS. We currently have a defender from the technodystopia in Sam Altman. By making sure that every bit of text generated by LLMs costs money (and that money goes to OpenAI) he can ensure the safety of the world through his Terms of Service.
Giving one guy or one small group of people vetted by Elizier Yudkowsky complete monopoly over this technology or industry is a small price to pay to ensure that the power to easily generate text does not get too spread out and accessible to the wrong people. By concentrating all of the power over content and revenue from the industry into the hands of Good Guys we make sure that no bad things can happen.
>> We currently have a defender from the technodystopia in Sam Altman
Was it sarcasm? Sam Altman is the most dangerous man on the planet right now because he is manipulating the public with the AI alignment "problem" while simultaneously changing the Open AI "core values" and developing AGI. And let's not forget his "retina" project with the scam coin. Sam Altman wants to be the sole owner of an AGI that will predict whatever he wants.
>> Giving one guy or one small group of people vetted by Elizier Yudkowsky complete monopoly over this technology
Nope. Giving anyone or any group exclusive access or the right of veto over a technology will result in a dystopia. Especially after Elizier's hysterical letter and calls to bomb the data centers. He is biased, and his letter was not rational; it was very emotional and full of fear. This does not make his point of view any more justifiable.
So I hope that was a sarcasm too.
Edited: separated the answers from original comments.
>Especially after Elizier's hysterical letter and calls to bomb the data centers. He is biased, and his letter was not rational; it was very emotional and full of fear
I would encourage you to peruse this other post from the same very-serious website that we are discussing the content of here
Agreed, the worst outcome here is the little guy getting funny ideas about being able to freely access information. We need to keep it locked up so our betters can decide what the most appropriate use is.
Giving one guy or one small group of people vetted by Elizier Yudkowsky complete monopoly over this technology or industry is a small price to pay to ensure that the power to easily generate text does not get too spread out and accessible to the wrong people. By concentrating all of the power over content and revenue from the industry into the hands of Good Guys we make sure that no bad things can happen.