I want to create AGI, or ASI if possible. I doing that in non-academic and non-enterprice way. I want this because I afraid that one days Mathematics will be too hard for humans but I want no second without any progress in this field. I believe the computers are decent successor for Math culture. Everything else is less important by far.
People want to create something better than what's there. With ML and AI, computers already can do what's better than giving instructions or recommending things that may be better from the list, however, it doesn't know why it's better or what are its consequences. As a system, what's next is an AI that learns the consequences of behavior and actions.
As humans, we tend to make things better and profit from it. AGI is no different.
Not sure I agree with that philosophy. But it is a valid viewpoint.
I'm more in the camp of "AGI will bring some good, some bad, and some ugly. One step @ a time, please". As if anyone in a leading AI company would care...
A self learning AI is essentially an endless R&D machine and would lead to the automation of virtually every job in every industry that relies on skilled and unskilled labor. People say it's about money, but where's the money in a society with no jobs?
It's about power. Whoever wields the most powerful version of this technology will hold more influence than government.
Of course, my goals are incompatible with OpenAI.