We dont align our current actions with AGI. Rather, we align our actions with a presumption of what we think AGI is to become (assuming some inevitability).
Some people believe AGI is imminent, others believe AGI is here now. Observe their behavior, calm your anticipation, and satisfy your curiosoty rather than seeking to confirm a bias on the matter.
The tech is new in my experience and accepting claims beyond my capacity to validate such a grand assertion would require me to take on faith the word of someone I don't know or have never seen, who likely generated such a query in the first place outside the context length of Chatgpt mini.
The problem is that exactly tomorrow someone might announce some technology that will revolutionize AI once again, like ChatGPT did only a couple of years ago. No one expected that. No one expected creative jobs to be taken by Midjourney.
We could hide behind "we can't predict the future" but it would be wise to get ready for that inevitability.
One day you will ask your computer to "open a word processor" and it will pull a fully-featured Word 2013 out of thin air. What will developers do then?
That day could be March 1st, 2025 or 2050. Many of us will likely still be in the jobs pool either way.
Some people believe AGI is imminent, others believe AGI is here now. Observe their behavior, calm your anticipation, and satisfy your curiosoty rather than seeking to confirm a bias on the matter.
The tech is new in my experience and accepting claims beyond my capacity to validate such a grand assertion would require me to take on faith the word of someone I don't know or have never seen, who likely generated such a query in the first place outside the context length of Chatgpt mini.