Loved the lone footnote defining their view of AGI:
> A highly autonomous system that outperforms humans at most economically valuable work
Holy goalposts shift, batman! This is much broader and much less than what Iād been led to believe from statements by this company, including by altman himself.
All I've heard from every single tweet, press release, etc. has defined their AGI as "A system that can think like humans, or better than humans, in all areas of intelligence." This is the public's view of it as well - surely you can see how burying their "working" definition in footnotes like this apart from the hype they drum up publicly is a bit misleading, no?
The footnote is aligned with what Sam Altman has been saying in most interviews up until recently. I was actually surprised to see the footnote since they have shifted how they talk about AGI.
> A highly autonomous system that outperforms humans at most economically valuable work
Holy goalposts shift, batman! This is much broader and much less than what Iād been led to believe from statements by this company, including by altman himself.