No way has training hit any kind of cost, computing or training data efficiency peak.
Big tech advances, like the models of the last year or so, don't happen without a long tail of significant improvements based on fine tuning, at a minimum.
The number of advances being announced by disparate groups, even individuals, also indicates improvements are going to continue at a fast pace.
We have all kinds of advancements to make training cheaper, models computationally cheaper, smaller, etc.
Once that happens/happened, it benefits OAI to throw up walls via legislation.