Nuance.
They have insufficient GPUs for the amount of training going on.
If you assume theres a plateau where the benefits of training constantly no longer outweigh the costs, then they probably have too many GPUs.
The question is how far away is that plateau.
Nuance.
They have insufficient GPUs for the amount of training going on.
If you assume theres a plateau where the benefits of training constantly no longer outweigh the costs, then they probably have too many GPUs.
The question is how far away is that plateau.