I'm researching the AI GPU market landscape...
I don't have the tech background in semiconductors to evaluate this question:
In theory, in (say) 2 years from now, could Mac Servers (with, say M5 Ultras) be used in a data center as a substitute for NVIDIA's H100s?
If so, why isn't anyone other than Apple exploring this direction?
It not, why not?
If you grok the tech, please explain to someone who doesn't. THANK YOU!!!