I just realized that I do not understand the current "AI" craze. I understand OpenAI and ChatGPT. Amazing product, provides value, clear room for further improvement. Fine.
What is everyone else trying to do? Why are they buying whole 1000 GPGPU clusters? Are all of them each trying to train their own LLMs from scratch? If not, what are they trying to do?
- self-driving (Tesla Dojo)
- computational chemistry, drug discovery, protein folding
- physics simulations
- the many things the US national labs do (https://www.scientificamerican.com/article/new-exascale-supe...)