I can see the big chip makers making out like bandits - a la Cisco and other infra providers with the rise of the internet.
They are facing competition from companies making hardware geared toward that inference that I think will push their margins down over time.
On the other end of the competitive landscape, what moat do those companies have? What is to stop OpenAI from pulling a Facebook and Sherlocking the most profitable products built on their platform?
Something like Apple developing a chip than can do LLM inference on device would completely upend everything.
It's a good question. I think the user facing stuff has things like brand recognition, customer support, user trust, inertia and other things on its side.
Models don't have this benefit. In Cursor, I can even switch between models. It would take a lot of convincing for me to switch off of Cursor, however.
They are facing competition from companies making hardware geared toward that inference that I think will push their margins down over time.
On the other end of the competitive landscape, what moat do those companies have? What is to stop OpenAI from pulling a Facebook and Sherlocking the most profitable products built on their platform?
Something like Apple developing a chip than can do LLM inference on device would completely upend everything.