Hacker News new | past | comments | ask | show | jobs | submit login

> What makes you think AI will become a commodity?

Because it already is. There have been no magnitude-level capability improvements in models in the past year (sorry to make you feel old, but GPT-4 was released 17 months ago), and no one would reasonably believe that there are magnitude-level improvements on the horizon.

Let's be very clear about something: LLMs are not harder than search. The opposite is true: LLMs, insomuch as it replaces Search, made competing in the Search space a thousand times easier. This is evidenced by the reality that there are at least four totally independent companies with comparable near-SOTA models (OpenAI, Anthropic, Google, Meta); some would also add Mistral, Apple Intelligence is likely SOTA in edge LLMs, xAI just finished a 100,000 GPU cluster, its a vibrant space. In comparison, even at the height of search competition there were, like, three search engines.

LLM performance is not an absolute static gradient; there is no "leader" per se when there are a hundred different variables upon which you can grade LLM performance. That's what the future looks like. There are already models that are better at coding than others (many say Claude is this), there will be models better at creative writing, there will be an entire second class of models competing for best-at-edge-compute, there will be ultra-efficient models useful in some contexts, open source models awesome at others, and the hyper-intelligent ones the best for yet others. There's no "leader" in this world; there are only players.




Yes, and while training is still expensive governments will start funding research at universities.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: