Hacker News new | past | comments | ask | show | jobs | submit login

The SGD/pre training/deep learning/transformer local maxima is profitable. Trying new things is not, so you are relying on researchers making a breakthrough, but then to make a blip you need a few billion to move the promising model into production.

The tide of money flow means we are probably locked into transformers for some time. There will be transformer ASICs built for example in droves. It will be hard to compete with the status quo. Transformer architecture == x86 of AI.




I think it's possible that the breakthrough(s) needed for AGI could be developed anytime now, by any number of people (probably doesn't need to be a heavily funded industry researcher), but as long as people remain hopeful that LLMs just need a few more $10B's to become sentient, it might not be able to rise above the noise. Perhaps we need an LLM/dinosaur extinction event to give the mammals space to evolve...




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: