Hacker News new | past | comments | ask | show | jobs | submit login

> I think they were in fact way ahead of everyone else,

This would be a lot easier to argue if they hadn't gimped their Neural Engine by only allowing it to run CoreML models. Nobody in the industry uses or cares about CoreML, even now. Back then, in 2017, it was still underpowered hardware that would obviously be outshined by a GPU compute shader.

I think Apple would be ahead of everyone else if they did the same thing Nvidia did by combining their Neural Engine and GPU, then tying it together with a composition layer. Instead they have a bunch of disconnected software and hardware libraries; you really can't blame anyone for trying to avoid iOS as an AI client.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: