Hacker News new | past | comments | ask | show | jobs | submit login

Depends on which part you want to do on the smartphone. Training, inference, or both.

Training is quite expensive computationally, but inference needn't be. We have many models that can run on a smartphone, after all.

However, you can do some limited training on the smartphone by leveraging pre-trained models. Usually the internal representation at the very end of the network can be used as an input to train a simpler algorithm on top of it.

But all of the above depends on what actually needs to be done, which you have not specified. Classical, non-deep, ML models could easily be trained on a smartphone provided the datasets can fit.

The keyword for you to seek would be "edge AI", if that helps.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact