Hacker News new | past | comments | ask | show | jobs | submit login

ML expert here. Good points above especially training vs inference.

To be honest, yes it is possible. Most models I made could run on a mobile device, mostly they would not because they were written in python and since it is possible/cheap, I would not care too much about RAM and efficiency for a training job.

I think the dataset size is overrated by things like Kaggle or news about Deep learning models for image recognition. Bigger datasets are better, but if your data quality is good few hundred of rows (like a csv file) can be enough for many applications!

Most data challenges are not image recognition or NLP either, so you could do them on smaller devices. I think the main issues would be 'support' tho. Small devices do not run python (or R/Julia) so you need to port your inference code to some binary (like webassembly) or rewrite in C/C++. Inference code is much smaller than training/ experimenting code fortunately.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: