Hacker News new | past | comments | ask | show | jobs | submit login
Android to launch TensorFlow Lite for mobile machine learning (venturebeat.com)
119 points by ascorbic on May 18, 2017 | hide | past | favorite | 15 comments



The most important part of the announcement seems to be about hardware acceleration. Maybe it's something like Qualcomm NPE?

https://developer.qualcomm.com/software/snapdragon-neural-pr...


It'll be cool to see what Qualcomm releases. Until then, there's already some released Tensorflow code that will run on the Hexagon DSP in a Snapdragon 820: https://github.com/tensorflow/tensorflow/tree/master/tensorf...

You can run it on a rooted OnePlus 3t (or a OnePlus 3 if you find one used).


So I assume it will be similar to this ? It almost sounds like they would keep the interface to this but redesign how it uses the specialty chips. https://www.tensorflow.org/mobile/


I'd like to know how it's different from that at all.


Interesting, I'm wondering if it will be more popular to learn ML model on mobile or just to deploy model on mobile and run predictions.


It's going to be resource intensive, aren't your users going to hate you for draining their battery?


It'd be interesting to compare the power drain of running the model vs transmitting the data to a server, waiting for the response, and then acting on it. Even if it's still worse, it might be worth the responsiveness and reduction of data usage.


Not to mention that training can be done somewhat "out of band". So you can have your phone train on data while charging overnight to get better predictions the next day.


If you found a way to have efficient distributed training you could just have each device do a few training runs and the user probably wouldn't even notice.


You may be interested in federated learning: https://research.googleblog.com/2017/04/federated-learning-c...


It will use accelerators, which should help a bit. You shouldn't use this on the phone's GPU.


they show an example of ML when selecting a text in android and how it predicts how much text should it select depending on what it is (a full address, an email etc).. don't know how they made it but it was almost instant, I'm sure the users wont care.


I think that was just using a pre-trained model, i.e. the training didn't happen on the device but offline.


Simple models should run performantly, we're all thinking of CNNs and RNNs, but a simple Logistic Regression will probably the most ubiquitous application.


if you can learn on mobile you can have the model update it self with user specific data without sending that data back to a server.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: