
Show HN: Hand Sign Recognition on iOS Using CoreML - mendeza
https://www.youtube.com/watch?v=Tl7qRWQFqJc&feature=youtu.be
======
mendeza
Here is a demo of a CoreML model I develop to recognize hand signs, running in
realtime on iOS camera output. The architecture is a MobileNet v1
classification architecture. The model is 13 Megabytes in size, and the
average processing time is 40ms. Accuracy on Dev and test set is 94%!

Dataset location and inspiration was a tutorial from this link:

[https://cs230-stanford.github.io/proj..](https://cs230-stanford.github.io/proj..).

The demo here is running on an iPhone 6s. The model was architected using
Keras, trained on CS230's SIGNS dataset and custom dataset, and converted to
coreml using coremltools.

Code available to train and deploy model here:
[https://github.com/interactivetech/Ha..](https://github.com/interactivetech/Ha..).

If interested in integrating deep learning to your ios app, feel free to email
me at interactivetech1@gmail.com

