Hacker News new | past | comments | ask | show | jobs | submit login

Some technical details about how this works:

In the demo, the computer vision and gesture recognition algorithms are running on a Windows 8 tablet, connected to the glasses via a USB cable. The tablet is processing the live video feed from the glasses and communicates with the AR Drone via the super easy to use https://github.com/venthur/python-ardrone project. For rapid prototyping, we started with the great JS hand recognition demo http://revealjs.herokuapp.com/ but quickly wanted the flexibility of OpenCV so we moved to the OpenCV Python libraries and other high performance Python libraries such as SciPy.

The bulk of the algorithm uses a variety of different features (such as color, motion and shape) to estimate where hands might be in the frame, and when a hand is found the foreground (hand) and background (non-hand) are learned to improve the robustness of the hand estimation.

We are building an application platform for developers to create applications integrating our glasses, and we'll be exposing the gestures API (among other APIs) for easy gesture-triggered events. This will allow lots of fun applications like waving your hand to turn the music up with your glasses, but we're really excited for all the ideas we haven't even thought of that we know everyone else will build with the platform and APIs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: