
ASLSpeak: Intuitively decode sign language using the Leap Motion, and speak it - Oatseller
http://devpost.com/software/aslspeak
======
lioeters
Using gesture recognition and machine learning to translate sign language to
speech - brilliant!

Of all the interesting things people have been doing with Kinect, Leap Motion,
etc., this seems to be one of the most practical and helpful application. I
hope they pursue it further and develop a product for the general public, it
could be of great benefit to people.

They could have volunteers provide more input to teach and improve the
algorithm, and even branch out into other sign language dialects..

~~~
lioeters
Just realized the team is a pair of 18-year-olds! My goodness, so smart and
socially conscious.

------
Oatseller
I originally found this on MSDN BLOGS:
[http://blogs.msdn.com/b/microsoft_student_developer_blog/arc...](http://blogs.msdn.com/b/microsoft_student_developer_blog/archive/2015/11/03/what-
s-the-secret-to-winning-hackathons-a-great-idea-22-hours-without-sleep-and-
microsoft-azure.aspx)

    
    
        Like Nick Bowmen (18) and Nelson Liu (18, pictured left). The duo
        forms team ASLSpeak - winner of both the Microsoft Hack Award and the
        EMC Isilon Hack Award.
    
        Knowing they wanted to create something to help the disabled in some
        way, they came up with an idea that aims to aid American Sign Language
        (ASL) speakers.
    

Truly inspiring work.

github link: [https://github.com/nelson-
liu/ASLSpeak](https://github.com/nelson-liu/ASLSpeak)

