Hacker News new | past | comments | ask | show | jobs | submit login

I’d like to see someone develop a chorded keyboard with some sort of gesture based feedback. An advanced sign language for computer input.

Cameras, etc could be used to read our gestures:

https://www.cnbc.com/2018/07/16/ctrl-labss-neural-tech-lets-...

https://atap.google.com/soli/




I also think our hands could do more in keyboard position. Lift off slightly and gestures could be recognized by a camera, even things like 2-finger scrolling could be done by lifting the fingers slightly and doing it.

Same with pinch-zoom or scroll. Or when working with a 3D model, making a hand shape as if you were gripping a globe, you should be able to twist and rotate objects on screen.

I remember posting excitedly about this on Slashdot 15 years ago and am just remembering it now. This should be easily achievable with modern libraries to get basic detection working.

Although some abhor the idea and want more complex keystrokes chords, for myself I think there are specific gestures I consider intuitive and wouldn't have to particularly learn anything new.


LeapMotion gesture sensors were shipping in HP Laptops at least 5 years ago, this is from 2013: https://www.cnet.com/pictures/this-hp-laptop-has-a-built-in-...

How does it compare to the Soli?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: