

Leap Motion: hand and finger motion controller - aheilbut
http://www.leapmotion.com

======
bearwithclaws
Previously on HN: <http://news.ycombinator.com/item?id=4002418>

------
Jach
So it's a smaller Kinect (with no Linux support) for half the price? I'm
assuming it's using infrared, they don't say... Personally I'm still waiting
for a company to discover ultrasound and combine this with tactile feedback.
<http://www.youtube.com/watch?v=-e8tsG4uIt0>

~~~
vibrunazo
I agree tactile feedback would be helpful. But it's not as critical here as it
would be on a normal touch screen. Here you at least have visual feedback.
Because on a touchscreen you only have two states: touch and not touch. Your
app cannot know where the user finger is when he's not touching, that's why
there are no cursors and onFocus elements lose their meaning.

But if the leap is as good as in the videos, then your app could show a cursor
of where the fingers are before touching the UI. And you can use that as a
visual feedback to the user to help him mentally locate himself. So I'd guess
using a virtual keyboard on the leap would be much more efficient to work on
than a touchscreen virtual keyboard.

Of course, since the existing apps won't be expecting this, it will be hard to
use existing stuff. Ideally, people will have to adapt their apps to this
input instead of trying to just migrate from touch.

------
paduc
The problem is that the user has no tactile feedback. It could quickly become
very painful to hold one's hand/arm over the device.

Maybe they should orient the device toward the table and transform it into a
virtual trackpad.

