
Leap Motion's High-Res NUI Will Make Today's Touch A 'Legacy' - akosner
http://www.forbes.com/sites/anthonykosner/2012/07/16/leap-motions-high-resolution-natural-user-interface-will-make-todays-touch-a-legacy/
======
AndrewDucker
No it won't.

It's a fascinating system, with lots of potential, but it's not going to
replace actually touching the thing I want to touch on the screen, and it's
not going to get rid of the problem of "Gorilla Arm" which happens if you hold
your arm out straight for more than a few minutes.

~~~
zacharypinter
Side note: Have there been any studies (or even informal accounts) on people
that hold their arms up a lot (e.g. puppeteers)? Do they still get tired arms
easily, or have they gotten so used to it that they can do a day of work
without their arms getting tired?

~~~
vadman
"[Rotator cuff tear] also commonly affects orchestra conductors, choral
conductors, and drummers (due, again, to swinging motions)." -
<http://en.wikipedia.org/wiki/Rotator_cuff#Rotator_cuff_tear>

I wonder if this could be a side effect of Leap. The motions required are not
as violent as those of conductors, but then again there are finer/weaker
muscles in our wrists/fingers that could be affected.

------
dsirijus
I've been watching this from the public announcement date, preordered, applied
for dev sdk, and been active on the forums, as well as pondered the
possibilities (read, waving hands in the air a lot).

In my conclusion - THIS WILL REPLACE TRACKPAD.

Not augment, not be by the side of. REPLACE. And not only because Leap is
awesome, but because trackpad is horrible.

EDIT: You'll remember not upvoting this one when you're proved WRONG.

------
nsns
I don't get it... if I want my computer to take note of my hands I have to
physically touch my keyboard or trackpad; any movement lacking this physical
contact doesn't affect it. With this, how do I signal my computer to ignore
movements which are not intended for it?

------
ila
having seen a prototype, i can only say that this tech is a game changer -
comparable to the mouse, but will make keyboard style input/ux obsolote. it
wouldnt surprise me if the company gets snatched before it ships product.

------
th0ma5
One could duplicate this right with just a few webcams and some computer
vision? I haven't read their patents or anything, but I thought I read the
thing just has three webcams in it.

------
jcfrei
this might be useful in many niche cases, though I doubt a widespread
usage/replacement for traditional input methods. I imagine it being useful
when you quickly need to switch to another desktop, when you're just browsing
the web, or playing another one of those funny games where you have to juggle
balls or something.

or maybe it'll forever remain one of those cool demo tools, where you show off
a model of your newly developped protein to some executives on a large screen.

~~~
pbreit
Not sure you could get more narrow-minded than that. I'd suggest opening up
your outlook a few millimeters and try to envision some of the zillions of
ways this could be used. Here's a hint: you might not be sitting in front of a
PC.

------
wahsd
I get tired of holding a remote too long; you want to start waving around like
a little harry potter wannabe to get anything done?

I don't think so!

------
debacle
I was really hoping 'NUI' stood for 'Neural User Interface.'

~~~
lnanek2
I was at a hackathon over in Santa Clara recently where we were using consumer
brain wave reading headsets: <http://www.neurosky.com/>

Unfortunately, it looked like there were something like 8 signals read, the
two easiest being concentration and relaxation. Blinking was there too,
although that had false positives and was even delivered as a probability more
than a boolean. But hey, the tech is coming.

------
DanielBMarkham
I like this a lot, but I think it will augment instead of replace systems.

So I walk into my living room, say "Computer. Gesture interface" then change
around things on my large screen monitor while I remain standing, perhaps
setting up some music to play and launching and arranging some apps. Then I
say "Computer. Mouse interface" and sit down and begin programming.

Not sure of the details, but this inside a mixed environment would be awesome.
Might not want to work all day with my hands extended, but I shouldn't need to
sit and find a keyboard for everything, either.

I'd also like to see surface displays with touch feedback. I think there are a
lot of non-intuitive things like image processing or gaming that could benefit
from haptic feedback systems. It's going to be awesome watching these
technologies mix with each other over the next ten years or so. Even the
simple idea of having multiple screens all over the wall and having them
switch depending on which one you're staring at would completely change our
relationship with modern technology. (And while we're blue-skying, true 3D
would be really nice too) Perhaps we end up with _all_ surfaces being some
sort of color e-ink and something like leap as a standard feature of house,
office, and car cosntruction? Could that happen?

~~~
nchlswu
Ultimately, I really do think it will be a mixed environment like you say (or
a different paradigm entirely). There are a lot of NUI proponents who gas up
their work in speech or touch (and touch related) interfaces.

I do agree that they will be used pretty widely (maybe not so much speech...)
but all of these technologies remain stop gaps in the sense that a mouse was a
stop gap solution. They are great technologies that are great stepping stones
to something _truly_ natural* that encompasses more than one method of
interaction

* I don't necessarily think "natural" is necessary in the future, but that seems to be what a lot of folks strive for.

