
Leap Motion: Amazing, Revolutionary, Useless - kevin_morrill
http://www.hanselman.com/blog/LeapMotionAmazingRevolutionaryUseless.aspx
======
jdietrich
Believe it or not, this is a century-old problem.

The theremin is an electronic musical instrument, played by waving your hands
in the air. It works by detecting RF capacitance between a pair of antennae
and the player's body. You can see the Theremin being played at the YouTube
link below.

Playing the theremin is incredibly difficult, due to the lack of tactile
feedback. The human body is very poorly equipped to point precisely at an
arbitrary position in free space. Only a handful of players can achieve
anything better than squeaky science fiction noises and even virtuoso players
struggle constantly with intonation. Modern theremin technique depends on a
system of discrete hand gestures, which reduce the player's dependency upon
coarse proprioception.

If the Leap Motion is to have any real utility, it will need _phenomenally_
sophisticated software, to interpret intent from hand motion rather than
simply passing the hand location as a raw input. The human body simply isn't
capable of making the kind of movements that the designers of Leap Motion seem
to expect, even with a great deal of practice.

[https://www.youtube.com/watch?v=Ptq_N-
gjEpI](https://www.youtube.com/watch?v=Ptq_N-gjEpI)

~~~
lifeformed
I wonder if you could add some tactile feedback with some precision fans? A
grid of fans below your hand and in front of it could provide a gradient of
subtle pressure. Or maybe a fan on some sensitive servos that track each
fingertip?

~~~
vilya
That's pretty much exactly the idea behind the "Aireal" project from Disney
Research:

[http://www.disneyresearch.com/project/aireal/](http://www.disneyresearch.com/project/aireal/)

------
nairteashop
I think the article is a bit unfair. I've been playing around with a leap for
a few days and am suitably impressed.

What works very well with the device is coarse movements, especially relative
hand movements. What doesn't work so well is finer gestures (1/100th of a
millimeter motions of all 10 fingers? ha).

This app is a perfect example: [https://airspace.leapmotion.com/apps/cyber-
science-motion](https://airspace.leapmotion.com/apps/cyber-science-motion)

You can use your hands, kept flat, to spin around / zoom a 3-D rendering of a
human skull. You can also point at specific elements on the skull. Both of
these coarse gestures work great, and the experience is incredible.

However, the app also unfortunately has a "click" gesture to pick apart
elements of the skull - you click by spreading out your thumb and then folding
it back in. Works terribly, as this fine gesture is detected maybe 50% of the
time. It should've simply been left out.

I showed this app to my dad, who's a doctor, and he was blown away. He was
visibly excited about the potential for a device he can use to spin around CT
and MRI scans in the operating room without having to touch a mouse/joystick -
currently he has a person doing this for him to keep things sterile, and this
can sometimes be frustrating.

The leap, at least in its current incarnation, reminds me a lot of Google
glass. Both Google/LEAP and their proponents say the devices are going to
change the world. Maybe, maybe not. Neither device works perfectly like what
you see in the heavily edited demo videos. But both can be invaluable in
certain specialized fields, _today_ , as long as folks are realistic about
what can be done with them.

~~~
abrichr
_I showed this app to my dad, who 's a doctor, and he was blown away. He was
visibly excited about the potential for a device he can use to spin around CT
and MRI scans in the operating room without having to touch a mouse/joystick -
currently he has a person doing this for him to keep things sterile, and this
can sometimes be frustrating._

This is exactly the value proposition of my startup, TouchFree Labs. We're
developing software that uses the Leap Motion Controller to allow surgeons to
manipulate medical images inside of the operating room. You can see a
demonstration of an early prototype here: [http://www.youtube.com/watch?v=WaO-
cimDOEQ](http://www.youtube.com/watch?v=WaO-cimDOEQ). Demo starts at about
35s. (Apologies in advance for the low production value.)

Right now our bottleneck is medical expertise, and we're looking for surgeons
who would be interested in collaborating with us. We're developing workflows
that are tailored for different types of procedures, which requires very
specialized knowledge. The application also learns the nuances of individual
users' movements to improve gesture recognition, which requires lots of data.

I don't know how far away you are from Toronto, but if you could pass the
message along to your dad, I'd be very grateful--if only to get some basic
feedback. But if he's interested, he could be among the first surgeons in the
world to use the Leap Motion inside of an operating room.

~~~
jamesbritt
Is there anything you can say about how you arranged a license for this use?
The default Leap EULA forbids use with medical equipment, as well as use on a
shared work-station.

~~~
abrichr
We're still working on this. But since our product won't be used for
diagnostics or direct surgery, the regulatory requirements should be a bit
more relaxed.

------
mbesto
There's a general assumption that the Minority Report interface is the
interface of the future.[0] There are few reasons why I'm saying no.

1\. First and foremost, gorilla arm.[1] My presumption with the "interface of
the future" is that it's needed for prolonged use. So, first thing's first,
the interface can't be one where our arms require our hands to be higher than
our elbows. Unless of course our species got a whole lot stronger in the
forearm to support such a feature. Don't see our species doing that anytime
soon.

2\. Feedback - Right now the feedback loop is eye->brain->hand->brain->eye
(repeat) where the hand's pressure against a solid surface is the most
important feedback response. With the minority report style interface we
currently have a massive delay (comparatively speaking) between the
brain->hand->brain loop. We also have to iterate the whole loop much more
because we need to constantly assess with our eye where our hand is in 3D (not
digital) space. Now let's say the technology gets much better and reduces this
to 5ms. We are now bound by the differences of our synapses firing between
touch and light. I could be wrong, but it's my assumption that due to the
speed of light being the way that it is, that "touch" will always beat "sight"
in performance.

For prolonged used applications my bet is on adaptive surfaces. For short term
(turning an stove on, flicking a light switch, etc) interfaces I potentially
see this Minority Report style interface happening. But does the benefit cost
of innovation? Personally I think we are fooling ourselves.

[0] -
[http://www.ted.com/talks/john_underkoffler_drive_3d_data_wit...](http://www.ted.com/talks/john_underkoffler_drive_3d_data_with_a_gesture.html)

[1] -
[http://en.wikipedia.org/wiki/Touchscreen#.22Gorilla_arm.22](http://en.wikipedia.org/wiki/Touchscreen#.22Gorilla_arm.22)

~~~
nairteashop
I agree with you that a Minority Report style interface doesn't make sense as
the _sole_ interface to a PC. However, something like the Leap IMO makes a
great secondary interface to a keyboard and mouse/trackpad.

I'm using a Leap controller right now, with BTT for Mac/Touchless for gesture-
based control. As I read through a page I can simply stick my hand out and
wave it up to scroll down the page - it's a phenomenal experience for passive
reading as I don't have to break focus to reach out for my mouse/trackpad.
I've also configured some additional coarse gestures to launch mission control
etc.

Using the Leap for such brief, coarse gestures avoids both the problems you've
mentioned because my arm is resting on my desk, with fingers just a few inches
above my trackpad/keyboard, so no "gorilla arm" problems; the gestures are
coarse, requiring very little hand-eye co-ordination and finally, the gestures
are brief so no fatigue problems.

All of this breaks down once you start trying to do any finer-control
gestures, like trying to point at links and click on them like the OP tried to
do. IMO the Leap should be used to augment the keyboard/mouse as a secondary
interaction interface that you use occasionally.

~~~
Fuzzwah
I don't really understand how reaching out and waving my hand to scroll a page
is a step forward from simply using a scroll wheel on the mouse which is
already under my hand. Even if its not under my hand, my peripheral vision is
good enough that I don't need to break focus on what I'm reading to position
my hand on the mouse.....

I totally accept that commenting on a LEAP with out first using one could /
will make me look stupid.

~~~
superuser2
>I don't really understand how reaching out and waving my hand to scroll a
page is a step forward from simply using a scroll wheel on the mouse which is
already under my hand

The real value proposition is not necessarily in applications designed under
existing UI paradigms like scrolling text. It does, however, let you
accomplish "science fiction" effects like changing the camera position on a 3D
model far, far more easily than the mouse.

~~~
clarky07
Is it better than a touchpad? My iPad with touch does rotating and zooming
really well.

I can see this being an awesome technology in an operating room where they
don't want to touch things for sanitation reasons. I have a hard time
imagining how it is useful in my day to day use of a computer.

------
bennyg
A coworker and I have written some gesture recognizers for Leap in C# and
Objective-C hoping to make the thing more usable and easier for other
developers to write software using it. Hate to do a shameless plug, but both
can be found here:

[https://github.com/uacaps/MotionGestureRecognizers-
ObjC](https://github.com/uacaps/MotionGestureRecognizers-ObjC)

[https://github.com/uacaps/MotionGestureRecognizers-
CSharp](https://github.com/uacaps/MotionGestureRecognizers-CSharp)

\---

We're hoping to start the community foundation for making tools that help make
Leap extremely usable from both a development and from a user experience
standpoint. The Leap is awesome, beautiful and we think can be used in a
myriad of applications.

------
nine_k
If all you do is browse the web, a touchpad is often all you need.

For text editing / word processing, a good keyboard is often all what's
needed, and the use of mouse is often discouraged by gurus.

I still can easily imagine using the Leap Motion device while editing images
and especially 3D models. Even more I can imagine using it in games,
especially games written with this device in mind.

I don't own the device but I tried it. What's great is that you don't need to
wave your hands in the air, Minority report-style; moving your fingers us
enough. I wish it was built into a keyboard; it would replace a touchpad /
trackpoint easily, adding much more capabilities.

BTW does anyone here remember how clumsy were mice on PCs in, say, 1992?

~~~
kabdib
Mice worked pretty well in 1992. They worked well in 1984 on the Mac, and five
years before that, on Xerox hardware and LMs (though they suffered from
"really small ball bearing" disease, and easily got dirty or cranky and
refused to roll well).

With a mouse, you have at least one button you can signal an event with.

Imagine doing a UI where you didn't have a mouse button. All you can do is
move and point. That's a Kinect, for the most part.

I haven't used a LeapMotion, but I suspect it's the same problem; there's no
way to generate a discrete event. It's all fuzzy. Did your fingers touch? Did
you wave in a particular way? Some fuzzy matcher is pumping out "90%
probability of event X, 75% probability of event Y" every few milliseconds,
and it's up to higher layers to turn this goo into decisions that people are
happy with. It's hard at all layers.

I really think you need a button, a clicker. Something "hard" in the UI that
slams a voice of reason into that fuzzy tower that's continually only able to
/guess/ what you're trying to do.

[We wanted a clicker on Kinect. Politically impossible. I think it would have
helped a lot.]

~~~
rywang
Actually, it's quite possible to build a "clicker" on the Kinect. It involves
mounting the sensor from above, and building completely different software
that tracks the hands and fingers well.

We've done it: [http://threegear.com](http://threegear.com) Here's a video of
tracking arbitrary hand motion using a Kinect-equivalent sensor:
[http://youtu.be/exZ6wukQCpk](http://youtu.be/exZ6wukQCpk)

~~~
jfoutz
I think he was calling out the microsoft politics as limiting the
functionality of the kinect, not technical expertise.

------
will118
I find the Leap part of BTT (BetterTouchTool) is actually... err.. use-
worthy..? Neither useful nor useless.

I've got some really cool (still a big part for me) and useful stuff working,
augmenting my mouse/keyboard use. For example, a finger to the left minimises
and two fingers to the right opens a list of recently used apps.

Yet I'm very conscious that everything would just be better suited to a
keyboard shortcut..

I never bothered with Touchless and mouse emulation things; years of 2D GUI
design isn't suited to this kind of interface. "Midnight" is a my favourite
Leap app but I think that's just an iPad app that lends itself very well to
the leap input too.

------
imroot
I did some investigative work to look at if we could use the leap motions to
replace the touch-sensitive overlays that we strap to 60" TV's for our on-air
traffic folks to use during their segments -- the overlays run about 2500, and
if we could replace them with a leap motion and get the same functionality
with less cost, this could allow us to roll out the traffic application to
more stations than the six or so that are currently using our in-house traffic
application.

The first thing that I noticed was that it couldn't take the range of the 60"
television that we had hooked up to the traffic software, so I scaled this
down to a Thunderbolt display, and tried again. In my tests, the recognition
for the thumb was sporadic, if not completely missing -- in both my (fat guy)
test as well as during the testing of the local personality (non-fat guy).

I then made some changes to our software to try to minimize the effects of the
natural movements of the hand -- I turned down the sensitivity to attempt to
compensate for the normal shakes and jitters that you have with your hands.
This gave it a better feel, but, the traffic reporters still missed the
feeling of touching the display and watching that display interact with your
touch.

They're still neat devices (I really wanted to say neat toys, but, I don't
want to cheapen the work that the Leap Motion folks put into this thing), but,
I'm having a hard time implementing them in a way that would work for us...so
they're sitting on my shelf, waiting for a project that could use them (or,
take them to my local hackerspace should I not find a good project for them
shortly)...

------
CRidge
Amazing, Revolutionary, Useless... and let's not forget buggy! And with
horrible support... My device was not able to recalibrate, a problem shared
with many others should I believe the forum. A week or two has passed, and no
reply from the makers of the device, neither on the forum or on my bug report.

I guess I just got another hunk of junk to put in the failed-devices-closet...
:-(

~~~
alternize
same here. my device reports "bright light detected" even when there's not
enough ambient light to see the keys on my keyboard. turning off the monitor
seems to help, but...

~~~
vanderZwan
It works by detecting infrared light - the lightsource is likely an infrared
one that you can't see, possibly your monitor.

~~~
alternize
absolutely. yet if it won't work properly with the monitor turned on, there's
little use for it.

------
Semaphor
This is why I decided to back Mycestro [1] and skipped on the LeapMotion.
While it won't support multi-hand (unless you have 2 devices) and multi-finger
stuff, at least it should be able to easily recognize any motion I make with
it on.

[1] [http://www.mycestro.com/](http://www.mycestro.com/)

~~~
draugadrotten
Expensive. 30 USD shipping for a tiny < 50g light device. Will they send John
Travolta delivering these things in person using his jet, or what is up with
that?

~~~
dsego
How is that expensive? You pay more for a good mouse.

~~~
nodata
30 dollars shipping for a mouse?

~~~
dsego
oops, I thought that was the item price.

------
nixarn
My first impressions with the Leap is similar. Got one, played around with it,
felt kinda useless, haven't "touched it" since.

~~~
lvs
As another early adopter, I have to say that it's disappointing how much the
company is relying on "the community" to generate their business model for
them, rather than properly develop the software themselves.

------
nsxwolf
It's obviously not a replacement for a mouse and keyboard and never will be. I
could see some useful gesture based macros, like "throw your hands up in total
frustration" to rage-quit an app or open a distraction-free full screen
editor.

------
utopkara
I find Leap Motion quite accurate. You will quickly get used to the
convenience of gestures with BetterTouchTool, and wish you had it on computers
which don't have Leap Motion.

Otherwise, The gestures used by apps is something that needs to be carefully
crafted. For instance Touchless, the mouse replacement, simply doesn't cut it;
you'll find yourself reaching out for the mouse/trackball/trackpad within the
first 10 seconds.

The leap gets effected by strong light sources on the ceiling. You might want
to use it facing downwards if that is an issue. Also, if you are wearing a
watch or a ring, it might get confused with the reflection.

------
jecs321
gorilla arms [http://catb.org/jargon/html/G/gorilla-
arm.html](http://catb.org/jargon/html/G/gorilla-arm.html)

------
tsenkov
_Disclaimer: I haven 't tried Leap Motion, yet._

Did I get this right? Leap Motion vs. Kinect:

    
    
      - LM is smaller (significantly);
      - LM is cheaper (significantly);
      - LM is more accurate (significantly);
      - LM has almost no real apps (mostly concept demos).
    

If these are all correct I find Scott's post nothing more than a "normal",
"the competition sucks, too", Microsoft type of post.

~~~
AndrewDucker
Very different markets.

The LM is very short range, so you couldn't use it like a Kinect. And you
wouldn't plug a Kinect into your PC to watch your fingers move either.

------
sytelus
What surprised me after little digging up was that Leap Motion does not
support point cloud. That means you can't get 3D world as points in space from
Leap Motion. Their founders says Leap Motion isn't designed for this purpose.
This means you can't use Leap Motion for applications such as 3D scanning.
Personally I think that would be much more exciting then ability to move
windows by waving.

------
nwh
Is all writing seriously boiling down to animated GIFs of "reactions"?

~~~
simias
I had the same reaction after seeing the first two gifs but the rest are
mostly demonstrative and to the point.

On a side note, it's still quite amazing that gifs are still the simplest way
to show short video clips on the web.

~~~
nwh
They're really not. Embedded video is a fraction of the size, and faster to
render to boot.

~~~
knowaveragejoe
Is that true? What kind of encoding are we talking about?

~~~
nwh
[https://mediacru.sh/](https://mediacru.sh/) was on the front page a day or so
ago, it grabs GIFs and re-encodes them into video (MP4 and OGGv). The results
are much smaller and, in my local experimentation, are much, much faster to
render than the original files.

There's some technical details at
[https://mediacru.sh/demo](https://mediacru.sh/demo) .

Draw your own conclusions, but that's how I have experienced it.

~~~
jodrellblank
The gif loaded and played seamlessly. The video was a placeholder which loaded
a full screen QuickTime window that delayed for 5-10 seconds then played.

Claim " _gifs are still the simplest way to show short video clips on the
web._ " agreed.

~~~
nwh
I'm assuming you're using Chrome. Chrome delays the loading of <video> tags
until after <image> ones. It's mentioned on the page I linked to.

~~~
jodrellblank
No, mobile Safari.

------
rhema
The biggest problem with the leap that I have seen (there is one in my lab) is
that it sees hands spread out evenly on the surface really well, and that's
about it. If you do a thumbs up gesture (or a rude gesture when it doesn't
work), the fingers get occluded by the bottom of your hand.

Think about you hands as five friends trying to play connect at the same time
and you can imagine the kinds of occlusion problems you might face.

Still, I, and probably others, like the leap. It's not useless. You just have
to exploit it the right way, looking for natural interface design beyond a Tom
Cruse movie.

The biggest free air interaction problems are (1) making visible what the
available gestures are, and (2) providing tangible or visible feedback. You
don't get to see and feel the the interaction like you can with a keyboard or
less digitally inclined tools.

~~~
lnanek2
Maybe camera is just the way to go then. There's Kinect, of course, but Intel
is getting into the act as well:
[http://click.intel.com/intelsdk/Intel_Developer_Kit-0-C92.as...](http://click.intel.com/intelsdk/Intel_Developer_Kit-0-C92.aspx)

Video demos here: [http://software.intel.com/en-us/vcsource/tools/perceptual-
co...](http://software.intel.com/en-us/vcsource/tools/perceptual-computing-
sdk)

------
deanclatworthy
I've been experimenting with the LeapMotion at work today, and some initial
observations:

\- There's no apps yet that have made me go wow. \- The range is quite small
\- The motion of hovering an arm in front of you is extremely tiring after
more than 10-15minutes. Try holding your arm out in front of you for that long
without moving and you'll see why.

The reason Kinect was a success is that you can take real-world activities
such as dancing, jumping over obstacles, jogging (on the spot), and translate
them into an interactive digital version.

With the Leap, I've yet to think of a real world scenario where I would be
waving my hands in front of my chest, that would translate well into a digital
experience. Conducting an orchestra would be one good application for this,
perhaps training conductors, but I couldn't think of anything else.

------
pbreit
I think the "useless" comment misses that its sweet spot utility is unknown at
this point. And I'm not sure "general purpose computer input device" is going
to be such a sweet spot.

The precision "problem" can obviously be addressed with software.

That said, I do believe the absence of killer utility is a problem for Leap
right now since it came out with a decent bang and now the less than favorable
reviews are dripping in. I think they would have done themselves a significant
favor by having a killer app ready from the outset. I also think they need to
encourage people to look beyond simple human-computer interaction. Apparently
these things could map a whole football game or count the number of people at
a concert. Things like that. I also think the commercial angles will be better
for business.

~~~
lvs
The onus was on the company to figure out what its "sweet spot utility" was
before launch, don't you think?

~~~
pbreit
Yep: "I think they would have done themselves a significant favor by having a
killer app ready from the outset."

~~~
lvs
Sorry -- yes, chiming in with agreement.

------
egypturnash
I've got my Leap doing what I bought it for: controlling iTunes when my hands
are covered in hair dye. [1]

If I can get it to do more, awesome. But that's enough for now.

1:
[http://www.youtube.com/watch?v=_O6sR0PKofc](http://www.youtube.com/watch?v=_O6sR0PKofc)

~~~
Domenic_S
Then you got ripped off!

[https://flutterapp.com/](https://flutterapp.com/)

~~~
egypturnash
When it's at my desk, my laptop is connected to an external monitor and
keyboard; it runs closed. Flutter is of no use to me.

~~~
Domenic_S
Why does it run closed? It doesn't have to.

~~~
egypturnash
Short answer: I like it that way.

Long answer: When it's plugged into the external monitor/keyboard/wacom
tablet/leap/etc, the laptop is on a shelf beneath the main desk surface. It's
completely out of view when I'm standing there working. I've got a 24" monitor
on an adjustable arm; I don't _need_ a second screen. Or want one; if I have
two screens next to each other I inevitably go crazy if they're not exactly
the same color profile.

(Photo of my setup:
[http://egypt.urnash.com/media/blogs.dir/1/files/2012/08/desk...](http://egypt.urnash.com/media/blogs.dir/1/files/2012/08/desk-
porn.jpg))

------
mattdanger
I played with a LeapMotion last week and was impressed by it but within 5
minutes my arm was tired.

~~~
chiph
The local BestBuy had one set up & I played with it for a few minutes. _Really
Interesting_ , but not all that applicable to how I use a computer on a daily
basis.

I think it's greatest potential will be in gaming. Imagine casting a spell by
using the appropriate arcane hand gestures. Or swinging a sword - it can tell
the difference between an overhand and a side cut, and a blocking move.

~~~
gdulli
> I think it's greatest potential will be in gaming. Imagine casting a spell
> by using the appropriate arcane hand gestures. Or swinging a sword - it can
> tell the difference between an overhand and a side cut, and a blocking move.

I've had a Wii, and a Kinect, and gesturing as input got annoying very quickly
with both. It's something that just sounds fun, but isn't.

Even with the Nintendo DS, when you'd use a stylus to trace a certain shape to
trigger an action, it didn't translate well to sustained gameplay. Though it
remained workable longer than with Wii/Kinect gaming. Simple swipes with a
stylus or finger (on a screen) continue to work well, whereas even simple arm
or hand gestures quickly become annoying and fatiguing.

It turns out that pressing buttons works just fine, and is actually ideal,
even if other input methods sound sexier.

------
hoffcoder
I think that the newer Kinect 2.0 will be able to take care of all these more
sophisticated gestures. When compared to the Leapmotion it has much better
range and its accuracy will improve now too.

------
vermontdevil
Four students at RIT are using this to develop a sign to text app.

[http://motionsavvy.com/](http://motionsavvy.com/)

So it does have potential/use.

~~~
rpwilcox
Fascinating. What's annoying about computer/video based sign instructions is
the lack of feedback. "Did I get this sign right or do I just _think_ I have
it right?"

If MotionSavvy can notify me when I get it wrong, very nice!

------
mdaniel
I also have one, and after seeing what it sees during the orientation I
wondered how many of its problems could be cured by having a second sensor
that one could position about a foot away from the other to give the Leap
stereo vision. I am speaking totally out of school because I don't know if
that would create more problems than it solves, though.

On my phone, to apologies if this is a duplicate of another thread.

------
JamesCRR
Nice example of someone adding a leap motion controller to their site,
GoSquared: [https://www.gosquared.com/blog/playing-around-with-the-
new-l...](https://www.gosquared.com/blog/playing-around-with-the-new-leap-
motion-controller-on-the-web) I'm not going to argue this demonstrates
utility, but it does show some good recognition of smaller gestures.

------
zero_intp
I think a ring to click might be the best answer. Clicking (for me) works
better with a tactile response.

------
wesley
Looks like a much better alternative just revealed itself:
[http://www.kickstarter.com/projects/haptix/haptix-
multitouch...](http://www.kickstarter.com/projects/haptix/haptix-multitouch-
reinvented)

------
tocomment
I think combining a kinnect and several leaps with an Oculus Rift might be
great. That way you can use your own body motions within the rift. The kinnect
would be for gross motions and the leaps would be for detecting fine motions.

What do you guys think?

~~~
replax
Well, as kinect already is capable of tracking your fingers, i am not sure the
leap would be very useful. rather a slightly better kinect..

~~~
VLM
"kinect already is capable of tracking your fingers"

I will nitpick the word selection that "capable" is nearly useless in a user
interface. It needs to be nearly 100% reliable or its useless. 99% of my
interaction with Kinect is my daughter crying that she can't navigate menus in
her dance games and is all frustrated, followed by me being all frustrated and
swearing about how if only I could bypass this POS and use the buttons on the
controller I would be done twenty seconds ago and I hate Kinect with a
passion. It works fine for gross motor like my daughters dance games but
useless for fine motor. Perhaps in the future I will write SQL statements by
performing an interpretive dance at work, but I hope not.

The failure rate is vital... If I'm typing this at 100 WPM, which is probably
about right, then a 99% motion detection success rate means I'd swear and hate
motion detection and have to stop and fix an error, what, every six seconds or
so? All day long? Forget that, I'm sticking to the keyboard and mouse, I don't
have the patience for 99% success.

~~~
hoffcoder
The newer Kinect 2.0 will be able to do almost perfect finger tracking. Infact
sign language recognition apps have been written for even the older Kinect
(XBox 360 version). Check out the SigmaNIL framework and FORTH libraries
offered with OpenNI.

~~~
Qworg
You wouldn't see this in an XBox game though - the XBox SDK doesn't support
it.

~~~
hoffcoder
I think that the SDK is way ahead of OpenNI. Soon the more enhanced features
of the new Kinect will make their way into games too!

------
melling
Can someone build something like this but only have it recognize a small set
simple gestures? Using it to simply browse the web would be cool. If you could
throw in a way to click build in Xcode and IntelliJ then I'm sold.

------
mrbill
This is exactly my experience with the Leap unit; I played with it for a
couple of days. Then uninstalled all the drivers and put it back in the box,
will try again in a few months to see if things have improved.

------
iekadou
Leap Motion – Just a toy or the future? -
[http://tech.particulate.me/](http://tech.particulate.me/)

------
JabavuAdams
Question: does the LEAP need to be flat on the table, or could you hang it
around your neck to get mobile gesture input in a vertical plane?

------
nikunjk
Touchless sucks on AirSpace. Try BetterTouchTool. The gesture tracking and
usability of that App is incredible.

------
colmvp
Well jeez, let's not even bother then. Wendell, shut off the generator, Scott
thinks we're useless.

------
nicenemo
Can it be that I saw a prototype at ICT Delta conference may 2007 in Utrecht,
the Netherlands?

------
JoeAltmaier
Thank you for the most hilarity I'm likely to encounter today! That's a great
article.

------
seivan
"Hey so LeapMotion is still being developed, I better hurry to write a blog
post and complain about it before they improve it".

~~~
davedx
If it's still being developed why is it on retail for $80?

Most people make sure their product is useful and working before selling it to
people.

If you bought a car and the brakes _mostly_ worked, would you be happy?

~~~
oscilloscope
If the year was 1896, I probably would be happy. Brakes were far less
functional and reliable in the first 20 years of the automobile's history.

