
Kinect finally fulfills its Minority Report destiny - shawndumas
http://www.engadget.com/2010/12/09/kinect-finally-fulfills-its-minority-report-destiny-video/
======
shawndumas
gorilla arm: n.

The side-effect that destroyed touch-screens as a mainstream input technology
despite a promising start in the early 1980s. It seems the designers of all
those spiffy touch-menu systems failed to notice that humans aren't designed
to hold their arms in front of their faces making small motions. After more
than a very few selections, the arm begins to feel sore, cramped, and
oversized — the operator looks like a gorilla while using the touch screen and
feels like one afterwards. This is now considered a classic cautionary tale to
human-factors designers; “Remember the gorilla arm!” is shorthand for “How is
this going to fly in real use?”.[1]

\----

[1]: <http://ftp.sunet.se/jargon/html/G/gorilla-arm.html>

~~~
Cushman
_humans aren't designed to hold their arms in front of their faces making
small motions_

Well, humans aren't designed, but we _did_ evolve holding our arms in front of
our faces making small motions.

<http://en.wikipedia.org/wiki/List_of_gestures>

I know it's hard to remember around here, but it's not just a UI buzzword.

(Which is why I see promise in this tech; using touchscreens is unnatural, but
using gestures in front of your body to communicate with something a couple of
feet away is quite human.)

~~~
shawndumas
the phrase, "for 8hrs a day" is implied...

~~~
Cushman
Of course, that's a valid point. But again, lots of (perhaps most?) people
engage in conversation for several hours per day, and a lot of that time is
spent gesturing, mostly unconsciously.

I'm not envisioning it as a primary input device, naturally (although the deaf
seem to get by fine). But I don't think I've ever been in a situation where a
heated conversation made my _arms_ tired; as a replacement for a few basic
mouse tasks, I bet it could even improve ergonomics.

Just try it: raising your arms as if gesturing conversationally is way, _way_
less stressful than reaching out to touch your monitor.

~~~
hugh3
Right now I'm sitting back in my chair, with my elbows on my chair's armrests.
With comfortable armests I reckon I could gesture all day. If I had a hundred
odd inches of monitor surrounding me like in Minority Report it might be a
reasonable way to control things -- easier than moving a mouse pointer all the
way from one side of a vast piece of screen real estate to the other.

------
roadnottaken
Boy, if my computer-use consisted exclusively of browsing snapshots and
rotating them, this would be AWESOME.

~~~
moe
Don't be afraid, it's not going to replace your mouse and keyboard. It's going
to _augment_ them.

That means you can just work normally on your computer, until that moment
comes where you want to sit back comfortably and... browse snapshots and
rotate them.

I predict there will also be much more interesting input assistance beyond
photo browsing. I'm using a Magic Trackpad in addition to mouse/keyboard for a
bunch of gestures (window management). However, the number of distinct
gestures that can be performed on a flat surface is fairly limited.

I'll gladly take the additional gestures that a kinect-style device will give
me. It could very well revolutionize the way we interact with window managers
_without_ forcing us to grow gorilla arms.

------
elblanco
It's cool, but I'm pretty sure I can flick through hundreds of photos much
more efficiently with my mouse and keyboard (and without the tired shoulders).

I think gesture based interfaces may have some quick utility, like remotely
operating equipment for small sequences of activities (gestures operating an
Asimo comes to mind since it doesn't yet have a real input system like a mouse
& keyboard -- come here, go there, stop, move aside, etc.) but I've yet to see
a demo of this interface method, in this context, that isn't wildly less
efficient than existing input methods.

~~~
rue
More efficiently than the example or in "Minority Report"? Sure, even if a
less-sophisticated computer user (such as a cop) might not.

You cannot, however, use a mouse + keyboard faster than performing very small
gestures mainly with fingers (as in, raise your hands from the keyboard and
perform similar gestures).

~~~
elblanco
I can hit the right arrow key on my keyboard hundreds of times a minute.
Combined with a mousewheel for some spot checking and zooming, it only slows
that down a hair.

I'm an amateur photographer and after a week of taking photos I have thousands
of photos I need to vet. This kind of gesture-based interface would take me
until the universe grows cold to go through a single week's worth of shots.

> Sure, even if a less-sophisticated computer user (such as a cop) might not.

I'm not sure how learning a set of rather specific gestures is less
complicated to a novice user than hitting the right arrow key or the spacebar
a bunch of times (or clicking on a button labeled "next photo"). I've tried to
use gesture based systems a number of times in the past and have always ended
up just turning that feature off, mostly because I couldn't remember which one
of a dozen gestures meant the particular thing I wanted to do.

The idea is a nice one, use something that humans do all the time to build an
intuitive interface, but the current state-of-the art doesn't understand
normal human gestures very well. Why does two fingers vertically up mean "grab
this" and not some other random gesture? In the end the problem is that we're
trying to use arbitrary physical gestures, intended for a physical,
3-dimensional world, to interact with virtual objects in a 2-dimensional
window, in ways that might not have any particular meaning to us as a form of
normal gesturing (let alone cultural specific gestures, the number of
different ways people do fairly universal gestures like point at something is
mind-boggling).

I don't think that there isn't any value in this work, I just think that the
application that these interfaces are currently being designed for simply
don't make any sense.

Somebody else here mentioned that it might make a great deal of sense in 3d
modeling sense mice-keyboard combinations are rather clumsy in that
application. I happen to think using gestures as a control mechanism for
robotics or machine operation might make more sense (imagine controlling a
crane remotely by making appropriate hand open/close arm up/down gestures).

~~~
rue
I agree with you that the composer-style arm flailing is not great (even if
intuitive to non-techies), but I do consider smaller, closer-proximity
gestures a perfectly viable direction of research.

> _I'm not sure how learning a set of rather specific gestures is less
> complicated to a novice user than hitting the right arrow key or the
> spacebar a bunch of times (or clicking on a button labeled "next photo")._

They are just not as good with traditional computer input methods, negating
the advantage that you or I would gain there.

The point about 3D is very good, and something that can be workable even with
2D displays (in a "3D" environment, like your average FPS). Rotating, pulling
things from "behind" something else etc. For certain categories of use, it is
probably better than voice commands - another natural UI for humans - in the
same way some of us still prefer the CLI over a GUI.

The current technology obviously is not quite there, but it is already
possible to implement a virtual keyboard by observing finger movements.
Combine that with gestures performed above the virtual surface, and you might
have something.

------
mitko
The other day I was walking in the stairs in CSAIL, and through the windows of
the conference room I saw the guy filming this. I got late to my meeting (in
the room above this). "The future is here", I thought.

Even if there are some bugs and issues with the interface now, imagine that
just in few years this will be mainstream :)

------
xentronium
All we need now is a more responsive kinect.

As a proof of concept this kind of interface is fine but for real use it's
slow and unresponsive.

~~~
rodh
Isn't the latency experienced with the games arising in software (the
algorithms running on the xbox, not the Kinect itself)?

~~~
m_eiman
According to the web, "The depth sensor uses a monochrome image sensor.
Looking at the signals from the sensor, resolution appears to be 1200x960
pixels at a framerate of 30Hz." It would perhaps be better to be able to run
it at a lower resolution and a higher framerate; maybe it's possible.
Processing 35M pixels per second does sound like something that could use up a
CPU cycle or two. It also sounds like something that OpenCL would be good at
handling, doesn't it?

<http://openkinect.org/wiki/Hardware_info>

~~~
rodh
My guess would be that the high resolution is required to interpret the IR
pattern at the maximum distance supported by the kinect (tens of feet). I'm
wondering if the 30Hz is a limitation of the camera sensor or the pattern
matching.

------
powrtoch
It's interesting how 3D, holographic interfaces have been in so many sci-fi
movies etc., yet they all seem to get referred to as the "Minority Report"
interface.

Relevant:
[http://tvtropes.org/pmwiki/pmwiki.php/Main/HolographicTermin...](http://tvtropes.org/pmwiki/pmwiki.php/Main/HolographicTerminal)

~~~
Cushman
It's good to look out for the little guy, but I think this is misplaced. The
Minority Report interface was neither 3D nor holographic; just projected onto
glass. What it did have was UI using individual fingertips as control points
without touching the screen. So this system, for example, is pretty
specifically a Minority Report interface.

I don't know for a fact that Minority Report was the first conceptualization
of that specific interface, though.

------
Cushman
Obviously it's still a bit choppy, but it's clear this is only a few software
improvements away from being a very usable peripheral.

Just from waggling my hands around a little in front of my monitor, I feel
like the touchscreen fatigue Apple made a big deal of isn't as much of an
issue, since you're holding your arms upright, which has much more structural
support than having your arm outstretched to touch the monitor. It actually
feels like a very natural way to control virtual desktops or application
switching. Additionally, this is a peripheral that can be added to any display
of any size, or even used across multiple displays.

Is there a reason this tech hasn't gotten traction as a PC peripheral before
Kinect?

~~~
shin_lao
Because using a mouse is faster and more convenient perhaps?

~~~
Cushman
A mouse is faster and more convenient for _some tasks_ , sure. You wouldn't
use a mouse to replace keyboard input, though— or vice versa. Different
interfaces for different tasks.

Is it really so hard to imagine a task for which gesturing is appropriate?

~~~
roadnottaken
_"Is it really so hard to imagine a task for which gesturing is appropriate?"_

Yes. Can you think of some examples? I can't think of a single task I'd like
to do _on a desktop computer_ that wouldn't be easier and more-precise with a
mouse. You know what's easier than pinch-zoom? A scroll-wheel. Multi-touch is
great for phones/tablets, but this seems pretty silly to me as an interface
for, y'know, computing.

~~~
IChrisI
I browse the web a lot on my desktop, and I find myself missing my laptop's
two-fingers-and-swipe-down scroll.

~~~
ollysb
Perhaps you want one of these apple trackpads in your stocking
[http://store.apple.com/us/product/MC380LL/A?fnode=MTY1NDA1Mg...](http://store.apple.com/us/product/MC380LL/A?fnode=MTY1NDA1Mg&mco=MTg1ODE3MDE)

------
snes
Movies like Minority report can now be made with $150 instead of expensive
CGI.

I could see this on a desk, like MS Surface.

------
ck2
Now the cpu needs to be about 3x faster so there is no perceptible lag (it's
pretty obvious right now).

Very nice of Microsoft to back down on the anti-hacking threats, this is great
for everyone.

I wonder how long until someone hacks it to help certain disabilities.

------
nlavezzo
Looks interesting, and maybe something that's actually useful will come of it,
but I don't think I'll be trading in my mouse anytime soon.

------
jbillingsley
Add a really awesome voice control interface to this and I could see it being
nice in a consumer context.

------
baddox
I like the part of the video at 0:30 where my arms would be completely worn
out.

------
WiseWeasel
Looks like it might be a nice iPad app.

------
mohsen
getting closer, but not there yet.

