

Augmented Reality 3d Video on iPad with Kinect  - iqram
http://www.youtube.com/watch?v=R8tiHXDiqsw
This is an awesome way to demonstrate what can be done with the Kinnect and the iPad String Augmented Reality SDK
======
rkalla
So damn clever... releasing an SDK for the Kinect[1] was one of the smartest
things Microsoft could have ever done and I think has gone a long way to
contribute to it being the fastest selling piece of hardware in history[2]
(did anyone know that? I sure didn't, and barely believed it reading it).

This reminds me a lot of what Johnny Lee did with the Wii soon after it was
released with head-tracking 3D[3]

Part of me wants to stop using a computer and take up beet-farming because the
cool factor seems so much less than what these guys are doing.

Anyone on HN actively toying with a Kinect and want to share some video?

[1] [http://research.microsoft.com/en-
us/um/redmond/projects/kine...](http://research.microsoft.com/en-
us/um/redmond/projects/kinectsdk/)

[2] [http://www.vgchartz.com/article/83375/kinect-is-the-
fastest-...](http://www.vgchartz.com/article/83375/kinect-is-the-fastest-
selling-videogame-hardware-of-all-time/)

[3] <http://www.youtube.com/watch?v=Jd3-eiid-Uw>

~~~
hammerdr
I have no video, but I was able to get a WebGL-powered game world running that
was controlled through a Kinect using DepthJS. This was done in just over a
day of playing around. Was pretty cool when it worked.

The source code is available on github at
<https://github.com/thoughtworks/kinect-spike>

Edit: My co-workers were looking at me with concerned looks that day. I would
stand up and wave my arms for 30 seconds and then sit back down at random
times throughout the day. :)

~~~
rkalla
Very impressive Derek -- how easy was it to work with the Kinect data via the
API? Is the information coming out of the sensors pretty straight forward or
does it just come in as a video feed with additional metadata (positional?)

~~~
hammerdr
The way that DepthJS works is through three technologies: libfreenect[1],
OpenCV[2] and then a simple Tornado (push) server. libfreenect is what grabs
all of the data from the Kinect device and OpenCV is the 'interpreter'. This
translates to just getting events such as 'move' in the browser.

[1] <http://openkinect.org/wiki/Main_Page>

[2] <http://opencv.willowgarage.com/wiki/>

------
KingOfB
Here's the launch video for String SDK. Looks pretty impressive, much smoother
than many other implementations I've seen

<http://www.poweredbystring.com/official-launch>

Great demo video about 1:30 in to the launch video.

~~~
rkalla
It's hard to tell how it performs in real life (not a launch video) but that
being said the demos of it in action were impressive.

I wonder if this is the form the Holodeck/3D technology we've all been waiting
for since the 70s will be in.

AR of your living room mixed with a glassesless 3D TV[1] and some motion
controls. Very exciting stuff.

[1] [http://3dradar.techradar.com/3d-tech/52-inch-glasses-
free-3d...](http://3dradar.techradar.com/3d-tech/52-inch-glasses-free-3d-tv-
hits-japan-23-05-2011)

------
chopsueyar
Isn't the new Nintendo tablet doing something like this?

Also, why wasn't the piece of paper on the table showing up, only the books?

~~~
dstein
Ya it looks like the "augmented reality" part was faked. The angle of the
video never quite matched up with where he was holding it either. In any case
it's still a nice proof of concept.

~~~
chopsueyar
I found the Nintendi E3 Wii U Demo.

It gets interesting around the 2:40 mark (the whole thing is pretty
impressive)...

<http://www.youtube.com/watch?v=DowHrWluHxg>

------
kposehn
Very, very cool. I want to use this with my own app!

