

Luma updates its video camera app for iOS, adding cinematic image stabilization - alexkcd
http://thenextweb.com/apps/2013/02/25/luma-improves-its-super-smooth-image-stabilization-for-shooting-video-on-ios/

======
cosbynator
I'm completely biased in this (I know the founders), but this is the real
deal. I've recorded shaky, all over the place video handholding my iPhone on a
bike and it came out like this: <http://luma.io/v/B2->

It is pretty neat stuff.

~~~
samstave
That's really smooth. Installed the app after viewing that.

I'd love to see a side-by-sire as well though.

~~~
alexkcd
Here you go: <http://luma.io/v/CIt> (thanks to cosbynator for letting me make
this)

~~~
samstave
Thats awesome. (It is also awesome that when you right click and "Show video
in new tab" it simply dowloads the MP4!

I hope that google glass has this level of stabilization (or that google buys
this for that purpose)

------
networked
Digital image stabilization is a fascinating subject. On one hand, the core
idea is simple (average out motion over time in some way then pan and zoom the
image to compensate for deviations from the average); on the other, it's not
easy to get right since outside of random jitter some camera motion, even
rapid, can be what the camera operator wants.

Here's an example of an actual algorithm used for digital IS:
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.148...](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.148.6856&rep=rep1&type=pdf).

------
rubberbandage
I’ve used this app since it was called SteadiCam (before that name was pulled
for copyright reasons, of course), and there’s really nothing else like it
around—long ago I thought of applying 6-axis gyroscope readings in reverse to
video, but this dev has done that and more better than I ever could. It’s a
fantastic application of amazing technology and crazy math. Major kudos!

------
codex
If you're clever, you can do this kind of postprocessing without a big CPU hit
on mobile phones. Phones compress video in hardware, and as part of that
process the hardware looks for blocks of video that are roughly the same
across frames. When it finds one, it attaches a pointer indicating where it
should go in the next frame.

Taken in aggregate, all of these pointers in the compressed data stream
effectively show you which way the image "shook" relative to the previous
image (and how far it shook) saving you CPU cycles to determine this yourself.
You can even detect rotation. So all you need to do now is compensate for the
shake by rewriting the compressed stream, "panning" (and perhaps rotating) in
the opposite direction of the shake. In order to have room to pan, you need to
emit a smaller rectangle than the original video.

This isn't sufficient for advanced stabilization, but it's a quick first pass.

I took a stab at writing this for the iPhone in 2010, but by that time the
writing was clearly on the wall: Apple was soon going to offer this
functionality in hardware (and they do, on the iPhone 4S and 5), and they
would only do a better job with every phone refresh. The only way one can hope
to compete is to perform global optimizations across the entire video clip
that the hardware encoder can't do (e.g. dynamic programming), or else apply
fancier transforms which are so CPU intense they kill the mobile experience.
Good job on the part of the developers; the video looks great. As iPhone GPUs
get more powerful stabilization algorithms will only get better.

One business angle here is to give the app. away for free and charge a dollar
per video to deshake clips as a web service in the cloud.

~~~
mamp
The app is great.

It seems to take the approach of a post-processing frame shift rather than
real-time stabilisation. In other words it lets the phone capture and compress
using hardware, but keeps the sensor data separately and shifts the x,y of the
video to stabilise. Unlike Final Cut etc. that has to analyse each frame, Luma
has the raw data as the video was being shot. So the battery hit would come on
export when the video has to be re-rendered I'm guessing.

I think this is the method because the Luma video appears more zoomed in than
the original, i.e. they leave some space around so the video is always
visible. This means that the Luma output will be lower res than the original
hardware captured video I'm guessing.

Still, if it avoids walking around with your phone in a steadicam rig then
it's a huge win. I have a steadicam smoothee but the ridicule from friends and
family mean I don't use it, the slightly lower res output with Luma is the
price my family will pay for their insensitive mockery!

~~~
alexkcd
Luma applies stabilization on playback. We keep all the pixels we get from the
camera, even when you upload to our cloud service. As we improve our
algorithms your existing videos will continue to improve with time. There's
much more that we can do to help you create stunning videos. We're just
getting started :)

~~~
codex
It sounds like you're applying post-decode linear transformation to the data,
but, if so, the video would only appear stabilized in your own custom player.
Does MPEG-4 allow one to specify per-frame linear transformations beyond
simple ROI? I didn't think so, but if so, that's pretty cool. Or perhaps you
keep the original data around server-side, but always stream a corrected
version to anyone who asks to play the video.

Either way, your videos look great. I love the fact that you also compensate
for rotation, which is expensive even when you have macroblock vectors. I hope
you can stay ahead of Apple (both iPhone, iMovie, and FCP) and Android in this
space. Best of luck. May you become the Instagram of online video!

------
hellopat
Very very cool stuff. I downloaded the app and took this (not the ideal stress
test, but the panning is incredibly smooth): <http://luma.io/v/CId>

~~~
alexkcd
(Cofounder here.) That's our motion retargeting algorithm at work. It'll fit
the smoothest motion to your recording. In this case a constant panning motion
does the trick. You can get perfectly stationary, motorized-tripod-like
panning shots, and dollies with Luma without any equipment other than your
hands.

~~~
tuxidomasx
Is this a proprietary algorithm? Can desktop video editing software or plugins
do it this well? I would love to have this as a quick drop in for Premier or
Aftereffects.

Also, are there any plans for motion estimation for achieving super-slow
motion, like Twixtor or Timewarp? That would be a neat effect for mobile
video.

~~~
alexkcd
We developed this in-house. We use additional sensor data and fairly advanced
optimization algorithms to produce these videos (all done in real-time on your
mobile phone!) Due to the lack of sensor data, and subpar camera models, you
won't get results of this quality with desktop video editing software like
Final Cut.

------
eclipxe
This is really impressive. I'd love for a similar Android product.

~~~
HorizonXP
Me too!

------
kybernetyk
Hmm, the comparison videos are interesting. The one with IS turned on reminds
me somehow of a first person shooter video game while the other video feels
more 'real'.

Maybe game developers could introduce more shake into their games to make them
feel more real?

Other than that: The tech is great. The videos certainly get a cinematic
feeling with that kind of image stabilization.

~~~
tuxidomasx
Shaky cams in video games can easily become a hindrance of a good experience.
For example Kane and Lynch 2 was criticized for having a shaky cam that made
gameplay more difficult. Its definately a balancing act to get the right
amount of shake

~~~
joshschreuder
Plus, anecdotally, it makes viewers feel 'dizzy'. See criticisms of found
footage movies like Cloverfield.

------
koudelka
How different is this to AVFoundation's stabilization?

There's an example of it in action here, but it requires a login:
[https://developer.apple.com/videos/wwdc/2012/?include=520#52...](https://developer.apple.com/videos/wwdc/2012/?include=520#520)

~~~
alexkcd
(Cofounder here.) Ours is better: <http://luma.io/v/CJC>

You can think of AVFoundation's as dampening handshake. We retarget the motion
to be along the smoothest path possible. So we can often completely eliminate
handshake. Luma supports both native and cinematic stabilization modes.

------
joshschreuder
Here's a better way of comparing the two identical videos. It's an even
clearer improvement when viewing side-by-side: <http://goo.gl/xoMgQ>

------
softgrow
If you want to try it here is an iTunes link <http://www.itunes.com/apps/luma-
camera>

------
diziet
Very awesome tech! I'm really impressed -- what is the loss in terms of
frames/frame size? Is data or missing parts of a shaky frame generated based
on previous shots?

Also, another thing that goes with shaky videos is bad or stuttering sound. If
you guys handle both, I can easily see you becoming the go-to solution for
filming on cell cameras.

------
duncans
Very nice. It's missing using the volume buttons as a record button though.

~~~
alexkcd
It's on our todo list :)

------
barista
How does this compare to the optical image stabilization on Nokia Lumia 920?

~~~
pavlov
It's basically like comparing optical zoom to digital zoom.

Digital stabilization will inevitably lose resolution because it needs to crop
some data to even out the shake, and also because there is varying motion blur
in the images that needs to be cleaned up somehow.

Like with zoom, optical is always a better solution because it fixes the
problem before light hits the sensor, but of course it comes with its own
tradeoffs involving physical size and production cost.

~~~
alexkcd
Luma's cinematic stabilization performs much better than Nokia's optical
stabilizer in daylight. As sensors get better (more light sensitive, higher
resolution), the advantages of optical stabilizers will diminish even further
and eventually disappear. (Disclosure: I cofounded Luma).

