

Hyperlapse makes first-person videos smooth and speedy - davio
http://news.microsoft.com/features/microsoft-hyperlapse-makes-first-person-videos-smooth-and-speedy/

======
spilk
How does this differ from the app released by Instagram last year with the
exact same name that appears to do the same thing ?

[https://hyperlapse.instagram.com](https://hyperlapse.instagram.com)

[http://blog.instagram.com/post/95829278497/hyperlapse-
from-i...](http://blog.instagram.com/post/95829278497/hyperlapse-from-
instagram)

[https://itunes.apple.com/app/id740146917](https://itunes.apple.com/app/id740146917)

~~~
codezero
Microsoft published a research paper about this technology before the
Instagram app was released. I believe the Instagram app was a hack week
project developed based on the MS research paper.

~~~
macca321
[http://research.microsoft.com/en-
us/um/redmond/projects/hype...](http://research.microsoft.com/en-
us/um/redmond/projects/hyperlapse/igcomp/)

------
hammock
How it works, from the creators, back in August[1]:

Standard video stabilization crops out the pixels on the periphery to create
consistent frame-to-frame smoothness. But when applied to greatly sped up
video, it fails to compensate for the wildly shaking motion.

Hyperlapse reconstructs how a camera moves throughout a video, as well as its
distance and angle in relation to what’s happening in each frame. Then it
plots out a smoother camera path and stitches pixels from multiple video
frames to rebuild the scene and expand the field of view.

[1][http://blogs.microsoft.com/next/2014/08/11/hyperlapse-
siggra...](http://blogs.microsoft.com/next/2014/08/11/hyperlapse-siggra..).

~~~
bnegreve
For some reason, I find the demonstration in this article far more convincing
than the ones in the posted article.

------
totolapraline
Note that the app appears to be based on this recent research paper:
[http://research.microsoft.com/en-
us/um/redmond/projects/hype...](http://research.microsoft.com/en-
us/um/redmond/projects/hyperlapserealtime/hyperlapse.pdf)

Not to be confused with the hyperlapse paper from 2014 that was instead using
a very computationally heavy structure from motion pipeline. Quoting the 2015
paper: "Our approach resides in between the Instagram approach and that of
Kopf et al [the 2014 paper]".

------
owenwil
Here's a quick real-world demo I made in Amsterdam today:
[https://www.youtube.com/watch?v=gO4-LBGejqw](https://www.youtube.com/watch?v=gO4-LBGejqw)

~~~
fiblye
I easily get motion sickness from first person games, but I've never had a
problem with video before.

This made me get a tickle in my throat just two seconds in. Definitely very
uncomfortable to watch.

~~~
seszett
I don't usually get motion sickness from first person games, but I got a
headache just watching the video. Twice since I showed it to a coworker.

~~~
sbarre
Wow, I just had the same issue, and I'm also rarely affected by this stuff..

------
hamoid
This is from 6 years ago, and I still haven't seen anything as good:
[http://createdigitalmotion.com/2009/06/magical-3d-warping-
te...](http://createdigitalmotion.com/2009/06/magical-3d-warping-techniques-
steadies-your-videos/)

~~~
forrestthewoods
That's totally different though. That's strictly image stabilization and it
crops the image down. Microsoft's goal is smooth, fast video playback.

If you took the image stabilization you linked and sped it up the end result
would be garbage. Different techniques for different goals.

~~~
platinum1
In addition to cropping the frame, it warps the image to map each frame to a
set of points calculated to fit a simulated camera following a line, parabola,
or a filtered version of the original camera path. When sped up, the end
result is not "garbage" at all because it's a path along a perfect
line/parabola.

You can 2x the video on Youtube to see for yourself. The major drawback is
cropping and computation time, which is where Microsoft's technique excels.

~~~
forrestthewoods
This isn't about a small 2x speedup. This is about 10x. Big difference.

------
sp332
Here's discussion from back when the research was published.
[https://news.ycombinator.com/item?id=8160571](https://news.ycombinator.com/item?id=8160571)
It's amazing that the performance problems were addressed to make this look
good on a phone!

~~~
widdma
At first I was amazed they'd managed to get it to run on a phone too, but it
seems this new one uses quite a different technique which gets nowhere near as
smooth results.

The new technique is described here: [http://research.microsoft.com/en-
us/um/redmond/projects/hype...](http://research.microsoft.com/en-
us/um/redmond/projects/hyperlapserealtime/hyperlapse.pdf)

------
sytelus
For those who are new to this technology: If you have ever taken timelapses
while moving (or long videos that you want to speed up), you would have seen
how jittery and frustratingly bad they look. Usual remedy for this involved
either using bulky stabilization hardware or half backed motion stabilization
algorithms. This thing is completely different and takes whole stabilization
game to next level.

In essence, it would remove frames that produces really big sudden movements
or it would "re-shape" it to make it consistent or smoother and it does all
that with some very deep algorithms. Ultimately your long timelapse or fast
video shows up as if there was done with expensive stabilization and great
care. My only complain is why iOS is being left out :(.

For much better demonstration of how cool this is:
[https://www.youtube.com/watch?v=SOpwHaQnRSY](https://www.youtube.com/watch?v=SOpwHaQnRSY)

------
johnohara
The level of complexity introduced by moving the camera AND the scene is
remarkable.

Very different from the work of Keith Loutit and Tony Leech where a stationary
camera is key.

[http://www.keithloutit.com/](http://www.keithloutit.com/)
[https://vimeo.com/11445353/](https://vimeo.com/11445353/)

Filming everything in motion and then calculating stability is an interesting
problem.

Well done to Microsoft and Instagram for attempting it.

edit: fixed links

------
maaaats
I had to try, with some downhill skiing footage I took a few weeks ago. It
works well with the stabilizing[1], but screws up the FOV. The footage fed to
the Hyperlapse program had even more details vertically than the first clip in
my video[2]. Maybe it's too much motion skiing down, or it is unable to
process the stuff too far away?

[1]:
[https://www.youtube.com/watch?v=_CvzPyQF7fA](https://www.youtube.com/watch?v=_CvzPyQF7fA)
[2]: [http://imgur.com/w1mLrM6](http://imgur.com/w1mLrM6) (not the same frame
in both pictures, but you get the idea)

~~~
machbio
From your video, it looks like the vertical stabilization is almost perfect
but the horizontal stabilization is really not working..

~~~
rtkwe
I think the horizontal stabilization is working fine the lines in the snow
snake back and forth a lot and maaaats is slaloming back and forth making it
look jumpy where it isn't.

This is probably a pretty hard video to do since so much of the field is just
plain white snow.

~~~
maaaats
Yeah, you are probably right. And compared to other videos with hyperlapse,
there are no close reference points here. No buildings or trees, just
mountains far, far away, which may be hard to extrapolate anything from.

------
frik
Is it related to Microsoft's Photosynth v3?
[https://photosynth.net/preview](https://photosynth.net/preview)

~~~
sp332
The same people, at least Johannes Kopf and Richard Szeliski, were on both
projects. So I wouldn't be surprised if there is some overlap in techniques.
But Photosynth (and Microsoft's first hyperlapse research) is focused on
making a 3D model using pieces of photos to recreate a scene, while the new
Hyperlapse app just picks frames from a video.

~~~
URSpider94
If you read the detailed technology link posted elsewhere in the HN comments,
you can see that this is a different manifestation of the same technology.
They are calculating the camera position individually for each frame, then
mapping a new (smoother) POV through the scene and re-rendering appropriately.
The frames in the final video may include data from more than one source frame
so that the POV can pan smoothly without cropping.

~~~
sp332
That's the older version that I mentioned. This is the link for the new
realtime version: [http://research.microsoft.com/en-
us/um/redmond/projects/hype...](http://research.microsoft.com/en-
us/um/redmond/projects/hyperlapserealtime/)

 _We develop a dynamic programming algorithm, inspired by dynamic-time-warping
(DTW) algorithms, that selects frames from the input video that both best
match a desired target speed-up and result in the smoothest possible camera
motion in the resulting hyperlapse video. Once an optimal set of frames is
selected, our method performs 2D video stabilization to create a smoothed
camera path from which we render the resulting hyper-lapse._

------
philip1209
I'm excited that I can use Hyperlapse via Azure without a Windows computer. It
will be the reason that I finally create an Azure account.

~~~
yayadarsh
Disclaimer: PM on Azure Media Services who brought Hyperlapse on board.

This is great to hear! Feel free to reach out if you have any questions
adsolank at microsoft dot com

------
dharma1
.apk please anyone.. Does it support images/video from your library? would
like to try with ARC Welder in Chrome, using my DSLR shots

~~~
eco
What do you mean? You can download the Windows desktop version here:
[http://research.microsoft.com/en-
US/downloads/b199c523-bcd9-...](http://research.microsoft.com/en-
US/downloads/b199c523-bcd9-4a1f-b58b-af75bd5c621c/default.aspx)

------
aceperry
Unfortunately, I can't download it for my android phone, only a few are
supported. But it's great that MS is releasing on platforms other than
windows. Even though right now, the Windows phone market is tiny, the more
they open up their products, the more awareness they will gain for their
platform.

------
skizm
So the "Picking the right frames" section indicates that the tech is mostly
selecting and ordering _already existing_ frames. My question is does it also
_add_ frames that it generates? Or is the entire algorithm about selecting the
right frames?

~~~
nostrebored
They describe their algorithm in the video at the bottom of this page:

[http://research.microsoft.com/en-
us/um/redmond/projects/hype...](http://research.microsoft.com/en-
us/um/redmond/projects/hyperlapse/igcomp/)

It says the hyperlapse is constructed from selecting the right frames and
doing a projection onto the proxy geometry.

------
jakejake
The smoothed footage looks great. I'm excited for sensor technology advance so
that we can have a global shutter (as opposed to rolling shutter) in our
consumer devices. Once we have that, I'd imagine stabilized footage could be
almost imperceptible.

------
blackhaz
Problem: share hours of dull video content nobody wants to see.

Solution: play it back at 30x speed.

------
hammock
Previous discussion on HN (August 2011):
[https://news.ycombinator.com/item?id=8160571](https://news.ycombinator.com/item?id=8160571)

~~~
dang
I think you meant 2014.

------
anotheryou
why is it smooth _and_ speedy?

wouldn't the 3d extraction + reconstruction for smoothness alone be super
cool?

edit: Or do they need so much redundancy and variety in the images that they
need to have an "oversampled" source?

------
cdnsteve
Wasn't this was open sourced like a year ago on HN?

------
ljk
what kind of pathway would lead to doing something like this? is this
something only MS/PhD would get to work on?

------
unicornporn
Does anybody have an APK? I don't want a Google Plus account and it seems like
a troublesome procedure to try this.

~~~
crumpled
[https://cdn.fbsbx.com/hphotos-
xfp1/v/t59.2708-21/11253139_10...](https://cdn.fbsbx.com/hphotos-
xfp1/v/t59.2708-21/11253139_10206879288129321_1211397065_n.zip/Hyperlapse-
Mobile_1.0.9.zip?oh=b1a90199f3197116f3ca30e4681d501c&oe=5556C33D&dl=1)

~~~
unicornporn
Thanks, but "Sorry, something went wrong." :(

------
guilt
Microsoft should consider uploading to Vine :)

