Hacker News new | past | comments | ask | show | jobs | submit login
Hyperlapse makes first-person videos smooth and speedy (microsoft.com)
191 points by davio on May 14, 2015 | hide | past | favorite | 65 comments



How does this differ from the app released by Instagram last year with the exact same name that appears to do the same thing ?

https://hyperlapse.instagram.com

http://blog.instagram.com/post/95829278497/hyperlapse-from-i...

https://itunes.apple.com/app/id740146917


"Difference between Microsoft's and Instagram's Hyperlapse" http://research.microsoft.com/en-us/um/redmond/projects/hype...


Highly recommend watching the video in the link. Holy shit is it impressive.


Instagram's version is more traditional image stabilization. It uses gyroscopic data captured alongside the video to help calculate stabilization. This is all done on the device.

Microsoft's version uses post processing on their servers to generate a 3D model of the scene. They then synthesize a new video based on this data.

Microsoft's approach is more time intensive and can lead to strange artifacts in the video. But lets them generate a smooth video if, for example, the person wearing the camera glances around quickly mid shot.


Microsoft published a research paper about this technology before the Instagram app was released. I believe the Instagram app was a hack week project developed based on the MS research paper.



It was actually based on a previous app (Luma) that did the same thing for several years prior. Instagram acquired Luma back in 2013.


The Instagram app, I believe, relies on gyro data from the phone to sync and align footage seamlessly. However, this would not work from, say, a GoPro camera or video from a DSLR.

The Microsoft product is not limited in this way from what I recall of their research. It is much more robust.


Microsoft's method allows to use any previously recorded footage and stabilize video by doing frame by frame comparison. Whereas Instagram uses motion sensors which are recorded along with the video. So Instagram's method won't work for any videos not recorded with their app.


It works on Android among other things.


Yup! And that's not a small thing. More and more companies neglect Android and the web these days with their "mobile first" strategy, which means, iPhone first, in fact, and we should pressure them not to discriminate the majority and a vast share of the influencers as well. In the early days, it was hard to find Android developers, but now they are more and the latest SDKs are much easier to develop and support a multitude of devices. Test on Samsung Galaxy, Nexus, and a few more leading devices, and you still will do better than totally neglecting Android! For smaller startups, I can find some excuse, but for a billion dollar companies like Instagram/Facebook, there's no excuse!


Worth noting that there is an Instagram Hyperlapse-type app on Android called "Gallus". It's made by some guy (e.g. lacks polish), and apparently isn't perfect on all devices (1), but my experience has been that it yields as good if not better outcomes on my Nexus 5 than Instagram's app on my iPhone.

It has gone almost entirely unnoticed, but it creates outputs a world better than this Microsoft app. I get that they're going for 'the videos you already have', but if I have a mobile device with so many sensors, why wouldn't I use them?

(1) - I mention that because whenever something pushes the edge, if someone finds that it doesn't work on their own Android handset, they tend to be the absolute loudest about it, declaring that therefore it must not work on anyone's handset. It is the most bizarre aspect to apps in Android land.


Well for one thing it actually works on something other then iOS. Big plus in my book there.


Microsoft showed their stuff before Instagram showed theirs.


How it works, from the creators, back in August[1]:

Standard video stabilization crops out the pixels on the periphery to create consistent frame-to-frame smoothness. But when applied to greatly sped up video, it fails to compensate for the wildly shaking motion.

Hyperlapse reconstructs how a camera moves throughout a video, as well as its distance and angle in relation to what’s happening in each frame. Then it plots out a smoother camera path and stitches pixels from multiple video frames to rebuild the scene and expand the field of view.

[1]http://blogs.microsoft.com/next/2014/08/11/hyperlapse-siggra....


For some reason, I find the demonstration in this article far more convincing than the ones in the posted article.


Note that the app appears to be based on this recent research paper: http://research.microsoft.com/en-us/um/redmond/projects/hype...

Not to be confused with the hyperlapse paper from 2014 that was instead using a very computationally heavy structure from motion pipeline. Quoting the 2015 paper: "Our approach resides in between the Instagram approach and that of Kopf et al [the 2014 paper]".


Here's a quick real-world demo I made in Amsterdam today: https://www.youtube.com/watch?v=gO4-LBGejqw


I easily get motion sickness from first person games, but I've never had a problem with video before.

This made me get a tickle in my throat just two seconds in. Definitely very uncomfortable to watch.


It's the focal length of the lens. If it had a wider FOV I think it would feel more comfortable.


It's much better in 480p30 than 720p60. Shockingly so even. I'm not sure if that's a fundamental property of 60Hz or if something about how it was encoded.


I don't usually get motion sickness from first person games, but I got a headache just watching the video. Twice since I showed it to a coworker.


Wow, I just had the same issue, and I'm also rarely affected by this stuff..



I thought you were possibly just overly sensitive to those kinds of things, but after watching it, I began to feel the exact same way. I've never had a video cause severe motion sickness before.


I suspect the 60fps is a large contributor to this, at least for me.


Agreed, subtle, but instant queasiness. Not a pleasant video to watch.


Holy moly this is bad. Seems unstabilized (compared to Instagram's Hyperlapse) and resolution drops dramatically on some scenes.


This is from 6 years ago, and I still haven't seen anything as good: http://createdigitalmotion.com/2009/06/magical-3d-warping-te...


That's totally different though. That's strictly image stabilization and it crops the image down. Microsoft's goal is smooth, fast video playback.

If you took the image stabilization you linked and sped it up the end result would be garbage. Different techniques for different goals.


In addition to cropping the frame, it warps the image to map each frame to a set of points calculated to fit a simulated camera following a line, parabola, or a filtered version of the original camera path. When sped up, the end result is not "garbage" at all because it's a path along a perfect line/parabola.

You can 2x the video on Youtube to see for yourself. The major drawback is cropping and computation time, which is where Microsoft's technique excels.


This isn't about a small 2x speedup. This is about 10x. Big difference.


Here's discussion from back when the research was published. https://news.ycombinator.com/item?id=8160571 It's amazing that the performance problems were addressed to make this look good on a phone!


At first I was amazed they'd managed to get it to run on a phone too, but it seems this new one uses quite a different technique which gets nowhere near as smooth results.

The new technique is described here: http://research.microsoft.com/en-us/um/redmond/projects/hype...


For those who are new to this technology: If you have ever taken timelapses while moving (or long videos that you want to speed up), you would have seen how jittery and frustratingly bad they look. Usual remedy for this involved either using bulky stabilization hardware or half backed motion stabilization algorithms. This thing is completely different and takes whole stabilization game to next level.

In essence, it would remove frames that produces really big sudden movements or it would "re-shape" it to make it consistent or smoother and it does all that with some very deep algorithms. Ultimately your long timelapse or fast video shows up as if there was done with expensive stabilization and great care. My only complain is why iOS is being left out :(.

For much better demonstration of how cool this is: https://www.youtube.com/watch?v=SOpwHaQnRSY


The level of complexity introduced by moving the camera AND the scene is remarkable.

Very different from the work of Keith Loutit and Tony Leech where a stationary camera is key.

http://www.keithloutit.com/ https://vimeo.com/11445353/

Filming everything in motion and then calculating stability is an interesting problem.

Well done to Microsoft and Instagram for attempting it.

edit: fixed links


I had to try, with some downhill skiing footage I took a few weeks ago. It works well with the stabilizing[1], but screws up the FOV. The footage fed to the Hyperlapse program had even more details vertically than the first clip in my video[2]. Maybe it's too much motion skiing down, or it is unable to process the stuff too far away?

[1]: https://www.youtube.com/watch?v=_CvzPyQF7fA [2]: http://imgur.com/w1mLrM6 (not the same frame in both pictures, but you get the idea)


Made a new video. A new clip from a bicycle ride, in which the hyperlapse is very good. And the old clip, showing the case with too much cropping.

https://www.youtube.com/watch?v=b0jNJBZP1QA


From your video, it looks like the vertical stabilization is almost perfect but the horizontal stabilization is really not working..


I think the horizontal stabilization is working fine the lines in the snow snake back and forth a lot and maaaats is slaloming back and forth making it look jumpy where it isn't.

This is probably a pretty hard video to do since so much of the field is just plain white snow.


Yeah, you are probably right. And compared to other videos with hyperlapse, there are no close reference points here. No buildings or trees, just mountains far, far away, which may be hard to extrapolate anything from.


Is it related to Microsoft's Photosynth v3? https://photosynth.net/preview


The same people, at least Johannes Kopf and Richard Szeliski, were on both projects. So I wouldn't be surprised if there is some overlap in techniques. But Photosynth (and Microsoft's first hyperlapse research) is focused on making a 3D model using pieces of photos to recreate a scene, while the new Hyperlapse app just picks frames from a video.


If you read the detailed technology link posted elsewhere in the HN comments, you can see that this is a different manifestation of the same technology. They are calculating the camera position individually for each frame, then mapping a new (smoother) POV through the scene and re-rendering appropriately. The frames in the final video may include data from more than one source frame so that the POV can pan smoothly without cropping.


That's the older version that I mentioned. This is the link for the new realtime version: http://research.microsoft.com/en-us/um/redmond/projects/hype...

We develop a dynamic programming algorithm, inspired by dynamic-time-warping (DTW) algorithms, that selects frames from the input video that both best match a desired target speed-up and result in the smoothest possible camera motion in the resulting hyperlapse video. Once an optimal set of frames is selected, our method performs 2D video stabilization to create a smoothed camera path from which we render the resulting hyper-lapse.


I'm excited that I can use Hyperlapse via Azure without a Windows computer. It will be the reason that I finally create an Azure account.


Disclaimer: PM on Azure Media Services who brought Hyperlapse on board.

This is great to hear! Feel free to reach out if you have any questions adsolank at microsoft dot com


.apk please anyone.. Does it support images/video from your library? would like to try with ARC Welder in Chrome, using my DSLR shots


What do you mean? You can download the Windows desktop version here: http://research.microsoft.com/en-US/downloads/b199c523-bcd9-...


From the project page, you can get it for Android, Windows Phone, and Windows Desktop: http://aka.ms/hyperlapse


Unfortunately, I can't download it for my android phone, only a few are supported. But it's great that MS is releasing on platforms other than windows. Even though right now, the Windows phone market is tiny, the more they open up their products, the more awareness they will gain for their platform.


So the "Picking the right frames" section indicates that the tech is mostly selecting and ordering already existing frames. My question is does it also add frames that it generates? Or is the entire algorithm about selecting the right frames?


They describe their algorithm in the video at the bottom of this page:

http://research.microsoft.com/en-us/um/redmond/projects/hype...

It says the hyperlapse is constructed from selecting the right frames and doing a projection onto the proxy geometry.


The smoothed footage looks great. I'm excited for sensor technology advance so that we can have a global shutter (as opposed to rolling shutter) in our consumer devices. Once we have that, I'd imagine stabilized footage could be almost imperceptible.


Problem: share hours of dull video content nobody wants to see.

Solution: play it back at 30x speed.


Previous discussion on HN (August 2011): https://news.ycombinator.com/item?id=8160571


I think you meant 2014.


why is it smooth and speedy?

wouldn't the 3d extraction + reconstruction for smoothness alone be super cool?

edit: Or do they need so much redundancy and variety in the images that they need to have an "oversampled" source?


Wasn't this was open sourced like a year ago on HN?


what kind of pathway would lead to doing something like this? is this something only MS/PhD would get to work on?


Does anybody have an APK? I don't want a Google Plus account and it seems like a troublesome procedure to try this.



Thanks, but "Sorry, something went wrong." :(


I can't download it from the Play Store either... but it didn't tell me that it was because I wasn't part of Google Plus... I even tried a few Google accounts and different browsers (in case addons were blocking something necessary). I guess I will pull out my unused Windows Phone.

The error I got:

Sorry - nothing to see here.

There may be several reasons: The app you’re looking for doesn't exist or the app developer is not currently running a test or you are not eligible to participate in the testing program. If you received the link to this page from someone, we recommend you contact them for more information.


Microsoft should consider uploading to Vine :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: