Instagram's version is more traditional image stabilization. It uses gyroscopic data captured alongside the video to help calculate stabilization. This is all done on the device.
Microsoft's version uses post processing on their servers to generate a 3D model of the scene. They then synthesize a new video based on this data.
Microsoft's approach is more time intensive and can lead to strange artifacts in the video. But lets them generate a smooth video if, for example, the person wearing the camera glances around quickly mid shot.
Microsoft published a research paper about this technology before the Instagram app was released. I believe the Instagram app was a hack week project developed based on the MS research paper.
The Instagram app, I believe, relies on gyro data from the phone to sync and align footage seamlessly. However, this would not work from, say, a GoPro camera or video from a DSLR.
The Microsoft product is not limited in this way from what I recall of their research. It is much more robust.
Microsoft's method allows to use any previously recorded footage and stabilize video by doing frame by frame comparison. Whereas Instagram uses motion sensors which are recorded along with the video.
So Instagram's method won't work for any videos not recorded with their app.
Yup! And that's not a small thing. More and more companies neglect Android and the web these days with their "mobile first" strategy, which means, iPhone first, in fact, and we should pressure them not to discriminate the majority and a vast share of the influencers as well. In the early days, it was hard to find Android developers, but now they are more and the latest SDKs are much easier to develop and support a multitude of devices. Test on Samsung Galaxy, Nexus, and a few more leading devices, and you still will do better than totally neglecting Android! For smaller startups, I can find some excuse, but for a billion dollar companies like Instagram/Facebook, there's no excuse!
Worth noting that there is an Instagram Hyperlapse-type app on Android called "Gallus". It's made by some guy (e.g. lacks polish), and apparently isn't perfect on all devices (1), but my experience has been that it yields as good if not better outcomes on my Nexus 5 than Instagram's app on my iPhone.
It has gone almost entirely unnoticed, but it creates outputs a world better than this Microsoft app. I get that they're going for 'the videos you already have', but if I have a mobile device with so many sensors, why wouldn't I use them?
(1) - I mention that because whenever something pushes the edge, if someone finds that it doesn't work on their own Android handset, they tend to be the absolute loudest about it, declaring that therefore it must not work on anyone's handset. It is the most bizarre aspect to apps in Android land.
How it works, from the creators, back in August[1]:
Standard video stabilization crops out the pixels on the periphery to create consistent frame-to-frame smoothness. But when applied to greatly sped up video, it fails to compensate for the wildly shaking motion.
Hyperlapse reconstructs how a camera moves throughout a video, as well as its distance and angle in relation to what’s happening in each frame. Then it plots out a smoother camera path and stitches pixels from multiple video frames to rebuild the scene and expand the field of view.
Not to be confused with the hyperlapse paper from 2014 that was instead using a very computationally heavy structure from motion pipeline. Quoting the 2015 paper: "Our approach resides in between the Instagram approach and that of Kopf et al [the 2014 paper]".
It's much better in 480p30 than 720p60. Shockingly so even. I'm not sure if that's a fundamental property of 60Hz or if something about how it was encoded.
I thought you were possibly just overly sensitive to those kinds of things, but after watching it, I began to feel the exact same way. I've never had a video cause severe motion sickness before.
In addition to cropping the frame, it warps the image to map each frame to a set of points calculated to fit a simulated camera following a line, parabola, or a filtered version of the original camera path. When sped up, the end result is not "garbage" at all because it's a path along a perfect line/parabola.
You can 2x the video on Youtube to see for yourself. The major drawback is cropping and computation time, which is where Microsoft's technique excels.
Here's discussion from back when the research was published. https://news.ycombinator.com/item?id=8160571 It's amazing that the performance problems were addressed to make this look good on a phone!
At first I was amazed they'd managed to get it to run on a phone too, but it seems this new one uses quite a different technique which gets nowhere near as smooth results.
For those who are new to this technology: If you have ever taken timelapses while moving (or long videos that you want to speed up), you would have seen how jittery and frustratingly bad they look. Usual remedy for this involved either using bulky stabilization hardware or half backed motion stabilization algorithms. This thing is completely different and takes whole stabilization game to next level.
In essence, it would remove frames that produces really big sudden movements or it would "re-shape" it to make it consistent or smoother and it does all that with some very deep algorithms. Ultimately your long timelapse or fast video shows up as if there was done with expensive stabilization and great care. My only complain is why iOS is being left out :(.
I had to try, with some downhill skiing footage I took a few weeks ago. It works well with the stabilizing[1], but screws up the FOV. The footage fed to the Hyperlapse program had even more details vertically than the first clip in my video[2]. Maybe it's too much motion skiing down, or it is unable to process the stuff too far away?
I think the horizontal stabilization is working fine the lines in the snow snake back and forth a lot and maaaats is slaloming back and forth making it look jumpy where it isn't.
This is probably a pretty hard video to do since so much of the field is just plain white snow.
Yeah, you are probably right. And compared to other videos with hyperlapse, there are no close reference points here. No buildings or trees, just mountains far, far away, which may be hard to extrapolate anything from.
The same people, at least Johannes Kopf and Richard Szeliski, were on both projects. So I wouldn't be surprised if there is some overlap in techniques. But Photosynth (and Microsoft's first hyperlapse research) is focused on making a 3D model using pieces of photos to recreate a scene, while the new Hyperlapse app just picks frames from a video.
If you read the detailed technology link posted elsewhere in the HN comments, you can see that this is a different manifestation of the same technology. They are calculating the camera position individually for each frame, then mapping a new (smoother) POV through the scene and re-rendering appropriately. The frames in the final video may include data from more than one source frame so that the POV can pan smoothly without cropping.
We develop a dynamic programming algorithm, inspired by dynamic-time-warping (DTW) algorithms, that selects frames from the input video that both best match a desired target speed-up and result in the smoothest possible camera motion in the resulting hyperlapse video. Once an optimal set of frames is selected, our method performs 2D video stabilization to create a smoothed camera path from which we render the resulting hyper-lapse.
Unfortunately, I can't download it for my android phone, only a few are supported. But it's great that MS is releasing on platforms other than windows. Even though right now, the Windows phone market is tiny, the more they open up their products, the more awareness they will gain for their platform.
So the "Picking the right frames" section indicates that the tech is mostly selecting and ordering already existing frames. My question is does it also add frames that it generates? Or is the entire algorithm about selecting the right frames?
The smoothed footage looks great. I'm excited for sensor technology advance so that we can have a global shutter (as opposed to rolling shutter) in our consumer devices. Once we have that, I'd imagine stabilized footage could be almost imperceptible.
I can't download it from the Play Store either... but it didn't tell me that it was because I wasn't part of Google Plus... I even tried a few Google accounts and different browsers (in case addons were blocking something necessary). I guess I will pull out my unused Windows Phone.
The error I got:
Sorry - nothing to see here.
There may be several reasons:
The app you’re looking for doesn't exist
or
the app developer is not currently running a test
or
you are not eligible to participate in the testing program.
If you received the link to this page from someone, we recommend you contact them for more information.
https://hyperlapse.instagram.com
http://blog.instagram.com/post/95829278497/hyperlapse-from-i...
https://itunes.apple.com/app/id740146917