What Makes Hyperlapse from Instagram so Hyper?


Earlier this week, Instagram released Hyperlapse, a very simple new app for recording videos with the intention of speeding them up.

Much like a timelapse, hyperlapses are pieces of footage captured with a slow frame rate (for example 1 frame per minute) and then sped up to a regular video frame rate (24, 25 or 30 frames per second). This method of capturing footage can show a great amount of time and activity much faster than regular video. The difference between a hyperlapse and a timelapse is that with a hyperlapse, the camera is moving a great amount as well as the subject, thus increasing the activity and movement in a sped up video.

Using the App

First of all, the user interface is magnificently simply. Users simply tap a button to record and then tap it again to stop recording. Following that, they are presented with a slider to adjust the speed of the clip. Save — and that’s it!

Thanks to the growing development in the quality of smartphone cameras, the results of the video footage is outstanding and coupled with this app they could possibly now be treated as a serious addition to an filmmaking enthusiast’s camera bag – or pocket.

Here’s an example of how impressive the stabilisation is:

Hyperlapse 2.0

There are some initial thoughts I have about Hyperlapse that I would love to see developed. First of all, it would be excellent if this made its way to Google Glass. It seems like such an obvious use of the camera on Glass, as people could easily film a hyperlapse perspective from their own vision.

It would also be great if the video recorded sound as well. I know why there is no sound recorded, as it would mostly be an unintelligible mash-up of sounds, but I do feel like there’s a slight loss in the user’s control having that decision made for them. There is plenty of software available that can speed up audio and maintain the pitch so that it doesn’t sound like chipmunks, so why not include the option for it in Hyperlapse. I think a lot of people would get entertainment out of the high pitched sounds, but some scenes may also benefit quite well from a sped up ambience (e.g. a bustling city).

I’d also love to see both the front and back cameras being used at the same time when recording a Hyperlapse. I can see potential for people filming large chunks of events and also themselves within the events. I’m not sure how intensive that would be on the phone’s processor though…

Recording multiple clips together in one Hyperlapse rather than having to export the footage to another app or video editing software to simply stitch clips together, would be a great addition too.

Technically Tremendous

Instagram have already blogged about how they built their app which you can read here on Instagram’s blog. Otherwise just watch the following videos and you’ll probably get an instant “oh yeah!” moment when you visualise how stabilisation software works!

The smartest thing about Hyperlapse is the way that footage is stabilised. It’s such a simple concept to use the built-in gyroscope to record the movement of the camera when filming and then counter balance it in the opposite direction when processing the final result. It’s pretty much as simple as that (plus a few thousand hours or so of perfecting the final code). It’s a bit like how noise cancelling headphones work – using microphones to record the surrounding noise and then playing the inverse sound through the headphones for a clean slate to play music over.

It got me thinking that DSLRs and software engineers can learn a lot from Hyperlapse. Particularly with the use of image stabilising (IS) lenses, which of course have gyroscopes built in to physically hold the internal glass level within the lens. The difference with Hyperlapse is that it only uses the digital data from the gyroscope to stabilise the end picture. An IS lens uses a gyroscope to physically holds a lens steady. What would be really interesting is for editing software to have access to that IS lens gyroscope data and further stabilise footage digitally. Just think how incredible the results could be to use further physical information about a picture than just having the software analyse the pixels moving on the screen.

Can’t Wait to Play

I’m personally very excited to see what I can create with Hyperlapse, but for now I’m probably going to have to wait until I’ve got a new iPhone as mine is now 4 years old and struggling with most things. All eyes are on Apple now to see what their september event brings…

Stay up to Date

Make sure to check back here to see the latest content and subscribe/follow me on whichever channel suits you best:
Follow me on Twitter
Subscribe to me on YouTube
Add me on Google+
Follow me on Instagram

Subscribe to updates


Join thousands of others and get updates straight to your inbox! Plus, get a massive 40% OFF my Lightroom presets, instantly!

Share this post

If you think other people would enjoy this post,
share it across whichever social network is most relevant to you.

Comments


Leave a comment

You can use the following HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>