Hacker News new | past | comments | ask | show | jobs | submit login
HDR video accomplished using dual 5D Mark IIs (engadget.com)
57 points by makeramen on Sept 9, 2010 | hide | past | favorite | 23 comments



It's really disappointing to see even engadget disparaging HDR because they think it's the same thing as tone-mapping, and because they think tone-mapping always produces freaky results. All HDR really is is a technique for gathering more information that can be post-processed.

And, from what little we've seen of HDR+auto tone-mapping on the iPhone, it could be the harbinger not of rampant pyschedelic tone-mapping, but of more reserved and tasteful tone-mapping.

One thing I'm willing to bet on, though, is that this will elicit a lot of criticism from the people who've fallen in love with the limitations of current cinematographic techniques. They'll complain about how you can actually see stuff in shadows, and that the lens flare doesn't do enough to obscure the subject, all without realizing that this advance merely makes those things optional.


" ... and that the lens flare doesn't do enough to obscure the subject, all without realizing that this advance merely makes those things optional."

The funny thing is, lens flare is like tape hiss in audio recording; it's an artifact of a given capturing technology, not intrinsic to the thing being captured. On the one hand, it should just go away as recording technology improves, since you don't experience lens flare when you see something with your own eyes, yet, unlike tape hiss, it has become some sort of token indicator of "realism" (especially in CG movies).

Some people will look at HDR video and complain that, like HDR still shots, it doesn't look real, as if something viewed through a recording intermediary could ever properly look real.


You can experience lens flare in your eye because you have two lens in your eyes. http://en.wikipedia.org/wiki/Lens_flare

Close your eyes and only let a slight open aperture to your eyes, you will experience all kinds of interesting effects made by your eye lashes too.

Totally real, like HDR(we see far more dynamic range than digital cameras, and have exponential, not linear sensors).


Interesting. I'm aware of visual distortion when squinting, but never thought about lens flare under typical eyeball conditions. Perhaps it's a phenomenon that the mind filters out, or is so obvious I never think about it. (Now I'll have to pay attention.)

Do you think the traditional cinema lens flare mimics what the naked eye sees? I'd think not.

A problem with pure CG movies is that they feel sterile. Producers add in visual distortions to make it feel real, but what they're adding is merely what's familiar (that is, the results of traditional lens and film that has become accepted as a stand-in for real), not what a naked eye might normally experience. But it's enough to make people feel more comfortable with what they're seeing.


Using spatial dithering and the iphone 4's retina display, you could make an app that could display HDR and still have the resolution of the old iphone. Throw in temporal dithering if it was something like a static photo viewer, and you would have a pretty massive range.


That would just increase the resolution. The real problem is that the device is way too dark to display the levels of radiance you see on a bright day. Tone-mapping has to be used to bring those levels down, and there is no "correct" way to do that - one has to play with settings.


It wouldn't increase the resolution, it would increase the dynamic range; for two shots with different exposure you would do 4 pixels white if both were white, two pixels white if one was full white and the other full black, etc. (your flat panel probably already uses a similar approach to get roughly 8bits of overall color-depth out of 7bit elements)


I meant intensity resolution (or color depth, as you say). Dynamic range is something else - it is the interval [darkest, brightest], or their ratio if you prefer.


Ick... it's like looking at some of the earliest HDR photos, when people using the tech had no idea how to use it properly.

If that's what people think when they think "HDR", it's no wonder there's been so much misguided animosity.

edit: Please to repeat after me:

Good HDR: http://blogs.adobe.com/jnack/files/images/ChristChurchCathed... (ironically, the same site links this and the video)

Bad HDR: http://www.digitalcameratracker.com/wp-content/uploads/2006/...

Good HDR: http://ursispaltenstein.ch/blog/images/uploads_img/hdr_japan...

Bad HDR: http://www.flickr.com/photos/9147703@N03/2176897085/

What have we learned?


Not entirely sure that it's good or bad, just differing points of view on how to get a certain look.

Sure, with time artists will turn the dial down and video HDR will be more subtle. All new tech goes over the top at first. Remember video morphing?


Agreed. The clip of the HDR'ed person looks almost fake, since there's this halo that surrounds him.


My first thought is that this video looks exactly like a video game, but that is because current video doesn't use these techniques whereas video games do. I think this actually looks better than traditional video, but whether it is actually adopted or not depends on what viewers think. There's an interesting parallel with the fact that TV broadcasts and digitally filmed movies aren't displayed in higher framerats: They look "fake" or "homemade".

What we really need to make our digital images more lifelike, of course, is cameras and monitors with higher dynamic range. There are efforts underway to creating actual HDR LCD monitors where there instead of a backlight there is an array of white LEDs that are selectively lit according to which parts of the monitor we want to be bright, resulting in a ridiculously high contrest ratio. But for the time being, data formats are designed for our current contrast ratio or dynamic range.

Don't remember who said this, but a researcher in this field said something along the lines of "you would never mistake your monitor for a window to the outside, but this is what we are trying to achieve". This actually seems plausible, apart from the depth perception side of things, with today's developments in super-high resolution displays. Think the new iPhone display scaled up to 50 inches, with a much higher contrast ratio. I'm very excited about this.

It's going to be interesting to see what amateurs do with the D5 camera. Equipment costs are no longer an excuse not to do cool things with video, and this demonstration is a perfect example.


I'm amazed at how much it looks like modern video game graphics, particularly the video of the man. If there was a way to go from the HDR to a "normal" image, perhaps that would let video games look slightly more realistic too.


http://vimeo.com/11774969 - similar setup, but with more advanced cameras.


Now this is what it is supposed to look like... But I think that it is down to technique rather than the cameras.

This guy lays out most of the obstacles with the process in a very short comment: 1. Maintain consistent depth of field between two cameras by using separate Neutral Density filters for each camera 2. Spend time tweaking your luma keys for each video stream.

People that work on lighting crews should be getting nervous right about now.


It's a 3D rig with zero inter-axial separation (well zero as far as compositing concerns, not actual physical zero, of course). So shooting this stuff bears same problems as shooting 3D in general. DOF is easy to solve, as well as controlled lighting (stabilized voltages are the norm anyways). Tricky part is shooting outside, as evident by strong flickering in OP video. Though that can be solved in post.

And you are right, this video (and the other one on that vimeo account) is better mostly due to process/skill and not due to equipment used. One could argue about HDR though, since it's only a technique to widen the range of possibilities within compositing. How one utilizes those can be viewed as artistic direction. I also dislike masks with visible feathering, as shown in most HDR stuff out there.


From the comment here http://vimeo.com/12828140 it looks as if this guy isn't even using a beam splitter, just two wide angle lenses.

I think that most of the magic in this technique is in the luma keys. (duh...)



I'm still waiting a HDR display to actually view all this HDR material that is produced these days.


My brain actually hurts during some of this. You can see some of the unfocusing, misalignment and artifacts that show up in individual frames. Not really sure how I feel about HDR in video.


How much post processing is needed to combine the two video streams? Is it parallelable?


At first blush it seems embarrassingly parallel - with each matching frame (or block of several frames) from each camera being chunked off in parallel. But it can't be that straightforward, can it?


You would probably want to make sure that the tone mapping used from one frame to the next is relatively consistent.

What this means is that if you have a scene that is tone-mapped to just fit within the dynamic range of the monitor, and then the brightest lights in the scene are cut, you have a choice: either use a tone mapping that is similar to the previous frame, resulting in an apparently under-exposed frame that uses only a fraction of the dynamic range of the monitor, or use a radically different tone map, which would replace the sudden darkening effect with colors shifting all over the place.

A naive automated tone-mapper might take the latter approach in order to be embarrassingly parallel, but in order to look natural, the tone-map would have to shift gradually, just like pupils dilate gradually. These inter-frame data dependencies will make tone-mapping video comparable in complexity to video encoding. It also means that it is unlikely to be a task that can be fully automated, because there is a tradeoff between inter-frame contrast and intra-frame contrast that can only be resolved with knowledge of the artistic intent of the sequence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: