Hacker News new | past | comments | ask | show | jobs | submit login
James Cameron champions faster film projection rates (from 24 to 60 fps) (latimes.com)
214 points by fourspace on April 1, 2011 | hide | past | favorite | 134 comments



> The filmmaker said that when he begins shooting the "Avatar" sequel in about 18 months, he will be shooting at a higher frame rate, though he has yet to decide if that will be 48 fps or 60 fps. He said George Lucas was "gung-ho" to make the conversion, and also called Peter Jackson one of his allies. Jackson, he said, had at one point been heavily weighing shooting "The Hobbit" at 48 fps.

Awesome!

It's finally happening. The 24 fps speed is a relic from the stone age of cinema, back when technology simply could not cope with higher frame rates. So all they could do was shoot at 24.

Then something odd happened. Everyone was shooting at 24, including the grand masters of the art. All the procedures, techniques, all the clever things they devised, took that frame rate as a given. They were not simply shooting AT 24 fps, they were shooting FOR 24 fps.

Thus the "film look", which is the way a motion picture looks if it's shot at 24 fps using the standard cinema techniques. From a technical limitation, it became a cultural given. Because Bunuel and Eisenstein and Bergman all shot at 24 fps, everyone simply assumed this is how cinema is supposed to look like. Higher speed video ended up being seen as "cheaper looking, made for TV".

But it's exclusively a habit, a purely conventional thing, imposed by culture.

I remember thinking a while ago that, what it would take to break the mold is a few big names in cinema to start purposefully shoot at higher frame rates, and at the same time try and figure out new ways to express the same things at the new speed. Some geniuses need to reinvent the art to break free from the old stereotype.

Cameron filming at 48 or even (please, please, please) at 60 fps cannot do the same things that Cecil DeMille did at 24. Or, rather, is free to not do the same things, because the lower frame rate is limiting in many ways.

And now finally it happens. This is great. I hope 24 fps finds its true place - in the history books. It's time to move on.


I'm not a cinematographer but having gone to USC I met and talked with some (both aspiring and those who came back to give guest lectures). There was a digital projection system back in the 80's which had 60fps video, the output was much more 'real' in the sense of looking at the screen seemed more like you were looking at real life, and it jarred folks. One of the directors who was giving a lecture on the future of film felt it was the future but acknowledged the following counterpoint.

24FPS celluloid, with its known gamut limitations and temporal resolution creates an ambiance for the film. This is not unlike the way that canvas and oils create a certain ambiance about an oil painting. Now is the future of 'art' digital printers that can reproduce what an artist might have painted in oils, something they can paint using a Wacom tablet and Illustrator? No, not really. The medium is part of the art, its part of the palette.

Do digital films at high FPS have a place? Of course they do (art is pretty liberal in what it will accept as a medium :-) but will 24fps celluloid be consigned to the history books? Answering that question is less obvious.

The change from hand cranked to mechanically operated shutters was a clear win for cinematographers because it took some randomness out of what the audience would see, talking pictures did the same. The change from celluloid to video is however a medium change not a technique change.

I believe the comment at the particular talk I attended was "If Leonardo could have taken a picture of Mona Lisa instead of painting her, would it be in the Louvre today?" Can't easily answer that.


This sounds like an argument similar to the one lamenting the advent of digital photography because the grain of the film no longer provided the same effect in photos. But although the grain effect is available for digital photos, it is seldom seen.


Your claim "But although the grain effect is available for digital photos, it is seldom seen." is a fallacy. I'm sure if someone wanted a grain effect in their photograph they would be more likely to actually use film rather than use a digital simulation of what film would have looked like had they chosen to do that. I don't see wide spread use of the 'water color' effect either but people still make water color pictures. It is just that they start out with water colors.

I was rebutting the claim that shooting film on celluloid would 'go away' because there were higher frame rate technologies available. I don't believe that conclusion is supported by either current or past experience the art world. It was not a 'lament', it merely challenged the reasoning that 24 fps celluloid movies would be replaced in their entirety by higher frame rate ones.


> Your claim "But although the grain effect is available for digital photos, it is seldom seen." is a fallacy.

Yet it's factually true.

> I was rebutting the claim that shooting film on celluloid would 'go away'

Perhaps I did not choose the best words.

It will not disappear from the face of the Earth, of course. But it will be relegated to "retro" or nostalgic movies, etc.

It depends on how quickly the new, less limited technologies are adopted by the big names in cinema, and of course by how quickly the old generation, accustomed to the old way a film "is supposed to look", will retrain their taste or die off. :)


a grain effect added to a digital photo is a kind of affectation, an artifice that is not intimately connected to the process of making the image, included as a stylistic decision; possibly an afterthought. it seems dishonest (not necessarly wrong) but it is part of a panoply of effects that can be applied that constitute what's possible with digital. there's no need to 'lament' since each medium has its own panoply of possiblities: photochemical emulsion gets to be 'honest' about grain, however, and digital does not, since with emulsion grain is not an 'effect': it is what emulsion, in fact, consists of. the same cannot be said about digital.

there are analogous issues between film and video, and even between older video formats and digital video. the future of movies no doubt includes a choice of frame rates, frame resolutions, interlace vs. progressive, and aspect ratios as creative choices. they will be part of the panoply of possibilities that artists will be able to use.

n.b.: in the silent era, there was no real standard film speed, and the film was often exhibited (in the days of hand crank projectors) at varying speeds to suit the content of the image and the music that was invariably played live alongside it. for example, romantic scenes might be cranked a bit more slowly; comedy scenes could be cranked a bit faster. it was only when the industry started to be professionalized that silent films were shot a 16 or 18 fps. with the sound era, the fps was upped to 24 partily to improve the fidelity of the optical sound. most projectors have for decades now multi-leaf shutters which actually interrupt the light path as many as 3 times per frame elimitate the perception of flicker.


...interlace vs. progressive...

Please, no. Interlacing doesn't have the same nostalgic effect as a 24fps frame rate, and only serves to lower video bandwidth for devices that can't handle a progressive image at the desired frame rate. It's a tradeoff between vertical and temporal resolution that works well for limited technology because it preserves the sharpness of still images. Basically everything except broadcast TV can handle 1080p60, though, so let's let the hack die.


generally speaking, i'm no fan of interlace either, but my argument still stands that it could be used expressively -- in other words, to enhance the experience of the content for appropriate material. as an example, it might be a good choice if you were making a program and you wanted to exploit the impression that the viewer was watching a tv broadcast from a few decades ago.


I definitely agree that no option should be removed from the palette available to artists, just as long as the final delivery medium doesn't require deinterlacing.


There's simulated and real. I see scanlines all the time for CRTs appearing in cartoons, which has nothing to do with the actual format of the video playing.


Instagram is showing that people do like low-fi effects.

But I agree with you -- this is a mere affectation, a slightly way of turning common images into something evocative. And this becomes much easier to achieve when starting with a high resolution image.


That is an argument of professional taste. This is different. Many non-professionals unconciously associate the more realistic quality of video with TV movies, soap opera's and game shows. (e.g. 60fps video is a sign of lower quality content.)

The same was not true for still photography. It certainly is an open question on whether a negative association with Video will go away or if it will remain.


I think any picture Leonardo would have taken would be a significant cultural artifact, as would any telephone he built, or any computer he made.


Now we have higher framerate digital and post-production teal and orange effects.


Isn't everything about film and film-watching "exclusively a habit, a purely conventional thing, imposed by culture"? Or are there aspects that are objective? (Would I even watch a James Cameron film if it weren't for cultural norms encouraging me to?)


A postmodernist would reply that everything is purely conventional. Others may disagree. But that's an entirely different can of worms.


I don't see how it's any different, I suppose. The aesthetics of the 24-fps film are aesthetics, just like basically any other aspect of film, ranging from acting styles to types of cuts. I happen to like the aesthetics of 24-fps film for certain kinds of scenes, and actually prefer 14-fps and 18-fps even more, again for certain kinds of scenes. You seem to be arguing that I don't "really" like the aesthetics, but am being misled by my associations. But I think I do really like the aesthetics!

Mostly, I don't see why my film fps preference is more culturally constructed, or constructed in some more objectionable way, than basically every single other thing having to do with film, which is nothing but an aesthetic and cultural phenomenon to begin with.

I mean, we might inquire into the habits and conventions that lead you to prefer 60-fps film, just as much as we look into those that lead me to prefer 24-fps film! Perhaps you have an aesthetic preference that films "should" be more realistic, and less stylized? Or a technological-aesthetic ideology that films should use "better" technology because it must necessarily be better aesthetically?


Wow, that's a lot of sophistry.

There are objective differences. 24 fps contains less information than 60 fps. Its temporal resolution is 2.5x worse.

Believe me, I can clearly see my own cultural bias. It keeps telling me 24 fps somehow does look okay. But we are not complete automata, we can distinguish between instinctive reactions (acquired tastes) and reality.

And the reality is, the higher the frame rate, the more information is recorded. An intentionally crippled low-framerate movie may serve certain esthetic norms, but that's pretty much all it does.

As far as 24 fps cinema is concerned, thanks to a century of conditioning, we're all pavlovian dogs.


I guess I'm missing how information recorded is even relevant. If recording footage of a scientific phenomenon for later study, sure. But art? What principle says that more temporal resolution equals better aesthetics? This seems like some sort of cultural bias, assuming higher fidelity = better aesthetics, which I don't think is true.

I tend to prefer more stylized films overall, not just in temporal resolution either. If you prefer more realistic ones that seem like you're really watching a scene that's happening, that's a legitimate preference, but it's not mine. I'm not sure the 24-fps (or better, 18-fps) effect is the most important or best anti-realism aesthetic effect in a film-maker's toolbox, but it's one of them. It's probably, as you say, an accident of history that it's as widely used as it is, but it's in my view a positive accident of history. Modern Hollywood is far too realistic in its shooting style imo, and this is one of the few fortunate areas where it hasn't gone all the way off that cliff.


I think you're missing GP's point. The look of black-and-white film creates a different feel than color film. Clearly, there's a lot of "information" missing from a black-and-white film. But filmmakers still use it when they want that feel. Same with the way CSI colors everything blue. Sure, it loses a lot of color info and distorts the gamut etc. but it has a particular feel because of that. So, sometimes (maybe even most of the time) you want a smoother picture with more motion information. But sometimes you want something that feels choppier. Example: http://revision3.com/filmriot/warfilm There's a short film at the beginning that demonstrates the techniques they're talking about, with commentary starting about 5:00 in. Specifically, in this film, they used a high shutter speed at 24 fps to create a choppier look. If they used 60 fps, motion would be smoothed out, so even with a high shutter speed the violent feel created by missing information wouldn't be the same.


more != better


This sounds a lot like "Now that we have color film, it's time for B&W to find it's true place - in the history books. It's time to move on."

Which is not to say that 24p won't get largely eclipsed. I suspect it will. But not because it's universally 'better'- just more appropriate to a growing number of things.


The problem is that 24 FPS is now part of the Hollywood "look". 24 FPS is associated with high quality high budget productions in people's minds; 60 FPS looks like a home movie or low-budget TV sitcom. It may take a while to change people's perceptions.


Sitcoms and soap operas look the way they do because of lighting. I won't disagree that 24 fps has something to do with the 'film' look, but I think camera lens, filters, and lighting affect the look of the movie more then the framerate.


I'm just a layperson when it comes to film but I want to respectfully disagree with the assertion that it's all about lighting.

A friend recently picked-up an LED "3D" tv. Now, I always thought the 3D is a BS gimmick, and I still do, though I will conceed that it does work far better than I ever thought it would. Which is, not at all.

But the TV -- a samsung -- had a feature that reminds me a bit of up-converting, however it's used on every input, even 1080P from the blu ray. I googled it at the time but the gist is that it, on 2D mode, it computes deltas between frames and inserts new ones to make the image seem more fluid.

IT LOOKED LIKE TRASH!

It made everything look soap-opera or like a documentary. It messed up the depth of field. Sure, things were hyper-crisp but alarmingly so. It made it look cheap. Now, there's a setting buried in the control panel to tweak or disable that, but this is how it came from the factory, and i thought it was un-watchable.

So it's not merely lighting.

One last thought which is I actually think the perception of higher quality shots in films has a lot to do with stedicam which of course will be the same at 60fps. But that's just a guess.


This is a straw man. Interpolated tweening frames are not the same as actual frames with new content.

It may look like trash, but--then again--so do upscaled images.


His observation suggests that higher frame rate, with different content in each frame, does achieve the soap opera look. If the interpolation is advanced enough, it may be arriving at something close to what the extra frame would have been.


A "straw man" ? I didn't know this was a debate. Besides, I think you missed my point: That it's not just about lighting.


But seanalltogether never said it was just about lighting. He mentioned, "camera lens, filters, and lighting."

Your dissection of what is wrong with Samsung's interpolation had nothing whatsoever to do with seanalltogether's comment.


Does Saturday bring out the debaters? What is this? "Sitcoms and soap operas look the way they do because of lighting."

Now come back at me with some word-by-word, point-by-point dissection, but i'm done, this isn't worth it.


"It messed up the depth of field."

Absolutely preposterous. Depth of field is a function of lens focal length, aperture, and sensor size. There is no way that Samsung's TV could in any way effect the depth of field of the original recording. If it could, then it would be a magical motherfucking TV.


You ought to watch the TV before wetting the bed over it. Go to a Best Buy and have a look.

Because as much as you clearly know about film tech more than I do, you seem to be lacking a slight amount of common sense: What's happening there is processing of a digital signal. Specifically, the TV is creating frames for you to create the illusion of smoother video. And what this effect does, is it appears to flatten the shot.

What your slashdot-worthy post overlooks is that you're speaking from your limited knowledge of film. However, in this TV, there's a feature that lets you apply this setting to just half the screen, so you can see them both, side by side, and tweak as you wish.

Just like in life, there's no amount of theoretical education than can make-up for actually being there, for actually having hands-on.

So really, save the hyperbole.


Uh, you started a debate, then got mad at people for responding in like manner. I don't understand. Go back and reread your post, it was both confrontational and of low technical quality.


The reason it looks like trash is not because it's a higher frame rate, it looks like trash because it's removing natural motion blur from the image.

This actually makes it more like a video game in which discreet images are generated from the world, instead of natural film/digital sensor in which millions of photons are captured for a short duration.


That's what I always used to think, until few years ago. Then a new crop of TVs came and it started to mess up the movies. The movie looked like a soap opera. It turned out, that it was caused by a feature that increases fps and inserts interpolated frames (every manufacturer calls it differently). After turning it off, the movies look like movies again.


I have always asked myself why films made for television look different than the ones made for cinema. Thank you for the explanation!


Not to mention dynamic range/gamut, gamma, etc.


No, the Hollywood "look" is having tens/hundreds of millions of dollars to dump. TV looks low-budget because it is low budget. But there's a big difference between real low budget TV and simply TV.

Honestly, download an original British or Canadian drama show that doesn't get carried in the US and you'll see low budget. You'll see horrific, unwatchable direction.

I'm sorry, but give Sam Raimi a video camera with any frame rate and there's a significant chance he'll produce a blockbuster with it. I mean Oliver Stone released big budget movies using Super8!

The Hollywood "look" is genericness, because like everything else about Hollywood the studios are whole-heartedly fearful of anything that might stop someone watching a film, and in doing so turn people off watching films by lacking originality from plot to cast to cameras.


I can't speak for Canadian TV but primetime British dramas are generally technically excellent - for instance, this year's Oscar winner for Best Director, Tom Hooper, has directed episodes of Cold Feet, EastEnders and Byker Grove.

Practically nothing that doesn't feature US accents and locations is picked up by the networks - sometimes period co-productions sneak in, as long as there's pretty frocks and no swearing.

You're missing Misfits, for instance, which is the best show on TV...


I'm not arguing against you, I'm from the UK and there are some great shows. Skins has been technically excellent since its onset, I even got my wife (Canadian) watching it. Shameless is another example. Even shows like Hollyoaks (if it's still on) are technically great, I actually thought their quality of shooting (the clearness of camera positioning, etc.) was better than shows like 24 were managing and it's a rolling-production with 1 week to film an episode and 3 episodes airing a week.

I'm talking, look at the first episode of Primeval. It was atrocious across the board from acting to directing and 3D graphics; especially for reaching an audience of ~6.8 million. Compare that to the outstanding production qualities of Stargate Universe that only got ~2.3 million in a 5x bigger market.

My example are shows like the Canadian Flashpoint, which despite a decent budget and getting 1 million viewers can just be awful. The stories often have little reasoning behind them (shit literally just happens), the direction can often be gaudy by trying to make grand shots and just utterly failing. It often feels cold, like they're not using close-ups right. I often find this in many British shows too, comedies and traditional dramas not so much, but especially a lot of the police shows can be awful.


OK, Primeval is not great - but it is well-researched: most of the monsters in the show are not known prehistoric creatures, for example, which makes sense given the long gaps in the fossil record. And the rendering is excellent - it's just that the creatures have a limited interaction with the real environment, which I guess is down to budget.

And Primeval could never approach, say, the brain-rotting stench of The Cape. SGU's poor ratings are not a badge of honour, it is a cancelled show!

As for police shows - that's pretty much a US speciality, but the really good ones are under constant threat of cancellation.

I do concede though, while I've frequently thought of British TV lately "damn, there's a lot that's good right now!", this thought has never consciously crossed my mind while watching anything Canadian. Due South, anyone?


The BBC has some of the best camerawork I've ever seen. Other European productions aren't bad either (RTL comes to my mind)

Disclaimer: I'm a layman, but that's my opinion as a South American that has watched both North American and European TV (with long stays in Canada and Austria).


When I was in college and doing alot of video work (I worked at a TV station, and volunteered in the video editing labs in my free time) one of the most requested features from students doing short films was doing a 3:2 pulldown* to convert their editing video to 24 FPS and then reconvert back to 29.97, thereby giving them the appearance of having originally shot on film (at 24 fps).

Sadly we were using first generation G4s with Final Cut 1.0, so to do this for any any film over about 5 minutes meant that the render time would be measured in days, not hours.

* http://en.wikipedia.org/wiki/3:2_pulldown#23pulldown


60fps also looks like a high-budget video game. I think that can play a role in overcoming a high-frame-rate aversion.


Perhaps digital animation is the proper place for the transition to take place?


That's likely a big part of it, but there are also genuine aesthetic effects. In the arthouse-film world, Super 8 filmmakers can normally choose between 18 fps and 24 fps, and don't have the same baggage Hollywood does in which only one of those "looks right": both 18-fps and 24-fps films are accepted by audiences as legitimate. Yet some directors choose one, some choose the other, and some directors switch depending on the film, because 18-fps and 24-fps shooting produce different stylistic effects.


I think the fact that Cameron, Jackson, and Lucas are using it will make the transition much easier.


There are some televisions that have the ability to interpolate a 24fps format and produce an apparent higher frame rate. Watching sports on these televisions is great, but I tried watching Iron Man on one and it made it look like an after school special. I believe this is colloquially know as the 'soap opera effect'.

Google for 'samsung soap opera effect'.


My new TV does that - it came with the mode enabled, and when I watched an HD movie off Netflix, it seemed very odd and low-budget - this soap opera effect - until I realized what was going on with a second movie doing the same thing. It's definitely jarring and everything looks, ironically, like it was shot with cheap cameras.


Have you tried getting used to it? Does the effect go away with time or is it forever burned in your brain?


I watch everything like this on my Samsung television (this is a 2009 model so I assumed nearly all new TVs have interpolation+high refresh rate.) My favorite is watching old movies on bluray, combined with this it is like you are actually there.


Notice he's leaning towards 48p as opposed to 60p, which is an already standardized framerate that your modern TV likely supports.

I do not think that is an accident. After all, once we all buy our new 3D TVs, something will have to happen for people to buy new TVs again in 3-4 years.

And before you mention bandwidth considerations, I do not have actual data, but I suspect that when using H.264 High Profile, the difference in bandwidth between 48fps and 60fps won't be that large. There is a limit to how much you can change visually in a single second: human perception.


48 divides evenly into 240, so in theory a 240Hz TV could take a 48Hz signal and display it without any pulldowns, while 120Hz is 2.5 times 48, so it needs to do some work to create extra frames. (And really, that's the main advantage, currently, of 120Hz and 240Hz: 24, 30, and 60 (the primary American frame rates) all divide evenly into them, unlike 24 into 60).


Seems like you wouldn't count on the TV to do this work. Instead, you'd shoot at 48, then convert to 60 before delivery, in the same way that TV shows shot on film and edited at 24p subsequently convert to 25 for PAL markets and 29.97 for NTSC.


I would rather count on the media player (Blu-ray player or whatever) or TV to do it than in the source. Think of it this way: in NTSC, DVDs were either had hard 3:2 pull down for 24Hz to the 29.97Hz to show on the interlaced CRTs of the time. But now that we have TVs that can show 24Hz, if you play those DVDs they still have the interpolated frame artifacts. While if you left it up to software pulldown in the DVD player (which they basically all had, since I believe it was part of the standard), they can now be shown at the proper frame rate without interpolation.

Most theatrical DVDs released after the early 2000s were 24Hz, as far as I know and depended on the DVD player to convert to NTSC. So now in modern players, they're outputted at 24Hz for 120Hz and 240Hz TVs.

The only downside is if you get crappy TVs and/or Bluray players that do horrible pulldown, I guess. But I've gotten hardpulled down DVDs that look horrible as well (I'm looking at you, BBC).


Problem is, while you can devise ways of converting 24p to interlaced 29.97, getting from 48 progressive frames to 60 progressive frames isn't obvious to me at all. You'd have to add a frame every 4 frames. This would likely ruin the whole large framerate experience.


Assuming the bitrates are the same, 48fps will require 80% of the bandwidth needed to handle 60.

To the best of my knowledge, H.264 only supports two GOP structures (short = 6 frames / long = 15 frames), both of which operate independently from the playback speed.

In other words, a short GOP setting produces 8 blocks per second running at 48, and 10 per second running at 60. Assuming a constant bitrate (I know, I know), the throughput of the entire system needs to increase by an additional 20% to handle 60. And that's where things fall apart.

Consider the iTunes store, where I believe HD rentals top out at 720p. Or most HD broadcasters, who skew away from the 19.39 Mbps ceiling on the ATSC MPEG-2 spec in favor of programs encoded at 13.something Mbps (the legal floor) thereby allowing an SD channel encoded at ~6Mbps to be delivered in parallel. Does this exercise in bandwidth maximization impact the quality of the HD component? You bet. Do the networks care? If an additional SD means they can dramatically increase the number of ads served, then no, absolutely not.

So regardless of how great the acquisition setup, and how awesome the playback system, there's always the network in between them to consider. And with providers moving toward bandwidth caps, I see picture quality going down, not up. So, streaming 1080p60 H.264 encoded at (say) 7-9 Mbps? Choice, yes, but total fantasy for the time being.

Of course, Cameron is talking about making theaters competitive, so maybe that's his point. Given that BluRay has effectively failed as a universal home standard, and that instant-availability is hugely favored over order-and-wait programs, image quality of filmed entertainment at home is now tied to network limits.

The real question, then, is "are there enough cinephiles who actually care?"


"Assuming the bitrates are the same, 48fps will require 80% of the bandwidth needed to handle 60."

Not exactly - while your 80% approximation is true for uncompressed footage, h.264 encoded video and essentially every modern day video compression schema will inherit data from the frame(s) before it. The higher the framerate, the less (generally speaking) the images will change between frames - leading to a reduction in data per frame.

Here's an example:

File 1: http://files.jjcm.org/60fps.mp4 - 127KB

File 2: http://files.jjcm.org/48fps.mp4 - 119KB

As seen here, the 48fps video 94% of the size of the 60 fps video. Granted, this is a singular example, but at least it supports the concepts that I described above.


Wow, you're absolutely right about the (minor) file size differences. That's really interesting. H.264 FTW.

I have to admit, I couldn't see any difference in the quality of the motion between the two clips, but thanks for posting them.


H264 has no such limitation regarding gop structure. The levels (3.0, 4.1 etc) define (slightly indirectly) the DPB (decode picture buffer) size. This is how many frames must be held in memory at one time. The gop itself is not limited. Furthermore the compression rate for high frame rate actually increases for live action as there has been less change between the frames and even taking a fixed framerate. There are compromises to be made and faster framerate with a fixed bitrate may not be great for all usages but I think that you are being very, and unduly, pessimistic here. From my experiences at 720p 60fps and a 6 Mbps stream (which we currently have deployed) and a nice encoder you are looking at a very nice picture.


Very interesting. Thanks for the detail. I guess the fixed GOP structure must be limited to HDV (MPEG-4). That flexibility is pretty awesome.

For what it's worth, I suspect what you describe (720p60 @ 6Mbps) is more than fine for most people. In other words, Cameron's idea that people will be going to theaters because of magical picture quality seems like wishful thinking. I suspect people go to theaters because they like going to theaters. It's social. It's big. It's immersive and (relatively) distraction-free - all part of a fun night out.

Are they going to go home to 720p60 @ 6Mbps and think 'gah!'. Unlikely.


HDV has nothing to do with H.264, or any of the MPEG-4 standards for that matter.


You're absolutely right. I see that HDV is encoded with the H.262/MPEG-2 Part 2 compression scheme.


> "To the best of my knowledge, H.264 only supports two GOP structures (short = 6 frames / long = 15 frames), both of which operate independently from the playback speed. In other words, a short GOP setting produces 8 blocks per second running at 48, and 10 per second running at 60. Assuming a constant bitrate (I know, I know), the throughput of the entire system needs to increase by an additional 20% to handle 60."

I'm sorry, but it needs to be stated: the above isn't true and the calculations have nothing to do with reality. H.264 supports pretty much any GOP structure and you don't get from frames to final bitrate by dividing frame rates by GOP size.


Dude, chill. Also, read.

Accepting that I was mistaken about the rigidity of h.264 (a possibility I acknowledged up front), and assuming that you DID establish a GOP structure of 6 frames (at a constant bit rate) shooting for one minute at 48 fps would produce 80% of the data generated by shooting at 60fps.

Indeed, this ratio would be true if you used no compression at all. The point was that in a fixed GOP scheme (again, at a constant bitrate), the difference in FILE SIZE would increase with the difference in frame rate, so the idea that file sizes for 60 fps material would only be marginally larger than those for 48 fps seemed mistaken.

My only mistake was in assuming that the GOP is fixed in h.264. I agree that "you don't get from frames to final bitrate by dividing frame rates by GOP size." But then again, I never said you did.


So because one country is willfully going back to the stone age with caps, everone living in other less idiotic countries have to suffer?


Since no one else responded with this: you're right, it is not an accident that he would be considering 48fps, the reason being that a good amount of 24fps film projection occurs at 48fps on equipment that double pumps each frame. Such equipment (and the related equipment and processes around it) would likely be easier to upgrade/retrofit to 48fps than to 60fps.


I believe there are already some TVs that accept 48Hz as an input frame rate over HDMI. This is the only way 48fps video will look right, unless we move to 240Hz (the LCM of 48 and 60). 120fps would give the same juddery results as playing 24fps on a 60Hz display, but going up to 180 might be good enough to make the judder invisible (I don't notice the judder of a 24fps stream on an 85Hz CRT, but do at 60Hz, for example).


This is more exciting for me than 3d. I remember thinking this when I first noticed that counterstrike didn't feel right at <60fps while watching action films with 24fps


That is because a game's frame is rendered at one absolute time point, while a movie's frame is like a photograph taken with a shutter speed of 1/24th: it includes the motion data of that whole time interval, it contains motion blur, and it's smooth from one frame to the next.

That's why a game at 24fps looks bad, but an (action) movie at 24fps looks good.


> That's why a game at 24fps looks bad, but an (action) movie at 24fps looks good.

"Looks good"? Maybe passable. And that's only because of a century of experimentation with camera techniques allowing to eschew the limitations of the format.

E.g. try giving an amateur a 24 fps camera and ask them to shoot a sequence that involves a lot of panning. The result is going to look like junk.

Even in the hands of a professional it still doesn't look good. But we've all trained our brains to shut up and accept it, because we've all watched so much 24 fps content. It's 100% pavlovian conditioning, nothing else.

To someone coming from a universe where all motion pictures are shot at 60 fps or above, our movies would look excruciatingly bad.


That doesn't seem like it could be true. The film is rolling through the camera, and if the shutter were open continuously, the entire film would be a vertical blur of light. The shutter would open and close once per frame, so the film can be advanced. The actual shutter speed of a camera shooting 24 fps would be much faster than 1/24 s.


You're right about the continuous roll. Film actually moves through the gate incrementally. There's a really neat mechanism that closes the shutter and advances the film to the next frame in one elegant move. But while the film is exposed, it's exposed for a fraction of a second. Meaning that any motion occurring within that second will register with a blur in that particular frame.


Correct. There's a simple formula for that: http://en.wikipedia.org/wiki/Shutter_speed#Cinematographic_s...


Oh, cool. The linked article about the rotary shutter also has a great animation showing how the camera works:

http://en.wikipedia.org/wiki/Rotary_disc_shutter


Just hit pause on practically any movie when there is movement


It is really interesting how our brain compensates for the motion blur in films. I mean, you don't usually notice that things are really blurry until you pause the movie.


Our brain compensates for motion blur, period, because everything we see is effectively slightly blurred due to our eyes' response to light changes being non-instantaneous.

Specifically, cone response time seems to be on the order of 15-20ms if I read http://www.ncbi.nlm.nih.gov/pubmed/14990682 correctly. And that's just to go from initial stimulation to peak response; presumably the response also dies off in nonzero time, so the full width of the response in time is more than 20ms. Rods respond slower than cones according to http://en.wikipedia.org/wiki/Photoreceptor_cell though I can't find numbers.


Douglas Trumbull's 'Showscan' fprmat shot and projected 70mm film at 60 fps. The huge cost kept it from succeeding, though the visual impact was supposed to be amazing.

Cameron's proposal will introduce similar perceptual problems to HD video - fake stuff will look more fake, every tiny detail of a performer's face will be apparent, and scenes that play a certain way at 24fps might fall flat at 60. One study showed that physical comedy was funnier in low res/low frame rate, and the same action looked painful to the audience in greater definition and detail.

I'll watch a good story in any format, thanks.


I saw one of those 'ride movies' in Las Vegas that had been shot in Showscan, and can't say I noticed much of a difference -- the seats were shaking and vibrating and careening though, so it may not have been the best test.

Trumbull's research found that 60fps somehow resonated the most with the brain -- no slower, no faster.


Wasn't Showscan also projected with more lumens? A quick search didn't reveal this, but I am sure I remember that Showscan was much brighter.


If you're ramping up the shutter speed on the projector, you gotta compensate with more light coming through.


Film geeks in here will eat up the story of MaxVision48 - it's an equally inexpensive upgrade to existing film capture and projection that results in a much higher picture quality, way higher than even the Red camera.

Check out this infographic on Roger Ebert's blog comparing the different formats:

http://blogs.suntimes.com/ebert/assets_c/2011/01/resolution%...

The full blog post is well worth a read, especially given the story of a single passionate entrepreneur trying to change Hollywood and his story of trying to get Christopher Nolan to champion the format:

http://blogs.suntimes.com/ebert/2011/01/more_than_ever_the_f...


That chart is seriously misleading. They wanted to represent the 2x increase in framerate, so they showed that as a 4x increase in resolution, which doesn't make any sense.

And they seem to suggest that RED made the entire chart, which they didn't. RED made the accurate part of the chart, but MaxiVision48 added their own nonsense.

The impression I get about this company is that they tried to sell a product to a bunch of people who didn't want or need it. The reason RED is so successful is that they actually made a camera directors wanted to use. Directors want to be able to do long takes, they want to see instant footage, they want to be able to go directly into the editing room after shooting, and they want a cheap medium. MaxiVision doesn't offer any of those advantages. In fact, if it ends up being harder to process, it's actually worse on all those points.

MaxiVision48 seems a little like BluRay. It's the next generation of a dead technology. Film will always be useful for certain films, but the benefits of digital go way beyond high resolution.

Anyway, interesting story, thanks for the link. Gives me some food for thought about my own business.


"BluRay. It's the next generation of a dead technology."

Slightly off topic, but you just nailed it, sir. Upvoted accordingly.


What's misleading is that you're only focusing on resolution or framerate, not the point that MV48 upgrades image quality while maintaining both the unique look of film vs digital, but also uses existing film and cameras, unlike digital, IMAX, or 3D.


In what way is it equally inexpensive to shoot twice as much film and use twice as much film to make every print? Digital tech is what's going to make high framerate capture and playback economical.


Two reasons:

1) according to the Ebert article, it's not twice as much film, it "requires 50% more film and one stop more light (a trivial issue)"

http://blogs.suntimes.com/ebert/assets_c/2011/01/MaxiVision%...

Notice what's implied in the diagram is that the MV48 format would use existing 35mm film and would modify existing cameras for it.

2) it's way less expensive than 3D, both to shoot and display.

3D is just as much a competitor as the Red camera. It's what the movie studios and theaters have been banking on to compete with the quality of home theater setups, not an increase in 2D image quality. Caring about the unique look of film compared to digital while trying to increase the quality is why directors like Christopher Nolan like the IMAX format, but shooting with 70mm film is not only expensive but unwieldy because the cameras are huge. Sort of like what bulky 3D cameras were like before James Cameron forced the invention of a new one for Avatar.


That Maxivision sample there is for prints, not for capture. If you were to take a look at a super 35 original negative shot at 24 fps, it would look pretty much like the Maxivision print there (no waste, but half as many frames). So it would almost certainly take twice as much film to shoot in Maxivision. Also, a stop more light is not a non-trivial issue. It could mean having to rent 20K HMIs instead of 10K, a bigger generator, etc. Or using a higher ISO film, thus negating the benefits of the format in the first place.

No one shoots with 70 mm film. 65 mm film is the capture size, 70 mm is the print size (added 5 mm for audio tracks).


Fine, I misspoke about the IMAX format and those are all points about #1, but what are you comparing the increase in cost to? Continuing to use the same old equipment as now? Or switching to digital? Because the discussion is about the least expensive way to upgrade image quality, not about whether it's more expensive at all.

And what about the second point about 3D being an expensive sham that movie studios are using to milk more money out of the audience? And this isn't the first time they've tried that - we're going through the same fad that hit in the 60's and it'll die for similar reasons.

Regardless of whether it's Maxvision or RED, an increase in picture quality way beyond 1080p is possible and being held up by bean counters, not the directors. It's not like RED's native high resolution is translating directly to the screen - it's being downgraded for the projectors at movie theaters. I'm getting this from the FAQ page on the RED website. Scroll down to the last one - "So, where can I see 4K?" and you'll see promises of 4k projectors and displays coming soon.


I completely agree with you about 3D. I despise it.

The Maxivision goal was a noble one, but it was just too expensive to work. Increasing the framerate of content playback on digital projectors will be a far less costly initiative than the Maxivision proposal would have been. Unfortunately I'm not sure 48 fps vs. 24 fps has enough "wow" factor to convince the average movie goer to go to the 48 fps screening over a 24 fps screening, but I really hope it will.


I agree that a simple increase in framerate isn't enough of a wow factor. It might be if it's also accompanied by a massive increase in resolution?

Either way, I'm looking forward to any improvement that doesn't involve 3D. Ebert's other guest author the week before the Maxivision one was an anti-3D article by Walter Murch, probably the most famous film editor and sound designer ever:

http://blogs.suntimes.com/ebert/2011/01/post_4.html


I wish that the sample footage he shot at 24,48 and 60 fps was available somewhere I could download it =(.


A video comparing 24 fps to 60 fps (In video games, where the difference is higher)

http://mckack.tumblr.com/post/3535190601/60fps-vs-24fps


I really must be "blind". I can see the difference side-by-side (especially the moving ball) but the difference is so slight. Even 15fps looked fine to me. If you showed them to me back-to-back, I don't think I could tell the difference. Which is frustrating to me, because I know the difference but I wish I could see the difference. I feel like my eyes are failing me.

This is watching the downloaded avi.


Hm, I was still stuck into the "above 30 FPS the eye cannot notice the difference" mindset. What's going on?


That was never correct. The eye is not a camera, it doesn't work at frames per second.

http://www.100fps.com/how_many_frames_can_humans_see.htm

Humans can notice stuff that hapens in 1/200th of a second.


Excellent link, thanks.

His other pages are worth a look too, you can find them in the upper-left corner under FAQ.

It is nice to see that I'm not the only hacker with an interest in video technology.


I would wager that this upper limit, whatever it is, is context-sensitive. Depending on the details of the scene, and the emotional state of the viewer, the eye (and the brain) will notice less or more. (Also, many of today's films use very quick cuts, where a significant clip of an action scene is a mere 10 or so, and viewers are likely adjusting to drinking in a lot of visual information in a short amount of time).


I'm a big fan of 60fps video. It takes a bit of getting used to because it looks so different, but after you watch 60fps content for a while, 30fps content looks like a slideshow -- almost unwatchable.

(And there's plenty of content that's 60fps, just not a lot of American content that's 60fps.)


Could you give some examples?


Visit your favorite torrent site that has Japanese TV shows and search for 60fps. (Sometimes the video is encoded as 30fps and is interlaced, in which case you need to turn on bob deinterlacing to get 60fps. But this seems to be rather rare these days.)


All the Japanese animation that I seen has been animated at a much lower frame rate than American animation.


Why not adaptive framerate? Project the film at 60fps but gradually switch (this is the tricky part) between low and high framerate as the scene demands?


I assume that higher fps = higher digital payload = higher bandwidth load for ISP's, considering services like Netflix Since this will place a bigger burden on ISP's, they will use it as added ammunition when lobbying Washington re Net Neutrality, and the whole thing will become a political clusterf*ck. Ain't life in the US grand? But other than that, it sounds like a good idea.


Higher FPS = reduced delta between frames = better compression as compared to the raw footage.

Doubling the framerate will not double the bitrate. (Though it will increase it somewhat, probably not as much as you'd expect) Modern video codecs are getting pretty smart.


ISP's don't enter into it. The goal is to have exclusive tech in movie theaters only.


Indeed. It's all about doing what your ISP can't.


People are used to 24fps for movies now, however.

In my opinion, higher framerates just end up making everything look like a sitcom.


We're at a local maxima. It will look better after it looks worse.


Well I certainly have never been one to complain about current film standards.

I suppose I am just wary of change simply for the sake of change. I would like to be able to see the same 24fps and 60fps scene side by side and evaluate the advantages myself.


Recent movies filmed on digital video have the problem that the exposure time is much less than the frame gap. So action scenes look like flick-books. Gladiator had this issue, as well as most of Michael Mann's recent output. A higher framerate will make action really flow.


What would be the impact on movie theaters? I am looking at this from the perspective of the simple 2D (non 3D) projection system.

I assume if they have a modern digital 2D playback system this may not be a problem. However, what if they have just good ol' projectors that require film reels?

Questions on my mind:

- Can we roll film at a higher speed on the projector? Can we double the reel intake rate it if we go with 48 fps? Is it tougher to do fractional speed increases (like 60 which is 2.5 x 24)

- Are there any refresh issues - I assume no since the light is always turned on.

- How about sound? [The assumption here is sound is on the same reel - but I could be wrong]

If there are any projectionists in the HN house, I would appreciate you filling us in...


I'm not a projectionist, but:

- sound is either on the same reel or on a DTS disc synced to the reel. - I was surprised that a rather small theater in my metro area (no stadium seating, out in the boonies) has digital projectors in all of its screening auditoriums.

For modern movies, I would imagine that in 10 years or less there will be no more acetate prints and all distribution will be digital. For digital prints framerate is solely limited by the response rate of the projector and the throughput available in the playback system.


I'd guess our brain fills in data when the frame rate is only a few per second. When it's much higher we have to directly absorb what we see without mental filtering. This is why 24fps is more "cinematic".


Well, you see at barely 1 megapixel, in the sense that that's how many photoreceptors there are in your eyes. But you don't notice this because by the time it reaches your level of actual perception, lots of scenes from every small movement of your eyes have been stitched together. Your brain has a ton of hardware for doing interpolation. 24fps seems to mesh well with that hardware. But we're not machines... Higher numbers in the lab don't necessarily translate into better human experience.

Another case in point: people still shoot and love photos on Tri-X, a grainy B&W film, even tho' you can buy cameras that do a dozen or more full-colour megapixels off the shelf...


My idea of great film tech? 8K, 60fps, and true 3-D (not the hokey 2-level thing)

It'd be awesome, but it's a long, long way away.

As Jim says, this is low-hanging fruit. Easily done with most everything that's on the shelf today.


It may be low hanging fruit, but imagine rotoscope work on a 24fps feature vs a 60fps feature. The work has just increased 2.5x for the same shot.

For fully rendered content, there is likely no additional man hour time, but for any frame by frame work involving humans, post processing could take a lot longer.


Is this kind of thing really done frame-by-frame though? I think with smart interpolation techniques you shouldn't have to do single frames anymore.


Rotoscoping[0] is by definition frame by frame. Keyframed animation would largely remain unchanged, though rendering would take 2.5x as long.

[0] http://en.wikipedia.org/wiki/Rotoscoping


I'm pretty sure large budget animations are already rendered with higher frame rate for proper motion blur. You can't add proper motion blur with post-effects, like in video games.


"120 fps ought to be enough for everyone."


Given the way your eyes work... yes! 120FPS is beyond almost everyone's internal "frame rate," so going any higher really doesn't make much sense except in special cases (trying to deliver different frames to each eye using higher frame rate and shutter glasses, for example).


I think people read my initial comment as some sort of plea for more and more tech, but that's not the way I meant it.

My point was for archival purposes. If you record data at a degree or two higher level than human perception, you can always add whatever post-processing you want to get any kind of artistic effect desired. You want that old jerky 24fps stuff? Fine. Post-process it to get it. People 200 years from now will be able to watch it in 24fps, in black-and-white, in 2-D, or whatever the initial artistic intent. But future audiences and artists might also choose to remix or to see it with more data.

We are currently in a situation akin to knowing how to shoot color movies but refusing to because everybody liked black and white so much. Or being able to record audio but being afraid of putting all those movie pianists out of work.

This kind of thing can be framed up as artistic all day long, but in reality it's just about fear of change -- more to the point, fear of mucking around with your business model too much.

So having learnt that 120FPS is the limit, I'd shoot for 240 or 480FPS.


As other people have commented, the medium is part of the artwork. To quote a comment from above: "If Leonardo could have taken a picture of Mona Lisa instead of painting her, would it be in the Louvre today?"

Also, the days of watching just the raw footage are long over. These days, almost every frame you see in a movie has been heavily post-processed. Quadrupling the frame rate for no reason other than to be safe or to allow others to remix your work in the future means quadrupling render times and, for jobs like rotoscoping, quadrupling people's workload. It just doesn't make any economic sense.

Unless you were talking about shooting just the raw footage with higher resolution and frequency (and downsampling it prior to post-processing), but then what people are remixing isn't your film but your footage. Plus, you still have increased cost for more sophisticated equipment and for storage.


I hear you, and you make good points.

But this issue is not unique to frame rate, or even cinema. We ran into this same problem when HDR took off in still photography. How much of the art is the medium, and how much is the processing?

I think you can argue it either way, but my point was that additional data can always lead to various interpretations later on, while less data always leads to a less-varied range of future possibilities. Speculating on what kinds of post-production work might be done, who might do it, or whether it's a good thing or not misses the point. What used to be part of the medium is now part of the process.

Leonardo would have taken the picture, then painted the painting. What hung in the Louvre would have been up to him as to what he decided to release. Would the painting be worth any less if we had a fully-rendered, highly-detailed model of the studio, subject, and the artist? Not at all, but there are many derivative works we could create with that kind of data that we could never do from just a painting. Think of it in a silly way: if I see a man and draw a stick man, am I locked in forever for only remembering that man in such simple terms? Or might I want to come back and paint him? Why make me choose when the tech lets me have it both ways?

I won't go into the technology/cost issues, as these things have a way of changing dramatically from year to year.


I agree with you that it would be wonderful if all footage could be preserved with maximum spatial and temporal resolution, and maximum dynamic range (and, for that matter, some day become part of the public domain!)

All I was saying is that realistically, and unfortunately, the economic realities of film production make it unlikely that anything will be shot with quality much higher than what the stakeholders can make use of in the short to medium term.


Replying mostly to DanielBMarkham, the sibling.

I respect your point and I mostly share the sentiment. Even so; no, da Vinci wouldn't choose what to put in the Louvre. That's the point. We chose what to put in the Louvre, and if da Vinci would have chosen another medium he would have created something else, which might not hang there or anywhere, today.

I'm doing a program where I have great use of Redis for some things, while I could use MongoDB for others. Even so, I choose to use Redis exclusively because I have a feeling that the constraint will help me create a better program.

I really believe in constraints in creation. But of course, if Cameron needs 60 fps he should shot in that!


I can't recall the source, but this "max eye frame rate" was shown to be a misinterpretation.

The individual eye's components do have a frame rate, but they are not synchronized, so the eye as a whole can detect changes smaller than that.


going to 48 would be fine, but I really want digital theaters at minimum of 4K and keeping the brightness on the bulb turned up to where it is supposed to be (not lower to same money)


Apparently we have someone who believes that 2k projectors are good enough or has never experienced cheap theater operators who turn down the brightness on the projector bulbs to make them last longer / save money.

48 fps might be a fine thing, but there are other factors to enjoyment that should be looked at.


He seems to think higher frame rate will make his movies suck less.


It's about damn time. I've never understood the rush to increase resolution and 'dimensions' when simple camera pans fundamentally do not work, and never have.


And the increased dimensions merely make it more obvious. I generally prefer to watch things on my laptop / slightly larger screen, things look better than at the theater where a moderate-speed pan means things are jumping foot or so at a time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: