Hacker News new | past | comments | ask | show | jobs | submit login
How many frames can humans see (100fps.com)
87 points by coliveira on Jan 7, 2010 | hide | past | favorite | 34 comments



Re Sensitivity to brightness: Human eyes can see single photons (well, one is too few, but ten will definitly do it). Which means that if there is some light - any light at all - we will see it. Not the direction (if we are talking about single photons) but still. Oh, well, there will never be a TV good enough :)


The first hit from Google:

The human eye is very sensitive but can we see a single photon? The answer is that the sensors in the retina can respond to a single photon. However, neural filters only allow a signal to pass to the brain to trigger a conscious response when at least about five to nine arrive within less than 100 ms. </quote> http://math.ucr.edu/home/baez/physics/Quantum/see_a_photon.h...


Is this a correction? To quote myself: "(well, one is too few, but ten will definitly do it)"


It seems rather like an elaboration.


Definitely. And it's pretty apparent that the responder wanted to actually reference something, rather than just rely on someone saying, "10 photons should be enough for anyone"


It is easy to prove that humans can perceive more than 60 frames per second: wave your hand in front of a fluorescent light. They flicker at the same speed as your electricity (in the US it's 60Hz) and you can easily see the motion break down.


That's not "perceiving" at 60Hz though. The eye is seeing an integrated image, which happens to be of several "frames" of your hand lit mixed with gaps of darkness. Effectively, you're seeing multiple strobes "at the same time".

You do get aliasing effects with strong strobing though, where your eyes can see beating patterns in the image intensity. That's why some people get eye strain looking at 60Hz displays; moving to ~70Hz or so generally fixes it.

But again, our visual systems do not have the ability to distinguish between individual events separated by 1/60th of a second. We just can't do it, and trying to animate at that speed is purely wasted effort.


>But again, our visual systems do not have the ability to distinguish between individual events separated by 1/60th of a second. We just can't do it, and trying to animate at that speed is purely wasted effort.

And yet, there's a very clear perceivable difference between a game running at 30hz and 60hz. How does that work?


As the article said, you see chopping because the images are sharp. Imagine if your eye only worked at 10fps. You'd still see the difference between 30 and 60 because moving objects would chop even after averaging over 3 or 6 frames. It's just that the difference between the frames would be smaller at higher fps, so eventually they are too close to notice and you effectively get motion blur.

If the game rendered at 30fps but with motion blurring to account for 1/30s of motion, and at 60fps to account for 1/60s of motion, I predict it would be very difficult to tell the difference.


I don't buy it. Until someone shows me a real study (or even a demo where I can verify the frame rate and see the effect for myself), I'm not going to believe this "effect" exists.

Almost certainly, you're not seeing a 30Hz game. You're seeing a game running at 30Hz typically and being annoyed by a handful of frames that take longer than ~100ms. It's the outliers that are doing it, not the frame rate.


There is a significant difference between a locked-at-30fps game and a locked-at-60fps game. Heck, I've seen 60fps video on a PC and you could instantly tell that it was uncannily smooth compared to regular 30fps video. Many high-end filmmakers have been pushing for higher framerate movies as well - Roger Ebert has been a big proponent of this due to demos he's seen (preferring it over 3D), and James Cameron was pushing for Avatar to be filmed at 60fps.

One of the most interesting examples of 30fps vs. 60fps. was the PC port of the original Halo. It has the option to unlock the framerate from 30fps and render as fast as possible, unlike the locked-at-30 Xbox original. However, all the animations were 30hz - if you disabled the framerate lock, the gameworld would run perfectly smooth (moving, physics, etc.), but all of the animations stayed at 30hz, appearing choppy and disconcerting. (I believe there are demos of Halo 1 for PC and Mac, should you have a computer capable of running it - since it's many years old it should be OK on anything with a reasonable graphics card, though you may need to turn the graphics settings all the way down to see the effects.)

Many games will be fine at 30FPS, as long as they maintain that framerate - but for very fast action games, the extra frames do make a noticeable difference in feel and responsiveness.


Then watch a 60fps video. Then decimate it to 30fps, and watch that. Or will you claim that there are "outliers" in a constant framerate video, too?

The visual difference is so significant that television broadcasters have kept interlacing solely for the greater temporal resolution, despite the tens of billions of dollars in inefficiencies that the legacy technology creates.


If the decimation does anything but average frames, then it's a lossy process and of course there will be detectable differences. And if it does average frames, it's susceptible to the aliasing issues I mentioned above (lighning flashes become gray blurs, etc...)

The question was if there's a single game (or video) you can point me to that (1) has a solid 30Hz frame rate and (2) looks perceptibly "not smooth" for some obvious definition thereof.


Sure.

Here, I'll give an example that doesn't even rely on high motion, flashing, or other such tricks: a simple video game clip. This particular game engine has its display locked to 60fps: the in-game time between two frames is absolutely constant, so even if your computer is too slow to display in realtime, it will simply output frames slower. As a result, the FRAPS'd capture of the game is a perfect smooth 60fps no matter what.

Additionally, it doesn't have any single-frame effects that would be visually aliased (e.g. lightning), nor does it have any sort of motion blur.

http://mirror05.x264.nl/Dark/testfps1.mkv http://mirror05.x264.nl/Dark/testfps2.mkv

Don't look at the filesizes (obviously, 60fps will be larger), and don't go checking the file info or whatever. Just play them in your favorite media player.

I'm pretty sure you'll be able to tell which is 60fps.


Uh... then why is there a frame rate counter in the bottom right of the screen giving variable numbers between 40-70? :)

I agree that the smaller file looks jumpier. But it's still reporting the same frame rates as the bigger file, which leads me to believe the jumps are an artifact of translation somewhere. Certainly nothing seems to be "locked".


Uh... then why is there a frame rate counter in the bottom right of the screen giving variable numbers between 40-70? :)

Because that's the rate of the frames being displayed during capture. Again, the in-game time between two frames is the same, so if it captures at 50fps, that just means the game runs at 50/60=5/6 times normal speed.

It's similar to how you can capture a 1000fps recording in Counterstrike: the game doesn't actually have to run at 1000fps, it just slows down the game accordingly.

I agree that the smaller file looks jumpier. But it's still reporting the same frame rates as the bigger file, which leads me to believe the jumps are an artifact of translation somewhere. Certainly nothing seems to be "locked".

It sounds like your media player is broken, because ffmpeg gives 60fps for the higher framerate file and 30fps for the lower one.


Yes 30fps has less information than 60fps, but I thought you were saying above that we can't perceive it. The whole point of the argument is that removing every other frame (and thereby doubling the viewing time of the shown frame) is perceptible. A game running at 60fps is doing exactly that: rendering twice as much information and showing it to you.

And it's all relative… The argument is that 30fps is perfectly smooth until you see it side by side with 60fps.


Your conclusion is a non sequitur. While people cannot discern individual events a 60Hz, most will perceive a CRT monitor as slightly but disconcertingly flickering, an effect that only vanishes at 70Hz or more (I once had a co-worker who didn't notice that 60Hz flicker and opted for more pixels over higher frequency in his XF86Config, nobody else could look at his monitor for extended periods of time...).

So there definitely is some room for quality improvements above 30Hz, but the amount depends on the viewer and the technology (flat screens do not flicker even at 60Hz, and it is almost always preferable if you animation frame rate is the same as you video refresh rate [edit: or an integer fraction thereof. From which premise we can mathematically prove that there is only one acceptable frame rate for TV sets: the least common multiple of 30Hz (NTSC video), 25Hz (PAL and SECAM video) and 24Hz (movies), which happens to be 600Hz. Way to go, my dear marketing departments.] ).


They flicker at 120, because the top and bottom of the sine wave both "flick".


Side note: If you have florescent lighting running at 60hz, and a UK tv running at 50hz, you get a horrendous flicker effect on the tv. (I got this working on the EU version of a console game, in the US, on an imported UK tv.)


That is not percieving, that is aliasing and conclusively demonstrates that you cannot percieve 60fps. If you could, you would see only one image, not a superposition of several illumination states.


What's the resolution and bitrate of a general human eye?

I have a theory: the ultimate limit of broadband upgrading movement is our brain's whole throughput. After that individual consumers' demand will slow down and the broadband market will be dedicated for industrial broadband market. :)


That's a hard problem.

The human eye is only ~2 megapixels but it's focused on a small part of your field of view. So if you wanted to fool the eye you either need to track what it's looking at or to create a huge high resolution screen most of which is ignored. Add in the ability to focus and you suddenly need to add depth information and a display that can handle it or you will not fool the eye. Now add the fact that most people have 2 of them and it gets even more insane.

PS: Video cards have far higher internal bandwidth than external bandwidth for a reason. Focusing on the brains bandwidth is not going to be enough.


I'm pretty sure I've read that the resolution is somewhat unlimited (i.e. it your field of vision blends in seemslessly with short-term memory) but the bandwidth is actually quite limited, but variable (i.e. can handle much change in a small area or little change in a large area).


Back in 1995 Jakob Nielsen estimated it as ~1Tbps. http://www.useit.com/alertbox/9511.html


Fascinating, but one bit of feedback--I really would like to pass this on to a couple of young, budding scientists, but I can't because of the examples: murder, people being stabbed with an icepick, etc. Out of all the movies this is what they choose? sigh

Please, writers--choose your examples with a broad audience in mind.


3dfx had a demo of this a while back. (11 years ago!) It would show 60/30/15 fps side-by-side, so you could better compare how the frame-rate impacts you. I found it archived here: http://www.falconfly.de/artwork.htm http://www.falconfly.de/downloads/3dfxdemo-3060hz.zip

You're likely to need a Glide wrapper too: http://www.sierrahelp.com/Utilities/DisplayUtilities/GlideWr...


The trouble is that the higher the frame rate the less photons are captured by the light sensitive elements, and so the larger the shot noise becomes.


Follow this up with his other useful sites including:

http://100-dating-tips.com

http://in-my-opinion.org


Sorry, I down-voted you thinking this was a spam comment. I see now that you were actually pointing out the other sites this author has written. Sorry about that!


BTW: These are other sites by the same author.

I think we should dismiss his 100fps articles based on the fact that he operates a strange dating tips site.


whoops, this is missing a not. "we should not dismiss ..." sorry about that!


There was me thinking it was sarcasm - works both ways :)


from http://100-dating-tips.com/

"Moreover many women have bisexual tendencie"

I can confirm this based on the sample of my female friends. Wondering why ...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: