Hacker News new | past | comments | ask | show | jobs | submit login

Here's a chart which shows the ideal viewing distance for various resolutions, based on the smallest detail the human eye can discern at 20/20 vision.

http://cdn.avsforum.com/4/4c/600x376px-LL-4cd4431b_200ppdeng...

This monitor is pretty close to retina level DPI based on the typical viewing distance, but i guess a 24inch 4K would be even better.




Note also that 20/20 vision isn't actually that high, especially if you're young, so the chart isn't necessarily definitive.

"a subject diagnosed as having 20/20 vision will often actually have higher visual acuity because, once this standard is attained, the subject is considered to have normal (in the sense of undisturbed) vision and smaller optotypes are not tested."

http://en.wikipedia.org/wiki/Visual_acuity


It's probably better to have a larger screen further away, particularly for older people who can't focus close up any more. That occurs some time in your 40's for you youngsters who may think "older" means quit a few more years than that.


Your information is wildly inaccurate. Take a 30' black screen vs one with a single white pixel and someone can tell the difference from across a football field if it's dark enough. Do the same thing with one white pixel vs 2 next to each other and you can't tell the difference. The important point is screens showing normal video have aliasing effects so under some situations with unedited video you get differences such as flickering at fairly long distances. Edit: Basicly if you have 480p and 720p video having a 720p monitor is worse than a 720x4 monitor at fairly long distances.

Toss in compression artifacts and you want a screen at least 4x the resolution as your showing in that chart.


The chart is just about how many DPI you need based on a specific distance to reach the point where the average human eye would stop seeing benefits vs even higher DPI.

Your points are more about refresh rates, video compression and lightning. For example, in gaming antialising and other "smoothing" techniques are wildy used to improve image quality, but when playing in 4K resolution on a 24inch screen you wouldn't need those anymore because your eye can't see a difference.


Aliasing is vary easy to notice even on a 4k screen. Look at the second immage it's got way to much white and you can easily have the same issues at 4k or 8k.

http://en.wikipedia.org/wiki/Aliasing

In the end increasing resolution does help in most cases, but with the right fractal pattern there is no 'safe' resolution.

PS: It's basically the same reason that QuickSort is a O(n^2) sorting method in the worst case. Pick the wrong data and your assumptions fall apart.


Note that you want no pixellation effects when image features are at the limit of your visual acuity. So you want the pixels to be 3-5 times smaller than you can see.

To demonstrate: draw a pair of vertical black lines 1px wide, with 1px white space between them. Then, tilt them at 30° or 45°. (Or, draw a pair of circles that are 1px thick and have 1px between them at the top, bottom, and sides; then look at various other positions.) Then try the same thing with a line and space thickness of 3px and 5px.

See also: https://en.wikipedia.org/wiki/Moir%C3%A9_pattern


This chart gets trotted out in seemingly every resolution-related discussion... does anyone know where the numbers come from? I've always assumed they were pulled from somebody's ass, but I'd be interested to find out if that's not the case.

Very curious since it looks dubious to me but gets thrown out at as cold hard fact every single time.


do you have something like this but log scale? the most interesting part of the graph is the most unreadable.


I believe that chart is for video viewing. That's important for a TV, but on a computer monitor you also spend a lot of time reading text and higher resolutions tend to provide benefits for longer with that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: