Hacker News new | past | comments | ask | show | jobs | submit login

I'm confused by the quote of number of pixels required to match the eyes resolution. I don't see a definition of the size of a pixel in that calculation? Though the whole thing seems way to thorough for them to have missed that, so I'm assuming I misunderstood.



> How many pixels are needed to match the resolution of the human eye? Each pixel must appear no larger than 0.3 arc-minute. Consider a 20 x 13.3-inch print viewed at 20 inches. The Print subtends an angle of 53 x 35.3 degrees, thus requiring 5360/.3 = 10600 x 3560/.3 = 7000 pixels


The calculation uses the size of a pixel in arc-minutes (radians). If you specify a pixel size (in m, or cm) then you will need to also specify the eye-pixel distance. Easier to calculate using the arc-minutes size of a pixel.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: