Hacker News new | past | comments | ask | show | jobs | submit login

High DPI on something like a phone screen makes sense since it is often used so close to the face - but are people really getting this close to their computer displays?

For a 1080p 14" screen, and a person with 20:20 vision, you would need to be closer than 50cm to the screen to be able to discern any visual differences. Hackernews has lots of people who like to claim they can tell the difference, but the science of visual acuity is against them and they never back up these claims with studies or experiments.

Given a blind A/B test at a normal desk viewing distance, I'm doubtful any of these people would be able to pick out the difference between 1080p and 4k on a screen so small.




High DPI displays have been available for a decade, since the 2012 retina macbook pro. "Science of visual acuity" aside, the difference is noticeable.

I use a tablet pc and find it bizarre some vendors sell the m with 1080p displays when it's meant to be used closer to the face at be times. 2-in-1 budget laptops also have the same issue when using 1080p panels


Read this article: https://tonsky.me/blog/monitors/. There's simply not enough pixel to render character correctly on non-hiDPI displays.


I have two laptops, one with a high DPI display (Macbook), and one with a regular 1080p screen. Both running Linux. There is quite zero difference viewing text at a regular viewing distance.

If I move uncomfortably close to the screen, I can of course make out the changes. The examples on this page do not work because they do not take into account viewing distance.


Difference is pretty obvious. I'm 100% sure to pass ABX test.

Is it seriously needed is different question. I strongly prefer it but I can live without it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: