Hacker News new | past | comments | ask | show | jobs | submit login

Sometimes I feel like I'm the only person who is really excited about these resolution improvements. When the MacBook Pro Retina came out in 2012, the only reason I bought it was because of the display. I had never used a Mac before then.

Going from 4K to 8K for a 32" monitor may seem like a small improvement, but it is a subtle sensory improvement that just makes using a computer more pleasant. Until displays reach 1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast), I will always want higher resolution.

Other than resolution improvements, it would be nice if someone would attempt an HDR light field display. This would ultimately lead to a monitor that is indistinguishable from a window.




> Until displays reach 1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast), I will always want higher resolution.

The human eye is normally rated at 1 arc minute == 1/60 degree rather than "1 arc second == 1/60 arc minute". This is around 1/30th the size of the full moon.

8K 32in at > ~.32m is "Retina" quality since it's > 60px / deg. 4K achieves this at >.6m. The former is ~275 dpi which is close to the ~300 dpi Apple used to define "Retina" for their iPhone.

Personally, I would prefer 120+ fps rather than higher resolution. I have both a 120Hz & 60Hz LCD on my desk and the difference when scrolling or dragging windows is quite noticeable.


I agree. 4k is enough. If you have perfect eyesight you need to sit about 1 foot away to resolve the pixels at 8k.

At that distance you cannot see the whole screen anyway, as the 1 arc second resolution is limited to a very small spot at the center of your vision.

Personally, I'd prefer a taller aspect ratio than 16:9, which seems designed for movies. 4:3 or even 1:1 (square) would suite desktop and medical imaging work better.


Totally agree... I have a 32" 4k monitor, and it's great. If I get really close I can still see pixels, but... realistically now I want a 32" 4k monitor with a 120Hz refresh rate. And graphics cards that can handle it... I know my MBP, even though it had the extended graphics card option, is struggling...


I got a 29" wide-screen monitor. I want it to be twice as tall as it is. I guess I want a 42"+ desktop monitor that's reasonably priced.


Consider Dell's P4317Q[1] for ~$1000, which is a 4K 43-inch display with multiple inputs (you can split screen across 2/4 devices simultaneously, if you need). I couldn't be happier, well, except for 120Hz.

[1] http://accessories.us.dell.com/sna/productdetail.aspx?c=us&c...


If you're okay with 4k at that screen size there are TVs that would work.


It takes some elite window management skills I'd think. How many windows do you expect to have on a screen that big? If more than 4, doesn't moving them around get cumbersome?

Side note: sizeUp for osx is very good, I wish there were something as good for Ubuntu.


Tiling window managers solve this problem pretty well. I've used xmonad and stumpwm, although people generally seem to prefer awesome or i3 derivatives


Frankly, a 4K 32" monitor is quite crappy...4K works at 24" or so, but above that you begin to see pixels if you use it as a standard monitor (arms length) rather than a TV (a few feet away). 8K is fairly reasonable for 32", putting it in 200+ PPI area (279.73 to be precise), meaning it could be used as a real monitor...but...it's a bit big for me (27" is kind of a stretch already).

I would like to see OLED's at this size/resolution, which would accomplish something similar.


I recently traded away a 4K monitor for a 1080p that was better in other respects because Windows 7's handling of high resolution is still, seemingly, no better than it ever has been in recent versions of Windows. I have yet to find a way to get sufficiently large text for readability without breaking all sorts of GUIs, including built-in Windows apps. Typically I find these problems show up when a level or two deep in a menu or config window.

For me the resolution of the hardware doesn't matter a bit until the OS can handle it properly. I don't know what you're using but I would guess that it's not Windows 7.


Do you mean Windows 7? Thats an OS which has been out of mainstream support for two years now. I wouldn't hold your breath for much to change.

I haven't used Windows 10 on a high DPI external monitor, but it certainly works well on the Surface Pro 3/4's high DPI screens.


Windows 10 has been spectacular on my 4k monitor; some of the software isn't (old games), but the windows experience itself seems to be nice.


Have you figured out how to stop Desktop icons from moving all over the place?


Unfortunately, I've never noticed that, because I hide my desktop icons. ;)


Newest updates (creators) aim to fix it.


Yes, Windows 7. I have heard that newer version are no better in this regard but can't speak for them directly as I am still on 7.


I use Windows 10 on a 1440p display and it works fine. Windows actually has better scaling options than a Mac, which just resorts to bilinear scaling and totally disables the option on low DPI displays where the blurring would be too obvious.


Windows 10 does just as bad a job as Windows 7 at handling high-dpi displays. Especially with multiple displays, and especially where it comes to application support (not sure how much of this is on the application side, but my impression from Retina UI-designing colleagues is that Apple made it really easy for applications to migrate/support both high- and standard- DPI displays).

I've heard the Creators Update has some improvements to display handling, so we'll see how that goes.


"just as bad a job as Windows 7 " is not true, its much better in this respect but sill not perfect. The Creators update improves it even more as now MMC panels render fonts correctly.


As someone that develops Windows desktop software, I can understand why application support is so terrible. It's not easy, especially when you have a multi-monitor system where one screen is 4k and another is a 1080. Dealing with dynamic DPI changes as the application is dragged from one monitor to the other is something I still haven't fixed.


This. I only bothered with fixing for single DPI value, even that was hard enough

This is on Microsoft, no question. Apple handled the transition much better with the hard coded 2x scaling factor. Windows is more flexible with dpi in theory, but a PITA to code against in practice.


Could you elaborate on what 10 does better than 7 on this?


A couple of great resources on the improvements in the latest Windows 10 release are this blog post: https://blogs.windows.com/buildingapps/2016/10/24/high-dpi-s... and this Channel 9 video: https://channel9.msdn.com/Events/Windows/Windows-Developer-D...


That's fair. I'll admit I haven't spent much time using Win7 with 4K+ displays. I just assumed it was bad based on how bad Win10 has been.


Agree, Windows is optimized for multiple smaller monitors; 3x24" works great for me. Not sure I'd be able to use a 32" single monitor setup as well.

(Being able to snap windows to half of the screen was one of the biggest UI quality of life improvements I can remember in a long time. MacOS is lagging on this one, as they are optimizing the experience for one screen.)


You can do split screen in macOS, too: Keep the window’s expand button pressed until a split screen appears, drag the window to the right or left side, select a window that should fill the other half of the screen.


Yeah, they added this a couple versions ago (i use MacOS primarily so I use this feature a lot), but the keyboard shortcuts on windows make it really snappy to use.

As far as I know, the Mac version of the feature requires two windows to be selected to share the screen, which is less flexible than the Windows implementation where I can temporarily pop a window into half-screen mode to peek at another window that's beneath it.


Magnet is a little menubar utility app for the Mac that does this.


Didn't know about that one. How does it compare to spectacle?


I'm not sure if spectacle has a menubar icon, but I mostly use that as opposed to keyboard shortcuts. Downside, I guess, is that Magnet isn't free but often seems to be "on sale" for 99 cents.


  Frankly, a 4K 32" monitor is quite crappy..
The Apple Cinema display is 2560×1440 (1440p) at 27" and plenty of people use that. That was a pretty standard resolution for 27-30" screens before 4K became popular. Most people would find 2160p (i.e. 4k) at this screen size to be really nice.


Apple no longer makes or sales that, it is outdated tech. I'm so used to 200+ PPI now that when I had a 4K 28" monitor at work, I thought it was just not that good.


I just bought a P2715Q (27", 163ppi) and I really haven't noticed the difference from my 220ppi 15" MacBook Pro. I'm sure if I put them side by side and shoved my face into them I would notice, but practically speaking, 163ppi is pretty great.


If you are using Mac OS, font smoothing makes it really hard to tell. If you are using Windows, the difference is more apparent given a more sharp font rendering.


This makes sense. I've also had a P2715Q for more than a year and it looks retina to me, but I'm mostly looking at the (anti-aliased) text when determining that.


I use one and while the text is pretty clear, it's not as good as a Retina-type display. The pixels are visible.

This matters mostly with tiny type. The difference there is astounding.


You mean the Thunderbolt monitor, right? Apple didn't make a 27" Cinema.


The 27" was a non-Thunderbolt Cinema Display for about a year.


Ah, my mistake. Thanks for the correction!


Most Mac people might, but professionals doing real work want at least UHD 4K


All's relative.

I recently argued that 1080p is no good at anything larger than 24" for a monitor, but that it was just right for 24".


Some people like large pixels - mainly because HiDef displays with pixel-doubling have spotty coverage.

Personally I like larger pixels, larger display at a longer distance (maybe it's my vision - gettin old?)


"4K 32" monitor is quite crappy."

You must have really good eyes. When I sit a normal distance away from my 32" I really have to make an effort to see the pixels. Unless you are a graphics professional I would say it's more than enough. I also work on a 40" 4K and there you can really see it's pixelated, but if you use your computer for coding/mail/browsing, it's still fine.


You must have incorrectly positioned monitor. At arm's length distance, the difference is immediately visible with only 27" - where you need 5K (see iMac) to loose visible pixels. You need to SEE it, not theorize.

Low dpi screen is "fine", sure - until you get used to better ones.


We have a nice range of different sizes and resolutions at our office, so I have seen it. That's subjective of course, but if you can see the difference between 4K and 5K at 27" you have much better vision than me.


You're not the only one. I too want as much resolution as I can get. I work with text all day and a high-dpi display is like the difference between reading text printed on a laser printer vs a dot matrix printer.


The Rayleigh criterion defines the theoretical limit of anything resolvable looking through and aperture, your pupil for example, irrespective of how good your retina is. For a pupil diameter of 0.5cm and 500nm light (green), this is 1.22e-4 radians. 1 arcsecond is about 5e-6 radians, or about 25 times smaller than it is theoretically possible to resolve with a pupil that size and that wavelength of light. There is roughly a factor of two for the wavelength and the pupil size each possibly but, it is physically impossible for you to be able to resolve this small.


You are not the only person, I totally agree. I was longing for a "retina" display years before the hints of it came out of Apple, and I find display quality (not just resolution, but colour quality and the right size/resolution ratio for text size) make a huge difference to my enjoyment and productivity when using a computer.


I agree. I have a 13 laptop with 2560x1700 resolution (236 PPI) and text quality is amazing - much more pleasant to look at than on my 24 inch 1920x1200 Dells. Also I don't entirely understand people who blame hardware when really it's their software that sucks on High DPI displays (looking at you, MS). Most of Linux distros have no problem with that for quite a while now.


I don't understand why they don't scale up the screen size for monitors any further.

I've been coding on a 40" 4K monitor for a while now and it's awesome to be able to see most of my code without scrolling. I would love to get better resolution but I don't want to go back to 32" (4K is good enough for coding but it's definitely pixelated at 40").

You can buy TVs in humongous sizes so I'm sure there is no technical limitation, but apparently they seem to think there is no market for large monitors. The first one on the market with a 40" monitor with a resolution above 4K has my money.


Resolution improvements are nice, we're finally reaching "print level ish" dpi for desktop monitors, which means we can finally get some decent typesetting going on, but my main gripe is that we keep upping resolution and ignoring the fact that we're still using garbage color gamuts. UHDTV Rec.2020 is basically the first reasonable-and-practical gamut, but even this monitor only does Rec.709, which is virtually identical to the ancient sRBG gamut we've been forced to live with for decades.


1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast)

What would this translate to in PPI or any other unit / measurement we might be familiar with.


1 arc second is 4.8x10^-6 radians. For small angles, sin(x) is approximately x. Thus, sin(1 arc second) = 4.8x10^-6 is the ratio between pixel width and viewing distance. At 3ft this is 5240 dpi, and at 10ft it's 1720 dpi.

I think the 1 arc second threshold is too strict by about an order of magnitude.


Yeah, I found this:

https://www.quora.com/How-do-you-convert-arc-seconds-to-mete...

"One degree is an angular measurement. 1/60th of a degree is an arc minute. 1/60th of an arc minutes is an arc second, a rather small angle but an angle none the less.

(...)

1 arc second subtends 1 meter at a distance of 205,787 meters (I did say it was a small angle)."

Which should mean that 1 arc second subtends one millimetre at ~200 meters? Do we really have that high visual acuity?

100 pixels per millimeter at 2m? 200 at 1m? That's ~25 * 200 = 5000 pixels per inch at 1m. I would think 1200 dpi would be more than sufficient at 1m...

[ed: missed the bit about: "from standard viewing distances" - still sounds rather extreme]


The theoretical upper limit of the resolving power of the human eye will be given by the Rayleigh criterion. For a pupil diameter of 5mm and a wavelength of 500nm (green light) we can theoretically resolve about 1.22e-4 radians. 1 arcsecond is about 5e-6 radians, i.e. 25 times smaller than we could theoretically resolve without a magic retina.

1 arcminute is a more realistic size for what people can reliably resolve I think. I guess we need significantly better than that to avoid a subjective feeling it is blurry but, that is about what we can actually reliably tell the difference between I think. We cannot tell the difference below 25 arcseconds or so though, it is not possible without larger pupils.


Can you tell us at what distance the theoretical 1.22e-4 radians would apply. I suppose you mean something like regular screen-viewing distance.


The angle in radians apply at any distance - if I understand correctly it is about when light from two adjacent points would have to "hit" the aperture so close as to interference with each other. Another random link from the Internet makes the connection between the definition of 20/20 vision and Rayleigh limit:

http://hyperphysics.phy-astr.gsu.edu/hbase/phyopt/Raylei.htm...


> Other than resolution improvements, it would be nice if someone would attempt an HDR light field display.

Light field displays just mean even more resolution increases because they're usually created by throwing more pixels at the problem and sticking microlenses over it. Although it doesn't have to be planar, afaik you can also achieve light fields by modifying the phase by altering the properties of the medium in its depth dimension. But you still need to encode a lot more informaation, so whether you have 2D or 3D pixels... you still need more resolution.


The thing that bugs me about hi-res displays is that not all software are optimised for them. I bought a Dell 24" to make my workflow easier (transitioning from a 14" laptop!) but when I load up a multimedia software major to my contracts, the interface gets all tiny squished. In the end, I have had to lower the resolution anyway, which defeats the purpose..


> Until displays reach 1/60th of an arc minute …

Or an arc second, a second being 1/60th of a minute

(as an aside, I was prevented from posting this for some time due to 'submitting too fast,' but I've only posted five times today, twice in the past hour & thrice in the hours before that — how strange!)


I think a lot of us aren't excited yet because it costs $5000 and therefore isn't a realistic option for most people. When the price comes down, I'll be ecstatic.


Exactly. I think only this year, most people are starting to move to 4K monitors. These have become affordable, as the cheapest are moving to the $300-$400 range.


I run my 4K 15" laptop at half res (2x2 pixel blocks) because I take lower res everywhere much rather than even one single app or bit of OS scaling poorly.

I might just not be very sensitive but I don't notice the positive difference when switching to 4K, but the negative ones such as various apps not respecting dpi scaling settings are immediately visible. On 15" any app that not scaled is unusable.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: