Hacker News new | past | comments | ask | show | jobs | submit login
Dell’s 32-inch 8K UP3218K Display Now for Sale (anandtech.com)
229 points by DiabloD3 on March 24, 2017 | hide | past | favorite | 247 comments



Sometimes I feel like I'm the only person who is really excited about these resolution improvements. When the MacBook Pro Retina came out in 2012, the only reason I bought it was because of the display. I had never used a Mac before then.

Going from 4K to 8K for a 32" monitor may seem like a small improvement, but it is a subtle sensory improvement that just makes using a computer more pleasant. Until displays reach 1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast), I will always want higher resolution.

Other than resolution improvements, it would be nice if someone would attempt an HDR light field display. This would ultimately lead to a monitor that is indistinguishable from a window.


> Until displays reach 1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast), I will always want higher resolution.

The human eye is normally rated at 1 arc minute == 1/60 degree rather than "1 arc second == 1/60 arc minute". This is around 1/30th the size of the full moon.

8K 32in at > ~.32m is "Retina" quality since it's > 60px / deg. 4K achieves this at >.6m. The former is ~275 dpi which is close to the ~300 dpi Apple used to define "Retina" for their iPhone.

Personally, I would prefer 120+ fps rather than higher resolution. I have both a 120Hz & 60Hz LCD on my desk and the difference when scrolling or dragging windows is quite noticeable.


I agree. 4k is enough. If you have perfect eyesight you need to sit about 1 foot away to resolve the pixels at 8k.

At that distance you cannot see the whole screen anyway, as the 1 arc second resolution is limited to a very small spot at the center of your vision.

Personally, I'd prefer a taller aspect ratio than 16:9, which seems designed for movies. 4:3 or even 1:1 (square) would suite desktop and medical imaging work better.


Totally agree... I have a 32" 4k monitor, and it's great. If I get really close I can still see pixels, but... realistically now I want a 32" 4k monitor with a 120Hz refresh rate. And graphics cards that can handle it... I know my MBP, even though it had the extended graphics card option, is struggling...


I got a 29" wide-screen monitor. I want it to be twice as tall as it is. I guess I want a 42"+ desktop monitor that's reasonably priced.


Consider Dell's P4317Q[1] for ~$1000, which is a 4K 43-inch display with multiple inputs (you can split screen across 2/4 devices simultaneously, if you need). I couldn't be happier, well, except for 120Hz.

[1] http://accessories.us.dell.com/sna/productdetail.aspx?c=us&c...


If you're okay with 4k at that screen size there are TVs that would work.


It takes some elite window management skills I'd think. How many windows do you expect to have on a screen that big? If more than 4, doesn't moving them around get cumbersome?

Side note: sizeUp for osx is very good, I wish there were something as good for Ubuntu.


Tiling window managers solve this problem pretty well. I've used xmonad and stumpwm, although people generally seem to prefer awesome or i3 derivatives


Frankly, a 4K 32" monitor is quite crappy...4K works at 24" or so, but above that you begin to see pixels if you use it as a standard monitor (arms length) rather than a TV (a few feet away). 8K is fairly reasonable for 32", putting it in 200+ PPI area (279.73 to be precise), meaning it could be used as a real monitor...but...it's a bit big for me (27" is kind of a stretch already).

I would like to see OLED's at this size/resolution, which would accomplish something similar.


I recently traded away a 4K monitor for a 1080p that was better in other respects because Windows 7's handling of high resolution is still, seemingly, no better than it ever has been in recent versions of Windows. I have yet to find a way to get sufficiently large text for readability without breaking all sorts of GUIs, including built-in Windows apps. Typically I find these problems show up when a level or two deep in a menu or config window.

For me the resolution of the hardware doesn't matter a bit until the OS can handle it properly. I don't know what you're using but I would guess that it's not Windows 7.


Do you mean Windows 7? Thats an OS which has been out of mainstream support for two years now. I wouldn't hold your breath for much to change.

I haven't used Windows 10 on a high DPI external monitor, but it certainly works well on the Surface Pro 3/4's high DPI screens.


Windows 10 has been spectacular on my 4k monitor; some of the software isn't (old games), but the windows experience itself seems to be nice.


Have you figured out how to stop Desktop icons from moving all over the place?


Unfortunately, I've never noticed that, because I hide my desktop icons. ;)


Newest updates (creators) aim to fix it.


Yes, Windows 7. I have heard that newer version are no better in this regard but can't speak for them directly as I am still on 7.


I use Windows 10 on a 1440p display and it works fine. Windows actually has better scaling options than a Mac, which just resorts to bilinear scaling and totally disables the option on low DPI displays where the blurring would be too obvious.


Windows 10 does just as bad a job as Windows 7 at handling high-dpi displays. Especially with multiple displays, and especially where it comes to application support (not sure how much of this is on the application side, but my impression from Retina UI-designing colleagues is that Apple made it really easy for applications to migrate/support both high- and standard- DPI displays).

I've heard the Creators Update has some improvements to display handling, so we'll see how that goes.


"just as bad a job as Windows 7 " is not true, its much better in this respect but sill not perfect. The Creators update improves it even more as now MMC panels render fonts correctly.


As someone that develops Windows desktop software, I can understand why application support is so terrible. It's not easy, especially when you have a multi-monitor system where one screen is 4k and another is a 1080. Dealing with dynamic DPI changes as the application is dragged from one monitor to the other is something I still haven't fixed.


This. I only bothered with fixing for single DPI value, even that was hard enough

This is on Microsoft, no question. Apple handled the transition much better with the hard coded 2x scaling factor. Windows is more flexible with dpi in theory, but a PITA to code against in practice.


Could you elaborate on what 10 does better than 7 on this?


A couple of great resources on the improvements in the latest Windows 10 release are this blog post: https://blogs.windows.com/buildingapps/2016/10/24/high-dpi-s... and this Channel 9 video: https://channel9.msdn.com/Events/Windows/Windows-Developer-D...


That's fair. I'll admit I haven't spent much time using Win7 with 4K+ displays. I just assumed it was bad based on how bad Win10 has been.


Agree, Windows is optimized for multiple smaller monitors; 3x24" works great for me. Not sure I'd be able to use a 32" single monitor setup as well.

(Being able to snap windows to half of the screen was one of the biggest UI quality of life improvements I can remember in a long time. MacOS is lagging on this one, as they are optimizing the experience for one screen.)


You can do split screen in macOS, too: Keep the window’s expand button pressed until a split screen appears, drag the window to the right or left side, select a window that should fill the other half of the screen.


Yeah, they added this a couple versions ago (i use MacOS primarily so I use this feature a lot), but the keyboard shortcuts on windows make it really snappy to use.

As far as I know, the Mac version of the feature requires two windows to be selected to share the screen, which is less flexible than the Windows implementation where I can temporarily pop a window into half-screen mode to peek at another window that's beneath it.


Magnet is a little menubar utility app for the Mac that does this.


Didn't know about that one. How does it compare to spectacle?


I'm not sure if spectacle has a menubar icon, but I mostly use that as opposed to keyboard shortcuts. Downside, I guess, is that Magnet isn't free but often seems to be "on sale" for 99 cents.


  Frankly, a 4K 32" monitor is quite crappy..
The Apple Cinema display is 2560×1440 (1440p) at 27" and plenty of people use that. That was a pretty standard resolution for 27-30" screens before 4K became popular. Most people would find 2160p (i.e. 4k) at this screen size to be really nice.


Apple no longer makes or sales that, it is outdated tech. I'm so used to 200+ PPI now that when I had a 4K 28" monitor at work, I thought it was just not that good.


I just bought a P2715Q (27", 163ppi) and I really haven't noticed the difference from my 220ppi 15" MacBook Pro. I'm sure if I put them side by side and shoved my face into them I would notice, but practically speaking, 163ppi is pretty great.


If you are using Mac OS, font smoothing makes it really hard to tell. If you are using Windows, the difference is more apparent given a more sharp font rendering.


This makes sense. I've also had a P2715Q for more than a year and it looks retina to me, but I'm mostly looking at the (anti-aliased) text when determining that.


I use one and while the text is pretty clear, it's not as good as a Retina-type display. The pixels are visible.

This matters mostly with tiny type. The difference there is astounding.


You mean the Thunderbolt monitor, right? Apple didn't make a 27" Cinema.


The 27" was a non-Thunderbolt Cinema Display for about a year.


Ah, my mistake. Thanks for the correction!


Most Mac people might, but professionals doing real work want at least UHD 4K


All's relative.

I recently argued that 1080p is no good at anything larger than 24" for a monitor, but that it was just right for 24".


Some people like large pixels - mainly because HiDef displays with pixel-doubling have spotty coverage.

Personally I like larger pixels, larger display at a longer distance (maybe it's my vision - gettin old?)


"4K 32" monitor is quite crappy."

You must have really good eyes. When I sit a normal distance away from my 32" I really have to make an effort to see the pixels. Unless you are a graphics professional I would say it's more than enough. I also work on a 40" 4K and there you can really see it's pixelated, but if you use your computer for coding/mail/browsing, it's still fine.


You must have incorrectly positioned monitor. At arm's length distance, the difference is immediately visible with only 27" - where you need 5K (see iMac) to loose visible pixels. You need to SEE it, not theorize.

Low dpi screen is "fine", sure - until you get used to better ones.


We have a nice range of different sizes and resolutions at our office, so I have seen it. That's subjective of course, but if you can see the difference between 4K and 5K at 27" you have much better vision than me.


You're not the only one. I too want as much resolution as I can get. I work with text all day and a high-dpi display is like the difference between reading text printed on a laser printer vs a dot matrix printer.


The Rayleigh criterion defines the theoretical limit of anything resolvable looking through and aperture, your pupil for example, irrespective of how good your retina is. For a pupil diameter of 0.5cm and 500nm light (green), this is 1.22e-4 radians. 1 arcsecond is about 5e-6 radians, or about 25 times smaller than it is theoretically possible to resolve with a pupil that size and that wavelength of light. There is roughly a factor of two for the wavelength and the pupil size each possibly but, it is physically impossible for you to be able to resolve this small.


You are not the only person, I totally agree. I was longing for a "retina" display years before the hints of it came out of Apple, and I find display quality (not just resolution, but colour quality and the right size/resolution ratio for text size) make a huge difference to my enjoyment and productivity when using a computer.


I agree. I have a 13 laptop with 2560x1700 resolution (236 PPI) and text quality is amazing - much more pleasant to look at than on my 24 inch 1920x1200 Dells. Also I don't entirely understand people who blame hardware when really it's their software that sucks on High DPI displays (looking at you, MS). Most of Linux distros have no problem with that for quite a while now.


I don't understand why they don't scale up the screen size for monitors any further.

I've been coding on a 40" 4K monitor for a while now and it's awesome to be able to see most of my code without scrolling. I would love to get better resolution but I don't want to go back to 32" (4K is good enough for coding but it's definitely pixelated at 40").

You can buy TVs in humongous sizes so I'm sure there is no technical limitation, but apparently they seem to think there is no market for large monitors. The first one on the market with a 40" monitor with a resolution above 4K has my money.


Resolution improvements are nice, we're finally reaching "print level ish" dpi for desktop monitors, which means we can finally get some decent typesetting going on, but my main gripe is that we keep upping resolution and ignoring the fact that we're still using garbage color gamuts. UHDTV Rec.2020 is basically the first reasonable-and-practical gamut, but even this monitor only does Rec.709, which is virtually identical to the ancient sRBG gamut we've been forced to live with for decades.


1/60th of an arc minute from standard viewing distances (roughly the discerning power of the human eye under assumptions on contrast)

What would this translate to in PPI or any other unit / measurement we might be familiar with.


1 arc second is 4.8x10^-6 radians. For small angles, sin(x) is approximately x. Thus, sin(1 arc second) = 4.8x10^-6 is the ratio between pixel width and viewing distance. At 3ft this is 5240 dpi, and at 10ft it's 1720 dpi.

I think the 1 arc second threshold is too strict by about an order of magnitude.


Yeah, I found this:

https://www.quora.com/How-do-you-convert-arc-seconds-to-mete...

"One degree is an angular measurement. 1/60th of a degree is an arc minute. 1/60th of an arc minutes is an arc second, a rather small angle but an angle none the less.

(...)

1 arc second subtends 1 meter at a distance of 205,787 meters (I did say it was a small angle)."

Which should mean that 1 arc second subtends one millimetre at ~200 meters? Do we really have that high visual acuity?

100 pixels per millimeter at 2m? 200 at 1m? That's ~25 * 200 = 5000 pixels per inch at 1m. I would think 1200 dpi would be more than sufficient at 1m...

[ed: missed the bit about: "from standard viewing distances" - still sounds rather extreme]


The theoretical upper limit of the resolving power of the human eye will be given by the Rayleigh criterion. For a pupil diameter of 5mm and a wavelength of 500nm (green light) we can theoretically resolve about 1.22e-4 radians. 1 arcsecond is about 5e-6 radians, i.e. 25 times smaller than we could theoretically resolve without a magic retina.

1 arcminute is a more realistic size for what people can reliably resolve I think. I guess we need significantly better than that to avoid a subjective feeling it is blurry but, that is about what we can actually reliably tell the difference between I think. We cannot tell the difference below 25 arcseconds or so though, it is not possible without larger pupils.


Can you tell us at what distance the theoretical 1.22e-4 radians would apply. I suppose you mean something like regular screen-viewing distance.


The angle in radians apply at any distance - if I understand correctly it is about when light from two adjacent points would have to "hit" the aperture so close as to interference with each other. Another random link from the Internet makes the connection between the definition of 20/20 vision and Rayleigh limit:

http://hyperphysics.phy-astr.gsu.edu/hbase/phyopt/Raylei.htm...


> Other than resolution improvements, it would be nice if someone would attempt an HDR light field display.

Light field displays just mean even more resolution increases because they're usually created by throwing more pixels at the problem and sticking microlenses over it. Although it doesn't have to be planar, afaik you can also achieve light fields by modifying the phase by altering the properties of the medium in its depth dimension. But you still need to encode a lot more informaation, so whether you have 2D or 3D pixels... you still need more resolution.


The thing that bugs me about hi-res displays is that not all software are optimised for them. I bought a Dell 24" to make my workflow easier (transitioning from a 14" laptop!) but when I load up a multimedia software major to my contracts, the interface gets all tiny squished. In the end, I have had to lower the resolution anyway, which defeats the purpose..


> Until displays reach 1/60th of an arc minute …

Or an arc second, a second being 1/60th of a minute

(as an aside, I was prevented from posting this for some time due to 'submitting too fast,' but I've only posted five times today, twice in the past hour & thrice in the hours before that — how strange!)


I think a lot of us aren't excited yet because it costs $5000 and therefore isn't a realistic option for most people. When the price comes down, I'll be ecstatic.


Exactly. I think only this year, most people are starting to move to 4K monitors. These have become affordable, as the cheapest are moving to the $300-$400 range.


I run my 4K 15" laptop at half res (2x2 pixel blocks) because I take lower res everywhere much rather than even one single app or bit of OS scaling poorly.

I might just not be very sensitive but I don't notice the positive difference when switching to 4K, but the negative ones such as various apps not respecting dpi scaling settings are immediately visible. On 15" any app that not scaled is unusable.


Any engineers familiar with DSC 1.2 care to comment on how the required compression to support 8K@60Hz over DP 1.4 could presumably manifest undesirable artifacts (for lack of a better analogy) which would run counter to the reasons that would make this high-end display attractive from a productivity perspective? I initially thought this would be a vanilla lossless frame transaction, but that's apparently not the case. If imperfections aren't perceivable at 32", I'm curious how they'll visually manifest (if at all) as display size scales up.

Wiki[1] notes that DSC is visually lossless per ISO/IEC 29170-2 test method, but I'm neither familiar with DSC nor the ISO/IEC standard. What are the practical implications of calibrating against a $200+ colorimeter accessory that Dell is so intent on adding to your cart?

[1] https://en.wikipedia.org/wiki/DisplayPort#Display_Stream_Com...


Dell specifies[1] 87W (typical) to 125W (max), with 89.5W in Energy Star mode. Has anyone ballpark-estimated power requirement to generate the video signals driving such a monitor?

Also, any idea what DP cable lengths will be pragmatically limited to at such a resolution?

P.S. The naivete of this type[2] of marketing tactic never fails to blow my mind away...as perceived from an inferior Dell U2415.

[1] http://www.dell.com/en-us/shop/dell-ultrasharp-32-8k-monitor...

[2] http://i.dell.com/das/xa.ashx/global-site-design%20WEB/2a9f3...


"Dell specifies[1] 87W (typical) to 125W (max), with 89.5W in Energy Star mode. Has anyone ballpark-estimated power requirement to generate the video signals driving such a monitor?"

I've never looked at LCD panel power usage before ... are those numbers high or low or ... ?

If I go to best buy and buy a 50" samsung 4k TV, what would that power usage be, roughly ?

(just trying to get a sense of comparison)


For comparison, my 10 year old Dell 3007WFP (30" 16:10 1600p monitor for similar market segment) is 147W typical 177W max. So 8x the pixels for ~60% of the power consumption after a decade of improvement sounds good. Shame it's 16:9.


An Asus PA328Q (32" 4K monitor) specs say "Power On: <138.3W".[1]

For the tv comparison, choosing the first 55" 4K tv on Samsung's site: "Typical Power Consumption: 65, Maximum Power Consumption: 170".[2]

[1]: https://www.asus.com/us/Commercial-Monitors/PA328Q/specifica...

[2]: http://www.samsung.com/us/televisions-home-theater/tvs/4k-uh...


I guess that points to a chance that a USB-C 5A cable at 20V or something could soon power something like this. If that cable were carrying the DP signal, that'd be rad.


8K@60Hz appears to be the end-game bandwidth limit[1] of DP over USB-C, although I suspect the pragmatic reality of the situation will ultimately be driven by cable length limits.

I'm really struggling to wrap my mind around how sensibly pushing 100W of power out of a wee little USB-C port on a laptop might look though.

[1] https://www.displayport.org/what-is-displayport-over-usb-c/


Afaik the Displayport Alternate mode of USB-C is limited to Displayport 1.2, which means 4k@60Hz max. For that reason the external Apple/LG 5k displays use the Thunderbolt3 mode of USB-C instead of Displayport.

It could also be possible that faster displayport alternate modes could be possible, but are just not available on current-gen transceivers.


>It’s worth noting that Raja Koduri, SVP of AMD’s Radeon Technology Group, has stated that VR needs 16K per-eye at 144 Hz to emulate the human experience

This is tangential to this thread, but can anyone in here explain the state of the art in eye tracking? Actually rendering 16k quality for my peripheral vision seems insane to me, so I'm really interested in the barriers between today's tech and a good foveated headset.


You may not need to render that pixel density in all places, but since you may need that in any place, your display hardware needs to be able to show that detail in any place where the user can point their eyeballs.


Every year at CES, e3, etc. there are vendors with pretty good eye tracking headsets. APIs for foveated rendering are in place already. I'd imagine it'll be in the next generation of headsets.


I didn't think the civilian state of the art in foveated rendering is sufficiently good enough to fool our vision processing into believing it is seeing a "real" view, even in our peripheral vision, without head fatigue after say, 4 hours continuous usage, but I don't follow the field that closely because a couple years ago I thought that refinement level was so far away. A quick Google seems to reinforce that perception. I only hear PR puff pieces of military-grade gear that good now, like the F-35 helmet that is supposed to be light years ahead of anything civilians can get their hands on.

Can anyone in the field give a précis on what is the state of the art?


I cannot possible believe that rendering at this resolution is necessary. Most VR users don't turn their eyeballs while wearing headsets, but opt to use controls to adjust their perspective. I've read several accounts of hacks developers have taken when rendering VR games, but one that has kept cropping up is reducing resolution of areas in the periphery.

While you certainly could use 16K displays for VR, I think it would be entirely unnecessary.


Most VR users don't turn their eyeballs right now because the field of view is pretty small.


> Most VR users don't turn their eyeballs while wearing headsets

Err what? Of course they do!


It needs 10K but I guess that becomes 16K due to base2 process.


I think he may meant that if you were to process everything for a 360° view, you would need 16K. For games that are rendered on the fly, you probably can do with less (have the card render only the portion of the scene you see), but for video, you would have to shoot at 16K.


Didn't someone once say 640K ought to be enough for anyone?


Gee, and I always thought 15.5K and 143 Hz was enough.


Well... if you have 20/20 vision, you can resolve about one arc-minute. (That's at your fovea. Everywhere else, not so great. But we will probably find it easier to render everything than track your eyei saccades and re-render fast enough to fool them.) Your field of vision is about 180 degrees wide (not really) and about 135 degrees vertically (full eye movement).

So: 60 arc-minutes to a degree, 180 wide: 10800 pixels.

60 * 135 wide: 8100 pixels.

An 8192x4096 is not there, but a 16384x8192 is.

Now double that so you can get stereo vision (though really there's a lot of overlap that's going to be hard to arrange), update it 100 times a second in 48 bit color... 420 billion bits per second. Double that if you believe that we need 200Hz updates.


Yes, but shouldn't we eventually be eye-tracking anyways for depth-of-field effects? If you're eye-tracking already, you can leverage that for resolution optimization.


You still need the display to support the resolution even if you can optimize rendering costs.


To allow the eyes to actually refocus based on distance you want light field displays in your VR headsets


How many years away are we from being able to push this kind of throughput through a GPU?


Oh boy, napkin math! (not sarcastic at all - I think this kind of ballpark estimate is actually super fun)

A Titan XP/1080 Ti is finally just about enough to push 4K/60Hz with a single card. 16K would be 16 times as many pixels per frame, and 144 Hz is more than twice the refresh rate.

To put that another way, we haven't even started to think about the connector standard that you could gang together to push that many pixels, let alone having a GPU that could actually push them. At this point you are pretty much talking about some kind of lossy compression being involved ("visually lossless" sure is a great euphemism). With lossy compression you might be able to get away with ganging together a couple of whatever DisplayPort 1.5 or 1.6 end up being.

In fact with the end of Dennard scaling there's some fundamental problems with how you would even use a GPU like that as a consumer. Your average (US) household circuit is 120V/15A peak, and can be run at 80% load continuous (12A), which works out to 1440 watts at the wall. With an 80% efficient PSU that works out to 1152 watts inside the case.

A Titan XP/1080 Ti pulls 270 watts at stock clocks [0]. So hypothetically even if you stacked four of them on an interposer you're now pulling 1080 watts inside the case, which is almost your entire circuit capacity. So by a naive calculation (16 times the pixels twice as fast = 32x divided by four GPU dies) we need at least an 8x improvement in overall efficiency before this is viable.

(Four big GP102-sized chips on an interposer is technically possible right now - they only need to be on the interposer where you need interconnect bumps - they can hang off onto a support substrate, and you can have shared memory controllers/HBM2 stacks/etc on the interposer that make this appear to be one big GPU chip instead of SLI/Crossfire. Since the interposer is actually a chip all on its own, some of these auxiliary functions can actually be built into the interposer itself (the "active interposer" concept), with the biggest obstacle being getting the heat out since the interposer has other chips stacked on top of it...)

220V users obviously have things a little easier here (~twice the capacity per circuit). This cuts the overall efficiency improvement necessary down to 4x.

There will also be some efficiency improvements on the software side. For starters we can cut quality a bit (that's 4K/60fps ultra), foveate rendering, and other software magic. Speeds don't scale perfectly negatively with increased resolution so you will get some savings there. Assume some speedup from DX12/Vulkan too.

Oh and since this is 16K/144 Hz per eye I guess there's an implicit assumption here that we can make NVIDIA-style Multi Viewport Rendering work at near-100% efficiency, otherwise that's another factor of (up to) 2x that needs to be accounted for.

So let's say that we need a 4x improvement in overall hardware/software efficiency. Let's say, something like a doubling in GPU throughput-per-watt and to double software efficiency before an absolutely state-of-the-art rig could even feasibly do this task on its own dedicated 220V/15A circuit. And there's a few fairly optimistic assumptions built into that 4x number.

Take your best shot at how long it will take to quadruple efficiency in the post-Dennard era. Let's say 20-30 years, unless there's a massive breakthrough in materials science or optical computing or something.

[0] https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_...


> 220V users obviously have things a little easier here (~twice the capacity per circuit). This cuts the overall speedup necessary down to 4x.

Actually, in Europe, you get 220V/25A from normal wall sockets, or, for stuff like washing machines or stoves, you get up to 400V/63A.

With that – roughly 3.5kW from a normal socket, or 25kW if you use a socket for a large appliance, compared to the 1.4kW available in the US – you can easily power a fuckton more.


Apparently the 400V/63A one also exists in 125A and 200A versions though I guess those are not very likely to show up in households. There's also a 690V edition. It's quite a shock.


Normally "large appliance sockets", as he put it, are 400 V / 16 A CEE, though, 32 A CEE is pretty rare in households and more a commercial thing, although you can install it, since total capacity is at least 40 A -- but usually 63 A or 100 A.

690 V is industrial (this the logical progression dictated by star-delta connection of motors), and not used in residential contexts. 125 A CEE is the largest standard size. Beyond that other connectors are used. Industrial use also sees other connectors / kinds of connections for higher voltages (up to medium voltage [couple kV]).


Ugh, you have no idea how annoying that is. I'd like to set up a little MPI/CUDA cluster for personal/friend use but four GPUs (1060/1070) at full tilt works out to 500-600W right there, which means a standard circuit only runs two boxes maximum.

As far as I can see it's either colo or run a 220V circuit, which is a bit of a non-starter for a hobby project.

Heck, you can even get uncomfortably close to maxing out a circuit with a single box. SLI 1080 Tis = 540W, plus 250W for an OC'd Intel HEDT processor is 800 watts inside the box, or 1000W at the wall (8.25A of 12A continuous). Hope your wife doesn't start the hairdryer while you're gaming...


How are you even using vacuum cleaners? Many vacuum cleaners here are 1800 or 2400W, kettles are 3500W, and hair dryers usually above 1800W, too.

This all works because you have between one and three 400V/63A triphase circuits per housing unit, but if your US circuits are this tiny...


Guess why it's 3500W max? Because Schuko is specced for 16A only (and 10A continuously). Where did you get that 25A number from?


The 25A is, in the house of my parents (built '05), what each breaker and line support.

As a line might be hardwired to multiple sockets, or to an appliance, or might be connected via a CEE plug instead of Schuko, it might support more than 16A.

If you build a large-scale GPU Cluster, you won’t connect them all to a single socket, but each to a separate socket, although they might be on the same line. That’s where the 25A can be useful.


I'm even more impressed with my eye and brain circuitry now.


The thing to remember about eyes is that they don't work in terms of frames and pixels.

This is really the fundamental problem with saying "X ppi is enough for the eye" or "you can't see faster than X Hz". This is a digital approach to an analog system. Your brain is tuned to identify resolution in the center of your vision and movement in the periphery, and has all kinds of "special case" circuitry to react quickly when necessary.

Classic example, fighter pilots are capable of recognizing silhouettes flashed on a screen at some ungodly rates. But if you were, say, playing a game, 100 Hz is very smooth already (even CounterStrike), and you might not be able to visually distinguish that from 144 Hz.

It's like a psychovisual model with a temporal aspect incorporated.


Thanks to the sampling theorem, if there is a frequency limit in a continuous signal, then you merely have to sample at twice that frequency.

The eye clearly has frequency limitations, so we just need to find the sampling strategy which accurately codes all perceptible frequencies.


It's going to be more than twice--Nyquist-Shannon is for 1D; 2D images would have a lower resolution on the diagonals, so you'd want a minimum of twice the frequency on the diagonals.

In an ideal world, we would also be using hexagonal grids rather than square, which gives a resolution increase for the same area and a given pixel size, and also reduces off-axis aliasing.


Very astute, thank you for bringing this to my attention. So it'd be about 2 and 58/70 times the maximum frequency, assuming two-dimensional sampling.

As for hexagonal grids, there are a number of complications introduced by that, especially when it comes to using general algorithms based on regular 2D euclidean space.


With a sampling rate of 2 on the diagonal, you'd have a rate of 2.828 on the axes, with a square grid (2 * sqrt(2)). Unless I'm missing something.

Agreed about the complications of the hexagonal grid. We make implicit assumptions everywhere. Though I'm not sure they are insurmountable for some purposes; with graphics APIs like OpenGL, texture coordinates and samplers could be using a hexagonal grid relatively transparently. That would allow use of hexagonal displays even if the texture data is square.


  The eye clearly has frequency limitations
No...the eye doesn't work that way. It is not a machine. It is organic and analog. It doesn't have some frequency limit. In any case, you're abusing the Shannon-Nyquist theorem. It absolutely does not imply that if some device is sampling at x hz then you can sample at 2x hz and produce a sufficient reproduction for the 1x hz sample. This is a common misconception.


16:9 is suboptimal for me. I usually like one extra display in portrait orientation for code and web pages.

My current 16:9 24" 4k display in portrait is too tall---have to move my head up and down a lot, and too skinny---hard to put two windows side by side comfortably. Have to fiddle with overlapping windows a lot. In landscape it would be too short, and I rarely watch movies on my desktop. One movie trailer a quarter approximately, so optimizing for that use case is absurd.

I would prefer a fatter 16:10 instead, and was happy with the 22" 1920x1200 that was replaced, though I love the increased resolution of the new monitor.


This is the single reason why I have not made the switch to 4k displays. I simply cannot give up my existing 24" 16:10 monitors.


I use two 24" 4K displays and scale the image to look like 2560x1440 per monitor (macOS). It took a while to get used to the smaller UI, but I love the amount of space I get.

Edit: The rendered viewport per monitor is 5120x2880 (=2560x1440 Retina), so everything is sharp.


4K should be seen as qualitative upgrade over 1920x1080, and I know the pain of trying to work with that few vertical pixels. However 2560x1440 is still 16:9 but the 1440 pixels is enough for me. So my upgrade path will be to 5K (5120x2880).


If you care about aspect ratio, your monitor is too small. On a sufficiently large and sufficiently high-resolution monitor, the edges of the display no longer define the viewport; you just draw the windows at whatever aspect ratio you like. We have pixels to burn these days.


As I mentioned the monitor is already too tall, and not wide enough. I don't need a bigger monitor I need a squarer one.

In fact my old monitor was smaller (22") and I liked it better except for the lower resolution. Why? Because I could tile two windows side by side in portait. Docs and code, no fiddling, very little scrolling, and very productive.

I guess I could ignore the top few inches of my monitor but that is somewhat of a pain because of what happens on window maximization.


>I guess I could ignore the top few inches of my monitor but that is somewhat of a pain because of what happens on window maximization.

Windows has lousy built-in tools for laying out windows. Try AquaSnap, or script something with AutoHotKey.


By your definition, any monitor that doesn't fill your entire FOV is too small.


That is already starting to happen in some sense. I've got a 38" monitor (4K resolution) and I sit about 2 feet from it. Regularly, I miss mail notifications in the top-right corner.


Pretty much, yes.


I wish they would still make 4:3 laptops. I love my trusty old T61!


I would take that over 16:9.


8K doesn't excite me. I want a monitor with 4K, FreeSync, over 144hz, running on Displayport-over-USB type C... and ideally powered only by USB.

This seems like it will be possible one day.


Give me a 50+ inch 8K concave OLED desktop display with a matte surface and I will pay a huge premium.

My predictions aren't holding up [1]. I feel too many customers are satisfied with small form-factor and/or are tolerant of using multiple displays and suffering the inconvenience of bezels.

[1] http://tiamat.tsotech.com/ideal-desktop-displays


What kind of "huge" premium are you talking about here? Because your projections were wildly incorrect. You proposed a target price of $1200 for a 50-inch 8k display with a target date of a year ago. Instead Dell is selling a 32-inch 8k display for $5000, and still not actually shipping. I'd imagine that if Dell started selling what you're describing, the price range would be $30k or more at this point.


What's the use case where bezels are inconvenient? For me I want natural delineation of workspace. Being able to snap applications to 3 physical boundaries makes it easier for me to organize my work, vs one massive display where every window is floating arbitrarily.

I'm not saying those are the only two options, I'm just curious what the downside is, specifically with productivity use. I feel like if I had a 50" concave display, I'd want my window manager to have some kind of logical organization similar to what I'd have with multiple monitors anyway.


For one, you can't actually expand something across multiple screens. Depending on the kind of work you do (especially anything visual like graphics), the bezels may be a deal-breaker. Personally, I also just prefer having one large monitor because my head doesn't need to move.


At this point even a 4K 50" display costs around 6000$... how much of a premium would you pay? 20.000$?


No true, LG sells its 2016 curved 55" UHD OLED TV for $2000.

But 8K/UHD-2 might take a while, since the marginal gain from such a resolution is probably not going to be huge for your typical TV viewing experience, and it would cause problems at many levels.


Or even 42" flat OLED. They'll come soon enough.


> "Linus from LinusTechTips should be happy, as they just invested in a pair of 8K video cameras. Time to submit my own acquisition request"

I have never understood what's the point of investing in such an expensive and new tech for videos for YouTube, most of the tech channel videos on YouTube will be non relevant really fast so it's not like you are future proofing...in two years from now still very few people will have 8k screens, and cameras will cost at least 50% less. Storing 8k videos will increase storage costs and processing bandwidth too.


> I have never understood what's the point of investing in such an expensive and new tech for videos for YouTube

these people are professional content makers. they make their living from buying/receiving gear and reviewing it on youtube. this isn't a hobby. youtube is big media now. very, very big. "just because you don't use it doesn't mean nobody else does."

if a film editor, or professional photographer, or rich guy, or web developer, or options trader, or magazine designer, or whoever is in the market for an 8k display and/or camera rig, they go to youtube and watch the reviews.

but in general... you know we used to have 640x480 screens and 20MB hard drives, right? that was 'normal', and 1024x768 seemed excessive. welcome to the forever now.


I believe a lot of people shoot in higher resolution than intended output because they want to be able to crop the video and still have desired resolution.


I just want to have nicer text.


I don't think they're rendering 8K video to youtube. They're probably just using 8K raw footage for editing, and then outputting the final version as 4K.


In fact they edit most of the video in 1080p because it's easier and faster. Then they upscale to 4k when they send it to YouTube because the extra bandwidth they get for the 4k stream makes a noticeable difference. They keep close-ups and B-roll in 4k because it's easier to edit anyways, and it looks really nice.


For editing you usually use proxy low-res versions of your 4k/8k/16k content to get optimal speed and then just do the final rendering at your result resolution.


I don't really remember but I think that they had a video where they talked about it. I think one of the conclusions was that youtube's own compression loses a lot of the quality improvements, and as it can be more difficult in general to work with/encode they do a lot in 1080p


It's like programmers who start learning a hot new language before it's production-ready. You want to get ahead of the curve so that you understand the technology and can take full advantage of it once its time comes.

(And I'm sure that for a professional video person, it's not just "use it like your old camera, but it produces better output." There will no doubt be subtle changes needed for the best output, whether lighting, scenery, filters, workflow, or what have you.)

We programmers just happen to be really fortunate that the hot new tools we want to learn are usually free.


Off the top of my head:

- Cropping can be more flexible.

- Higher resolution = better IQ at lower resolution than sensor at that resolution, because sensors have a bayer pattern.


Some YouTube channels upgrade to higher resolutions as soon as they can afford it. If they are one of the first, their videos will be watched as demo content by any users that look for material to take advantage of their new higher resolution monitor. You could see this phenomenon when video titles all had “HD” in it. Later it changed to “4K”. And soon it will change to “8K”. It is used to attract viewers.


Well I just invested on a PC with an i7 more than a thousand euros so that I can do web development on it. It will probably give me a maximum of 5% speedup over my 6-year-old PC but that's still worth it. What I develop needs to run on the dirt-cheap mobile phones too, which is what 90% of my users have, and even then it still makes sense.


It's a tech tips channel, do you realise how many videos they get out of reviewing their own kit?


They will get media coverage just by having these equipments. This article is an example.


digital zoom


tl;dr: US$4999, 7680 × 4320 resolution, 1300:1 contrast, 31.5" IPS, 60Hz refresh, 2 × DisplayPort 1.4 to handle the bandwidth


One last spec: it is indeed 16:9.

Which is unfortunate.


That is a lot of pixels. And quite an improvement over the 16 lines of 64 characters on a converted television that the first computers had.

The article points out the challenge though, all that memory that needs to from where it is into the pixels on the screen at a rate fast enough to not annoy you with repaint lag. Which reminds me that the real 'winner' in this space will be if someone can put 300 ppi where ever you are looking in a larger field of view and leave the rest at 100ppi and 50ppi.


Wow, that's a really interesting idea. Unfortunately any amount of lag and you get 100dpi for a fraction of a second, and then suddenly it sharpens to 300dpi... like your eyes are constantly refocusing outside of your control.


That'd be a fascinating piece of tech.

I'd bet against it being achieved before GPU's have caught up with the requirements though.


Foveated rendering has already been demonstrated at GDC this year, it's expected to be included in the next generation of vr headsets, and there are already some attachments for your monitor you can buy now.

http://www.tobii.com/


I have 3x 4k monitors, one 31.5" DCI 4k, one 28" UHD 4k and a 55" UHD TV (that also reports itself as DCI capable).

Frankly, 8k at 31.5" is kinda pointless unless you have eyes of an eagle or are working glued to your monitor. As a tech demo from Dell, it's cool!


I agree. I couldn't use my 4k Dell monitor with anything but my mbp on OS X because I couldn't read much on Windows. Windows does not do display scaling very well.

I imagine 8k is not the easiest to use on Windows and there do not seem to be any Apple products with the right amount of power to use a display like this correctly (i.e. With the same power you'd put on a custom machine running Windows.).


Set manual scaling to 200% (I think by default you can only go up to 150%; works on Windows 7 as well!). That will make it retina-style and you are going to enjoy your 4k monitor even under Windows!


For everything that supports that in Windows you're correct. It's much better. I was doing this around Windows 8 (before the major update) and it's been improved a lot now in Windows 10 (esp the latest version I've heard).

However, I'm not sure what macOS/OSX do, but it feels like they simply render whatever the app wants to render using many pixels instead of one. On Windows it feels like each app has the full control so each one needs to be updated to handle the DPI settings.


Windows 10 works mostly ok for me (27" UHD), I just set the scaling to 175% and everything scales nicely (if a program doesn't fully support scaling then, it's usually scaled up but blurry). It's supposedly a per-monitor setting, but I haven't tried it.

Linux is the real pain - most things only support integer scaling for UI elements and 200%/2x makes it feel a bit big to me.


> Linux is the real pain - most things only support integer scaling for UI elements and 200%/2x makes it feel a bit big to me.

That’s actually only the fault of Gnome.

All KDE and Qt programs support fractional DPI scaling – with a different ratio per screen.

Don’t blame all Linux programs if it’s only Gnome that’s broken.

The environment variable used is

    QT_SCREEN_SCALE_FACTORS=DisplayPort-2=1.75;HDMI-A-0=1.08;


The trick I've found for Linux with 4K is set it to 200% and then decrease the font size to .8 or .7.


Great reason to use a Mac. It's a great example of how Apple just figured it out and made it "just work".


I use a Dell 4K 24" monitor on a Surface Pro 3 with Windows 10 and rendering at 150% or 175% works great.


280 ppi! At around 800 ppi you don't need to do AA anymore. So, around 32k at same size.


Why would you need AA at 280ppi? At sane viewing distances, I would imagine that 99% (maybe 100%) of the population lacks the visual acuity to benefit from >280ppi.

Where does your 800ppi claim come from?


Yeah, it depends on viewing distance, of course. there was a calculation which from 800 came out as a number, which I forgot and can't produce atm. I remember the output. Same as in print when printing fine detail, such as maps, no less than 800 dpi ever (not related, but there was also a calculation which I forgot, but remember the output).


Print and monitors aren't directly comparable. Print needs higher DPI because people will literally hold a map up to their face to look at fine detail. You just zoom a digital map instead.

If you aren't holding your face up to your monitor, you don't need 800dpi. The original iPhone "retina" display was 326ppi and that was assuming a 12 inch viewing distance. For monitor viewing distance, you'll be 20 or more inches away, making 280ppi plenty for the vast majority of people.


Do you think that the market will sooner or later move to 8K monitors? I'm not so sure.

At a normal viewing distance a 100dpi monitor is already decent. A UHD monitor is just great. You have to get very close to see individual pixels. I'd say doing AA is no longer required even then.

8K seems overkill for most purposes. Sure, there is a niche that can take advantage of it, but I don't see advantages for the mass market.

Same as with SACD, CD tech is simply good enough for pretty much everyone so SACD never took off.

I write this despite being a high dpi junkie, I bought a ViewSonic VP2290b (IBM T221 clone, 3840x2400 22") back in 2006 and dealt with a huge hassle of 4 DPI inputs for years.


I have a 28" 4k screen and while I find the crispness amazing I still notice a little bit of aliasing in games (and I don't exactly have an amazing eyesight). Nothing terrible of course, but I still leave AA enabled for that reason.

I'm guessing that for monitors <= 30" 8K will be the endgame, at least for me.

In any case, I'm super happy to see the resolution race back in full force, it took the industry quite a while to recover from the CRT->LCD switch. It was about time we moved past 1080p as the standard.

It's also amusing that so-called "high definition" is not so high anymore, and SD not so standard. What will 8K be? Ultra HD Alpha Plus?


>At a normal viewing distance a 100dpi monitor is already decent.

No it isn't.


You'll see, it'll do perfectly in a couple of decades. Most people wear glasses nowadays, and those who don't will at some point in the future.


Wearing glasses doesn't reduce the need or desire for higher resolution monitors. Maybe uncorrectable vision problems do, but if your glasses get your eyes close to 20/20, you'll benefit from higher resolution just like people with natural 20/20 vision.

And as another poster pointed out, the blurriness of the screen may compound with the natural blurriness of uncorrectable vision loss.


Nope, I'm "old" and wear glasses and my small 4k monitor (~250dpi) was a significant improvement.


Actually, my eyes are more sensitive to low resolution as they get worse. The blurry of the text compounds with natural blurriness of eyes.


Depends on the usage I guess. The larger the screen, the further away you stand. I have a 2560x1440 32" monitor with a <100dpi. It's great for text editing (I use a bitmap font). For other use, I would love 5k :)


I thought font rendering was the biggest let down in switching to 4k.

AA not only is still required on my 27" 4k display, but in terms of clarity AA fonts are still MARKEDLY inferior to fonts which are optimized for non-AA rasterization (like Terminus).


From our human eyeball's perspective, aren't we getting to a point of diminishing returns?

I just wish Apple had made a retina Cinema display and left it at that (they have the tech, it's in the 5K iMac).


Until the last year Apple didn't make any devices that could actually reasonably drive a 5k monitor (save the iMac of course), and that's ignoring the connectivity problem.


I think 32" 4k is too dense for most people, but maybe a 40" screen could work. I have 20/40 vision, so my eyesight has long been surpassed.


> "By then, 16K might exist, back at the $5000 price point. Maybe."

I appreciate the author saying 'maybe' because I can't, off the top of my head, understand why 16K would add any benefit over 8K... Aren't we at 'retina' with 4/8K anyway?

EDIT: at a 32" resolution...


Retina is determined by view distance and visual acuity. You can get retina on any 4k tv right now at several meters of view distance (hell, you can get "retina" on old 1080p tvs when you are sufficiently far enough away), but if you stood right in front of it it would be obviously pixilated anywhere from 30"+ resolution.

Same applies to desktop monitors and phones. The reason phone screens pushed high DPI first was because you were much closer to the screen to make it fill your view.


The image on the screen in that article certainly looks better than the monitor I' using ;)


Right? And I also don't understand this side-by-side comparison. If I'm viewing the image on a 1080p screen (say), then why can I tell the difference between these images? What is the comparison supposed to be showing?


The comparison images are zoomed in, though they should specify how much. Presumably by a factor of four.


So can I drive this from a 15" touchbar MBP?


Personally don't see the point. I was disappointed w/ 4K monitors when half of my applications didn't scale well for them. I imagine even less things will scale well for an even less popular resolution.


I brought 27 inch 4K display recently and instantly regret I hadn't purchased 4K display 3 years ago.

Back then, 4K display was still way too expensive. Although I could buy one, I didn't think spending half the price of good laptop computer for a 4K display doesn't worth the money. I was wrong.

4K display greatly improved my productivity. I should have purchased a few of them already.


I don't need a super-high DPI display, but 3440x1440 @2x is almost 7K, and that sounds like a natural upgrade from the existing 21:9 34" displays.

It's a pity that most operating systems aren't designed for huge screens - Windows' start menu is always in the bottom-left corner, for example.


Hopefully, they drop the price on the 32'' 4K monitor to under $1,000. That'd be a great deal.


There are 32" 4k monitors (IPS, 60Hz, 4ms response) for $600 or less from LG already.


Can you please provide a link? They are not showing up on the LG website or Amazon. If anyone else knows of a 32" 4k monitor for 500 USD or less, share a link. For me 32" is smallest size I can get away with 100% scaling at 4k.


I have bought this one in December, there's 1 bad pixel but otherwise am satisfied. http://www.ebay.com/itm/QNIX-NEW-32-UHD3216R-REAL-4K-MINE-38...


That article says I can get 4K for 350. More interested in that than the 8K, anybody got a link in Europe?


2-3 years ago I got a 28" 4k TN panel for ~£280. Then a 24" 4k IPS for ~£200.

This year 2 27" 4k IPSs for ~£350 each (both LGs, the first one a better model but cheaper because pre-Brexit-vote).

Prices have gone up from my POV.

In answer to your question: https://www.overclockers.co.uk/monitors/by-type/4k-ultra-hd?... -- under £300 for 24" 4k IPS. OCUK is a UK subsidiary of Caseking (de), so I'd be surprised if similar prices weren't available across Europe.

In all the above experience, I've found Acer to be very good (3 monitors, 2 excellent and 1 with pixel defects), Benq to be poor (multiple returns before I gave up and bought something cheaper) and LG to be absolutely flawless.


I highly recommend the $370 28" https://www.amazon.com/ASUS-PB287Q-3840x2160-DisplayPort-Mon... - we've standardized on them at our office, and if you have a graphics card that can sustain 4K at 60Hz, it's amazing for coding and other work.


I've got this one hanging off of a 2016 MacBook Pro. It's a TN panel.

https://www.amazon.de/dp/B00WUACE4S


I might be tempted if it used a USB-C input and was compatible with my MacBook. I recently bought from Apply the LG ultradef monitor they recommend for the MacBook, and while the LG + MacBook combination is great, even a larger higher resolution screen would be great.


Interesting that it uses two DisplayPort 1.4 inputs. I guess HDMI 2.1 isn't ready yet?


Any current gfx cards featuring HDMI 2.1?


Oh, right, there's a chicken-and-egg problem there.


I'm using multiple 32" 4K monitors, and while the additional definition might be "nice," I certainly can't work with any "smaller" text.

I wish Dell would produce 8K monitors in a much larger format, like 48".


It's not about smaller text size, it's about clarity, with a proper scaling (like on Mac OS) 4K can have the same font sizes as on smaller resolutions, the clarity tho will be incredibly better.


And to add to that, higher DPI usually = less inter-pixel gap, which for some people reduces eye-strain.


That's personal preference. I prefer more screen real estate over larger font's at higher clarity.


But with 8K you can have the same font size as you currently have with 4K but with a better clarity...


Huh since when does macOS have 'proper' scaling? I usually only hair complaints... same with windows 10 for old apps.


Can you give me an example? I've been using retina macbooks since the day they were released, and I've never seen a single issue with scaling...


Ah never had issues with my retina mbp or 12". Unless I connected it to an external display. Which is where people seem to complain.


Used it with 27 Cinema Display for couple years, never had issues with scaling , now switched to 32 inch 4K from asis and its also working perfect.


macOS is the only OS that gets it right.

You can even drag a window from a HiDPI monitor to an old fashioned one and it will do what's right.


> and while the additional definition might be "nice," I certainly can't work with any "smaller" text.

I never understand why people say things like this. Why not just throw more pixels at text rendered in a readable font? Does DPI scaling not work on their operating systems?

As my monitor resolutions have gotten larger, my text on screen hasn't become smaller at all. I simply go with a higher fidelity. Isn't that what most people do?


I go smaller including on my retina displays. My point is I'd love to eliminate 4x 4K monitors with 1x or 2x 8k display, they just need to be larger than 32".


I don't think most people do that, they don't go smaller.

Also, do you know what a larger than 32" display is going to do to your poor neck?


Resolution has very little to do with smaller text.

If it does, you're doing something wrong.

(It does improve legibility of small text, I'll grant you that)


Not true, the default settings on OSX on a retina display give you an effective lower resolution with higher clarity. I opt for the higher resolution with lower quality. It effectively makes your text, windows, buttons, icons, smaller.


So resolution has something to do with larger text then? ;-)

Anyway, on iMac Apple doubled the resolution (quadrupled the number of pixels) when they introduced the "retina" versions so there was no font size change.


I'm using a single 44" Vizio 4K TV for my main monitor attached to a Dell XP 15 laptop which acts as a second monitor (it also has a 4K screen).

I picked 44" as the optimal size for me by standing 3 feet in front of all the 4K TVs at my local warehouse store and seeing what sized TV let me see the entire screen without swiveling my head - 48" was just a little too big.

My 4K screen works really well in Windows 10 other than a couple of old apps that don't to font scaling for their title bar correctly. I have the TV scaling at 125% and the laptop scaling at 250%. I'm not sure what an 8K screen would do for me as a programmer.


>I'm not sure what an 8K screen would do for me as a programmer.

Maybe you wouldn't be stuck with monochrome bitmap fonts for good clarity. AA on 4k is still quite uncomfortable in comparison.


How is the response time? I thought using my 40" 4k Samsung TV had too much "mouse lag" for desktop use.


I don't notice any lag at all.

One thing I also did before I bought the exact model was do a little research -- the TV I bought has 1 HDMI port (out of the 5 available on it) that runs at 60Hz instead of 30Hz like the other 4 ports. I use that port for my laptop. I also read online that that port also skips the "Ultra HD engine" in the TV but I can't find that in the tech specs.

Btw, while I was looking for the tech specs I realized I have a 43" TV not a 44" (https://www.vizio.com/m43c1.html)


That might be due to HDMI, if I remember correctly HDMI 2 supports 4k at 60hz, but HDMI 1 can only do 30hz. If you have display port on it, use that instead and you should be good.

I have two Samsung 4k monitors and they are unbearable to use with HDMI at 4k, I had to get a new video card to drive them both over display port.


Forgive my ignorance since I've been on Mac for so long, but does windows still not do UI scaling?


It does scaling well now and also handles multiple monitors much better than macos. I love that it doesn't matter what displayport I use, Windows remembers the monitors and doesn't force you to swap cables until your monitors position match their configuration in the OS.


It does scaling well for apps which have been updated to fully support the UWP scaling APIs. Everything else is quite disappointing and wildly inconsistent. In my business, apps which use the UWP account for maybe 15% of usage time? I've encountered many situations where one cannot even achieve clear text, at any size, for a given app without overriding the default scaling system and using the "not recommended" custom scaling factor (i.e. the legacy controls).

Multiple monitor support on Windows 10 I've also found to be very dependent on your graphics drivers and hardware configuration, as it always has been.


Windows does UI scaling, but you can't say the same for it's application ecosystem. Any app with a custom UI needs to explicitly support scaling.


Windows has HiDPI support pretty well nailed down AFAIK. Laptops like the Dell XPS have been shipping with HiDPI screens for a while.

Linux support is spottier, especially when using multiple displays of varying DPIs, but even Gnome works pretty well on a HiDPI laptop these days.


> Linux support is spottier, especially when using multiple displays of varying DPIs, but even Gnome works pretty well on a HiDPI laptop these days.

Actually, Gnome is the only one that doesn’t do proper HiDPI support. It only supports 96 and 192dpi - not any other ratio.

In comparison, Qt supports different ratios for every screen, all ratios specified as float – you can even scale up, down, whatever.

The environment variable used is

    QT_SCREEN_SCALE_FACTORS=DisplayPort-2=1.75;HDMI-A-0=1.08;


I'm using Ubuntu 16.04 on a Dell XPS 13 (3200x1800) and it works well as long as I use 2.0 as the magnification value. Other values such as 1.83 are broken.


I'm using XFCE (Debian unstable) on a Zenbook (3200x1800) and that works really well too. There isn't a magnification value to set - rather you set the display DPI.


Windows 7 is still pretty bad. Maybe windows 10 is better


Forgive me, but windows 7's last large update (Service Pack 1) was "February 22, 2011; 6 years ago" according to wikipedia. As this was before high resolution monitors really took off at all, you can see why windows 7 doesn't have great support.

I can't say anything about windows 10's support (The highest I have is 1080p)


Windows 7 is what my company and many others still use. I only described what I have seen.


It sort of is, in that it sort of works... but scaling on Windows 10 still introduces a bunch of UX issues. Especially if you pair a Retina laptop with a 1080p external and try and use both side by side.


DPI scaling is pretty good, only time it's an issue is when people use older versions of qt or Windows forms (?) which shouldn't​ be an issue for many people.


Sort of, but a lot of stuff doesn't scale. Java apps, things like WebEx...


Actually, Java Swing and JavaFX do scale. JavaFX does not scale on Linux, though, because it uses GTK to determine the scale ratio, and GTK is still broken.

Qt’s had QT_SCREEN_SCALE_FACTORS=DisplayPort-2=1.75;HDMI-A-0=1.08; for years, but GTK still only supports one global scale factor, and that’s limited to 96dpi or 192dpi.

So JavaFX only scales to these, while Java Swing scales perfectly (but still only handles one global scale factor)


Per HN's own doctorpangloss 1 week ago: The Dell P2715Q is the best deal on a high DPI display https://news.ycombinator.com/item?id=13901752#13902158

HN's skwirl, right as I posted this: (27", 163ppi) https://news.ycombinator.com/item?id=13949827

I don't think anyone is beating https://amzn.com/dp/B00PC9HFO8 at $540.

If you found this info useful, this is a referral link you could choose of your own free will (no pressure from me!): http://amzn.to/2nvx9XP


8k but no Rec.2020 gamut support... that's actually incredibly disappointing.


holy shit...look at the size of the taskbar


Now if only Microsoft could fix that Remote Desktop resolution scaling problem, things would be so much better. Remote Desktop to a hi-res machine shrinks everything to unviewable level.


Is it with matte finish?


Can't wait for AR/VR to make monitors obsolete.


FTA: "It’s worth noting that Raja Koduri, SVP of AMD’s Radeon Technology Group, has stated that VR needs 16K per-eye at 144 Hz to emulate the human experience, so we're still a way off in the display technology reaching consumer price points at least."

Dual 16k @ 144Hz

Granted, phones are ahead of desktop displays in pixel density, and that seems more applicable to VR displays.


I still can't imagine that they would replace displays at work for anyone who regularly interacts with a team. VR is not good for collaboration. Just as an extension if you need to shut out the world around you. But it'll be hard to tell your company that you suddenly need 8K displays and VR.


Project your teammates into a shared VR world. Just think, finally you can gesture wildly at the whiteboard without whacking people in the head.


> whacking people in the head

Which, incidentally, you could do now that you are in a VR, whenever you feel like.


Those displays will need huge improvements in effective pixel density (after optics) as well.


4k is 4096p, I would expect 8k to be 8192p, but it's only 7680p.


Long story short, it depends..[0]

In most cases, I'd guess that most people mean 3840×2160.

[0]: https://en.wikipedia.org/wiki/4K_resolution


4K is 2160p. Digital theater 4K is 4096x2160 but UHDTV 4K is 3840x2160.

8K is 4320p. UHD 8K is 7680×4320.


4k refers to approximately four thousand horizontal pixels. 1080p refers to 1080 vertical pixels. So how many 'k' and how many 'p' a display is/has aren't comparable figures.


Thanks, this was my confusion.


The 'p' stands for progressive scan, not the dimension being measured (neither does 'k').


I know that and 'k' just means thousands, but in context they still indicate which dimension is being referred to. 'p' is never used to refer to horizontal dimension metrics and I've never seen 'k' on it's own with a number used to indicate vertical resolution. Not without that being explicitly stated anyway.


DCI 4k is 4096x2160, UHD 4k is 3840x2160. If you buy EIZO/LG for video/photo editing, you get DCI, the rest is UHD.


4K UHD is 3840 × 2160. 8K UHD is 7680 × 4320.


meanwhile, 60hz...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: