> Many Laptop screens are in fact 6-bit panels performing dithering to fake an 8-bit output. This includes even high-priced workstations replacements, like the HP Zbook Fury 15 G7 and its 6-bit LCD panel, that I sit in front of right now.
This made me double-check the date of the article (it's from 2023.)
The author's laptop appears to have launched in 2020. I'm astounded any manufacturer would think this is even remotely acceptable this day and age, much less on such a high-end device. (Going off the current generation, these laptops start around $3k.)
Is this actually that common or did the author just get unlucky?
It is surprisingly common even on desktop monitors, though not high-end ones.
The panel is something that will often get changed over time in given laptop model without any visible change in the model number etc. and there have been cases of early versions (including review models, of course) having better panels and later revisions having 6-bit ones. I'd be really irritated if it happened in a high-end model but I'd not be surprised to discover there are examples where it has.
They are very careful not to state parts of the panel spec that they don't absolutely have to, then they can legally (though obviously not morally) change the unstated parts without any issue. With early review units there are other extra excuses available to them.
It is rare that a screen will get such a significant downgrade, though it has happened. More common is downgrades in RAM and drive performance. Drive makers themselves play fast and lose with this sort of trick, sometimes changing both controllers and memory to the detriment of performance multiple times without outward change in product codes.
The "panel lottery" is still a thing, especially from the major brands like HP, Lenovo, and Dell. They don't say what you are going to get, they change panel suppliers without changing the model number, and in general they don't give a damn about how the thing looks. I guess it's why people buy Apple.
I don't know about Lenovo and Dell, but for HP, they do say what you're getting.
I'm not familiar with the Z Book line, but for the Elite and Pro Book (which we have at work - it's why I know) they give some specs. Usually, they quote NTSC coverage and backlight intensity. Now, they never say it's great, mind.
I have a top-of-the-line EliteBook screen and while it's very bright, colors aren't great and viewing angles are a bad joke. But that's likely related to the "privacy screen" feature.
One of its cousins has a lower-end screen, which is 6-bit. Its angles are fine, but it's basically unable to display real reds.
The first one was quoted as something like 72% NTSC. I forget what the other one was, but certainly below 50%.
And, indeed, these laptops are not cheap. My 14" EliteBook cost a few hundred euros less than a 14" M1 MBP with the same quantity RAM and SSD (32 / 512). Now I haven't seen the new MBP screens, but my 2013 MBP's screen runs circles around these pieces of crap. Also, battery life is comparable to my 2013 MBP, but the SSD is slower.
Apple at least used to often use multiple suppliers, I remember people saying one was better than the other, maybe somewhere back in the first few generations of laptops with retina displays.
When generating images also mix in a little blue noise to avoid banding effects. A simple lookup table goes a long way. I found it worked much better than adding uniformly random noise (white noise) to each pixel.
Apple UI blur & vibrancy[1] look smooth without having to introduce noise. They have the advantage of owning the entire pipeline with phones & laptops, but the effect is decent even on budget external displays.
The LCD itself should ideally be a nice HDR LCD, but if it isn't, it should apply time-dependant error diffusion dithering. Ie. if you took a photograph, you would see a 6-bit or whatever error diffused image, but that dither pattern changes at 60fps. That should happen entirely within the firmware of the screen, and is actually easy to do (some types of dithering require no RAM, others require 1 line of pixels as a buffer).
History shows again and again, how nature points out the folly of lieing to the OS.
Maybe you could have an option in the LCD controller to do dithering, but it's better to be truthful and let the OS control the dithering. Maybe it's hardware accelerated in the GPU or the LCD controller, but sometimes it's probably better to turn it off. Especially for people sensitive to flickering.
My recommendation is to do the equivalent of FRC in your shaders. I use one of the dither functions from https://developer.oculus.com/blog/tech-note-shader-snippets-... to break up banding in my geometry rasterization shaders that I use for rendering all my UI elements. It looks good in static images and fantastic in motion since the dithering pattern shifts every frame.
One thing to watch out for is that if you're alpha blending, you need to be fairly cautious about how you dither and multiply. Premultiplied alpha does not get along with dithering, so the output from your shaders needs to be dithered but NOT premultiplied, with premultiplication performed by the GPU's ROPs (they typically seem to operate above 8 bits of precision.) If you don't do this you will get really nasty banding on alpha fades even though you dither. Also, counter-intuitively, you may not want to dither the alpha channel (test it yourself, ymmv)
When dealing with sRGB and linear space it can also be important whether you dither before or after the color-space conversion.
noise and dithering are different, though. I'm referring to dithering. It's correct that for videos it can be distracting to have the dithering change every frame, particularly if your framerate is low. I'm doing game dev so my framerates are >=60, at which point FRC dither is near-invisible.
Are there any debanding methods that add a point to the color channels?
E.g. A delta of +(1, 1, 1) in RGB space would have six intermediary (but not perceptually evenly spaced) values, e.g. (1,0,0), (0,0,1), (0,1,0), (1,0,1), (0,1,1), and (1,1,0).
This might be something dithering already does (and I just don't understand it).
It’s more or less linear—what you can do is dither in each of the R,G,B channels separately. To remove banding in an image, you dither in R, dither in G, dither in B.
If you take a gray value that is 50% between two gray values, and you use uncorrelated noise to dither, then the result will contain all (±0.5,±0.5,±0.5) delta from the original gray. If you use ordered (correlated) dither, you’ll get a subset of those 8 colors.
This made me double-check the date of the article (it's from 2023.)
The author's laptop appears to have launched in 2020. I'm astounded any manufacturer would think this is even remotely acceptable this day and age, much less on such a high-end device. (Going off the current generation, these laptops start around $3k.)
Is this actually that common or did the author just get unlucky?