Hacker News new | past | comments | ask | show | jobs | submit login
How to (and how not to) fix color banding (frost.kiwi)
148 points by asicsp 10 months ago | hide | past | favorite | 28 comments



> Many Laptop screens are in fact 6-bit panels performing dithering to fake an 8-bit output. This includes even high-priced workstations replacements, like the HP Zbook Fury 15 G7 and its 6-bit LCD panel, that I sit in front of right now.

This made me double-check the date of the article (it's from 2023.)

The author's laptop appears to have launched in 2020. I'm astounded any manufacturer would think this is even remotely acceptable this day and age, much less on such a high-end device. (Going off the current generation, these laptops start around $3k.)

Is this actually that common or did the author just get unlucky?


It is surprisingly common even on desktop monitors, though not high-end ones.

The panel is something that will often get changed over time in given laptop model without any visible change in the model number etc. and there have been cases of early versions (including review models, of course) having better panels and later revisions having 6-bit ones. I'd be really irritated if it happened in a high-end model but I'd not be surprised to discover there are examples where it has.


> there have been cases of early versions (including review models, of course) having better panels and later revisions having 6-bit ones

How is this not fraud?


They are very careful not to state parts of the panel spec that they don't absolutely have to, then they can legally (though obviously not morally) change the unstated parts without any issue. With early review units there are other extra excuses available to them.

It is rare that a screen will get such a significant downgrade, though it has happened. More common is downgrades in RAM and drive performance. Drive makers themselves play fast and lose with this sort of trick, sometimes changing both controllers and memory to the detriment of performance multiple times without outward change in product codes.


The "panel lottery" is still a thing, especially from the major brands like HP, Lenovo, and Dell. They don't say what you are going to get, they change panel suppliers without changing the model number, and in general they don't give a damn about how the thing looks. I guess it's why people buy Apple.


I don't know about Lenovo and Dell, but for HP, they do say what you're getting.

I'm not familiar with the Z Book line, but for the Elite and Pro Book (which we have at work - it's why I know) they give some specs. Usually, they quote NTSC coverage and backlight intensity. Now, they never say it's great, mind.

I have a top-of-the-line EliteBook screen and while it's very bright, colors aren't great and viewing angles are a bad joke. But that's likely related to the "privacy screen" feature.

One of its cousins has a lower-end screen, which is 6-bit. Its angles are fine, but it's basically unable to display real reds.

The first one was quoted as something like 72% NTSC. I forget what the other one was, but certainly below 50%.

And, indeed, these laptops are not cheap. My 14" EliteBook cost a few hundred euros less than a 14" M1 MBP with the same quantity RAM and SSD (32 / 512). Now I haven't seen the new MBP screens, but my 2013 MBP's screen runs circles around these pieces of crap. Also, battery life is comparable to my 2013 MBP, but the SSD is slower.


Apple at least used to often use multiple suppliers, I remember people saying one was better than the other, maybe somewhere back in the first few generations of laptops with retina displays.


Yes, on the first gen of retina display, LG panels were prone to image retention, while the Samsung's were not.


This can also be done temporally not simply spatially. https://www.shadertoy.com/view/tslfz4


Many thanks for posting my article here <3


Another GPU friendly approach is mixing with blue noise. I used it in a display driver with XRGB8888 framebuffer and Y2 pixels, and it worked great.


When generating images also mix in a little blue noise to avoid banding effects. A simple lookup table goes a long way. I found it worked much better than adding uniformly random noise (white noise) to each pixel.


Apple UI blur & vibrancy[1] look smooth without having to introduce noise. They have the advantage of owning the entire pipeline with phones & laptops, but the effect is decent even on budget external displays.

1: https://developer.apple.com/design/human-interface-guideline...


This is great... but it's at the wrong level.

The LCD itself should ideally be a nice HDR LCD, but if it isn't, it should apply time-dependant error diffusion dithering. Ie. if you took a photograph, you would see a 6-bit or whatever error diffused image, but that dither pattern changes at 60fps. That should happen entirely within the firmware of the screen, and is actually easy to do (some types of dithering require no RAM, others require 1 line of pixels as a buffer).


We draw on the screens we have, not the screens we want.


History shows again and again, how nature points out the folly of lieing to the OS.

Maybe you could have an option in the LCD controller to do dithering, but it's better to be truthful and let the OS control the dithering. Maybe it's hardware accelerated in the GPU or the LCD controller, but sometimes it's probably better to turn it off. Especially for people sensitive to flickering.


Plenty of displays use various temporal and spatial dithering methods


I guess we only notice the ones that don't do it properly...


I’d be interested in a simulation of this. I feel like this could well be worse, basically a kind of constant flickering, but I’m not sure.


Just dropping this here, because there is more to this issue in real-world scenarios:

https://loopit.dk/banding_in_games.pdf

(Not my talk, but I found it enlightening.)


That's a fantastic and practical overview, thanks


My recommendation is to do the equivalent of FRC in your shaders. I use one of the dither functions from https://developer.oculus.com/blog/tech-note-shader-snippets-... to break up banding in my geometry rasterization shaders that I use for rendering all my UI elements. It looks good in static images and fantastic in motion since the dithering pattern shifts every frame.

One thing to watch out for is that if you're alpha blending, you need to be fairly cautious about how you dither and multiply. Premultiplied alpha does not get along with dithering, so the output from your shaders needs to be dithered but NOT premultiplied, with premultiplication performed by the GPU's ROPs (they typically seem to operate above 8 bits of precision.) If you don't do this you will get really nasty banding on alpha fades even though you dither. Also, counter-intuitively, you may not want to dither the alpha channel (test it yourself, ymmv)

When dealing with sRGB and linear space it can also be important whether you dither before or after the color-space conversion.


Changing the dither pattern every frame subjectively increases noise so I have change dither pattern every frame off for videos.

I also turn off colored noise as well, because even though having colored noise on lowers luma noise, it does increase chroma noise.

I don't want to see noisy colors moving around on videos, esp. anime.

My monitor does do FRC, so all this software stuff is added noise which is why I like to turn it off.


noise and dithering are different, though. I'm referring to dithering. It's correct that for videos it can be distracting to have the dithering change every frame, particularly if your framerate is low. I'm doing game dev so my framerates are >=60, at which point FRC dither is near-invisible.


Are there any debanding methods that add a point to the color channels?

E.g. A delta of +(1, 1, 1) in RGB space would have six intermediary (but not perceptually evenly spaced) values, e.g. (1,0,0), (0,0,1), (0,1,0), (1,0,1), (0,1,1), and (1,1,0).

This might be something dithering already does (and I just don't understand it).


It’s more or less linear—what you can do is dither in each of the R,G,B channels separately. To remove banding in an image, you dither in R, dither in G, dither in B.

If you take a gray value that is 50% between two gray values, and you use uncorrelated noise to dither, then the result will contain all (±0.5,±0.5,±0.5) delta from the original gray. If you use ordered (correlated) dither, you’ll get a subset of those 8 colors.


Something similar is used in visual psychophysics: https://www.spiedigitallibrary.org/conference-proceedings-of...


This is exactly what I was thinking about! Thank you!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: