Hacker News new | past | comments | ask | show | jobs | submit login
What is Color Banding? And what is it not? (simonschreibt.de)
97 points by deverton on Dec 3, 2015 | hide | past | favorite | 26 comments



Very well written and illustrated article.

This is one of those things that's so difficult to explain to a mediocre to average graphic designer, much less a layperson. Now the trick would be to get them to read the whole thing...


Dumping color table entries from the last frame of the "256" color GIF:

> gifbuild -d colorBanding_255colorsOn1920pixels.gif | tail -n 200 | grep rgb | sort | uniq | wc -l

> 100

What happened here is some sequence of color space conversions with gamma correction added/removed that ended up compressing the darks.

A true 256 color gray gradient will not display so much visible banding. Just throwing together a naive 256 color gradient in Gimp reveals some RGB triplets with mismatched values suggesting the encoder is doing some odd color space conversion. Even then, it has much less perceptible banding when scaled with nearest neighbor to 1920.

I suspect most of the examples stem from improper gamma handling than any true demonstration of the limitations of 8-bit channels.


Every image on the page is broken for me (Chrome if it matters). I was hoping they would help understand what I was reading. Console is full of errors for where each image tried to load.


Do you use Privacy Badger or another blocker?

The markup loads images from seemingly local URLs:

  http://data.simonschreibt.de/gat055/intro_comic.png
but redirects to images stored at SSL-ACCOUNT.COM:

  https://ssl-account.com/data.simonschreibt.de/gat055/intro_comic.png
I reset Privacy Badger, and the assets loaded fine.


It seems that my site has problems when it's visited for example from behind a firewall which doesn't like my SSL-Proxy. I've heard that sometimes from people visiting the blog from work but at gome everything was fine...

Are you behind a firewall?


When I tried it in IE I got loads of popups about SSL/certificate stuff that I just don't get. So it probably is work messing it up. I will hopefully get a chance when I get home to revisit the page.

To the others, I did disable ublock and ghostery but that didn't do anything for me.


The page uses webp and webm. Don't know why your Chrome doesn't support that. Someone over in Reddit mentioned the same thing for an old version of Chromium.


The greyscale gradient animations that are meant to show banding are full of gif artifacts so they are not actually smooth gradients.


Is this an example of Poe's Law? GIF's palette size is 256, there are 256 shades of gray, so no GIF artifacts should be present. GIF doesn't have artifacts anyway; those would come from the dithering/palletization step before compression.


Are you viewing on mobile? Some mobile providers run images through an "optimizing cache", and they use weird bad settings.


I was really surprised the first time I saw banding in a 24-bit gradient image, I spent many years thinking it was impossible. But I'm still not sure if that's an inherent limitation of 24 bits, or whether my display is not capable of displaying all 24 bits with full fidelity. The LCD manufacturers are using tricks to squeeze the last couple of bits out of panels that aren't up to the task.


Dithering is seen so often in 90's 256-color images, but why so rarely in the ugly banding of 24-bit RGB images?


Well, like the video explains, I think a lot of people don't recognize the problem, because it's less obvious with 24-bit color. And, if they do recognize the problem, they might think it can't be fixed, or they might not think subtle bands are worth fixing.

I've used the following in the past with a lot of success, for one-click fixing Photoshop banding.

http://nomorebanding.com


The alternatives to dithering a 256 color image both end up with undesired results. You can either create a palette specific to your image or application (which makes the rest of a multitasking system look like a tie-dye shirt), or have really, really ugly banding since you have a mere 256 colors to work with (minus reserved colors) instead of 2^24

24-bit images are 'good enough' and dithering would just add another step to the asset workflow.

I can remember the days with a 1MB video card and Windows 95 - switching between 16-bit color at 800x600, or 8-bit color at 1024x768. Resolution vs dithering...


And there is one alternative: 30bit (+alpha) colors.

But that’s currently only used by designers.


30 bit color means 10 bits per channel equals 1024 discrete values. If you do a full-screen gradient from, say, red to black on a large resolution monitor, there will be color banding.

It's better than 24 bit color but doesn't solve the problem.


Current movies use 36 bit – that’s enough for 4K usually.


I can't say for sure, but one of my guesses would be stability. By that I mean that the noise is acceptable for a single static image, because it doesn't change - but distracting in an animation if the dithering completely changes from frame to frame.

I might be a non-issue - either not noticeable or not distracting - and the real reason is that people just don't see banding as a problem or that the rendering overhead per-frame is too much.


Dithering seems to work for animated gifs.


It’s very complicated there, too, though.

As this article [1] shows, even with animated gifs you have often unstable dithering with artifacts somewhere in the image.

[1] http://blog.pkh.me/p/21-high-quality-gif-with-ffmpeg.html


That video itself has a lot of color banding, posterized or quantized look. Chrome on 2015 Macbook Pro Retina 13". Same happened on Safari.

I thought this was intentional and expected the author to mention about it at the end of the video, but... nothing.


The video looks bad because YouTube's encoder butchers video content. Add to that the reality that video decoders typically apply filters to a video post-decode to compensate for the artifacts introduced by encoders. Those filters can add banding, and the encoder already quantized the video.

It would look better if the video had been uploaded at 1080p60 or something ridiculous, because that increases the bitrate budget, and the smaller video macroblocks cover less important image information. But that's kind of an awful hack.

You can observe this clearly with youtube clips of classic games, like those at TASVideos ( https://www.youtube.com/channel/UCFKeJVmqdApqOS8onQWznfA ) the source games are 480i or 240p at best, but the quality of the youtube 480p bitrate is significantly compromised in ways you can see. If you watch it in 1080p it looks much closer to what you'd see in an emulator (or off the hardware when connected via RGB/component.)


He mentions YouTube compression at 3:28.


simply hack pixman and then most things that request gradients in linux (including cairo therefore inkscape) will generate flawlessly smooth color curves


Bit depth is not a measure of quantity, but of dynamic range. In other words, what matters is not only how many quantization steps you have, but how large of a space you are trying to map them over.

The examples in the article point to this obvious conclusion, but don't quite state it explicitly. The relationship between space and quantization effects is demonstrated for a physical dimension interpretation of space by the horizontal greyscale bar that stretches and contracts.

But what if you stretch and contract the dynamic range of your monitor itself? Each bit in the encoding space (naively) offers a doubling of dynamic range in the natural space, so even your 30 bit encoding can be stretched if you display it on a monitor that intends to output a contrast ratio many times greater than what we are used to.

For instance, imagine a monitor that could output light perceptually as bright as the afternoon sun, next to effectively infinite blackness. Will 30 bits be enough when 'stretched' across these new posts of dynamic range, or will banding (quantization) still be visually evident when examining a narrow slice of the space?


HDR monitor do exist. BrightSide Technologies was showing them at SIGGRAPH over ten years ago. Looks like consumer OLEDs are starting to get on board [1]. And, they did really want 10 bits per channel. But, even that seems lightweight.

10 bits per channel will carry us for a while. Apparently Dolby bought BrightSide and now they are pushing for 12 bits. 16 bit ints will probably be enough for home use in practice. Internally, most games that do HDR rendering use 16 bit floats for their intermediate frame buffers. That format is popular in film production as well. I would be surprised if consumer display tech ever bothered to go float16-over-DVI. But, maybe it will get cheap enough eventually that we might as well have the best :)

[1] http://www.avsforum.com/forum/40-oled-technology-flat-panels...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: