Hacker News new | past | comments | ask | show | jobs | submit login

Is it just my eyes or does the gradient dithered in sRGB look more accurate that the one dithered in linear space?



I think both gradients are "wrong" in that they themselves interpolate without correcting for RGB. I think the first example thte original and dither are wrong in the same way, while in the second the dither is more right than the gradient is.

Basically I'm afraid the author of this post is a bit of a "careless eager student" archetype who, while generously pointing out the gotchas that to an expert might be second nature, is also introducing unintentional errors that add some confusion right back.

I'm not expert in color, but with anything with soo many layers of abstractions (physical, astronomical, psycological, various models that approximate each), it helps to work symbolically as long as you possibly can so the work can be audited. Trying to correct from the current "baked" state is numerical hell.


Yes, this is also my impression.

But see also :

https://news.ycombinator.com/item?id=25644176


If it does, then it probably means you've got some funky substandard or non-standard LCD panel or gamma setting or color correction on whatever you're viewing it on.

Which isn't terribly unusual. But if your screen is well calibrated, then no -- the sRGB gradient should by definition be identical. That's literally the specification.

(And it is on my MacBook, as Apple screens tend to be among the most accurate of general consumer screens.)


The sRGB version is evenly balanced bitwise yet 'gamma free'. The linear RGB version appears bitwise imbalanced due to gamma correction, but cross your eyes and blur your vision, and you'll see the linear RGB is actually more gamma correct! (Better contrast and luminosity)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: