Hacker News new | past | comments | ask | show | jobs | submit login

Actually the coefficients are 12 bit in JPEG, before quantization. In principle you can make pretty accurate 10-bit HDR JPEG files, and with an accurate JPEG decoder, it would work well enough.

The most common JPEG decoders though (in particular libjpeg-turbo) are using a cheap but not super precise iDCT that has 8-bit YCbCr as output, which then gets chroma-upsampled if needed and converted to 8-bit RGB. That causes the effective precision in reds and blues to be only 7-bit. But in principle you could have about 10 bits of effective RGB precision, it just requires a sufficiently precise JPEG decoder.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: