Hacker News new | past | comments | ask | show | jobs | submit login

Somewhat related, a few years ago I noticed that an image with gaussian blur applied to it takes up significantly less space in JPEG than does the original image. I suppose it works out because the blur diffuses information throughout a particular area such that each JPEG block has less detail to encode internally?



A gaussian blur is essentially a low-pass filter so all the high frequency components are gone. The resulting image can be represented and stored using fewer DCT coefficients.

This page has a few visual examples at the bottom: https://www.cs.unm.edu/~brayer/vision/fourier.html


JPEG's compression is very similar to blurring.

It converts the image into frequency domain, and reduces precision of the frequency data, with a special case when the data rounds down to zero.

However, whole-image blur crosses the 8x8 block boundaries, so it isn't perfectly aligned with the blur that JPEG uses. Lowering quality setting in JPEG (or using custom quantisation tables) will be more effective.

There's also a fundamental information-theoretic foundation for this — lower frequencies carry less information.


blurring does average which compression does too




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: