Hacker News new | past | comments | ask | show | jobs | submit login

Is there any good, systematic discussion of the trade-off between computational complexity and compression? I have this intuition that the lower the entropy of signals, the more complex encoders and decoders might potentially need to be, while slightly reducing compression could afford, in some cases, making do with dumber codecs. Is this idea explored anywhere?



You may want to read about the Pareto frontier. Squash Compression Benchmark [1] shows a plot of the Pareto frontier for the particular dataset.

[1] https://quixdb.github.io/squash-benchmark/#optimal-codecs


Read up on Kolmogorov complexity. Mind-blowing stuff. Not exactly what you’re asking about but I think it’s relevant.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: