Hacker News new | past | comments | ask | show | jobs | submit login
Fast Lossless Color Image Compression (kdepepo.wordpress.com)
131 points by dochtman on Jan 30, 2012 | hide | past | favorite | 21 comments



Beating PNG is not actually hard; zlib is not fast at compression, nor is it very good at compressing anything besides graphics. It doesn't even do RGB decorrelation, which can be a really big win (e.g. YCgCo).

Ironically, on non-graphics material, most lossless video encoders beat PNG, too. FFV1 is particularly good at grainy images, and is probably quite difficult to beat without either a much fancier predictor (e.g. LPC instead of median) or much more costly entropy coding (context mixing).


I wonder how much you could improve PNG just by giving it some more appropriate filters for truecolor images?


Impressive. But, I guess someone has to ask the inevitable, sad question: "which patents does this code unwittingly infringe?"


The fact that the question is inevitable is disgusting.


Agreed, especially if this was developed in a closed room.


Do you mean clean room? That protects you from copyright, but not from patents.


It would be great if the author could actually write a description of how IZ's algorithm works. The C++ source is fairly complex and has no comments whatsoever.


A quick perusal of the source indicates that the algorithm hinges on the predictive encoding strategies in this file:

http://gitorious.org/imagezero/imagezero/blobs/master/pixel....

Without comments the specifics are still somewhat mysterious, though.


For fast lossless image compression algorithms that are used by the world's leading visual effects studios, check out OpenEXR: http://www.openexr.com

From the Features page: "The current release of OpenEXR supports several lossless compression methods, some of which can achieve compression ratios of about 2:1 for images with film grain. OpenEXR is extensible, so developers can easily add new compression methods (lossless or lossy)."


Not everyone wants 16-bit floating point samples.


An OpenEXR file can contain 8-bit channels.


> The best known algorithms, however, are very slow, and sometimes impractical for real-world applications.

Is this in reference to paq [1]? Are there any other really good compressors that are too slow?

[1] http://en.wikipedia.org/wiki/PAQ


For any algorithm that’s heuristically searching a very large space (in this case, the space of model parameters to compress a given string), you can expect it to do better given more time. So it would actually be surprising if the best compressors, in terms of compression ratio, were fast.

In practice, the competitions that people like to use to measure which compressor is “best” have resource limits, and compressors from the PAQ family are at or near the top of most rankings.

http://mattmahoney.net/dc/text.html has some nice charts of the Pareto frontier for at least one competition. This lets you see the most efficient compressor for a given compression ratio and vice versa. The log scale gives a sense of the diminishing returns: it’s really easy to compress most structured data to 0.5 its original size, but getting from say 0.214 to 0.213 can be a hell of a lot of work.


Right now the CPU usage of a project I'm working on is dominated by PNG compression. An alternative with significantly faster compression times would be very useful. I'll be watching this closely, and hope that it becomes possible to decode IZ images in the browser (or implement it myself).


Just as a side note, someone wrote a JavaScript-backed JPEG 2000 decompressor: https://github.com/mozilla/pdf.js/pull/1068/files


PNG compressors generally try a couple different strategies, which can get very expensive. You might be able to ask the compression library to be a little more hands-off, if you don’t need great compression.


Thanks for the suggestion. Unfortunately I'm already specifying the pixel delta strategy that works best with my images, and using the fastest zlib compression setting. Going with no zlib compression would turn my CPU problem into a bandwidth problem :).


Ouch. Well, it looks like you get to be an early adopter of IZ!


There is a windows binary in this thread (obviously try with caution)

http://encode.ru/threads/1471-iz-New-fast-lossless-RGB-photo...


Sample size of one, and they didn't even publish the image itself.


http://skulpture.maxiom.de/playground/list-iz.txt

They tried it on the same benchmark as this list

http://www.imagecompression.info/gralic/LPCB.html

Of course it's meaningless without the time to decompress on there but it has to be done on the same machine as the master benchmark list.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: