

Fast Lossless Color Image Compression - dochtman
https://kdepepo.wordpress.com/2012/01/30/fast-lossless-color-image-compression/

======
DarkShikari
Beating PNG is not actually hard; zlib is _not_ fast at compression, nor is it
very good at compressing anything besides graphics. It doesn't even do RGB
decorrelation, which can be a really big win (e.g. YCgCo).

Ironically, on non-graphics material, most lossless video encoders beat PNG,
too. FFV1 is particularly good at grainy images, and is probably quite
difficult to beat without either a much fancier predictor (e.g. LPC instead of
median) or much more costly entropy coding (context mixing).

~~~
duskwuff
I wonder how much you could improve PNG just by giving it some more
appropriate filters for truecolor images?

------
sambeau
Impressive. But, I guess someone has to ask the inevitable, sad question:
"which patents does this code unwittingly infringe?"

~~~
joeybaker
The fact that the question is inevitable is disgusting.

~~~
sambeau
Agreed, especially if this was developed in a closed room.

~~~
smosher
Do you mean clean room? That protects you from copyright, but not from
patents.

------
lubutu
It would be great if the author could actually write a description of how IZ's
algorithm works. The C++ source is fairly complex and has no comments
whatsoever.

~~~
chipsy
A quick perusal of the source indicates that the algorithm hinges on the
predictive encoding strategies in this file:

[http://gitorious.org/imagezero/imagezero/blobs/master/pixel....](http://gitorious.org/imagezero/imagezero/blobs/master/pixel.h)

Without comments the specifics are still somewhat mysterious, though.

------
pavlov
For fast lossless image compression algorithms that are used by the world's
leading visual effects studios, check out OpenEXR: <http://www.openexr.com>

From the Features page: _"The current release of OpenEXR supports several
lossless compression methods, some of which can achieve compression ratios of
about 2:1 for images with film grain. OpenEXR is extensible, so developers can
easily add new compression methods (lossless or lossy)."_

~~~
klodolph
Not everyone wants 16-bit floating point samples.

~~~
pavlov
An OpenEXR file can contain 8-bit channels.

------
huhtenberg
> _The best known algorithms, however, are very slow, and sometimes
> impractical for real-world applications._

Is this in reference to paq [1]? Are there any other really good compressors
that are too slow?

[1] <http://en.wikipedia.org/wiki/PAQ>

~~~
celoyd
For any algorithm that’s heuristically searching a very large space (in this
case, the space of model parameters to compress a given string), you can
expect it to do better given more time. So it would actually be surprising if
the best compressors, in terms of compression ratio, were fast.

In practice, the competitions that people like to use to measure which
compressor is “best” have resource limits, and compressors from the PAQ family
are at or near the top of most rankings.

<http://mattmahoney.net/dc/text.html> has some nice charts of the Pareto
frontier for at least one competition. This lets you see the most efficient
compressor for a given compression ratio and vice versa. The log scale gives a
sense of the diminishing returns: it’s really easy to compress most structured
data to 0.5 its original size, but getting from say 0.214 to 0.213 can be a
hell of a lot of work.

------
nitrogen
Right now the CPU usage of a project I'm working on is dominated by PNG
compression. An alternative with significantly faster compression times would
be very useful. I'll be watching this closely, and hope that it becomes
possible to decode IZ images in the browser (or implement it myself).

~~~
celoyd
PNG compressors generally try a couple different strategies, which can get
very expensive. You might be able to ask the compression library to be a
little more hands-off, if you don’t need great compression.

~~~
nitrogen
Thanks for the suggestion. Unfortunately I'm already specifying the pixel
delta strategy that works best with my images, and using the fastest zlib
compression setting. Going with no zlib compression would turn my CPU problem
into a bandwidth problem :).

~~~
celoyd
Ouch. Well, it looks like you get to be an early adopter of IZ!

------
ck2
There is a windows binary in this thread (obviously try with caution)

[http://encode.ru/threads/1471-iz-New-fast-lossless-RGB-
photo...](http://encode.ru/threads/1471-iz-New-fast-lossless-RGB-photo-
compression)

------
rorrr
Sample size of one, and they didn't even publish the image itself.

~~~
ck2
<http://skulpture.maxiom.de/playground/list-iz.txt>

They tried it on the same benchmark as this list

<http://www.imagecompression.info/gralic/LPCB.html>

Of course it's meaningless without the time to decompress on there but it has
to be done on the same machine as the master benchmark list.

