
Image Compression: Seeing What's Not There - davidbarker
http://www.ams.org/samplings/feature-column/fcarc-image-compression
======
dietrichepp
You might think image compression technology has stalled, but it hasn't...
it's just being done under the umbrella of video compression. Demand for
better video compression techniques has driven image compression far beyond
the underwhelming JPEG 2000 standard.

JPEG 2000 doesn't have wide support because it sucks. It does solve two major
problems: blockiness and efficient lossless encoding. However,

1\. It's not actually much better than JPEG in terms of quality / file size
tradeoff, at least under usual circumstances. JPEG 2000 is sometimes
subjectively worse than JPEG.

2\. Available encoders and decoders have poor performance.

Meanwhile, look at how amazing I-frames are in H.264. They decode rather
quickly, you can do adaptive quantization, etc. I'll bet on WebP (basically,
VP8 I-frames) over JPEG 2000 any day. JPEG XR is the other serious contender.
JPEG 2000 was a good idea, it just didn't work out.

Unfortunately, the tests I see compare image formats just using SSIM (or
worse, PSNR), rather than allowing subjective comparisons.

[https://developers.google.com/speed/webp/docs/c_study](https://developers.google.com/speed/webp/docs/c_study)
(who uses PSNR? the poor methodology means you can't take this analysis
seriously)

[http://goldfishforthought.blogspot.com/2010/10/comparison-
we...](http://goldfishforthought.blogspot.com/2010/10/comparison-webp-jpeg-
and-jpeg-xr.html) (SSIM is better, but we still need to measure the subjective
quality)

~~~
jacobolus
Thanks for this comment. I hate how many analyses use PSNR to represent
“quality” (especially academic papers; I don’t understand why academic CG/CV
people have such a love for nonsense metrics).

I think you’re being a bit too hard on JPEG 2000. If you try compressing
images to a very very tiny size, I find that the regular old JPEG encoders
I’ve tried produce much more objectionable artifacts than the JPEG 2000
encoders I’ve tried, at the same filesize. I haven’t done any very
comprehensive study though; do you have a link to a study showing JPEG 2000
performing worse? What type of images and what level of compression was that?
You’re right that the JPEG 2000 encoders and decoders have abysmal performance
though, making them impractical in many (or even most) use cases.

The big use case I can think of for JPEG 2000 is archival of very high
resolution images where disk space is limited/expensive and encoding/decoding
speed doesn’t matter much.

I personally thought Google’s WebP was very underwhelming on all the stuff I
threw at it 2–3 years ago when I did some ad-hoc personal tests. Basically
pretty similar in filesize vs. subjective quality to regular old JPEGs for the
images I tested (and noticeably worse than JPEG 2000 at the same file sizes),
except brand new and not supported by anyone but Google.

[But note my little hour of ad-hoc testing based on personal judgments of a
small handful of images was certainly far from comprehensive or authoritative,
and it was a few years ago, so YMMV.]

Seemed to me like the advantages were more political (i.e. giving Google
control and helping push support for their video codec and their fight against
MPEG license fees) than technical. And it didn’t help that their publicly
published comparisons and marketing material were full of obviously bogus
tests. Felt like snake-oil sales.

Have they improved the WebP codec since then, or is there a better analysis
somewhere of when WebP is a better choice than JPEG?

------
Scaevolus
JPEG supports progressive rendering by sending one group of coefficients at
the time -- first the DC coefficient (average value), then each AC coefficient
in turn (starting with low frequencies).

This tends to actually improve efficiency, since coefficients of a given
frequency are more similar to each other than coefficients of a given block.

~~~
hrjet
Yeah, I have noticed it improves compression on many of my photos. Why isn't
it the default? It probably takes more time / memory to decode.

------
richcuteguy34
Awesome stuff.

Here's another article about this
[http://www.datagenetics.com/blog/november32012/index.html](http://www.datagenetics.com/blog/november32012/index.html)

