
JPEG 2000: The Better Alternative to JPEG That Never Made It Big - artsandsci
https://petapixel.com/2015/09/12/jpeg-2000-the-better-alternative-to-jpeg-that-never-made-it-big/
======
niftich
Other commenters are right, licensing was a big factor, because the 1990s were
marred by a few high-profile events where patents on popular formats were
suddenly enforced [1]. These events led to the PNG format being developed as
an alternative to GIF, and Vorbis developed as an alternative to MP3.

By the time JPEG 2000 rolled around, wariness of patents was well-established.

On the technical side, JP2 produced unsightly ringing distortions much more
noticeably at medium compression ratios than JPEG did while also blurring
detail, while JPEG would only experience ringing around sharp transitions, and
would get blocking artifacts elsewhere. For some people, JPEG's blocking would
impact a sort of 'fake sharpness' in the image, while JP2 would produce a
blurry smudge. Dark Shikari, longtime dev on x264, wrote about this phenomenon
[3], which also adversely affected early H.264 encoders and made them look
worse compared to ASP (like Xvid), despite former being an objectively better
format.

[1]
[https://en.wikipedia.org/wiki/GIF#Unisys_and_LZW_patent_enfo...](https://en.wikipedia.org/wiki/GIF#Unisys_and_LZW_patent_enforcement)
[2]
[https://en.wikipedia.org/wiki/MP3#Licensing.2C_ownership_and...](https://en.wikipedia.org/wiki/MP3#Licensing.2C_ownership_and_legislation)
[3]
[http://web.archive.org/web/20141124213159/http://x264dev.mul...](http://web.archive.org/web/20141124213159/http://x264dev.multimedia.cx/archives/317#more-317)

------
bnolsen
jpeg2k is far far more complex, hard to implement, is far more computationally
complex. The quality of the results arguably aren't that much better.
Revisiting the encoding of the DCT coefficients shows tremendous gains can be
had there. After digging into jpeg2k I was never a big fan.

IMHO for image compression there's a balance between algorithmic simplicity,
computational efficiency and image quality.

I'm looking for an article some months back about DCT coding sans huffman, but
here's a quick page on how using arithmetic coding in the JPEG standard cuts
jpeg file sizes down 5-12% [http://www.rw-
designer.com/entry/1311](http://www.rw-designer.com/entry/1311)

Okay one DCT based recompressor is "lepton" which is 20% smaller than jpeg
with the identical DCT quality:
[https://blogs.dropbox.com/tech/2016/07/lepton-image-
compress...](https://blogs.dropbox.com/tech/2016/07/lepton-image-compression-
saving-22-losslessly-from-images-at-15mbs/)

------
theandrewbailey
I think this is a bad article. I was expecting a more technical piece, but it
only gives a few high level reasons why it never got used widely. And wasn't
legal requirements a big reason?

It mentions that backwards incompatibility was a big reason. Why would you
design an obviously better media format, then ruin it by a burdensome needless
requirement? Has any widely used media format been backwards compatible? (I
can't think of any)

------
dmitrygr
Missing main reason: licensing

------
skrowl
When can we expect the same article about WebP?

~~~
tehlike
I made a good slice of mobile app ads use webp, it provides significant bw
savings over jpeg, and i am not sure if it is going anywhere.

Ps: i work for google.

