
JPEG2000: Choices and Tradeoffs for Encoders (2004) [pdf] - nkurz
http://www.cs.sfu.ca/CourseCentral/820/li/material/source/papers/jpeg2000tips04.pdf
======
jsingleton
I did my final University project on an algorithm based on JPEG2000. It's a
pretty interesting way of compressing images.

One of the best things about using the Wavelet Transform over the Discrete
Cosine Transform is it allows you to structure the file or stream from low to
high frequency. This lets you get an image of any size out of a single file
just by taking a subset of it. No more resizing thumbnails. It's kind of like
a super version of progressive JPEG.

Sadly it never took off as storage / bandwidth are cheap and JPEG is so
entrenched (especially in hardware). It still has some niche applications in
constrained environments though.

 _Edit:_

I dug out the paper if anyone wants to read it:
[https://unop.uk/misc/university-final-year-project-
cbir](https://unop.uk/misc/university-final-year-project-cbir)

It's funny looking back at something you wrote over 8 years ago any seeing how
you've improved. In other words, please don't use this to judge me!

    
    
      Abstract
      This project investigated the effects of scaling images
      with JPEG2000 emulation software on a CBIR system based 
      on MPEG-7. A CBIR system was built that consisted of 
      indexing and retrieval. The performance of the system was 
      evaluated with standard metrics.
    

Both JPEG2000 and MPEG-7 never really took off. Most people moved on to more
advanced algorithms.

~~~
varjag
My understanding is patent encumbering was what killed JPEG2000. There was an
effort later to waive the patents, but not sure how comprehensive and perhaps
a little too late.

So it's the same reason noone uses theoretically superior arithmetic coding
option in baseline JPEGs.

~~~
IshKebab
It was more that it didn't offer enough benefits. It was "the files are a bit
smaller".

Image files don't take up _that_ much space anyway so that is less compelling
than for video compression.

What they needed to do is add more features, like WebP has now:

* Single format for photos and diagrams * Transparency * Lossy and lossless in the same format * HDR * Tiling of large pictures

Hopefully those features (especially lossy transparency) will help WebP do
better than JPEG2000, though I'm not holding my breath. If IE and Firefox ever
add support for it then I think web developers will start using it a lot.

~~~
PaulKeeble
Performance was also a big problem, they had a lot of issues in the beginning
justifying the amount of time it took to encode and especially decode the
images. IIRC it was of the 100x the time of JPEG and that meant that at speeds
of the time the bandwidth saved was overshadowed by the decode time.

I did a lot of working with it and had quite a lot of problems with
performance of the sample software.

~~~
yoklov
Did you feel that this was an intrinsic property of the format, or just that
less/not enough time had been spent on developing efficient implementations?

------
pornel
JPEG2000 just isn't that great, and DCT-based compression is very hard to
beat. DCT with varying block sizes gets most of the efficiency of wavelets
with fewer complications.

A poorly encoded block only makes a limited area of the image look bad. In
wavelets errors in lower frequencies have ripple ( _wink_ ) effects. It's
mathematically easier to reason about DCT, so encoders can be really good.

DCT can easily create noise and sharp edges (even if they're fake, they still
can create realistic texture), but any sort of noise and sharpness is
expensive with wavelets. In practice it turned out it's easier to do DCT and
then blur unwanted edges than to express edges to wavelets.

------
TorKlingberg
It's interesting how JPEG2000 use has not really taken off outside of a few
niche areas. Web browser vendors have been reluctant to implement it because
of increases attack surface, and of course web sites will not use it without
browser support. Patents may also be an issue.

JPEG works mostly on 8x8 blocks of pixels, and in high resolution pictures
each block is almost a single color. JPEG does a poor job taking advantage of
similarity between the blocks.

JPEG2000 does at least see some use in archival formats.

~~~
captainmuon
> Web browser vendors have been reluctant to implement it because of increases
> attack surface, and of course web sites will not use it without browser
> support.

I believe codecs should actually be part of the OS. There should be one good,
well-tested implementation per OS that browsers and other apps just link to.
It might actually be possible to sandbox the codec components for increased
security. Ideally, a web browser would just be a HTML+stuff parser, a layout
engine, and a bit of gui, and shell out to the OS for everything else.
Unfortunately, it seems the monolithic kitchen sink model has won.

~~~
vetinari
The monolithic kitchen sink has won, because the OS bundled libraries have
different APIs (not only among different OSes, but also among different
versions - and OS vendors are interested in pushing the latest), so app
vendors just said to themselves "screw it" and bundled library (like jpeglib
or libpng) that has same API on all systems and versioning is under their
control.

That's the short version ;).

------
fnordfnordfnord
I'd like to see this get more attention.
[http://bellard.org/bpg/](http://bellard.org/bpg/)

------
therealmarv
RIP. The wars today are WebP (Google) vs. JPEG-XR (Microsoft) vs. BPG ?! Maybe
they will all also die... I only heard that WebP is/was used by Facebook and
is used by Telegram.

~~~
ashmud
BPG may have patent issues.

[http://bellard.org/bpg/](http://bellard.org/bpg/)

"Some of the HEVC algorithms may be protected by patents in some countries
(read the FFmpeg Patent Mini-FAQ for more information). Most devices already
include or will include hardware HEVC support, so we suggest to use it if
patents are an issue."

