
MozJPEG 3.0 - ssttoo
http://calendar.perfplanet.com/2014/mozjpeg-3-0/
======
georgef
I'm reminded here that JPEG includes arithmetic encoding as part of the
standard, but almost everyone uses Huffman because up until a couple years ago
arithmetic encoding was patent-encumbered (the patents are expired now). Is
anyone aware of a study like Mozilla's that considers JPEG-with-arithmetic-
encoding? Or perhaps it does, and I failed to notice?

Most competing file formats seem to beat JPEG by only a slim margin, and what
I've read on arithmetic encoding suggests it gives a ~5-10% gain, which would
make that difference slimmer still, perhaps vanishing into the uncertainty of
the usefulness of these quality benchmarks. Of course, there would be inertia
to overcome to support it, as with a new format, but recompiling everyone's
libjpeg is surely less work than adding support for whole new file formats. At
the very least, it seems there might be a better effort/payoff ratio.

~~~
chkuendig
According to other comments and a mozilla bug[1] most/all browsers dont even
support decoding these jpegs.

[1]
[https://bugzilla.mozilla.org/show_bug.cgi?id=680385](https://bugzilla.mozilla.org/show_bug.cgi?id=680385)

------
jacquesm
JPEG is absolutely awesome and this is a valuable addition.

I was using the very first release of the source back in the stone age or so.
We took passport photo images with a video camera at reasonably high
resolution and then scaled them down and compressed with PCX to save on
storage.

Quality after compression was absolutely terrible.

Then Tom Lane came along with libjpeg and suddenly the quality was better than
what we could print!

~~~
jacquesm
Too late to edit: PCX is lossless, what we did was reduce the higher frequency
bits in the images before using PCX in order to achieve a reasonable
compression ratio, in effect making a lossy wrapper around PCX. It was a
pretty crude way of making this work and JPEG was _so_ much better that it is
hard to believe we managed to sell our customers on the original version. I
should see if I can dig up some of those old images, they're interesting
historically.

------
joshmoz
FYI, I plan to officially release mozjpeg 3.0 tomorrow.

Thanks for the writeup and your work on this release, Kornel!

~~~
auvi
I am waiting for 3.0 in HomeBrew! Thanks for the awesome work, mozjpeg has
saved me a fair amount of network bandwidth.

~~~
Kiro
What are you using it for?

------
BorisMelnik
beautiful! I had this file sitting on my desktop in compressed JPEG format.
Text JPEGs are inherently bad and spit out ugly pics:

Original - 327kb -
[http://i.imgur.com/DTxTcLp.jpg](http://i.imgur.com/DTxTcLp.jpg)

MozJPEG - 127kb -
[http://i.imgur.com/jVESWGS.jpg](http://i.imgur.com/jVESWGS.jpg)

Stared at both side by side and really struggled to tell the difference. Great
job!

Sorry WebP is great but I just don't see it getting adapted unless all
browsers get on board as well as big software. JPEG is practically a household
name, photographers, artists, insta-grammers,all know what it is and short of
a mild revolution I just don't see it.

~~~
mikhailt
How can you not tell the difference? Look at the "Humor, empathy, resilience",
the characters 'r'/'i'/'n' are damaged.

There are more in the fourth column.

~~~
joshmoz
Thanks for pointing this out! Filed a bug:

[https://github.com/mozilla/mozjpeg/issues/139](https://github.com/mozilla/mozjpeg/issues/139)

I said in another comment that we'd release 3.0 tomorrow, we'll probably hold
up the release to investigate this.

~~~
uniclaude
Wow, that was fast.

They already fixed the bug. I'm impressed.

~~~
mikhailt
Awesome to hear that.

@joshmoz Hopefully, you can re-run the image and share it with us in case we
can find something else. :D

------
sandstrom
Deringing (removing 'noise' around text and similar shapes in jpeg-compressed
images) is awesome only in itself.

Hopefully this will find its way into image authoring tools.

~~~
leeoniya
wonder if SmartDeblur would benefit from this, also:
[http://smartdeblur.net/gallery.html](http://smartdeblur.net/gallery.html)

------
leeoniya
would adding dithering support to the encoder help with gradient smoothness? i
know it helps a lot with non-compressed formats in addition to shrinking
filesize (though it may not be the case with jpeg compression). you can toy
with the params [1] and see that even dropping target palette color count by
>50%, still gets good results with a dithering kernel selected. repo here [2],
btw.

[1] [http://o-0.me/RgbQuant/](http://o-0.me/RgbQuant/)

[2]
[https://github.com/leeoniya/RgbQuant.js](https://github.com/leeoniya/RgbQuant.js)

~~~
pornel
Yes, some sort of deblocking in the decoder could go a long way, e.g.

[http://johncostella.com/unblock/](http://johncostella.com/unblock/)

This isn't something that can be done on the encoding side, so it's out of
scope of MozJPEG.

I do think it may be worthwhile to spec a backwards-compatible extension for
decoders that adds deblocking.

~~~
derf_
You could also just modulate the lambda used in the trellis quantization so
that it is less aggressive in smooth blocks, and more aggressive in textured
blocks. It's not as good as being able to change the quantizer, but you can
get somewhere around half the benefits of real activity masking by changing
lambda alone.

------
Cherian
Thanks so much for this piece of art.

I’ve been using JPEGMini trial
[[http://www.jpegmini.com/](http://www.jpegmini.com/)] for a while. How does
this compare?

~~~
sinak
I just ran a quick comparison:

Original: 250kB
[http://files.sina.is/original.jpg](http://files.sina.is/original.jpg)

Compressed by JPEGmini Lite: 133kB [http://files.sina.is/jpeg-
mini.jpg](http://files.sina.is/jpeg-mini.jpg)

Compressed by MozJPEG 3.0 @ Medium: 82kB [http://files.sina.is/moz-
medium.jpg](http://files.sina.is/moz-medium.jpg)

Compressed by MozJPEG 3.0 @ High: 141kB [http://files.sina.is/moz-
high.jpg](http://files.sina.is/moz-high.jpg)

~~~
kristofferR
Thanks, but unfortunately it's not that good of a comparison since the
original is already heavily compressed. Could you do another with a less
compressed original image (and, preferably, more color variation)?

------
shmerl
So does it mean that Daala compression can be used to produce some new image
format when it will be ready (similarly to how WebP was produced from VP8)?

~~~
pornel
Yes, Daala is doing exactly that:

[http://people.xiph.org/~xiphmont/demo/daala/update1.shtml](http://people.xiph.org/~xiphmont/demo/daala/update1.shtml)

~~~
shmerl
Thanks, that's a good overview. Hopefully something better than JPEG and WebP
will come out of Daala still images.

~~~
bnolsen
based on my experience with bpg compression of 39 megapixel image takes 2s
with jpeg turbo (the original is a raw tiff but already cached). same image
bpg is 8m30s. this is on an ivy bridge xeon. i was wanting to smash a few
hundred thousand of these 39mp images for transport and backup storage but
unacceptible time wise. how much faster would daala be than hvec?

~~~
shmerl
Daala's methodology of video compression differs from HEVC, namely it
optimizes the perceived quality of the image. So in theory it can be
computationally lighter, because it can save on areas which are affecting
perception less. But it's not there yet.

Here is an overview of this idea:
[http://jmvalin.ca/video/spie_pvq_abstract.pdf](http://jmvalin.ca/video/spie_pvq_abstract.pdf)

I'm not sure though how exactly it translates into still images compression
efficiency. For video they do plan to eventually beat HEVC both on quality and
algorithmic delay.

------
CookWithMe
Since it's mentioned in the article: Does anyone have experience with lossy
png tools?

I'm currently working on a project that needs alpha channels. I've been
optimizing the images with pngcrush, which helped (interestingly, images put
out with Adobe products where already pretty optimized, but I'm generating
thumbnails locally with sips, where pngcrush often saves 60+%).

Still, for photographic images, file size often remains multiple times larger
than what I'd expect from a high-quality JPEG.

~~~
Daiz
>Does anyone have experience with lossy png tools?

I do. Lossy PNGs work great for images with not a lot of colors. I use
pngquant and optipng a lot in my work to compress a lot of PNG images with
practically no visual quality loss.

For very colorful images, lossy (quantized & dithered) PNGs just don't work,
though. They just end up looking nasty with larger filesizes than what high
quality JPG gives you.

~~~
pornel
There's also a method for true-color lossy PNGs that blurs instead of
dithering: [https://speakerdeck.com/pornel/lossy-png-for-true-color-
imag...](https://speakerdeck.com/pornel/lossy-png-for-true-color-images-
velocity-conf-eu-2014) ([https://github.com/pornel/mediancut-
posterizer](https://github.com/pornel/mediancut-posterizer)). It's not as
efficient as PNG8, but still better than nothing :)

------
pwr22
Looks awesome but I hope there is a better naming convention for the quant
tables than

    
    
      -quant-table 2
    

etc

------
annand_virk
This is totally awesome. Nothing bothers me more than seeing that awkward
noise around images I export from Photoshop. Someone mentioned this already,
but I hope this finds its way into the apps I use.

On a totally unrelated note, Denny, the dude that dropped the first comment on
that post, is not a stand-up guy.

------
Timmmmmm
I have a feeling that nobody would really bother with WebP for its
compression, but does JPG/PNG have:

* Lossy compression with alpha channels. * Efficient lossless compression of photo-like images. * Efficient compression of photo-like and diagram-like images in the same format (and in the same image, e.g. screenshots containing photos). * Good lossy compression of diagram-like images.

No.

~~~
d0ugie
> nobody would really bother..

I did, last summer, converted all images (35K) on my NSFW hobby site (check
profile) to WebP with no jpeg fallback or shabby javascript decoder (which
don't work on very high res images), and haven't looked back.

On my journey to 1000ms-to-glass with a site like mine, I'm going to go with
the format that gives me dramatic size savings, thank you Google.

That said, I can see how it benefits Firefox users not to be able to render
WebP... sigh.

It would be helpful if 4chan followed my lead by at least allowing users to
post WebP with something like mod_pagespeed running.

~~~
Renaud
What particular sorcery do you use that prevents your site from serving
Firefox browsers?

~~~
d0ugie
An nginx redirect based on user agents to an apology and a list of download
links to WebP friendly browsers. I used to include a link to a Firefox fork
that supported WebP natively, but no one bothered.

I made a sort of Google+ companion to the site which I'd bump them onto but I
still haven't gotten the hang of not getting banned.

~~~
cbr
ngx_pagespeed would be another way of serving WebP to supporting browsers and
JPEG to others.

~~~
d0ugie
Yes, but not when storage, bandwidth, money, a desire to deliver only the best
user experience (or nothing) and pushing WebP are concerns.

By the way, it's remarkable when running an image-heavy site how much bot/mass
downloader traffic relative to humans vanish when turning away Firefox user
agents.

