
Progressive JPEGs: a new best practice - ssttoo
http://calendar.perfplanet.com/2012/progressive-jpegs-a-new-best-practice/
======
herf
I don't like this suggestion of "best practice" without any numbers.
Progressive JPEG uses more RAM (because you can stream normal JPEGs so you
only have to buffer a row of JPEG blocks) and lots more CPU (up to 3x).

Most of the compression benefits can be obtained by setting the "optimized
Huffman" flag on your compressor. e.g., a baseline JPEG will save 10-20% when
"optimized" and progressive very rarely achieves a double-digit win after
that.

MobileSafari eats cycles every time you inject a large image into the DOM, so
(while I haven't benchmarked progressive), it seems like this new-found love
of progressive JPEG is beating up one of the slower code paths in the mobile
browsers. And to me, it doesn't look that good!

~~~
aerobson
This brings up some important points. Yes, we need numbers. Let's get them.

Progressive jpegs do not necessarily need to use more RAM. The FAQ I linked to
also says "If the data arrives quickly, a progressive-JPEG decoder can adapt
by skipping some display passes." Win!

Also, why do you say "up to 3x" more CPU? Is that an estimate based on how
many scans you're guessing a progressive jpeg has? A progressive jpeg can have
a variable number of scans -- we used to be able to set that number, which is
totally cool!

As for the compression benefits, you say "progressive very rarely achieves a
double-digit win after that." We web performance geeks LOVE single-digit wins,
so you can't burst our bubble that way.

Yes, Mobile Safari has trouble with images. Period. But Mobile Safari does not
progressively render progressive jpegs (I wish it did). So we can make it a
best practice without worrying about Mobile Safari. When the web is full of
progressive jpegs, Apple will have to deal with them. It's not an evil plan,
it's the right thing to do.

When you say it doesn't look that good, are you saying that for yourself
personally, or are you saying it for your users? We need to think about what
they see. As I say, perceived speed is more important than actual speed, and
the thing that excites me most about progress jpegs is not the file size
savings, but instead the behavior of the file type in browsers that properly
support it.

~~~
brigade
Progressive JPEG needs a minimum of 2 x width x height additional bytes over
baseline to decode an image (maybe more, definitely 1.5-3x more than that if
you're displaying coarse scans), regardless of how many scans you have or
display, as it needs to save coefficients for N-1 scans over the entire image,
whereas baseline needs only to save the coefficients for a couple 8x8 blocks
at a time. Though if you're clever about and sacrifice displaying coarse cans
you could reduce this some.

If you don't display coarse scans (and if you're comparing progressive vs.
baseline CPU usage then counting such isn't fair), then approximately the only
additional CPU time progressive should take is the time additional cache
misses take. It's probably a wash considering that decoding fewer coefficients
/ less Huffman data takes less CPU.

Maybe I'll get some numbers, I'm curious now... But unless you're serving
multi-megapixel images the additional CPU and memory doesn't matter. Probably
not even until you're in the double digits, if then.

~~~
aerobson
Thanks for the comment. Whatever detail you can add to this conversation is
much appreciated. It's a neglected topic, and it's important for us to
understand it better.

------
lysol
I'm nitpicking, but

    
    
      When images arrive, they come tripping onto the page, pushing other elements around and triggering a clumsy repaint.
    

This is easily avoided by defining the image dimensions in your stylesheet.

~~~
sltkr
And when the dimensions aren't defined, then progressively encoded JPEGs don't
offer an advantage either, because with both progressive and non-progressive
images, browsers reserve space for images as soon as their size is known, i.e.
when the file header is received.

~~~
aerobson
Agreed. But would you say that progressive jpegs don't offer a visual
advantage in this case? Imagine if a photo has a caption below it: baseline
starts rendering far away from the caption, but with progressive we'll get the
caption in the correct place without a big gap between the caption and where
the photo is rendering. In the case of baseline, the photo will draw to "meet"
the caption.

------
cbr

        their Mod_Pagespeed service
    

mod_pagespeed is an open source Apache module, not a service. Google also runs
PageSpeed Service, an optimizing proxy. Both support automatic conversion to
progressive jpeg.

    
    
        SPDY does as well, translating jpegs that are over
        10K to progressive by default
    

The author has SPDY and mod_pagespeed confused; this is a mod_pagespeed
feature.

(I work at Google on ngx_pagespeed.)

------
kijin
> _Plotting the savings of 10000 random baseline jpegs converted to
> progressive, Stoyan Stefanov discovered a valuable rule of thumb: files that
> are over 10K will generally be smaller using the progressive option._

At first I thought: "What, you're opening JPEGs and saving them again? Don't
you lose image quality every time you open and save in a lossy format?"

But then I read the actual source [1], and it says that `jpegtran` can open
baseline JPEGs and save them losslessly as progressive JPEGs. That sounds
useful!

Does anyone know whether other image editing software do the same thing to
JPEGs that are re-saved without modification? What about Photoshop? GIMP? MS
Paint?

[1] <http://www.bookofspeed.com/chapter5.html>

------
weixiyen
I would not call this a "best practice", but simply an alternative.

Progressively loading photos in that manner is not a good user experience
either. The first paint is often quite jarring, especially, as pointed out in
the article, over 90% of photos simply don't load this way.

For content such as photos that are contained within some sort of layout, it
would be better to have a placeholder there that is the same frame as the
final image size, then have the final image appear upon completion.

~~~
neumann_alfred
_The first paint is often quite jarring_

To me it's still better than nothing. Just like I don't enjoy websites that
only paint after they're done loading.

~~~
weixiyen
Is it really better than a loading indicator?

Just use the title link as a reference. First you see chicken wings, then all
of a sudden 6 piglets.

The transition paint is really meaningless.

~~~
neumann_alfred
_Is it really better than a loading indicator?_

For me, yes. It is a loading indicator, without any additional "noise".

Also, the second-to-last pass is often quite good, if not indistinguishable
from the last one. So you have the full image already, while more details get
filled in, instead of first having the top, then the middle, then the bottom,
and to me that's less "jarring".

------
peterjmag
This is sort of tangential, but that browser chart reminded me of something
I've been curious about for a while: Why do certain browsers only render
foreground JPGs progressively? Is it a rendering engine limitation or an
intentional one (perhaps for usability during page load)?

~~~
Someone
Guessing: rendering anti-aliased text over an image background is expensive
(you have to read the background image to blend colors). Re-rendering page
text multiple times while an image is loading may not be a good idea.

~~~
TazeTSchnitzel
Browsers seem to be very lazy about this kind of thing, perhaps for that
reason. Chrome dealt very weirdly with one of my sites, where I had an image,
with a transparent-background iframe containing a semi-transparent-background
div above it, and text above that iframe. The text's position was lagging
relative to some images placed above it. I wonder if this is why.

------
fluxon
Oh, no no no. Ill-advised idea. We've gone through this before already, and it
was resolved in favor of baseline with optimization. How was it resolved? By
website visitors, who hated-hated-hated progressive JPEGs. Boy, those who
ignore history... cue the "Doom Song" from Invader Zim, sung by Gir.

(edited)

~~~
Terretta
> _visitors hated-hated-hated progressive JPEGs_

I call BS. On most every web project I've been involved in, we've used
progressive JPEGs for the size optimization. This was crucial in the dial up
modem days when we were inventing how web sites should best serve users, and
we still do it. Sites with faster times to perceived page completion
consistently drove higher page views and longer times on site. Even in
browsers not supporting progressive rendering, perceived completion was faster
due to smaller size meaning faster load time. Switching from baseline to
progressive consistently drove higher page views and longer times on site.

I have never heard a single client say a single user complained about
progressive JPEGs. I'm not saying it hasn't happened somewhere to someone.
Users will complain about anything. But in billions of page views across
countless clients (including pro photo clients), I haven't run into user
complaints from progressive JPEGs, only measurable page view and time on site
improvements in user behavior.

------
r0s
Assuming the image has a small thumbnail embedded in the EXIF data, maybe it
would be even faster to use that resource (already included with the image),
scaled to fill the image space and replaced when the image loads.

Essentially the same effect, with some back-end code. I suppose we'd need a
benchmark test to find the real numbers. The main advantage would be sticking
with existing file types already in use.

~~~
quasque
On that note it would be interesting to see if the EXIF data is typically
included at the start or end of JPEG files, or if it varies via compressor.

~~~
jzwinck
In EXIF, whether applied to JPEG or TIFF, the header comes first, then the
thumbnail if present, then the primary image. Exactly as you'd want (which is
more than can be said for much of EXIF, e.g. that it lacks time zone
encodings).

------
pbhjpbhj
> _and progressive jpegs quickly stake out their territory and refine (or at
> least that’s the idea)_ //

Your browser stakes out the image area if you set height and width attributes
for the image thus avoiding reflows.

Also the bracketed comment makes it sound like this feature doesn't work well?

~~~
aerobson
I have observed that even when you set height and width attributes, the area
is not always "staked-out." Of course we'd assume that it is. I'll need to get
you some browser and version details about this.

~~~
codeka
Are you saying if you specify <img width="xx" height="xx"> that the browser
doesn't reserve exactly the right amount of space for that image? Because
that's exactly what it's _supposed_ to do: if it's not, then it's a browser
bug.

------
mistercow
>They are millions of colors and pixel depth is increasing.

Um, no. Nobody is serving anything more than 8 bpc on a web page, and nobody
has a monitor that could show it to them if they did.

~~~
csense
The real limitation is the number of colors that can be distinguished by the
human eye. I've read that medical scanners and other science-y imaging gear
can sample intensity with 16-bit resolution.

But if you're talking about hardware whose sole purpose is displaying images
for humans to view, having more than 8 bits is just a waste of resources.

~~~
Dylan16807
I might buy 10 bits, but it's pretty easy to distinguish 8-bit brightness
levels.

Edit: Found a source quoting 450 light levels the eye can distinguish, so
adding a bit of buffer for smooth transitions and imperfect gamma you'd need
10 or 12 bits.

~~~
liuliu
Also it depends on the color, for blue, 8 bits may be sufficient, but
definitely not green.

~~~
Dylan16807
Rods pick up blue just fine, so I don't think you can skimp on any color.

~~~
mistercow
Red is picked up about half as well as green, and blue is discerned about half
as well as red. You can definitely skimp on blue.

~~~
Dylan16807
I strongly disagree. Blue is less bright to rods, that's it. If you _happen_
to be looking at even gray, then sure the fact that blue seems a quarter as
bright means you can chop off two bits. But if you look at pictures that are
blue, without a lot of other colors to wash it out, you're going to have the
exact same 450 distinct brightness levels over the 100:1 contrast ratio that
the eye picks up.

------
lucaspiller
Good article! I'm quite surprised mobile browsers basically ignore this
though. Showing the low quality image when zoomed out, and only the full
quality when zoomed in would avoid the CPU issues.

~~~
jonny_eh
Although zooming in would require a re-rendering of the image, which could
cause lag/jitter.

------
cbsmith
Wow. Everything old is new again...

~~~
kamjam
Exact same thing I was thinking. We used to ue this technique when I first
started web dev 12 years ago and used to have to support dial up modems. as
internet connection speeds have gotten faster the newer breed to developers
totally ignored any optimization techniques and page loafs have become
extremely bloated. Now with mobile, limited speeds and limited data plans, as
you say, we have gone full circle. Kind of like responsive designs and 100%
tables from yester-year - ok, so areas didn't reflow like they do, ir areas
dud not hide themselves, but the content did scale according to screen width.

~~~
Terretta
Thanks for pointing this out. I don't usually like the phrase "kids these
days", and truth is this has nothing to do with age and everything to do with
experiencing first hand the many interesting ways users' connections can be
borked.

Same holds true in video streaming, where companies got overconfident with
broadband then are surprised when the limited bandwidth and high latencies of
wireless are better managed by the multi-bitrate and error correcting
streaming technologies of a decade ago.

~~~
kamjam
Just realised how bad the spelling was on my comment, sheesh, the joys of
typing from my smart phone!

I'm currently working in Canada, having come over from the UK where we are
much more used to cheap unlimited broadband plans, both at home and cellular.
So the caps and prices here have been quite a shock! A friend told me that
NetFlix (or similar) in Canada stream at a much lower rate and don't offer HD
in Canada because of the fact that. My current broadband plan costs me $60
(+15% tax) for 130GB a month.

One thing it has taught me is you really need to consider the differences of
the local markets!

------
shod
I'm usually eager to jump on board with recommendations from Stoyan’s
performance calendar, but Anne’s description of baseline loading doesn't
comport with my experience in Webkit browsers (and possibly others). They
don't "display instantly after file download", but display almost line by line
as the file is downloading, with a partial image appearing from top to bottom
as information becomes available. Since this particular trick is about
perception -- users being given some visual indication that that the image is
loading, and data as soon as possible -- the difference between progressive
and baseline loading seems like it should be subtler than the article
suggests.

~~~
aerobson
I never said baseline jpegs display instantly after file download. They render
as I describe several times, top to bottom or "chop chop chop." It's
progressive jpegs that display instantly after file download IF the browser
does not support progressive rendering of progressive jpegs.

The difference between the rendering of the two file types is not subtle.

------
esharef
Thanks for an interesting article. What's your guess on when other browsers
will make using progressive jpegs easier?

~~~
mmariani
Guesses won't make any difference. File a rdar or open tickets instead ;)

------
btown
Now if only there was a bot that could send pull requests to every GitHub
repository making JPEGs progressive...

------
stesch
A current blog post but Firefox versions from April and August in the data?

------
justincormack
s/SPDY/mod pagespeed/

