
Faster Images using HTTP2 and Progressive JPEGs - ilarum
http://calendar.perfplanet.com/2016/even-faster-images-using-http2-and-progressive-jpegs/
======
philbo
It's worth balancing this against other research showing that users often
don't like the experience provided by progressive image rendering:

[http://www.webperformancetoday.com/2014/09/17/progressive-
im...](http://www.webperformancetoday.com/2014/09/17/progressive-image-
rendering-good-evil/)

~~~
rebuilder
Sample size of one, but on mobile, my number one pet peeve (apart from popover
ads and autoplay videos!) is layouts that change as the page loads.
Progressive encoding with multiplexing should help make sure newly loaded
images don't make article text jump around as I try to read it.

Then again, maybe this is a laziness issue since it seems to me defining a
layout independent of actual image content is something that you could do in
the 90's.

~~~
rimantas
Specifying image dimensions is enought to prevent any jumps.

~~~
eknkc
On mobile, we mostly do percentages on images due to varying display sizes.
Gets ridiculous when you want to specify the dimension, because you can't. We
need to specify aspect ratios and css does not provide that.

Went with a padding-top + absolute position in wrapper hack at the end. That
jumpiness is part the tooling problem.

~~~
kuon
I know it's a hack, but what is wrong with it? I've always been using that and
it works.

I haven't looked in details into the new srcset and size attributes of image,
but won't that help also?

~~~
woof
<picture> with multiple srcset makes it possible to serve images that are
displayed at 100% width, filling their (DOM) container.

Check out
[http://scottjehl.github.io/picturefill/](http://scottjehl.github.io/picturefill/)

------
millstone
> it is possible to flag individual scan layers of progressive JPEGs with high
> priority and making the server push those scan layers into the client
> browsers’ Push cache even before the request for the respective image is
> initiated

Whoa - I had no idea this is possible. Isn't this a crazy layering violation
(why should HTTP2 know about progressive JPEGs)? The links don't seem to
provide any more information about it.

 _edit_ It looks like HTTP2 only talks about streams. So it's too strong to
say that you can flag "individual scan layers with high priority." You can't
change the order of scan layers within a JPEG file, or send the file in
anything except its natural byte order. So it seems like this has the same
limitations as HTTP 1.x.

~~~
the8472
It's only a violation if you don't have a clearly structured API to provide
the necessary hints from the application to the HTTP server.

~~~
millstone
How can a layering violation in a protocol be fixed at the API level? An API
is what it is; a good API atop a gross protocol is only lipstick on a pig.

~~~
mnarayan01
I think he's saying that you could potentially have the prioritized output
without "HTTP2 know[ing] about progressive JPEGs". E.g. specifying "these
layers are high priority" versus "the first _n_ bytes of this stream are high
priority": The first would be a layering violation, but I don't see any reason
the second would be (not saying anything about its practicality). Now
obviously the protocol would need to support the second (at least to use it in
the context we're talking about), but I read the GP as using "API"
figuratively enough to encompass this.

~~~
the8472
Yes.

And would pushing chunks actually require a protocol modification? AIUI http2
push mirrors the hypothetical request headers. So it could essentially push a
range-request which the browser could use to partially populate its cache.

------
vladdanilov
> The best way to counter negative effects of loading image assets is image
> compression: using tools such as Kornel Lesiński‘s ImageOptim, which
> utilizes great libraries like mozjpeg and pngquant, we can reduce image byte
> size without sacrificing visual quality.

The standard metrics these tools provide (JPEG quality or PSNR) are not enough
to preserve the visual quality for the variety of images. I'm working on the
project to actually do it [1].

My favorite example is the actual Google logo [2]. It's 13 504 bytes and can
be reduced almost 2 times to 7 296 bytes saving probably gigabytes of
bandwidth each day.

[1] [http://getoptimage.com](http://getoptimage.com) [2]
[https://www.google.ru/images/branding/googlelogo/2x/googlelo...](https://www.google.ru/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png)

~~~
Klathmon
Image compression is a great first step. Imagemin is a great javascript-based
tool that lets you incorperate a bunch of optimization tools (mozjpeg,
pngquant, jpegtran, optipng, gifsicle, etc...) all in one (if your project can
easily use javascript modules...). And there is a plugin [0] (that I wrote)
for webpack to make it happen without any thought.

Another potentially massive step is to use the srcset [1] attribute of the
<img> tag. It lets you provide a bunch of different resolutions for the same
image, and the browser will choose the best one to download and render based
on the physical screen pixel density, zoom level, and possibly in the future
even things like bandwidth preferences or battery level.

Combine the imagemin plugin with a webpack-loader [2] that will auto-generate
5-ish different downscaled versions of an image as srcset, and you get a
pretty perfect setup.

My web apps now always use the highest resolution image I have available by
default (within reason, I do cut it down to a realistic value), then provide 5
different downscaled versions alongside it in the srcsets automatically which
are all run through a battery of optimization to compress them as good as
possible. And the browsers will only download the biggest one it can
realistically use. Everyone gets high quality images, nobody wastes bandwidth
because of support for higher res screens.

[0] [https://github.com/Klathmon/imagemin-webpack-
plugin](https://github.com/Klathmon/imagemin-webpack-plugin)

[1] [https://css-tricks.com/responsive-images-youre-just-
changing...](https://css-tricks.com/responsive-images-youre-just-changing-
resolutions-use-srcset/)

[2] [https://github.com/herrstucki/responsive-
loader](https://github.com/herrstucki/responsive-loader)

~~~
vladdanilov
Agree. Serving the right size is as important.

------
the8472
I never liked progressive images. Just loading vertically is far less
tantalizing than having to check the pixelation to see whether you're really
looking at the final product.

~~~
angry-hacker
Mee too, but choosing between page jumping and your scenario, I take
pixelerated. Often I even don't look at those images, all I look is page
jumping.

~~~
the8472
Why would the page jump? The rectangular size of the image is encoded in the
header either way. The layout engine will already reserve the necessary space
as soon as the first few bytes of the image are loaded, progressive or not.

~~~
angry-hacker
It's possible I was mistaken. Thank you for pointing it out, I need to
research it a bit more.

------
dgreensp
Progressive JPEGs are actually a worse experience on browsers that don't
render them progressively, like Safari (including iOS). This is why many major
sites, such as Flickr, don't use progressive JPEGs (last time I checked, which
was a few years ago).

See: [http://calendar.perfplanet.com/2012/progressive-jpegs-a-
new-...](http://calendar.perfplanet.com/2012/progressive-jpegs-a-new-best-
practice/)

~~~
simonlc
Flickr does use progressive jpegs now. Imo I hate seeing the blurry image
first.

------
ge96
I'm curious, this would probably be better than using an image preload/loading
icon as far as delay or even blank spots/placeholders.

I think I've seen this before but I've used an overlay loading gif which was
shown over the image while the image loaded then when it loaded the loading
gif would be hidden.

~~~
nothrabannosir
I'm always happy when sites don't mess with <img> like this, since it
completely breaks down for non-JS browsers. :/ There are ways to do this
without breaking, but it's not always done. Apparently.

~~~
ge96
Oh man, that's one of those things with web that sucks you gotta factor it in.
I don't know the percentage but say on average, 70% of a basic web gets most
of the users. Then you have to factor in blind users, non-javascript, internet
explorer... Ahhh

It bothers me because I know it's something to address.

So regarding non-javascript, most interfaces are built with JavaScript so what
percentage are you addressing?

I've only had the non-javascript come up personally regarding using Tor. I
don't know/use Tor much.

I don't ever put <noscript> alerts in ahhh I should though.

I use AdBlock too, not sure if that affects what the desired end-result of
non-javascript

~~~
nothrabannosir
I was talking about myself, in this case. Quite selfish, unlike your comment
:)

I use noscript and disabled JS on iOS Safari, for battery life and security.
Not a fanatic noscript user at all, it just seems slightly less bad than full
js at this point.

Graceful degradation is a better choice than <noscript> tags, where possible.
Which, for <img>, it is.

thanks for considering the minority, even if it doesn't pan out :)

~~~
ge96
I don't know if it's a futile battle regarding accessibility. I guess if
you're a big enough company/group you'd have experts in that area.

Hmm yeah I guess I'm not as up to date with mobile(the stuff you mentioned)

------
kodfodrasz
If only some side-by-side video of the page rendering with the two methods
would be available.

I like progressive images, and the whole idea, but some catchy video about a
page re-layouting multiple times as images arrive on traditional http+simple
jpeg would be the most convincing.

------
79d697i6fdif
I think at this point a better solution is to use a polyfill for Webp.

The polyfill is pretty light because the underlying image format is just a
single frame of VP8. Shouldn't get much performance hit since it will just
render as a single frame of video on polyfilled browsers

~~~
Eric_WVGG
I was going to say the same thing, except "polyfill for JPEG-2000"

~~~
79d697i6fdif
nah JPEG 2000 isn't quite as nice because some browsers can't natively parse
it :) . WebP support literally already exists in all mainstream browsers, it's
just a single frame of VP8. And it's still around half the size of regular
JPEG

Not sure why we're downvoted here, either of these are clearly better options
for increasing image download speed

