Most of the compression benefits can be obtained by setting the "optimized Huffman" flag on your compressor. e.g., a baseline JPEG will save 10-20% when "optimized" and progressive very rarely achieves a double-digit win after that.
MobileSafari eats cycles every time you inject a large image into the DOM, so (while I haven't benchmarked progressive), it seems like this new-found love of progressive JPEG is beating up one of the slower code paths in the mobile browsers. And to me, it doesn't look that good!
Progressive jpegs do not necessarily need to use more RAM. The FAQ I linked to also says "If the data arrives quickly, a progressive-JPEG decoder can adapt by
skipping some display passes." Win!
Also, why do you say "up to 3x" more CPU? Is that an estimate based on how many scans you're guessing a progressive jpeg has? A progressive jpeg can have a variable number of scans -- we used to be able to set that number, which is totally cool!
As for the compression benefits, you say "progressive very rarely achieves a double-digit win after that." We web performance geeks LOVE single-digit wins, so you can't burst our bubble that way.
Yes, Mobile Safari has trouble with images. Period. But Mobile Safari does not progressively render progressive jpegs (I wish it did). So we can make it a best practice without worrying about Mobile Safari. When the web is full of progressive jpegs, Apple will have to deal with them. It's not an evil plan, it's the right thing to do.
When you say it doesn't look that good, are you saying that for yourself personally, or are you saying it for your users? We need to think about what they see. As I say, perceived speed is more important than actual speed, and the thing that excites me most about progress jpegs is not the file size savings, but instead the behavior of the file type in browsers that properly support it.
If you don't display coarse scans (and if you're comparing progressive vs. baseline CPU usage then counting such isn't fair), then approximately the only additional CPU time progressive should take is the time additional cache misses take. It's probably a wash considering that decoding fewer coefficients / less Huffman data takes less CPU.
Maybe I'll get some numbers, I'm curious now... But unless you're serving multi-megapixel images the additional CPU and memory doesn't matter. Probably not even until you're in the double digits, if then.
Written from a ten year old notebook. Progressive JPEGs are slower for me no matter which browser. Not having Chrome browser on the newest computer? Still slower. Mobile devices: the same story.
If the author never leaves his desk, has a new computer and likes Chrome browser, good for him. Others shouldn't trust him too much.
When images arrive, they come tripping onto the page, pushing other elements around and triggering a clumsy repaint.
Unless width="" and height="" have been deprecated on <img> overnight.
And because it allows browsers to make room for the image and avoid reflowing.
their Mod_Pagespeed service
SPDY does as well, translating jpegs that are over
10K to progressive by default
(I work at Google on ngx_pagespeed.)
At first I thought: "What, you're opening JPEGs and saving them again? Don't you lose image quality every time you open and save in a lossy format?"
But then I read the actual source , and it says that `jpegtran` can open baseline JPEGs and save them losslessly as progressive JPEGs. That sounds useful!
Does anyone know whether other image editing software do the same thing to JPEGs that are re-saved without modification? What about Photoshop? GIMP? MS Paint?
Progressively loading photos in that manner is not a good user experience either. The first paint is often quite jarring, especially, as pointed out in the article, over 90% of photos simply don't load this way.
For content such as photos that are contained within some sort of layout, it would be better to have a placeholder there that is the same frame as the final image size, then have the final image appear upon completion.
To me it's still better than nothing. Just like I don't enjoy websites that only paint after they're done loading.
Just use the title link as a reference. First you see chicken wings, then all of a sudden 6 piglets.
The transition paint is really meaningless.
For me, yes. It is a loading indicator, without any additional "noise".
Also, the second-to-last pass is often quite good, if not indistinguishable from the last one. So you have the full image already, while more details get filled in, instead of first having the top, then the middle, then the bottom, and to me that's less "jarring".
When you say
it would be better to have a placeholder there that is the same frame as the final image size, then have the final image appear upon completion
isn't that the quality of progressive jpegs?
The way I'd approach photos is this - no image should take more than 3 seconds to load, and anything between 1-3 seconds has a small progress bar (mobile) or just an empty frame with a thin border (mobile thumbnails or any web photos).
I'd take into account device, connection speed, CDNs, jpeg compression to ensure that I meet the time requirement for the full image to load.
If the full image isn't consistently loading within that timeframe, I've already lost and need to rethink the quality of the images being delivered or if I'm designing the right app / site, because it's going to be a terrible user experience either way, progressive jpeg or not.
I call BS. On most every web project I've been involved in, we've used progressive JPEGs for the size optimization. This was crucial in the dial up modem days when we were inventing how web sites should best serve users, and we still do it. Sites with faster times to perceived page completion consistently drove higher page views and longer times on site. Even in browsers not supporting progressive rendering, perceived completion was faster due to smaller size meaning faster load time. Switching from baseline to progressive consistently drove higher page views and longer times on site.
I have never heard a single client say a single user complained about progressive JPEGs. I'm not saying it hasn't happened somewhere to someone. Users will complain about anything. But in billions of page views across countless clients (including pro photo clients), I haven't run into user complaints from progressive JPEGs, only measurable page view and time on site improvements in user behavior.
Essentially the same effect, with some back-end code. I suppose we'd need a benchmark test to find the real numbers. The main advantage would be sticking with existing file types already in use.
(I'm sure someone has already implemented something better than this, but I am lazy.)
Your browser stakes out the image area if you set height and width attributes for the image thus avoiding reflows.
Also the bracketed comment makes it sound like this feature doesn't work well?
Um, no. Nobody is serving anything more than 8 bpc on a web page, and nobody has a monitor that could show it to them if they did.
But if you're talking about hardware whose sole purpose is displaying images for humans to view, having more than 8 bits is just a waste of resources.
Edit: Found a source quoting 450 light levels the eye can distinguish, so adding a bit of buffer for smooth transitions and imperfect gamma you'd need 10 or 12 bits.
I'm scanning slides today with a Nikon Coolscan 5000ED, and it's a full 16 bits per channel of R, G, and B, and 16 bits of infrared (48 bits + 16 bits). The "raw" TIFFs are 182 MB, while the final JPEGs range between 2.2 and 6.9 MB.
Same holds true in video streaming, where companies got overconfident with broadband then are surprised when the limited bandwidth and high latencies of wireless are better managed by the multi-bitrate and error correcting streaming technologies of a decade ago.
I'm currently working in Canada, having come over from the UK where we are much more used to cheap unlimited broadband plans, both at home and cellular. So the caps and prices here have been quite a shock! A friend told me that NetFlix (or similar) in Canada stream at a much lower rate and don't offer HD in Canada because of the fact that. My current broadband plan costs me $60 (+15% tax) for 130GB a month.
One thing it has taught me is you really need to consider the differences of the local markets!
The difference between the rendering of the two file types is not subtle.