

Real world analysis of google's webp format versus jpg - jjcm
http://englishhard.com/2010/10/01/real-world-analysis-of-googles-webp-versus-jpg/

======
lysium
Unfortunately, webp claims to have same image quality with smaller file size,
but the 'real world analysis' compares images quality at the same file size.

Further, image quality is measured in terms of mean RGB or value difference,
which is very technical and does not matter to the human eye (a lot). For
example, in the portrait, the last picture, the JPEG artifacts in the person's
face really hurt more than the blurring of webp, yet both have about the same
statistical mean values.

Last, without the original lossless image (at hand), it is hard to tell which
lossy encoder is better. Again with the portrait picture as an example, you
have to download the provided lossless image to see that webp blurred the face
too much.

Still, better than other quick-and-dirty 'analysis' I've seen so far.

~~~
jfager
1\. How else would you construct an objective test? Creating images of the
same quality and then comparing their file sizes seems prohibitively difficult
if not impossible. This test, otoh, implies the same information (minus the
exact difference in file size), but is pretty easy to set up.

2\. Those were the reported numbers, but it's not like that was the only data
point discussed in the article. Every single photo had deeper analysis and
included subjective evaluation. He even directly made the point that you did
about the last photo.

3\. Loading megabytes of photo data into folks browsers by default doesn't
seem worth the benefit (the page is already sluggish). If you're interested,
the originals are available for download.

~~~
sp332
There's more than one "objective" test for image quality, and some of them are
closer to human perception than PSNR. The trouble with comparisons like this
is that the JPG encoder might be tuned to minimize PSNR, but if the WebP
encoder is tuned to optimize for something else, of course it won't perform as
well when you compare the PSNR!

More info on the issues with this type of benchmark from Jason Garrett-Glaser
(a.k.a. Dark Shikari), an x264 developer:
<http://x264dev.multimedia.cx/?p=458>

~~~
jfager
That's fine, I agree. My point was not "His metric is the most awesome, shut
up", it was "Despite only giving one kind of number, the author still included
other analysis and subjective opinion in the writeup for each photo, so it's
not as bad as you're trying to imply".

~~~
sp332
You asked _How else would you construct an objective test?_ And my answer is,
just make sure you're being fair in measuring the same thing the encoders are
optimising for.

~~~
scott_s
If I only test the optimal circumstances for a new thing, then I have done a
terrible job at trying to understand the consequences of adopting it.

~~~
sp332
_On two occasions I have been asked,—"Pray, Mr. Babbage, if you put into the
machine wrong figures, will the right answers come out?" .... I am not able
rightly to apprehend the kind of confusion of ideas that could provoke such a
question._

\--Babbage (1864), Passages from the Life of a Philosopher, ch. 5 "Difference
Engine No. 1"

You can't reasonably use an encoder optimized for PSNR and then "ding" it for
producing output with bad SSIM, or vice-versa.

~~~
scott_s
I'm not familiar enough with the image encoding algorithms (and the tradeoffs
they are faced with) to comment on that, specifically. But I've done enough
research with systems in general to disagree with your statement. No matter
what I optimize a system for, it's always fair for someone to test it under
adverse conditions. If we're going to adopt something, we need to know its
limitations.

Further, my understanding of Google's intent with webp is that it is being
offered as a replacement to jpeg. In that case, even if it's optimized for one
thing, it's not just fair but necessary to see how it works under all other
things.

------
timb
compare his horrendous 33kb jpg: <http://i.imgur.com/djmBv.png> to the 33kb
jpg i made in photoshop: <http://i.imgur.com/02E6e.jpg>

------
klon
Strange artifacts in the first JPG image, looks more like a GIF with reduced
colors.

------
bstrong
To my eye, webp reduces compression artifacts at the expense of losing detail
(this is especially visible in the portrait example).

Another way to accomplish a similar effect without requiring a new file format
would be to apply a selective gaussian blur (or another de-artifacting filter)
to highly compressed jpegs before displaying them.

Of course, no one does that because the trade-off is generally not considered
to be a good one.

~~~
chc
Compression artifacts _are_ loss of detail. Applying any kind of blur filter
to an image that had compression artifacts would result in _more_ loss of
detail (and applying it strongly enough to remove the artifacts would pretty
much obliterate any recognizable details).

~~~
bstrong
Actually, while applying a blur does obviously eliminate detail, you can
retain most of the important detail with a selective blur. Give it a try in
gimp and see for yourself. The default selective gaussian settings work pretty
well for de-artifacting, and the loss of detail arguably isn't much worse that
that in webp compression. Some of the commercial de-artifacting tools use
different techniques (I don't know what they are) that result in even less
loss of detail.

------
thegir
I wake up and now I have to look at server logs because of this. Way to go
jjcm =(

------
ableal
Page not coming up for me, but over at
<http://lwn.net/Articles/407884/#Comments> there was a link to another less-
than-thrilled appreciation: <http://x264dev.multimedia.cx/?p=541> (by Jason
Garrett-Glaser, aka Dark Shikari, x264 Developer)

------
ergo98
Those who don't learn from history are doomed to repeat it. This is a futile
effort by Google and will go no where.

If they somehow managed to chop file sizes by 90% or more, it would have a
small chance (it still wouldn't be guaranteed, as in the era of CDNs and
caching and large pipes, static images just aren't a big concern). Instead
they've marginally chopped file sizes in only certain scenarios, while adding
numerous new downsides, while actually reducing the feature set.

Are they insane? I'm surprised they actually announced this.

Like MP3, JPEG is _good enough_ unless the improvement of a replacement format
is overwhelming. It doesn't have the political baggage of something like h264,
so that argument doesn't apply.

More interesting than this silliness from Google are formats that actually
bring new and impressive features. I recall that JPEG2000 could do
incremental, stallable loading (or maybe I'm thinking of something else), such
that as you scaled an image it wasn't loading an entirely new image, but
instead was loading incrementally more data to provide the detail for that
level. IP stopped it from taking off, but that was actually interesting. This
isn't.

~~~
gojomo
Give it 15 years. If it's free and better, then at some point, all browsers
and tools will have it as an option. If CPU keeps improving faster than
bandwidth, and free codebases keep growing, then an option to create webP on
the fly (or downconvert to jpg for older clients) will become effortless.

In the meantime, if early adopters get a slightly better web experience --
that's a win for Google. They want more marginal pressure to upgrade.

~~~
ergo98
The will be no early adopters.

~~~
gojomo
Not even Chrome users hitting Google's own websites? At Google's scale, the
savings from that alone might pay for the R&D that went into this.

