It basically gives a few data points about the resulting image size for a few quality settings from an unknown source image already compressed in jpeg with unknown quality setting.
No example of perceptual quality degradation or screenshots (which would have been useful for a perceptual codec) besides him finding no and another person find very little difference.
No comparison to a standard jpeg compressor for the same image / quality. I'm curious : how does guetzli that compare to libjpeg ? libjpeg-turbo ? whatever photoshop is using ?) esp in size for the same quality on your image ?
I find the original blog post is better (having links and actual screenshots): https://research.googleblog.com/2017/03/announcing-guetzli-n...
the Arxiv paper is much better controlled on how the experiment went, but has few/no figures besides the fact that it's better: https://arxiv.org/pdf/1703.04416.pdf
EDIT : I find this article more complete : https://www.34sp.com/blog/speed-testing-googles-guetzli-jpeg...
> 3MB of those being the CSS/script
> Should be noted that there was not any form of image compression before.
I consider that unlikely, given that they were starting out with JPEG files, which are usually compressed. (Does JPEG even have a non-compressed mode?)
> Being a JPEG encoder, it cannot output PNGs (so no transparency). But it can convert and compress your PNGs.
How does Guetzli compare against PNG compression for the things that PNG is good at, i.e. diagrams and sketches with a small palette? I'd like to see a comparison where the same PNG file is compressed with optipng on one hand, and converted to JPEG and compressed with Guetzli on the other hand, and then they look at file size and the amount of artifacts.
optipng can really squeeze PNGs if the palettes are small. For example, this decorational image from a website I did is 2400x160 pixels large, but fits in barely over 1 KiB: https://fsfw-dresden.de/theme/img/banner.png
Does anyone know which patents cover JPEG2000 and their expiry dates?
And it looks like their latest beta includes Guetzli support.
I'm working on a more practical tool  which produces comparable results .
Anecdotally jpegmini's 'optically-lossless' algorythm seems to provide better results than the other options, but obviously ymmv.
I'm super interested in this because i want to optimize in the region of 20k photos a day (don't ask :() and the PAYG options all work out relatively expensive with that kind of throughput, whilst mozjpeg is generally only saving us between 5% and 7% on the most-served images.
How do you justify this?
Our analytics indicate people like "stuffed" sites more. The data clearly shows that the bounce rate is much higher for people with outdated hardware and software; conversely, conversion rate is much higher for people downloading our multi-megabyte scripts and styles. Clearly, this means we should not waste time optimizing the site for people with stale hardware/software, and focus on delivering more exciting experience for those whose machines can process it.
That's unheard of.
I think the problem here is gross incompetence more than compressing tools.
30MB for homepage images, this guy's bounce rate has got to be off the charts.
I tried jpeg-compress -c in.jpg out.jpg (-c skips output that is bigger than the original) and got visually good results, lower file sizes, and quick processing. The file size ultimately will be larger the guetzli, but not by that much more and it doesn't take an extremely long time to compress one image.