Hacker News new | past | comments | ask | show | jobs | submit login
Compressing Images Using Google’s Guetzli (theodo.fr)
52 points by ClementHannicq on May 23, 2017 | hide | past | web | favorite | 26 comments

I find this article a bit poor.

It basically gives a few data points about the resulting image size for a few quality settings from an unknown source image already compressed in jpeg with unknown quality setting.

No example of perceptual quality degradation or screenshots (which would have been useful for a perceptual codec) besides him finding no and another person find very little difference.

No comparison to a standard jpeg compressor for the same image / quality. I'm curious : how does guetzli that compare to libjpeg ? libjpeg-turbo ? whatever photoshop is using ?) esp in size for the same quality on your image ?

I find the original blog post is better (having links and actual screenshots): https://research.googleblog.com/2017/03/announcing-guetzli-n...

the Arxiv paper is much better controlled on how the experiment went, but has few/no figures besides the fact that it's better: https://arxiv.org/pdf/1703.04416.pdf

EDIT : I find this article more complete : https://www.34sp.com/blog/speed-testing-googles-guetzli-jpeg...

How do you get to the point where you have 30MB in images for the homepage? I mean, I would understand 1-5MB if you have several images and want to keep them high-quality for your users not to notice, but there are many more techniques to avoid just loading 30MB to try before compressing them better...

He actually has 27MB of images. Which is even more worrying, as he's then got 3MB of who knows what.

I'm sure it's multiple copies of JQuery.

From the article:

> 3MB of those being the CSS/script

3MB of TCP overhead?

Cool stuff indeed. Two things:

> Should be noted that there was not any form of image compression before.

I consider that unlikely, given that they were starting out with JPEG files, which are usually compressed. (Does JPEG even have a non-compressed mode?)

> Being a JPEG encoder, it cannot output PNGs (so no transparency). But it can convert and compress your PNGs.

How does Guetzli compare against PNG compression for the things that PNG is good at, i.e. diagrams and sketches with a small palette? I'd like to see a comparison where the same PNG file is compressed with optipng on one hand, and converted to JPEG and compressed with Guetzli on the other hand, and then they look at file size and the amount of artifacts.

optipng can really squeeze PNGs if the palettes are small. For example, this decorational image from a website I did is 2400x160 pixels large, but fits in barely over 1 KiB: https://fsfw-dresden.de/theme/img/banner.png

Yes JPEG has a lossless mode.

The answer is "no", because lossless compression is still compression. But there's no chance that a website was using lossless JPEG, since it's very poorly supported.

There are no standard losless jpegs. Even a quality of 100% will introduce loss. Only JPEG2000 is lossless, but unsupported for the web because of patent claims.

I was refering at https://en.wikipedia.org/wiki/Lossless_JPEG which is not compatible with regular JPEG.

JPEG2000 from 2000, hence the name. Shouldn't that thing be coming off patent soon?

Does anyone know which patents cover JPEG2000 and their expiry dates?

The best thing about using ImageOptim[1] is that it keeps up with all these latest encoders and uses the best one. And it Just Works™ on a folder of images.

And it looks like their latest beta includes Guetzli support.

1: https://imageoptim.com/mac

Guetzli is great. Here's a better comparison (same filesize) done by one of the authors [1].

I'm working on a more practical tool [2] which produces comparable results [3].

[1] https://drive.google.com/drive/u/1/folders/0B0w_eoSgaBLXQk1V....

[2] http://getoptimage.com

[3] http://i.imgur.com/qmIwJGw.jpg

I’m still hoping someone will compare Guetzli to mozjpeg.

Also jpegmini and kraken.io

Anecdotally jpegmini's 'optically-lossless' algorythm seems to provide better results than the other options, but obviously ymmv.

I'm super interested in this because i want to optimize in the region of 20k photos a day (don't ask :() and the PAYG options all work out relatively expensive with that kind of throughput, whilst mozjpeg is generally only saving us between 5% and 7% on the most-served images.

27MB of images on a webpage is inexcusable! You don't need better compression. You need to reconsider what you are doing with your life (and your poor users' data allowance). Visiting your site once would use up about 1% of AT&T 4G user's monthly data!

How do you justify this?

> How do you justify this?

Our analytics indicate people like "stuffed" sites more. The data clearly shows that the bounce rate is much higher for people with outdated hardware and software; conversely, conversion rate is much higher for people downloading our multi-megabyte scripts and styles. Clearly, this means we should not waste time optimizing the site for people with stale hardware/software, and focus on delivering more exciting experience for those whose machines can process it.


I can possibly imagine how you can get to 27MB of images if you have lots of them, all at very high resolution for some reason, and with no compression, but 3MB of CSS and JavaScript!?

That's unheard of.

I think the problem here is gross incompetence more than compressing tools.

This compression is very very good, and price is right.


30MB for homepage images, this guy's bounce rate has got to be off the charts.

besides of all compression issues I wonder if Google gives you extra ranking Points for using this tool. At least using Gützli for the Image compression of a customers online shop gave us one extra Point in Googles page Speed. I do hope that we get an extra ranking Point for using a Google tool. Which is in These days the most intresting part. since LTE and other Standards "size does not matter" that much anymore :)

Why browsers don't implement HEVC or AV1 based image formats? They are so much better than JPEG. https://wyohknott.github.io/image-formats-comparison/

jpeg-archive (https://github.com/danielgtaylor/jpeg-archive) may be the better use case images that you don't intend to have downloaded a billion times.

I tried jpeg-compress -c in.jpg out.jpg (-c skips output that is bigger than the original) and got visually good results, lower file sizes, and quick processing. The file size ultimately will be larger the guetzli, but not by that much more and it doesn't take an extremely long time to compress one image.

Nothing against the article strictly speaking, but can we stop with the "subtle" innuendos to get clicks already? We have enough sexist crap going in the industry as is.

I miss bytesizematters.com ( http://lea.verou.me/2009/07/bytesize-matters/ ), made by a true dude bro in more innocent times.

what kind of stuff?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact