
Essential Image Optimization - mmazzarolo
https://images.guide/
======
Klathmon
A bit of "self promotion" here, but I was annoyed with optimizing images
manually, so I wrote a plugin for webpack that does it for me.

It allows you to use any imagemin plugins you want, which have wrappers for
most image compressors out there. And since it runs every "production build"
you can commit your unoptimized images to your repo.

[https://github.com/Klathmon/imagemin-webpack-
plugin](https://github.com/Klathmon/imagemin-webpack-plugin)

~~~
spookyuser
Hey thanks, I used this recently with responsive-loader and it worked great,
although responsive-loader was a little finicky to get working!

~~~
Klathmon
That's actually why I created it in the first place! I no longer use
responsive loader any more (fully offline web application, so the responsive
images just weren't worth the extra complexity at that point), but it does
work wonderfully once you wrangle it under control.

~~~
spookyuser
Yeah it does. But you're totally right about the complexity. Figuring out the
right values to generate the srcset at felt insanely complicated for what you
get out of it in data saving, but images were so blurry on chrome when
downscaling them even a little bit that srcset almost seemed mandatory :(

------
mikerg87
I appreciate that using WebP has its advantages but when I look at canIuse,
it’s support isn’t as widespread as the author suggests.

[https://caniuse.com/#search=WebP](https://caniuse.com/#search=WebP)

~~~
Theodores
Yes so it is much better to use Google's Pagespeed module for Nginx or Apache
to handle this aspect. In that way the compression is abstracted out and you
don't have to worry about it. Pagespeed will know that someone is using an old
version of Safari rather than Chrome and will send the correct type of image,
i.e. one that will render.

------
olegkikin
I have a mini benchmark of lossless PNG compressors, in case anyone finds it
useful.

[https://www.olegkikin.com/png_optimizers/](https://www.olegkikin.com/png_optimizers/)

~~~
gravypod
It would be really cool to get something like this that is run against all
formats and their specific optimizers. I'd like to see how MozJPEG stands up
against the best PNG optimizers or how everything compares to webp (which in
my experience is the smallest)

~~~
olegkikin
It's hard to objectively compare lossy encoders. Yes, there are various
metrics, but you will get very different results depending on which metric is
chosen.

------
Splines
Here's an open question that I'm having a hard time finding an answer for:

I have a series of images that are basically screenshots of a visual novel
(pretty much a comic book). Most of the space of the screenshots are identical
from image to image, and I'd like to compress these files together to save
space. My first thought is to use some kind of video compression format to do
this, but I'm wondering how to handle playback (currently the windows built in
image viewer is used).

Backstory: My wife enjoys reading VNs on her phone and screenshots them so she
can re-read them on the PC. I've automated most of the capturing of
screenshots via some adb scripting and a WPF app that talks to the phone, but
the resulting files are quite large and I can tell that I could be smarter
with how the resulting files are stored.

~~~
klodolph
Let's say you already have image X saved, and you want to also store image Y,
which is mostly similar. Take all of the pixels which are the same in both X
and Y and set them to RGBA (0,0,0,0). Instead of using the Windows image
viewer, write a web viewer in JavaScript. The viewer takes image X and image
Y, composites them in a Canvas, converts that into an image, and then displays
the result.

The extra steps with compositing in a Canvas first will prevent you from
getting artifacts from scaling the images, but the artifacts may not be
particularly visible.

If you are comfortable with NumPy this whole process should be fairly easy and
painless. I have done something similar in the past, but it was for making
animated GIFs.

If you are particularly adventurous, you could remove large chunks of the VN
foreground by taking the per-pixel median value from a large stack of images
that share a background. You could then use this as a base image.

~~~
nitrogen
This is basically how GIF works. If you have a GIF viewer that can pause and
advance frame by frame, you could use GIF without the canvas needs.

~~~
solidr53
But the compression in GIF is so terrible that it will possibly be way bigger
than individual JPEG or WEBP. I would say, go with WEBM and a mobile frame-by-
frame player, there are dozens.

------
trishume
Shameless plug for my tool for automatically taking in a static site and
spitting out one that uses responsive optimized images:
[https://github.com/trishume/enfasten](https://github.com/trishume/enfasten)

Unlike all non-custom Gulp/Webpack/JS based tools I know, it does incremental
builds so that it only resizes/optimizes images that you change or add, so it
won't substantially slow down your site build times. It also rewrites the
output of your existing static site generator so you don't have to do much to
integrate it.

------
vladdanilov
Images in the guide can be further compressed by 7% (lossless) and 25% (lossy)
using Optimage [1], and even these results can be improved. There are many
nuances in image optimization, and sadly, some of them are not in the guide
[2].

[1] [http://getoptimage.com](http://getoptimage.com)

[2] [https://github.com/GoogleChrome/essential-image-
optimization...](https://github.com/GoogleChrome/essential-image-
optimization/issues/55)

------
fold_left
I have some benchmarks of image optimisation tools here at
[https://foldleft.io/image-tools/](https://foldleft.io/image-tools/)

------
nayuki
Good article, and very comprehensive coverage of different aspects. However,
they are using JPEG images for screenshots and clean vector artwork. Isn't
this worse than using PNG in terms of quality and size? And doesn't this
violate their own goal of promoting better image optimization practices?

------
eashish93
Hi, please avoid if not useful. I built an image compressor with bulk
uploading and parallel processing of images. It's free to use. Suggestions,
bugs, new features welcome. Link here:
[https://imgsquash.com](https://imgsquash.com)

------
Houshalter
Pngnq is a bit better than pngquant, and zopflipng is really good at
compressing pngs even further.

------
GvS
I use the ngx_pagespeed module for Nginx for automatic image optimization.
It's really good and easy to install:
[https://github.com/pagespeed/ngx_pagespeed](https://github.com/pagespeed/ngx_pagespeed)

I describe the whole setup step by step on my blog:
[https://blog.tjl.rocks/cheap-secure-and-fast-ghost-blog-
set-...](https://blog.tjl.rocks/cheap-secure-and-fast-ghost-blog-set-up/)

~~~
therealmarv
yes agree about ngx_pagespeed but personally never liked the idea of keeping
care of nginx compilation (and all the modules). My advice:
[https://github.com/cryptofuture/nginx-hda-
bundle](https://github.com/cryptofuture/nginx-hda-bundle) so that you can use
nginx, brotli and pagespeed all in one package :)

------
stereo
This is such a low-hanging fruit yet so rarely done. I’ve been getting good
github mileage out of submitting PRs that just losslessly optimise images with
ImageOptim.

~~~
Klathmon
I completely agree!

People will complain over a few hundred kb of JavaScript, but a 3mb
unoptimized image will be ignored.

It drives me insane!

------
jenhsun
I do use [https://tinypng.com/](https://tinypng.com/) to do JPEG or PNG
optimise. Not bad at all.

------
therealmarv
great guide!! Only downside: I think it needs an essential text/article
optimization, the text is huge, had to scale it down at least to 75%.

~~~
zzzcpan
It's easier to read. I appreciate proper font size for once. I have to use
200% zoom on HN to get the same font size, for example.

------
IgorPartola
Is there any real benefit to progressive JPEGs? I kept reading about how it’ll
load in a better fashion in browsers but it seems no browser actually supports
it.

------
the8472
Except image hosting sites applying _lossy_ "optimization" contribute to the
VHS problem, especially when they don't allow you to download the original. I
am looking at you, twitter "gifs". Stripping out metadata on a high-res photo
shot with a DSLR is also a minor nuisance, since you can't even check at which
settings it was taken. Optimizing lossless compression is ok, although even
that has a minor downside that it makes data deduplication more complex, i.e.
you can't rely on zfs or btrfs deduplication, you have to use format-specific
tools that compare the decompressed output.

Most of these things can be avoided by providing a link to the original.

[https://xkcd.com/1683/](https://xkcd.com/1683/)

------
styfle
This was posted yesterday

[https://news.ycombinator.com/item?id=15933687](https://news.ycombinator.com/item?id=15933687)

~~~
jwilk
There was no discussion, so there's no point linking to it.

