
Image optimization decreased website's page weight by 62% - ayoisaiah
https://freshman.tech/image-optimisation/
======
acconrad
As someone else mentioned, it's much easier to just use ImageOptim if you're
on Mac. There's also a CLI and accompanying tool for NPM that includes this
and ImageAlpha and JpegMINI: [https://www.npmjs.com/package/imageoptim-
cli](https://www.npmjs.com/package/imageoptim-cli)

But one thing I'd caution is that webp is not a panacea for image
optimization. It's only supported in Chrome. If you want to fully leverage
next-gen image formats cross browser, you'll also need JPEG 2000 and JPEG
XR...and even if you do all of that, you still won't get support for Firefox.

There's also srcset and lossy compression, which are also viable options:
[https://userinterfacing.com/the-fastest-way-to-increase-
your...](https://userinterfacing.com/the-fastest-way-to-increase-your-sites-
performance-now/)

~~~
theandrewbailey
The market fragmentation is why I prefer to use JPEGs more intelligently than
authoring multiple versions of the same image for each browser. It feels like
the 90s all over again. Using fewer images, smaller images, and caching them
has worked well for me.

~~~
reaperducer
I'm at the point now where I've realized that nothing is ever the solution all
the time. It's a little frustrating.

I used to use the rule of thumb that JPEG was for images, and PNG for charts,
and things with text.

But these days I go through each image I get from the art department and
optimize it myself in Photoshop, cycling through a series of presets.

To my surprise, depending on the image, sometimes an 8-bit PNG will end up
smaller than a JPEG, and provide better visual quality.

Naturally, your mileage will vary; at participating locations; not valid in
Alaska, Hawaii, or Puerto Rico; no cash value; batteries not included; do not
taunt Happy Fun Ball.

------
gingerlime
I'm a bit surprised there's no mention of image optimization proxy / services
like thumbor[0] (which is open source). Instead of pre-processing all your
images, it lets you worry about it later. You can compose different
transformations and filters (e.g. add a watermark, resize, crop etc). This is
especially useful when things on the website change. It lets you keep the
original at full size, and transform them as you need.

There are some commercial services in this space, as well as other similar
open source services.

If you're looking for a quick way to get thumbor up and running with docker,
I'd plug
[https://github.com/minimalcompact/thumbor](https://github.com/minimalcompact/thumbor)

[0] [https://github.com/thumbor/thumbor](https://github.com/thumbor/thumbor)

~~~
Cthulhu_
Similarly, if you're still living in the past and / or want to maintain your
own server instead of pass it through a 3rd party like that or a CDN, a few
years ago Google's mod_pagespeed (module / plugin for both apache and nginx)
did image optimization and a lot of other optimizations, all transparently -
page caching, creating sprites, inlining small images (as base64), above-the-
fold CSS optimization, etc.

I'd still recommend it if you've an old-fashioned webserver (like idk, random
wordpress installation) and don't want to pay a third party. I don't know if
it's still being maintained or updated though, I haven't heard much of it
since its release.

~~~
leeoniya
> Similarly, if you're still living in the past

you mean like when privacy was still a thing?

~~~
briandear
If you have a public-facing site, how is using something like Imgix a privacy
issue?

~~~
JoshTriplett
You're letting a third party track your users and their usage of your site.

------
Bluecobra
Coincidentally, when I was a freshman in high school in the late 90's this was
a topic our instructor drilled into us. I remember trying to shave off every
little kilobyte for .gif and .jpg files and to make my personal website load
as quickly as possible with a reasonable amount of quality over a modem.

From my perspective, it seems that everything has gotten way more bloated
there is an assumption that everyone has unlimited data and bandwidth. I used
to have a 1GB data cap on my phone that I would blow out in a couple of weeks
from just reading news websites. For example, bloomberg.com shouldn't need to
make nearly ~300 requests and download 18mb of data just to load the front
page.

~~~
MR4D
Your comment made me bring out DevTools, because I thought your number of
requests you said ("~300") sounded a bit high.

Boy was I wrong!!!

    
    
      - 500/565 requests
      - 7.0 MB transferred
      - 9.86 second to load
    

Holy crap!!!

~~~
mehrdadn
This is weird. In Chrome incognito, I get ~400 requests (without scrolling),
and in Firefox private browsing, I get ~120 requests before scrolling, and
~190 after. What in the world? I disabled all the extensions in Firefox too.

~~~
detaro
Doesn't Firefox private browsing has a tracking blocklist active by default?

~~~
mehrdadn
Wow, I didn't know that! I thought the tracking protection was just about the
DNT header and I didn't expect that to make much difference here. However,
even disabling that, I only get ~200 requests. There's still a lot missing...
is it still blocking things under the hood?

~~~
RugnirViking
It may also be that different resources are loaded based on browser agent to
track the user in different ways/display the page correctly on different
browsers/polyfill javascript differences

------
Mojah
A long time I ago I "automated" the optimization of images on my site by
running `optipng` in a cronjob. Every file that touches my server gets
optimized.

I wrote about it here: [https://ma.ttias.be/optimize-size-png-images-
automatically-w...](https://ma.ttias.be/optimize-size-png-images-
automatically-webserver-optipng/)

Benefits:

\- Don't have to think about it

\- Optipng is really good at reducing PNG's to their bear minimum

Downsides:

\- Doesn't resize images (if a 1024x768 is displayed as a 10x8, it'll still
download the 1024x768)

\- Only does PNG

\- If your images are stored in git (and you didn't pre-optimize before
committing/deploy), you can get merge conflicts

Still, better than nothing.

~~~
analogmemory
The node package Sharp does a great job of optimizing images and letting you
manipulate them for different sizes, color and channels.

[http://sharp.pixelplumbing.com](http://sharp.pixelplumbing.com)

~~~
davidmurdoch
If using gulp you can use gulp-responsive which uses sharp internally.

The automatic cropping methods are pretty cool and work really well, too.

------
l5870uoo9y
<picture> <source srcset="sample_image.webp" type="image/webp"> <source
srcset="sample_image.jpg" type="image/jpg"> <img src="sample_image.jpg"
alt=""> </picture>

Didn't know you could wrap images in a <picture> tag and browsers (except for
IE) will automatically download the .webp version if they support it. Used to
do this via a Javascript. I like on-demand scaling where you pass scaling
parameters in the url, such as: /200x200/sample_image.jpg.

~~~
mimischi
No need to do on-demand scaling via JavaScript either! The srcset attribute is
also available in <img> tags and lets you define differently sized images for
different viewports: [https://developer.mozilla.org/en-
US/docs/Learn/HTML/Multimed...](https://developer.mozilla.org/en-
US/docs/Learn/HTML/Multimedia_and_embedding/Responsive_images)

------
Cenk
If you’re developing on a Mac, ImageOptim can handle all of the image
compression (JPG, PNG, etc):
[https://imageoptim.com/mac](https://imageoptim.com/mac)

For SVGs, svgo (brew install svgo) usually produces the best results for me.

~~~
bwbw223
svgcleaner is supposed to be faster and have a better compression ratio than
svgo:
[https://github.com/RazrFalcon/svgcleaner](https://github.com/RazrFalcon/svgcleaner)

It also has a GUI.

~~~
1996
After fighting with the options, failing to find a good default, and still
consistently getting larger images, I fail to see how svgcleaner is better.

~~~
bwbw223
I think it's only with lossless compression that it's better.

------
Kagerjay
I just ran a quick comparison on two big ecommerce sites I'm familiar with. I
know for a fact they performed as much optimization as they possible could in
their respective file formats

Webp and .jpg file both had similar dimensions, picture detail complexity, and
dpi. Webp format came out to 50% smaller file size

I didn't have enoughof a sampling and/or tests though.

I personally don't think image optimization with webp should be a thing
though. The lack of native web support is one issue, the next is lack of
native support on windows OS is another.

Two things IMO are most important about image optimization for heavy-image
load sites. One is lazyloading.js (frontend library) via specifying a class
for those images past a threshold browser height on the backend. Most notably
this is used in many ecommerce sites, but on analysis amazon doesn't seem to
be using this.

Next would be sprite compression of common social links. A great example of
this is amazon actually, checkout this image I extracted from from their
webpage.

[https://images-na.ssl-images-
amazon.com/images/G/01/gno/spri...](https://images-na.ssl-images-
amazon.com/images/G/01/gno/sprites/nav-sprite-
global_bluebeacon-V3-2x_optimized._CB474516457_.png)

~~~
Cthulhu_
I think with images, just as with videos, there will never be a consensus;
some people will insist on certain proprietary formats, forever. At least
there's standards now, so that you can provide fallbacks - if you as a website
owner would e.g. prefer people use webp because it compresses more than jpg
then you can offer both, without forcing one or the other on them.

I wouldn't mind a solution on the server-side though, where in your html you
just put an img tag and the server determines what the best format is based on
browser support. Of course, that would mean there'd be a header (or some smart
user agent analysis) for every image request, and you'd like to keep that
overhead to a minimum.

------
mcjiggerlog
I've been using this great imagemagick script for optimizing images the past
few years. Works like a charm. Any images that are going to be served from my
websites first get optimized via the script.

    
    
      smartresize() {
          mogrify -path $3 -filter Triangle -define filter:support=2 -thumbnail $2 -unsharp 0.25x0.08+8.3+0.045 -dither None -posterize 136 -quality 82 -define jpeg:fancy-upsampling=off -define png:compression-filter=5 -define png:compression-level=9 -define png:compression-strategy=1 -define png:exclude-chunk=all -interlace none -colorspace sRGB $1
      }
    

Usage:

    
    
      smartresize image.png 300 outdir/
      
    

Looks like I must have found it here
[https://www.smashingmagazine.com/2015/06/efficient-image-
res...](https://www.smashingmagazine.com/2015/06/efficient-image-resizing-
with-imagemagick/)

~~~
mabynogy
Thanks. I noticed a gain of 20% on my images with this method.

------
Klathmon
If you are using webpack, I highly recommend imagemin-webpack-plugin [0]
(although I might be a bit biased as I created it...)

It will run a slew of image optimizers by default using imagemin, and has
support for a wide range of others.

It also supports caching and optimization of images that aren't being directly
imported through webpack (thanks to some awesome contributors) so it's a great
way to set it and forget it and never have to worry about sending 3mb images
to your users by accident.

[0] [https://github.com/Klathmon/imagemin-webpack-
plugin](https://github.com/Klathmon/imagemin-webpack-plugin)

~~~
ayoisaiah
That's a nice one, thanks for sharing! Makes sense for those who already use
Webpack

------
kawsper
If you are using Ruby, I can recommend the image_optim[0] gem together with
image_optim_pack (that packs the binaries), it is maintained by a great person
I only know by the name of his handle "toy".

I used to give him a few dollars per week when Gratipay was still up and
running, sadly I don't know of an alternative now.

[0] [https://github.com/toy/image_optim](https://github.com/toy/image_optim)

~~~
NHern031
Curious as to what Gratipay was, so I did a little a research. It turns out
the maintainers of GratiPay have handed over all assets over to Libreapay[0].
Libreapay appears to be a fork for Gratipay. This might be what you have been
looking for!

[0] - [https://liberapay.com](https://liberapay.com)

*I have no affiliation with liberapayn just did some curious research.

------
Adamantcheese
Another option, if your images are simple enough, is to use
[https://github.com/fogleman/primitive](https://github.com/fogleman/primitive)
to convert it to SVG. Might not be worth the effort though as the space
savings could be too insignificant to matter for the artifacts produced. Neat
effect though for small amounts of shapes.

~~~
rectangletangle
Another approach is to design the site with SVGs as the priority format. Use
flat cartoon like graphics in place of JPG photographs, whenever possible.

------
tobyhinloopen
Sometimes I visit a project and drop all public images (logos, icons, stock
photos, whatever) in the project in imageoptim. 75% reduction in file-size for
some images is not uncommon.

------
greysteil
One of the nicest additions to the GitHub Marketplace is a bot that will
optimise your images automatically:
[https://github.com/marketplace/imgbot](https://github.com/marketplace/imgbot)

(No connection to me - just think it's a great idea, and totally free.)

~~~
ayoisaiah
Another GitHub bot[0] was shared earlier. This one looks similar. Thanks for
sharing!

[0]: [https://www.shrink.sh/](https://www.shrink.sh/)

------
jim-jim-jim
Making sure that grayscale/monochrome images are set as such (oftentimes they
aren't) can also shave the size down. I use imagemagick for that.

~~~
detritus
This applies to any image will never be seen in its original state - eg.
photos that are opacified or dulled down to act as background elements.

We had a site outsourced (because I found Shopify to be ..'tricky'.. to
meaningfully work with on a PC) and the developer was darkening 'hero' images
on the homepage shout in CSS, rather than specifying that we simply pre-
process them in something like Photoshop before uploading.

When I realised (I wasn't particularly hands on by this point) I was livid, so
changed the code and we knocked a few hundred Kb off the front page. Our site
is necessarily image heavy, so any gains anywhere are useful.

When I first started I had a ceiling limit of about a 120Kb for the entire
page - images and all - so today's internet is a weird and foreign land to me.

------
calyth2018
Setup metrics that you'd like to hit for your pages. When I was a kid putting
up a fan site for games, (back on Geocities), I was mostly stuck on a 28.8k
modem, and aimed to have it finish loading in 30s.

Now, I was quite brutal in the compression and probably should have backed off
a bit to avoid artifacts.

It might be nice to have an automated test that the system under test limits
the bandwidth and try to load your pages, and set some times that you'd like
to hit.

------
jordache
is there a robust, automated, and universal process for gauge image
optimization result, and comparing it to required image quality level? I've
always done it manually and I have a keen sense of image degradations as I
increase the compression.

------
nicolasbistolfi
I personally have been working around web technologies and performance for the
last 14 years. I understand where this article is coming from and I actually
have to give it some credit, it has a good click-bait title but it's an old
solution. I'm tired of reading about this type of solutions, there are
thousands of articles exactly like this one.

Reducing the size of your images is just the first step and there are many
things you need to consider in order to make your website faster, things you
need to solve: \- Format: deliver the images in the right format for each
browser (e.g. using WebP for Chrome) \- Size: What size for each image? What
happens on mobile, tablet, desktop and the different screen sizes and pixel
ratios (it's not only retina or not) \- Quality: Is your image being resized
by the browser? Are you using raw files to generate the optimized images? \-
Thumbnails: Are you also generating thumbnails for listings or smaller
versions of your images? How are you going those thumbs to your original
images? Do you need to use a Database? \- Storage: Where are you going to
store those images? \- Headers: Caching static assets it's key for recurrent
users, are you using Apache or Nginx? Is your setup working well? \- CDN: are
you using a CDN to deliver those assets? CloudFlare is great but it's not the
fastest way to deliver images. What about setting the right configuration for
that CDN? How much are you going to spend?

So what's next? Going to one of the API service to optimize images, read their
500 pages of docs to get to resize and crop an image? Ad complex plugins to
your backend and have a high dependence integration?

I mean, if you like to add more dependencies to your project, maintain more
code, spends hours rebuilding scripts and running cron jobs to update your
images, go for it.

That's why about 9 months ago we started working on a new concept, solving all
these problems with a service that integrates as easy as a lazy-loading plugin
and it solves EVERYTHING about image optimization (and yes, everything that
you're talking about on every comment on this HN post).

Don't get me wrong, we have a lot to improve and there are many details of our
product that we need to polish, but we believe we've built a solution that
solves perfectly all the most important parts of image optimization and
delivery. It's not about reducing the image size by 1KB more, it's about
everything else and understanding the big picture.

We love feedback and our backlog is prioritized based on our customer needs,
let us know what you think.

Here's the link to our startup website: [https://piio.co](https://piio.co)

~~~
ayoisaiah
Author here. Thanks for the feedback. I agree that what I've done is nothing
revolutionary, but the vast majority of websites out there don't have these
sort of thing. That's one of the reasons why the average page weight continues
to grow every year [0].

Moreover, this solution is good enough for people with small blogs just like
mine. Anyone who needs something more involved can use your service or other
alternatives.

As for responsive images, I implemented that on my site too although I didn't
mention it in the article. I plan to write a follow up article on that topic
soon.

On a side note, you might want to bump up the font size of the navigation
links on your site. They're too faint.

[0]:
[http://idlewords.com/talks/website_obesity.htm](http://idlewords.com/talks/website_obesity.htm)

~~~
nicolasbistolfi
Totally agree with the page weight and that we're missing something.
Increasing page weight when we're also increasing connection speeds is only
bad when the first one increases more rapidly, and I believe that we're under
that situation.

Would love to connect to chat about this and for sure I'll read the follow-up
article.

Thanks for the feedback too!

------
Xt-6
After seeing people forgetting to do basic optimization step on images at our
respective jobs, a friend and I build
[https://www.shrink.sh](https://www.shrink.sh). The goal this tool is to
create a catch all system. It also prevent you from installing some tools that
will slow even more build or deploys and that you would need to maintain
forever.

------
chrisparton1991
For one of my websites (a static page with some pretties), I challenged myself
to remove as much cruft as possible without degrading the experience.

I used Fontello to strip out unnecessary FontAwesome icons and uncss to remove
unused Bootstrap styles, replaced some Bootstrap JS with vanilla JS and made
use of SVGs (optimised with SVGOMG) for backgrounds and the logo.

The resulting site is a total of 178Kb when viewed in Chrome (down from over
1MB), including bootstrap, analytics, some screenshots, a custom font and
animated logo. There's plenty more I could do to trim off size, but I had more
important things to do.

There are so many ways to make webpages smaller and more efficient, and it can
be a really fun learning experience.

------
molotovbliss
[https://github.com/DarthSim/imgproxy/](https://github.com/DarthSim/imgproxy/)
imgproxy I've not seen mentioned, it is probably the fastest image processing
I've used yet. It uses libvips, [https://github.com/jcupitt/libvips/wiki/HOWTO
----Image-shrin...](https://github.com/jcupitt/libvips/wiki/HOWTO----Image-
shrinking) which not only handles resizing & other basic img needs but
optimizes on top of being very light weight on memory & CPU cycles compared to
most other implementations.

------
snowwrestler
There's some unexamined hooey in this post. For example you can't really
compress a JPEG, but you can re-encode it to a lower quality, which can have
dramatic effect on the file size. That's what moz2jpeg is doing for this
person.

But they could have just done the same thing in Photoshop, Preview, MS Office
image tool, etc. JPEG is a standard; file size does not depend on what tool
you use to create it. It's strictly dependent on the image itself, and the
render settings you choose. Same with PNG.

In fact, you'll get better quality for the file size if you go directly from
the original image straight to your final resolution in one step. Rendering to
high-quality JPEG, then re-rendering on the server to shrink the file size,
will give you worse image quality than just going straight from the original
file to the final in one render.

WebP looks promising but is not yet well-supported. Most sites can go a long
way just by caring about, testing for, and adjusting image rendering defaults
to optimize for file size.

EDIT to add a bit more:

If you are optimizing images as part of a pre-deploy build process, you can
use whatever library you want. The only thing that really matters is your
choice of format (JPG or PNG), and the render settings. Or, you can hand-
optimize the images and drop them into your repo to deploy as-is.

If you're running a CMS where non-developers are going to be uploading images
through an admin UI (like Wordpress), your CMS should be using a server-side
library to render optimized versions of the images that get uploaded, then
serving the optimized versions. You can adjust the settings of the server's
image library, although that might require a plugin or module, or custom code,
depending on the CMS.

Missing this is a common killer mistake in page load times. I visited a site
the other day that served a _16 MB JPEG file_ for the "hero" image on the
homepage. My guess it that it was the JPEG straight out of a high-resolution
camera.

This is also good for user privacy, as the server-side rendering should remove
IPTC and EXIF data that would get served with the original image.

~~~
donatj
> JPEG is a standard; file size does not depend on what tool you use to create
> it.

That’s simply not true. The specific choice of cosines can make a huge
difference on compression while having nearly zero perceived visual
difference. Most encoders however take a naive approach whereas something like
Guetzli does an amazing job of compressing JPEGs way better than Photoshop
ever could.

~~~
snowwrestler
Guetzli is not appropriate as a general-purpose tool for optimizing website
images. That's not really what it's designed for.

I guess I should specify that I'm trying to give practical advice for people
who think the linked blog post is instructive. For the vast majority of
people, the simple act of thinking about, selecting, and testing the available
settings in popular image optimization tools is going to have a far greater
effect than the small optimizations (and sometimes big tradeoffs) that might
come from cutting-edge stuff like Guetzli.

The reward per effort of going from "not optimizing my images" to
"purposefully optimizing my images using common tools" is typically much
bigger than the step from the latter to "using the absolute best possible tool
for each image."

~~~
avhon1
> Guetzli is not appropriate as a general-purpose tool for optimizing website
> images. That's not really what it's designed for.

What is it designed for? I downloaded and compiled it, and it seems to work
quite well for the photographs on my website. The README says:

> Guetzli is a JPEG encoder that aims for excellent compression density at
> high visual quality

[https://github.com/google/guetzli/](https://github.com/google/guetzli/)

~~~
snowwrestler
Guetzli is for minimizing file size at the highest quality levels for JPEGs.
Essentially, it's for nice looking photos.

It only goes down to quality 84 and it takes a looonngg time to optimize. As a
point of comparison, the author of the linked blog post dropped his JPEG
quality to 70 and was happy with it. A JPEG at 70 (or lower), if you're happy
with the look, has a good shot to be even smaller than the smallest Guetzli
output.

Generally speaking, the easiest gains in JPEG file size will probably come
from just dropping the quality down and down in tests, and deciding what you
can live with. But if you have to have the best quality, and have plenty of
resources/time for encoding, then maybe Guetzli will be a good fit.

------
stereo
You can save another 7.6% on that png by passing it through advpng+pngout.
ImageOptim is fantastic for this:
[https://imageoptim.com/mac](https://imageoptim.com/mac)

------
adityapatadia
We at Gumlet also provide image optimisation which just works:
[https://www.gumlet.com](https://www.gumlet.com) It's also accompanied by
client side javascript library:
[http://github.com/gumlet/gumlet.js](http://github.com/gumlet/gumlet.js)

------
yurishimo
Cloudinary is another option to get optimized images/videos without having to
manually optimize everything. Definitely a good option if you have lots of
user uploaded content. Running an optimization script within your app will
likely use more resources than it's worth unless you're already operating at
scale.

~~~
markc
If you use Akamai's CDN you can get per-browser auto image optimization (size,
quality, format) as an add-on service via Image Manager.

------
harias
2016 Google developers conference had something on this :
[https://m.youtube.com/watch?v=r_LpCi6DQME](https://m.youtube.com/watch?v=r_LpCi6DQME)

------
freecodyx
we recently moved all our images to s3, we have created some lambda function
which compress the images using Guetzli, it's very slow but the results are
good.

------
vladdanilov
This article like many others is full of fallacies.

Image formats are not used wisely: [1] is PNG not JPEG and [2] is JPEG not
PNG.

> I found that setting quality (mozjpeg) to 70 produces good enough images for
> the most part, but your mileage may vary.

You can get away with this setting for hidpi sizes but 1x will look horrible
[1]. If you care about quality, the mileage is actually 75-95.

> (Pngquant) quality level of 65-80 to provide a good compromise between file
> size and image quality

Again, it may only be applied to hidpi sizes, and it will easily ruin any
gradients or previously quantized images.

Pngquant is a great color quantization tool but it does not actually perform
any lossless PNG optimizations, which can save you at least 5% more, and up to
90% in some cases.

All of these tools will also blindly strip metadata (but it's not guaranteed!)
along with color profiles and Exif Orientation resulting in color shifts and
image transformations respectively.

Most importantly, none of them are good enough for automatic lossy
compression. Guetzli is the closest but it still has some severe issues [3].
I'm also trying to build a real thing, and it is hard.

> there’s value in using WebP formats where possible

WebP lossless and WebP lossy are quite different formats. WebP lossy being
always 4:2:0 is not a good replacement for JPEG [4] especially at higher
quality. On the contrary WebP lossless has evolved into a decent alternative
for PNG including lossy [5].

Proper responsive images would give you considerably smaller page weight and
improve performance on mobile devices. BTW Google treats oversized images as
unoptimized [6].

[1] [https://freshman.tech/assets/dist/images/http-status-
codes/e...](https://freshman.tech/assets/dist/images/http-status-
codes/everything-ok-683.jpg)

[2]
[https://freshman.tech/assets/dist/images/articles/freshman-1...](https://freshman.tech/assets/dist/images/articles/freshman-1600-original.png)

[3]
[https://github.com/google/guetzli/issues](https://github.com/google/guetzli/issues)

[4] [https://research.mozilla.org/2014/07/15/mozilla-advances-
jpe...](https://research.mozilla.org/2014/07/15/mozilla-advances-jpeg-
encoding-with-mozjpeg-2-0/)

[5]
[https://twitter.com/jyzg/status/958629795692150790](https://twitter.com/jyzg/status/958629795692150790)

[6]
[https://developers.google.com/speed/pagespeed/insights/?url=...](https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Ffreshman.tech&tab=desktop)

------
nickthemagicman
Hate to ask a dumb question but What are the best tools/strategies to optimize
images?

------
memelord69
I just did convert -strip -quality 40 for a website I had to make

------
partiallypro
I use Kraken on every site I do for this reason.

------
Theodores
There are better ways that place usability first. By usability this means that
there is nothing for the content creator to do and nothing for the frontend
developer to do.

I use mod_pagespeed - there are versions for nginx and Apache that do all of
the heavy lifting.

With mod_pagespeed you can get all of the src_set images at sensible
compression levels. All you need is to markup your code with width= and
height= values for each img.

With this in place the client can upload multi-megabyte images from their
camera without having to fiddle in Photoshop etc. It just works and the hard
part is abstracted out to mod_pagespeed.

By taking this approach there is no need to use fancy build tools. However, a
background script to 'mogrify' your source images is a nice complement to
mod_pagespeed, if you want your images to be in Google Image Search then
1920x1080 is what you need.

The really good thing about taking the mod_pagespeed route is that you do get
'infinite zoom' on mobile, e.g. pinch and zoom and it fills in the next
src_set size. Keep going and you eventually get the original, which you have
background converted to 1920x1080.

There is also the option to optimise image perceptually, so you are not just
mashing everything down to 70% (or 84%).

On your local development box you can run without mod_pagespeed and just have
the full resolution images.

Or you can experiment with more advanced features such as lazy_loading - this
also comes for free with mod_pagespeed.

If you want your images to line up in nice squares then you might add in
whitespace to the images. Maybe taking time in Photoshop to do this. However,
it is easier to just 'identify' the image height/widths and to set something
sensible for them, keeping the aspect ratio correct. Then you can use modern
CSS to align them in figure elements to then let mod_pagespeed fill out the
src_sets.

Icons and other images that are needed are best manually tweaked into cut down
SVG files and then put into CSS as data URLs, thereby reducing a whole load of
extra requests (even if it is just one for a fiddly 'sprite sheet').

Oh, a final tweak, if you are running a script to optimise uploaded images and
to restrict max size then you can also use 4:2:0 colour sampling. This is
where the image still has the dots but the colours are 'halved in resolution'.
This is not noticeable in a lot of use cases and particularly good if you are
using PNGs to get that transparency.

As mentioned, mod_pagespeed reduced project complexity by offloading the hard
work to the server, keeping cruft out of the project and making the build
tools out of the way. It can also be covered to inline some images and plenty
else to get really good performance.

Mileage may vary if the decision has been made to use a CDN where such
functionality is not possible. However, if serving a local market then a faux
CDN is pretty good, i.e. a static domain on HTTP2 where the cache is set
properly and no cookies are sent up/down the wire to get every image.

[https://www.modpagespeed.com/doc/filter-image-
optimize](https://www.modpagespeed.com/doc/filter-image-optimize)
[https://www.modpagespeed.com/doc/filter-image-
responsive](https://www.modpagespeed.com/doc/filter-image-responsive)

------
amelius
> How Image Optimimization decreased my website's page weight by 62%

Title needs spellcheck.

~~~
akerr
Would save 3% too.

~~~
_arvin
$ echo "How Image Optimimization decreased my website's page weight by 62%" |
wc -m

67

$ echo "How Image Optimization decreased my website's page weight by 62%" | wc
-m

65

$ echo "scale=3; (67-65)/67" | bc

.029

Math checks out, sir.

~~~
amelius
But:

    
    
        $ echo "" | wc -m
        1
    

You probably want to add the -n flag to echo. It doesn't change the validity
of the statement though.

------
alberto_ol
Optimisation

------
Atreus
Venkatraman Santanam.

Made deep learning improve thumbnail representation. Facebook (or google)
showed up, gave him a 6 to 8 zeros to the left of the decimal, preceded at the
far left by a $ and O(1).

[https://arxiv.org/pdf/1612.03268.pdf](https://arxiv.org/pdf/1612.03268.pdf)

