
Zopfli Optimization: Literally Free Bandwidth - temp
http://blog.codinghorror.com/zopfli-optimization-literally-free-bandwidth/
======
roeme
Somewhat OT; As a swiss, the omitted ö is _really_ beginning to bug me.

It's Zöpfli. Gopferteckel.

(The second word is a somewhat soft cuss word - But don't try to use it as a
non-native).

Also, you can't "zopfli" something - it's a noun! You "zöpf" \- or, since
we're in the alemannic german space; "zöpfle".

/rant /vent

~~~
xorcist
Same thing with Löve 2D, or is it Love 2D?

The problem here is that the naming isn't consistent. Sometimes it's Love and
sometime Löve. I never know what to call it, but I think the whole idea of
naming a software package with diacritics is just asking for trouble. If they
would have transcribed it to Loeve it would at least be obvious what's going
on.

This is perhaps exacerbated by my native tongue treating ö not as an o with
umlaut, but as a completely different character distinct from o and at another
place in the alphabet. To me it's like naming a package Blam but half the time
referring to it as Blym instead.

~~~
aristidb
Löve is not a German or Swiss German word, so the umlaut is simply
gratuitious.

(Other languages such as Dutch or French use diacritics to mark a vowel as
not-dipthongised, but that doesn't apply here either.)

~~~
xorcist
If it's not German it's gratuitious?

I'll try not to take that as an insult to speakers of Swedish, Finnish,
Icelandic, Estonian, Hungarian, Turkish, and any of the other dozen languages
which make use of the character.

------
arcameron
Regarding the avatar images:

Why not a <div> with a border-radius & background color? Seems you could
achieve the same thing without another HTTP request (1 for each unique
avatar), no need to zopfli 45,000 unique files.

~~~
codinghorror
Also font and alignment issues cross browser can be brutal. There is a lot to
be said for a tiny image that works everywhere, on any device.

~~~
qopp
What about an svg?

~~~
other_herbert
That wont work in email... I'm generating a usage graph (svg) that is emailed
and the graph will not display in the client.. Must be opened in a browser

~~~
ZeroGravitas
Seems like you could use an SVG avatar for clients that can handle it and
fallback to PNG for email.

------
millstone
> It's a smaller file to send over the wire, and the smaller the file, the
> faster the decompression.

Can someone elaborate on this? Why do smaller files decompress faster?

> However, remember that decompression is still the same speed, and totally
> safe

Wait, what? Didn't we just establish that it's faster to decompress?

~~~
ndesaulniers
> Why do smaller files decompress faster?

The same reason why "no code" is faster than optimized code. ;)

~~~
millstone
The trivial identity "compressor" is fastest to decompress, since it requires
no work. Likewise PNG has literal blocks, which are longer but faster to
decompress. So is it actually true that Zopfli-compressed files decompress
faster?

------
jonsneyers
The original PNG of that PBF comic is 671,012 bytes. ZopfliPNG crunches it
down to 585,117 bytes.

Not bad, but if you use newer image formats, you can do better.

Lossless WebP brings it down to 429,696 bytes (using -lossless -m 6 -q 100)

FLIF with default options (which means interlaced for progressive decoding)
takes it down to 322,858 bytes. Non-interlaced FLIF reduces it further to
302,551 bytes.

------
HeyImAlex
If anyone is interested in PNG optimization in general, a wrote an article
about it a while back. The deflate step is only one part of it!

[http://heyimalex.com/journal/png-
optimization](http://heyimalex.com/journal/png-optimization)

------
callumjones
You could probably have a worker system that eventually produces a Zopfli
compressed file.

User uploads a provided PNG, you perform the quickest compression compression
but then queue up a Zopfli compression. Up front you're only returning the
less compressed file but after time you begin serving up the lesser bandwidth
file.

If the uploaded file or associated post is deleted then you can wipe it from
the queue.

~~~
brianwawok
Or maybe don't even Zopfli it until it gets 100 some hits. Cost some amount of
cents of CPU time to do the compress... unless maybe you made the uploaded so
it.

~~~
codinghorror
Yeah the smart thing to do is schedule "important" user submitted images for
heavier recompression, but even then you might get in trouble for very large
PNGs, and need to gate based on dimensions. For huge PNGs I am not sure the
80x - 160x time penalty is tenable, you might be talking 30 minutes in some
cases.

------
eridius
Are the reported Zopfli numbers for PNGs achieved by recompressing the
original, or by recompressing the output of PNGout? The wikipedia page for
PNGout says

> _PNGOUT also performs automatic bit depth, color, and palette reduction
> where appropriate._

Assuming the Zopfli numbers were created by recompressing the original, I
wonder if there's any further savings to be had by recompressing the output of
PNGout?

Alternatively, PNGcrush can also do the same sort of lossless bit depth and
palette reduction, so I'd be curious about the combination of PNGcrush +
Zopfli as well.

~~~
danielvf
That doesn't matter - both systems decompress the losses PNGs before
recompressing them.

~~~
HeyImAlex
It does kind of matter, because compression isn't the only form of
optimization that can be done. Consider a 32 bit png that is actually bilevel
with no transparency. If zopfli just recompresses, PNGOut will still make
smaller output because it does bit depth reductions.

~~~
danielvf
The zopfliPNG program does do bit depth reduction if that's possible. It even
tests to see if a given small image that would be possible to bit depth reduce
would actually be smaller by it keeping 32 bit, since the added pallet
overhead may cost more than the gain of a smaller pixel storage cost.

------
legulere
And you could save even more bandwidth if you changed the PNG standard to also
allow Brotli compression. Even more probably if you created a better file
format.

That's not free anymore, but it's technologically easily possible to
drastically reduce the amount of resources we use. What holds us back is that
it's hard to get other people to do stuff like supporting new file formats or
even have a better output in their image manipulation program.

~~~
eridius
If you change the PNG standard, it's no longer PNG. If existing PNG decoders
can't decode it, then all you've done is invented a new format that looks very
similar to PNG.

~~~
legulere
Something similar was made with WOFF. Version 1 used deflate, version 2 uses
brotli. You could now argue semantics wether WOFF2 still is WOFF, but that
doesn't matter at all. The point I wanted to get across here is that it's a
relatively easy change

~~~
eridius
> _You could now argue semantics wether WOFF2 still is WOFF_

No arguing needed. It's a different format. It uses a different extension
("woff2" instead of "woff"), a different magic number, a different Universal
Type Identifier string, and a different @font-face format.

> _it 's a relatively easy change_

But it's not. Even WOFF2 isn't supported everywhere (according to
[http://caniuse.com/#feat=woff2](http://caniuse.com/#feat=woff2)). And that's
a format that's fairly recent (the W3C Recommendation doc for WOFF 1.0 is
dated December 2012) and only has a handful of implementations to begin with.

PNG is a format that's a lot older and is decodable by practically everyone.
Even if there's only a handful of distinct implementations (and I don't
actually know how many implementations there are), and even if every single
implementation updated immediately, it would still take an incredibly long
time for it to get deployed widely-enough to actually use as a general-purpose
format.

It's also worth pointing out that web fonts have built-in fallback behavior
(e.g. if a browser can't handle a WOFF2 font, you can provide a WOFF font as a
backup), and they're also things that are typically set up once (so generating
multiple font formats is reasonable). Images don't really have fallback
behavior. On the web, the WHATWG HTML Living Standard defines a <picture>
element that provides fallback but it's not supported everywhere. Outside of
the web there's typically no way to do fallback either (if your browser can
render an image but nothing else can, saving that image to disk isn't very
useful, and sending it to someone else isn't very useful either). Also, while
font files are created very rarely, images are created very frequently, and
most people aren't going to want to create "PNG2" images if they also have to
create PNGs and deal with fallback (just look at WebP, which was released 5
years ago and still AFAIK is not used by very many people outside of Google).

~~~
legulere
Technologically it's relatively easy: just link the brotli library and call
its decoding functions instead of zlibs when you encounter a file with brotli
compression. It's getting everyone to implement that which is hard. This is
already what I wrote in my first post.

~~~
eridius
The "getting everyone to implement that" part is the part which actually makes
it a format.

------
Rygu
It's not even just about bandwidth. You're reducing page load time, and
therefore increasing revenue. [http://www.fastcompany.com/1825005/how-one-
second-could-cost...](http://www.fastcompany.com/1825005/how-one-second-could-
cost-amazon-16-billion-sales)

If you're a designer/developer on Mac I would strongly recommend ImageOptim
([https://imageoptim.com/](https://imageoptim.com/)). It supports Zopfli and
has a simple drag-n-drop user interface.

~~~
vanderZwan
On Linux there is Trimage:

[http://trimage.org/](http://trimage.org/)

------
ck2
Batch files which go though all the tools and find the smallest sizes for
various images and .gz files

[http://css-ig.net/tools/](http://css-ig.net/tools/)

Personally I've found this tool is faster and does a better job than most
others and it's free:

[http://psydk.org/pngoptimizer](http://psydk.org/pngoptimizer)

------
jibsen
I don't think tools like pngquant should be so easily dismissed. With the
pixel density of todays monitors, the lossy changes they introduce can be very
hard to see (which may be surprising for those of us who remember the
pixelated horrors of Floyd-Steinberg from a couple of decades ago).

For some image types, lossy png has the huge advantage over jpg at the same
file size that they have no jpg artifacts.

    
    
        671.012 original
        584.677 zopflipng -m
        580.180 zopflipng -m --lossy_transparent
        576.637 pngwolf --max-stagnate-time=0 --max-time=300 --normalize-alpha --strip-optional
        190.598 pngquant --speed 1
        179.638 pngquant + pngwolf

------
jzelinskie
His example use case seems a little contrived. We have a similar default
avatar at Quay.io, but we handle it all client-side with CSS. There's some
more free bandwidth!

~~~
quadrature
a little, but images are important if you want to ensure consistency across
different platforms.

------
leni536
> ~250 color schemes

Theoretically one could use one of the indexed PNG formats and only change the
palette. I don't think that those avatar images use too much number of colors
(even with anti-aliasing) so 8 bit indexed PNG should be more than enough.

------
manigandham
This is interesting - but really don't we have better things like FLIF that
should be gaining developer momentum first?

That would solve basically everything wrong with today's decades old formats.

------
bartvk
Just checked; on my MacBook it installs with a quick

$ brew install zopfli

That's assuming you have Homebrew installed, if not:
[http://brew.sh](http://brew.sh)

------
sandGorgon
the lions share of image creation these days happens on the mobile. and I cant
seem to find a Zopfli library for android. I wonder why - is it too heavy duty
for a mobile CPU ?

~~~
Nutmog
Photos maybe, but who makes images that would suit PNG on mobile?

------
Qwertious
Sounds like there should be a system to JIT images with this.

------
jMyles
> it's about as close as it gets to literally free bandwidth in our line of
> work.

Listen, I'm all about inventive ways to lighten the yoke of static media on
the web today.

But, in two important ways, this is not "literally free bandwidth":

1) The weaker: Despite the tone of obviousness in this article, it
acknowledges that the choice of which technology to use is not made for you:
there are edge cases where other methodologies are indeed superior. So, far
from being free, these sorts of solution do have a time cost.

2) The stronger: We live in a world where, on a great day, the user's realized
downstream bandwidth is 20% their LAN connection; their upstream 5% or less.

Connecting to a next-door neighbor via a conventional web application served
through a typical corporate ISP probably means pushing packets a thousand
miles or more, only for them to come back into our community.

Complicating this issue: our name service and certificate distribution are
implemented in a way that is reasonably called "incorrect."

Our ISPs have a "speak when spoken to mentality" about connectivity, and
competition is rare.

A solution bragging "literally free bandwidth" needs to service this concern -
let me transfer a piece of media to a next door neighbor utilizing the other
95% of my network interface upstream capacity. That I'll call free bandwidth.

~~~
cbsmith
Both scenarios are free bandwidth.

