

Optimizing Images on Alexa's Top 100 Global Sites - rast-a
https://blog.kraken.io/optimizing-images-on-alexa-top-100-global-sites/

======
dm2
A third of Google's image size is this thing:
[https://ssl.gstatic.com/s2/oz/images/notifications/spinner_3...](https://ssl.gstatic.com/s2/oz/images/notifications/spinner_32_041dcfce66a2d43215abb96b38313ba0.gif)
which is basically a loading icon that is rarely seen.

How does HTTP compression factor into image optimization? Is it possible to
optimize an image but the HTTP compressed size ends up being greater?

You should release a monthly report with this same algorithm. You should also
put in a column for competitors algorithm (not the just ones you beat, all of
them).

Don't forget that some images are loaded after the page and content load, so
they have very little impact on user experience (and all these sites make
heavy use of CDNs).

~~~
yalooze
At first I assumed Google's reduction was so high because they'd scraped the
site on a day when the logo was a doodle. But I just manually checked Google's
main logo and was surprised to see a 40% lossless reduction. I would assume
they dogfood PageSpeed (or at least use some internal version of it) which
does image optimisation automatically for them. So is Kraken just better or is
it just an oversight from Google?

~~~
dm2
There might be a browser support issue that prevents doing it.

The bandwidth and mobile loading might not be an issue because of their
massive amount of data-centers to serve content from.

It seems like it to would be easy to test the new image to see if anyone
reports any issues, but there might just not be any benefits to reducing the
homepage image by 40%. Make sure you consider that the homepage image is being
scaled smaller than the actually image is, some optimizers take this into
account and will give you a scaled image.

~~~
DanBC
> but there might just not be any benefits to reducing the homepage image by
> 40%

There would be advantages to people on limited bandwidth mobile data plans?

I'm gently curious about the amount of CO2 it takes to push that extra 40%
through the Internet.

(Off topic: Seeing the Daily Mail on the Alexa list is profoundly depressing.)

~~~
dm2
CO2 from extra bandwidth? Probably less than a text-message.

According to this "Club Penguins" are a major source of CO2, I'm not sure why.
[http://visualization.geblogs.com/visualization/co2/#/club_pe...](http://visualization.geblogs.com/visualization/co2/#/club_penguin)

------
tmikaeld
Hm, tested the field/mountain comparison from kraken.io first page on JPEGmini
- 184kb on jpegmini vs 242kb on Kraken.io

Seems their solution is both cheaper and better?

[http://www.jpegmini.com/](http://www.jpegmini.com/)

~~~
ksec
Well the server version start at $199. So i dont think that is cheap.

~~~
tmikaeld
Considering you are not limited to any amount of images, yes it is.

------
BorisMelnik
I don't know much about the technical side of things, but as someone studying
SEO and UX this seems like a huge win. Not only by cutting down on page load
size / speed but so much better fo rUX.

~~~
noir_lord
> I don't know much about the technical side of things, but as someone
> studying SEO and UX.

I felt a great disturbance in the Force, as if millions of developers suddenly
cried out in terror...

Joking aside it is usually a win to reduce page load sizes from a speed and
bandwidth point of view (which on a heavily tracked site can add significantly
to costs) occasionally you have to be careful as some of the compression
methods result in non-standard or "technically standard but the client doesn't
really do it that way" files which can render corrupted or more slowly.

