

Automatically Compressing Your Amazon S3 Images Using Yahoo’s Smush.it Service - drp
http://developer.yahoo.com/blogs/ydn/posts/2010/10/automatically-compressing-your-amazon-s3-images-using-yahoo%E2%80%99s-smush-it-service/

======
simonw
I just used Smush.it to optimise a couple of images on <http://lanyrd.com/>
(the logo and the background image) and it knocked a good 5KB off the page
load - not too bad at all.

Our filenames are generated from the truncated SHA1 hash of the file contents,
so logo.png becomes logo.16c7e567.png - which means we can safely set a far-
future expires header and serve through Amazon CloudFront without worrying
about changes to the images, JavaScript or CSS not propagating to the live
site.

~~~
imurray
Thanks for the example. You can do another 1KB better than Smush did for you:

A mirror of your logo:
<http://homepages.inf.ed.ac.uk/imurray2/tmp/logo.16c7e567.png>

A smaller file: <http://homepages.inf.ed.ac.uk/imurray2/tmp/logo.squish.png>

My notes on squishing images:
[http://homepages.inf.ed.ac.uk/imurray2/compnotes/squish_imag...](http://homepages.inf.ed.ac.uk/imurray2/compnotes/squish_images.html)

~~~
simonw
Neat - thanks for that.

------
SabrinaDent
_Autosmush scans your S3 bucket, runs each file through Smush.it, and replaces
your images with their compressed versions._

I think I'm in love.

~~~
coderdude
Depending on how many images you have. Doesn't that mean that it has to
download each image and then re-upload them? Could be quite a cost for minimal
savings if you have a large number of images to store.

Edit: I'm talking about 20,000+ product images, not the images or files
associated with a layout.

~~~
bkrausz
Wouldn't it be free (or close to it) if you run it on an EC2 instance?

------
steve19
Is there a tool that can do this for a local directory? I have Google'd but
could not find one.

~~~
inm
<http://github.com/grosser/smusher>

There's a bunch of others on Github but this is the only one I've personally
used.

~~~
steve19
Thanks

