
Show HN: Dynamic image size generator - ryan21030
Hi,<p>Firstly, the link to the tool - <a href="https:&#x2F;&#x2F;github.com&#x2F;DrRoach&#x2F;DynamicImage" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;DrRoach&#x2F;DynamicImage</a><p>I created a dynamic image sizing tool that allows images of any size to be generated on the fly. It&#x27;s aims are as follows:<p>a) Allow users of sites to upload images of any size without the site owner having to worry about their dimensions or having to resize them.<p>b) Reduce load times for sites as images can be stored on their own dedicated server and be served in milliseconds.<p>I&#x27;d love to hear some feedback as to what you think and whether you&#x27;d consider using this tool and if not, why not?<p>All constructive criticism is welcome!
======
ShirsenduK
Thumbor is an awesome open source solution which has proved itself in many
high traffic sites.

[https://github.com/thumbor/thumbor](https://github.com/thumbor/thumbor)

------
digsmahler
Nice! This is definitely an important tool when creating a site traffic in
images to be displayed across a variety different sized screens. Having worked
on this type of project before I'll mention a few challenges you have not yet
addressed.

At some scale point, your single DynamicServer server will have more resize
requests than it has CPU cycles to serve. The straightforward solution will be
to add more DynamicServer instances. This means that your source image
directory will have to be accessible from both instances. It also duplicates
your cache directory, meaning a single image size could get generated on both
servers. You might solve that by having a shared image cache network
directory. Another solution could be to use a hash mechanism to determine a
particular backend server from the load balancing layer, so any particular
image would always be generated on the same instance.

Something interesting happens when people upload an image to their newsfeed.
If it's a public newsfeed, there can be a good many requests of the image
within the next several seconds at upload. This means that you can get
multiple cache miss requests at the same time coming into your DynamicImage
server. Your PHP processes are vertically partitioned, so your server will be
simultaneously generating the resize. For users with only a few friends, this
is a small waste of CPU cycles, but for users with massive user bases (e.g.
Britany Spears), a single upload could grind your DynamicImage server to a
halt just because of simultaneous duplicated work. Varnish (and other load
balancers) can solve this by collapsing multiple requests to the same resource
into a single HTTP call ([https://varnish-
cache.org/docs/3.0/tutorial/handling_misbeha...](https://varnish-
cache.org/docs/3.0/tutorial/handling_misbehaving_servers.html)).

------
technimad
I've created scripts like these as part of our home-brewed CMS system in the
early 2000's.

Nice and easy dos tool. Enumerate the width and height parameters and bring
down the server fast.

Better way to do it is to specify, or allow admins to specify, a set of named
sizes and only allow these named sizes.

You could also think about adding parameters to determine the crop, or force
aspect ratio etc. (with the same risk of dos)

~~~
noir_lord
I did both and used a .htaccess rule to look for a sharded filepath based on
the hash of the image requested and parameters in the url.

So you could o a1jjajda.jpg?size=200x150 and it would check for the existence
of a/l/jjajda/200x150.jpg and if it didn't exist it would create it from
a/l/jjajda/original.jpg store it at the right place and serve it.

.htaccess file looks meant I didn't have to boot PHP to serve an image the
second time (we where using laravel and even optimised it takes 30-40ms to
come up) and in 95% of cases at all (particulary since I then wrote a shell
script that trawled the paths, built a list of common sizes and named presets
and requested them when the server was quiet).

It worked out pretty well actually and had the benefit of relying on extremely
robust and well tested technology.

That was some hinky looking regexs though.

    
    
        RewriteCond %{QUERY_STRING} ^presets=([a-zA-Z0-9_]+)?$
        RewriteRule ^files/(.*)/(.*)/(.*)/(.*) files/$1/$2/$3/%1_$4? [L,QSA]
    
        RewriteCond %{QUERY_STRING} ^options=[\[|\{]?([0-9]*x[0-9]*)[\]|\}]?$
        RewriteRule ^files/(.*)/(.*)/(.*)/(.*) files/$1/$2/$3/%1_$4? [L,QSA]

~~~
technimad
Thats exacly how I did it, and than the creation of a new cached file was
handled by a PHP script ran as the 404 errorHandler which output the image to
both the browser and file system.

Still pretty dangerous stuff i.e. ?size=10000x2000000

~~~
noir_lord
Nah, I'm a cynical fucker, I had...

    
    
        $x = $x > 9000 ? 9000 : $x;
        $y = $y > 9000 ? 9000 : $y;
    

I also had checks for negative values and that what I got actually made sense
as an integer, since well it's the internet and a get request, they can put
anything.

Tbh even 81MP was pushing it but I got to put the comment

// check and limit maximum image size // it's over 9000!

As someone who was around online back then how could I resist.

EDIT: Just remembered, I had to really resist the urge to change the returned
image to a raised middle finger if either parameter was out of limit, not
because I didn't think it was funny but because with my luck it'd be me that
fat fingered it.

------
coleifer
Nginx has a module for this which you can tie into your cache:
[http://charlesleifer.com/blog/nginx-a-caching-
thumbnailing-r...](http://charlesleifer.com/blog/nginx-a-caching-thumbnailing-
reverse-proxying-image-server-/)

------
Pigo
I was playing around with some image optimization libraries a few weeks ago.
This looks handy, but my wish list also includes image optimizations to reduce
file size.

------
dalore
If people are looking for a third party service that does this I would
recommend cloudinary.com

------
notrheadagain
There are SaaS for that, Uploadcare offers that for free, for instance

~~~
WillPostForFood
Good deal for a small site, but if you have traffic, it gets expensive fast.
$.25/GB bandwidth vs $.085/GB from Cloudfront va $.06 at a budget CDN.

------
godot
imgix is an image CDN service (not free) that offers this feature and more.
Not usually used for personal projects I guess since there is no free tier for
it.

~~~
_eht
Imgix is great, but expensive as a cdn + image size on the fly company. I have
been testing then for ~6 months for thumbnail generation but will be
transitioning off soon. The cost is running about half of my S3 storage and
transfer bill for only generating thumbs.

It's hard to justify not just creating my own thumbnails and stashing them on
S3 for my use case.

