

500 requests/s with Ruby on Rails 4 on a 5$ / month server - matwiemann
http://blog.wiemann.name/500-requestss-with-ruby-on-rails-4-on-a-5-month-server

======
benologist
Am I reading the ab output wrong or did you only make 40 requests and 13 of
them failed?

Also there's no mention of what your test actually is?

~~~
Keats
Indeed

Complete requests: 40

Failed requests: 13

is far from good

~~~
jcampbell1
And 28 were non 2xx, which is also a failure. Probably db issues since there
is no way SQLite can handle 20 concurrent connections.

~~~
matwiemann
I corrected the blog post with a correct sample size. Thanks for the feedback!

~~~
benologist
You're missing the ab output now which also hurts the quality of the analysis.

------
acanby
What is the best way to establish optimal pool size? The article seems to
gloss over this and instead mentions 'you need to set pool: 25', but I doubt
this is a one size fits all solution.

Anyone have any experience profiling these sorts of things that can share some
info?

~~~
trustfundbaby
You probably want to find out max_connections for your server.

Then figure out how many other connections you'll need outside your server
(rails console, cron jobs, commandline) subtract that and you have your
number.

I like to have about 10 spare connections.

Running

 __select name, setting from pg_settings where name = 'max_connections'; __

in postgres. I see I have 100 max connections.

So that minus 10 gives me a pool size of 90.

------
tlogan
Cool write up - I never heard about Puma server. I should check it.

BTW, similar result like this can be also achieved with any RoR and jruby 1.7
(for me jruby 1.7 uses much less mem then 1.6). Threads in jruby are quite
cheap. And as always database is the bottleneck but jruby is cool because you
can embed BerkeleyDB in it (so open temp Berkely DB and cache things like
crazy).

------
nwh
If anybody wants to have a go at trying it themselves, the coupon "SSDTWEET"
will give you $10 credit on DigitalOcean.

------
viktorsr
Last time I benchmarked Puma on MRI 1.9 and 2.0 it was quite fast for most
requests, but some requests took more than 20 seconds to complete. The bottom
line being that it would be nice to see full ab output.

------
LogicX
Discussion from his previous post in the series:
<https://news.ycombinator.com/item?id=5763282>

------
timmillwood
I find a single dyno on Heroku does a good job too and costs $0 / month. So
far the most traffic I've handled is 1,708 page views in an hour with Rails 4
(and a lot of action caching).

~~~
siong1987
That's less than 1 request/second. Rails should be fine even without caching.

