
How much traffic have you handled and what was the architecture like? - kingnothing
We've been dealing with, to me, a large amount of traffic lately on a number of apps and I'm curious to see how it compares. Three recent Rails projects have had to deliver  between 15k and 25k sustained requests per minute serving dynamic content. On the low side, I think we had 8 app servers; on the high side, we spun up over 20. In retrospect, I don't think all of the app servers were necessary, but of course it's better to err on the side of caution when AWS is so cheap. This was all Ruby 1.9.2, Rails 3, and MySQL.<p>What have you had to deal with and what was your setup?
======
chuhnk
What is your definition of an app server? A aws ec2 instance? I think
understanding the throughput of a rails application is very simple. Rails is
not thread safe so to concurrently serve 10 requests you must be running 10
processes. Thats not to say you will serve 10 reqs/sec, it depends on how
quickly the application responds.

The key is knowing where the bottlenecks lie and what gives you the greatest
gains for the least amount of optimizations.

Run unicorn for your rails apps. It runs in a master/worker setup in which the
master distributes the requests to workers. Benchmarks have shown it to be
much faster than competitors. In saying that you may want to look into phusion
passenger lite 3.0. I have not seen any clear benchmarks against unicorn so I
cannot give a definitive answer but have heard good things.

Make sure static content is not being proxied through to your rails apps.
You'll be wasting request processing on js,css,jpeg and gif in rails where it
could be done by apache, lighttpd, nginx or even varnish if you'll levelled in
some caching. A bigger advantage to that is to serve the static content from a
second domain from a different web server so that parallel connections are
made to retrieve that content.

Enable caching in rails so that classes, views, etc are in memory. It'll help
to not have to reload a class or view every time a request is made.

Look to find the most efficient templating engine for rails. Erb was 3 times
slower than erubis last time I checked.

Are you IO or cpu bound? Run top and vmstat. If you are IO bound then cpu
utilization will be lower and its most likely wait on the db. This could be
for any number of reasons. Rails is notorious for selecting everything. Limit
your queries, select on fields that are required. Index columns that are used
in your where clauses and on join fields. If your entire dataset does not fit
into the buffer pool then your mysql might be hitting disk which is orders of
magnitude slower than ram.

Set noatime on all your servers so that file access times are not updated,
huge performance differences. Can be done on your database server too.

I hope some of that helps.

------
briandoll
At New Relic, we have customers that scale Rails apps well over 100k requests
per minute. They also have the advantage of understanding how their app scales
(throughout, cpu, memory at various levels of load) and how they are utilizing
their infrastructure.

Check out <http://newrelic.com> \- We have a week-long free trial of our
premium service and a forever-free version with less features as well.

As far as what architecture/infrastructure is required to support certain
levels of traffic, it really depends. However, the faster your app, the fewer
app instances you need to serve a given amount of traffic.

