
Optimizing Nginx for High Traffic Loads - ichilton
http://blog.martinfjordvald.com/2011/04/optimizing-nginx-for-high-traffic-loads/
======
morganpyne
I would also recommend that you gzip any static assets and use
<http://wiki.nginx.org/HttpGzipStaticModule> to serve these pre-gzipped files.

No point in re-compressing the same files over and over again. I normally gzip
all static assets as part of the deployment process.

~~~
JshWright
Perhaps even more importantly than not recompressing... If Nginx is reading
gzipped files, rather than full size files, it reduces the pressure on
filesystem cache. This increases the odds that the system won't have to go to
the disk to find the requested resource.

------
patio11
Note the absence of any particular setting which will cause it to crash if you
get a link to your blog retweeted.

 _glares at Apache_

------
not_chriscohoat
Some useful Nginx optimizations, but I've always found that the bottlenecks
are far worse elsewhere. Nginx is a champ at serving static files, and I have
it proxy all requests upstream to Apache (and use mod_wsgi because the
majority of what I work with are Django apps). Optimizing the DB schema and
queries have had the most positive performance increases by far, followed by
Apache tweaks, and then finally Nginx config changes. But I really love the
setup I use for how easy it is to get a Django app spun up.

------
clemesha
If your app uses a database, you've got bigger problems than optimizing Nginx
for high traffic loads.

That said, I absolutely love Nginx because it's so damn lightweight, and easy
to setup and maintain.

~~~
mfjordvald
Even if your app does not use a database then you've still got bigger
problems. Optimizing Nginx is often a bit of a joke as you don't really get a
lot for your effort. I think reducing the IO impact of Nginx is probably the
best thing you can do. CPU wise you're kinda stuck with the awesome stock
performance.

------
brendoncrawford
_> The biggest optimization happened when you decided to use Nginx and ran
that apt-get install..._

I recommend against doing this on Ubuntu Lucid without first installing the
unofficial nginx PPA, unless you want to get stuck with v0.7.65.

~~~
mfjordvald
Thank you for pointing this out, I always tell people to do this so I should
really take my own advice.

~~~
brendoncrawford
Well, it doesn't negate your statement. apt-get is still a great way to
install nginx.

~~~
justincormack
I dont know, I compile from source the issue is that modules have to be
compiled at runtime and everyone needs different sets, so there is rarely a
PPA with what you want unless you build your own.

Its a build model that does not work so well with binary package managers,
works better with say the Gentoo use flag model.

Oh and if you are not using any nginx modules you are missing out on a lot of
its power, you should be able to move big chunks of your app like auth right
into nginx, use redis and memcache directly, and so on.

------
originalgeek
I think he's got a little math problem. Many browsers will open up to 4
connections (possibly more, if config tweaked) to overlap requests for
content, so you might want to consider this when configuring your
worker_connections.

~~~
rryan
Each worker can handle many thousands of concurrent connections so this isn't
really an issue. This is where nginx and Apache differ. Generally you want one
worker per CPU core.

~~~
originalgeek
I know how nginx works, I use it all the time. He is saying if you have 1024
woker_connections, at 2 per connections per user, you'll get 512 users per
worker process. But the truth with most modern browsers is it is 4 connections
per user, woker_connections = 1024, that is more like 256 users. So if you
really want to service 512 users, you need to set woker_connections to 2048.
This is the math problem I was highlighting, a simple division problem.

I believe you are talking about worker_processes

~~~
rryan
Oops -- you're right I misread your comment. Sorry about that.

------
anto1ne
too bad there's not so much said about optimizing latency, serving a lot of
traffic is not that difficult, but getting the best latency to serve your
files 50ms faster can make a big difference

~~~
mfjordvald
This is kind of outside the scope of Nginx.

------
known
Optimizing == Customizing as per your business needs

