

How Twitter DDoS’d our Website 61 times in Past 2 days - chintan
http://trialx.com/enablers/2011/02/how-twitter-ddosd-our-website-61-times-in-past-2-days/

======
there
in short, twitter didn't DDoS them, it was the result of all the various bots
that look for new urls posted to twitter and fetch them for whatever reason.

the article says the "DDos" was "80 requests within the span of 2-3 seconds".
if that amount of traffic causes as much trouble to your server as it did for
that site (load average of 31, out of swap memory, server reboots), you have a
really shitty architecture.

it sounds like twitter did these guys a favor by pointing out how poorly
engineered their site was. why they posted about it and drew attention to
their problem, i have no idea.

~~~
chintan
All these requests were first time requests before the cache pages were
created, hence it caused so much load.

~~~
tedunangst
How many requests does it take to start caching?

------
jefe78
You shouldn't be getting DDoS'd by a bunch of bots. Since you're using Apache
with that hardware, you should be able to avoid crashing the entire server.
Regardless, you should be able to control your server enough to avoid having
it swap/crash. May want to hire a sysadmin with more experience.

------
thezilch
I'm not familiar with the interals of WP Super Cache, but it is really odd
that a cached page can't be served ~80 req/s. Perhaps Varnish will work in the
case where WP Super Cache can't be tooled to write-through to cache or sooner.

~~~
thezilch
Something else is really amiss here. As I refresh the article to double-check
the "~80 req/s:"

$ curl -I -m5 [http://trialx.com/enablers/2011/02/how-twitter-ddosd-our-
web...](http://trialx.com/enablers/2011/02/how-twitter-ddosd-our-
website-61-times-in-past-2-days/) curl: (28) connect() timed out!

Surely their caching is still enabled and most of their articles should be hot
in an LRU.

------
tedjdziuba
"How I Run a Website with Swap On"

