That made me LOL. Is this really surprising? A symptom of years of high-level web development or similar... ?
This is /the/ classical optimisation strategy - since the day of pen and paper computations (anyone remember "log" books?).
For example who wouldn't like to hear the stories (possibly horror stories) of the early Twitter days? Even though there was lots of downtime precisely because of that reason there is a lot to learn.
As the number of requests climbs you reach limits, those limitations manifest to the end-user as suckage, slowness, errors, and in the worst cases lost data. You can't plan those limits out ahead of time, you discover them as you go.
I dont know if people really use them as a reference, but Reddit is one of the few sites in the sweet spot of getting a lot of traffic, and being pretty open about their architecture. Any presentation done on behalf of Reddit (pycon '09, '10, this) is great fodder for the technical Reddit users, such as myself.