Hacker News new | past | comments | ask | show | jobs | submit login

Imagine you're running a load balancer in front of your application. Caring about your application's response time, when a request comes in, you measure the start time. When the response leaves your application, you measure the end time. You record each and every diff, and you're super happy to find out that your 99.999999 percentile response time is 2ms!

...Except, it turns out that the requests were spending 6330ms queued in your load balancer because ... well, it doesn't really matter why...

Something is coordinating the flow so that what you're measuring is not going to see the full picture.

RapGenius v. Heroku comes to mind: http://genius.com/James-somers-herokus-ugly-secret-annotated

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact