These are excessively detailed numbers with very little context. What is the code doing? What's the performance of the equivalent implementation in another language (I'd be most interested in a typed language like Java)?
I realize you probably have more context in your head but for everyone else the body of the post adds very little to the title.
It might be just as fast in Java; I don't know. My only point is that theoretically being able to serve 78 million requests in a 10 hour day, most within 5 ms (server processing only) and each a unique gzipped page based on the user's past interaction with the site, all on a relatively cheap server, seems blazing fast to me. I'd think it was fast even if every user was shown the same page but each user's page was gzipped. Could Java match it?