Under the Poisson process assumption the average rate directly characterizes the distrubtion, so it is very relevant to this discussion. The non-Poisson component of the phenomenon (correlated spikes in traffic due to events) are not well-characterized via averages, which is why the original analysis relaxed that limitation by examining a single worst-case traffic hour. Beyond that the Poisson process analysis is standard.
While 10k rps may be possible it would either require vastly exceeding 6M requests per day (and per hour) or some very non-standard traffic event (this could happen in practice, for instance email asset delivery is increasingly peaky due to modern email clients and can exhibit very large traffic spikes). As HN is primarily visited by human users spanning many organizations such a strongly correlated event seems unlikely.
Again, you seem to be nitpicking my upper bounds estimate of their traffic. Fair enough. Are you making any broader point other than that we may mildly disagree on what that bound is? If so, that's fine, I picked a higher round number than is likely. This simply reinforces the broader point I was making.
While 10k rps may be possible it would either require vastly exceeding 6M requests per day (and per hour) or some very non-standard traffic event (this could happen in practice, for instance email asset delivery is increasingly peaky due to modern email clients and can exhibit very large traffic spikes). As HN is primarily visited by human users spanning many organizations such a strongly correlated event seems unlikely.