The article explains that this only works if you assume that there are much fewer sales than pixels, so that sales divide pixels.
And it wasn't sales data, I believe it had to do with chip design or marketshare (during the design phase of the current generation of consoles).
Can someone give a bit more justification for this? It seems like the average rate shouldn't be constant and is heavily dependent on time/date.
If not, is there another justification for why sales mean should equal sales variance?
A priori it is assumed that sales are independent of time. That's part of what Poisson distribution means -- a constant rate of rolls of a fixed weighted die. The assumption could be wrong.
> justification for why sales mean should equal sales variance