I usually only pay attention to real memory. I also notice when my real memory runs out or real memory, my machine immediately start hitting swap. I guess 4 gigs is just not enough anymore.
I actually don't think that's accurate, since it naively adds up the resident set size for each Chromium thread. The problem is that there is probably a lot of shared pages among those threads - the Chromium runtime and such. I think - but I'm not sure - that the resident set size includes pages that are shared with other threads/processes.
You may be right. I get the same memory usage numbers with my Chromium memory usage script as Chromium's "Stats for nerds", but Chromium's detailed memory usage tool does link to a bug where it over reports its own memory usage. http://code.google.com/p/chromium/issues/detail?id=25454 I guess if even Google can not figuring out how much memory Chromium is using, laymen, such as myself, are not going to do much better.
(Forgive the Python notation, it would actually be easier to express this using actual summations if I could draw freehand, but given that I can't, I figure code would be clearer.)
For all threads in the process, we know that it must be solely responsible for its resident set minus its shared set. This already is a lower bound, but it ignores what is shared. We can increase (improve) this lowerbound a bit by figuring out what the smallest shared set among all of the threads is. If my thinking is correct, then we can be sure the total memory used won't be lower than this.
This is what I use to measure what I believe to be real memory usage for Chromium: https://gist.github.com/864606