"Computers are so fast today that does this benchmarking business really matter?"
Famous last words.
Computers are much faster than they were years ago, but look at how much crap they have to deal with... So much crap in fact that we barely notice they are indeed much faster. So much crap that hardware can barely keep up.
Performance is never mission critical until it becomes mission critical, and that usually happens all of a sudden. It's that moment a few days into production when the system becomes unbearably slow and the rest of the company is breathing down your neck asking "WTF is going on?! Fix it, NOW!".
It's problematic enough when you are just creating to many objects and making processor caches irrelevant, but if your performance issues are baked directly into your coding style... good luck with that.
We're not talking micro-optimization to squeeze an extra 0.5% performance out of a system. We're talking about an order of magnitude (or more) slowdown, in a language already not exactly known for its blazing speed.
My question of the day: why isn't more time spent on building abstractions that aren't just convenient but that are also fast? Why does readable, maintainable code have to mean code so slow it effectively turns Core i7s into Pentium 90s?
I don't think it has to mean that at all, but that's exactly the choice a lot of programmers seem to think they're making.