If DCI helps to keep your code maintainable, readable, and less buggy, is that a better tradeoff? I would think that 95-99% of the time, the answer is yes.
On a side note, if speed is mission critical, then would ruby really be the language of choice to use?
Famous last words.
Computers are much faster than they were years ago, but look at how much crap they have to deal with... So much crap in fact that we barely notice they are indeed much faster. So much crap that hardware can barely keep up.
Performance is never mission critical until it becomes mission critical, and that usually happens all of a sudden. It's that moment a few days into production when the system becomes unbearably slow and the rest of the company is breathing down your neck asking "WTF is going on?! Fix it, NOW!".
It's problematic enough when you are just creating to many objects and making processor caches irrelevant, but if your performance issues are baked directly into your coding style... good luck with that.
My question of the day: why isn't more time spent on building abstractions that aren't just convenient but that are also fast? Why does readable, maintainable code have to mean code so slow it effectively turns Core i7s into Pentium 90s?
I don't think it has to mean that at all, but that's exactly the choice a lot of programmers seem to think they're making.