I've seen that countless times where exuberant "experts" hastily choose a new technology over the warnings of those who are a little more thoughtful in their technology choices. Usually the ignoring comes with a heavy dose of disdain and mocking of the person for not being "up to speed with the latest stuff".
Then when disaster strikes, they seem to have forgotten about how they were warned and write a blog post about how much smarter and more experienced they have now become.
Not suggesting that this the case here, but I have to wonder if Cloud9 didn't have at least one person saying "Whoa there sparky! Have anyone of you considered how we are ever going to query this stuff?", etc.
I'd be a millionaire if I made a nickel every time someone guessed at a bottleneck rather than looked. If I wanted to expand my portfolio, I would add a nickel for every time this is said: 'we had our programmers working for 6months to improve performance' when performance issues could have been solved with a $10k SSD array in a week.
Nice nugget from the article : "Facebook uses MySQL to keep most of its data, are you going to get bigger than them"
You could rephrase as that as Djisktra did "Premature optimization is the root of all evil"
First solve the problem then optimize, unless it gives you a special edge that'll bring money.
When I started doing "stuff" I wish I had read such an article. It would have saved me both good nights of sleep AND unneeded worries.
This sort of approach won't scale too well when 10 startup companies all need to hire the core node.js (or insert favorite trendy base tech of the day) team.
It sucks to be the alpha tester. However, the SQL vs NoSQL debate is silly, because Facebook seems to be using both to scale.
A lot of content at Facebook seems to be served from a denormalized NoSQL storage and that is the useful takeaway insight. I do not know what I can learn from the fact that a team with some of the most talented SQL ops and developers can store some core data in SQL.
I've grown more conservative over time, but other people tend to continue to operate with the idea that with enough searching/testing/querying, there will always be 'the' answer which will solve all their issues. Not picking on node specifically, but choosing that a year ago or so as the platform to write a production system in seems hopelessly naive. I suspect that some of the people involved had this unconscious idea that, with enough work, the 'right' answer would magically appear, and they'd move on to the next problem. Turns out that's not what happens when choosing bleeding edge tech, but I'm not sure it's a lesson everyone learns (and I suspect it's a maxim not everyone really cares about anyway).
Superior technology (as distinct from new technology) almost always gives you an edge. The edge maybe a small one, but it exists, assuming other things are equal (quality of your devs, say). Isn't this the whole point of pg's "Beating the Averages" essay? 
But then, I guess you need some real hands on experience to distinguish superior tech from 'the latest buzzword' tech. For most apps, node.js is probably an example of the latter than the former. So the OP's point holds.
This guy is not doing what he says people should do. He uses the latest fancy web page UX where he should use a well proven UX but maybe boring. Do what he say not what he does.
His whole article is about "woops, going with the latest tech wasn't such a good idea - here is what you can learn from this".
So yes, he is using the latest fancy web page UX and he has written an article about what he has learned. Next site he makes will probably be using well proven UX.
(just add strip=1 to the end of cache searches)