Perhaps you believe that in web applications, all of the real computation happens on the server, and JavaScript is just for providing a nice UI. Some people probably want it to be this way, so they can do most of the work in their preferred language and runtime environment, and only deal with JS for the minimum of client-side work. But to take this approach is to leave unused the embarrassment of riches that we have in computing power on the client side, even in smartphones. If we make full use of that client-side computing power, we can reduce server resource consumption and, more important from a user's perspective, the need to incur the cost of network round trips.
Or maybe you think that in modern web applications, most of the complex logic belongs on the client side, and the server is mainly for receiving, storing, and sending data. In that case, perhaps you think the most important attribute of a server-side language is performance. But it's probably quite rare for the server side of a real web application to be a dumb data store. So what happens if you wrote some non-trivial logic in JS, then discover that it needs to run server-side?
I think we're still figuring out what belongs on the client and what belongs on the server. Therefore, I think using the same language on both sides might have some benefit. Admittedly, I've never written a real application with Node.js, and the thought of callback spaghetti everywhere doesn't exactly appeal to me. But I might accept that in return for maximal code re-use.
One does not have to believe that "all of the real computation happens on the server" to not want to use Javascript if one doesn't have to.
Where are these mythical Node.js applications that share so much code with the client though? What functionality is shared and how much time did it save anyone?
Personally, I wouldn't trust one browser-land function or library in my server application without digging through every line of code to make sure it's not going to do some browser-specific hackery. Then what have I saved? I'd rather go get a separate module that was built specifically for Node.js.
Then there's the time I'd save sharing my own code between Node.js and browser - which would be next to nothing because any time I save would be spent having to think about platform differences anyway.
Perhaps you believe that in web applications, all of the real computation happens on the server, and JavaScript is just for providing a nice UI. Some people probably want it to be this way, so they can do most of the work in their preferred language and runtime environment, and only deal with JS for the minimum of client-side work. But to take this approach is to leave unused the embarrassment of riches that we have in computing power on the client side, even in smartphones. If we make full use of that client-side computing power, we can reduce server resource consumption and, more important from a user's perspective, the need to incur the cost of network round trips.
Or maybe you think that in modern web applications, most of the complex logic belongs on the client side, and the server is mainly for receiving, storing, and sending data. In that case, perhaps you think the most important attribute of a server-side language is performance. But it's probably quite rare for the server side of a real web application to be a dumb data store. So what happens if you wrote some non-trivial logic in JS, then discover that it needs to run server-side?
I think we're still figuring out what belongs on the client and what belongs on the server. Therefore, I think using the same language on both sides might have some benefit. Admittedly, I've never written a real application with Node.js, and the thought of callback spaghetti everywhere doesn't exactly appeal to me. But I might accept that in return for maximal code re-use.