It's not the cool new thing anymore, but it's definitely not going to go away. Its killer feature is that the same code can run on the client and the server. Not a big deal for most apps, but very important for some.
Also, ES6 & ES7 make callback-heavy Javascript so much more pleasant to program in. That addresses Node's main pain point.
> Its killer feature is that the same code can run on the client and the server
It's a feature in search of a problem. Nobody is rushing to replicate this, because it's difficult to leverage. There are few languages that do not have a way to parse json and are all easier to maintain.
I very much disagree. Using react and webpack I can render my views on the client and the server using the same code and using react-router I can even share the routing code unmodified on both ends. Having a fully powered SPA that cleanly falls back to server rendering for crawlers and no-script clients is pretty sweet, in my experience.
This has been my experience. To the extant that I'm trying to embed Javascript everywhere now. Many server-side operations are deterministic and only need to be executed on the server because it's a trusted environment. They can be executed "optimistically" on the client if you have the same information and a compatible runtime. It makes everything feel snappier and enables a practical programming model. A single state transition function for each "action" performs the immediate client update and triggers validated server persistence in parallel. Without a ubiquitous execution environment that's like four/five times as hard.
Pre-rendering SPAs on the server is rather nice and there's several frameworks which cover the hard parts for you, but there's also no great downside to using node.js for just that and writing the rest of the backend in some other language.
This is pretty much where I'm at. I've always rolled my eyes a bit at node.js, but a current client is using it to render SPAs and I don't think I mind that at all. For my own stuff I'd do the heavy lifting in Rails or on the JVM (depending on my needs), but I am truly sold on React/Webpack for that.
We do. We have one app where that ability is a key part of its design. I'm sure there are others.
On a different app I was working on yesterday, I was adding a feature. That feature was much easier to do on the server, so I moved a bunch of code from the client to the server. Less than an hour's work in Node, it would have taken several days in any other framework.
We do. It's great—we can have a fast first-page load time while bootstrapping additional client-side functionality and all subsequent pageloads are ridiculously fast.
This way, we are making our web server "just another client."
> Its killer feature is that the same code can run on the client and the server.
In practice this (almost) never happens. The popularity of NodeJS has much more to do with asynchronous I/O, and having a package manager that is working very well for the community.
The ability to deploy the same code to front end and back, for me anyway, works out quite well. With browserify, I can write common utilities and require the code using the commonJS module system. It's pretty seamless. Doesn't change the skill set necessary for front end development in any way though.
Node's event driven design is also called "co-operative multi-threading", and is more efficient than OS threading. It is annoying that you have to use multiple processes or Web Workers to use more than one core, but that is far from being the main pain point.
A), that does not allow parallelism, only concurrency. B), that is a description of all single threaded systems. "Event driven design" is euphemism for "bolted on async primitives to a single threaded language." Multi threading is a super-set of "event driven design", and a much more scalable one.
Node is largely a reaction to advances in event multiplexing like kqueue and epoll. It was intended to be a scripting language bolted on top of a single-threaded concurrency model. This design-decision corresponds to the before unseen degree of callback hell that is Node's primary pain point.
It was intended for low latency services that mostly wait on I/O. These services don't compute enough to need parallelism. Especially when you can just fork another process or FFI to a C library with proper threading if you need it. It is necessary that they be concurrent but parallelism introduces a LOT of complexity beyond that.
>Node's event driven design is also called "co-operative multi-threading"
I think you mean multi-tasking, not multi-threading. And isn't cooperative multi-tasking the Windows 3.1 model on DOS? Not sure how cooperative multitasking is better than OS threading.
And am I mistaken, or is Node.js the same thing as Windows programming model from 1990 repackaged in Javascript? Windows programs have single message loop, and Windows events and you write callbacks that handle those particular events, like when a mouse clicks, when it moves, etc.
Well the call back hell which is a consequence of the event loop paradigm which is in turn due to being single threaded. But callbacks aren't a big deal with the intro of native Promises in ES6. And a lot of people (me included) find the event driven paradigm very useful for simple REST backends serving mobile apps.
I feel like if you make an effort to avoid callback hell, you can prevent it fairly easily. I don't know if I'm not writing complicated enough code, or everyone else is just bad at JavaScript.
> Its killer feature is that the same code can run on the client and the server.
In most cases that's not even completely true. Try to run in the browser anything with a node require... That's why we have to use ugly hacks like browserify :/ (not meant to bash browserify, it's really useful. The ugly thing is having to bundle 10s of thousands of javascript lines of multiple packages in a single file)
Also, ES6 & ES7 make callback-heavy Javascript so much more pleasant to program in. That addresses Node's main pain point.