Hacker News new | comments | show | ask | jobs | submit login

Node.js lets you write server applications in a server container that can handle tens of thousands of concurrent connections in a loosely typed language like Javascript which lets you code faster. It uses the same design as Nginx which is why it can handle so many connections without a huge amount of memory or CPU usage.

If you were to do this on Nginx you'd have to write the module in C.

You can't do it on Apache because of Apache's multi-process/thread model.

The fact that you can write a web server in a few lines of easy to understand and maintain Javascript that can handle over 10,000 concurrent connections without breaking a sweat is a breakthrough.

Node.js may do for server applications what Perl did for the Web in the 90's.




  > If you were to do this on nginx you'd have to write the
  > module in C.
You can write a module to glue nginx and V8. Many people've done it. It takes less than 400 lines of code and a lot of it is nginx typical module code. (The problem is more about the lack of nginx online help, perhaps.)

  > The fact that you can write a web server in a few lines of easy to
  > understand and maintain Javascript that can handle over 10,000
  > concurrent connections without breaking a sweat is a
  > breakthrough.
Yes. But the big performance issue still is hitting the database and disks. There's no point in having a super fast web server if the DB is dog slow like the vast majority of databases out there. Including the NoSQL bunch. They are not fixing the issue of latency vs. scalability vs. reliability. For that they need to address many uncomfortable problems of current hardware architectures. This is the elephant in the room.


Well, at least it's an elephant we're all talking about a lot. It's not as if anyone's ignoring that.

If your DB is dog slow, you can still handle wildly high concurrency with node. It's just that the user will feel the slowness, and your DB will struggle. But at least your server won't be unable to serve requests while it's waiting for IO.

Of course it's still on the developer to architect their system for success. Node just takes one unnecessary bottleneck out of the equation.


"The fact that you can write a web server in a few lines of easy to understand and maintain Javascript that can handle over 10,000 concurrent connections without breaking a sweat is a breakthrough."

I think it's a big mistake to judge based on "number of lines taken to write 'hello world'". It's what happens when you have 20k LOC to maintain and a heap of complexity that matters.


Agreed. It's too bad it's hard to discuss complexity issues at that level in blog posts - it tends to skew things towards overly trivial examples.

I'm curious how suitable node's callback-centric model is for larger codebases - it's workable for smaller stuff, but can turn into spaghetti code quickly, like CPS code or giving someone elaborate instructions in passive voice. (Of course, relatively autonomous chunks of the system can be moved into their own processes.)

Off and on, I've been working on a similar event loop/async server framework in Lua, and I think Lua's coroutines make the resulting code easier to manage. (No time frame on that yet, btw.)


The complexity around callback heavy code is why I think that the Clojure Aleph library's hybrid approach presents an avenue worth exploring - if your language has good syntactical support for dealing with concurrency.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: