Hacker News new | past | comments | ask | show | jobs | submit login
All Node.js servers are vulnerable to DoS (groups.google.com)
75 points by johnx123-up on Jan 3, 2012 | hide | past | web | favorite | 35 comments

Ryan Dahl commented on the thread, and it's being fixed in node. Nice to see assessment and responsiveness at the core of the project.

I'm evaluating node.js as an application platform choice for a large public infrastructure project. One thing that concerns me is (my perception here) a lack of public hardening of the server that's yet to come. I've been around long enough to see that effect on PHP, Django, Rails, etc.

We'll continue our evaluation but it's encouraging to know that issues like these are being discovered and addressed.

A look at node's HTTP parsing code, 1500 lines of hand-coded and rather pretty C, makes it clear that Ryan Dahl cares a lot about HTTP in node doing the Right Thing.


(This is also very handy for people writing HTTP servers and clients in other languages, since it's independent of node, and really fast and feature-complete.)

I don't think Dahl takes credit for all of this code:

"Based on src/http/ngx_http_parse.c from NGINX copyright Igor Sysoev"

This is a particular case of http://news.ycombinator.com/item?id=3401900. Basically, weak hash functions allow you to create lots of hash table collisions, degrading performance to that of a linked list. It is common to put POST'ed data into a hash table (the equivalent of ?foo=bar&baz=qux becomes {"foo": bar, "baz": qux}).

This title is a great piece of FUD. Anyone paying a little more attention would have seen the original paper (http://www.nruns.com/_downloads/advisory28122011.pdf) which states: "PHP 5, Java, ASP.NET as well as v8 are fully vulnerable to this issue and PHP 4, Python and Ruby are partially vulnerable".

That's not to say that Node doesn't need to fix this (and it seems like Bert from the core team is) but it's not a Node specific issue.

I disagree.

Node.js and client-side JavaScript should treat security issues differently because they face different risks. E.g. A DoS against client-side Javascript is not a big deal (because it might slow down a single browser, or even just a single tab within a browser). However, on the server side, a DoS could take down an entire service which is much more significant.

Thus you could say that V8 is "secure" on the client side but "insecure" on the server side because of the different risk assessments. It is poor security practice to take software designed for one security environment and assume it will be secure in other environments. If Node.js wants to have a secure system they will need to take these security issues into consideration and harden their system appropriately.

He's saying it's FUD because the headline is misleading, not because he's trying to downplay the security issue. You and grandparent are likely in agreement with respect to your comment.

(The headline is misleading because the issue affects several major language runtimes, V8 included – yet only Node.js is mentioned.)

Thanks; upon re-reading the comment I see it now

Ruby 1.9, the current production version of Ruby, is not vulnerable. Rails 4, the next version of Rails, will require Ruby 1.9.

I wonder if this is a bigger deal for Node because it's single-threaded? Just one malicious POST request could slow down the entire server, whereas other languages that spawn a process for each request could easily kill a process that's using 100% CPU, right?

you can just as easily spawn multiple node.js servers and kill off the unresponsive one's

yeah but you're encouraged to only spawn child processes for cpu-intensive tasks.

If you are using external processes (tools like gd's converters and so forth) yes. For code that is only running in node's environment it used to be that running a process or more per core (using something like nginx as a reverse proxy to tie them to one port) was considered the way to make better use of extra CPU resource.

There is even a cluster module built in now to remove the need for an extra external tool to manage the processes: http://nodejs.org/docs/latest/api/cluster.html (there are more fuller featured options available as extra modules, I'm not sure how they compare efficiency-wise with the in-built one). I'm guessing this isn't the way to go if the processes need to communicate, but I've not looked into it overly deeply yet (my experiments with node not having grown to the point of needing to take advantage of more than one core).

This might provide a little more explanation than the title and comment thread do: http://www.ocert.org/advisories/ocert-2011-003.html It is a big issue, but it isn't just node.js.

Looks like it's getting fixed:

>> Yep. Hang tight. v0.6.7 is coming up soon.

Putting the kibosh on the paranoia parade, huh?

It's like everyone figured out how hashes work just a couple of days ago. What happened to spark all of this conversation? I also keep hearing that this problem is solved with randomized hashing functions, but my best guess is that this doesn't eliminate collisions. It just then becomes roughly impossible to generate a set of keys that would cause enough collisions to actually be a problem. Hooray for data structures.

As a Ruby developer, I'm glad I won't need to deal with a "All Rails Servers are Vulnerable to DoS" FUD post in the next few days.

I was expecting a bunch of Perl developers in this thread reminding us that they fixed this problem in 2003. :)

So are all string based dictionaries?

Yes, if they are implemented via hash tables and do not randomize their hash generation somehow. The talk at 28c3 specifically mentions PHP, Java, ASP.net, Python. Ruby is fine, but other variants of Ruby are apparently also vulnerable.

Worth noting that this has been "fixed" in Perl since 5.8.1, released over 8 years ago.


Ruby 1.9 is fine, 1.8 has problems.

1.8 is also fixed.

Go team! Good to know.

PHP already fixed this in the 5.3.9 and 5.4RC branches.

> Didn't I see this same thing about PHP the other day?

This affect all languages using hashtables with non-randomized hash functions to store POST arguments, it's been discussed on Python's mailing list for instance. It's also been noted on the Erlang list, but all erlang frameworks apparently use proplists for POST mappings, so none of them is affected.

I believe that the critical ASP.NET framework update that Microsoft pushed just before the new year was to fix this issue as well, amongst a few others.


Anything measured can be improved.

That's the spirit of those "penis enlargement" ads.

To be pedantic, there are hardcore limits on stuff, that we can not improve upon. We cannot sort a list of n items in less than O(n) for example.

O(n) only applies to non-comparison based sorts; for comparison based sorts that might be used on "items" the lower bound is indeed O(n.log n)

Not necessarily specific to node.js, but in general, instead of a standard webserver, use netcat, on multiple obscure ports, where each instance of netcat acts once and is discarded.


That plan sounds like half-assed voodoo, but it kind of resembles the approach that qmail uses for security, which is actually pretty neat:


Thank you for the link, but my approach is neither "half-assed" nor "voodoo". I've used netcat, as described, for a small project, and it worked well. I'm about to do the same, for a big project, and I expect that it will again work well. Standard webservers are bloated. Speed, security and stability can be enhanced by distributing work across a system of one-shot processes. I take some inspiration from Jef Poskanzer's design decisions in thttpd.



NOTE TO DOWNVOTERS: You shouldn't downvote a technical suggestion, no matter how strange, unless you are certain that it won't work.

Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact