Hacker News new | past | comments | ask | show | jobs | submit login

The added fault tolerance of Erlang is not free - it adds complexity and overhead. Would you rather maintain an distributed Erlang system running 100s of tasks on 10s of machines, or a simpler C++ based system that due to performance can run on a single machine? Erlang may scale out, but does it scale down?

This is similar to the ADA argument -- it's great and can be used to write safe software, but do you need safe software a month from now or unsafe a week from now? Global networks with millions of users are extreme outliers.




A few points:

- A large system never runs on a single machine if you want any level of realistic fault tolerance. Two nodes is a bare minimum, 3 an acceptable one.

- The number of nodes will always depend on where your bottlenecks lie. Erlang developers would be rare and few to write CPU-bottlenecked code directly in Erlang. The usual approach would be to write your system in whatever is appropriate, and then too coordinate things with an Erlang layer.

- I would a hundred times more willingly maintain and debug a running Erlang system than a C++ one. I'm kind of sold on the idea though, and wrote http://erlang-in-anger.com to share my experience there.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: