Hacker News new | comments | show | ask | jobs | submit login

On one hand, I agree with him. The software ecosystems we work in have a whole lot of needless and incidental complexity. I could go on and on about the insanely and overly complicated things that developers -- especially ones like Ryan Dahl -- have to deal with all the time.

On the other hand, it's arrogant for one to think that he or she could do it that much better than the next guy. Writing efficient, maintainable, and "simple" software requires adding layers of indirection and complexity. You have to use your best judgment to ask whether the new layer you're adding will make things ultimately cleaner and simpler for future generations of programmers, or will hang like a millstone around their necks for years to come.

Let's try a little thought experiment: go back a few decades to the early 80s. Propose to build node.js as a tool to make it much easier for developers to write real-time network applications. You'll need to design a prototype-based dynamic language, itself an extremely difficult (and dare I say complicated) task. The implementation will need a garbage-collector, a delicate, complicated, and cumbersome piece of code to write. To make it acceptably fast, you'll need to write a JIT, which traces and profiles running code, then hot-swaps out JITted routines without missing a beat. You'll need to write a library which abstracts away event-based IO, like the "libev" node.js uses. That will require kernel support.

Frankly, even forgetting about relative CPU power at the time, I think you'd be laughed out of the room for proposing this. All of these things, for production systems, were extremely speculative, "complicated" things at the time they were introduced. People can't predict the future, and they obviously have difficulty predicting what tools will become useful and simple, and which will become crufty tarpits of painful dependencies and incidental complexity. No one in 1988 could say "a dynamic, prototype-based, garbage-collected language paired with a simple event model will allow developers to create simple real-time network applications easily in 2011". Many of them probably had high hopes that C++ templates would deliver on the same vision by then. But, instead, we have Boost.

Further, it's extremely arrogant of Dahl to create a dichotomy between those who "just don't yet understand how utterly fucked the whole thing is" and those, like him, with the self-proclaimed power of clear vision to see what will help us tame and conquer this complexity. Who knows, maybe in 15 years we'll be saddled with tons of crufty, callback-soup, unreliable node.js applications we'll all have to maintain. I don't think James Gosling envisioned the mess that "enterprise Java" became when he designed the initial simple language. Most developers do many of the things he cites, like adding "unnecessary" hierarchies to their project, because they believe it will help them in conquering complexity, and leave something simple and powerful for others to use down the line.




Erlang came out in 1986, and was used in production systems shortly thereafter. The world really is just catching up to the state of the art of the early 80s.

Ryan is right. Most of the software we use is crap. That's because Worse is Better.


Ryan doesn't claim he can tame it. The post comes off as self-deprecating and not arrogant to me:

"Node is fucked too. I am also one of these people adding needless complexity. ... The entire system is broken - all the languages and all the operating systems. It will all need to be replaced."

I am not a node.js user.


In fairness the internals of Node get complicated due to cross platform support. The fact that every OS does it differently is both a blessing (because we can learn from mistakes of others) and a curse.


FYI, in 1988, Self (a dynamic, prototype-based, garbage-collected language that had a JIT) existed. I'm not sure how it handled concurrency, though.


Not well, unfortunately. Neither does it handle errors well. It doesn't have proper closures, either. I love it, but readily admit it's a giant research demo.


Lisp came out in 1958 Smalltalk came out in 1980.

Oh, hell, just watch the Mother of all demos (1968): http://www.youtube.com/watch?v=JfIgzSoTMOs

It's not arrogant of him to think like this. It's more like Steve Jobs, circa 2006 thinking phones sucked and deciding to do something about it. Or maybe it's like Steve Ballmer thinking he could take over the phone market. I think it's too early to say for sure, but the early signs are promising.


Modern Lisp didn't came out until much later, but still, Common Lisp was out in 1984, and that's still very early, especially considering the fact that it was a response to already widespread use of Lisp languages with Lisp-like features.


Arrogance? Come on, it's a rant. Cut him some slack.

Other than that, I agress with pretty much everything you said. It's easy to forget how reliant we are on things like industrial-strength GC, multithreading and JIT and how young those things really are.


I don't see where he says he knows what the solution is.




Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: