Hacker News new | past | comments | ask | show | jobs | submit login
The Future of Programming in Node.js (groups.google.com)
278 points by Yakulu on Aug 13, 2013 | hide | past | favorite | 176 comments



Only tangentially related, but the "rant" by Ryan Dahl (node creator) when he left Node day-to-day is worth a read:

https://gist.github.com/cookrn/4015437


This guy is either pretending to be an idiot or really does not appear to understand just how complex everything is. The whole "thing" that is "fucked" is there to manage that insane complexity. If you don't have good tools, good libraries or don't know how to use them, you won't be able to deliver on your promise of "solving the problem", because you will be overwhelmed by the complexity monster.

I'm sorry, I just can't get over it. "Just solve the problem" - I hate that attitude. That's how shitty software is born. Incompetent schmucks give birth to unholy, unmaintainable, monolithic code because they think it's okay to not know your shit as long as you "solve" the problem.


Yo, the point is that the complexity is there because the well-intentioned and simple underlying systems were not properly scaled to meet the demands of modern applications. At no point does he suggest that you should eschew the tools that are in place to maintain the current complexity – that would be nonsensical, as you pointed out. He's arguing that we should use the system as it was designed, if that's at all possible, or re-architect the system to handle modern application complexity without increasing system complexity unnecessarily. Achieving this would be near impossible – see Plan9, Minix, VPRI, and Squeak for "failed" attempts – hence this being a rant and not a serious proposal.

Note also that Ryan is a mathematician. Mathematicians hate engineering because there's nothing to protect real-world systems from little accidents that compound over time. No proof, no piece of code, no matter how elegant, can reverse the thousands of bad design decisions that have resulted in the modern Linux ecosystem.


Ryan Dahl created Node. He's not just some schmuck.


> Ryan Dahl created Node. He's not just some schmuck.

---

"The only interesting languages are C/C++, Go, Dart, JS (and possibly Rust). Everything else is legacy bullshit."

"Node.js has linear speedup over multiple cores for web servers.”

There is also the one about math not being useful for computer programmers and how programming is just like playing with duplo block and legos.

---

I don't know. Node scaling linearly over multiple cores. I can see how many would agree with you saying he created node.js but might think twice about the schmuck label.


I know. That doesn't mean he can't be incorrect about something. Let's not appeal to authority here.


Fine, let's all go back to business as usual of calling people idiots to make ourselves feel superior, HN.


Also, I didn't mean to say that he writes crappy code. What I meant was, those who already write crappy code will now be less inclined to improve because some tech-celebrity said it's okay.


Those are fighting words bro I'll send you a code sample right now see whose the ninja.

Look, you can't just throw cheap shots, that's all deuche. What was the need for calling him an idiot or incompetent?


I agree, no need to call anyone an idiot or incompetent.

But, I would not want Ryan writing code that people completely trust to live on (hospital equipment, NASA, etc).

That does not make him an incompetent person. Just someone with opinions of wanting to keep things as minimalistic as possible which is just one of the many paths available for us to choose in life. In some cases, this is great, in others it is not.


If you write every piece of code like how one would write hospital equipment or mission critical code, then you are wasting your time and money.

Just like I wouldn't expect a civil engineer to build a bridge in a suburb capable of holding 5 Tons because "NASA does it that way because their spaceships are heavy" I wouldn't expect the same guarantees from software engineers writing an image sharing application.


I wouldn't trust most developers to write code for hospital equipment, NASA, Boeing, etc...even myself (I'm not that kind of programmer) and probably you (though I don't know your background). It takes a certain kind of programmer to write safety critical code, and they have to have some amount of training in that direction.

Diversity is good anyways.


"The only thing that matters in software is the experience of the user."

Hell no. Software comes from software development. And software development is a process. My editor, programming style, directory structures and other personal organizational behaviours do matter. This is work we do, and while the end user's experience may be identical regardless of very different personal development style options, the one which gives me the most comfort and work satisfaction is superior. So yes, I will configure my editor so that I don't have to keep repeating the same keystrokes dozens of times throughout the day. I will align things the way I see fit, and I will endeavour to study and improve my skill with the language. Not only so that I'm more efficient, but because it will bring increased contentment to the process of development.


>Hell no. Software comes from software development. And software development is a process. My editor, programming style, directory structures and other personal organizational behaviours do matter. This is work we do, and while the end user's experience may be identical regardless of very different personal development style options, the one which gives me the most comfort and work satisfaction is superior.

Only nobody cares about you.

As in a restaurant, people care about the food, not the cook.

And even if you (and I) mattered, developers of a piece of software are N, where users are 1000N or 100000000N. They matter massively more.

Oh, and there will always be programmers as long as there are money to be made, so the "without me [in particular] there would be no software" argument doesn't hold much either.


> As in a restaurant, people care about the food, not the cook.

The health inspectors care about both the food and the cook, and the latter's workflow is certainly scrutinized.

If we're going to beat this analogy to a bloody pulp, the health inspector in software land would be technical debt and colleagues.

Not taking care of the process will bury the project, and the users with it. Ryan Dahl's brilliant, but I don't think his entire statement should be adhered to verbatim. It's solid guiding principle, nothing more.


No, the health inspector would be non-existing in most software shops.

What a different industry it would be if all software required independent audits and the public could get access to the results.


Colleagues are non existent in most software shops? I obviously didn't imply there was some independent software inspection service.

I care about how code is written, because I'll have to work on it. My employers care about how code is written, because our reputation is on the line.

It's about solving problems, certainly, but workflows directly contribute to solving problems better, faster and more reliably.


This is like saying "the only thing that matters at this hotel is service". The challenge is hotels that go overboard at the expense of employees wind up with bad service anyway. There's a reason high end hotels treat employees well.

Similarly companies too crazy about short term shareholder value find that it takes committed employees to get long term results.


> Only nobody cares about you.

Except me, I care about me.

And I don't want to use crappy and shitty tools. I want to use good tools. I am less productive when I use crappy tools, it can be done, but then a lot of time is spent fighting the tools not really solving problem.

> Oh, and there will always be programmers as long as there are money to be made

There are not that many good programmers. Or at least they are not the ones throwing resumes around. One has to find them, if they are good they are probably already in a good place and are not looking for job. So you just taking anyone from the street because they did some VBScript in High School you are going to replace one good programmers with even 100 incompetent ones. It is just not how it works.


> And I don't want to use crappy and shitty tools.

Neither does Ryan. That's what the whole rant is about. He just wants to make software for users without having to hack through the forest of bad design decisions that is the modern Linux system.


Yeah I agree with Ryan's point there I was just replying to the GP about how nobody cares about programmers and how programmers who whine and complain are just easy to replace like cogs in a machine.


Software developers are users who experience software. Might be a different category of software, but user experience is still king.


Technology is just a tool, for the purpose of solving end-user's problems.

Spoiler: Most software you create wont be around in 10 years: http://www.youtube.com/watch?v=zut2NLMVL_k&feature=youtu.be

What matters is the value it delivers in the meantime, which is ultimately the end-user result - not how you got there.


What matters to me is being able to enjoy my life while providing value. And the end user doesn't gain any advantage if I happened to have suffered while I wrote the software. There's nothing wrong with setting yourself up to enjoy your work as much as possible.

Yes, in the end, the software won't be around, in fact nor will those who derived value from it. All that will matter is how you got there.


    >Spoiler: Most software you create wont be around in 10 years
A personal anecdote (that at least I still chuckle about):

I was working for a start up about about 5 years ago working on a the reporting system for measuring customer metrics, the data was granular enough that we allowed the customer to drill down to the per-hour activity level. As a result it was deemed important to represent the data in the customer's timezone. To "future proof" the system we had a table indicating when to shift an hour for daylight savings time. I populated the table to go out 25 years I think. My manager asked me "what happens in 2033?". But after a moment of thought he came to the same conclusion you did and 25 years was deemed sufficient.


That you got there is important. If you didn't get there, no value was delivered to end-users.


>The only thing that matters in software is the experience of the user.

That is true, but the experience of the people developing the software matters too, as it informs the care that will be put into creating a better user experience.

Let's say you're creating an application and you're the sole developer and designer of that application. When you have a predictable and reliable development environment that emphasizes good, well defined practices, you reach a state of flow with more ease when you're coding, and your project gets developed faster. Faster development allows you to iterate quickly and better test your assumptions about the user interface and the overall user experience. It gives you more time to design the application. All in all, with a better development experience, the developer gets more time to create a better experience for the user.

I bet Ryan's frustrations—all the abstractions and crappy interfaces—come from years of legacy technologies and the need for compatibility and integration with other systems, and sometimes as preemptive measures to avoid security issues. When they were made, there was a reason for their existence.


Consider systems like the GDS infrastructure backing the travel industry. These systems are long-lived (50y+) and are the backbone of all travel booking and scheduling. The only thing that matters for them is not the end user experience, it's the stability of the system, and society's reliance upon its operation. It's robust and it was produced at the right time to carry an industry into the efficient monster it needed to be to succeed. A whole industry was built on it layer by layer, and what we now have is a complex beast composed of ancient IBM hardware, crazy nested and disparate schemas, greenscreen scrapers, and probably more than a few dashes of COBOL. One can't expect the system to stop evolving while we take a 5 year breather to rewrite it all.


This seems to parallel some of the work VPRI is doing, eg from:

http://www.vpri.org/pdf/tr2010004_steps10.pdf

STEPS Toward Expressive Programming Systems, 2010 Progress Report Submitted to the National Science Foundation (NSF) October 2010 Alan Kay et al

"If computing is important—for daily life, learning, business, national defense, jobs, and more — then qualitatively advancing computing is extremely important. For example, many software systems today are made from millions to hundreds of millions of lines of program code that is too large, complex and fragile to be improved, fixed, or integrated. (One hundred million lines of code at 50 lines per page is 5000 books of 400 pages each! This is beyond human scale.) What if this could be made literally 1000 times smaller — or more? And made more powerful, clear, simple, and robust? This would bring one of the most important technologies of our time from a state that is almost out of human reach—and dangerously close to being out of control—back into human scale.

(...)

Previous STEPS Results

The first three years were devoted to making much smaller, simpler, and more readable versions of many of the prime parts of personal computing, including: graphics and sound, viewing/windowing, UIs, text, composition, cells, TCP/IP, etc. These have turned out well (they are chronicled in previous NSF reports and in our papers and memos). For example, essentially all of standard personal computing graphics can be created from scratch in the Nile language in a little more than 300 lines of code. Nile itself can be made in a little over 100 lines of code in the OMeta metalanguage, and optimized to run acceptably in real‐time (also in OMeta) in another 700 lines. OMeta can be made in itself and optimized in about 100 lines of code."

More:

http://www.vpri.org/html/writings.php


did this stuff actually work?


According to the last yearly report that is available:

http://www.vpri.org/pdf/tr2011004_steps11.pdf

"This year we are able to write this entire report in the Frank part of STEPS, including producing the PDF version for NSF’s web site. We can give comprehensive high‐quality, high‐resolution presentations using STEPS which include live code and in situ demos, and which show working versions of a number of the document types derived from the STEPS “universal document”."

So yes, they have something that is working. Unfortunately, as far as I can figure out - the "Frank part of STEPS" isn't really available as such.

There's a few thing to play with, like:

http://piumarta.com/software/cola/

http://lukego.livejournal.com/16036.html key quote: "Here's what a hello-world-esque operating system looks like"

There's also:

    svn co http://piumarta.com/svn2/idst/trunk idst
    head -2 idst/function/examples/tcp/00_README 
    This example demonstrates an alternate 'shell' for
    Jolt and how to build a tiny TCP/IP stack using it.
And there's also: http://tinlizzie.org/ometa-js/ (see "go to project", top right).


Hm, looks like the Logo example doesn't quite work anymore - but the prolog one does.


To use his language, I agree that the "whole thing fucked", but I also think it's fun. I enjoy the fact that things are a challenge and sometimes quirky.

Sure, down in the trenches it makes you want to pull your hear out. And the world would probably be more productive if software interfaces and programming languages were completely frictionless. But it would also make programming, to me, less fun, challenging, and enjoyable.

We are imperfect beings living in an imperfect world, and I don't think changing the status of either or the former or the latter would create the nirvana ry is alluding to.


>To use his language, I agree that the "whole thing fucked", but I also think it's fun. I enjoy the fact that things are a challenge and sometimes quirky.

The problem is that the wrong things are fucked and that's neither challenging nor fun.

E.g configuring some system so you can get your software built correctly is neither fun not challenging. It's merely tedious busy-work. Actually solving problems, that's the fun part.


That's precisely what I'm disagreeing with. Not everyone sees configuring a system as "merely tedious busy-work". I for one like setting things up. It might be frustrating, but there's also an enjoyable element of accomplishment when you overcome something frustrating. It might not be the kind of fun and challenging you're looking for, but please, try not to state state opinion as fact.

One man's tedium is another man's pleasure, as much as I too enjoy solving the bigger-picture problems. There's a reason both Gentoo and Arch exist :)


>That's precisely what I'm disagreeing with. Not everyone sees configuring a system as "merely tedious busy-work". I for one like setting things up.

And what you're saying is precisely what I'm disagreeing with. I'm sure there are people that like those things (mostly beginner's with Unix and such, tinkerers etc). It's a form of passtime.

That doesn't mean it's challenging in any creative way. It's a boring ass brain puzzle that tons of people are solving again and again.

The sense of "accomplishment" of getting the dependencies to build X right, might be valid if you just started building things and getting to know how compilers, libs etc work, but it's grows old very quickly for normal people and it's a false accomplishment after that initial stage.

Actually saying you enjoy that is like saying you like boilerplate code, because that's exactly what it amounts to.

>It might not be the kind of fun and challenging you're looking for, but please, try not to state state opinion as fact.

Sure, you can find people that like everything, even the most boring ass shit. I'm sure there are people that enjoy calculating logarithms manually too. But I wouldn't call such bizarro outlier fun as fun in the accepted sense.

But it's a fact (I've seldom seen anyone state the opposite, tons of texts from Stack Overflow to blog posts support this) that most programmers/hackers tear their hair out to get rid of those "fun" tedious configuration/setup/figuring dependencies etc work and move to the real part.

I've actually never read an interview with a hero programmer where he said that he likes those parts of programming -- but have read tons of those that say the exact opposite, from Alan Kay to Jamie Zawinsky.

Actually a large part of programming is exactly about removing (automating) those tedious parts.


Yeah, but what about us who are building software to make money?


I certainly appreciate Ryan Dahl's opinion and absolutely agree with the conclusion (which has done incredibly well for me in my career), but these things are not mutually exclusive. One can put user experience above all and still hone their tools and habits while working towards that goal.


I wonder if Ryan Dahl is following in Chuck Moore's footsteps now.

http://www.ultratechnology.com/forth.htm


sorry, why did you say that was worth a read? it doesn't appear to have any content.


You could possibly draw a connection between Ryan Dahl pleading for simplicity in that gist, and Isaac Schuetler claiming that they would not be adding extra complexity to Node.JS.

By the way, your post comes off as unnecessarily rude.


Your post comes off as unnecessarily passive-aggressive. If we're pointing these things out.


There's a Confucius saying in here somewhere.


I beg to differ. It was just necessarily passive-aggressive enough. On the dot.


I thought it was good.


He is so right! No wonder why he wrote such beautiful and useful piece of software. A sort of genius. Thank you, Ryan!


> We are done making breaking changes at the Node.js layer. If your

> program runs today, we're doing everything we can to make sure that it

> will run next year, albeit faster and more reliably.

I've yet to devote any significant amount of time to developing on Node but this is a big +1 for more serious consideration.

My memory is fuzzy (this was well over a year ago) but I remember building an app in express and it being completely broken by updates a very short time later (along with a good number of other modules that hadn't been updated either).

Anyone venture out a guess/opinion on how many things in userland will stabilize because of this? (Or perhaps things already have- I don't hear as many complaints about express' api changing lately).


Userland has become a lot more stable over the past year or two as projects have matured. A lot of popular projects (like express) went through rewrites with breaking changes.


Didn't UserLand make Frontier? (It's a Dave Winer joke -- ha ha!)


I remember that time too. Most of the modules I use are pretty stable now, which is nice.


Well, I had an issue with Express going up a major version also to 3.0 but it was actually my fault because I didn't specify the major version in my package.json, i.e. 2.x. If I had done that the breaking changes would not have got me.


Working on a large scale & fluid Node project at the moment and we probably see at least a dozen dependency updates a week. Never caused a problem.

The community as a whole is pretty strict in it's respect for versioning conventions.


Node's module system is hands down the best module system I've worked with. I wish other new languages would stop trying to innovate and instead just copy what they've done.


Really? I think it is terrible when compared to other languages (C#, Java, Ruby, Python, etc...). I have only done a few applications on node, but my experience with including modules was painful to say the least.


I'd have to say the biggest thing that npm has over module systems found in java, ruby, python etc. is the complete isolation of transitive dependencies. It is nice to use two dependencies and not waste a day or two because:

* module A depends on module C v1.0.2

* module B depends on module C v1.4.3

In all the languages you mentioned it becomes a pain because you can only use one version of module C, meaning either module A or B simply will not work until you find a way around it.


This is a cultural problem, not a module management problem per se. Blame the author of module C, not the package management system.

Semantic versioning's raison d^etre is to prevent these sorts of issues. Reusing a major version number is supposed to constitute a promise not to remove or change the call signatures of any functions published by your library. This is necessary so that a dependent library can declare a dependency on version 1.x.x of your library when version 1.1.0 is released, without having to worry that version 1.2.0 of your library will break things.

The problem is that too many library and module authors (who are otherwise talented, or are simply the first provider of a useful library that ends up gaining traction) refuse to follow the rules, and there's no effective sanction in the OSS marketplace for this sort of antisocial behavior.

As soon as backward-incompatible change is introduced without bumping the major version number, the dependent module author becomes paranoid (and being closer to the user, he's going to wrongly get a disproportionate amount of blame), and (rightly) feels he has no other option but to declare a strict version dependency. And when there is more than one dependent module involved, the misbehavior of the independent module author can cause a dependency graph that is impossible to satisfy.

As far as I can tell, this whole situation started with Ruby, and is the main reason (along with second-class documentation) why I am generally averse to its ecosystem. rvm and its ilk shouldn't even have to exist.


> Semantic versioning's raison d'être is to prevent these sorts of issues.

Semver may surface them by making it very clear (assuming all involved libraries use semver) where they can occur, but, if you have a package management/loading system that only allows one version of a particular package to be loaded, obviously can't do anything to prevent the situation where different dependencies rely on incompatible versions of the same underlying library.

Sure, with semver it won't happen if A depends on C v.1.0.1 and B depends on C v.1.4.3 (as A and B can both use C v.1.4.3), but it will still happen if A depends on C v.1.0.1 and B depends on C v.2.0.0.)

To actually avoid the problem, you need to isolate dependencies so that they aren't included globally but only into the package, namespace, source file, or other scope where they are required.


That's all good and well, but these packaging problems happen and they'll continue to happen, so wouldn't you rather have a system like npm that can tolerate mediocre packaging than one that doesn't? When you're trying to fix clashing dependencies, are you really going to care about whether those clashes are an intrinsic or a cultural problem?


Regarding semantic version: it works in theory, but in practice applications often ends up relying on bugs, private APIs, or other kinds of non-public behavior. For example the GNOME libraries have been following semantic versioning forever, yet sometimes an upgrade breaks something else because that something else was relying on a bug. In 2004 there was a famous case where upgrading your glib would break gnome-panel. This is of course not to say that semantic versioning is useless, but in practice you will still need some kind of version pinning system.

As for "rvm and its ilk shouldn't even have to exist": you do realize that rvm and its ilk are not just to allow you to pin your software to a specific Ruby version, right? They're also there to allow you to easily upgrade to newer versions. Let's face it, compiling stuff by hand sucks, and your hair has turned white by the time the distro has caught up.


> because you can only use one version of module C

This is not strictly true in Java. You can set up your own classloaders that allow you to load multiple different versions of the same class and hand out the right instances on demand. (This requires some work of our own since by themselves classes are not versioned. But you can evolve simple versioning schemes on (say) Jar files to solve this)

Obviously not trivial, especially if you are rolling one on your own. But you can use an OSGI implementation to do most of this for you in a standard way. JSR 277, if and when it is implemented, should provide another solution in "standard" Java.

Not really sure about python and ruby.


NPM's way of managing dependencies still can waste a day or two (or more) of your time. For example, get a C object from B, then pass it into A.

Things are even more twisted when you have a half dozen versions of C floating around in your node_modules, and the problem isn't in your code, but a dependency of a dependency.

Another issue I've run into is patching a bug in a module, and then having to figure out how to get that patch into all of the other versions that cropped up in node_modules.

NPM is one way to solve the modules problem, but it's no panacea.


That's great, but it's not without cost. Here, the cost is you end up with deeply-nested directory nodes (which breaks Jenkins ability to properly purge the directory after a job). Node modules are also extremely liberal in the number of files they create -- even a "simple" app using just a few common modules could end up with 1k+ extra files. This can produce problems in your IDE, as well as with your source control or continuous delivery systems, among other things.

So, it solves some headaches, and creates others.


Maybe you need to run jenkins under a better/faster filesystem. We use jenkins as well, and our deeply nested directories are deleted in under a tenth of a second.

I feel like your complaints are a user problem. I don't have the "too many files" issue when I use vim.


The OS is Windows, and Jenkins handles everything else just fine. It's just the Node projects that ever have issues. Of course, it's easier to blame the OS.


Are your concerns more than just theoretical? I've been developing in Node for a long, long time now and have never had any issue with any of this. Take source control for instance: isn't the first thing you do to put `node_modules` in your gitignore?


some argue gitignoring node_modules http://www.futurealoof.com/posts/nodemodules-in-git.html


That post predates `npm shrinkwrap`.


What makes you think they're just theoretical? Are you insinuating that I'm just here to argue arbitrary crap for the hell of it?

Also, gitignore does not work in SVN (*omg he uses SVN! the shame!), and the node_modules do actually have to be included in the source since the runtime is disconnected from the internet (intranet app).


Since for some reason that's what everyone else is doing in this thread ("I once read the Node documentation two years ago and am therefore in a position to make grand and sweeping judgments about it") I'm afraid I lumped you in with that crowd. Apologies.


That's a property of Node/Javascript, not a property of NPM.


In the Java world, OSGi takes care of that and much more.


I found OSGi to be a promised land that we desperately needed in Java. Unfortunately I found that the journey there was painfully difficult.


Did you use bnd/bndtools? Without those I heartily agree: you'll have the steep learning curve of OSGi upfront, and the manual maintenance of metadata for the duration of your project. Using bnd/bndtools, it's only the initial learning curve that you have to worry about.


Could you elaborate on that? What did you find difficult about `npm install $PACKAGE` and `var thing = require('thing')'`?


Weird conventions, like including index.js by default. It does not play nicely with coffeescript's default behavior of including a module. The modules being a hack built into the language, I have to read library docs and follow external decisions (versus a holistic approach built by the language maintainer). Introducing a variable to contain a library is unconventional and potentially limiting (specifically for meta programming).

Like I said, I didn't spend much time with them. They are a solution. NPM feels like the old rails gems and is certainly better than most package managers. Requiring external libraries and files specifically is a hack and feels as such. I would gladly take any of the languages inclusion techniques I listed above over node's.

Edit: I am not trying to put down the hard work that went into creating the system. JavaScript does not support modules natively, so adding support is a momentous accomplishment. The unfortunate reality is that it will always be a limited solution. JavaScript modules (in any form) are going to be inferior to languages that natively support such a basic part of programming.


Don't complain about a tool built for Javascript when you try to shoehorn Coffeescript into it.

Coffee script being a hack built into a new beast of a language, I have yet to read docs that drift from "JS is better! ... No, CS is better!"

Use the right tools for the job, and don't complain that your pentagonal screwdriver doesn't always work on phillips head screws.


I see your point, but really, what he said about CoffeeScript doesn't make any sense. It doesn't have any 'default behaviour of including a module' because it's just syntax sugar, you use the same require() and libraries.


The default behavior for coffeescript is to hide the code in a JS module (the pattern). This is to prevent turning variables into globals. This behavior can be disabled, but I see it as one of the many benefits of coffeescript.


"JS modules" (what you're actually referring to are called IIFEs, which stands for "immediately-invoked function expressions" — JS modules are something else, and CoffeeScript doesn't support them) don't have anything to do with importing/exporting code. In browser-CoffeeScript the only way to "import" and "export" code is to make global whatever you intend to export.

IIFEs are also pretty orthogonal to Node's module system — they work perfectly fine together, and don't have much to do with each other at all. Not sure why you'd feel the need to disable IIFEs in the CoffeeScript compiler except out of desire for cleanliness since they're an ugly hack in the first place.


Could you provide an example scenario in which this would cause a problem? This has never had any noticeable effects in the 2 years I've been using Node and CoffeeScript together.


A closure. That's true, but it doesn't change the behavior inside a node module at all, they already have their own contained scope, you have access to the same globals `module/exports/global/require/etc`, and `this` is preserved.


Please don't confuse JS modules and closures, they are completely different things.


Fair points, though I quite like the concept of a "variable containing a library" (or module).

Personally, with JavaScript being the first language I seriously learned, it's intuitive for me to reason about. JavaScript is built around closures, and you're simply importing a closure that returns (exports) its public interface.

In fact, the main issue I have with ES6 modules is that they differentiate a "module object" from its default exports, which can be tricky to reason about (basically, what I wanted from ES6 modules was just syntactic sugar over CommonJS, whereas what we have is more powerful but more complex).


You do realise that CoffeeScript is just a layer of sugar/arsenic on top of JavaScript?


It's not clear to me how you can import something without adding it to one or more variables. I don't just mean in Node, I mean in any language.


Why do you feel node modules are a hack? They are part of node.js core and have very well-defined behavior. Most developers prefer them over the proposed `import` syntax in ES6.

I have nothing but praise for NPM - it's simple, fast, keeps dependencies isolated (but can dedupe if wanted), and did I mention fast? It runs circles around any other package manager I've used, except maybe for homebrew.


You can implement a module pattern with closures quite easily.


Which is not the same thing as node's modules. There's nothing stopping me from making global variables within a module "pattern". Contrast that with a node.js module where even if a variable declaration leaves out a 'var' by accident, the global will not show up in anything that imports that module. I.e., there is no chance of name collisions because each node module gets its own unique and clean global scope.


True, but the question was whether JS had any native ways to support modules, and closures are literally right there. That's a good way to start them. Add more shit on top to make it robust, but they're still there.


index.js is not a default you need to include in your package. you better go read and experiment before making bullshit. JavaScript is a language with multiple platforms and NodeJS is just one of them.


I believe he's talking about node's behavior when requiring a folder: http://nodejs.org/api/modules.html#modules_folders_as_module...


I personally found that some packages wouldn't install because they had dependencies that were broken in some way. This was about a year ago though, so I don't know how things have changed since then. Also, node packages seems to be small and specific so npm downloads many dependencies for each package. Again, I only played around with node for a little but I got a bit frustrated with all the packages and dependencies.


I think the issue you are describing really stems from poor semantic versioning. People will peg to specific revisions that they happen to use rather than depending on the general major or minor revision level, so small fixes don't naturally propagate to second- or third-level dependencies. There's no good solution to the problem that doesn't lead to other sorts of hell (imagine asking users to manage the revisions for third-level dependencies)


Were you using Windows? Although node.js "works" on Windows, there are a number of modules that only compile under UNIX, which makes Windows basically useless for node development.


Might be worth your while to give it another shot. Npm is so lovely.


While I like using NuGet packages with C#, I'm not really wild about how they can get magically linked in to a project, and then required. I had nunit and fluent assertions become inextricable from a project I was working on even after all the tests were removed. Just a total mind-f*ck. Python when using pip is a whole lot better but I've had some issues finding things there too. Ruby... it depends. Are we talking Rails gemfile or "gem install $package"? Conflicting versions can become an issue. Java with Gradle has been pretty cool so far. NPM as a whole, has just worked. Packages are referenced in ONE place (package.json) I can do an "npm install $package --save" during development and it gets included automatically.


There's nothing magic going on. A C# project's settings are located in the .csproj file (other .net languages work similarly). This file is normally editied from within Visual Studio, but it's just xml and can be edited by hand fairly easily.

A project file has references [1], which define the project's external dependences, Removing references is easy. [2][3]

[1] http://msdn.microsoft.com/en-us/library/ez524kew(v=vs.110).a...

[2] http://msdn.microsoft.com/en-us/library/hh708954.aspx

[3] http://msdn.microsoft.com/en-us/library/7314433t(v=vs.90).as...


Didn't understand the NuGet sentence. It's just binaries and XML descriptions. Worst case - just delete package.config and all the package folders. A one minute fix I used when not using source control to revert.


I agree. PIP in python is great, but has some extra overhead and difficulty with it, like having to setup virtual environments for each project. NPM by default installs to the local project only and with a quick --save will put it in package.json dependencies (similar to requirements.txt with pip). Node package management is awesome because it is so simple.


Virtual environments are optional though, right? You could have one big virtualenv for all projects, or simply install things into the system path (although I wouldn't recommend either)


I should just probably say, clearly you haven't seen Common Lisp's defpackage, modules are actually first class objects there and are completely decoupled from the file system.

But most importantly as Barbara Liskov mentions in this video[1] we don't know what is a module exactly or how to use them yet. Which is an specific statement aligned with Alan Kay's famous "We don't know how to design systems so let's not turn it into a religion yet."[2]

tl;dr; 1) Innovation is good. 2) Javascript's module is a half assed implementation of Common Lisp's defpackage. (Don't get me wrong is still way better than Python's abhorrent ninja goto: import)

[1]: http://www.infoq.com/presentations/programming-abstraction-l... [2]: https://www.youtube.com/watch?v=oKg1hTOQXoY


Have you used ML?


You have not used a good module system. Clojure's namespace system for example is really nice.


What do you like about Clojure's system?


It's full, proper namespaces. Like the best of Python and Java but simpler, more powerful, and more general.


Still that is the thing that Node.js lead developer regrets the most. Too late to change it, though.


Do you have a link? I'd be interested to read what the Node.js lead developer has to say on this, thanks


I applaud the Node developers for their effort. While recent "competitors" like Go are fascinating and in some ways even arguably better, the fact that every device on earth runs JavaScript makes me bet that Node will prevail.

Computing will need to move fluently between the cloud and the terminal, and JavaScript is the only language which allows you to use a single codebase for both (in the foreseeable future). As developer time will stay the biggest expense when developing software, I am bullish on Node and JavaScript.


I disagree with the idea that "[...] every device on earth runs JavaScript makes me bet that Node will prevail."

Because Node does not strictly equal Javascript-- V8 is not Javascript, nor is spider monkey. Most node applications, without some modification, do not run in the browser. They are missing tons of libraries and cannot interact with the file system.

I do appreciate the ability to have a single codebase, but over the past few years, I've pivoted and hope to someone's god that it is NOT javascript. Mostly because, in my experience, Javascript itself, even with a great focus from the beginning, is incredibly difficult to maintain.

Even then, is having a codebase in one language the best idea? There are some things that are much better to write in Erlang than they are C, and vice versa. It depends on the problem you're trying to solve.

The bigger the project, the less Javascript I want in it.

I think ASM.js, or something similar, will be the death of writing Javascript. Then it won't matter what language it's written in, we can use source maps to debug and it'll be faster than we could've ever written by hand*

*single one off functions don't count. Try writing a webapp in pure x-86 assembly.


These days virtually every language can be compiled to JavaScript in a very usable manner. For example, I've had a lot of success with js_of_ocaml[1], not really running into any of the problems people like to harp about regarding compiled-to-JS languages.

[1]: ocsigen.org/js_of_ocaml/

The argument for developer time--as well as decreased maintenance cost--is exactly why I went with OCaml over pure JavaScript. And so far it's certainly paid off!


Sure, but the language that _every_ developer will need to know is JavaScript, not OCaml or some other of the dozen alternatives that can be compiled to JS.

Besides, the ability to compile to JS is not enough; you will need to read it in JS, debug it in JS and let other developers work on it in JS (because they don't know OCaml).


Does _every_ C or C++ developer know the ISA they're targeting? Also, source maps.


> There will be no changes to the module system. None. It's finished. It's been finished for over a year. Any proposed changes must come with a clear reproducible bug that cannot be worked around or addressed with documentation.

This seems short-sighted, with ES6 modules now being used in production (through transpilers) and slowly making its way into SpiderMonkey and, more importantly, V8.


It is a shame, but I just don't know how node would migrate to ES6 modules. I'm sure someone smarter than me could think up of a plan, but it seems like a huge amount of work. One of the strongest points about node is the amount of small reusable libraries available. They aren't about to ditch that.


Some thought has definitely been put into this problem:

https://gist.github.com/domenic/4748675

https://gist.github.com/wycats/51c96e3adcdb3a68cbc3#importin...

(both of those are slightly out of date, syntax-wise, but are still conceptually current, IIRC)


Glad to hear it.


This is the kind of open-source leadership that made me move from PHP to Node. Node knows what it is.


At least we can trust the node developers not to check in untested code that breaks crypt.


I hope that the future of Programming is not in Node.js


You could make this comment useful and worth reading by explaining why.


I think that's obvious: forces you to stick with one specific legacy dynamic weak-typed language.


> Synchronous child process execution is being added, at long last.

Thank the jesus. This will be nice to have, especially for some command line and filesystem operations.


Yes, best part since the node module we are currently using does not work in v0.10 and the author is very adamant about not supporting it anymore.


> Callbacks will remain the de facto way to implement

> asynchrony. Generators and Promises are interesting and

> will remain a userland option. "

I am seeking for a comparable alternative to node (or something on top of node) in which Promises are the de facto way to implement async. Any suggestions?


Is the fact that they use callbacks instead of Promises outside of userland a real problem for you (as an application developer)?

You can totally write your node applications using only Promises using something like Q if you feel like it.


Dart is very Node-like is that it's event based, but almost all async APIs are based on Future and Streams, including all IO, and therefore any other packages that do IO, like database drivers. This makes things very consistent and composable.

On top of that you get to use a much nicer language than JavaScript.


Checkout C++11. It has lambda callbacks, promises/futures and async with threading.


I don't think C++11 is a comparable alternative to node, personally. Although I understand that's a subjective judgment on my part.


Have a look at node.native. It's just libuv and c++11, which behaves pretty much the same as node. But much lighter, and with no Javascript engine tacked on. Obviously not nearly as stable and supported, but it's an example of how c++11 can work very similarly.


Or luvit (libuv + Lua)


Scala+finagle is comparable in use cases, but scala isn't very close to javascript. Perhaps still worth looking into though.


It probably would be fairly straightforward to write a generator that wrapped every function in a module with Q.nfcall.


Dart. Look for Future and Stream.


The E language http://erights.org/ works that way. It's unfortunately been inactive for years; its BDFL moved on to the Javascript committee.


Take a look at CoffeeScript (cleaner callbacks) and ToffeeScript (callbacks hidden, just look like sync code). Those are much better solutions that promises.


You may find https://npmjs.org/package/pr useful.


node + clojurescript. you can do better than promises.

http://swannodette.github.io/2013/07/12/communicating-sequen...


"TypeScript and CoffeeScript will not be added to core." "As new language features are added to V8, they'll make their way into Node."

Sigh... Let's leave this "future of programming" to the future then. I hope someone needs it.


Sound good on all those. Completely dropped PHP for Node, no regrets.. Good because those breaking changes in Express v2-3 were annoying..


That's what semantic versioning is for. Node modules should continue to move forward, and reinvent themselves to improve how we write our apps. Otherwise it'll just turn into another PHP.


I still, still, still don't understand why Node hasn't done in-line server-side Javascript.

Give it a snazzy name. Maybe.. "Livewire".

http://www.datacraft.com/livewire.html


I think the reason for this is that the industry in general has moved more to a MVC approach than this. No one wants to mix their server side code with HTML anymore. In fact, more and more, people want to delay the rendering of HTML until it hits the browser.

I don't think it would be very hard to to rewrite livewire as node module though, if you were really interested in it.


Of course its not part of core but certainly there are a number of modules for that although that's not especially popular these days.

https://github.com/visionmedia/ejs


You have PHP for that.


Can someone please explain Node.js's relationship with Joyent? I like what I've seen/done with Node but I've had less-than-stellar experiences with Joyent in the past. I'm hesitant to devote additional time to it if they are very tightly coupled.

Is it similar to Go or Angular @ Google?

One thing I love about Rails is that it exists quite separately from 37signals or any other company. I don't get that feeling with Node/Joyent, but maybe I'm wrong.


Joyent owns the trademark and copyright on the IP related to Node.js. (Logo, "Node.js" mark, etc.) They also are involved with marketing, legal, etc.

The active core committers over the last month, and their employers, in order of whether or not they're me:

- Isaac Z. Schlueter (npm author/admin, lead gatekeeper, stream maintainer, javascripter), Joyent

- Ben Noordhuis (issues list champion, most opinionated C++ developer, libuv unix guy), StrongLoop

- TJ Fontaine (build owner, C shim author, tracing afficianado, Master of Robots), Joyent

- Trevor Norris (performance perfectionist, master of dirty hacks), Mozilla

- Fedor Indutny (russian ninja rockstar, TLS tamer), Voxer

Also very important to the project are Bert Belder (Windows guy, libuv co-author) at StrongLoop, and Scott Blomquist (Windows bug-chaser, Microsoft liason) at Microsoft.

Honorable Mention to Ryan Dahl (Node.js creator, BDFL) who is currently off being a professional hipster somewhere in NYC, and not involved with the project. When he did work on Node, he was paid by Joyent, and he got some money for selling the copyright to the company as well.

The Node.js community experience would be much less pleasant if not for Nodejitsu's continued support of the public npm registry. (Nodejitsu acquired IrisCouch a few months ago.)

Node would continue on even if Joyent bit the dust. (Depending on who assumed ownership of the mark, it may or may not have to change name/logo.)

If you've had a bad experience in the past with Joyent, it probably has nothing to do with the experience you might have with Node.js. I am given free reign here to manage the project free of any overt Joyent influence. Joyent is a big Node user, and a lot of really smart people, so I do take it seriously if they have something to say. Especially considering that most of the time when Joyent engineers have a problem with Node, it's related to other users hosted on their platform (Walmart, Voxer, etc.)


Cool, thanks for the thorough reply.


Node is not really tied to Joyent. Actually, a good amount of the core team is at StrongLoop.

Joyent has basically got it as far as they needed it to get for their own uses. Now they are just sitting on it and keeping it stable while letting userland modules cover future expansion.


I'm new to web programming but I got the impression that C++ for web work was silly ... use PHP, use Rails, use this, blah, blah, everyone said. Now it seems that Node.js is just C++ style programming in JavaScript, expect that you get "packages" that manage the HTTP headers (and other things of course) for you.

It would be cool if someone makde a Node.C++. I would have less of a learning curve.


You mean this https://github.com/d5/node.native? Godspeed.


i hit "casablanca" the other day from microsoft.. what impress me is that is open source and it also runs in linux. (not even in my wild dreams o would predict this to happen one day)

anyway.. looking at the source its pretty decent, and go right to the point.. of course its focused in HTTP/REST..

theres a bultin IO scheduler and a concurrent task system.. from microsoft.. who would say so :)


Wow ... thanks for the info!


My, god. Could they have possibly created any more or any more inscrutable jargon? Must have had whole teams dedicated to doing just that. Keeps out the riff raff I suppose.


> This is not a democracy.

I agree 100%. But still, a lot of people are complaining about some of the Node.js API choices.

They are complaining because it makes programming more difficult , and the truth is , async programming is hard when it relies on callbacks.


Wasn't that the original selling point of Node? That it used callbacks for async and that it was easier that way?


No, the original selling point was that Node forces all code to be async and Erlang is too weird.


The main selling point of Node was that it was JavaScript, and since you are going to have to use that on the front-end anyway, you might was well use it on the backend.

I've never seen anyone argue that callbacks are one of the easier ways of doing async.


No, the main selling point is that it's an event-loop based paradigm where all the APIs are async by default. As opposed to python twisted or ruby eventmachine where you have to look for async versions of the libraries you want to use and people writing modules usually aren't even aware that you want to be event-based.

So yes, grandparent is correct.


Some people will buy into it for different reasons than others. This is a silly argument.


Not the original point perhaps, but its come around to that. A couple years ago they settled on callbacks after a fairly active debate, as the least ambiguous option for continuation passing. As more people explore Node, callbacks are one level of abstraction too deep or just too tedious for some people. Fortunately there's a great selection of asynch modules to help them.


I really enjoy the callback paradigm. I love how powerful the abstraction of async code is. In procedural paradigm, there are some things that you want to accomplish that you simply cannot create an abstraction for, because the operation you want to do is time based. With the power of callbacks, you can abstract away complex things with a single function.


I thought the original point was that it is JavaScript that can do stuff.


As AndyKelley mentions, the point was it being async and event-driven by default. JavaScript was chosen because it was the only language that was simultaneously popular and free of sync baggage.


This became my all time favorite comment on HN. I have been laughing for a good 10 minutes.


I've personally never had issues with the callback pattern. To the contrary, it forces me to have well-organized code, which is a ++.


It was easier until we discovered even cleaner ways, and most of the Javascript community began to adopt them. But node chose to stick to the "old" ways. (Not really old, but not as shiny and new as the alternatives)

I feel like promises don't need to be part of core though. It's easy enough to wrap the standard library in userland and just have other things depend on the wrappers.


ES6's generators and the `yield` keyword plays very nicely with callback-based code, doesn't require any changes to the underlying libraries and gives you a much nicer syntax to work with.

See https://github.com/visionmedia/co


I've been looking at another library like the one you linked. Essentially the "yield" keyword can be compared to "await" in C#.

https://github.com/bjouhier/galaxy


As the post said, there are plenty of libs that add additional functionality in place of callbacks. Let everyone choose the way that works best for them.


You can use fibers. We are doing that via common-node (https://github.com/olegp/common-node/) and are very happy with the choice at https://starthq.com


I don't understand. Can you not just roll in a promises library (like Q)and call it a day?


Why would you when someone already rolled a promises library like Q?


? Maybe I used the wrong lingo, but my intimation was that one use something like Q.


A lot of short node.js code snippets you run into on the web use anonymous callbacks, and I agree, that gets ugly very quickly, but if you bind callbacks to methods on prototypes the code remains pretty readable.


Take a look at ToffeeScript https://github.com/jiangmiao/toffee-script


So callbacks are annoying enough that they have to mention its a non-fixable in the mailing list.

Awesome.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: