
The Node.js Community is Quietly Changing the Face of Open Source - apunic
http://caines.ca/blog/programming/the-node-js-community-is-quietly-changing-the-face-of-open-source/
======
shadowmint
Obviously if Node doesnt come 'batteries included' then you have to re-invent-
from-scratch all the tools like package management, dependency management, MVC
libraries, etc.

That doesn't mean the community is amazing and wants to contribute! ...it
means, that those features didn't exist, and _someone_ had to do them (and
yes, it's really great that they're all coming out open source, but it's not
revolutionary)

Any language in that situation will have a strong initial growth as people
port their favorite toys across.

I'd be much more interested to see the rate of change over-time, year by year.

Is it still growing? Is it slowing down now there are some mature frameworks?

That's interesting stuff. "Changing the face of open source?" -->
selfimportantness gasbagging.

~~~
InclinedPlane
Obviously if linux doesn't come "batteries included" then you have to re-
invent-from-scratch, or borrow, all the tools like compilers, web servers,
package management, archive utilities, etc.

That doesn't mean the community is amazing and wants to contribute! ... it
means, that those features didn't exist, and _someone_ had to do them (and
yes, it's really great that they're all coming out open source, but it's not
revolutionary)

Any OS in that situation will have a strong initial growth as people port
their favorite toys across.

I'd be much more interested to see the rate of change over-time, year by year.

...

~~~
geon
If you have a point, make it clearly instead of hiding it.

~~~
InclinedPlane
It's pretty obvious. Your criticisms would have been exactly as justified with
regard to linux. That doesn't mean node.js is guaranteed to be as wildly
successful as linux, but it does mean that merely dismissing it because so
much of the work on it is "just porting things" is misplaced.

~~~
shadowmint
Oh the sweet irony. This attitude is exactly what I'm talking about.

You're right, node is great. :)

------
autarch
It's hardly surprising that there's a huge burst of growth when a language and
its package repository first takes off. I'm sure if we looked at the 4 year
periods where each of Perl, Python, and Ruby were hitting their stride we'd
see similar numbers.

It's nice that node.js is succeeding, but it's hardly changing the face of
open source. It's treading a very well-trodden path blazed by Perl starting
about 28 years ago.

~~~
RKoutnik
Looking at the numbers posted in the article, there are almost as many npm
packages published in four years of Node than there are Python packages
published in 22. I think just saying "hey, they're new, it'll level off" is a
great disservice to the Node guys who made this happen.

Truth is, npm is growing (in terms of # of packages) faster than _any
community like this ever_.

~~~
oscargrouch
True but, this may also be a "technology generation" issue.. i mean.. compared
with the 90's .. there are much more people interested and with hands-on
computer development than ever before..

And this is all about the internet boom.. so it happens that the technology of
the moment now is javascript and consequently node.js.. its natural that a
great part of this bigger labor force ends up doing node.js stuff..

to see this numbers clearly we need to see the net effect, the number of the
whole development community in comparison..

it's not fair to compare 2010's with the 90's.. what was the audience for perl
back then? now whats the audience for node.js and all the other players?

~~~
RKoutnik
There are other factors at play here. Node has access to a bigger audience,
but it's an audience that already has mature toys like Python/Perl/etc. Node
not only has to create a great ecosystem, it has to create a _better_ one than
what already exists (which is the central premise of the article).

Simply saying "Welp, there's more people on the internet now" is ignoring the
excellent competition for hackers that Node (or any other new tech with
grandiose ambitions) has.

------
steveklabnik
> It seems to me that monolithic frameworks like Rails don’t have a clear
> notion of what does not belong in Rails, and so they tend to expand
> indefinitely to include everything an application could need.

Rails 4 will see _eight_ different things that used to be in core extracted to
a gem. The diff from 3.2.13 to 4.0.0.beta1 was +110,000 to - 100,000, so it's
grown about 10% in the next major version.

Rails is obviously 'full stack' but it's not like it's a black hole that sucks
in all Ruby code.

~~~
vidarh
I was about to comment about this as well. I don't particularly like Rails.
One of my most linked posts on my blog is a rant about what I don't like about
Rails. But that was written when Rails 2 was current, and while I don't
particularly like Rails still, I'm very happy to see that Rails has steadily
moved towards splitting more and more stuff into separate gems that are
increasingly useful in non-Rails scenarios.

So while Rails certainly is "batteries included" and "full stack" in many
ways, I don't see how it contradicts the "tiny module aesthetic" that this guy
mentions: A typical Rails deployment is big, sure, but it's big by pulling in
a ton of, in many cases quite small, modules, more and more of which can be
replaced

I still have issues with the level of coupling in a typical Rails deployment
(e.g. a number of gems still pretty much expect a full "default" Rails stack,
so replacing or avoiding specific components is often tricky), but the
landscape has changed drastically.

------
icambron
To speculate a little about the huge rate of package growth (none of this
should be taken as critical of the article, which makes no claims about this):

1\. Reuse - a lot of NPMs also work in the browser. There are good reasons to
make your browser code run in Node too; it makes command-line-driven testing
easier and it requires little additional work to add support to a whole
platform. And it should be obvious why there is so much JavaScript in the
world; in fact, one of Node's biggest advantages is JS monoculture.

2\. Take-off time. Node itself has gotten really popular really fast (there
are good reasons for this too), which means its ecosystem needs to do some
catching up. It has half the packages, but Node developers still need to get
the same things accomplished. I've contributed more to the Node ecosystem than
the Ruby one even though I write more Ruby in general, mostly because _more
needed to be done_. You look around for a good existing solution, and if it's
not there, you roll your sleeves up and you write it.

~~~
misframer
It's also the case that Node modules are sometimes inspired by those in other
ecosystems, like Rails.

~~~
acchow
It's just like technology transfer into a booming (developing) country.
Happens rapidly until they've caught up to the frontier.

------
drakaal
I have never known Node.JS developers to be quiet.

Quite the opposite. With a single thread, gaping security holes if you want
large memory support, and until about 3 weeks ago nothing that even came close
to resembling compliance with HTTP standards, the onlything Node is changing
in the OpenSource community is that a crowd that would have been called script
kiddies 5 years ago can now put that on a resume and for some reason think
that should earn them 6 figures.

Changing the Open Source community (for the better) requires doing more than
hacking together a few lines of code that dupes the functionality of someone
else's lines of code. Node.JS doesn't have a single innovative project that is
moving all of computer science forward. If node wants to be taken seriously it
needs to prove that it can play big data, or science, or linguistics with the
rest of the communities.

Every language has a niche where it dominates in resources for a given field.
Node has none.

~~~
lmm
And yet it succeeds. I think it's simply that async IO is in javascript
developers' blood - sure, everything you can do in node you've been able to do
for five years with twisted, but if you start trying to do something in
twisted all that wonderful python ecosystem is suddenly useless to you, and
no-one wants to talk about creating replacements because they've all moved on
to tornado or gevent or incompatible-framework-du-jour.

Because javascript started in the single-threaded execution environment of the
browser, the whole ecosystem has had to be nonblocking; browser APIs were
callback-oriented so the whole ecosystem has been written in callback-oriented
fashion, and all the libraries play well with each other. Single-threading is
a hair shirt that results in better code in the long run, like laziness in
haskell. And async I/O is so much more performant than blocking (in modern
application stacks) that that's making node dominate there.

I do wonder whether other languages could have ended up the same way if
threading hadn't been invented. Perl always had these unreliable bodged
threads, and so there were some really interesting event-driven libraries for
it - but the threads were good enough for many practical uses, and AFAIK the
ecosystem never converged on a single approach. Did I hear of a PEP attempting
to standardise a compatible API for doing these things in python?

~~~
drakaal
In web languages threading is often not the win that people think it will be.
True web scale is serving 1000s of requests per minute if not second. Each of
those users is a "thread" in Python, Java, Etc. So enabling multiple threads
per users robs Peter to pay Paul. For this reason the Async model doesn't
offer the huge performance gains in deployed code to enterprise scale, that it
offers to single users trying to build fast one off projects.

~~~
lmm
You seem confused. The whole point of the async model is to allow one thread
to serve many users, and it's precisely on large scale projects that this
becomes useful.

------
RogerDodger_n
Strange that in a blog comparing package managers there's no mention of CPAN,
which "currently has 120,446 Perl modules in 27,328 distributions, written by
10,567 authors" [0].

It's a lively community, sure. But "changing the face of open-source" is a
stretch -- "quietly" even more so.

[0]: <http://www.cpan.org/>

~~~
MetaCosm
There is a huge difference in the barrier to entry. To contribute a module to
cpan versus contributing a module to NPM.

You might consider the barrier a feature, but it is a wild difference.

~~~
xb95
Well, if your argument is that it's easier to contribute to NPM, then you
should consider that Perl has been managing the same output for 18 years that
the Node community has for the past four: about 6700 modules/year.

For what it's worth, uploading to CPAN is not very difficult. Getting a PAUSE
account is easy and uploading is no problem. It's really no harder than the
PyPI system.

~~~
MetaCosm
Step #1 is "apply". That is a hell of a barrier to entry. You also need to
explain what you want to do with the pause account.

As I said in another comment, it might end up being to the detriment of node,
but now it feels like a feature.

------
aidos
So, you want to compile LessCss. You go to npm and have a look at the options.
Which package to use?

assemble-less, baidu-less, buildr laessig, less, less-bal, less-clean, less-
cluster, less-context-functions, less-features-connect, less-less,
lesscompile, lesscw, lessup, lesswatcher, lessweb, style-compile, styles,
watch-lessc, wepp

Number of packages is far from a perfect measure of how much interesting and
important work is going on in a community.

~~~
andypants
Or you could go to the official lesscss.org website and follow the very clear
instructions to 'npm install -g less'.

------
aneth4
Just what I want in a framework. Thousands of libraries with functional
overlap, each tested in a small number of real use cases and having a large
number of undiscovered bugs, none of which is flexible enough to solve my
problem or has a large community to maintain it.

In time, node users will realize the pain of abandoned library dependencies
and the overhead of researching alternatives and grinding through bugs and
functional gaps in poorly planned libraries.

I like node, but the library situation is a disaster. I'll take monolithic,
well designed, tested, and popular over a huge choice of one off libraries any
day.

~~~
karterk
_I'll take monolithic, well designed, tested, and popular over a huge choice
of one off libraries any day._

Been there, done that. And, it's terrible. Monolithic things try to do too
many things. It's fine if you grow along with it but a few years down the line
- a beginner will find it very difficult to get started and manoeuvre around.

Small, composable things are better. If something is popular, it will be
forked and maintained. Also, monolithic things tend to slow down over time (in
terms of keeping up) as they start accruing so much baggage.

~~~
aneth4
Small and composable things that are popular and well tested is great. That is
not what you get from something like the wild-west of node. Perhaps your ideal
is Linux, with small composable commands. How many of those commands copy or
move a file? Pretty much one - that's because lots of unexpected things go
wrong, and we need one good implementation instead of 500 shitty ones.

Rails is actually moving towards being a pre-installed collection of small
composable things, much like a *nix distribution.

Small and composable is great. Everyone implementing cp is not.

------
VeejayRampay
Isn't Node also coming 10 years late to the party? It's always easier to join
the race at a later time when all you have to do is copy the efforts of others
and catch up. Especially since Node has quite a few users and a lot of early
days hype to sail on. That being said, good on them and good on the variety it
offers people.

I just wouldn't say it's "changing the face of open source" though...

~~~
MetaCosm
It came late (but at the perfect time) -- it lowered the barrier to entry --
it changed the default licensing -- it leveraged existing systems (github).

In short, it didn't do any amazing -- but by simply combining the right tech
at the right time, the result is something IMHO much better than pip or gem or
X.

~~~
integraton
What do you mean by "it changed the default licensing"?

~~~
MetaCosm
Lots of other communities tend to default to GPL.

~~~
jerf
Name them.

You might be surprised. "Perl Artistic License" is not GPL. Python is largely
a slightly tweaked BSD, I think, certainly it's not GPL. Both tend to release
packages under the same license as the implementation. And so on.

This strikes me as another instance of the Node community conveniently
rewriting history so they can tell each other how revolutionary they are,
instead of looking around at what really has already been done. I don't know
of any other community so prone to that, consistently and persistently, even
after being corrected.

------
sippndipp
I assume half of the packages are outdated plugins for grunt.js. If we then
subtract all the packages without tests (aka the ones that are almost useless
for open source) I guess the numbers should be pari. Doing such comparisons
reminds me of the days were LOCs were a meter of productivity.

------
ultimoo
While I see the point the OP is trying to make, I feel the article is not
written in a neutral way, with whatever little knowledge of node.js I have.

Isn't Node.js a platform? And JavaScript the language? I looked it up and
JavaScript has been around since 1995 (which makes it contemporary to Ruby and
Python).

~~~
htilford
It's not about language, it's about the module ecosystem. Less Node.js vs Ruby
vs Python more npm vs gem vs pip

~~~
Tobu
pip didn't exist in 1991 though. CPAN got mature faster (and it has 120k
modules), but such a comparison isn't satisfactory without a curve showing the
yearly growth rate.

------
hayksaakian
While I understand the benefits the author presented about having many small
packages, I still wonder what kind of effect this has on resolving overlapping
dependencies.

EX: I have package X that depends on A, B, and C and I decide to use package Y
that depends on C, D, and E.

D and E may very well be an alternative implementation of A and B, given the
large number of packages and a finite solution space.

On the one hand it's easy to think that the best libraries for a particular
task will rise to the top, but as a newer node dev, having to reinvent so many
wheels for use in node is not very attractive. (However, this is almost
counteracted by having to only write in one language).

~~~
ladon86
I think the modules which tackle very common functionality do naturally rise
to the top, and end up being well-maintained and fairly standard throughout
the community. A good example is mikael's request
(<https://github.com/mikeal/request>) which I suppose is a urllib2 equivalent.

Modules like this are constantly referenced in other modules, and in tutorials
etc. By scrolling down this list (<https://npmjs.org/browse/depended>) you can
get a sense of which modules now form the community curated 'standard library'
for node - if you were a new node dev looking for a very common piece of
functionality for your project, this might be a good place to start looking.

------
jacques_chester
By analogy, bacteria introduced to a freshly sterilised petri dish are
changing the face of existing, thoroughly colonised petri dishes.

------
teleological
Bully for Node, but to compare apples to apples, you should count from when
the packaging system was released, not from when the language was released.
Node's numbers wouldn't look so good if divided by the number of years since
JavaScript was released.

Rubygems dates back to 2003, in which case Ruby's 5,439 packages per year
still trails Node's pace but not so dramatically.

------
dr_faustus
Its ridiculous to compare the take up of an execution environment to that of
new programming languages. Of course it takes MUCH longer for a language to
get traction than for a simple piece of infrastructure using a language that
has been around for 14 years (at the time of node's inception).

Many packages in npm are just code which has been around for quite some time
before npm existed or it comes from the client and was repackaged for node
(underscore, backbone, jquery ...). So to make the comparison fair, lets
divide 26,966 by 18 years of JavaScript: 1496/year. Not bad, but certainly far
from game changing considerin the inclusion of JS in all major browsers gave
that language an initial boost that Ruby and Python never had...

~~~
TazeTSchnitzel
Um, I doubt most of the packages were existing JS code. Many, sure, but not
most.

------
bdcravens
Seems a bit unfair to show a rate of packages per year based on the full life
of each language, as the principle package repos haven't been around as long
as the packages/year implies.

~~~
jacques_chester
Paging Newton and Leibniz.

Newton and Leibniz to the HN thread, please.

~~~
drivebyacct2
I don't understand this reference. I have to say, I agreed. I don't see a lot
here that is unique to node.js. If anything it's the culmination of timing
that reveals the current state of advancement and collaboration in today's
OSS/software world. But I think Go or Rust are emerging examples of a similar
phenom.

~~~
jacques_chester
> _I don't understand this reference._

Newton and Leibniz independently invented calculus.

------
davedx
I'm sorry, but it's not a good thing when there are multiple packages to do a
binary search.

------
jeffdavis
PostgreSQL has a somewhat similar philosophy of extensibility and moving
things out of core. Consider something like PostGIS: it's an entire first-
class geospatial system done entirely as an extension (I don't think any other
database can claim that).

It's still basically a batteries-included distribution, but it is very
extensible and getting more so. I expect to see many more domain-specific
extensions (you can already see a lot at <http://pgxn.org> ).

However, I will say that it should be a simple core but not _too_ simplistic.
Especially for something like a database, some things are better done in the
core (perhaps not inherently, but it's in research-project territory). That's
really the challenge: it's not "pro-modularity" versus "anti-modularity"; it's
about what constitutes a good base from which to build.

------
fijal
He got me to the point where he divided number of packages by number of years
(hell, I did not have _internet_ in 1991, so why would there be packages???)

------
petercooper
_ruby: 54,385 packages / 18 years = 3022 packages per year_

Gems / rubygems haven't been around 18 years. The first public release was in
2004. So 9 years or 6042 packages per year is probably more realistic.

~~~
jagira
And, most of those gems have been written after Rails became prominent
(somewhere around 2006/7). Denominator should be 6 or 7.

------
waxjar
The metric OP uses (total # packages dived by the # years the language exists)
is misleading.

Node.js is new, a lot of people are trying to solve a lot of problems that
aren't solved yet. Once they're solved and have a more or less standard
solution the rate at which new packages are released will drop, simply because
most problems will have been solved.

------
shocks
I find the quality of these packages to be quite low, in general. :/

------
pistacchioso
Irrelevant but I must say it: Node.js forced async IO is its strength but
programming it is a pain in the butt.

~~~
dagw
I used to think so too, then I stopped trying to use it for everything and
started treat it as a DSL for just writing IO heavy asynchronous web services,
and now I'm much happier.

~~~
argonaut
This sounds interesting. Can you elaborate on what you mean? Do you mean using
node in conjunction with another backend language/[micro-]framework?

~~~
dagw
Basically. My general setup (to the extent that it is reasonable) is to set up
my web apps as a collection of freestanding APIs that simply send and receive
JSON. The only component in common between the APIs is the back end
database(s). This way I can write each API using the framework/language that
makes most sense for the task and simply chuck Nginx in front of the whole
thing to have it route requests to the right back end.

------
malandrew
Great post. I'd also add to that list the benefits of local by default
packages that are in your project folder. The approach of tools like
easy_install, pip, bundler, etc is to hide all your depemdencies in some dot-
prefixed folder outside your project directory. This promotes an out of sight,
out of mind mentality. The npm approach on the other hand puts them in your
project and makes it much more likely that you are going to explore the source
of your dependencies, more likely that you bug fix those dependencies instead
of work around bugs and most importantly you'll treat that folder like part of
your own project. This last point means that you are likely to develop one of
your own generic modules to the point where it's good enough to submit back to
npm as a public module.

------
niggler
"I’ve only very rarely seen a node.js project that wasn’t on github."

To an extent, that's due to npm's git integration and lack of strong free (for
open source) git hosting alternatives. You can specify a git endpoint for a
module import.

~~~
rodw
> You can specify a git endpoint for a module import.

But you lose some of the magic that comes from expressing dependencies through
semver versioning. You can publish multiple versions of a module sourced from
git, of course, but that takes a lot more manual effort for publishers than
than it does to publish on npmjs.org.

------
smandou
Java has more than 100k packages in 16years... more than 6k per year as well.

------
jpadkins
The author uses the term 'mono culture' but I think the better term might be
'network effect'. There are a lot of positive network effects of all these
projects hosting on github.

------
endlessvoid94
quietly?

------
cmccabe
I don't agree with the idea that there should be no standard library. There is
a lot of benefit to having a big standard library with code for common tasks
like manipulating strings, performing network I/O, or formatting text. Sure,
you might be able to improve on the standard library, but the next person who
has to maintain your code probably won't thank you for using something weird.
And if your new thing is awesome enough, it might be added to the standard
library in time.

Things like this were a big problem in C++. In the early days of C++,
literally nothing was standard. Everyone ran around using his own string
class-- "std" string was named aspirationally, not to reflect reality. They
all behaved slightly differently. It has taken literally decades to get to the
point where std::string appears in most new projects in the UNIX world. (I
think that Windows is still hosed, due to the continuing UCS-2 train wreck on
that platform... but I digress.) Similarly, there are 31 different flavors of
smart pointer, and even a master C++ programmer won't know them all. So much
pointless non-standardization, so much cognitive overhead.

I'm also reminded of Perl's "there's more than one way to do it" and the
"enhanced job security" that ended up providing for anyone who managed to
sneak Perl into production.

There is value in modularity, but only when the choice you're giving to the
library user is a valuable one. Choosing whether to use a MySQL database
connector or a Postgres database connector is good diversity. Choosing which
string class to use, or what we're calling BigInt this month, or which of 15
Regex libraries we're using today... that kind of diversity just gives you
headaches and maintenance pain, nothing more.

~~~
glenjamin
All of the things you listed _are_ within the scope of node.js's standard
library - except for BigInt.

