
What the web platform can learn from Node.js: The Darwinism of small modules - ingve
https://developer.atlassian.com/blog/2015/11/what-the-web-platform-can-learn-from-nodejs/
======
dham
Web development is in truly a sad state. Having worked on a larger Node
project I'll never do it again. The only thing you can do is npm shrinkwrap
and never update your dependencies ever again. If you update you are f'ed.
It's a nightmare. Semvar doesn't help at all. For instance Bookshelf added a
method called count. Co worker was on .4 and using count. Everyone else was on
.3. 0_0.
[https://github.com/tgriesser/bookshelf/issues/888](https://github.com/tgriesser/bookshelf/issues/888)

If you want things to just work like in the Ruby community, make sure you
shrinkwrap.

I don't think most people run into this because most backend node programmers
have moved to Go at this point, so they're not aware of the upgrade problems.

On the front end, people are probably re writing the front end every 3 or 4
months. Moving from javascript framework to javascript framework. Never dealt
with upgrading 6to 5 to Babel to Babel 4 to 5 to 6, and npm from 2 to 3. Or
browserify, browserify react hot reload, react-proxy v1 only, gulp. Ugh.

I'm not sure it's possible to move a project using Babel 5, async/await,
generators and react to upgrade to Babel 6. I've spent hours on it.

Sometimes I think it's all some kind of cruel joke. Look at Babel. I mean a
library that doesn't do anything without installing a plugin. At least support
transpiling ES 6 and 7 to 5 out of the box and everything else with a plugin.
I have to install like 4 plugins to get anything running.

~~~
Scarbutt
_I don 't think most people run into this because most backend node
programmers have moved to Go at this point, so they're not aware of the
upgrade problems._

How does Go solve this?

~~~
lobster_johnson
Go's static typing goes a long way in helping catch breakage due to things
like adding or remove symbols. (However, it does have some warts where
incompatible versions may silently compile, like initializers.)

While Go doesn't have package management, it does have a framework (go get)
for automatically fetching and compiling dependencies. However, people quickly
realized that there's no way to pin dependencies; it will always fetch the
master branch's HEAD, so if you "go get" a project, it will fetch some
arbitary version depending what time of day it is. The community has devised
various ways around it, and recent versions of Go have a way of resolving
packages to a local "vendor" directory, but it's not a solved problem. There's
no solution that stands out from the rest as particularly good or feature-
complete when you look at the historical alternatives (Ruby's Bundler in
particular).

~~~
paulddraper
Always getting HEAD seems like a rookie mistake that rejects decades of
lessons from mature systems and is a real deal breaker.

But that may just be me. I think the same thing about generics, and others
clearly don't.

~~~
muraiki
As far as I understand it, always fetching HEAD was the default because Go was
developed with Google's giant monorepo full of vendored dependencies in mind,
where doing this is both safe and sane.

FWIW, I've been using Go 1.5's vendoring alongside govendor and it seems
pretty nice. While this situation is a bit of a disappointment given the
generally awesome tooling in Go, it's hopefully something that will be
resolved soon.

~~~
kashif
This is the kind of reasoning which bothers me about Go. How can you write a
language that you expect the world to use based on assumptions which are only
true within 'Google'

Brittle.

~~~
muraiki
Well, I think that it is reasonable to say that the practice of vendoring
dependencies is not limited to Google alone. It might not be as common in
dynamic languages, but it's fairly common in static languages like C++, which
Go was designed to replace. `go get` is basically a better curl for _getting_
dependencies, not _managing_ them. The language authors and the community are
addressing the problem of management right now.

Before dismissing Go, please try to understand its history and the fact that
it is changing.

------
drinchev
HN why all the hate agains NodeJS? It's still ( IMHO ) amazing.

Yes the versioning is sad and breaks almost every time. Also some modules look
fragile and what was good yesterday is not good today anymore.

But still. Those are the reasons why I still stick with NodeJS :

1\. Electron. Simply awesome - I copied 90 % of my app, changed Sequelize ORM
to have it's storage to sqlite and boom. I have a webapp as a OS-packaged app.

2\. Isomorphic code - I'm rendering React / Flux applications on the server-
side and on the client-side. Reusing code, storage and syncing logic between
them. Even my logger module works on client-side / server-side.

3\. Front-end / back-end needs very similar skills - This is one of my top
reasons to tell my clients ( I work as a freelancer ). There will not be a big
deal for a front-end guy to write / understand backend code and vice-versa.
Also it's javascript. The most popular language.

4\. Libraries for most popular services - There is always something, someone
wrote that is NodeJS or at least javascript.

P.S. I'm heavily using TypeScript nowadays. I think this solved a lot of
NodeJS syntax/compiler problems.

~~~
pikzen
>1\. Electron. Simply awesome - I copied 90 % of my app, changed Sequelize ORM
to have it's storage to sqlite and boom. I have a webapp as a OS-packaged app.

And your users get to enjoy awful performance and insane resource usage for
their app. How generous of you.

Also, the hate for NodeJS is because Javascript is a terrible language, no
matter how many things you put on it (TypeScript, Babel, etc.). It's also
because V8 performance is laughable.

Everything you mentioned is also mentioned in
[http://notes.ericjiang.com/posts/751](http://notes.ericjiang.com/posts/751) ,
which I believe does a nice job of explaining why NodeJS is terrible.

~~~
drinchev
I usually answer "Javascript is bad" statements with "English is not the most
declarative language too, but ... it's the most common". Anyway I think JS is
not a part of my statement, so I'm not going to agree on why JS sucks. It's
opinionated. And contrary to logic it works good when you get used to it.

About Electron. I think nobody will build Photoshop or 3D Games ( at this
point ) using it as a platform. I would personally decline if I have an
assignment like this. But for apps like Slack, Visual Studio Code, Atom, etc,
it works pretty good. Something more. I'm actually working on my own
accounting app that suits my needs, which I would not be able to do for that
limited spare-time I have these days If I used Objective-C / Swift.

~~~
pikzen
>But for apps like Slack, Visual Studio Code, Atom, etc, it works pretty good.

I won't get into a long rant, but aside from on OSX, Slack and Atom are slow,
resource hogging and unstable piles of crap. Atom couples that with bad
architecture and engineering choice, but that's not Electron's fault.

(Also, your comparison with English is bad. English naturally evolved to the
state it is in today. Javascript was designed from scratch, and it still
manages to be awful. Also, sorry, Javascript isn't the most common language,
thankfully.)

~~~
tracker1
I have an i7 desktop with 32gb of ram.. a recent rMBP i7/16gb-ram for my
personal labtop, and another work issued one... I also use VMs with Linux
frequently... VS Code is one of the better code editors I've used. Given, I'm
on fast machines with plenty of ram... I really haven't had a serious problem
with it... hell, my browser typically uses up way more ram than any other
single application on my computer... it still does the job well enough...

That's the key right there "Well enough"...

------
tlrobinson
I've run in to so many npm errors over the past couple weeks, some due to npm
bugs, some due to buggy packages, that I'm not very confident in npm or the
package ecosystem any more.

First of all, most people don't actually follow semantic versioning, but npm's
default version ranges assume they do, so unless you lock down your
dependencies with `npm shrinkwrap` you'll find your app occasionally breaks
because some deeply nested dependency broke something in some package you
don't know anything about.

But shrinkwrap has a bunch of problems itself (so many that Uber wrote a
separate tool to try to fix it: [https://github.com/uber/npm-
shrinkwrap](https://github.com/uber/npm-shrinkwrap)). npm-shrinkwrap.json
files are _huge_ , in part due to the nature of npm, but it also contains a
bunch of unnecessary information. The bigger problem is it's _non-
deterministic_ , resulting in huge diffs every time you run it, even if
nothing has changed.

Here's another fun bug I encountered recently:
[https://github.com/npm/npm/issues/9642](https://github.com/npm/npm/issues/9642)

Every time I think I've worked around some bug another pops up. I wish I've
kept track of them all.

/rant

~~~
Klathmon
Can't you just check dependencies into your source control if you really need
consistency at that level?

------
matthewtoast
I'll share a thought or two while I wait for $ npm install to finish.

I agree that other ecosystems could learn a lot from the Node philosophy. And
in general I enjoy working with the technology. But it's not all roses. In an
ecosystem of small modules, you often find yourself wading through many --
equally small -- annoyances which can add up to a lot of friction. Here's a
contrived fake example:

Your app needs ABCXYZ. So you go shopping for a module. The most popular one
looks good, but when you Browserify it your bundle ends up with 1,000s of
extra lines of code for the Buffer shim. So much for keeping that footprint
small. So you keep looking and find another one that has a good footprint, but
a weird API and it hasn't been updated by the maintainer since 2011. The next
one you find looks great, but the maintainer decided they were tired of ES5
and didn't feel like adding a distfile, so you would have to install a bloated
transpile toolchain to use it. You're getting annoyed but you keep going. The
next one's author inexplicably decided to implement everything with streams.
But you're too busy to spend a day grokking streams and reworking your entire
API to be compatible. The next one is packaged as a jQuery plugin. The next
one doesn't have a license. The next one uses Web Workers for no reason you
can see. Etc. etc. etc. Repeat.

Any of these could be problems with a monolith (and monoliths certainly come
with their own pile of frustrations), but monoliths are less likely to hit
these problems simply because they consolidate effort and personal emotional
investment around a certain set of techniques and standards, which leads to
more consistency overall.

~~~
tracker1
Or, fork the light library, and work around the troubling bits? Many (most?)
npm modules are ISC, MIT or something equally permissive.

There are several times I've found something close to what I want, I fork it
on GH, change the bits I want, update the name everywhere and publish it...
many times they continue to work as-is without much grief. Of course that's
part of why there are so many options for any problem.

I usually look at the following, when was the last commit, and how many issues
have been sitting stale for over a month... if the last commit was several
months ago, but there aren't any recent issues, I'll check through the backlog
issues.... some libraries reach maturity and don't change much, if the issues
aren't recently curated, I move on...

~~~
matthewtoast
I agree and that's what I like to do too when these issues come up (time
permitting). Still, I hope my basic point came through: the small modules
approach isn't a panacea; it can add its own set of problems; "monoliths" can
be a good choice, not always, but sometimes.

------
Osiris
I work on a fairly large node.js project with over 100k LOC, and I rarely run
into issues where a minor update to a dependency breaks the application.

thrift is the only module where I've locked the version number in
package.json.

Maybe it's easier because I don't shrinkwrap, I also get the latest version
when CI runs and that allows me to make sure that my application is always
compatible with the up-to-date releases of all the dependencies. Shrink
wrapping has the downside of actually making it harder to upgrade dependencies
because you can get further and further behind on updates.

------
tjholowaychuk
Not being able to create something useful out of the box on the web is pretty
painful. I have the same problem with node, you need 1000 modules with
ridiculous names to do anything.

Fundamental core patterns like streams are still broken and suffer from huge
UX issues. I hope future versions will use promises and async/await in core
and redo streams to utilize this more like Go's reader/writers. Just by
punting that stuff to core the UX without third-party modules would improve
drastically, and encourage better practices throughout third-party code as
well.

I'm not bashing Node I'd love for it to do great. I'd love to use Lambda out
of the box with no callbacks and not worry about which streams are missing
which handlers or does this third-party module implement them correctly or
will it explode on me :D.

------
homulilly
While npm and semver are far from perfect, I've personally found them to make
dependency management a lot less of a headache in node than, for example,
python which is an absolute nightmare to deal with. You still have to make
sure you're using fairly stable, well written packages by people who actually
understand semver and best practices but the same applies everywhere and I've
found the node community's to this kind of detail better than most other
ecosystems.

------
BinaryIdiot
I don't know as a very heavy user of node I absolutely hate the hundreds and
thousands of dependencies. A vast, vast majority of modules never maintain an
LTS version of their module so when one module has a security update you
better have the latest of everything (especially if it's a very common
dependency) otherwise now you're stuck back porting security patches.

I've had to back port security patches far more than I would ever like to
admit.

The node community is still pretty immature. Sure it's nice to rush ahead
releasing version after version without having a LTS version but when I want
to build something serious with it it can be downright frustrating.

------
phillmv
I've heard this claim before and it feels like a mixture of stockholm syndrome
and wishful thinking.

In my experience, it's a bit of a shit show. You end up with libraries that
are subtly incompatible, or have overlapping functionality, or strangely large
holes of functionality that require overlapping combos of libraries and
therefore nobody's written.

If you get your socks off writing shims and glue and maintaining it against a
thousand moving targets, power to you.

In my opinion, what you _really want_ is someone to bundle composable
libraries for you. That bundling has incredible value. You know that
everything more or less works, in ways that you'd expect, because someone else
managed to test it.

You have all the functionality basics down pat. The shims you do have to write
work at higher levels of abstraction. Etc, etc.

~~~
Swizec
> In my opinion, what you really want is someone to bundle composable
> libraries for you. That bundling has incredible value. You know that
> everything more or less works, in ways that you'd expect, because someone
> else managed to test it.

Sooooo a big monolithic framework like Rails or Django? I thought we said
those are bad.

~~~
woah
I think that at this point we can conclude that monolithic frameworks and
minimal modules are BOTH bad. Software development is bad.

------
spo81rty
One app with 1,000 package dependencies? Sounds like a nightmare to me when it
comes time to update them and ensure they didn't break anything... especially
in a language without a compiler to help.

~~~
muraiki
This is why they rely on semantic versioning.
[http://semver.org/](http://semver.org/)

Edit: I know this doesn't solve the problem completely, but comparing, for
instance, a typical CPAN module's versioning scheme with what you get in npm,
I'm far more confident in a lack of breakage in the npm case. :)

Edit2: For downvoters: I program in Perl and Javascript professionally. I'm
speaking from experience (my boss just ran into this exact problem with a cpan
module in production yesterday), not picking on Perl.

~~~
BinaryIdiot
When you're looking at modules individually yeah semver is useful. When you're
looking at modules that include other modules that include other modules? When
one person makes the decision to upgrade then everyone either has to upgrade
or not upgrade and I rarely see even big projects start a LTS version where
they keep updating it; no most of them move on the second the next big version
is released.

This can be painful when you're dealing with a really big project and someone
provides a security update but it's only for the newer version that has lots
of breaking changes.

~~~
muraiki
Yes, I agree that this is a problem. You might be using foo in your project
which gets a security update, and two things that depend on foo, where one of
those things is using the still-vulnerable-foo. So you now have two versions
of foo, including one that is vulnerable.

I'm curious to know how other dynamic languages handle this. In Perl I think
you generally can't safely load >1 version of the same module, so while you
could end up with breakage on an update, you can't end up with the above
security problem.

------
jondubois
I'm pretty unhappy with the current state of software development. Developers
who have a voice are often extremists with very strong ideas (and sometimes
very bad ideas).

The idea that you should use a module to check if a number is -0 is just
insane.

Developers need to wisen up and accept the reality that there is no right or
wrong way to do anything. The problem is that people like radical thoughts. No
one wants to hear the word 'sometimes' \- Which is actually the right word to
use 99% of the time.

Question:

\- Should you use as many small modules as possible?

Answers:

\- Famous developer: Yes, always break everything into small modules because
it encourages code reuse, composability, collaboration. That's the Unix
philosophy.

\- Moderate developer: Small modules are great sometimes but in some cases you
just want a wholesome tool that just works and takes care of all the edge
cases (and gives you operational guarantees). Having too many separate modules
increases management complexity and can make the upgrade/deployment process
more complicated and time consuming. Sometimes having too many small modules
can make your code difficult to read because people are forced to read through
pages of documentation on GitHub every time they encounter a new module.

Question:

\- Is Microservices always the right approach to software development?

Answers:

\- Famous developer: Yes, always break up your system into separate services -
This promotes separation of concerns which makes your system easier to manage.

\- Moderate developer: Sometimes. If you have a small team, you don't want to
be managing too many different services and having too many code repos -
Sometimes it's better to start out with one service, but build it in a way
that it can easily be split out later on.

Question:

\- Should you use Docker?

Answers:

\- Famous developer: Yes, you need to Dockerize your whole stack from the
beginning if you want fast deployment.

\- Moderate developer: Sometimes. You should be selective about which
technologies you use within your project and try to keep the deployment
process as simple as possible. If you can keep your technology count to a
manageable level, you may end up not needing Docker.

Question:

\- Should you use promises in JavaScript?

Answers:

\- Famous developer #1: No, promises are evil, they are slow, they swallow
your errors and they make your logic impossible to follow.

\- Famous developer #2: Yes, promises should always be used. Callbacks make
code difficult to read and cause callback hell.

\- Moderate developer: Promises are good sometimes but you have to be careful
how/when you use them. You don't want to have really long chains of promises
spread across multiple files. Sometimes callbacks are more appropriate if you
design your system in such a way that nesting is minimal. Ultimately is comes
down to what makes your team happy.

------
ilaksh
[https://github.com/WebAssembly/design/issues/363#issuecommen...](https://github.com/WebAssembly/design/issues/363#issuecomment-150891842)

------
falsedan
Reminds me of the plethora of packages on CPAN; juxtapose with PyPI & trying
to convince python devs that releasing a distribution that only does one
thing/has one package wasn't a waste of time…

~~~
mhd
I sometimes think that in the JS world, List::Util would be 12 different
packages…

~~~
substack
I think that's preferable to remembering whether any given method is in
List::Moreutils, List::Util, List::AllUtils or some other package entirely.
You already know the name of the method you want, why can't that be the name
of the package?

~~~
riffraff
that is only true if you know the name in advance. Considering List::Util I
can look at the module documentation for a function that returns the first
element for each pair in a list and fine "pairkeys" but I would probably not
make the name up and match what the package author thought.

------
wereHamster
I still want something like stackage for node modules. Dealing with shrinkwrap
is pure insanity.

