
JavaScript Growing Pains: From 0 to 13,000 Dependencies - nikolalsvk
https://pragmaticpineapple.com/javascript-growing-pains-from-0-to-13000-dependencies/
======
jonny383
One can see why this happens when packes like [1] and [2] not only exist, but
are frequently used.

[1] [https://www.npmjs.com/package/is-even](https://www.npmjs.com/package/is-
even)

[2] [https://www.npmjs.com/package/is-odd](https://www.npmjs.com/package/is-
odd)

EDIT: my favourite part, is-even depends on is-odd.

[https://github.com/jonschlinkert/is-
even/blob/master/index.j...](https://github.com/jonschlinkert/is-
even/blob/master/index.js)

~~~
caseymarquis
I immediately thought: "But no one would actually add that dependency!" Only
to see 120k weekly downloads of is-even.

I... don't understand...

~~~
onion2k
It's not _just_ checking if the number is odd. It checks if the input is a
number (modulus on a non-Number value returns a NaN in JS), if it's an integer
(not strictly necessary I guess), and if it's within the bounds of JS's safe
integer range (again, you'll get a NaN if the number is too big). If you want
to make sure your code is safe those _are_ sensible things to check.

A lot of JS devs wouldn't bother check those things, so really a lot of code
is _improved_ by this library. It's kind of a shame that so many developers
just skip over checking inputs really. That's the real problem.

~~~
berkes
The obvious solution would be to have some stdlib or even a common-utils
project or something like it, which contain all such items. In one package.

Because, indeed, no-one should be rewriting obvious code like isArray,
leftPad, isEven and so on.

Yes, that might cause _some_ overhead in some situations (when you need isOdd,
but not leftPad). But I'm certain that overhead is minimal. In practice,
however, we currently see dependency-trees where these "should-be-stdlib"
modules like "leftPad" or "isArray" are repeated numerous times in one
project.

Any stdlib could even be optimized by browsers and engines, turning that
"overhead" into a net benefit.

------
trollied
JavaScript really needs a proper standard library. Has any sort of central
authority ever tried to drive this forward? Maybe it should fall upon the ECMA
people to get this done, seeing as they’re the ones that define the ESx specs?
I don’t think npm/node is the answer either, this needs putting together from
scratch without being too bloated, much like the C or C++ stdlibs.

~~~
MauranKilom
> JavaScript really needs a proper standard library.

Well, Deno
([https://en.wikipedia.org/wiki/Deno_(software)](https://en.wikipedia.org/wiki/Deno_\(software\)))
has that as one of its explicit goals. As well as fixing a lot of the other
grown idiosyncrasies.

------
Epskampie
The only way for this deluge to stop is when big packages adopt a ‘as little
deps as possible’ attitude. React clearly did so with only 22 deps, it’s time
for others like webpack to do the same.

~~~
rumanator
That, or for the JavaScript community to get their shit together and start
investing in a modular standard library a kin of C++'s Boost to bundle basic
functionality.

~~~
the-dude
Smells like jQuery.

~~~
rumanator
> Smells like jQuery.

It does, if all you care about is mainly DOM stuff. That usecases is taken out
of the table once you adopt any framework such as React, Vue, Angular, etc.

------
alkonaut
How many of 13000 dependencies are duplicates in different versions? Would it
help if npm didn’t allow more than one version of a dependency? If I
understand correctly, now you aren’t even warned if you have transitive
dependencies to two versions of a library?

------
kumarvvr
I was a javascript enthusiast till a short while ago.

But now I strongly encourage every developer to look at .net core and asp.net
for web development.

The entry is a slightly steep learning curve, but its well worth it.
Especially if you are a lone developer, deploying to a vm in the cloud, the
cost savings due to the enhanced throughput itself is substantial.

Creating a web api is a breeze.

Standard websites with a few pages and some forms can take advantage of razor
pages.

And Blazor is an awesome alternative to React/Angular/Vue etc.

Edit : Also, the full tech stack, including developer tools are open source,
as in with proper licensing. And the open source dev tools experience is also
very very good. VS Code is an awesome IDE.

~~~
varrock
Thank you for writing this. I saw how well ASP.NET Core performed in the 2020
Stack Overflow Developer Survey and I was curious to hear from people who have
had good experiences with it.

[0] [https://insights.stackoverflow.com/survey/2020#technology-
mo...](https://insights.stackoverflow.com/survey/2020#technology-most-loved-
dreaded-and-wanted-web-frameworks)

------
satya71
The author left out another option. Run away from JS ecosystem when you can.
Unfortunately, it's not possible for client-side code on the web.

~~~
ScottFree
We're at peak JS and WASM hasn't managed to get their JS integration done to
any acceptable degree. Where do we go from here? Simply avoiding the web isn't
viable at this stage.

~~~
RedShift1
We used to have more options. There was Flash and Java. Flash is insecure,
Java too (it's basically running code that can do anything a native executable
can do). Java also has the competing JVM problems, backwards compat,... But
instead of getting our hands dirty and fixing the issues, the powers that be
decided to throw away all the hard work from the past and move to WASM which
is incomplete, barely supported and will have the same security issues Flash
and Java had once it gets more broader adoption. And so the circle continues.

~~~
lokedhs
Java had sandboxing from the beginning. There were some bugs that allowed
client code to escape the sandbox, but it wasn't nearly as bad as some people
made it out to be. I'd dare to guess that Javascript has had more sandbox-
escaping bugs than Java has had.

------
hoorayimhelping
Something that may not be obvious: JavaScript absolutely, can not under any
circumstances, break backwards compatibility. If JavaScript breaks backwards
compatibility, it literally breaks the web.

Most of the complexity we're seeing in projects is around that constraint
applied to 'modern' JavaScript - we have to translate all of our modern code
into older code that older browsers can still understand. JavaScript can't
just forget the past right now - there might be a point in the future where
it's given the IE6 treatment: "if you're using a browser that can't support a
standard that is 10+ years old, tough, you need to upgrade." But we have to
all move there, together, as an industry. It took a long time for us to dump
IE6 in that way, it'll take a while before we can dump ES5.

This is exacerbated by the fact that there is no standard library in
JavaScript. Check out some of webpack's dependencies:
[https://github.com/webpack/webpack/blob/master/package.json#...](https://github.com/webpack/webpack/blob/master/package.json#L14-L23).
A lot of these are replicating basic functionality a standard library should
provide.

But we can't just blame this on environmental factors - a lot of this feels
very much self inflicted. For example: node provides a file system utility for
but wepack is using a different one. Maybe the fs utility can't solve all of
their problems and they need an extended one. Instead of just using the
standard file system and dealing with the quirks of it, we have to write a
new, slightly better one and use that instead. Locally, it might make sense
for webpack to do that, but when you do it in aggregate across the industry,
you end up with the death by a thousand cuts situation we find ourselves in.

~~~
dktoao
To me it is preferable to write JavaScript like it is 10 years ago (using
slightly more rudimentary functionality) than to deal with transpiler hell and
write shiny ES6 code. Old tech gets the job done and if you are aiming for
stability, is probably a better choice. I know it isn’t exactly a fair
comparison, but C developers don’t code everything in C18 and then transpile
to C99 for maximum platform operability (I hope). We usually just use the
older standard until the shiny new one is supported on the platforms we care
about. I may know of one or two legacy C89 code bases that still exist that
have not caused the sky to fall yet.

------
bambam24
Can you imagine trying to find 22 New packages for your Java application?

------
awinter-py
> there’s not much you can do except be aware of what you add to your project
> ... try to avoid nested dependencies when possible

how? cultural expectations around transitive dep size?

I'm _certainly_ making decisions about graph complexity when designing
software in house (avoid circular import deps in code, be careful about call
depth in service graph)

But I've given up doing this for 3rd party deps in JS, except _maybe_ at the
point where `npm install` starts to bog down and I'm like 'this one isn't for
me'

------
austinpena
Why is javascript (seemingly) so much more prone to dependency hell?

Is it just the standard library falling short?

~~~
williamdclt
\- Standard lib falling short \- No control over the execution env (user's
browser), which mean we need polyfills, transpilers, abstractions \- Web is
everywhere => JS needed for many many things => lots of library for all sorts
of stuff => common problems extracted in small libs (which is has good and bad
aspects) \- Last point exacerbated by JS being a frontend and a backend
language \- Developpers tend to be more junior in average (not sure that's
actually true, it's believable but never seen numbers) \- NPM made it painless
to add dependencies, so that's what happens

~~~
Sebb767
A bit also stems from the fact that NPM installs dependencies as trees. I.e.,
if you and two of your dependencies use is-odd, you will have three copies of
it laying around, possibly in different version.

~~~
berkes
And now I'm stuck wondering why anyone ever would need to release a new
version of isOdd. Apparently isOdd had 6 releases. Does that mean that the
previous 5 times they had "calculating if a number is odd" wrong?

~~~
Sebb767
Looking at the commit history, they've mostly added checks for edge cases:
[https://github.com/jonschlinkert/is-
odd/commits/master](https://github.com/jonschlinkert/is-odd/commits/master)

Which is, to be fair, exactly the reason you'd use a library like this.

~~~
berkes
> Which is, to be fair, exactly the reason you'd use a library like this.

There is a crucial nuance here:

Which is, to be fair, exactly the reason you'd use a library.

With which I mean to emphasize that -indeed- libraries are crucial. But nano-
libraries, like nano-services are an antipattern.

This could be a stdlib (ideally), a bundle (like Lodash) or even a
numberUtils. Having a gazillion packages that have a rediculous biolerplate-
to-code ratio, contain just one (well evolved!) function and so on, is doing
more harm than good.

A library "like this" is harmful. If only because it gives people an argument,
or just the idea that libraries are stupid and should be avoided.

------
nojvek
One of my big reasons why I like Typescript. It doesn’t have any dependencies.
Sure it’s a big package but very self contained.

Don’t use Babel if you don’t want a gazillion dependencies.

The JavaScript ecosystem is vast, with many tools and trade offs.

------
MoSattler
The problem is really not as big as most commenters here make it out to be.

With most modern bundlers, bundle size is rarely the problem anymore. NPM
audit helps detecting and fixing security vulnerabilities of dependencies. And
build process breaking because of dependencies is rather a sign of having a
bad CI pipeline.

I am not saying it is great to have many dependencies, but it's really not
THAT bad.

~~~
Sebb767
The problem is really not the size or the number of files (both a pretty
manageable by modern hardware), but that you execute mostly unverified code by
several thousand people. It only takes one of those to go rogue to have your
data siphoned, your drive encrypted ... you name it.

Sure, big projects (hopefully!) watch their dependencies, but you auditing 13k
packages is simply impossible. And, given how much these numbers explode, it
unfortunately seems like people simply add a dependency without doing their
due diligence.

~~~
MoSattler
This is what `npm audit` is for. It's not perfect obviously, but it should
prevent the worst.

Besides, my guess is that the vast majority of developers do not verify their
dependencies in the first place. No matter if it's number is 5 or 5000.

------
mrlonglong
Just thinking about npm makes me feel really ill.

------
chrismorgan
I don’t believe “13,000 dependencies” is counting the right thing at all. In
actual fact, I think the number is only 756 or 691, depending on how you count
it.

If you look at the eventual package-lock.json and filter it to just lines
containing “resolved”, it’s only 756 lines, because it does plenty of
deduplication. I don’t have the time to waste on installing it all myself to
check, but I _think_ it fails to deduplicate some that could theoretically be
deduplicated, because of incompatible versions: I _think_ that if you have
dependencies _a_ → _x@1_ , _b_ → _x@2_ , _c_ → _x@1_ and _d_ → _x@2_ , it’ll
pick one of those versions of _x_ (no idea how it chooses—first it encounters,
perhaps?) to sit at the top-level node_modules, and any packages that need
another version will install their own version, so that in this situation you
might get node_modules/x (version 2, used by b and d),
node_modules/a/node_modules/x (version 1) and node_modules/c/node_modules/x
(version 1, again). I say this based upon very vague recollections of things I
read and interacted with years ago, and the structure of package-lock.json; I
may be wrong in the details of it.

This way of having multiple copies of the same version of the package is the
difference between 756 and 691—there are 65 exact duplicates. For example, you
get debug-2.6.9 at the top level, and then within other dependencies, you get
three copies of debug-3.2.6, and five of debug-4.1.1. That’s just one example.
There are eight copies of four different versions of kind-of. After excluding
these exact duplicates, there are then another 67 cases of multiple versions
of the same package being installed (kind-of’s four versions is the most).

A few days ago I looked at a case that was double the size of all this:
[https://news.ycombinator.com/item?id=23488713#23490055](https://news.ycombinator.com/item?id=23488713#23490055).

When you get duplicates with incompatible versions like this, it strongly
implies unmaintained, or occasionally incorrectly maintained, software. If
they all got their act together and simply updated to the latest version of
all their dependencies, the number of packages you’d install would not exceed
624.

Look, it’s still a lot, and I scorn many of them as unnecessary and oft
counterproductive frivolities, and there’s way too much overlap in many of
them; but 13,000 is just a shock number that doesn’t represent what people
expect it to represent, or match what they’re concerned about.

(Also this number doesn’t mean you’re taking code from over six hundred
sources; some things are just split up into multiple packages because they
genuinely are separate concerns; for example, there are 93 packages named
@babel/*, indicating first-party code from Babel.)

------
speedgoose
The weekly JavaScript/NPM bashing thread is there. I started to fill my bingo
grid.

~~~
rumanator
The fact that this problem is discussed weekly should be enough to raise a red
flag.

~~~
speedgoose
It's very repetitive and such a minor problem isn't enough to justify a red
flag in my humble opinion.

~~~
daze42
It's a minor problem until it becomes a major one. All it takes is one common
dependency to go rogue or a bug to be exploited without the maintainers being
around to fix it and a large part of the ecosystem becomes unusable. Using a
dependency implies an element of trust and now we have a huge web of trust
between thousands of maintainers that we really have no way to check. The
larger that number grows, the higher the chance of catastrophic failure.

~~~
speedgoose
Indeed you can't trust that many people and check all the code of that many
dependencies. It's a chain of trust, but how is it different than other parts
of your stack?

Do you trust every employee working on the Intel CPU microcode? Every
maintainer of the Linux kernel? The people who maintain the glibc? The
developers of V8 and nodejs? Do you do the same for your database? Your cloud
provider? The codebase of your business partners?

I would guess you don't, despite most of what I cited being highly critical in
terms of security, and some being written in memory unsafe programming
languages with tons of critical issues all the time.

If a NPM dependencies goes rogue, you will get notified. It will also do the
frontpage of HN and be mentioned in a comment of news about javascript for
years.

But what will most likely happen : a maintainer will fix the issue. If the
maintainer is in jail because s•he killed someone, well, someone else can
still maintain it or someone else will fork it and other maintainers will use
the fork.

In practice when a NPM dependencie of your project has a security issue, all
you have to do is to accept a pull request from a robot on Github.

I know it's not perfect, but it's really not that bad.

~~~
daze42
You bring up very good points. I certainly don't trust most if any of the
other critical pieces of the stack. I think those are attack vectors or points
of failure as well and that risk needs to be mitigated. But on the other hand,
the NPM package web is much larger and the barrier to entry is much lower so I
would consider it to be much higher risk than the rest of the stack.

We as an industry need to put work into reviewing, simplifying, and increasing
visibility at all levels of the stack, especially firmware. We're building
high and fast and while standing on the shoulders of giants is a great place
to be, we need to make sure the giant is more than just a house of cards.

