Hacker News new | past | comments | ask | show | jobs | submit login
JavaScript Growing Pains: From 0 to 13,000 Dependencies (pragmaticpineapple.com)
57 points by nikolalsvk 11 months ago | hide | past | favorite | 88 comments



One can see why this happens when packes like [1] and [2] not only exist, but are frequently used.

[1] https://www.npmjs.com/package/is-even

[2] https://www.npmjs.com/package/is-odd

EDIT: my favourite part, is-even depends on is-odd.

https://github.com/jonschlinkert/is-even/blob/master/index.j...


Anyone who needs to test an input is an odd number using vanilla JS will eventually end up rolling something that looks a lot like isOdd() themselves (or they'll just ignore a set of potential problems, which definitely is worse). It's very obvious when you read the code - https://github.com/jonschlinkert/is-odd/blob/master/index.js.

Utilities like that are a function of using a weakly typed language. "Just use TypeScript" is an alternative solution, but even in TS you should still check the number is within the bounds of isSafeInteger.


Are those checks actually needed?

"2.2 % 2 === 1" works just as well, so why is the isInteger check needed? Why is the isSafeInteger check needed? And "a % 2" is already NaN (and thus false), although an explicit error is arguably better than silently eating it.


But why should isOdd, isEven and isNumber reside in seperate packages? Surely a single math tools package should cover all of this?


Surely a single math tools package should cover all of this?

They shouldn't. It's silly. You're right that there should be "math"[1], but while those things aren't part of the main JS ecosystem people write utility libraries and publish them.

[1] There is, it's https://mathjs.org/. More people should use it.


It's an overreaction to the critizism of bloated libraries like JQuery. Dead code removal with tree shaking is a relatively recent feature in JS toolchains, so instead you have single-function libraries. For example there are also npm packages with each lodash function as its own package [1]

1: https://www.npmjs.com/search?q=keywords:lodash-modularized


Think of it like being able to pull any function from a global scope. eg. var/const foo = require("bar") Its one gigantic standard library.


I feel like somebody ought to start a project which systematically goes through the ecosystem removing these dependencies.

For more complex micro-libraries, perhaps they could be collated into a single, larger high quality library.


Very often these stupid dependencies are at the leaves of the dependency tree, often in packages which no longer get new releases to NPM. Fixing those packages has zero net effect on the whole situation.



It gets better when you read the repo description.

> I created this in 2014, when I was learning how to program.

Someone probably just did this for the fun of learning and somehow it got adopted elsewhere.


I immediately thought: "But no one would actually add that dependency!" Only to see 120k weekly downloads of is-even.

I... don't understand...


It's not just checking if the number is odd. It checks if the input is a number (modulus on a non-Number value returns a NaN in JS), if it's an integer (not strictly necessary I guess), and if it's within the bounds of JS's safe integer range (again, you'll get a NaN if the number is too big). If you want to make sure your code is safe those are sensible things to check.

A lot of JS devs wouldn't bother check those things, so really a lot of code is improved by this library. It's kind of a shame that so many developers just skip over checking inputs really. That's the real problem.


The obvious solution would be to have some stdlib or even a common-utils project or something like it, which contain all such items. In one package.

Because, indeed, no-one should be rewriting obvious code like isArray, leftPad, isEven and so on.

Yes, that might cause some overhead in some situations (when you need isOdd, but not leftPad). But I'm certain that overhead is minimal. In practice, however, we currently see dependency-trees where these "should-be-stdlib" modules like "leftPad" or "isArray" are repeated numerous times in one project.

Any stdlib could even be optimized by browsers and engines, turning that "overhead" into a net benefit.


It doesn't.

  isOdd('') => false
  isOdd(null) => false
  isOdd(2n) => TypeError: Cannot convert a BigInt value to a number
But even if it did, the same line of reasoning leads to the absurd conclusion we should have an addOne function.


> I... don't understand...

You are not supposed to reinvent the wheel. Somebody has already gone through the trouble of implementing the is-even algorithm, and it would be a waste of time to re-write it for no purpose. You may think that it is a simple algorithm, but I doubt you could implement it better than the people who uploaded these files to npm. Notice that the algorithm has many obscure corner-cases that you are likely to miss on your first implementation. Fortunately, it is already written and packaged; just use it.


Note that this implementation of `isEven` throws on input that isn't a number or numbers that are not integers, NaN or larger than 2^53-1. I think it's fair to argue an implementation could just as well return false for 1.4, NaN or '2'. There is a cost to learning these minutiae of your dependencies, especially if there is no compiler support to assist you.


I've not programmed a lot of JS, so this might be obvious for someone who did, but:

Why does he first use Math.abs on the parameter and then type check the result of that? I'd think if you do an argument type check, you'd do it before using it. Just to make it not throw on null? I don't see the sense in that...


I'm not sure this is the reason for that specific implementation but `Math.abs` _will_ do a string-to-number conversion.

I.e., `Math.abs("-27")` yields `27`.

It might be the case that Math.abs does other *-to-number conversions also, so maybe this method is trying to take advantage of that. (Math.abs is usually implemented as a native function so without digging more deeply it's not obvious to me what the actual implementation does.)

UPDATE: Curiously, `Math.abs(null)` yields `0`, so a null argument passed to is-even would yield `true` I guess.

Also whatever the logic is for that conversion is it is not the same as the built-in `parseInt` or `parseFloat`. `parseInt("27 meters")` yields `27` but `Math.abs("27 meters")` yields `NaN`.

Personally in JavaScript I almost universally make use of home-grown `isInt` and `toInt` methods whose behavior is predicable for me -- eg. my toInt() will reject a string that's not _just_ an integer value (and return `null` rather than `NaN` in all cases because who wants to check `isNaN`?) -- but that's probably not a great practice either. But I've been programming long enough to realize that the principle of least surprise is probably the most important concern here for long-term efficiency. I suppose what is or is not surprising may be subjective though.


</sarcasm>

I think you're getting some down votes as you missed a closing tag.


But if you indicate sarcasm as such, it ceases to be so! The best sarcasm is one that leaves exactly half of the people believing it to be sarcasm, and the other half agreeing with it. Otherwise, it is just a farcical exaggeration (which is also funny, but for different reasons).


Man I hope they did (just forget the closing tag). Poe's law in action.


For the record, 120k weekly downloads isn't really _that_ large in this ecosystem.

That's 120k instances of people or scripts doing `npm install is-even` or the equivalent.

For comparison:

* Lodash - a general purpose set of library/utility functions by the way - has ~26M weekly downloads.

* request - an HTTP client library - has ~19M weekly downloads

I wouldn't add is-even as a direct dependency either, but I think people are underestimating the size and diversity of the JavaScript ecosystem as a whole.


> EDIT: my favourite part, is-even depends on is-odd.

Probably because it just does

    return !is-odd(x);
So it can take advantage of all the type checking and other corner case junk in `is-odd`


It does seem like JavaScript/NPM has the advantages of composability found in Unix tools. It's no different from zcat | grep or similar. All that's different is the people making these tools have a wider range of experience and knowledge than people hacking on C making Unix tools, and easier distribution.


There’s an argument to be made that micro packages follow the ideas of the Unix philosophy, but to compare is-odd and a derivative that simply negates it’s results to piping two well optimized coreutils together is definitely a stretch.

I’d also say that implying JS developers inherently have a wider range of experience and knowledge compared to developers of coreutils is flat out absurd.


>> "I’d also say that implying JS developers inherently have a wider range of experience and knowledge compared to developers of coreutils is flat out absurd."

I was way too ambiguous, so I see how you understood it that way. I meant in the downward direction. There are more packages from more people in JS, so you end up with lots of first projects, small experiments, etc. So you have 100% of coreutils being decent to amazing, while it's probably closer to 1% with NPM. Both ways have their advantages, but it's easier to footgun with the highly inclusive NPM approach.


> but are frequently used.

From the short dependents lists, it looks like neither is frequently used directly. Edits to one or two dependent projects would likely be enough to largely eliminate them from the ecosystem.


I opened the index.js files of both repos and their copyright comment made me chuckle.


JavaScript really needs a proper standard library. Has any sort of central authority ever tried to drive this forward? Maybe it should fall upon the ECMA people to get this done, seeing as they’re the ones that define the ESx specs? I don’t think npm/node is the answer either, this needs putting together from scratch without being too bloated, much like the C or C++ stdlibs.


> JavaScript really needs a proper standard library.

Well, Deno (https://en.wikipedia.org/wiki/Deno_(software)) has that as one of its explicit goals. As well as fixing a lot of the other grown idiosyncrasies.


I like this a lot. I would say that going with one of the existing standard libraries is a start - lodash is a good example.

Another thing is anyone can roll most of these tools into standard libraries if the source license allows them to. So maybe some "rollup" packages are in order. You would get more than you need, but if you setup any sort of tree-shaking (which has been the focus of many build systems lately) then you can only bundle what you used in the end. One other thing that may be an issue is if you have two major libraries and each one is missing one function that you need - then you need both...


The point of having a standard library is that it'd actually ship with the interpreter, so you wouldn't have to "get more than you need" - it'd already be there without having to download. Obviously there would be a period of time (years) whereby you would have to do a conditional download, but the longer term aim would be that it's already provided as part of the base language.


Maybe lodash or just could be the std lib.

Just doesn't have any further dependencies.

https://github.com/angus-c/just


My vote is for lodash, if only to avoid the endless Abbott-&-Costello-style discussion thread confusion :P


I’m working on a better “standard” (or “base”) library (https://github.com/xixixao/jfl), but I’m not sure it’ll be as big as C++ stdlib.

Then again I don’t think even a library the size of std would change this picture much. Most of the dependencies in npm are too specific I think.


The only way for this deluge to stop is when big packages adopt a ‘as little deps as possible’ attitude. React clearly did so with only 22 deps, it’s time for others like webpack to do the same.


That, or for the JavaScript community to get their shit together and start investing in a modular standard library a kin of C++'s Boost to bundle basic functionality.


Smells like jQuery.


> Smells like jQuery.

It does, if all you care about is mainly DOM stuff. That usecases is taken out of the table once you adopt any framework such as React, Vue, Angular, etc.


Most of the "utility" deps you see in js eco-system are fragmented by opinions

So basically if I don't agree on your way of doing things, easy, I will just create a new module and publish it.

Things get real messy when hierarchy of modules use different modules.

Having basic functionality may fix it in long term but it will bring a huge rift in js community.


Shouldn’t TypeScript be able to do this without caring about JS?

If TypeScript had a reasonable standard library (at least the size of the Java/.NET libraries) and these were tree-shaken down when compiling, wouldn’t that go a long way?


It’s coming from tools or libraries trying to implement too many things in one. Or developers loading a huge library where they just need a tiny feature. This happens on many languages.


How many of 13000 dependencies are duplicates in different versions? Would it help if npm didn’t allow more than one version of a dependency? If I understand correctly, now you aren’t even warned if you have transitive dependencies to two versions of a library?


I was a javascript enthusiast till a short while ago.

But now I strongly encourage every developer to look at .net core and asp.net for web development.

The entry is a slightly steep learning curve, but its well worth it. Especially if you are a lone developer, deploying to a vm in the cloud, the cost savings due to the enhanced throughput itself is substantial.

Creating a web api is a breeze.

Standard websites with a few pages and some forms can take advantage of razor pages.

And Blazor is an awesome alternative to React/Angular/Vue etc.

Edit : Also, the full tech stack, including developer tools are open source, as in with proper licensing. And the open source dev tools experience is also very very good. VS Code is an awesome IDE.


Thank you for writing this. I saw how well ASP.NET Core performed in the 2020 Stack Overflow Developer Survey and I was curious to hear from people who have had good experiences with it.

[0] https://insights.stackoverflow.com/survey/2020#technology-mo...


The author left out another option. Run away from JS ecosystem when you can. Unfortunately, it's not possible for client-side code on the web.


We're at peak JS and WASM hasn't managed to get their JS integration done to any acceptable degree. Where do we go from here? Simply avoiding the web isn't viable at this stage.


We used to have more options. There was Flash and Java. Flash is insecure, Java too (it's basically running code that can do anything a native executable can do). Java also has the competing JVM problems, backwards compat,... But instead of getting our hands dirty and fixing the issues, the powers that be decided to throw away all the hard work from the past and move to WASM which is incomplete, barely supported and will have the same security issues Flash and Java had once it gets more broader adoption. And so the circle continues.


Java had sandboxing from the beginning. There were some bugs that allowed client code to escape the sandbox, but it wasn't nearly as bad as some people made it out to be. I'd dare to guess that Javascript has had more sandbox-escaping bugs than Java has had.


Look at intercoolerjs[1] (or the successor to it, HTMX[2]), turbolinks[3]. They make front-end development nearly non-existent and effortless. Just include in your source and start using HTML attributes.

[1] - https://intercoolerjs.org

[2] - https://htmx.org/

[3] - https://github.com/turbolinks/turbolinks


HTMX looks pretty cool. Thanks for the tip. I've been enjoying AlpineJS[1]

[1] https://github.com/alpinejs/alpine/


Something that may not be obvious: JavaScript absolutely, can not under any circumstances, break backwards compatibility. If JavaScript breaks backwards compatibility, it literally breaks the web.

Most of the complexity we're seeing in projects is around that constraint applied to 'modern' JavaScript - we have to translate all of our modern code into older code that older browsers can still understand. JavaScript can't just forget the past right now - there might be a point in the future where it's given the IE6 treatment: "if you're using a browser that can't support a standard that is 10+ years old, tough, you need to upgrade." But we have to all move there, together, as an industry. It took a long time for us to dump IE6 in that way, it'll take a while before we can dump ES5.

This is exacerbated by the fact that there is no standard library in JavaScript. Check out some of webpack's dependencies: https://github.com/webpack/webpack/blob/master/package.json#.... A lot of these are replicating basic functionality a standard library should provide.

But we can't just blame this on environmental factors - a lot of this feels very much self inflicted. For example: node provides a file system utility for but wepack is using a different one. Maybe the fs utility can't solve all of their problems and they need an extended one. Instead of just using the standard file system and dealing with the quirks of it, we have to write a new, slightly better one and use that instead. Locally, it might make sense for webpack to do that, but when you do it in aggregate across the industry, you end up with the death by a thousand cuts situation we find ourselves in.


To me it is preferable to write JavaScript like it is 10 years ago (using slightly more rudimentary functionality) than to deal with transpiler hell and write shiny ES6 code. Old tech gets the job done and if you are aiming for stability, is probably a better choice. I know it isn’t exactly a fair comparison, but C developers don’t code everything in C18 and then transpile to C99 for maximum platform operability (I hope). We usually just use the older standard until the shiny new one is supported on the platforms we care about. I may know of one or two legacy C89 code bases that still exist that have not caused the sky to fall yet.


Can you imagine trying to find 22 New packages for your Java application?


> there’s not much you can do except be aware of what you add to your project ... try to avoid nested dependencies when possible

how? cultural expectations around transitive dep size?

I'm certainly making decisions about graph complexity when designing software in house (avoid circular import deps in code, be careful about call depth in service graph)

But I've given up doing this for 3rd party deps in JS, except maybe at the point where `npm install` starts to bog down and I'm like 'this one isn't for me'


Why is javascript (seemingly) so much more prone to dependency hell?

Is it just the standard library falling short?


- Standard lib falling short - No control over the execution env (user's browser), which mean we need polyfills, transpilers, abstractions - Web is everywhere => JS needed for many many things => lots of library for all sorts of stuff => common problems extracted in small libs (which is has good and bad aspects) - Last point exacerbated by JS being a frontend and a backend language - Developpers tend to be more junior in average (not sure that's actually true, it's believable but never seen numbers) - NPM made it painless to add dependencies, so that's what happens


A bit also stems from the fact that NPM installs dependencies as trees. I.e., if you and two of your dependencies use is-odd, you will have three copies of it laying around, possibly in different version.


And now I'm stuck wondering why anyone ever would need to release a new version of isOdd. Apparently isOdd had 6 releases. Does that mean that the previous 5 times they had "calculating if a number is odd" wrong?


Looking at the commit history, they've mostly added checks for edge cases: https://github.com/jonschlinkert/is-odd/commits/master

Which is, to be fair, exactly the reason you'd use a library like this.


> Which is, to be fair, exactly the reason you'd use a library like this.

There is a crucial nuance here:

Which is, to be fair, exactly the reason you'd use a library.

With which I mean to emphasize that -indeed- libraries are crucial. But nano-libraries, like nano-services are an antipattern.

This could be a stdlib (ideally), a bundle (like Lodash) or even a numberUtils. Having a gazillion packages that have a rediculous biolerplate-to-code ratio, contain just one (well evolved!) function and so on, is doing more harm than good.

A library "like this" is harmful. If only because it gives people an argument, or just the idea that libraries are stupid and should be avoided.


Another way to word this is that the Javascript ecosystem is really good at reusing modular code.


One of my big reasons why I like Typescript. It doesn’t have any dependencies. Sure it’s a big package but very self contained.

Don’t use Babel if you don’t want a gazillion dependencies.

The JavaScript ecosystem is vast, with many tools and trade offs.


The problem is really not as big as most commenters here make it out to be.

With most modern bundlers, bundle size is rarely the problem anymore. NPM audit helps detecting and fixing security vulnerabilities of dependencies. And build process breaking because of dependencies is rather a sign of having a bad CI pipeline.

I am not saying it is great to have many dependencies, but it's really not THAT bad.


The problem is really not the size or the number of files (both a pretty manageable by modern hardware), but that you execute mostly unverified code by several thousand people. It only takes one of those to go rogue to have your data siphoned, your drive encrypted ... you name it.

Sure, big projects (hopefully!) watch their dependencies, but you auditing 13k packages is simply impossible. And, given how much these numbers explode, it unfortunately seems like people simply add a dependency without doing their due diligence.


This is what `npm audit` is for. It's not perfect obviously, but it should prevent the worst.

Besides, my guess is that the vast majority of developers do not verify their dependencies in the first place. No matter if it's number is 5 or 5000.


Detecting and fixing problems after your app has been deployed is NO WHERE NEAR as good as avoiding problems. Not depending on thousands of packages from thousand of strangers is about avoiding problems.


Obviously the measures are meant to be run before you deploy, not after. That's what a CI pipeline does.


Auditing detect issues in published packages. You can't know ahead of time in your CI that an issue is going to be found in that package.


What is a good CI pipeline meant to do when dependencies aren't able to be pulled anymore, caching aside?


Almost 100% of dependencies are hosted on npm. So the CI should fail and you wait for npm to fix the issue.

In the unlikely scenario that npm it gone forever, you can still get the dependencies from a previous build or a random developer laptop, or Github.

If you must build and deploy a version of your software while npm is down, which is unlikely but may happen, well you may have to skip the CI and build from a developer laptop.


Just thinking about npm makes me feel really ill.


I don’t believe “13,000 dependencies” is counting the right thing at all. In actual fact, I think the number is only 756 or 691, depending on how you count it.

If you look at the eventual package-lock.json and filter it to just lines containing “resolved”, it’s only 756 lines, because it does plenty of deduplication. I don’t have the time to waste on installing it all myself to check, but I think it fails to deduplicate some that could theoretically be deduplicated, because of incompatible versions: I think that if you have dependencies ax@1, bx@2, cx@1 and dx@2, it’ll pick one of those versions of x (no idea how it chooses—first it encounters, perhaps?) to sit at the top-level node_modules, and any packages that need another version will install their own version, so that in this situation you might get node_modules/x (version 2, used by b and d), node_modules/a/node_modules/x (version 1) and node_modules/c/node_modules/x (version 1, again). I say this based upon very vague recollections of things I read and interacted with years ago, and the structure of package-lock.json; I may be wrong in the details of it.

This way of having multiple copies of the same version of the package is the difference between 756 and 691—there are 65 exact duplicates. For example, you get debug-2.6.9 at the top level, and then within other dependencies, you get three copies of debug-3.2.6, and five of debug-4.1.1. That’s just one example. There are eight copies of four different versions of kind-of. After excluding these exact duplicates, there are then another 67 cases of multiple versions of the same package being installed (kind-of’s four versions is the most).

A few days ago I looked at a case that was double the size of all this: https://news.ycombinator.com/item?id=23488713#23490055.

When you get duplicates with incompatible versions like this, it strongly implies unmaintained, or occasionally incorrectly maintained, software. If they all got their act together and simply updated to the latest version of all their dependencies, the number of packages you’d install would not exceed 624.

Look, it’s still a lot, and I scorn many of them as unnecessary and oft counterproductive frivolities, and there’s way too much overlap in many of them; but 13,000 is just a shock number that doesn’t represent what people expect it to represent, or match what they’re concerned about.

(Also this number doesn’t mean you’re taking code from over six hundred sources; some things are just split up into multiple packages because they genuinely are separate concerns; for example, there are 93 packages named @babel/*, indicating first-party code from Babel.)


The weekly JavaScript/NPM bashing thread is there. I started to fill my bingo grid.


The fact that this problem is discussed weekly should be enough to raise a red flag.


It's very repetitive and such a minor problem isn't enough to justify a red flag in my humble opinion.


It's a minor problem until it becomes a major one. All it takes is one common dependency to go rogue or a bug to be exploited without the maintainers being around to fix it and a large part of the ecosystem becomes unusable. Using a dependency implies an element of trust and now we have a huge web of trust between thousands of maintainers that we really have no way to check. The larger that number grows, the higher the chance of catastrophic failure.


Indeed you can't trust that many people and check all the code of that many dependencies. It's a chain of trust, but how is it different than other parts of your stack?

Do you trust every employee working on the Intel CPU microcode? Every maintainer of the Linux kernel? The people who maintain the glibc? The developers of V8 and nodejs? Do you do the same for your database? Your cloud provider? The codebase of your business partners?

I would guess you don't, despite most of what I cited being highly critical in terms of security, and some being written in memory unsafe programming languages with tons of critical issues all the time.

If a NPM dependencies goes rogue, you will get notified. It will also do the frontpage of HN and be mentioned in a comment of news about javascript for years.

But what will most likely happen : a maintainer will fix the issue. If the maintainer is in jail because s•he killed someone, well, someone else can still maintain it or someone else will fork it and other maintainers will use the fork.

In practice when a NPM dependencie of your project has a security issue, all you have to do is to accept a pull request from a robot on Github.

I know it's not perfect, but it's really not that bad.


You bring up very good points. I certainly don't trust most if any of the other critical pieces of the stack. I think those are attack vectors or points of failure as well and that risk needs to be mitigated. But on the other hand, the NPM package web is much larger and the barrier to entry is much lower so I would consider it to be much higher risk than the rest of the stack.

We as an industry need to put work into reviewing, simplifying, and increasing visibility at all levels of the stack, especially firmware. We're building high and fast and while standing on the shoulders of giants is a great place to be, we need to make sure the giant is more than just a house of cards.


The Linux kernel, glibc, V8 and nodejs are some of the most vetted software existing. Of course I trust them. If my business partner has security breach it's possible to sue them.

That is different than adding 1000 barely looked at dependencies to my JavaScript project. Every addition is another chance for an undetected security vulnerability. "It's really not that bad" is probably what Equifax thought before the magnitude of what happened was revealed.



>>and such a minor problem

It's not a minor problem. It's a huge structural and operational problem that nearly renders the whole stack unsalveagble, if not for all the infrastructure and legacy.

And the mere fact that you need half a dozen JavaScript bundlers and plugins and polyfills to reign the stack into the realm of salvageability should speak for itself.


as someone with js disabled by default i am enjoying these every article justifies my decision, javascript is cancer of current web


Why? The amount of npm dependencies has little to do with why you disable javascript by default.

Do you really care if some website uses a dependency from npm or some copy pasted code ? What does it change for you as an user?


tons of websites have trash performance with javascript enabled and it certainly isn't just adbloat or tracking, altough that contributes a lot

e.g. web players are a nice example, on couple sites was about to sort out native html5 player in 40 lines of javascript, while they use 100+kB libraries for whatever reason

and i am not even talking about HLS streaming or anything like that here, just pulling mp4 file


It's another thing that may be an issue, but is it? My smartphone runs these bloated players just fine.


the video may be running fine; but the animations are slow and terribly laggy, input lag when i try to pause is awful and messing with right click options annoys me




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: