Hacker News new | comments | show | ask | jobs | submit login

The overhead is in your management of your dependencies. The size of the module isn't the problem, it's the fact that you end up using so many of them (especially recursively).

Consider this specific case. This author moved all their modules from one hosted location to another. Now, if you want to use these modules from that author, you need to update the scripts and configs that install them (some package.json files in this case). In a better world, like the C or Python world, you might need to update one or two urls which point to a couple of this author's popular libraries (maybe one you use directly, and one used by one of your handful of direct dependencies).

In this crazy npm world, this author has 272 modules. Maybe 20 are in widespread use ... it's already a lot of work to figure that out. Maybe you use a couple directly, and your dependencies have private recursive sub-dependencies on additional copies or versions of these or other of this author's modules! Maybe you have to cut your own versions of some of your dependencies just to change their package.json to refer to the new URLs! Anyway, you probably have to sift through hundreds of your dependencies and sub-dependencies to see if any of them are included in these 272 moved modules.

I've seen npm dependency trees with over 2000 modules (not all unique of course). That's totally unmanageable. I think that's why privately versioned sub-dependencies is a big feature in nodejs: so you can try to ignore the problem of an unmanageable dependency tree. But if you need to make reliable software, at some point you need to manage your dependencies.




I agree that NPM needs to push namespacing much harder, as that would make the whole process much easier.

Also a "provides" field could go a long way into stopping issues like this. Allow packages to say that they provide a package in them that is compatible with another in these version ranges.

That would let "API compatible" packages be dropped in to replace even deeply nested packages easily, and would allow easy "bundling" in big libraries while still allowing easy creation and access to "micro libs".

I really believe that composing tons of small libraries is the way to go, but there needs to be better tooling to make it work. In my (admittedly not extremely expirenced) opinion, bundling many small libs into one big package to make it manageable is a symptom of a problem, not its resolution.


This will become easier with rollup, webpack@2, and so on, which can effectively reduce the penalty of including large modules like lodash by tree-shaking out all of the parts you don't use. I would expect many more utility libraries to then be aggregated into a single package/repo and for users to simply pick and choose functions at will.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: