
Worrying about the NPM Ecosystem - diiq
https://sambleckley.com/writing/npm.html
======
segphault
The npm ecosystem needs something like the distinction between Ubuntu's "main"
and "universe" repositories, so that you have a smaller subset of known-good
packages with harmonized transitive dependencies, stronger centralized
curation, a lot more direct scrutiny, tighter control over versioning, and
some party that is responsible for addressing vulnerabilities and ensuring
appropriate maintainership.

As a venture-backed startup, the company behind npm needed the number of
packages and the number of downloads to constantly trend upward in order to
justify their valuation. This led to extremely poor policy and prevented them
from taking strong steps to remedy the deterioration of the ecosystem. Now
that it's owned by Microsoft, there's an opportunity to fix this.

Linux distributions have decades of experience solving these problems, there's
no excuse for the JavaScript community to continue ignoring the longstanding
precedents and best practices that have emerged from that experience.

~~~
mayank
This is a really good idea, and a fitting analogy. NPM already supports
private registries, so it would be a simple configuration change to point to
"main". On top of that, there is a lot of good work in the node community
around static analysis, CVE detection in transitive dependencies, and more
finely grained security perimeters, which could be used to detect possible
backdoors or malicious code.

~~~
jessaustin
It would be completely straightforward to do the work you describe. Presumably
numerous private actors have already done it for themselves. Until someone
does this work and shares it with the public, we'll all have to wonder how
valuable it would really be... anyway, it's unreasonable to expect the npm
people to do this work on top of everything else they do.

------
spankalee
This post describes several potential "problems" with NPM packages, but not
_why_ they are problems in the first place. The author doesn't say why they
consider circular dependencies bad, or why dependency depth should be low, or
even why their categorization of packages is the way it is (and leaves out
large categories like tools and CLIs).

I don't think that a high number of dependencies is necessarily bad, or a low
number is necessarily good. A dependency can often have factored out common
functionality into a shared and higher-quality project than the same bespoke
implementation re-implemented in several packages.

I think we'd need some more complicated metrics, possibly looking a bit like a
web of trust analysis, to determine quality and health of packages. Depending
on a well-maintained, popular, and highly-trusted package like lodash or
express shouldn't be an indicator of low-quality or low-trust. We also have
deprecation and security notices provided by NPM.

So I'd be much more interested in questions like "how many unique, low-trust
packages are included in this package graph?" or "how out of date are the
dependencies?" to try to estimate package health. From there I'd be curious to
know how healthy the ecosystem is in aggregate, weighted for usage.

~~~
diiq
You're absolutely right that I am doing a certain amount of "searching under
the spotlight", here. The metrics I measured were the ones I had the
information and computational power to measure. You want to see how many "low
trust" packages there are -- me too! But what does that mean? How do I find
them? That is, in essence, the _problem_ I'm describing. If I could solve it,
believe me, the post would have been about that, instead :)

I do propose, towards the end, a network-of-trust score -- but I don't think
that's something a single person can implement today; it requires social work.

(I am sorry you feel I didn't explain why deep dependency trees are a problem
-- my premise is based on the desire to assess packages for if they are
appropriate to include in a project, and I _do_ mention why deep and broad
trees make that harder. Circular dependencies I specifically say _aren 't_ a
problem; they're just disconcerting to me.)

~~~
chrisweekly
+1 for "searching under the spotlight", such a great (and IMHO underused) turn
of phrase

------
DLA
One of the author’s points it the thing that makes me crazy about the JS
ecosystem “... find a package ... it might depend on hundreds of other
packages, with dependency trees stretching ten or more levels deep...”

I cannot count the number of times when I’ve installed a package and got >500
or >1000 packages as dependencies.

Just today I installed ‘alptail’ a simple set of components using AlpineJS and
TailwindCSS (I love both!) but it npm installed 1314 packages! For a simple
set of maybe a dozen components.

This is madness.

How do we make this better?

Maybe we need more support from tooling to understand just what is being used
down the tree of trees of trees?

Maybe as lib writers we should seek to implement our own supporting functions
(within limits, of course) rather than having a first reaction “let’s go grab
XYZ for that” and inherit XYZ’s entire dep tree.

Not sure what the answer is but what we have today is a beautiful mess
(love/hate it).

Thanks for letting me vent a bit. I feel better. But this really bothers me.

~~~
stevebmark
I just wonder who cares? At all? What is "better"? Reinventing wheels instead
of relying on open source, battle tested code?

Also alptail isn't an npm package?
[https://github.com/danieljpalmer/alptail](https://github.com/danieljpalmer/alptail)

~~~
marcus_holmes
This "reinventing the wheel" concept has to die.

Writing your own code is not "reinventing the wheel". It's what we do. A lot
of times writing a new implementation is faster in the long run (more focused,
less obscure, easier to maintain).

Open source code is often not "battle tested" in your environment for your use
case. And if there is a problem, getting it fixed can be a nightmare.

It's like I'm making a go-cart, and making some nice appropriately-sized
matching wheels for it, and someone comes along insisting I don't "re-invent
the wheel" and use these four mismatched truck wheels instead. Yes, I'll save
time making the wheels. But the rest of the project is screwed.

~~~
DLA
> Writing your own code is not "reinventing the wheel". It's what we do. A lot
> of times writing a new implementation is faster in the long run (more
> focused, less obscure, easier to maintain).

So much agreement. We often will rewrite smaller sets functions or other code
for the express purposes of trimming dependencies, understanding what the code
does, and having the opportunity to adopt it into our codebase with a fresh
re-look at implementation, adding comments, etc.

> Open source code is often not "battle tested" in your environment for your
> use case. And if there is a problem, getting it fixed can be a nightmare.

Again. Agree with you strongly. Just because something as been available in
npm does not automatically mean battle tested. There's a ton of broken/poor
code out there.

------
ashtonkem
The chance to fix npm is dwindling quickly. With each year, a culture sets in
that will get harder and harder to fix with time.

Personally I suspect that the lefpad debacle might have been the point of no
return for npm; the fact that such a simple package managed to break
everything g should have been a wake up call, but it does not appear that the
community fixed much.

~~~
harikb
What could potentially happen with a malicious dependency added knowingly by
the original author or unknowingly because his account is compromised will one
day make leftpad look like a harmless warning / near escape we ignored.

Sure, leftpad broke productivity for one day. But the continued willful
ignorance we show towards "yet another dependency" will kill us one day.

There is work happening in creating notary etc. Github/Microsoft is in a very
good position to solve this. I hope they take the opportunity.

~~~
quantummkv
The sad thing is that it already happened 2 years ago [1]. And nothing changed
apart from some general apathy on the interwebs.

The JavaScript ecosystem is past the point of no return right now. Any
improvements will only be band-aids.

[1] -
[https://www.trendmicro.com/vinfo/dk/security/news/cybercrime...](https://www.trendmicro.com/vinfo/dk/security/news/cybercrime-
and-digital-threats/hacker-infects-node-js-package-to-steal-from-bitcoin-
wallets)

------
fenchurchh
My non-technical thoughts on the npm ecosystem:

1\. Split package-lock into a human readable file and a machine readable one.
Right now it's just pain.

2\. TRUSTED PACKAGES. Somebody has to assign trust. Companies or highly
involved developers should become curators.

3\. LIMITED Packages with reduced security risk: File Access, Network Access,
package access, OS access, etc...

4\. Transit npm package best-practices into a single module format. Reduce
clutter and number of files in node_modules.

5\. Better tagging. Allow votes on tags.

6\. Add monetization links for package authors to npm (blunt) and to the CLI
(subtle, e.g. 'npm pay leftpad' opens the monetarization page on npm.).

Additional ideas:

A: Buyable certificates/insurance for packages.

B: Better clone, modify & re-publish workflow

C: Integrate FAQ/Forum directly in NPM directory website. Simple
downvotes/upvotes for visibility. Allow and ask for readme pull requests from
npm package page.

~~~
chocolatkey
If packages are "trusted", how would NPM ensure that they are continually
trusted as new versions come out? I've read those stories with packages whose
developers hand them over for maintenance, and then a new update comes out
with malicious code

~~~
erichurkman
M of N signatures on new version tags, or on ranges of commits?

------
nisten
Hate to break it to the author, but npm has been a raging dumpster-fire for
over 4 years now. My personal advice on how to live with it:

\- Assume you will have to rewrite your app in a year or less from now and
organize it as such. This won't be easy when dealing with release deadlines
which result in all-nighters full of sphagetti code. Whether you're doing
react, vue or angular, just keep your components and logic small and if you
need a package, (say for handling CSVs) then look for one that promotes itself
as having no dependencies. Keep in mind that while re-writes can be a taboo
subject to many managers, they can actually be the fastest way to solve deep
dependency issues.

\- Unless your framework requires it, try not to lock your package versions,
this is a very common bad practice in my opinion. ```npm audit``` is your new
best friend, make updating to latest part of every sprint. If you're just 3-4
months out of date these days you're likely to have over 40 security
vulnerabilities. If you're 6 months out of date on a public facing application
and need to pass a security audit, you're not gonna pass it. Compliance
requirements, like SOC2 or healthcare ones are getting a lot more thorough as
well. It's a lot less work to update continuously.

-Secure your devops. Any npm package that runs has full acess to your entire disk and network I/O. The cheapest way to harden security is to move your devops to aws or gcp. They can afford to audit their networks and OSes a lot more often than you can. If you're doing it alone or on a very small team, i'd recommend trying out aws-amplify or firebase hosting on gcp.

-If you're doing node on the backend, go full typescript. I think it's easier to adopt a "strongly typed" mentality when your logic is clearly defined. If you don't feel confident in your use of typescript, then don't use it in the front end. You're gonna have a bad time, especially with state management in React. However it is getting more mature and package managers are learning to avoid type issues too when updating their methods.

\- And last, when picking a new framework or boilerplate or package, run npm
audit on it first then make sure to read up on their github issues as much as
you possibly can. It's very important to get not just a technical feel but a
social one too on how their devs react to their issues.

~~~
lioeters
Aside from the "raging dumpster-fire" \- I personally have benefitted greatly
from NPM, and many of its problems are likely common in other package
ecosystems/managers - I do find myself nodding in agreement with all your
points.

"Assume churn as inevitable and be organized for it" is a good mindset to stay
sane on the leading edge, especially in the world wild web. The Jenga tower of
dependencies will move around, break things every so often - changing
interface for sometimes seemingly no reason other than preference. Major
shifts in foundation, architecture, patterns, standard libraries, or
organizational concepts are a regular occurence.

------
furstenheim
A lot of the results of this analysis are a consequence of actually good UX in
dependency management.

In NPM publishing a library is really easy. To compare with java, I recently
had to follow a [three part medium
article]([https://proandroiddev.com/publishing-a-maven-
artifact-3-3-st...](https://proandroiddev.com/publishing-a-maven-
artifact-3-3-step-by-step-instructions-to-mavencentral-publishing-
bd661081645d)) to publish. NPM: one login, one cli command and you're done.
That lowers the bar to create a library, there are a lot of low quality
libraries, but there's also much more diversity and really cool libraries. In
java I've seen a lot of copy pasting. In Node you just create a library. You
can worry a lot about 1k dependencies, but a lot of those will be one liners.

In node, it's basically impossible to have conflicts with dependencies. Again
if we compare to Java. In Java namespaces are global, if there are two
libraries sharing the same package name they will collide. That makes deep
dependency trees actually a liability. I've seen recommendations of not using
guava if you are writing a library, because they change the api so much that
it will most certainly give problems if your user wants to use it as well, so
much for a library. In Node, in contrast, if there are two different libraries
that require different versions, they will each have their own dependency
version. That has the cost of space, but there it saves hundreds of hours on
conflict resolution. You can even have two different versions of the same
library with different name, in case you need to partially migrate a project.

Making it easy to create and consume libraries has of course a clear
consequence, more deep and big dependency trees.

------
tgbugs
The solution mentioned at the start of the article is what distribution
package maintainers have been doing for decades. I would take this account as
a giant flashing warning sign of what could happen to the flatpak/snap
ecosystem, and also a flashing warning sign for languages that are developing
their own package manager, especially if it is difficult for packages from the
language package manager to be integrated into actual system level package
managers. The curation effort required to get NPM into manageable shape is
almost certainly beyond the scope of any traditional community effort.

~~~
mratsim
Not having a package manager for your language means you're back to CMake and
the "FindBoost", "FindCuda", "FindMKL", scripts that always fail for one
reason or the other.

It encourages reinventing the wheel instead of standing on the shoulder of
giants. For smaller languages having everyone implement a JSON parsing library
because there is no dependency management would be a significant hurdle that
might kill off the language before it's even started.

System package manager are a pain to deal with as a developer, when adding CI
to my projects I'd rather deal with the language package manager, that works
on Windows, MacOS, Linux rather than brew + msys2/pacman + apt-get and some
C/C++ dependencies are just unavailable (looking at you recent BLAS/Lapack on
Windows)

It also significantly streamline the documentation. Any step you add to your
git clone or foo install is an extra hurdle for package adoption.

Lastly the language community needs is often at odds with the system
administrator needs. You can't expect the distro package manager to deal with
every single packages in the R community for example which allow you to
analyze in a slightly different but niche way. This was examplified by the
recent Haskell package management in Archlinux rant (link dead but was here:
[https://dixonary.co.uk/blog/haskell/cabal-2020](https://dixonary.co.uk/blog/haskell/cabal-2020))

Now I'm not saying that system package managers are bad, I go out of my way to
integrate all the Data Science libraries I use in my system:
[https://github.com/mratsim/arch-Data-
Science](https://github.com/mratsim/arch-Data-Science), but that's because
here I take my sysadmin/end-user/package consumer hat.

~~~
dane-pgp
> Not having a package manager for your language ... encourages reinventing
> the wheel instead of standing on the shoulder of giants.

Isn't writing a new package manager from scratch a pretty clear example of
reinventing the wheel, especially when these package managers often miss vital
features that system package managers have had for 15 years?

[https://wiki.debian.org/SecureApt](https://wiki.debian.org/SecureApt)

~~~
hombre_fatal
You link a package management solution that works for a single OS and wonder
why programming language libraries don't just use that?

So, I have a useful library that I want to share with the world. I do what,
exactly, so that other people across platforms can use it?

~~~
bwyeBr1j
Your comment is a beautiful demonstration of the problem -- a lack of
understanding of the existing tool drives a desire for a new, different tool.

There is absolutely nothing about apt or dpkg that are os specific other than
they require a posix-like environment. The same goes for rpm and yum. And
pacman.

At the end of the day, these package managers all do the following: 1)
describe the dependency relationships of packaged software 2) describe where
to extract files 3) describe any actions that need to be taken prior to or
after extracting files

There's nothing stopping you from installing .debs with dpkg on an RedHat
system nor .rpms with rpm on a Debian system aside from conflicts when both
package managers try to put files in the same place.

This same conflict occurs when language package managers come into the
picture, because newsflash: someone else has been working on the same fucking
problem for decades, and somewhere between just ignoring their work, and
actively hampering it.

~~~
hombre_fatal
Can you give an example of a project, maybe a programming language ecosystem,
that unifies some sort of publish-once, access-anywhere package system that
uses those tools and works across Linux, Windows, and macOS?

I don't understand what that would look like in practice. Tools like Homebrew
and MacPorts don't even do what you are asserting is the obvious, trivial way
to do it. And I have a feeling it isn't just because my microbrain doesn't get
it.

For example, I don't see how your post pitches something different than what I
used to have to do with C/C++ deps and CMake back in the day, and those were
not the hey days of dep management.

Also, enough with the animosity. When you find yourself registering a new
account because you want to sneak in some venom when talking about software
dep management on an internet message board, what are you doing?

~~~
tgbugs
Gentoo prefix can run on a wide variety of operating systems and
distributions. It needs a few updates to work around the recent changes in
macos permissioning, but if it can find a familiar shell it will more or less
work as expected.

I think part of the above comment is expressing frustration about the fact
that part of the reason for the lack of wide portability of these programs is
precisely because people say "oh they can't do x" or "this is operating system
specific" and taking it as axiomatic rather than realizing that with a bit of
engineering time and effort they can be extended to work on other systems.
This especially since there is pretty much no language that is used entirely
by itself without dependencies on another at some point in time. If your
language has a ffi, then you need a real package manager to accompany it.

[0]
[https://wiki.gentoo.org/wiki/Project:Prefix](https://wiki.gentoo.org/wiki/Project:Prefix)

------
paulhodge
Nice to see some actual numbers and research on the topic. Most complaints
about NPM seem to fall into the category of fretful, emotional hand-wringing
(although there's some of that here too).

If we measure success by the amount (not the percentage) of high-quality
packages available, NPM is easily the best package management ecosystem yet.
But there's definitely room for improvement.

One major improvement I'd like to see is some solution for unwanted indirect
dependencies. If I depend on package A, and A provides features X and Y, and I
only need it to do X, and the code for doing Y depends on package B, then it
shouldn't download B. The recommended solution is to split up A into a family
of packages (like lodash's per-method packages), but that just creates a mess
of versioning that's annoying for both the authors and the consumers. And
those massively-multipackage libraries (like Babel) just seem to create
dependency hell situations that semantic versioning is not good enough at
solving. So maybe instead of package splitting, the next-gen package manager
could have first-class understanding of ES6 modules, with tree-shaking logic
to prevent unused indirect dependencies.

------
matthewaveryusa
I think that a lot of ideas here are really good, but they shouldn't be
implemented in npm. Npm is not meant to be curated. NPM is meant to be a
canonical way to reference a dependency, and that's about it.

I don't think it's NPMs role to assign "quality" to a package (even though
ironically they have a quality metric) -- that's something a third-party
should do on top of npm.

That's the unix philosophy in me speaking.

------
akho
For most other languages the package ecosystem is a competitive advantage;
Perl had CPAN; Python famously comes with batteries included, but is used
where its batteries don’t matter (numpy and friends, django); rubyists will go
on about their gems for as long as you care to listen. JS is the only
ecosystem I know of where the libraries are something that scares people away
from the language. My own opinion is that npm is not state-of-the-art, and
there is a culture problem (starting with wrapping single statements in a
library — leftpad is most famous, but wrapping a single regex in its own,
incompatible wrapper seems to be an accepted way to do things, too). The
culture problem may be unsolvable.

~~~
ojnabieoot
Some other salient differences with Perl and Python:

\- Perl and CPAN are tightly integrated, whereas NPM and JavaScript are not.
Perl without CPAN is unthinkable; JavaScript without NPM is rather pleasant. A
huge amount of JavaScript culture, best practices, and idioms existed before
NPM and therefore there’s much less cultural cohesiveness on “standard” third
party libraries, etc. It’s a different story with Node specifically but the
horse already left the barn by then.

\- many of the “batteries included” in Python are from a rich standard library
that JavaScript/Node lacks. So there isn’t a need for a third party library to
implement extremely basic functions like you see in Lodash or leftpad, and for
pip packages there’s a much deeper base of code/API standards to draw
standards and idioms from.

~~~
vaughnegut
I'm of the opinion that most of lodash should just be incorporated into
vanilla javascript, as part of its standard library. It could be done
similarly to Python (or any language, really), and just group each collection
of functions by usage (like itertools in Python groups tools for iteration).

Once people stop feeling the need to reach for third-party packages to
accomplish day-to-day tasks maybe there'll be more freedom to move away from
the way npm is the way it is today.

~~~
Hedja
> lodash should just be incorporated into vanilla javascript, as part of its
> standard library

That's already happening. More and more of lodash/underscore/etc's functions
are being introduced to Array/Object/String APIs.

[https://github.com/you-dont-need/You-Dont-Need-Lodash-
Unders...](https://github.com/you-dont-need/You-Dont-Need-Lodash-Underscore)

------
open-source-ux
I think the problems articulated in this article will make Deno attractive to
many JavaScript developers. The standard Deno library has everything needed
for building server-side dynamic websites.

But I think what will speed adoption of Deno is something that has been
little-commented on: the runtime is a single-executable. This massively
simplifies deployment.

Contrast that with Node, Python, Ruby and the way they spew out dozen, often
100s, of tiny script files all over the place. Simple, uncomplicated
deployment solutions (particularly for self-hosting) simply don't exist with
many interpreted languages. Whatever you think of Deno, their decision to
create a single executable runtime will make it attractive to many developers.

~~~
_pdp_
Deno makes not sense to me outside of hobby projects, lambda functions and
other small components. Large projects with thousands of lines of code will be
difficult to support with this paradigm. The security model is also not great.
I applaud the effort, I think it is great, but command line flags for security
work only for small project files. For any large project you need something
that is granular and per module/function. If you start going down that rabbit
hole you will end up with some kind of configuration file no different than
package.json.

------
ausjke
I have decided to use javascript strictly for frontend only, no more node.js
or express or whatever else for backend. I will just use non-js backends these
days, which are plenty.

It's sad I still have to use npm even for frontend development nowadays, along
with babel, webpack and a matrix to get started.

I now stick with jquery(yes I know, but its concise syntax matters to me) and
bootstrap, and they seem working for 99% use cases for me. super easy to get
start and maintain, good enough for my use cases.

------
gtirloni
The only problem I see is the lack of a comprehensive standard library for
Node.js.

~~~
captn3m0
Ruby: x.is_a? Integer

PHP: is_int($x) (alias of is_integer)

Python: isinstance(x, int)

Node.js[1]:

    
    
        typeof val === "number" &&
        (!(typeof value !== 'number' || value !== value || value === Infinity || value === -Infinity)) && 
        Math.floor(val) === val
    

I think all supported Node.js versions now have Number.isInteger (via es262),
but the is-integer package still gets a 200_000 downloads a week[2].

[1]: [https://github.com/parshap/js-is-
integer/blob/master/index.j...](https://github.com/parshap/js-is-
integer/blob/master/index.js#L4-L8)

[2]: [https://www.npmjs.com/package/is-
integer](https://www.npmjs.com/package/is-integer)

~~~
Xavdidtheshadow
This actually taps into something I really like about npm. For things that
"should be easy/obvious", there are usually edge cases, _especially_ around
how `Object` is handled.

The real answer is for Node to have built-ins that handle this (which they now
do for `isInteger`, but absent that, I love that there are these specific
packages that have unit tests and a simple API. People scoff at things at
`isPromise`[1], but it's reliably filling a super important niche that the
language doesn't handle.

[1]: [https://github.com/then/is-
promise/blob/1d547300e780a4eca391...](https://github.com/then/is-
promise/blob/1d547300e780a4eca391236f15e5ccbf76a5789d/index.js)

~~~
captn3m0
Maybe we should adopt these packages into es262 and make the shims available
by default directly with the language?

------
julianlam
A package depending on itself is just an accident (and probably something npm
ought to detect on publish and reject, but alas...)

However, I believe a 1-package circular dependency is often the result of a
developer mistakenly adding a package as a dependency when they mean to add it
to `peerDepedencies`:

e.g. in TFA: "babel-core depends on babel-register, which depends on babel-
core... yeoman-generator depends on yeoman-environment which depends on
yeoman-generator."

Peer dependencies, of course, no longer exist because it got deprecated
because raisins.

Which leaves us with zero options to define sideways dependencies.

~~~
diiq
Where did you read peerDependencies was deprecated? It's still valid and
recommended, AFAIK:

[https://docs.npmjs.com/files/package.json#peerdependencies](https://docs.npmjs.com/files/package.json#peerdependencies)

~~~
wldcordeiro
I think they meant the behavior of not installing peerDependencies was
"deprecated".

~~~
julianlam
Right, I misspoke. However what's the point of calling it a "dependency" when
you don't install it alongside?

------
TehShrike
"is this package 'healthy'?" is a good line of thought, but the proposed
metrics are all less useful than just opening the source on Github or
unpkg.com and giving the source code a quick skim read.

If it looks too scary for you to maintain, you shouldn't take it on as a
dependency.

> It’s not really a technical problem, but mostly a social one.

This is absolutely correct.

~~~
diiq
You're right! That works great for the package itself, and it's what I do. But
will you also chase down and open on github all that package's dependencies,
and their dependencies, etc? It is that, more recursive problem that I'm
trying to point at. If you are content to trust that the package maintainer
chose their dependencies wisely, then this probably won't seem like a big deal
-- and that's ok! Different projects have different requirements for how they
assess third-party code.

~~~
TehShrike
I do check out dependencies, yeah – Octolinker makes it easy to do due
diligence on reasonable libraries.

If it has a ton of dependencies, then I don't want to take it on/maintain it
anyway.

------
cosmotic
The problem with the javascript ecosystem is that javascript ecosystem rejects
the problems with the ecosystem.

------
habosa
Why why why is the default behavior of npm install to choose a range of
versions rather than a single version.

I don’t want “a version compatible with lib 0.1.0” I want lib 0.1.0 ...
package-lock is a strange bandaid over this.

If I want the code my app depends on to change, I’ll change it.

~~~
marcus_holmes
why bother with the package system at all then? Why not just copy/paste the
code into your project?

~~~
habosa
Npm provides much more value than copy-pasting code! I think we all know that.

That said if I happen upon a possible dependency that's just one JS file and
has a permissive license, I often do choose the copy-paste route.

------
_pdp_
Btw, nobody is forcing anyone to use any packages from npm. You can easily use
whatever repository you want. You can store your dependencies in your source
repositories together with you application or another entirely separate repo.

JavaScript is a huge, perhaps the biggest, software ecosystem in the world.
There is a vast amount of code out there and you are going to get some bad
stuff if you are not careful. This does not mean it is bad at the core. It
means that it comes at the cost of doing security work upfront if you care
about this. while benefiting from the crowd-sourced experience of the masses.

------
_pdp_
While I agree that NPM can be improved, most people who are complaining about
the number of dependencies have no idea what they are doing.

If you install a package without the --production, which is default, you get
everything including perhaps some build packages like babel, linters, and what
not. These dev dependencies tend to be huge. That is how it works. Similarly,
if you install the dev version of a library in some other language you get not
just the source but perhaps hundreds of megs more of other source. What is the
problem?

~~~
diiq
This analysis does not include devDependencies. The counts and statistics in
it relate to strict dependencies only.

------
NiceWayToDoIT
What if npm packages are re-coded by popularity? In the sense that any package
that has achieved high connectivity and popularity must eliminate as much
dependencies with pure JS code. But again that would cause code bloating. Dan
Abramov had a talk about abandoning Clean Code. It is kind of strange that
simple React app now days has NPM folder with 1GB of files, but I guess we got
used to.

------
young_unixer
If I install React it's because I trust React. If React installs 10000
packages, I see no problem because I assume that React knows what they're
doing, otherwise I wouldn't trust them and I wouldn't have installed React in
the first place.

If I don't like deep dependency trees, then I install packages that I trust
not to use deep dependency trees (or rather graphs since they can have
cycles).

------
marcus_holmes
I refuse to use npm at all. Too risky.

I only use libraries if they can be added as a first-order dependency (i.e. in
a script tag on the page). This is still risky, but I can mitigate it
somewhat. And at least I'm not giving a bunch of random maintainers commit-to-
prod access in my core code.

------
ncrmro
Been looking for mdns browser package and still not sure how to filter out
node packages on NPM.

------
qppo
Cyclical dependencies are a feature, not a bug. Many languages don't allow
such a feature to exist at all with their compilation/interpreter models.

~~~
yjftsjthsd-h
What is a good usecase for circular dependencies?

~~~
crooked-v
Compilers/transpilers seem like a prime example. For example, imagine
Typescript vX using utility-library-a, which itself is written in Typescript
and has Typescript vY as a dev dependency.

~~~
diiq
You're right, that is a perfectly valid occurrence! However, I did not count
dev- or peer- dependencies in cycles; only true-blue dependencies. So all the
cycles I counter were either not of that type, or failing to use
devDependencies -- which would also be unfortunate.

