Hacker News new | past | comments | ask | show | jobs | submit login

The whitepaper notes that almost 9 billion NPM packages are downloaded per week, so I don't see anything laughable about needing good monitoring.

Which is roughly the equivalent of every single human being downloading two npm packages per week. To me, this suggests that the real problem is that too many packages are being downloaded.

I think this is a natural result of two things which should be appealing to fans of old-school UNIX philosophy:

- NPM is intentionally suited to lots of small libraries that do one thing and do it (hopefully) well, and composing those libraries in useful ways. Whereas systems like Debian have scaling limits with large numbers of packages, NPM tries hard to avoid this so that one hundred ten-line packages are as reasonable as a single thousand-line package.

- CI systems aim for reproducibility by deploying from source and having declarative configurations, in much the way that most distro package builds happen in a clean-room environment.

Probably a lot of these downloads are from bots. Continuous Integration is very common in Node.js/JavaScript projects, so each git commit anyone with CI (and no dependency caching) will download lots of packages.

> Which is roughly the equivalent of every single human being downloading two npm packages per week

The current human population of earth is about 7.7 billion, so that number should probably be closer to 1.17 npm packages per week per human being. That is still quite a lot, though

This highlights the problem of averages. Most (99.87% or so) humans download zero npm packages. But those that do, often download them in the thousands at a time. And yes, clean-room CI servers are a big part of that.

Perhaps npm could save themselves oodles of money by supplying a nice turnkey npm package cache and requiring major users to use it.

And perhaps the CI server folks would want this anyway because it would be vastly faster.

You might be surprised (or maybe not) to learn that many service providers are far more willing to spend money on predictably large bandwidth bills than on less predictable changes in their infrastructure which require human time and attention to implement.

Yep, not that surprising though, given the anemic state of the JavaScript standard library.

The idea of a scripting language is that it does not have a std. It will be different in each environment. You for example don't want the same std in nodejs and the browser. Each runtime can choose what API's it want to expose.

That’s not a definition for scripting language I’ve ever heard before and it’s neither true nor desirable. Even JavaScript has a standard library - think about things like Set, Map, Regexp, Promise, etc. – because they’re universally useful, as opposed to the runtime environment where things like the DOM are less relevant to many use cases. JSON is a great example of something crossing over as an increasingly high percentage of projects will use it.

Not having a standard library on par with other scripting languages just added overhead and incompatibility for years as people invented ad hoc alternatives, often buggy. The accelerated core language growth has been hugely helpful for that but you still have issues with things as basic as the module system which exist for understandable reasons but are just a waste of developer time.

Python is the batteries included scripting language. The two concepts are not mutually exclusive.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact