Hacker News new | past | comments | ask | show | jobs | submit login

Except that now, the download deps in CI step can fail if one of hundreds of websites for my hundreds of dependencies goes down. If the main NPM repository goes down, I can switch to a mirror and all of my dependencies will be available again.



To be the rubber duck, if wiping the cache at each build is a risk to your CI, what could you do to keep your CI up?

1 - not wipe the cache folder at each build? It's easy and secure. Oh and your build will be faster.

2 - use a cached mirror of the deps you use? It's like 10min to put in place and is already used in companies that care about security and availability anyway.

3 - you have https://deno.land/x if you want to put all your eggs in the same npm basket


Yes, I think I'd probably settle for solution number 2.

I still don't understand how this is better than NPM, and how Deno solves the horrible dependency management of Node, but maybe if I actually build something with Deno I'll get some answers.


From the post:

> [With NPM] the mechanism for linking to external libraries is fundamentally centralized through the NPM repository, which is not inline with the ideals of the web.


> which is not inline with the ideals of the web

Subjective.

> Centralized currency exchanges and arbitration is not in line with the ideals of the web! - Cryptocurrency

Nek minute. Besides, let's get real here; they will just end up centralized on GitHub. How exactly is that situation much different than npm or any other language ecosystems library directory being mirror-able?


The centralization of git on Github is completely different in nature from the centralization of Node packages on npm.

git does not require Github to be online to work, nor relies on Github existence for its functionality.


I'm talking about the centralization of software packages(Go, Deno) on GitHub as it applies to dependency resolution.


I'd highly recommend mirroring packages anyway. Obviously this isn't always necessary for small projects, but if you're building a product, the laws of the universe basically mandate that centralized package management will screw you over, usually at the worst possible time.


You answered your own question. Nothing stops you from using a mirror with deno too.


Which again brings me back to something I'm still not understanding - How is Deno's package management better than NPM if it is extremely similar to NPM, but slightly less secure?

I'm only asking because lots of people seem to be loving this new dependency management, so I'm pretty sure I'm missing something here.


We need to distinguish between npm, the service (https://www.npmjs.com/) and npm, the tool.

Deno has the functionality of npm, the tool, built-in.

The difference is that like Go, Deno imports the code directly from the source repository.

In practice it's going to be github.com (but can be gitlab or any code hosting that you, the author of Deno module, use).

NPM is a un-necessary layer that both Go and Deno has removed.

It's better because it's simpler for everyone involved.

In Go, I don't need to "publish" my library. People can just import the latest version or, if they want reproducibility, an explicit git revision. Compared to Go, publishing to npm is just unnecessary busy work.

I've seen JavaScript libraries where every other commit is related to publishing a new version to npm, littering the commit history.

In Go there's no need for package.json, which mostly replicates the information that was lost when publishing to npm (who's the author? what's the license? where's the actual source repository?).

As to this being insecure: we have over 10 years of experience in Go ecosystem that shows that in practice it works just fine.


How do you list the dependency libraries if you don't have a package.json?

Do you manually install a list of libraries provided by the author's readme?


The simplest approach is to either import anything anywhere, or have a local module that import external dependencies and then have your code import them via that local module.


The dependencies are imported in the source code of the package.


NPM, the tool, has had the feature to be able to install directly from GitHub instead of npmjs.org for many many years as well. No one really used it unless as a workaround for unpublished fixes because it has no other tangible benefits.


I like it because it's simpler. I know what happens when I import from a URL. I'd have a hard time whiteboarding exactly what happens when I `npm install`.


What happens?


My least favorite thing about importing from NPM is that I don't actually know what I'm importing. Sure, there might be a GitHub repository, but code is uploaded to NPM separately, and it is often minified. A malicious library owner could relatively easily inject some code before minifying, while still maintaining a clean-looking repo alongside the package.

Imports from URL would allow me to know exactly what I'm getting.


install from the repo then?

You can install a specific version from git via yarn/npm.

How do you trust a url more without reading the code?

What's going to stop deno ecosystem from putting minified js files on cdns and import them?


It's decentralized.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: