Not sure if Bun has had an influence on this, these features must have been in development for a long time, but the timing surely is impeccable.
npm compatibility is huge for Deno. It is basically the one major drawback to Deno, which gets hopefully fixed with this feature. It also looks like this compatibility layer is implemented transparently in the existing module management, which is a big bonus. Nobody wants to deal with node_modules anymore when working with Deno for an extended period, just love to see this.
If you deep dive into bun, it only has the gained performance for certain features and cant really even be compared to deno or node at the moment because it lacks so many other features that would better showcase its speed.
I had a huge jolt away from the Bun hype-euphoria when I noticed it won't run Express.
That level of incompatibility is the banner headline not some minor footnote, which is effectively how the bun team decided to communicate that little detail.
If they're going to do things all-over-again...again in the
JS community couldn't they get it right this time and go with speed, stability, compatibility...
Hm, that claims SQLite is 'never' used for web backends; Bun says it's motivated for use 'at the edge' - first thing I thought of when I read that was Fly.io/Litestream: https://fly.io/blog/all-in-on-sqlite-litestream/
(No idea if there's any relationship between the projects, just makes me think that article's a bit dismissive.)
Core features for npm compatibility though (like implementing most of the npm api, including require(), and its special globals) has been making progress for a long time.
Relative to the the existing node compat efforts, the url import (and accompanying package download code) will be fairly small. The most difficult part is probably the extra stuff that node throws in the global namespace. How to handle that without having to pollute the global namespace for all programs (even those that do not import from node) is unclear.
Ideally these globals would be visible only from code in modules imported from npm. But the spec does not really allow for this unless the npm code is loaded in a different realm, but cross realm code causes a lot of headaches, which could only be avoided by having the realms share most globals and intrinsics (and sharing intrisics is not allowed by the spec).
There may be some other way to hack this into working, or perhaps programs with such imports that actually use NPM specific objects will need to be run with the "--compat" flag. It is really unclear at the moment.
Heh, I commented in that post from yesterday how Bun looked more promising longer term to me than Deno, even though it wasn't production ready yet: https://news.ycombinator.com/item?id=32460658
Like you said, don't know if Bun had an influence on the timing of this announcement, but great to see innovation in the JS server space. Kudos to both the Deno and Bun folks!
My guess is they were planning on announcing these things sometime in the next couple of months, but then Bun showed up on HN, so they moved up the blog post.
Deno's progress on NPM compatibility may have accelerated due to how BunJS was received, but yes I believe this idea has been baking for a while in the Deno world. It was always somewhat of a concern when Deno was launched, but the team was just more focused on delivering a solid core product before they started worrying about the ancillary tasks. After all, you can always use the various Deno-friendly CDNs to use a Deno-compatible package distributed over NPM.
BunJS does of course have a slightly different use case, packages or apps that still need to make use of Node APIs. I can see a future where we say "if you have an old Node app, you can run it on BunJS with a codemod and get an X% perf boost for free", as well as "if you are starting a new project, just go with Deno because there's less BS to think about".
Npm is the main thing I dislike about Node though. I hope compatibility is a feature meant for easing transitions, not something the general user base should be embracing.
It's a package manager, package managers are almost always entirely optional.
Feel free to gather and build your dependency trees by hand, if you believe this to be worth it. Most packages have their sources and build instructions readily available on GitHub/Lab :)
These problems exist with every package manager, and every ecosystem of reusable modules that can be downloaded from a 3rd-party location. The problems that `left-pad` caused are minuscule compared to the ones created by Log4J. It's just the trade-off you make when you download a package from the internet vs when you write the code by hand. NPM may have more instances these days because it happens to be the most popular, but CPAN had the same problems over 20 years ago. The next package ecosystem to come along will also inherit a vulnerability to supply chain attacks, being part of the supply chain.
I don't think NPM is the best way to create a package manager, in fact there are many choices they made that I think are rather stupid and led to many more problems than there needed to be (whoever decided that the default for `npm add` should be a caret dependency instead of a tilde should never be allowed to work in the industry again, IMHO). But I'm not going to blame them for a problem that definitely existed in the package management world before them and will continue to exist long after everyone forgets about NPM.
The default should be shrinkwrap, like `yarn`, with the ability to upgrade packages manually when you decide that it's a good time to upgrade packages. I don't like things changes at all without explicit interaction. Services like Snyk will let me know if there's a specific security concern with a package version I'm relying on.
> Don't downplay it's importance and it's disadvantages that came along.
This is rather unfair.
Your previous comment referred to NPM in the broadest possible sense, so what else would you expect its audience to do but to interpret it in the same way?
It is primarily a package manager and so I assumed you didn't like it for how it functions in that capacity.
That is not to say that I disagree with you - it's their priorities and stubbornness on major issues that I myself take issue with, but I'm complaining too much already in general so I'll leave it at that.
I'm actually a little disappointed by NPM compatibility. Or at least conflicted. Deno has felt like an opportunity to reboot the server-side JS ecosystem into something far more sensible than it currently is. Not that anyone is to blame for what it is, really, but you learn lessons and move forward. Sometimes making breaking changes.
This seems like an acceptance that moving on simply won't happen. The existing ecosystem is too big and too powerful to be ignored. If I were a particularly cynical person I'd also point to Deno taking on millions in VC funding - once you've done that you can't be content to create a perfect, gleaming sandbox. You have to bring in money.
I can speak to this since I've used Deno quite a bit. The problem is that the Deno ecosystem is, unfortunately, still very young. This young age also brings a lot of instability with it. On the other hand, the Node ecosystem is extremely mature. There's a lot of legitimately great npm packages with years of development and bug fixes.
Not being able to use these packages is a detriment to Deno's adoption, not a boon. IMO, the biggest issue with Node is Node itself, not its ecosystem. All Deno does is bring the JavaScript ecosystem to a more modern standard. Being able to pick and choose the npm packages you want to use while slowly adopting the Deno stdlib and Deno specific packages will make it easier for developers to make the switch.
Once Deno ships these features I'll try Deno again. Right now I'm really turned off by JavaScript development. The last couple of times I've tried to start up a project on Node it's been an absolute nightmare with its ESM madness and lack of TypeScript support. On the other hand, getting started with Deno was a breath of fresh air... up until the real development starts.
This was my experience as well. When Fresh dropped a few weeks/months ago, I thought, "oh neat, islands architecture, that's right up my alley". I decided just to blindly fool around with it one weekend. I immediately was hit with, "but what do I do with my styles? How can I transform all this ancient Less into CSS and attach it to files/components?" I'm sure there's some sort of way around all that but, right now it seems like I'd need to have a node instance just to do a compilation step!
Node ESM support REALLY sucks. Node import maps suck (and are poorly documented). Node doesn't support TypeScript out of the box. Node dependency management has always sucked. I've used three different package managers, and I'm sure more will pop up in the future.
I've always loved Node.js. But the unfortunate reality is it sucks to build new projects in it in 2022.
Totally agree on ESM supporting being a mess. We're going to see projects littered with .mjs files for decades to come. People will whisper in hushed tones.. what the heck is a mjs file, isn't this just javascript? Why did we do this again? No one will remember.
I think this is the motivation behind requiring an npm: prefix. It is a way to virtue signal for devs who buy into the deno goals to push for migrating to non-“legacy” packages. If you see an npm: prefix in a deps.ts file you can consider it tech debt. And potentially set up a lint rule to enforce it in the future. If they had not put an explicit reference in the source (passed through the packages transparently or kept a server side list of packages on npm) then it wouldn’t be colocated with the source and the PR processes. I think this was a good move overall.
It's not just a "virtue signal"; those packages will have to go through some special (and pretty aggressive) transpiling in order to work. This wasn't just flipping a switch from the Deno team
That's not really what's happening behind the scenes.
We are consuming sources as they are provided in npm using "compat layer" that is part of Deno's standard library. We had to provide special module resolution, but besides polyfill for built-in Node APIs, it works very similar to how Node consumes these packages.
As a non-user of Deno, I’m curious, how do people tend to manage dependency version centralisation in Deno? I suppose this will apply to existing Deno as well as npm imports. I can just see you ending up having to change “import express from "npm:express@5";” from 5 to 6 in lots of places when the next major version comes out.
When contemplating how I might do it, I came up with centralised reexport, a file deps.ts containing:
export express from "npm:express@5";
This would basically stand in for the dependencies object in package.json.
deps.ts is currently the most common way, however new projects like fresh use import maps. the problem with importmaps is they are not composable, as such, using an importmap for a library is usually bad as said importmap is not picked up and used, so the end-user would have to add their own importmap with entries compatible for the library (this is what fresh does: it generates an importmap with entries it needs). due to this problem, deps.ts is the most common solution, however I do hope that composability issue with importmaps will change.
It could be nice to have import-maps composability. However, I don't think it is planned because browsers' import-maps are not composable. I dislike the deps.ts concept. It is one of the main reason I did not switch to Deno yet.
IMHO it's a feature and not a bug that if you import a dependency you also have to carefully review and approve (or add) its dependencies to your application. The entire reason node is a mess of supply chain issues and problems, and is almost like nuclear waste in some professional orgs (i.e. impossible to use because of unclear and unknown licensing concerns, ownership, etc. across thousands of dependencies) is from the ease at which long and complex dependency chains are pulled into a single npm install.
This isn't entirely fair to Deno but it always amazes me how people start out with "we don't need packages! Just do direct imports" and eventually end up with "okay fine fine here's a package manager". Packages are great! Don't dismiss them just because in edgecases they're bad.
In fairness, the two examples, Deno and Go, are successful projects, so maybe it's not a bad initial strategy.
Deno still won't have a package manager, at least in the style of npm. You don't need to learn a bunch of CLI and janitor a package.json file. You just add an import to your source code and you're done. Pulling in a npm package is the same as pulling in code from github, it's just a special URL. That's it... that's the 'package manager'.
Except that, as the ecosystem grows, you eventually have packages that are complex enough that they have imports, and those packages have imports.
And now because you've hard-coded imports at every level, you can end up with a dozen different versions of the same import, causing potential conflicts between the various components of your app.
Package managers evolved to be the way they are _because they're useful._ It's not like people said "let's add all of this complexity for fun!"
Direct imports failed as a solution for Go, and they already have been acknowledged as an incomplete solution for Deno (import maps are already a larval form of package.json, but without the ability to inherit child import maps, which means it's still broken, but presumably it will eventually be fixed).
Not having a standardized place where you can see the dependencies of a project, and not being able to distinguish which are _development_ vs _runtime_ dependencies, is also rather broken, and would require a package analysis tool that's far more complex than a package manager to use to trace the dependency tree reliably.
Its the same stuff npm does but it doesn't depend on npm's CLI. It sounds like you're mad they're moving the cheese a bit. IMHO I'm glad to be rid of the npm ecosystem baggage.
Package.json is a mish mash of a lot of concerns. It has project metadata, that's also duplicated in your readme and github descriptions. It has scripts, that are also probably just calling into other shell scripts and such in your project. It has your dependencies. It has your development environment setup. It has random key value pairs you decided to place there years ago and forgot the exact reason for now but are too scared to remove them. It's a royal mess of a file to manage.
Having a file to manage locked dependency versions isn't bad. Having a file that has grown into a crufty monster with millions of uses can be bad.
I don't care whether it's all in one file or in a dozen files, but I want all of that information to be available programmatically in a text file (unlike in a readme or on Github) in a standardized location in a project.
In that respect, package.json is a strict win. Your lack of willingness to use `git blame` to see why you added a line, or lack of reasonable git comments, is not to be blamed on the file.
Complexity is unavoidable. How could you write a tool like license-checker [1] for a Go-based project without having license information in a standardized location? Without the scripts section, how can you create a tool like husky [2] that automatically installs git hooks for a project? Every single part of package.json is there for a good reason; at best you could argue that putting some of it in other files would be aesthetically superior, but that's just bikeshedding.
Complexity isn't de facto bad. Some complexity is required if you want a certain level of functionality to become available. Deno (and Go) are slowly accumulating that "cruft" as people realize that those functions are actually useful or even critical to a mature ecosystem.
The dependency inspector is showing each TypeScript file, which strikes me as a newb mistake if I've ever seen one. That's just tons of unnecessary noise distracting you from what you need to know. Heck, it's also showing library files, which are going to be mostly irrelevant. If I include a framework that has 50 source files, I really just want to see what framework version I've included, not the place where every one of those fifty files are included every time they're included. Heck, if one indirect dependency had a bad version tag it would be nearly impossible to see.
The lock file format is also TypeScript-file based, which ... well, since dependencies can point at arbitrary git repos, you could easily end up in a situation where the version tag on the repo is changed and then you can't find the file you need to download. It's a pretty worthless lock file if the file can just disappear from the internet; a lock file should exist to guarantee that a package can be rebuilt exactly, not just prevent it from being rebuilt if it can't be rebuilt exactly, which is all the Deno lock does.
npm on the other hand will prohibit anyone from deleting an older version of a package if anyone else is using that package. "leftpad" can never happen again now in Node. Having a centralized repository is actually a Good Thing; being able to search through all the packages in one central repo and determine relative popularity of each is also useful.
Also: The lockfile is a generated shrinkwrap file. That's only a tiny piece of what I'm talking about. What I want to be able to do is set the exact version of an indirect dependency without needing to fork and rebuild every dependency. See yarn resolutions [1] for an example in the npm ecosystem.
No, I'm not mad because it's different. I habitually chase the bleeding edge up to and including jumping between frameworks and build systems and even languages and editors in constant search of ways to improve my development workflow. I thrive on learning new ways of doing things. I was enthusiastically digging into Deno shortly after they very first 0.x announcement, but quickly saw a number of showstopper flaws (including lack of npm compatibility) that caused me to write it off.
No, I'm complaining because Deno is missing crucial features. "Lack of features" isn't a feature on its own, especially when people use those features. (See also the Go language.) I also would miss the "scripts" section of package.json, since it's a centralized location for the various commands that a project might need; things like "generate a new migration file" or "run both the client and both servers" vs "run only server A".
This is all complexity that was swept under the rug by Deno and that will need to be replaced by community standards that won't be standard.
I'd still say Go doesn't have a "package manager". The dependencies are fetched, cached & built as you need it, but there's nothing comparable to "npm i"/"npm uninstall", dpkg/rpm/apk/etc, as such. If you want, you can just import example.com/foo/bar in your source, and the rest is just reproducability (version & hash) and caches.
Is deno successful? I (in my limited world) don't know anybody using it and don't see it mentioned in job descriptions etc.
And to comparing to other languages: It's not like "ok, projects became to big we need better package management" or something, but to me reads more like "there's too little deno stuff and migration prevents people from going here, so let's try to embrace the other runtime's stuff so maybe more people can use our runtime without having to start from scratch"
It's not even 5 years old, give it some time for people to pick it up and use it. Python was created in '91 but even by '96 it still wasn't widely used.
I think there's quite a big gap between success and being more popular than NPM. Plus it's only a few years old. Job adverts are usually several years behind what companies actually use and companies are behind what hobbyists use.
I'd say it is on a successful trajectory. The main issue with it was NPM compatibility so I think this is a very very wise move, even if it is pragmatic rather than ideal.
Wow, very interested in an npm compatible Deno! I think that will remove a huge barrier to entry for Deno and could mark a huge shift towards usage in production.
I'm very interested in how packages will manage runtime support in the future. Right now we are seeing an explosion of JavaScript runtimes from Cloudflare Workers to Bun, all with sightly different APIs. I wonder if there will be an easy way to get cross-runtime support without it being much extra work for library authors.
Right now, there has already been conflict with some packages being browser-only and some being node-only. I feel like there needs to be a better solution than the one we have now.
Browser, node (OS-level), or serverless worker, these are all fundamentally different roles and need different API requirements. Node and serverless don't have a DOM. Serverless can't have unfettered access to the host OS. Node should be as capable as any host language (Python, Ruby, C, etc.) with an API equally powerful. The browser obviously needs a more limited sandbox.
There is a parallel universe out there, where Sun had success with Java in the browser, the JVM evolved to better support distribution and HTTP standards as well as replaced the need for WASM. And we all lived in a happy world with Java on the server and Java in the browser and everything is cross-platform and Just Works(tm). Cats and dogs also live together in harmony in this world. We had big dreams in the '90s.
Yes the lack of npm compatibility was the only thing keeping me back from Deno. I really like they way they're going to do it. `import express from "npm:express@5";`
I'm very familiar with the space. The majority of these have never reached production status or any meaningful usage, the few active ones are very niche products - Duktape, Espruino, Quickjs, etc, you'll be hard pressed to find someone running a web server, gui or cli app on them.
Hermes might show up in the engine radar due to how prevalent RN is, but other than that, you can bet 99% of projects run Node.js or Deno now.
As long as these are actual standards, all the other runtimes also get a fair shot at implementing them. I think it's a move in the right direction.
So we've come full circle. From the talk of author on how "NPM was a big mistake", to "Nobody is adopting it, we will die without the NPM ecosystem". I was hoping that Deno will spin off into something that can be compiled due to type system hints into something more optimized than JS, but seems like I was just living in dystopian utopia.
> There will be no node_modules folder, no npm install; the packages will be automatically downloaded in the Deno cache.
My understanding of deno's package system (see https://deno.land/manual/linking_to_external_code#it-seems-u...), is that basically, at compile time it will go and fetch external URLs and cache those files locally. So this change basically makes it so your import:
Is magically generated from npm when you do this instead:
> import { assert } from "npm:testing@2.0.1";
Ok cool. Whatever.
...surely this isn't the actual problem?
There seem to be certain pretty fundamental problems:
- not all npm packages are typescript
- a package is not just an 'assert.ts' file, it's often an amalgamation (eg. using rollup)
- a package often has many dependencies
- a package may use the node API, that deno doesn't have (it has a more-or-less compatibility mode, see 'differences that cannot be overcome' -> https://deno.land/manual/node)
So bluntly, how on earth is this going to work?
I mean, I'm a fan, the folk working on Deno are smart and motivated, and if it works, I'm more 'wow' than 'I don't believe it'... but I'm very surprised to see:
> the vast majority of npm packages work in Deno within the next three months
That seems... ambitious.
I can't see how the approach is really sustainable, given that basically, it means 'anything node supports we have to support to'; doesn't that mean you're forever playing catchup and 'doesn't quite work'?
How is vendoring going to work without a package lock file? Doesn't this just mean you've reimplemented npm and you'll forever be playing catchup to the various npm features that are required to make packages work?
You overestimate many of these difficulties. The --compat mode can already run quite a bit of the node ecosystem, including programs like npm.
Imports don't need to be written in typescript. They could have manually written type definitions (which is how people using typescript in the node ecosystem already handle this), and packages without type definitions will have the imports typed as `any`. Sure that makes them a bit annoying to handle especially in strict mode, but it entirely possible to use such packages.
Deno is actually working on adding the native node API (https://github.com/denoland/deno/pull/13633), so the website was a bit overoptimistic when saying "Deno will never support Node plugins". However it is possible for some plugins to be incompatible still, for example if they try to use node internals rather than just the API, or if they try to use utilize the libuv event loop, since deno is not libUV based.
Looking at it in terms of sets is not really useful because the utility of TS is to exclude invalid behavior. JS allows many things which are not permissible in TS. Semantically, not syntactically. And TS can't infer a number of things such as higher-order functions. So you either have external definitions for those libraries, give up and type it "any" (and taint your entire code base), or have errors which means you can't use the JS library with your TS code base.
Or to put it another way, JS is only a subset of TS in as far as you're willing to forgo the benefits of TS.
The NPM thing is a huuuuge deal. This has been by far the biggest thing holding Deno back, but it was tough because one of Deno's most exciting benefits was standardizing the module system and ditching NPM. It looks like they may have found a best-of-both-worlds solution for shoehorning NPM's massive ecosystem into Deno while it's still building out its own. Big changes indeed.
That, and the emphasis on speed (esp quotes like "We aren't optimizing for a handful of edge cases, but for overall real world performance") makes me wonder if Bun is lighting a fire under them a bit. It's still a long way from catching up, but it's been closing the gap at a blistering pace and getting enormous amounts of hype. The more competition the better.
This might sound odd, but: one thing I've wanted to do is use the Typescript compiler without using NPM. (I'm not smart enough to know why I hate NPM. I just want a tool that compiles TS.)
It almost sounds like I could write a self-contained deno command-line "app" that does this. Does that seem right (for those that know better)?
Errr, why do you need any non-js-runtime tool for this? TS's normal `tsc` command line is a single js file - you can pull it straight from GitHub[1] or unpkg[2] and run it directly in node without installing a full package via a package manager if you'd for some reason like to.
This is just for personal projects. But I sometimes write JS code for the browser and I tend to not want to use any 3rd party libraries. But I do like the extra type checking. So I just want something that does the type checking and does TS to JS -- but without installing npm (which I have some irrational dislike for -- maybe just because I don't understand it enough).
One other alternative to npm that is very useful for this specific case is npx. `npx typescript` will download typescript, cache it somewhere outside of the current project (often %LocalAppData% on Windows or near equivalent elsewhere), and then run it. You can run specific versions with `npx typescript@4`. You can follow that with command line arguments for typescript.
npx is installed right alongside npm in Node installs for some time now.
Learning npm is still often a good idea when working with Node. One thing that may be useful to learn here: npm has a concept of a development dependency needed only for development, not run time. You can install one with `npm install --save-dev typescript` for example. (Or `npm i -D typescript` if you want to save some typing.)
You can clean out dev dependencies with `npm prune --production`. Or if you are in a fresh install situation (a fresh git clone, for instance) and you want to skip developer dependencies you can `npm install --omit=dev`.
I find the distinction between production dependencies and developer dependencies quite useful.
Yeah, this is the answer. TBH..... you should get over your fear of npm. If you already have node installed you can just run npm i -g typescript ts-node
This will give you tsc as a global command line to compile typescript to javascript.
ts-node is optional, but it is the standard node cli that can run typescript
After you install these two you can just ignore that you have npm forever.
(... but yes you can also just download the js file as the person above suggested if you're really stuck on being anti-npm)
TS "compilers" basically strip out the type annotations, they don't care about the validity of the types, so you'll be missing out on the main reason to be using TypeScript.
esbuild is so awesome... my steps towards web dev sanity are
1. Switch everything to esbuild, remove 10k dependency security nightmare, bloatfest and slugishness of pretty much every other node based build system / bundler.
2. Switch to deno.
Haven't quite got to deno yet, but #1 was liberating. esbuild also has some preliminary deno support, but not sure how stable it is.
The main thing I like about esbuild is that I feel like I wont ever have to "migrate" build system again (and I have to deal with an unusually high number of repos so this is a big deal), not much in my repos have anything esbuild specific, migrating was more a matter of removing build system specific smells from repos, it's all just a js and css bundle now. i.e there could be something better than esbuild in the future, and switching to that should in theory be effortless. This has never been true for all of the NPM build systems prior, (I was there at the beginning with grunt).
Not OP, but in paper deno offers a lot of things out of the box
> - Provides web platform functionality and adopts web platform standards.
> - Supports TypeScript out of the box.
> - Has built-in development tooling like a dependency inspector (deno info) and a code formatter (deno fmt).
Also they have a build-in testing library, so for me is mostly get rid of the tooling as dependency, the only thing that is missing is a frontend bundler (I think they have one, but intended to use for the backend only), but there are swc and rebuild that hopefully will catch the feature parity with webpack and friends.
> TS "compilers" basically strip out the type annotations, they don't care about the validity of the types, so you'll be missing out on the main reason to be using TypeScript.
I guess I'd want a Typescript compiler that does the "type checking" before stripping out the type annotations. Like that's how TS is supposed to work.
If ESBuild doesn't do the type checking, then it probably doesn't fit my need.
Edit: In fairness, if my IDE does the type-checking, this is almost good enough.
You can run esbuild during development for quick builds and eslint for ide typescript enforcement, and then use tsc plus eslint during a production build to enforce type checking.
I've been using Deno in production at work for a while, and NPM compatibility is the one thing dragging me back to Node, some packages like Elasticsearch simply won't work in Deno, if they solve that I'll be throwing a party for all HN where we collectively remove Node from all my development workspaces.
> We've been working on some updates that will allow Deno to easily import npm packages and make the vast majority of npm packages work in Deno within the next three months.
This is really huge and will be a huge boost to the Deno ecosystem. On the other hand, I quite enjoyed that it wasn't jacked into NPM. There were reasonable alternatives like https://jspm.org/. This is a big swing at Node and I'll be watching with maximum curiosity.
As an aside, the only thing preventing me from adopting Deno as my daily driver has been the lack of AWS Lambda support (requiring a lambda layer, slowing down cold starts). Would be great if they could leverage connections and advocate for native AWS support.
Demo is awesome, and the npm compatibility is a game changer. Isn’t this the 3rd http server rewrite in less than a year though? They didn’t mention it being backwards compatible. I guess they needed to change the underlying server to support features in Fresh. This is one downside of the team making the core platform as well as many of the main components of the ecosystem… they can evolve the runtime to suit their vision even if it affects competitor’s implementations.
I have a limited experience using NodeJs. My first interaction with Node was unpleasant. The whole npm world seemed very insecure and very wide wide west like, thinking back I can't remember why I felt that now.
I know the article said some would like to get away from the npm. For those who still would like to use npm with Deno, I would love to know why. (I'm genuinely curious, not looking to start a flame war)
Hi Luca, can you provide a bit of detail on how this will be accomplished? Will Deno be analyzing an entire project’s npm imports simultaneously, rather than individually?
Deno already analyzes the entire module graph before the application starts up, so we extract out the npm specifiers ahead of time, do the npm dependency analysis, cache any packages as needed, then start execution.
We'll have custom registry support. For local npm packages, that would be nice to have, but probably something that will be implemented later. It could actually be done in a hacky way with the existing implementation of this we have, but it would be better to have something specifically designed for this.
I hope we figure out a better way to deal with the mountain of dead (and just plain bad) packages on NPM. Not carrying this legacy always sounded like a plus to me, but I haven't used Deno in production yet.
It's also interesting to note how the tone changes after receiving funding. May just be a coincidence, but this looks like a reactionary jab back at Bun, which is barely out of the gate.
Deno has had a node compatibility flag for a while now, and quite a few node packages have worked flawlessly. For anyone who has been using deno for some time this is not a big surprise to see smoother npm import support. I think all the people saying this is such a giant, game-changing big deal or that deno has changed, blah blah haven't actually been using it...
The first individual investor in the funding announcement is Nat Friedman, who defended ICE while at GitHub. I was one of many who signed the petition and cancelled my GitHub Pro account. Node is still a community project, since the creation of the Node.js Foundation. I feel like switching to Deno would be going backwards. https://www.theverge.com/2019/10/9/20906213/github-ice-micro...
I'm excited to see this, and I'm sure they'll work this out, but I sure hope I don't have to keep dependency versions in sync across all of my import statements. I don't have much love for package.json and node_modules, but it is nice to have one source-of-truth for all the dependencies of a project.
Deno's philosophy is to be aligned with web browser standards. I suspect import maps will be the path forward for most projects as it's what browsers will or are supporting today too.
Package.json is a mess, it combines a million different concerns like project metadata, dev environment, scripts, and random key/value pairs along with all the dependency information. It's a node and npm specific configuration that ballooned into a dumping ground for metadata. There's no reason to keep it around in a new runtime.
You create the import map by figuring out what dependencies you want to use and crafting it and your code to reference them. You can do anything--you can import five different versions of the same library but under different names if you want. This was not something even possible with node and npm. You can make it as simple or as complex as your needs require. For most people they'll keep it simple and drop a few import urls directly in their source, never even needing to touch a tool or other file to worry about dependencies.
They're giving users what they want, an ability to run packages which haven't been ported to work with deno yet. The blog post explicitly makes this clear. This is not some grand conspiracy like so many replies seem to be trying to imply.
> This is not some grand conspiracy like so many replies seem to be trying to imply.
There's no conspiracy. All the awkward workarounds with tedious manual steps to recreate a package.json "dependencies" field are frankly laughable. Deno should bite the bullet and automate this since they already have the dependency resolution built in. No need to "craft" anything.
Seriously, no idea why anyone things this is a better option. Seems like more work, and if two packages export things with the same name you have to use aliases. Also, seems like this could get REALLY messy on large projects.
I LOVE TypeScript, and was so excited about Deno when I first saw it, but the lack of package.json "feature" is a huge turn off for me. I sometimes feel like I'm the only person that has no problem with package.json. I think it works great.
Haven't used deno but I assume the standard `export * as asserts from "https://deno.land/std@0.152.0/testing/asserts.ts";` will work, which is the same effort as a package.json and does not create any identifier conflicts.
In addition to the other solution posted, a global find & replace is... kinda trivial to do in most editors.
Anyway, besides that, I'm sure you can use different versions of a library within one codebase well enough; especially in NodeJS land, nowadays, differences between library versions are small.
> a global find & replace is... kinda trivial to do in most editors.
Global find & replace isn't a real solution here. For example, if I'm reviewing someone else's code, it's harder for me to verify that they didn't miss an import.
In general, anything that replaces a single source-of-truth (packages.json) with a process (“run find-and-replace with this query”) will increase the amount of energy spent on being vigilant about that process at the expense of other things.
It would be nice if they included a compat flag for importing without this special syntax. As it stands, there is no clear drop-in replacement path for Deno, and as a result, I have no interest in using it.
I'm not rewriting my work so someone can have fun playing with VC money and abandon mature software.
I wish there is more clarity about package dependency will be resolved from Deno?
Supporting npm is one big step forward, but without a package-lock.json that locks all child dependencies, it’d be a nightmare using the new compatibility in production
Deno optionally supports lock files, though its current lock files are just for asserting that specific URLs have specific hashes rather than controlling where/which dependencies are found. Maybe its lock files will be extended for handling this.
This is pleasing to see - no node_modules, no npm install. Just a special import format.
They are not clear if you need to have npm installed or not for this to work. I sincerely hope not - if it requires a locally-installed npm then they've shat the bed I think... that would be a major turn-off for me.
Can't say I've ever been concerned by the performance of Deno, but looking forward to seeing this playout anyway since faster is always nice to have.
There's a lot of code on NPM in general, and while there's definitely a lot of warts, there's a lot of good code on there as well. It makes no sense to write off an entire package ecosystem just because some of the packages are unmaintained or poorly written. Plenty of NPM packages are useful beyond the JS ecosystem, and I think until you get a better and more reliable system for distributing those packages, NPM is probably going to be here to stay, like it or not.
Deno still has a great standard library, so there's much less need to pull in crufty little packages to do simple tasks. IMHO the npm support in deno is more for pulling in the big packages like wrappers around databases or services and such which haven't yet written or want to write a deno-specific version. Sure you can still pull in junk like left-pad, but you don't need to do that.
The Deno/npm incompatibility up till now means a good number of popular and semi-popular npm packages got a shoddy Deno port done by someone a year ago, then never updated again. That is to say, legacy code got even more legacy.
Btw, I like where it's going but I find it quite sad that there is no official linux ARM64 support yet - which means I can't try to use it on AWS lambda for example.
ARM64 builds are something that has been getting investigated. The biggest problem is that github doesnt provide any ARM64 runners, so workarounds around that are necessary
By default they are x86 unless you specify ARM. I find the parent comment a bit confusing. Unless he wants the graviton 2 pricing and perf benefits. Which I 100% support!!! Love graviton!
If we really care about performance, then this split between the runtime and user code is not ideal. What we should be trying to do is bake the user code into the runtime and apply total program optimization. This can be done today with C, C++ and Rust. However, in principle JavaScript code could be transpiled to Zig and then baked into Deno.
JITs (as those found in V8, JSC, SpiderMonkey) can theoretically compile better code than static compilation approaches like C, C++, and Rust. It's because the runtime can collect information about what is being run and then generate code accordingly.
This is true in some situations, but I think is mostly a just-so story told about JITed languages.
Here's the problems:
1) Often, you have lots of code that only runs once (e.g. UI initialization), or runs rarely (click handlers). By the time the JIT has had an opportunity to observe this code and optimize it, it may be too late.
2) The dynamic nature of JS means that making your code JIT-friendly requires deep knowledge of what language features to avoid.
3) Performance isn't only driven by having the tightest possible machine code. Memory layout is often more important on modern processors. Indirection causes cache misses, and JS objects contain far more indirection than equivalent structs defined in C/C++/Rust. Further, in Javascript you can't really create cache-friendly collections of objects like vectors, BTreeMaps, etc.
The "JITed languages can be faster than natively compiled ones" meme has been making the rounds at least since Java gained JIT support. It never worked out that way (except for niche cases), for several reasons:
1) Gathering information at runtime competes with the actual program for resources.
2) Optimizing at runtime competes with the actual program for resources.
3) Most importantly, JITed languages tend to be pointer-heavy and full of indirections, and there's not much a JIT compiler can do about that.
But in practice they take a long time to warm up, may or may not get faster, and in general don’t attempt total program optimisation. As some parts of the node standard lib is native code and can’t be optimised at runtime.
With normal Deno dependencies, you could use an import map to accomplish patches like this. Not sure if that currently is planned to work for npm sub-dependencies but it seems like a natural place for that to be implemented.
This chips away at one of the showstoppers for Deno for me, which is good.
But while "the vast majority" of npm packages don't require a gyp build step for native addons, some of those modules are pretty important, and I see no indication in the announcement that they're also going to be implementing the Node C API or the gyp build process.
Right now I'm working with a machine learning project, and XGBoost [0] is a direct Node.js extension [1] through the binary interface.
So this does bring things a step closer to being generally usable, but there are still significant roadblocks.
A WebAssembly build of XGBoost could work with Deno, but aside from some guy's unsupported side project/proof-of-concept for use in a browser, I'm not seeing an XGBoost WebAssembly build. And generally when deploying something like a machine learning model I'd rather use well-supported tools than to need to dive into the rabbit hole of maintaining my own.
And yes, XGBoost will likely eventually have that kind of support for Deno, but then the next bleeding-edge project will come along and only support Node.
Even assuming Deno eventually hits a tipping point in popularity where everyone wants to release Node _and_ Deno support in their bleeding-edge projects, there are still things that I miss from package.json that don't seem to exist in the Deno ecosystem.
Things like the "scripts" block: A nice centralized place to find all of the things that need to be done to a project, plus auto-run script entries that can trigger when a project is installed. And inheritable, overridable dependency maps (see the yarn "resolutions" block).
I'd love to jump into Deno, but I think there has been far too much "baby thrown out with the bathwater" to its design. It's the classic development problem of looking at a system and seeing a ton of complexity, but not really understanding that all of that complexity was there for a reason. Maybe when it re-evolves 80% of Node's and npm's features I'll be convinced to make the jump. I'm a huge TypeScript fan after all. But it still strikes me as a violation of "As simple as possible, but no simpler."
[0] XGBoost is a _very_ promising approach to machine learning, training models much faster and with much more accuracy than traditional approaches.
Finally. This kind of freedom of choice was sorely missing in the JavaScript world.
Not only I will be able to choose package manager, frontend flavour, backend flavour, build system, unit testing framework, now I will be able to choose which runtime to pick as well. I can't wait to see how productive my team will be when this ships.
> the next release of Deno will include a new HTTP server. It is the fastest JavaScript web server ever built.
I feel unease reading untestable claims (with no numbers) about future releases. I'm made more uneasy by the fact that this is the second bullet point of the main tl;dr. It makes me wonder, what are the organizational incentives driving such a publication?
npm compatibility is huge for Deno. It is basically the one major drawback to Deno, which gets hopefully fixed with this feature. It also looks like this compatibility layer is implemented transparently in the existing module management, which is a big bonus. Nobody wants to deal with node_modules anymore when working with Deno for an extended period, just love to see this.