Hacker News new | past | comments | ask | show | jobs | submit login
Deno 1.34: Deno compile supports NPM packages (deno.com)
271 points by unripe_syntax on May 25, 2023 | hide | past | favorite | 90 comments



Deno is becoming more and more useful, but at the same time it feels like the project moves further and further away from the initial presentation of ry.


For deno to grow, it needs to be used in production systems. Deno seems to be betting most of it business in serverless/edge deployment (deno deploy, supabase functions, netlify functions). We also support it for self-hosted infra/internal tools production use-case at windmill.dev. The #1 feedback that we get is that people are frustrated that they have this one script or library they cannot use and that otherwise work with node. Deno is a marvelous piece of engineering that require a team of very qualified engineers, those need to get paid, so some tradeoffs need to be made.


Maybe, but as an(other) aficionado of Deno for smallish projects where Node is irrelevant, I don't feel like these panders to the mainstream hurt anything.

To the contrary, they make every conversation about using Deno easier, even where we don't and won't use Node (including NPM packages).

I mean sure, they consume some of their resources that might in some ideal (to me) world be spent on something more relevant (to me), but in this world it seems like engineering resources well spent. ¯\_(ಠ_ಠ)_/¯


Agree; I preferred Deno as the small-ish, Web API-compatible, focused JS/TS runtime that wasn't Node compatible. I personally feel all this NPM-compatiblity is a big waste of time.

That being said, I still like Deno.


It's become clear that it's a hard requirement for wide adoption, especially with the new competition from bun. Personal anecdote: I tried to pitch Deno last year at my job, and as soon as I mentioned that it can't use the NPM ecosystem they went "ahh..." and the conversation basically stopped.

So I don't blame the team for being practical. You can still make things better without a clean break


> especially with the new competition from bun

Competition seems to always lead companies to drop what makes them special in pursuit of the other guy's customers. They could have chosen to let bun take the npm-compatibility crowd and stick to their guns, but VC funding precludes that.


You're making some bad-faith assumptions.

Ecosystems die without adoption. A runtime that plugs its ears to the practical needs of users is a runtime that nobody but enthusiasts ever use. And you can't build any kind of sustainable business on an enthusiast-only runtime, VC or no VC.

Idealism requires pragmatism to succeed.


I don't disagree with your general sentiment, but you can be pragmatic and useful to users without bending to their every need. It's doesn't mean you're plugging your ears, it means you're being careful. Node and NPM are moving targets; chasing compatibility itself is not practical, and Dahl explicitly mentioned in his intro to Deno that Node compatibility was not a great idea (he was right).

If your team/company is reliant on Node packages, it does not make any sense to go for another runtime.


Here's an example where NPM compatibility allowed me to use Deno where I otherwise couldn't:

I've got a personal blog that had been on Node. I wanted to port it to Deno. I ported all the server code to be Deno-idiomatic, I dropped dependencies in favor of Deno standard library functions etc, it was great.

But I needed a markdown parser with specific special features that my corpus of existing documents already used. I researched Deno-native ones, I found one or two half-baked markdown parsers but none that supported the features I needed.

So, with a little fiddling, I got the (mature, featureful) NPM markdown library I'd been using before, working directly within the Deno version of my site. Markdown in -> HTML out. Everything else is lovely pure Deno code, and this one small piece of functionality didn't become a show-stopper because the team chose to support what I needed.

I tend to be an idealist in my programming - that's why I like Deno - but the absolutism I'm seeing in this thread is bizarre. An ecosystem is a huge asset, and you don't rebuild one - especially one as vibrant as NPM - overnight. It takes time, and in that time people can stop caring about your runtime and just use another one so they can get on with their lives. Projects like Deno are beholden to network effects; if you shun the network, the project dies.


That is a weird example, because you make it sound like you easily could have just extracted the Deno-compatible JS, and made a regular ECMAScript Module and avoided an NPM compatibility layer at all.

In fact, I did this exact thing to use marked[1] with QuickJS[2]. Obviously, QuickJS doesn't have an NPM compatibility layer because it doesn't need one, and neither does Deno. The great thing is the resulting .js file can easily be a Deno module as well.

> I tend to be an idealist in my programming - that's why I like Deno - but the absolutism I'm seeing in this thread is bizarre

I don't really see any absolutism in this thread, it's just a disagreement. Yes, NPM compat is great for adoption. I'm arguing it's not great for the runtime, and that it adds tech debt, and slows the team down from doing (imo) more creative things. It also adds more to the binary.

[1] https://raw.githubusercontent.com/markedjs/marked/master/lib... [2] https://bellard.org/quickjs/quickjs.html


> because you make it sound like you easily could have just extracted the Deno-compatible JS, and made a regular ECMAScript Module and avoided an NPM compatibility layer at all

"Easily" is relative. I'm sure one could port the library I'm using to be Deno-compatible, maybe with less trouble than some other libraries because it's unlikely to use many system APIs, but it's also a large, significant, mature library; we're not talking about leftpad here.

And I'm just making a blog site. I want to get on with my life, not spend hours or days fiddling with all the long-tail incompatibilities I would probably run into making a Deno-compatible fork of a major library that I'm not planning to maintain (not to mention any dependencies it itself has!)

If I'm writing a new library that I've already decided to put effort into, I'm likely to make it Deno-compatible from the get-go. But porting something significant to Deno is not trivial, even where it is straightforward. Things take time, and over a whole ecosystem it adds up, and we can't erase that labor gap from the discussion.


There is a difference between Node compatibility and NPM compatibility though. NPM may have started with Node, but it has become part of the core development process for front-end web development. One of the main purposes of Deno is to decrease the gap between frontend JS development and backend JS development, and adding NPM support furthers that goal.

What doesn't make sense is adding a compatibility layer for all the Node specific functionality that has since become redundant with new Web standards.


Given that a significant number of packages in npm won't work without Node compatibility, I'm not sure the distinction makes much of a difference.


> If your team/company is reliant on Node packages, it does not make any sense to go for another runtime

If Deno does some things better than node, and nothing worse, then NPM support means users can gradually migrate from node. Any of those improvements might be enough motivation if they help your use-case.

AFAICT, there still isn’t a killer reason for my mature project to migrate, but removing the NPM blocker means I can consider it as soon as they release something that fits my use case.


> Competition seems to always lead companies to drop what makes them special in pursuit of the other guy's customers.

Can you explain in your own words why supporting the world's leading package manager system makes Deno "drop what makes it special"? Does it make any sense to argue against gaining access to all conceivable dependencies?


Turns out, in the real world, arbitrary complexity under the hood which is almost completely invisible to the user, is less of a drawback than not having access to the largest ecosystem of 3rd party libraries that exist for the platform.

I greatly prefer a project that is willing to reconsider its founding principles than sticking dogmatically to them even when they are being harmful to the project and its users.


You make a great point. No matter how high the complexity level is, if interfaces and abstractions are trivial to follow and they "just work" then the system is actually simple and trivial.

I'm seeing poorly advised commenters in this thread trying to argue that reinventing the wheel is simpler than just using industry standards that are readily available and "just work".


I'll do you one better, and explain it in Ryan Dahl's words [0]. Getting away from NPM, node_modules, and package.json was one of a very small number of founding principles for Deno as he introduced it:

> Linking to a package requires a lot of components, a lot of systems. The problem that I have with [package.json] is that it gives rise to this concept of a module as this directory of files, where that wasn't really a concept before, where we just had JavaScript files. ... It's not a strictly necessary abstraction. And package.json has all this unnecessary noise in it.

> ...

> [In] Deno I want to simplify the module system, so screw all this stuff about how Node modules work... it can't be compatible with Node, otherwise you end up building Node, so there's no attempt at compatibility with existing software.

[0] https://youtu.be/M3BM9TB-8yA?t=1256


> The problem that I have with [package.json] is that it gives rise to this concept of a module as this directory of files, where that wasn't really a concept before, where we just had JavaScript files

> > [In] Deno I want to simplify the module system, so screw all this stuff about how Node modules work... it can't be compatible with Node, otherwise you end up building Node, so there's no attempt at compatibility with existing software.

I think that Deno's module system is still incompatible with Node's: supplying a node_modules directory of files in your project root, at run time or at build time, still won't work. Deno is now just doing the work to ingest npm packages and process them into Deno modules at build time, if you want to use them. That doesn't seem out of line with Dahl's vision to me.


[flagged]


I linked to the second half of the quotation. The first half was far from a passing comment, he spends a solid 3 minutes on his substantial problems with package.json: https://youtu.be/M3BM9TB-8yA?t=589

That first half is essential for understanding the second half. It sounds like a passing comment if you didn't watch both segments because he's assuming you already have the context for why he's ditching node.

Since you're assuming bad faith I won't be monitoring this thread any more. If you'd like to have longer, substantive discussions on HN, I'd suggest knocking off the name calling.


> the world's leading package manager system

What, exactly, do you mean by this? Just that NPM is where the most publicly available JavaScript code lives?


> I personally feel all this NPM-compatiblity is a big waste of time.

You feel like being compatible with one of (if not the) biggest package management ecosystems for any programming language is a big waste of time?


Yes.


I'm using deno in production and npm compat is, for me, a must.


If you don't use the node compatibility do you even notice or care one bit about it? It doesn't hurt a pure deno project at all.


If Deno has no Node compatibility then you know that no Deno library pulls Node modules in with it, because it's impossible. The more seamless node interop becomes, the harder it will be to sift the Pure Organic Deno libraries from the NPM-infused Deno libraries.

So, on the contrary: Node becoming embraced by the wider ecosystem absolutely hurts pure Deno projects because they'll rapidly become perceived as a fundamentalist minority that doesn't need to be catered to, where the original idea was that all projects would be pure.


Seems like it could easily just be a compiler flag to 'only allow pure deno' and then fail/warn if a dependency pulls in npm stuff.


I think a flag like that shouldn't have to exist. It's complicating and bloating the runtime more just to gain more adoption, not to make Deno actually better.


I think not wanting to pull packages in from the largest package ecosystem in JS land is a fairly marginal need which should only be catered to through a setting, if at all.

You can always not add NPM packages if that’s something that bothers you, and there are probably fairly simple ways to add a lint step that ensures you’re doing that by simply scanning your package.json (which makes me think…can’t you simply not add a package.json? The original direct URL library references will be around anyways).


You just have to also lint your URL imports for "npm:" URLs. That's an easy enough lint. (Though slightly more complicated if you also decide to block in your lint rule other common npm-based CDNs like unpkg/esm.sh/others.)


I don't buy it, you can already use stuff like skypack or other bundling CDNs to import npm modules in deno just by their URL.

You have a very bizarre argument that I don't follow. I don't care about the "purity" of the project, I care about getting shit done and deno is helping greatly there.


> I don't buy it, you can already use stuff like skypack or other bundling CDNs to import npm modules in deno just by their URL.

It's not that simple if the project you are importing uses Node-specific APIs. By the way, this is an area that can really confuse newbies.


then you're stuck with a CDN you don't control. The runtime (Deno) being able to work directly with most packages on npm is a huge win because I don't have to trust a 3rd party for this anymore, and that to me is a big win.


Agreed, I'm just refuting the reply above that deno recently besmirched some bizarre purity by adding friendly node NPM support. From the very beginning of deno you could import any JS code, including simple stuff from npm that didn't rely on then unsupported node APIs.


On the other hand, at $DAYJOB, being able to say that Deno has access to everything node can do goes a long way. Not having to reinvent the wheels is definitely nice!


True but I think trying to make a tool useful to everyone's $DAYJOB is kind of overrated. Instead of trying to make the tool the best it can be on its own merits, you're trying to please the millions of folks who already bought into another ecosystem.


This is tough/impossible when you're VC-funded like Deno is. Once they took VC funding, they didn't have a choice but to shoot for the moon. Deno without NPM support wasn't seeing the uptake they wanted.


Yep, agree. I think things went a little wayward when they took the VC funding.


Depends on your role and organizational goals. If you work in application security, it makes Deno harder to recommend, because then the recommendation (or perhaps even mandate) has to come with extra instrumentation in order to enforce the additional security the runtime is supposed to bring.

It also makes you question the priorities of the project, given that this is still early days. If the supposed focus on security is already being compromised on and whittled away today, would a recommendation to build with Deno even prove meaningful in 5 or 10 years?


No. Deno would've been widely adopted if they were compatible with Node from the beginning.


one of the main take aways I got from Ryan's deno intro was pragmatism over preferences.


Are we talking about the same talk? 10 Things I Regret about Node.js [0]?

Goal #2 (after security) was to simplify the module system, and the very first thing he said about that was no node compatibility.

[0] https://www.youtube.com/watch?v=M3BM9TB-8yA


Supporting NPM allows the entire ecosystem to gradually migrate from CommonJS/NPM packages to ESM/Deno imports.

New developers in Deno will still write pure code - not NPM code that has slow require; they don’t bear any mental complexity - it’s only legacy code that pays a runtime complexity cost.


I disagree. If it had become powerful enough for them to support node/npm sooner they would have supported it sooner. Pure Deno stuff like fresh is still packaged in a Golang-like way.


another perspective is that you can gradually move away your project dependencies into deno while still using node modules.


In practice that's not going to happen, though. People will add their npm dependencies and move on, and instead of solving JS dependency hell Deno will become a layer of abstraction on top of it.

A lot of us were hoping Deno would do the Gopher thing and force a complete reset. Yes, that means less/slower adoption, but it means that when you pick up a Deno library you know you're not getting all of npm with it. The further we move along the less likely it seems that any Deno project will be able to be npm-free.


What’s the “gopher” thing? A fringe system no one uses in the long run isn’t a reset. It’s a blip.


I'm very happy with the direction Deno is taking in its development. At the same time, I'm not sure when I would feel comfortable deploying something to production with it. Is anyone here running Deno in production?


We run a medium sized trading system (20 microservices) in Deno for about a year now. There have been a few runtime quirks but they have currently all been ironed out. I love the approach that Deno takes with respect to permissions which allows you to strictly specify which urls your program can access, which files/directories can be read/written, etc.


You are most likely aware but just to point out, Node now also has a version of the permissions features [0] (not sure on parity but it was obviously deno inspired) behind an experimental flag!

[0] https://nodejs.org/api/permissions.html


How are JS's scientific computing and stats libraries?


Python.js ?


There is brain.js and tensorflow.js.

It's not anywhere near python


I have a small Deno powered bot that generates Shopify listings from some inputs. It’s been running for a few months with no crashes or restarts.

I think it really comes down to what APIs or packages you need. I have had trouble with projects such as Prisma and wouldn’t do that in production as the generated output is slightly different for some reason (haven’t had time to inspect).


We have also run it in production for about a year, but it's a tiny metrics collector and doesn't do anything fancy at all. But writing and deploying it end to end was a fantastic and quick experience.


Why wouldn't you be comfortable with running it in production?

Just don't use the parts labeled as "experimental" or "beta" and it's very production-ready and stable.


Because things like adoption and longevity of a project correlate with it not being abandoned, etc. Dealing with abandoned tech in your stack sucks


It's still just V8 under the hood so IMHO it's not nearly as risky as if someone just wrote their own bespoke JS interpretor. And everything around V8 is in rust so there's less chance of catastrophic footguns and crashes (still non-zero, but way more trustworthy than a new C/C++ project).

IMHO the biggest risks would be misunderstanding of capabilities/configuration particularly around its stricter sandboxing, i.e. forgetting it won't let you access file system or network by default and your testing failing to capture some dependency there.


Can anyone comment on the experience with the feature of producing a single binary? Is it really a standalone binary with all the NPM libraries and everything without needing anything else from the host machine? For example can I ship a self-contained server with it that carries its own html/css/js?


Yes, until the html/css/js part. There is a tool that puts all your files into an js object and provides your files like a virtual file system. Obviously doesn't go well with bigger files. I wish there was a way to use PE's / ELF's existing ways to embed files, especially for memory usage reasons.


The self contained executable & the strict permissions is what catches my eye the most!

Love the direction Deno is headed.


what tool is that?


Isn't one of the main selling points of Deno is the new package management system, wouldn't supporting NPM effectively nullify the benefits?

Will there be a strategy to avoid NPM ultimately becoming the defacto package manager for Deno


Keep in mind there is no complete feature parity with NPM. This only allows you to use packages not present in Deno's ecosystem, but it does not guarantee it will work flawlessly (and in practice it can be a pain to use some packages).

The best way to use Deno and take advantage of its benefits is to use Deno's packages. Allowing NPM packages is a good strategic move because on top of already being quite different from Node, there are not many packages, but it does not take away from how well it will work with Deno-native packages.


Sure. But no one wants to use deno cos there’s no packages. And so there’s no benefits over npm. No one wants to link to random GitHub files.


I like Deno. The benefits to me are largely around developer experience and tooling. I much prefer Deno's dependency management solution to Node's.

Additionally, lots of packages have made their codebase more runtime agnostic as a result of Deno existing, so there definitely are Deno packages out there. It just depends onw hat you're building!


With all of this Node compatibility work it should be able to run Next.je at this point. I wonder what fails when one tries to do that?


Bartek from the Deno team here. We're currently missing polyfills for IPC module to be able to use `next dev` command.


Honestly I don't have an issue with Node's approach to module resolution - especially when combined with a package mangers like pnpm that symlink dependencies.

Multi project workspaces are easy to manage with pnpm and publishing to an npm registry is fine enough.

I thought about git as a package distribution mechanism (like Go) but it adds complexity in how you handle transpilation for releases and I'm not sure how versioning works in the context of multi-package projects/workspaces (how does Go do it?).

I just want to use TypeScript on the back end without a billion years of setup.


Producing an SBOM would be a very nice feature to add to this.


Can it compile native modules like serialport into an executable? or am I dreaming too much?


These modules can be included but currently there's no way to use them - that's because they need to be present on disk to load them as dynamic libraries.

We're debating how to best tackle that. If you have a specific use case in mind I would appreciate opening a feature request in our issue tracker.


Does this mean we can use npm packages with Deno Deploy? Or is that another issue?


It's another issue, but it's close


That's super exciting, can't wait to test it.


I wonder if deno will ever converge with node the way io.js did.


Not really, because it's not a fork and is written in a different language.


There are apparently plans to add a deno-style permission system to Node (I think there is experimental support behind a flag in one of the recent builds) but I don't think it will become the default the way deno does it, at least not in the foreseeable future.


I put this in another comment also but yeah, the permissions feature is documented here https://nodejs.org/api/permissions.html

It definitely seems that Node is surfing in deno's wake here and taking heavy inspiration from some of the more successful features.

I guess remains to be seen if this will follow in something like the Java / Scala / Kotlin model or if it'll lead to more of a merge a la io.js - I tend to think that a merge fits better since we're dealing with runtimes and not separate languages, but I think the former model could work too.

Either way, great thing for the ecosystem!


unless node convert to Rust, the future is Deno or rust based approach. The main reason to NOT use node is the who C backend.


As an admin and not a dev, why would this be beneficial to me, managing a fairly standard app and node/yarn frontend? Sell it to me :)


For an admin, I'm not sure. The deployment and CI/CD story may be cleaner? The real win to me is the developer experience. Deno comes with everything: test runner, linter, typescript, language server, etc. That's the winner for me.


Cheers, I worded that badly I suppose but you answered the question I meant :)


Disclosure: I work at Deno

For example, if you use Deno first frameworks like fresh and use Deno Deploy, you’re eliminating build steps. You git push and the application is live across the world in 1-2 seconds. It feels like magic.


Honestly it would feel magic back in 2009. Currently there are huge waves of changes. Everyone trying to offer utopian ecosystem. If the tendency is to gather community, it seems they need to chase companies given that many have somewhat stable stacks. Reshaping existing projects requires way more sacrifices. For example the major shift of Vue 2 to Vue 3, disrupted entire plugins ecosystem for Vue 2. People barely get motivated rewriting same thing for sake of composition api while options API remained. Idk exact reasoning, but there is a huge tech debt on Vue 2 plugin ecosystem. So again sacrifices, sacrifices.


Why I should use Deno?


For me it is convenience as tooling like fmt, testing, etc. are build in and they try to keep api close to browser api's when possible.


I've no usage experience myself, but, I'd say running (and now packaging or compiling) modern JS or Typescript directly without having to build a pipeline with tools like webpack.


I don't think Deno's gonna help deal with something like Webpack, is it?

Webpack's for building browser bundles, Deno's about running TypeScript server-side, isn't it?


It's hype, you need another reason?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: