Hacker News new | past | comments | ask | show | jobs | submit login
Bun v1.0.0 (bun.sh)
1089 points by tuananh on Sept 8, 2023 | hide | past | favorite | 367 comments



I work on Bun. Happy to answer any questions

Could the mods change the link to the blog post? It explains better than the github release page

Blog post: https://bun.sh/blog/bun-v1.0



Thank you


  > The transition from CommonJS to ES modules has been slow and full of terrors. 
  > Bun supports both module systems, all the time. No need to worry about file
  > extensions, .js vs .cjs vs .mjs, or including "type": "module" in your 
  > package.json.
  >
  > You can even use import and require(), in the same file. It just works.
This is the highlight for me. The Node.js ecosystem is more-or-less completely broken otherwise. This might save it.

(I think that the most impressive thing about Bun isn't its performance but the pragmatic, developer-friendly choices that have been consistently made by Jarred.)


This problem is very old at this point. I'm doubtful that Bun has "solved it" any more than other tools. They've just selected a different set of trade-offs that will result in a different set of packages not working. Is there a reason to think they've cracked some code that other tools were unable to?


In Bun, you can use import and require in the same file. There's three steps for what makes this work:

1) We patched JavaScriptCore to load ES Modules synchronously when they don't use top-level await. The main tradeoff here is that requiring an ES Module which uses top-level await is unsupported (Bun throws an exception when you try to do this)

2) To support require() inside ES Modules, we add `import.meta.require` and have a transpiler integration to convert calls to module.require() or require() into import.meta.require(). This either loads the file as an ES Module or as a CommonJS module, depending on what the transpiler says it is

3) We rely on certain heuristics to decide whether a script is an ES Module or a CommonJS module. If they use certain sloppy mode or CommonJS-only features, it becomes CommonJS. If they use ESM features or it's somewhat ambiguous, it becomes an ES Module.


While I appreciate the convenience that will result from this excellent engineering, I'm concerned that this could result in a lot of mixed module files that now can only work with Bun.

Is there an option to require each file to use either CJS or ESM features, but not both? If not, may I suggest adding that option?


Seems like a case for ESLint and I agree with you.

Mixed require and import (the latter in both its static and dynamic variants) is already possible with Webpack and Vite.

I hate it and would complain about any code introducing this in any kind of review.

It's great for integration, full stop.

This is an inconvenience where tradeoffs really show and demand to study the behavioral/semantic differences.

Funnily enough, I still find Bun interesting.

Regarding import.meta., I already love Vite's dynamic import transpilation features. They aim to work almost transparently and often do (I am not even talking about the glob feature here).

That being said, I'm unsure if Bun's way of bridging the module ecosystem gap using import.meta will be as useful.

If you ask me, frontend code should ban Node/CJS/require entirely or transpile it into static or dynamic ESM imports; respectively.

Any case where this is not possible unambiguously is a case for human intervention and code rewrite/conversion.


>If you ask me, frontend code should ban Node/CJS/require entirely or transpile it into static or dynamic ESM imports; respectively.

>Any case where this is not possible unambiguously is a case for human intervention and code rewrite/conversion.

Good god, yes


Why go to such lengths to support a legacy module system?

All three patches mentioned will lead to modules that work only in bun and not work in node/deno/browsers, further fragmentating the ecosystem.


It's a legacy module system with a lot of code written in it. Without mixed support, you can't rewrite code incrementally to use the ES modules. As a consequence, many projects still stay with CommonJS, since the migrating everythin in one go is a big hurdle.


I’ve often wondered why node doesn’t do 1 in this list. It would solve 95% of the problems, I’m sure.


For 1), what about requiring an ES module that doesn’t itself use top-level await, but imports another ES module that uses top-level await?


That throws since we cannot make it synchronous. It works by reading the internal state of the Promise and getting the value out of it if the Promise is fulfilled.



The compiler is the bundler is the runtime. The reason it’s such a problem on Node is because the tools are separate and often conceptually at odds.


No, it's a problem because the formats are fundamentally incompatible and it's impossible to infer the author's intent. A package with no imports/exports or requires could be either ESM or CJS and no heuristic can solve for that reality.


How common are packages with no imports, exports, requires, or "module.exports="?

I imagine that's quite rare, because such a package would only be imported or required for its side effects; and because that package cannot import or require any others, could you safely just assume that if it was `require()`ed use CJS, and if it was `import`ed, use ESM?


Rare things still happen a lot at the scale of npm. And due to transitive dependencies it winds up affecting a lot of actual users. And this is just one example of an incompatibility, there are many more. Bun didn't discover a silver bullet here that dozens of other tools in the past missed on.


Do you have an empirical basis for that? I'm curious because on one hand, Bun is advertising they are solving a major challenge for library authors, really betting quite heavily on that being persuasive for adoption.

The risk for Bun's adoption is that they're wrong, of course.

The risk for Node's usage is that they're right, and library authors begin urging users to use Bun because it's 100x easier than making a useful library with Node. (This will happen very slowly, but things that happen very slowly can start happening all at once very quickly.)


> And this is just one example of an incompatibility, there are many more

Such as? If you could provide 5-10 of your many examples and why you think that Bun's approach doesn't solve them it would be very helpful.


Yeah, I don't believe GP. Their assertions that Bun hasn't solved this (or that the heuristic used is inadequate) are basically based on speculation without any evidence.


Jarred even came here and agree with them. It's nothing controversial. Just nuance. No need to flame. DYOR.


Assuming it has no imports or requires, and no references to meta, it’s a bare JavaScript file, and whether you treat it as esm or cjs shouldn’t matter, no?


JS has two top level grammars, one defaults to loose mode and the other defaults to strict mode, among other nuances.

Possibly the most devious nuance is whether the spec's appendix B applies, which affects whether html comment syntax is valid (yes, this is a thing). The html comment token can therefore be parsed either as a comment or as a series of operators depending on the grammar being used.

Effectively, this means it's possible to craft a program that does different things in CJS vs ESM mode.


Yeah, definitely recommend you try it before discounting... I too have been super jaded with the current Node import ecosystem (esp. as an author), but Bun has basically become something of a silver bullet, with some small exceptions.

If I want, I can literally run a TS file directly with mixed import/require (not that you would ever) directly, without a package.json, tsconfig, dependencies, etc.

In my experience, it's basically Node, minus all the setup/config/build headaches.


In a sense, an opinionated collection of tools _is_ a solution.


This is what I pushed Node to support when they proposed their split syntax and I'm very happy to see Bun adopt this. It is the "correct" way as far as I'm concerned.

Node's split ecosystem has really hurt the entire language and runtime.


Is there a quick summary of the problem somewhere? I haven't worked on frontend stuff in nearly a decade.


From a package maintainer standpoint, trying to support both ESM and CJS in the same published package is a _nightmare_.

I wrote an extensive post a few weeks ago detailing all the pain that I've dealt with and problems I've run into this year trying to modernize the Redux packages to fully support ESM and CJS:

- https://blog.isquaredsoftware.com/2023/08/esm-modernization-...

I've gotten a lot of feedback from other maintainers saying they've run into similar issues.

Also had a couple podcast discussions following up on that topic:

- https://syntax.fm/show/661/supper-club-shipping-esm-with-mar...

- https://changelog.com/jsparty/290


Just my 2 cents/anecdata in response to your tweet about what library authors have to put up with, hoping it helps some of the doomers replying:

> Build artifact formats (ESM, CJS, UMD)

I don't know anyone that ever used AMD/UMD. CJS is loadable in ESM, and tsup makes the two of them trivial.

> > Matrixed with: dev/prod/NODE_ENV builds

Not every library has this issue.

> Bundled or individual .js per source

Not a new issue, this has always been a toss up (I prefer .js per source)

> exports setup

Yup this is a pain, especially because I prefer multiple .js files, and I wish tsup would help here.

> Webpack 4 limits

I'm not sure what you mean by webpack not understanding optional chaining syntax, we use webpack 4 and use it all the time. Might be a benefit of using typescript.

And tsup helps with the .js requirement issue.

> TS moduleResolution options

Again it looks like tsup solves this.

> User environments

When is that ever not an issue? Very fair for maintainers to say "nope, sorry, that's too esoteric an environment"

> TS typedef output (bundled? individual? .d.ts, or .d.mts?)

Should match your .js file output 1:1 (.cjs/.mjs et al)

---

All that to say, if you use typescript and don't do anything fancy, you've got a pretty compatible library with just `tsc`

And if you're looking to do something cross compatible, tsup will help a lot.


I maintain a lot of packages consumed by thousands of engineers internally at my job.

We don’t have super rigid standards in many ways, however that experience has lead me to wonder why so many engineers are struggling with dual support or cjs and ESM.

That said, none of those packages are redux level of downloads, and imagine you hit more edge cases than I do in that

Edit: originally stated hundreds. Turns out we’ve grown enough that I can say thousands now


Oh my god, thank you for this. I expected a bad situation, but I underestimated the badness by several orders of magnitude. Makes me want to never touch JS again.


That blog post is exactly what I was looking for, thanks!


> From a package maintainer standpoint

Like why is this the standpoint that trumps everything else (eg stability of the ecosystem)? Have a tool to solve the problem and be done with it. It's not like with esm you can publish packages free of tooling anyway.


I never said it "trumps everything else".

I'm saying that I've spent much of this year dealing with all the pain around this, and linked an article on my own experiences, and that other package maintainers agree that this matches their experiences.

If you can build a tool that magically solves all these problems, great! Please let me know when that's available :)

(FWIW Bun looks like a genuine improvement on the _consuming_ side of things, but that doesn't help on the _publishing_ side of things.)


I'm having trouble understanding why ESM was needed in the first place. I've always found the require system to be very elegant. What benefits does the module syntax provide?

I forked and customized an old Javascript templating engine and have published it as npm packages. Should I spend effort migrating to the new way?


Jesus, I thought python packaging sucked, but wow.


Still not clear to me what ESM exactly fixes. CJS works pretty well for me. The amount of pain ESM gives in unit tests is enough to ignore ESM. Maybe Bun will make a difference here.


By being declarative instead of imperative:

- The dependencies of a module can be scraped without running the code

- The subset of declarations used in an import can be known

This allows modules to be loaded in parallel, and also allows for things like tree-shaking. The biggest benefits are for web browsers, but all around it's a more constrained way to express dependencies, which allows the bundler/runtime to take more liberties.


CJS does not work in browsers. Bundlers fake it well enough you might not feel that pain regularly, but ESM was designed for use in browsers and CJS wasn't.


You could get those benefits through special syntax that chose to not deviate from commonjs semantics.

We are in this mess because tc39 explicitly chose to not give a damn about nodejs ecosystem.

None of the secondary benefits of ESM to commonjs semantics incompatibility (like recursive imports, top level awaits etc) are IMHO worth the ecosystem disruption.


Commonjs is synchronous. If you used it in the browser, it would block the main thread each time you required a module.

That's bad.


ESM has dual syntax for static and dynamic imports too.

What I am saying is that

import Foo from "foo"

could be sematically equivalent to

const Foo = require("foo")

and

const Foo = await import("foo")

could be semantically equivalent to

const Foo = await Promise.resolve(() => require("foo"))

if tc39 designed ESM import spec with CJS compat in mind.

It is fully possible to introduce additional syntax while not breaking the ecosystem.


CJS makes an assumption that all of the main body of a module is synchronous including its imports and that the main body of all of its imports also ran synchronously prior to the main body of that module. A lot of CJS modules are built around side-effects during module loading, built on assumptions from the way that Node especially synchronously loads everything, that break in any attempt to make the process asynchronous. It's not just a matter of syntax sugar, it's a trouble of bad assumptions and presumptions in the CJS format itself that browsers have no way to paper over (but bundlers can fake by putting everything in the same file). Loading a new file is always an asynchronous operation in a browser. CJS has never supported that. Those problems were known at the time when Node picked CJS. (AMD was the competing format that worked in browsers. AMD had its own problems and was a pain to work in without tools like Typescript, but it was compatible with browsers. CJS never was. ESM imports are backward compatible with AMD imports but because AMD is mostly a dead format in 2023 no browser today actually implements the shims to support that, but they were on the table for a long time and can still be ponyfilled if anyone is crazy enough to have an AMD codebase in 2023 that didn't switch to a CJS-based hodge podge in Node at some point in the last decade and that they can't all-at-once migrate to ESM. Which in my experience mostly just describes what's left of Esri's over-reliance on Dojo's AMD loader for ArcGIS JS and just about nothing else in the wild in 2023.) This mess is all Node's fault and it should have never picked CJS in the first place, that was always the losing horse.

(Hypothetically there might have been a place and time in CJS history to force CJS require() to always return a promise and module.exports to always take a function that returns a promise and that hypothetical version might have been compatible enough to import directly from ESM. Nobody actually wanted that hypothetical "ACJS" so it never existed. ACJS could have at least met browsers half-way and might have made the transition less overall painful.)


Nobody is advocating for synchronously importable modules in browsers. Also, people had already written support utilities like require1k [1] to support cjs in browsers. We don't necessarily need a full bundler - we only need a module analysis step between parse and execute which a native js engine is quite well positioned to facilitate.

Having said that, the fact remains that vast majority of pain points that people who actually need to maintain isomorphic libraries face today have nothing to do with synchronous vs asynchronous nature of cjs and esm modules.

It is the myriad pointless nuances like default export, namespace imports etc. that are sources of biggest headaches in day to day work. In cjs there is a simple model that a module exports an object and while importing you import that object. Instead now we have a situation where we need to deal with:

    export default { 
        foo() {...} 
    }
is not same as:

    export function foo() { ... }
and

    import Foo from "foo"
is something different from

    import * as Foo from "foo"

Plus the additional complexities introduced by import bindings being live etc. are just annoyances that one has to deal with over and over again every time module interop is involved.

[1] https://github.com/Stuk/require1k


Every few years I end up having to write a bunch of JavaScript-- and recently it was time again for me to dig in. The `import` vs `require` thing caused me HOURS of pain. The modern JS ecosystem is truly painful if you're not using a react app or some off-the-shelf solution-- and I am not, I'm writing a small library. I honestly can't even fathom how difficult and maddening it has to be for folks just getting started.

The fact Bun solves that problem is more than enough of a reason for me to give it a try, I'm honestly fucking stoked to try it out, and I never thought I'd have those feelings for yet another JS runtime or build tool. Big thanks to the Bun folks, I'm looking forward to gaining some sanity in my day.


I've been creating module projects as default for nearly a year now. It's been largely fine for the past 6 months since Typescript 5.0 released with the "bundler" resolution strategy.

Very rarely run into issues but have gotten around with pnpm patch and meta modifications in the past until projects have fixed issues..

My take is 2023 is the year of the module. The water is fine.


So almost 10 years after modules were added to the spec, a few years after they've been available in most browsers, the past 6 months have been largely fine to use them for _new_ projects.

That doesn't paint a great picture.


When I say fine I mean I've had no complaints with any aspect of my workflow since TypeScript released the bundler resolution strategy, which mostly just means you no longer need to specify extensions when importing..

I don't know about the past 10 years as I haven't tried to use modules for the past 10 years. I've been using them for the past year on new AND existing projects with Vite+React, Vite+Storybook, SWC+Fastify, Rollup and library projects, and etc.

Unless somebody has to work in the past I'm sure what relevance that has on using modules in 2023.


I don't usually comment on downvotes and I have just under 5k karma so it's uhh, just no thing to me typically..

It's also against the guidelines, but again 5k karma...

That said, kinda curious now downvoted into negative for sharing my, extensive, experience creating module type projects.. Hrrmmm.


agree. i ought to be a javascript expert by now but nothing makes me feel dumber than navigating all the type:module and .js/cjs/mjs/ts stuff. for a production app of course you should be clear about what you are writing. but for prototyping Bun's permissiveness lets me write code first and worry about the rest later.


> for a production app of course you should be clear about what you are writing

Agree in sentiment, but I don't think this even applies to the module/import mess. My experience has been that the difficulty is in using different packages that made different choices, and making them interop with your own code. I have still yet to make top-level await work in a Typescript codebase where I need to import nontrivial packages. I don't think you should ever need to worry about the specific way your modules are imported, not in prototyping or in production.


I've been writing JavaScript for a long... long... time and I've kinda zoned out the last few years with regards to ESM vs CSJ

I only know that I perfer writing import over require as those semantics and syntax make more sense to me.


For the benefit of the future reader, I’m guessing you meant “CJS”, short for CommonJS.


From my experience, the feeling that "Node.js ecosystem is more-or-less completely broken" comes only when you're trying to create package supporting both CommonJS and ESM (see https://gist.github.com/sindresorhus/a39789f98801d908bbc7ff3...). Just rip the ban-aid and convert all your code to ESM and you'll be much happier (otherwise you experience something like https://blog.isquaredsoftware.com/2023/08/esm-modernization-...).


"just convert all your code to..."

I remember hearing that once from the Python devs...


I don't use Python, but AFAIK the differences between Python 2 and 3 and the effort needed to upgrade are higher then ESM vs CommonJS. But that's not relevant anyway; I'm not saying the upgrading to ESM-only is easy, I'm saying in the end, it's easier than trying to support both module formats.


It's anecdotal but I switched to ESM in Node a couple of years ago. Never experienced any problem.

My stack is usually Fastify with official plugins and a PG client.


yes, definitely and completely broken. require and import differentiation the most retarded changes, broke my build. spent sometime to understand the substantial difference, turned out it's nothing too substantial.


Seems like a bad idea to support both. Encourages bad behavior and half-ass code. I've had no problem moving codebases to es6 without a ton of effort.

Maybe at most, hide the capability behind an execution flag.


You may not be aware that node created a situation where ESM maximizes compatibility when consuming libraries and CJS maximizes compatibility when producing libraries. Its a circular incentive paradigm. I suspect you are consuming libraries using "import" and everything "just works". Well, under the hood you are likely importing many libraries that use CJS "module.exports" because that is what libraries use to ensure their library can be used with both import/ESM AND require/CJS.

The only solution to this problem is to upgrade require work for importing ESM. Once that is implemented ESM will become the natural choice for those publishing libraries. Until that happens library authors are going to continue to use CJS so their libraries are available to everyone.


The last version of Node that only supported CJS libraries has already long faded out of LTS. In my opinion, more libraries should just rip the band-aid off and entirely stop publishing CJS. Sure that will upset some downstream dependencies, but semver major tag it, and many of the downstream dependencies need the kick in the pants to move to ESM anyway.


Node.js v12 doesn't have proper ESM support and went out of LTS < 6 months ago. Have some patience!


Node 12 went out of LTS 16 months ago. 4 months ago was Node 14


Also, Node 12 supported ESM somewhat fine, you just had to use the --experimental-modules flag for Node. If you still have to support Node 12 in 2023 for some odd reason, you probably shouldn't have a problem enabling an experimental flag to support more modern libraries.

Node 10 was truly the last LTS that had zero ESM support, and that support ended 2 and a half years ago.


The problem isn't your code. It's the dependencies of your dependencies.


Is there a better phrasing you could use than "drop-in replacement" for a 1.0 release that has decided to not implement all of "node:"?

The let-down from the first two projects not being a drop-in replacement is jarring, now I have to be skeptical of all communications coming from bun.

For context, both projects use osc which requires dgram. I was able to find a feature request tracking ticket from 9 months ago with no communicated plans to implement, but searching for 'dgram' in this announcement page doesn't show anything.

For a quick suggestion - can you explicitly state what modules are not supported by bun 1.0 and put that in the section related to node.js compat?

edit: found the docs on node support - https://bun.sh/docs/runtime/nodejs-apis . Would be nice to link in the release notes.

Cursory glance looks like for modules - 17 implemented - 17 partially implemented - 7 unimplemented


I think the version is actually closer to `1.0.0-beta` than to `1.0.0`. I just installed v 1.0.0 and run `bun repl` and it failed with exit status code 1, it turns out the repl wants to use port 3000, which was used in my case already. There are probably a lot of other small things like that, look on reported bugs in github. So, I also should be 'skeptical' about all claims they did.

Nevertheless I am super impressed with their speed and exited with the result. Didn't expect this project to grow so quickly to this state, I though it will take them much more time. For comparison, deno was started way earlier and now they are miles behind (personal feeling). I am considering to use it for my pet projects


How is Deno "miles behind?"

I just started a new project, writing the back end. I'm new to JS and TS, and I don't want to deal with NPM and a morass of modules at all. Deno, Oak, and MariaDB seem like a pretty tidy combo. I have routes and queries up and working with no experience in writing server-side code since PHP 5.

What makes Bun better?


bun repl was a last-minute hack we added. All it does is "bun x bun-repl" which is currently not an official thing.

The repl is about to be added and rewritten completely


1.0.0 means the API and semantics are stable, not that there are no bugs.


I’m a little surprised they announced 1.0 already. When I tried bun a little while ago, I noticed many bugs reported on GitHub around reading from standard-in and readline. I attributed it to being a pre-1.0 thing, but those bugs have not been resolved as far as I can tell.


What issues are you running into? We added support for setRawMode in Bun v0.8 (a month ago) and that addressed many issues with stdin and readline.


The port being in use is user error not really fair to put that on bun.


The following is probably not relevant for bun. But as an aside:

Most programs should bind to port 0 and let the OS allocate a free port and then report the resulting port to any future clients.


It is when you're trying to run a REPL. You really shouldn't need exclusive access to a port for that.


> Cursory glance looks like for modules - 17 implemented - 17 partially implemented - 7 unimplemented

The question isn't just how many? but also which ones?. Anecdotally, I've been test driving bun for a large application with millions of existing customers. So far, the only issue I've had isn't with bun per se, it's with patch-package not yet supporting the lock file (bun.lockb).

Bun is without question faster than yarn by a wide margin. And I say this as someone who really loves yarn.


All good questions - and every person has different needs.

The fact is each potential adopter either needs to go through and comb over the compatibility page (not what I would expect with a drop-in replacement); or just try and see if it works, and be surprised later if they run into an unimplemented module because they were operating under an incorrect assumption.


In the section related to node.js compat, they state:

> Note — For a detailed breakdown of Node.js compatibility, check out: bun.sh/nodejs.

The detailed breakdown explicitly states which modules are not supported.


Drop-in replacement [1]

---

[1] Not a drop-in replacement


"Drop-in replacement" doesn't imply perfect 1:1 compatibility with no missing features or differences. Otherwise there would be no such thing as a drop-in replacement (in the software world at least).


I think the definition on Wikipedia is correct, otherwise a "drop-in" replacement is no different than any other replacement, making it a meaningless term.

> [Drop-in replacement] refers to the ability to replace one hardware or software component with another one without any other code or configuration changes being required and resulting in no negative impacts.

https://en.wikipedia.org/wiki/Drop-in_replacement


My feeling is that not all replacements have feature parity, but a drop in replacement does.


I don't see a conflict by that definition. The drop in could easily or partially be interpreted from the perspective of the receiver and does not have to be a static property of the replacement.


Sure, and for many projects it can be replaced without any other code or configuration changes.


If a drop-in replacement isn't almost assuredly 1:1 compatible then _what the heck is it_? The nodejs compat page for Bun notes that base64 encoding is still incomplete. Where do you draw the line?


> Where do you draw the line?

That's up to you, and society. Do you accept that tall people exist?

Right. Now exactly what height do you need to be a "tall person"?

This is the same. I haven't used it so I can't say how optimistic their "drop-in replacement" claim is, but they certainly don't need it to work 100% of the time with 100% of project with 0 changes to make that claim.


"Drop in replacement" makes an implicit guarantee about utility. It sets expectations about what I'm able to accomplish by investing time and effort into a solution. I don't expect tall people to have any specific utility.

If you call something a drop in replacement and it is not a drop in replacement, it's simply a lie.


> Otherwise there would be no such thing as a drop-in replacement (in the software world at least).

That's obviously wrong, because there are things that could be different while the API/compatibility is still matching 1:1, such as:

- speed, memory usage, memory safety

- additional features, more capabilities

- simple licensing stuff

Imho, a drop-in replacement actually suggests that you can simply swap out the runtime and expect your code to work out of the box.


To me it mostly implies that wherever the replacement is supported, it doesn't require substantial reconfiguration or learning a different interface. It's about that approach, even when the execution of that approach is limited or incomplete in scope.

It does generally suggest a high degree of compatibility to me though.


podman is a drop-in replacement for docker but yarn is not a drop-in replacement for npm. Makes sense?


Yes exactly. Podman is not 1:1 compatible with Docker. Occasionally you will run into issues e.g. due to the rootless nature. I actually recently had to uninstall Podman and replace it with Docker due to an incompatibility!

But it works enough of the time without any changes that it's reasonable to call it a drop-in replacement.


Either I missed this on first read-through, or they just added it.

It seems the note mentioning this feels buried for me under a large banner image, instead of near all the other text about compatibility.

Anyway, glad it is there, hope that their communications design improves in the future as at least a few others got caught with mismanaged expectations.


I second this notion. I was really excited to try it out, only to have it fail to run most of our projects:

* can't use AWS SDK v3 because throws an error when parsing a response;

* can't use octokit (or anything that depends on jsonwebtoken, for that matter) because node:crypto is not fully implemented;

* can’t run our tests, because jest.resetAllMocks() is missing

* in another project, bun test failed to even run, complaining about invalid call to shared library So for me the local workflow is “run it with bun, see if it fails, re-run with node if it does”. At least I can always replace npm with it, I guess? Can’t imagine running our services with it just yet. Really excited for the future of the project, because when it works - it really is amazing. But I wouldn’t call it 1.0 if the selling feature is Node compatibility.


Congrats!

This has probably been asked before, but has there been any thoughts about moving community chat to a platform other than Discord? Discord has been brought up many times on HN for its accessibility/privacy/proprietary lock-in concerns that don't seem to be in line with the spirit of open-source. Also see [1].

[1] https://drewdevault.com/2021/12/28/Dont-use-Discord-for-FOSS...


Not likely even in light of how important it is to not use Discord. In the comments of the bun 0.6 announcement, a question similar to yours was downvoted almost enough to get flagged:

https://news.ycombinator.com/item?id=35970869

And in the comments of the bun 0.8 announcement, the same question generates very little interest and some well every other project uses it, so...:

https://news.ycombinator.com/item?id=37244294


Yet we should still continue to press against proprietary chat. Developers are the kind of folks wise enough about technology to know better than to support this. There are very good reasons to at a minimum support a bridge. If we have to teach new developers raised on Discord, then that’s a great opportunity for them.

> Choosing proprietary tools and services for your free software project ultimately sends a message to downstream developers and users of your project that freedom of all users—developers included—is not a priority.

— Matt Lee


I see this issue so overblown.

Discord is just an added communication channel, mostly for the community.

Relevant discussions happen on GH.

I am part of numerous programming discords and I just see no issue with them being on discord, again, starting from the fact being there is far from being a requirement for anything really.


> I am part of numerous programming discords and I just see no issue with them being on discord

Has Discord ever asked you to provide and confirm your phone number to continue using it?


I’ve been there. I told the project manager to buy me a burner phone+SIM or I wasn’t using it. The admin disabled the SMS requirement as a result & for a brief project I used it in locked down container not tied to any of my personal info. I also was bombarded with CAPTCHAs any time I tried to use it despite keeping the cookies in that container. It was a bad experience, & I pray I won’t have to interact with it again.


Well if that’s your concern lets start with proprietary remote repositories such as github and all the CI providers. That’s a lot more critical to the development process than a chat app.


I prefer to compare apples to apples. Git is distributed. If you host your repo on Github, you can very well just push that same Git repo to Sourcehut or your private Git server.

But otherwise, yeah, I would warn against potential vendor lock-in, especially if that service is not open-source.


Yeah, the lock-in will come with CI & and the pull-request model that requires an account in this case. The former Microsoft GitHub largely ate all the competition with its own ‘free’ (gratis) offering in Actions, but you could seek out a different CI or self-host. The latter could be easily circumvent by offering a mailing list or maintainer email—or mirror elsewhere—but it’s rare to see this with projects on Microsoft GitHub.

Well, and joining most discussions require an account, and if your alternative communication platform is Discord, we see all the same issues of lock-in.


Matt Lee’s quote comes from an article about why you would want to leave the walls of Microsoft GitHub. I would agree that would be ideal too, but developers are even more reluctant with that one—if you think suggesting libre coms gets you a downvote here, try suggesting not Microsoft GitHub.


Hacker news is proprietary too


I don't think anyone uses it for community chat / support. You can also content without logging in, its content is search indexed, and it doesn't try to convince you to spend money on it.


So HN wasn’t a perfect example. What about GitHub. You can’t use GitHub’s search feature unless you’re logged in, and it very much encourages you to spend money on it.

The Discord-hate is a double standard that is not applied to any service (to the extent that comments criticizing Discord appear with extreme regularity on threads that should be completely unrelated).


Git is distributed.

> Discord-hate is a double standard that is not applied to any service

What does this mean? Dropbox/Amazon/Google/Windows get criticism across the web and on HN regularly, some of those far more than Discord.

> to the extent that comments criticizing Discord appear with extreme regularity on threads that should be completely unrelated

I don't get it. Are you implying a conspiracy?


Github issues are indexed by google and searchable.


https://www.answeroverflow.com

And I think Deno uses it.


Definitely useful, but this should be a built-in feature of Discord at this point.


Discord _is_ particularly hostile to user privacy, that said.


No, but if your account at one of the big techs gets stuck in bureaucratic hell, the only way to get support is to make a fuss here.


FOSS projects don't organize on HN, it's a public link aggregator.


Discord's accessibility has improved quite a bit. Even a screenreader user in the article's source section agrees:

> What I can tell you is that, to my surprise, Discord’s accessibility has apparently improved in recent years, and more blind people are using it now. One of my blind friends told me that most Discord functionality is very accessible and several blind communities are using it. He also told me about a group of young blind programmers who are using Discord to discuss the development of a new open-source screen reader to replace the current Orca screen reader for GNOME

Discord is honestly a great place for FOSS to house their communications. I find all of the articles claims very haphazard:

> When you choose Discord, you are legitimizing their platform and divesting from FOSS platforms

It is legit? It's free? It has loads of compelling features, is very accessible, and - most importantly... Also, choosing Discord does not take value away from FOSS communication platforms.

> Use IRC

It isn't IRC which is very inaccessible to lots of people new to computers and software in general.


> Discord's accessibility has improved quite a bit. Even a screenreader user in the article's source section agrees:

Yes, but this ignores the others issues the post brings up: Users who cannot afford new enough hardware to make the resource-intensive client pleasant to use are also left by the wayside.

Or: Discord also declines service to users in countries under US sanctions, such as Iran.

> Discord is honestly a great place for FOSS to house their communications ... Discord does not take value away from FOSS communication platforms.

The blog's author, and others such as myself disagree. Discord chats are not indexible by search engines, so solutions are harder to find. You need a Discord account, and you need to join the channel to even see chat and use the search feature. Discord is proprietary and non-extensible because of it. Lastly, Discord is also profit-motivated, so they can shut things down or add limitations because they need to maintain a profit, and there would be nothing we can do about it. "Enshittification" as HN users love to say, is practically inevitable.

> It isn't IRC which is very inaccessible to lots of people new to computers and software in general.

Which client are you speaking of? :) The beauty of IRC, Matrix, XMPP, etc is that you have the choice and freedom (without being legally threatened by Discord Inc.) to build your own client.


But discord is a better experience than IRC or any other chat I know of, and I say it as someone who pioneered #quakenet, #freenode and #libera on IRC.

Instead of bashing library authors that have already so much to think, why don't you propose a discord replacement and pay for it, while providing the same ease of use for the users and the same features from channels, threads, third party apps/plugins, video conferencing, etc?


> But discord is a better experience than IRC or any other chat I know of, and I say it as someone who pioneered #quakenet, #freenode and #libera on IRC.

I'm glad you enjoy Discord! And thanks for your hard work. But I disagree, and all of the issues with Discord I pointed out above still persist despite your love for the UX.

> Instead of bashing library authors that have already so much to think

I never bashed library authors.

> why don't you propose a discord replacement and pay for it

I engage in communities that use Zulip and Matrix. Zulip is free for open-source projects [1], and I donate to a small community that hosts its own Matrix server.

> while providing the same ease of use for the users and the same features from channels, threads, third party apps/plugins, video conferencing, etc?

We're kinda getting away from the point here. No one is suggesting that there is a FOSS alternative to Discord that has 100% feature parity.

[1] https://zulip.com/for/open-source/


> Users who cannot afford new enough hardware to make the resource-intensive client pleasant to use are also left by the wayside.

Wait… people can’t run web browsers?


By "client", they mean the web application that is Discord, which is CPU and RAM intensive especially for folks with older hardware.


Not everyone lives in the West & has access to the fastest hardware or fastest networks. Some folks are living mostly off-grid using solar power & each heavy application is chewing thru their limited batteries.


> choosing Discord does not take value away from FOSS communication platforms.

it does though. it's the same vicious cycle that allowed microsoft to gain control of the OS market: developers write programs for windows (because that's where the users are) and users run windows (because that's where the programs run). the more people agree to use discord, the more it becomes "the place where the <x> community is", the harder it becomes for any community to not use it.

9/10 discord users don't understand or care about this at all. which is why it's of utmost importance that developers - especially open source developers, presumably developing open source software for a reason - encourage discussion on other platforms.


Requiring stakeholders (users, contributors, whatever) in a free software project to use non-free software to fully participate in the community is a disservice to the movement, the users, and software freedom generally.

There's some room for debate about offering Discord as a bridged option. But having the official community chat only on Discord? How is there even a question?


This argument also applies to GitHub, yet I’ve never seen an open source project criticized for hosting issues on GitHub.

Both are proprietary websites owned by for-profit companies. Both require you to create an account in order to participate in discussion.


A couple points:

1. Git is distributed by design. Hosting on Github tends to not be controversial because that code can also live on Gitea/Sourcehut/your private git server at the same time. If Github goes down, it does not really matter. Very different from Discord, where there is no way to actually backup server/channel data, and attempting to do so may be a violation of the ToS and get you IP banned.

2. Your argument hinges on the fact that you have never seen an open-source project criticized, but it does happen. The blogpost in the parent comment even suggests not hosting on Github.


Most sizeable F/OSS organizations track bugs somewhere other than GitHub for exactly that reason!

Don't get me wrong. I get small developers defaulting to infrastructure they don't have to set up and maintain themselves, and I understand wanting to meet people where they're at. But it's definitely a problem that GitHub plays such a central role in F/OSS development, too.


Non free software? Either this seems misleading or Im confused. You don’t need to pay to participate in the discord. The server owners may have costs but thats not a burden on the community and is always going to be the case.


You may be unfamiliar with the term "free software". See the Wikipedia article: https://en.wikipedia.org/wiki/Free_software


They (Discord team) seem to not know what Free Software means too.

Some time ago I asked them whether they planned to ever become Free Software so it could be actually safe and have a community around it, and after a couple of messages they went (not verbatim, but in spirit) "oh, open source? lolno", missing the point entirely in multiple ways.

Taking into account their massive teen-focused PR campaign - which is a particularly social phase of human growth and FOMO reigns supreme, their predecessor OpenFeint's demise including a privacy lawsuit, the amounts of data and metadata implicitly and explicitly collected, it's at least _likely enough to distrust it_. The doors in the back are bound to be giving that sweet personal data to whomever has a big enough money stick, and/or other non-trustable agencies.


Accessibility also refers to the hardware & network speeds of the users machine. Discord is a heavy web app (that also pings back a lot of analytics) with users actively using ‘heavy’ features like sending videos. It’s also subject to US sanctions or internal bans where entire countries can be blocked or do the blocking. Speaking of blocking, Discord could block a community member for a non-community-related reason (or by mistake) & there will be no way to resolve it since the community is not in control of the server.

Free as in gratis, sure. But as in libre? No, it’s definitely proprietary & it’s free because users are the product (data collected and upselling Nitro); not every user wants to give up their privacy freedoms to participate. Discord’s not federated so users will be required to create an account & agree to the ToS & CoC of Discord, not the community running it. Discord has also been hostile towards folks trying to make alternative clients to meet usability needs & certain kind of bots.

Without IRCV3 & bouncer it would be a difficult/unexpected experience for a new user. Luckily IRC isn’t the only libre option. If you want to keep the system requirements low XMPP MUCs are good & allow federation. If you want newer bells & whistles at the cost of system requirements federated Matrix can be considered (tho centralization concerns around Matrix.org having the most users, most used server, most used client, & controls the spec). There are bridges between all of these that you can choose as many or as few as meets the community needs such as a community-hosted XMPP main server in a neutral country with bridges to IRC & Discord.


> He also told me about a group of young blind programmers who are using Discord to discuss the development of a new open-source screen reader to replace the current Orca screen reader for GNOME

Could you link to the program they are working on? Nothing appears when searching for "orca screen reader for gnome replacement."


Are there any FOSS alternatives to Discord that are free (as in money)


Zulip is free for open source projects.

Creating a channel on Matrix.org is free.


Conversations & blabber.im offer free XMPP accounts. I self-host so I haven’t tested whether or not its users can create MUCs on those servers.

Libera.Chat & OFTC offer free IRC for open source—& can depending on the community be a welcome bridged platform for any alternative as IRC requires the least resources to run (tho if you need rooms with encryption, that won’t be an option).


Could answeroverflow.com be a temporary solution to this?


Since Bun is VC backed, what is the monetization plan? One of the things I consider with new tech is "how likely will this tech still be actively developed in N years." Bun will need to make money somehow or funding will be pulled.

I see bun's licensing is MIT, and that is fantastic. So that does give me some hope it won't die should the business go under. I hope the business succeeds, but I'm curious should I adopt bun, what might the upsell be down the road.


Business plan is outlined on https://oven.sh/:

Oven will provide incredibly fast serverless hosting & continuous integration for backend & frontend JavaScript apps — and it will be powered by Bun.

It will support popular frontend frameworks like Next.js, Vite, SvelteKit, SolidStart and many more — along with backend frameworks like Express, Fastify, NestJS and more.

The plan is to run our own servers on the edge in datacenters around the world. Oven will leverage end-to-end integration of the entire JavaScript stack (down to the hardware) to make new things possible.


Considering there are dozens of mainstream serverless hosting providers out there already, with more popping up every day and pricing going down to ~free, I can't really see this being a viable business model anymore.


I mean, it can if they do it better than everyone else and eat their lunch


DX is the key differentiator IMO.


So it’s competing with vercel?


Since it's a bundler, I guess something like "bun deploy" to make money from hosting JS apps. It's a common strategy, getting mass developer adoption first, then soft-pushing for their private hosting of applications.


There was a rad HMR devserver called Pundle. It was instant-fast, but it was also maintained by one guy in a poor country.

Trying to keep up with one guy's side project caused me so much pain that I moved everything to Rollup and never looked back.

VC funded and random side project clearly have different concerns, but the ultimate risk with either is "if the creator can't support me, then what"


I guess they'll follow NextJS/Vercel or Deno/Deploy book or even License change shouldn't be out of question. Advertise everything as "open source", get free bug reports, contribution, marketing and adoption. Push everything into their commercial offering.


The proposition of replacing the layered morass of node-based tooling (node, ts-node nodemon, tsc, jest, ts-jest, webpack, BABEL, 2 incompatible module specs, UMD insanity, 3 package managers!) and the Cambrian explosion of config files that it leads to is super compelling.

The tooling situation around JavaScript is certainly my biggest pain point around it and turns a really very capable and pleasant language (ES6 + TS) into a complete chore. Here’s hoping that Bun can deliver.

It has to be a similar feeling as going from CMake to Cargo.


Congrats! Your blog post paints a very compelling value proposition for simpler, all-in-one (yet extensible) software. I find I'm more compelled these days by "batteries included" software than "bring your own everything", so I'm excited to try this out.

It reminds me as a runtime version of Rome tools, which had a similar goal (replace a bunch of software with a single faster one). It went under and has [transition to OSS under a new team](https://biomejs.dev/blog/annoucing-biome). I hope you find more success than they did! It's certainly a hard problem to solve.


I agree with "batteries included", if node.js and co etc. had a good harmonious ecosystem to begin with we wouldn't need such an approach. Looking at the way Apple does things, they own and control the hardware and OS which allows them to deeply optimise their systems and also provide an ease of mind knowing all their devices will work seamlessly across the board. Having an all-in-one toolkit reduces overhead of running around finding the perfect solutions, when it is already right in front of them.


On the other hand, the DX of Apple's development toolchain is among the worst in the world


I would be interested to learn from somebody with experience using both Bun and Deno: which one is actually the more compelling Node successor? Bun's website makes some impressive performance claims over Deno. Are these true in practice? And if so, why? Seems like Deno also has similar goals. Also, are there large philosophical differences between the two projects? Like Bun tries to reimplement the kitchen sink but Deno wants a new post-Node way of doing things? (Just guessing, no idea if that's true.)

Any seasoned users with time spent on both?


I’m not over the moon with it but I’m kind of interested to know why I need a Node successor at all. Both Bun and Deno are VC funded tools so I have base level suspicion around monetization and longevity.

It seems Bun’s major selling point is performance. I can’t say I’ve really run into massive performance concerns with Node. It isn’t earth shatteringly fast but I’m way more likely to run into IO constraints than Node speed issues.

Deno’s major selling point was that it was Node Done Right in many ways: better packaging, ES6 all the way, etc. (a pitch I was sold on!) but it seems they gave up trying to create a new ecosystem and instead are adding Node compatibility.

Alongside all of this I'm encouraged by a number of recent Node improvements like having its own test runner and built in .env support. So I’m struggling to see good reason to use either Bun or Deno. Even if I were to switch I'd need to make sure I have a concrete path back to Node should the new generation tool become unviable.


It is well explained in the blog article: https://bun.sh/blog/bun-v1.0#why-bun-exists

This is extremely compelling for frontend DX.


That kind of speaks to my point, though: a number of the features listed there (e.g. test runner, .env file support, watch mode) have either been recently added to Node or will be soon (and are available today as experimental flags).

The bundler stuff is certainly more compelling though the page doesn't specify what makes it better than esbuild except that it comes built-in... but that's where, for me, VC concerns raise their head. If I go all in on Bun bundler, what happens if the company switches their priority towards monetization and neglects the bundler? I'm going to have to unroll all the configuration work I will have done and go back to an external bundling library. And it's still not entirely clear what I gain!


If it delivers on the promise of solving the require vs import module nonsense that would be enough for me. Maybe node copies that too, but that's also a win imo.


Does Deno suffer from that in the first place? I don't know what that issue even is, so I'm just asking.


Getting VC funded HHVM vibes.

Not sure how they plan on monetizing.


Seems they both have similar plans: serverless hosting, continuous integration, etc. etc.

All valid ideas. But the involvement of VCs makes me wonder what kind of scale they're going to be expected to achieve and what will happen if they don't.


I'm super skeptical of the viability, much less the desirability, of language VM specific CI and hosting services..

Feels like an acquisition by Cloudflare/Vercel is a more likely exit scenario.. But then if we get an HHVM repeat and most of the benefits are just introduced into Node proper what's the real value to the aquirer? Aqui-hire? Deno has raised at least $25m though, and bun $7m..

I guess most startups fail so maybe there is just no answer.


Google is failing me - what's the VC spin on HHVM?


VC spin?

HipHop Virtual Machine was created over a decade ago to address serious performance issues with PHP. This spurred(lit a fire?) serious efforts to address the issues in PHP proper. Over a rather short period of time PHP became performant enough that HHVM was sunsetted.

I think the history here would call into question the viability of Bun as a monetary investment. It may end up a net good for the ecosystem , but if the major benefits can just be incorporated back into Node proper what are the ROI prospects?


Thanks for the clarification.

I thought you're original comment was saying something to the effect of "VCs funded HHVM, and this feels similar."

I'd only heard of HHVM in the context of Facebook, so I was surprised to read that (but I know Phabricator was another infrastructure company spun out of FB, so it seemed possible).


These tools are way too young for anyone to have actually spent a lot of time using them.

Bun is bundling everything and making it really fast, while also striving to maintain as much compatibility as possible with Node. It doesn't throw away the existing ecosystem.

Deno took on too much of an adversarial perspective towards the Node ecosystem and now they're working towards re-adding support.

So in terms of a successor, I'd say the only option is Bun because it's still trying to maintain compatibility with Node while innovating with new features.


Not a seasoned users, mostly played with Deno. To me Deno seems more oriented to be an alternative Node runtime with security as first principle and built-in support for Typescript, JSX with some tooling like linter. It's also based on V8. My best guess it's what Ryan Dahl wanted Node to be as an afterthought. Bun, technically speaking is based on Webkit, but can't really say why, seems a better all-in-on tool(also remember rome?) and not only a runtime. Also with compatibility with current frameworks out of box, Deno wasn't npm compatible some time ago and I wonder if it ever meant to be and not a pivotal change on the run


> technically speaking is based on Webkit, but can't really say why

Why Bun uses WebKit/JSC was described here:

> One of the reasons why Bun bet on JavaScriptCore instead of embracing the server-side V8 monoculture is because JavaScriptCore and WebKit/Safari are strongly tied together. This means that Bun can often use implementations of Web APIs from WebKit/Safari directly, without having to reimplement them. This is a great example of that.

via https://bun.sh/blog/bun-v0.7.1#messageport-messagechannel-ar...


It's known that Deno tacked on NPM support.

For new, from-scratch projects done by someone who doesn't know Node anyway, why not use Deno? I just started writing the server side of a mobile app with it, and I didn't even know JS, TS, or have any experience with routing frameworks. I had server-side queries working in a matter of days, and I don't claim to be fast at all.

The issues cited in the Bun PR, like the morass of modules and related performance problems, don't seem to exist in plain Deno. Or am I missing something? I don't anticipate ever integrating anything from NPM, so I'm actually disappointed (but understanding of the motivation) to see Deno hedge on the "fresh start" idea.


You’re unlikely to encounter any issues with Node in a small project. It’s possible though that you’ll need some module that is not working with Deno.


Thanks. If I can do CRUD and client authorization I should be OK…

I say now, anyway!


We're a full-stack TypeScript shop, and I manage ~50 internal libs and ~500K LOC of TS. Last month I tested out both Deno and Bun as alternative runtimes for us. TLDR: for any semi-complex codebase we have, Bun almost always works, Deno almost never works. We now run all our tests in both Node.js and Bun, and gave up on trying to make Deno happen.


What were your "semi-complex" codebases written in and for? Were they testing native Deno, or its integration of NPM packages?

I have never used Node and am creating a new project from scratch, so I don't know how worried to be about your commentary.


The "semi-complex" code is legacy code, aka revenue code. Started with plain JS in CJS on Node 12 and evolved to strict TS/ESM on Node 20. A lot of cruft built up from that multi-year evolution.

These days, if you're authoring from scratch, just write idiomatic TS in ESM, and you'll be fine. Even if targeting only Node, prefer to use Web standards (e.g., fetch, Request/Response, WebCrypto, Web Streams, URL, AbortController) as Node is migrating in that direction, and standard-compliant code will give you optionality in runtimes (Bun/Node/Workers).


Thanks for the reply! I'm just writing TS code that utilizes Deno/Oak and their MySQL support, to present a REST-style API. I don't know enough to say whether these depart from the standards you're recommending. If they do, I don't know how I would do routing and API support using only the things you mentioned. I'm a one-man band writing both a mobile app and the server to support it. I also know very little about scaling, which I anticipate hiring outside expertise on.

I want to invest my time wisely and learn portable techniques, but I realize the scope of my ignorance may exceed what you can address in a comment forum!


I have the luxury of toying with tech because the business around it is relatively successful. Engineering deep-dives are a lagging indicator of success (your own or your employer's) and almost never a leading one. All those "we did X/Y/Z with cutting-edge stuff" blog posts are overwhelmingly written by people who don't need to worry about bringing in money to pay for the time spent on that stuff.

Choose the tech stack that allows you to rapidly experiment in building something other people will give you money to use. That means choosing DX and iteration speed over runtime speed. That means using whatever gives you the most joy, so you're incentivized to build more often. That means something high-level so you don't waste time re-building components your customers don't care about, so you spend fewer hours on invisible, undifferentiated, but complex and time-consuming stuff.

If you're most effective with JS, Node is fine. Bun is fine. Deno is fine (for greenfield). One is super stable, battle-tested, and has 10x more contributors and 100x more libraries than the other two (most of them are trash). This maturity gap may not matter at all or may break your startup if you must re-implement something complex or continuously spend hours understanding and battling your runtime with minimal information on the web from others who encountered the same problem before. The same goes for your prod infrastructure, mobile app architecture, business ops, infosec, etc. Choose boring technology https://boringtechnology.club


Thanks for that. I'm in the middle situation, in that I'm somewhat in a hurry but not desperate (yet). So far I've been pretty encouraged by how quickly I've gotten something working with Deno; the main reason I went with it over PHP 8 is that I wanted to learn JS/TS. Second, the author's goals of addressing Node shortcomings and providing a cleaner import mechanism resonated with me. Thus far it hasn't annoyed me, which is a miracle because I'm easily annoyed. I think sifting through a vast collection of libraries (mostly junk, as you say) would annoy the shit out of me but more importantly waste valuable time.

I learned that problems that seem simple and solved apparently often aren't, regardless of how many blog posts and HN posts there are about the framework of the week. I needed to define an API, so I did so using OpenAPI. While the idea and standard seems mostly sound, the tooling (from editing to code generation) is absolute trash. I wasted soooo much time trying to make it work. Should I ever use it again, I'm writing my own tools. But right now I do just need to get stuff done.


If you don't mind sharing, what are some of the issues you've had with Bun so far?


  1. Missing `node:http2`
  2. Missing `node:test`, so it's more difficult to execute the same test files within Bun as we do via Node. I wrote a custom Bun loader to mock parts of `node:test`.
  3. Vite and ESLint do not work.
  4. Still occasionally segfaults, and it's difficult to find out why/where.
  5. Surprisingly, some code runs slower in Bun than in Node. For example, generating JWE with symmetric encryption. But this might be WebCrypto vs OpenSSL.
  6. Subtle differences between WebKit and V8 (e.g., how they handle dates).


Have you tried since v0.7? Vite support was supposedly added then


So it has, thanks for pointing that out. I goofed as I actually actually meant to write Vitest, which remains broken in v1.0.


I am annoyed with the bombastic claims behind bun.

I wasted a day trying to get vite to work when they first announced it. Really excited about not needing >1gb of ram to compile a react project... Boggles my mind that react bundling uses more RAM than compiling linux.

It is still unable compile http://chatcraft.org due to some problem with wasm plugin.

They also said that bunx --bun option was a pre-1.0 workaround, and didn't keep that promise.

Performance-wise their claims are suspect, safari js engine was always better at startup and memory use at the expense of a relatively weak JIT. They paired that up with a ton of stuff reimplemented in native code to make their cli and hello world workflows fast. This means people will be in for a perf surprise when they start bottlenecking in JS hotpaths.


And you're right to be skeptical. With performance, so much depends on your actual code and program-specific bottlenecks. The difference in runtime speed between JSC and V8 is minimal, so a lot of Bun optimization must be in the gaps between your code and the core JS engine or in startup time (e.g., running `bun install` or `bun test`).

In my benchmarks, our actual runtime performance of long-running JS code is ~25% faster under Bun than under Node. However, sometimes Bun is ~25% slower - notably with tasks that require binary processing (e.g, fflate, sharp) or encryption (e.g., jose). We're committed to JS and will be constrained by its performance for a long time, so a "free" 10-30% speed up at runtime is worth the effort, but it's not a 10x slam dunk like the benchmarks on Bun's homepage imply.


I haven't been able to get any work code working in Bun yet. All my personal/pet projects work fine, though. Speed is impressive when it works.

I did raise issues with repros for it only wasn't able to try to fix the issues myself. Building Bun is not working for me


If I recall correctly, Bun doesn’t support Windows, unlike Node/Deno.

Edit: sounds like this is changing. Thanks for the correction!


It has experimental support as of this release: https://bun.sh/blog/bun-v1.0#bun-more-thing


Interesting! The original link has no Windows binaries, so I assumed nothing had changed.


The link has been changed to the Bun 1.0 blog post which specifically mentions experimental/incomplete Windows support.


Deno wanted to be different but never did enough to force its way. Maybe if they had 10x the funding to rewrite the ecosystem. Now it's in limbo.

Bun aims to be a better replacement to Node. There's less to consider so it's just a matter of if it's compatible enough and faster / better to warrant the switch. A lot easier to swallow.


There's plenty to consider if you don't care about Node compatibility. From what I've read (including in this thread and on the Bun PR page), Node seems pretty messy, with modules coming from different sources in different ways. In Deno you just import with a URL.

I know very little about JS and TS and "modules," but from a newcomer that's how it all looks. I'm happy to have any further insight!


Does Bun support url imports for module loading?


I was curious about this too, whether Bun supports URL module imports. Skimmed through the documentation a few times, and I believe it does not.

But, the command `bun install` supports NPM and custom registries, as well as Git and .tgz URLs.

https://bun.sh/docs/cli/install#git-dependencies


I think you can achieve this via a bun plugin


Deno wanted a post-Node way but they have gone back on their original ambitions (which never resonated with me, to be frank) and now have a pretty limited compatbility whereas Bun has full compatibility with Node.


Per their own docs, Bun is not fully Node compatible [0]. In my experience, even for APIs its docs claim full-compatibility has some sharp edges. That said, Bun's node-compat story is better than Deno's.

While Node's performance story keeps improving release over release [1].

[0] https://archive.is/NCzRQ

[1] https://twitter.com/lemire/status/1699459534190698999


We originally planned to release yesterday, but there were test failures in fetch() body streaming that needed to be fixed. The blog post was not meant to be public until Bun 1.0 was available on GitHub, but the blog post link was publicly accessible and our RSS feed had a bug that didn't hide drafts

The specific bug ended up not being in fetch() body streaming, but in our JavaScriptCore binding for getting a property from an object that may not have defined the property. Not all of our code was checking that the value was an object (only that it was a JSCell, which usually is an object, but things like symbols and BigInt are not JSObject)

Thread from yesterday: https://news.ycombinator.com/item?id=37424724


Doc's install section for Linux missing DEB/RPM, please update docs.

Extra hint - you may provide text snippet for Ansible playbook/taskset with proper sha256/similar checksumming for installing via "Download from Internet, not from your distro repos", the same for Puppet (which gonna be bit more complicated).

Will ease up a bit system's administrators work - SAX - system administrator experience, if you wanna name it that way :)


If Bun is able to run and bundle a TypeScript React app out-of-the-box, what are the benefits of using Vite.js on top of it?

I'm confused cause this guide on the official site showcases how to use Bun + Vite.js to build a TypeScript React app. [1]

There is also this issue on their Github. [2]

Can Vite.js be used to handle more complicated scenarios / advanced use cases that Bun doesn't handle? My use of Vite.js is pretty basic (start & build with the default TS+React config) so maybe I'm missing something here.

--

[1] https://bun.sh/guides/ecosystem/vite

[2] https://github.com/oven-sh/bun/issues/250


Vite still leverages node, esbuild, swc, tsc, etc for parts it’s not directly responsible for.

Granted I have no experience with bun yet, my take on it is that you can leverage bun for running the local dev server (instead of node), leverage bun for bundling (instead of esbuild or tsc or rollup or an amalgamation of them), and leverage bun for transpiring instead of Babel and typescript.

What vite provides is an easy to configure setup for developing frontend applications with a nice DevX when working locally.


I'm new around here, I've only seen people use Vite with bun for HMR, but imho, it's not great to combine them just yet as there's config complexity.


I thought Bun had HRM out-of-the-box but it's not working? Not sure. https://github.com/oven-sh/bun/issues/833


It's a big ask to get people to adopt an ecosystem wholesale. Compatibility with the existing ecosystem is a key goal of Bun, so that people can start using it immediately on their existing codebases and then adopt it gradually for more and more stuff


My guess is that you can leverage the vite's plugin ecosystem...postcss,tailwind,terser etc.

And most of the new meta frameworks(nuxt,sveltekit,astro,solidstart,qwik) runs on vite so that might pave a path for adoption.


that's like asking why need vite when you have tsc. I assume bun does not "run Typescript React app", it only does the jsx,tsx transpiling out of the box.


Looks great but the fact that it uses zig which is pre 1.0 and describes itself “not for use in production” gives me pause.

Realistically how much of a problem is this likely to be?


I guarantee zig is 100x more production ready than 99% of the 4 million nested dependencies included in the average js project


Wow, that many guarantees should persuade me to base my tech stack on Bun and Zig.

Seriously, I think this question is worth asking. Why was Zig chosen as the language when it’s not even stable, and what implications does this have for the long term viability of the project (besides the fact that its _fast_)? Zig’s head guy isn’t even sure when Zig will hit v.1.0, and Bun’s head guy hasn’t really responded either AFAIK.


Zig is fast, promising, compatible with existing C libraries, and relatively barebones. The language itself may not be ready for production, but the binaries built in the language work just fine.

If Zig dies tomorrow, bun could probably continue using it as-is, perhaps after fixing the bugs they encounter. It's "the API and language spec isn't complete yet" unstable, not "we haven't implemented floating point operations yet" unstable. So far, only the allocalypse has caused major grief in terms of language changes, as far as I know.


What does "allocalypse" mean here? Were memory allocations in Zig less explicit in the past?


No, they changed the allocator API so everyone had to update their code, here's the story:

https://pithlessly.github.io/allocgate.html


>The language itself may not be ready for production, but the binaries built in the language work just fine.

Why is it not ready for production if the binaries work just fine?


Because they may yet still change the language spec.


Because an update to the compiler could come out next week that completely alters the language. If you're developing a product in this language, you'll need to put significant effort into keeping it up to date, and any dependencies you've downloaded in source form may not work on the most recent compiler.

In one such change, all *Allocator parameters were turned into Allocator parameters (not the missing *). That meant rewriting tons of function bodies and signatures, because passing specific allocators around is one of Zig's strengths. The compiled binaries came out just fine, but every major Zig component needed refactoring from one compiler version to the next.


they probably figure the developer velocity gained from using it over c++/rust is worth possibly having to make large refactors if a feature in the language is removed.


it's possible that they chose Zig if this started as an unpaid passion project, and it simply sounded the most interesting to them.

Demonstrably by this 1.0 bun release it seems safe to say it ended up being a fine decision, no?


> Demonstrably by this 1.0 bun release it seems safe to say it ended up being a fine decision, no?

That’s just a decision they’ve made themselves. I honestly think it’s an interesting question: can software built on a <1.0 base legitimately call itself 1.0? What if there are big underlying issues discovered within Zig?


Well sure, it can legitimately call itself anything. You are wondering if Bun’s standards for 1.0 match up with the standards for 1.0 that you have in your head, but of course only you can answer that.


I don't see why not. 1.0 would mean the API isn't changing. And they can certainly swap out the language and still keep the API the same.


Node or Deno dependencies would be a fairer comparison.


I tried to compile a hello world template for Zig a few months ago. Pulled it from GitHub, zig build and it gave an error. The problem? They recently redesigned a major part of the build system and deprecated old functions. No deprecated message, and no backwards compatibility. Just a plain old undefined function error.

That's what I don't like about these modern languages. You have to be in the community and keep up with the latest releases in order to use the language. I just want something that's stable. I don't want to keep relearning everything.


> Realistically how much of a problem is this likely to be?

It is a very young language with momentum and a foundation behind it. I'm not aware of any pre 1.0 stability guarantees.

In fact it is generally recommended to use the nightly release (see: https://ziglang.org/learn/getting-started/) because they are moving quickly.

Definitely seems to be in a early adopters and tinkerers phase.


Calm down Gramps. People have moved on. Rust is yesterday's technology.


Jarred congrats for the release!

I've been following the progress of bun since your initial announcement and today I decided to give it a try, just to play around.

Haven't done much with it, but even for the 5 min I played with it I'm kind of impressed so far.

- Superquick install PLUS did not require root (like many other installs)

- It identified I was using fish shell and added to path (nice!)

- I ran a very quick bench on "npm install" vs "bun install" on one of the projects I have, and the performance is amazing. 50seconds vs 4.5 seconds on the first install. Moreover re-executing "bun install" takes 122ms on my machine. Removing node_modules and re-executing it takes 769ms (because of course, it uses a local cache elsewhere, but still). Amazing.

I'll probably continue exploring tomorrow and see whether it is able to run the rest of the backend/frontend or whether it gives me a hard time. I've seen there are certain things that are not 100% compatible with node yet, but since the initial impressions are great I'll explore further.

BTW A code formatter and a linter would be a great addition to bun.

I know there is this ticket: https://github.com/oven-sh/bun/discussions/712

But one of the advantages of integrating both things in bun is that it makes it the perfect standard tool to be used inside of a team. So no extra installations from other projects, no extra mental burden of what to use, etc... bun would be the perfect dev companion with both ;)

Probably a linter is a different beast (and not sure you or the rest of people working in bun want to get in there... probably not important right now), but a formatter seems doable and it does add a lot of value from my point of view. Given that bun already runs, installs, tests and bundles, to the very least _formatting_ seems like a natural addition to the family. To me a formatter is part of the standard toolset for developers nowadays.

Once again, thanks a lot for the effort to you and the rest of the people contributing to the project!

(edit: re-formatted my comment :p)


Personal experience: Bun has been a flawless drop-in replacement for both NodeJS and npm/yarn since I started using it around v0.4. Not only have I had zero issues running it, but it's been WAY faster. No benchmarks, but very noticeable, from "let's start a new build and go grab a coffee while waiting for it" on NodeJS, to "hit run aaaaand it's done."

I haven't thought twice about it. Frankly, I forgot that bun has been happily running behind the scenes for me. I highly recommend anyone using NodeJS to give it a go.


I've worked on several large node.js codebases and builds never took more than a few seconds, def. not "go grab coffee" time, so maybe something else was going on.


It’s not uncommon to have a nodejs library take like 10-15 mins to go through building and testing in GitHub actions CI


But that’s probably more about tests and github resources than raw build times.


Average angular apps take at least 4 min in release mode on machine.


Get a new machine


As a response to performance issues? Always very valid!


>and testing

And Bun is going to improve that?


So it means you'll drink less coffee, and it's not good for productivity reason.


Oh its a Friday. Deploying on Fridays is what we do. Here you go ... Bun 1.0 officially deployed on Platform.sh https://platform.sh/blog/bun-support-is-here/ enjoy.


Also. As I am excited. Here you go for a one-click deploy .. https://github.com/OriPekelman/express-bun/ Very unofficial. Very.


Slightly hijacking this thread to ask: I've not heard of Platform.sh before. Do y'all have a free/hobby plan for someone wanting to become familiar with the it? If not, no worries.


We are not big on free. We like money :) And people like that we will probably be around in a decade (and won't stop free as soon as we've locked enough people -in). We do have a no-strings-attached free trial ...


There was some criticism yesterday about bun being stable using a language (zig) that isn't considered stable. Do you care to comment about it?

Ref: https://news.ycombinator.com/item?id=37422894


Zig isn't an interpreted language; once the code is compiled, the produced artifact doesn't change. Stability doesn't really play a factor there.

Sure, the instability of Zig may increase the maintenance costs for Jarred, to use updated versions of Zig. But this has no effect on the features that Bun provides. The risk that a change to Zig would be so consequential that they would actually prevent Jarred from continuing to work on Bun, or would otherwise force Jarred to fork Zig, that risk is so low as to be inconsequential.

(disclaimer: not associated with either bun or zig, the above are pure opinions)


That in itself is no good argument -- it surely has some bug, and if, say, zig changes in the meanwhile and it can no longer compile the previous code then it can be problematic from a maintenance POV.

With that said, it is probably fine to use for most ordinary apps, but I wouldn't yet build a bank infrastructure on top for example.


also stability is more of a statement about api design than internals


Not OP but I think they meant Zig isn't 1.0 and therefore could be considered "not ready for prod". I'm not claiming anything about Zig; I can only hope that the Bun folks have done their due diligence and ran it through fuzzing/ASAN/etc before making the 1.0 stamp of approval.


That's the point being addressed by the comment you are responding to.


There was a FUD theme with some of the "zig isn't stable (>=1.0) so how can I trust bun" threads. So long as bun is stable, does it matter that zig is not 1.0 for the developer using bun? So long as zig is reliable enough for the developers of bun, I don't see the concern.


Bun ships features and improvements fast and often, which is great

But from anecdotes, I have the impression people still run into bugs and missing functionality semi-commonly. I'm sure it works great for small/personal projects, but I've been hesitant to recommend it for production use at my workplace because of this impression

So my question is: how feature-complete/stable/secure is it right now, what does the v1.0 label say about that, and what are the long-term plans/prioritization/guarantees around this area?


How does Bun support CJS and ESM in the same file? Is it transpiling CJS to ESM? Or is it transpiling ESM to CJS?

ESM can use top-level await and CJS can't. So this can't possibly work, right?

    // x.mjs
    export const x = await 'x';

    // y.cjs
    console.log(require('./x.mjs').x);
And, indeed, it doesn't work in Bun. When I `bun y.cjs` it gives me this error message:

   TypeError: require() async module "/tmp/x.mjs" is unsupported. use "await import()" instead.
… but using "await import()" instead is exactly what I was trying to avoid! I can use "await import()" in Node, too.

But it looks like it does work in Bun if I drop the `await` from y.mjs. So, is it just transpiling ESM to CJS, and failing out if the ESM uses top-level await?

EDIT: I see the documentation page https://bun.sh/docs/runtime/modules has a "low-level details" section, explaining how import(CJS) works (by transpiling CJS to ESM), but it doesn't say anything there about how require(ESM) works, which is unfortunate, because that's by far the most interesting part! I've filed a doc bug on this. https://github.com/oven-sh/bun/issues/4601


I believe Jarred has answered this upthread: https://news.ycombinator.com/item?id=37437848


You write typescript and unit test without tsc, jest, webpack/esbuild or a ton of config files. bun projects can be super lean and still cough out a tested bundle.js for execution on a browser.

I love it, it super simplifies project and for minimalists it's perfect.


And the speed of package installs, bundling and test running might make up for very fast pipelines.


How do you do type-checking? Do you just run TSC with “noEmit: true”, like you’d do with any other bundler?

TSC is simultaneously the most valuable and the most flaky part of your typical JS project setup. That’s the main part I’d be interested in replacing with a different tool.


It reads the tsconfig file if present. You just write TS files and bun build outputs JS. Vscode highlights typecheck errors based on TS errors. Bun build fails if you can't typecheck. No tsc installation necessary, but also consistency with tsc if you use a tsconfig. For simple stuff tsconfig is not needed


Bun build fails if you can't typecheck

It has to run TSC (or expect somebody else to run it) in order to typecheck, though, right?


I am mistaken. Yeah bun does not typescheck like esbuild doesn't. It reads the tsconfig for build options. Either you add tsc or let your editor do it.


That's my understanding, too.

> Bun can run .js, .ts, .cjs, .mjs, .jsx, and .tsx files, which can replace: > > tsc — (but you can keep it for typechecking!)


Love bun, but I disagree that blog post explains better what v1 is, but rather explaining what bun is in general.

It hardly mentions anything until the very bottom on what's changed to reach v1, the github release page on the other hand clearly outlines it.

Personally found the blog post to read like documentation rather than a release announcement.


The fact that Bun has taken VC funding [0] (through Oven, the company behind Bun) makes me skeptical. There is also no mention of this on the project's homepage.

[0] - https://oven.sh/


I am skeptical of the long term viability of such projects that rely on VC funding.

However, if I recall correctly, the early days of node.js was mostly funded by Joyent. It took time and some major conflicts for Node to be controlled by a foundation.


Joylent isn't a VC fund, though, is it?


I love that you engage with the community on both HN and Twitter. It's always interesting to see your polls on how certain features should be implemented.

Any plans for other interesting experimental features (like macros)?

I don't know how much this is still a problem, but a few years ago one area where I used to experience a bit of pain was that package.json scripts would get pretty unwieldy and supporting windows required using third-party tools. Have you consider experimenting a bit with the scripts or are you happy with the state of that feature?


from that blog post:

"Unlike Node.js and other runtimes that are built using Google's V8 engine, Bun is built using Apple's WebKit engine. WebKit is the engine that powers Safari and is used by billions of devices every day. It's fast, efficient, and has been battle-tested for decades."

That's really interesting, does WebKit do less JITing or something to be faster for startup?


bun startup is fast compared to other JS runtimes but imo this is little or nothing to do with JSC versus v8 as the javascript engine. i did some experiments a while back and wrote them up here if you want to take a look. https://just.billywhizz.io/blog/on-javascript-performance-03...


awesome writeup, as always.


thanks! love your uPlot library btw!


<3


Congrats on the launch! The NextJS guide[0] claims that bun doesn't support all the node APIs to run the dev server, is that still the case with 1.0?

[0]: https://bun.sh/guides/ecosystem/nextjs


We landed Next.js pages directory support two days ago and forgot to update that guide


Awesome! Reading through the install script now -- would you considering using installing to $XDG_DATA_HOME and falling back to `~/.bun`?


Any docs for migrating an existing Nextjs project? No rush but would be great to have :)


Congratulation to the project, it seems incredible.

I don't know that much so I have some questions, sorry if they're already answered. (share link if any if its possible)

Do you plan to support as much as possible the Node.js API (99%) or most features and make your own API for lets say Clustering module and such ?

Speaking of Cluster module, do you want to recreate Node.js Cluster module, or do you plan to create your own way for multi-processing using Bun in a more performant way ?

I've seen Deno putting a lot of effort into the platform around their runtime, making deploying Deno app easy, is that your goal too ?

I've seen that you use Zig instead of C++, what do you like the most about Zig while working on Bun ?

In anyway, I hope the project will make the JSRuntime world better, its promising.


How compatible is Bun with Remix[0]?

I’ve been loosely following the progress of Bun for awhile, but I haven’t actually tried it. The blog post you linked does make it sound very intriguing.

[0]: http://remix.run/


Remix in Bun works on Linux. The built version also works on macOS, but the dev version is blocked on us upgrading WebKit. Intl.ListFormat which Remix uses depends on a newer version of ICU which the current version of WebKit uses but we didn’t want to upgrade WebKit immediately before 1.0 so we held off for this release.


Dumb question: since Bun is MIT licensed - why doesn’t Node just merge Buns code into Node?

https://bun.sh/docs/project/licensing


"just" :_D


because the projects have different goals and are coded in different languages.


...and based on different JavaScript engines.


Are there any downsides? Cause from this blog post it sounds like I should just ditch everything I was doing and move completely to Bun.


This is what I thought. After some fiddling it became clear that compatibility will be an issue for more applications than they claim.

It's probably fine for new projects though.


I’ve been thinking about this anf trying it out. The only obviously bad thing is that the ‘drop-in replacement’ is bs: they’re aiming for it but there are a bunch of node std lib modules missing. Despite that it seems to work just fine for my Expo app and RNW project.


Check compat status here: https://bun.sh/docs/runtime/nodejs-apis


limited Windows support -- I am surprised that this is hardly mentioned. People here may not admit this, but Windows-based developers are a huge group, and probably more than Linux or Mac (e.g. look at stackoverflow survey). If you tell people "You can't really use Bun as intended unless you use Mac or Linux or spend half an hour setting up WSL first", people are going to be turned away, especially for new developers.

It may not matter to you personally, but if it is used in a project with developers working on different platforms, it is going to present some hurdles.


Seems good! I really like the ability to combine CommonJS and ESM modules, and the built-in file-watcher. My reluctance is two-fold: nodejs is still dominant and mainstream, and Bun is VC funded which, as others have mentioned, doesn't bode well for its longevity (I realize this concern is a bit circular, since it's longevity would be improved with adoption). (It's just that I've been burned adopting marginal technologies in the past. In the JavaScript space, I always liked Brunch, but it never caught on. In the Java space, I always preferred Dropwizard to the Spring Boot complexity, but it also never caught on.)


Seems very cool but hitting 1.0 without full windows support do not inspire me confidence. Using tools that consider your environment a second class citizen is never a great idea... I will give it a try in WSL though.


Consider switching to Linux, I think you will see this trend a lot more.


What trend? The ignorant trend where a niche group of dev thinks Windows support should be second class, even though it ranks highest used by personal and professional developers year after year?

I use all 3 OSes and this superiority complex is tiresome


> superiority complex

what do you base this ad hominem on?

If you try to convince your company to deploy servers on a windows environment instead of linux, you better have a really extraodinary reason, that's just a fact. If Windows is any good it's because of the software ran on it (games and related drivers?), despite the OS


>what do you base this ad hominem on?

Look at OP's original post. Why would one consider a change at all when everything that one needs is already in the OS? I mean... it could perhaps be a preference. Shocking right?

Anyhow, I wouldn't convince any company to use windows environment unless necessary as you mentioned. However not all software are web servers and there are plenty of reason, not extraordinaryly, requiring a windows environment.


Yeah but Bun is a web server. This thread is not about a tool for domain where Windows shines. Windows always sucked for web dev and for web deployment. Bun is an open source tool. The devs don't owe anything to Windows users. As a matter of fact, they are trying to bring it to Windows. It would probably already landed if it the support was as easy as for unix based system.

If you have circumstances requiring Win env and you want to do web stuff, run WSL. If you want to do web stuff and Win without WSL is your _prefererence_ don't expect the world to bend for you -> keep wallowing in your masochism.


On your github release page [1], there are both "bun-darwin-x64.zip" and "bun-darwin-x64-baseline.zip". What is the difference between them? Why the "aarch64" binaries don't have the baseline versions?

In addition, your "bun-linux-x64.zip" doesn't work on CentOS7 due to glibc compatibility. Would be good to provide a more portable binary.

[1] https://github.com/oven-sh/bun/releases/tag/bun-v1.0.0


Honestly if you run CentOS 7 you have to expect that most binaries won't work due to glibc compatibility issues. It's like 95% of the reason musl exists.

I am amazed and not at all amazed that GCC has never got a --compatible-with-old-linux flag.

To be fair MacOS doesn't really either, but people tend not to run 9 year old versions of MacOS.


> It reads your package.json and writes to node_modules, just like other package managers, so you can replace:

> ...

> pnpm

pnpm's flagship feature is global deduplication of packages, so sadly bun's package manager doesn't seem to be a replacement. Is there any plan or intention to support pnpm-like disk saving? (I suppose not having to esbuild etc. everywhere already helps.)

That said, I'm very impressed with the speed of `bun add` with already cached packages. Literally instant.


Bun already supports that [0]. It is the default behavior on Linux, but can also be enabled on macOS.

[0] https://bun.sh/docs/install/cache#saving-disk-space


Any chance of this coming to Windows? Most of our developers are on Windows systems.


Keep track of this issue: https://github.com/oven-sh/bun/issues/43

According to that, the speed benefits will probably not be as noticeable in windows.


Nice! I was looking at du(1) output on macOS which appears to indicate all files are duplicated. It probably just can’t tell the files are clonefile’d.


There is a very obscure undocumented api in macOS for detecting if two files are equal block-by-block but yeah I don’t think most tools can tell when files are clonefile’d


Just a note on your documentation: it says "this benefit does not extend to macOS" about saving disk space. I find that to be slightly misleading. Since arguably the most common FS for macOS is now APFS, it does save significant disk space unless one were to modify the node_modules files.


You're jumping to conclusions: I just tried the package.json below, and bun's node_modules folder is smaller than pnpm's (3.9M with pnpm vs 3.4M with bun), and as a sibling noted, bun is also deduplicating globally.

That, coupled with the fact that bun installs twice as fast when no packages are cached (634ms compared to 1.3s) and 100x as fast when all packages are cached (6ms to 639ms), I think it's fair to say that bun can replace pnpm.

Here's the package.json I tried:

    { "dependencies": { "express": "^4.18.2" } }%


I had difficulty using Bun with Verdaccio, a private/local npm repository.

I know it’s possible to get it to work, but I didn’t have time to fiddle all the bits.

If Bun made this easier, I’d switch from pnpm immediately.


My experiences with Bun so far have been excellent. Its package manager and bundler in particular are extremely fast and pleasant to use. Not needing to care about transpilation or using ts-node for development is also wonderful. For all the tools I cared about, compatibility has been well documented.

I would have trusted it in production before 1.0 and moreso now. Unfortunately, I cannot use it for most projects right now. I package my applications with Nix for deployment and Nix and bundling JS dependencies is mostly not a great story right now. buildNpmPackage works just fine most of the time, but requires that I use npm to bundle my dependencies.

Of course, this is no fault of Bun whatsover. Bun can emit a yarn.lock that is compatible with Yarn 1. If it could also emit a package-lock.json that might unblock my use case. The more ideal solution would be to build an equivalent of buildNpmPackage that deals directly with Bun, but I'm afraid the effort would not be worth it right now.


It really seems impressive to me. I think I'll try it as a tool first, later as a runtime. Just wondering why has been built using Apple's WebKit engine vs V8, other then what they say on the blog which is just shallow infos. Anyone has a link to share or any better explanation?


Both engines are really fast, both of them have places where they beat the other. For a while, JavaScriptCore was the fastest Javascript runtime, then V8 took over again, and now Spidermonkey is the fastest if you cherry pick the right tests.

The places where WebKit lags behind aren't the places Bun really cares about. I'm honestly quite glad they add some diversity to Javascript land.

I recall reading a comment that JavaScriptCore was chosen because of performance. I believe JSC is quicker to start up, and I wouldn't be surprised if it wasn't faster at executing the type of code Bun executes as well.


I would like to ask if the BETH stack is the most suitable to work with Bun, and what are the recommendations from the community. I would like to start a project using this runtime.


OK, I'm impressed. It took 0.01s to run 'bun install react':

    % cd /tmp
    % mkdir app
    % cd app
    % time bun install react
    bun add v0.6.1 (78229da7)

    installed react@18.2.0


    3 packages installed [20.00ms]
    bun install react  0.01s user 0.01s system 50% cpu 0.041 total
    % find node_modules | wc -l
          41


That's probably because it was cached! The first install is "a lot" slower than subsequent ones:

  $ cd /tmp/
  $ mkdir app1 && cd app1
  $ time bun install react
  bun add v1.0.0 (822a00c4)
  
   installed react@18.2.0
  
  
   3 packages installed [431.00ms]
  
  real    0m0.436s
  user    0m0.031s
  sys     0m0.052s
  
  $ cd ..
  $ mkdir app2 && cd app2
  $ time bun install react
  bun add v1.0.0 (822a00c4)
  
   installed react@18.2.0
  
  
   3 packages installed [6.00ms]
  
  real    0m0.010s
  user    0m0.006s
  sys     0m0.005s
Though even 431ms is obviously amazing!


on my currently crappy internet on ubuntu 22 i get ~2.65 seconds for bun and ~5.6 seconds for npm for a clean install without caching. so approx 2x as fast as npm.


Is there an example one step above "Hello world" that new users can take a look at? There are a bunch of guides for different API's / constructs but I'm wondering if there's a canonical example to get started.

Guides: https://bun.sh/guides


Wondering what (if any) telemetry data Bun collects? Ever since the telemetry in front-end tools website (https://telemetry.timseverien.com/) was shared here a few months ago I've been wondering the same for other tools.


I love how much it includes out of the box. I wrote a little zero-dependency app in TypeScript. It’s got SQLite, routing, etc. It even understands JSX. The only tricky bit was writing a ~100 line JSX-to-HTML function. It’s quite nice and starts so much faster than the normal Node + npm + essbuild stack.


What happens with circular require/import calls? Anything weird when requiring/importing the same file, such as instantiating multiple singletons? The two systems have fundamentally inverted trees when it comes to running JavaScript files and I'm curious if anything funny happens.


Hi Jarred,

The docs say that Bun uses clonesfiles / hardlinks (yay). But it also states that clonefiles don't benefit space saving:

> This benefit does not extend to macOS, which uses clonefile for performance reasons

As far as I know, clonefiles are COW, which should not take any space except the extra inode.


Congrats on the release. May I ask if the company is still hiring C++ engineer? I work full time in system programming, and the job post says 'post something you built with bun along with resume' which really confused me as I don't typically build things ON Bun.


It’s been so fun watching Bun progressing as quickly as it has! Truly incredible work, and the blog post is full of real value and time saved for future me from beginning to end - huge congrats on the 1.0.0 release, excited to see where bun goes from here!


Amazing work Jarred.

I am still absolutely amazed at how much more performant websockets throughput in bun vs node.

Congrats on 1.0!


Just wanted to say thanks. I’ve not been as excited to tryout some new tech in a *long* time.


I have been testing out Bun w/ Nest JS, and oddly it can run a dev server but can't run my E2E tests (using the Bun test runner). I think this has something to do with Nest's heavy reliance on old-school TS decorators.


We haven't added support for emitDecoratorMetadata yet but we have a branch that does most of the work. It just needs some more tests before we're comfortable merging it


looking forward for a full (I mean good enough) nestjs compatibility!


Woohoo!


Wow, that is a compelling landing page. It's almost too good to be true! Is it fudging the numbers or using highly cherry-picked examples? Or is it really that much better than the competition?


A few weeks ago, I tried to compile a little bit of TS code and was shocked at how long it took. 5 seconds! I can't believe people put up with this. Glad to see there is a solution.


Congrats on the launch! I have been following Bun’s journey since the beginning. Excited to make the transition once it’s got more compatibility with our Node tooling (prisma specifically).


Has anyone managed to run a non trivial Nest.js application with bun already?

With non trivial I mean an application using a lot of Nest.js features and libraries like TypeORM, Swagger-UI.


Not nestJS, but built around express typeorm and the rest and the code base is 5 years old so you can imagine.

Bun has seriously been a god send. I am so happy that we can finally do top level awaits and it allowed us to delete a lot of hacky code that we had in place just to make node and all the build dependencies happy.


If someone from the Nuxt team is around: I have 5+ projects with different module usage and all of them just log when trying to run bun —bun run dev:

ERROR Cannot build module „nuxt“ {}


I'm not aware of any issues as long as you are on the latest Nuxt and nuxi versions, so it might be easier to help if you open an issue with a reproduction on https://github.com/nuxt/nuxt/issues.


Will do once I’ve investigated more. Tried bun yesterday evening and just saw this thread this morning while heading to the lake :) Thank you for your awesome work on nuxt 3, by the way!


You're very welcome It's definitely a team effort.

Feel free to DM me if I can be helpful.


Hey Jarred, congrats on the 1.0 release!

I've tried this release on one of my projects and encountered some errors, where is the best place to send reports about compatibility issues?


Please file an isusue in https://github.com/oven-sh/bun and include the code to reproduce it


Would love to see bun work as a replacement for electron, using the existing os web browser, similar to neutralino, tauri, and muon. All told Im super excited to try this.


Why would I want to choose Bun instead of Deno?

As a library author, how convenient is it to have one suite of tests that can execute in both Bun and Node.js? (E.g. with jest?)


Would be amazing if this had its own formatter and linter as well, like go. That would make go + bun a dream for writing web apps

Either way still excited to check it out


As of today, what are the primary use cases for bun vs Node?

what type of project can I use Bun on without regretting it (due to unforeseen bugs, incompatibilities, etc)?


Just curious about the degree of compatibility with node, can I alias/symlink node to bun and everything will continue to work but faster?


Reading the features I wonder if there are plans to also add linter and formatter to bun, it sounds compelling to use one tool for everything.


Congratulations on reaching 1.0 !

Performance comparisons to Go/Java on the backend would be insightful. Would be interesting to see how close the gap is.



Yesterday was a mess, with first an announcement video and then a blog post released before they were supposed to. Normally we'd merge the threads but I don't think it makes much sense in this case - there's obviously community appetite to discuss this, so I think we probably just need to eat the duplication, even though we try hard not to.

There's a slightly similar case high on the frontpage right now: https://news.ycombinator.com/item?id=37434918. The thing isn't released yet but the appetite to discuss it is strong.


@jarred will bun work with pm2 out of the box?


So it is intended as drop-in replacement and is tested against React. Will there be support for angular apps as well?


Congrats, Jarred. Have really enjoyed following along with your progress on Twitter. Excited to give Bun a deep dive :)


Congratulations on the release! Looks really promising. I wish I was smart enough to solve some my own issues


From a quick glance, it doesn't seem to support building vuejs or electron projects out of the box


I'm curious about the Bun Framework API, is there a release date set for that?


Oh that's awesome! Now to find out if this works for SvelteKit projects...


I haven't tried it, but https://bun.sh/guides/ecosystem/sveltekit suggests it works fine.


They have a tutorial for that, looks like it just works. The devil is the details by the way, I would just try everything from dev environment to building production bundles before switching tough.


Thanks sir! Been looking forward to this and spinning it up right now :)


superseded by https://news.ycombinator.com/item?id=37442646

====

Doc's install section for Linux missing DEB/RPM


Just to double-check, Bun doesn’t support targeting the web, right?


Are there any edge cases when using bun with react-native?


I don't suppose this can cope with Node-RED yet?


This might work now. It didn't in v0.5.x and earlier. We added proper CommonJS support and that means sloppy-mode features that Node-RED relies on should work just fine.


It does! At least the base setup works with "bun run node-red". I will have to test third-party modules, of course, but the 3.1.0 release seems to work OK in 15 minutes of testing.

I'll have to see how it goes on slower machines, ARM, etc.

Thanks!


Another datapoint - wisp, my favorite LISP-to-JS transpiler, also apparently works fine: https://github.com/wisp-lang/wisp/issues/174

This makes for a pretty awesome combination considering bun's system calls and sqlite support.


The development environment does not work anywhere except Mac OS. This is not FOSS, just a startup that Apple is going to buy really soon and make it there cloud computing platform.


Any plans for Electron support (or similar)?


When will Bun open up to libre and/or federated chat options instead of requiring users sign up for and agree to Discord’s ToS?


Congratulations for the release.


Congrats on the release!


Congrats!


deno or bun for learning?


Congrats on 1.0!


[dupe]


Yesterday's news, more discussion over here https://news.ycombinator.com/item?id=37424724


I thought we, as an industry, had all agreed that `curl https://bun.sh/install | bash` was something we should never do and never encourage.


If you can't trust that, then I suppose you're checking every node_modules dir for any malicious codes but I guess you're just being allergic to pipe bash.


I don't like `curl|bash` not because it's insecure but because I don't know what it's doing to my system, and I don't believe that it understands my system properly. Ideally I'd download a single binary and put it in /usr/local/bin myself.


If you don't trust bun.sh don't do it. They would be torpedoing their project if they put something nefarious in there though, and it would probably more effective to put the nefarious stuff in bun directly. Yes a 3rd party could have gained access to their site and modified that script, but that's covered under "trust bun.sh".


It's also a short, clean and easy to read shell script.


Can you walk us through an alternative and why it would be safer?


You have it backwards. Even Microsoft posts install scripts such as this for DotNet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: