I ported some tests from jest to bun recently and have been been pleasantly surprised -- it was pretty much a drop-in replacement and the speed difference is certainly noticeable.
That said, it was a tiny and simple test set[1]. It may not be ready yet for more complex tests, as the docs warn[2]:
> You've never seen a JavaScript test runner this fast (or incomplete).
Paul, if you are interested in speeding up your tests for either Bun or Jest you might be interested in checking out my latest project. At Brisk we keep pre-built workers for your project in our infrastructure so there is no build time that eats in to your test time. This also means there isn't as much of a concurrency penalty that you get with traditional CI services (scaling up to 30 workers only to spend 30 times as long on building). So we use lots of concurrency (usually around 60 workers) for really really fast tests. I'd love to hear your thoughts. https://brisktest.com/
Interesting, just checked it out. In this case I need to build and run a local (Rust) dev server in the background while the tests run, does Brisk support that?
So if I understand correctly by local you mean on the test runner? So at the beginning you build your CI env and you start a local web server. Then later on when you are running the tests they hit the local web server (on the runner) during the tests?
We do support this yes. With what we call background jobs. Each test runner also comes with a dedicated docker server for running additional services (db, etc). It’s an experimental/new feature though and I haven’t properly figured out how to charge for it.
In the happy case while the worker isn’t running tests the background job should be consuming very little resources.. but we can’t rely on that (and also some people will inevitably try to mine crypto) so figuring out how to make sure the workers aren’t hogging the cpu between test runs has prevented me from broadcasting the existence of the feature, it is there though.
Why does bun run the tests faster? Is Bun a faster JS runtime? Is it running the tests in parallel? (This is possible with nodejs using some of the test runners). Is it starting & stopping the server differently? (In process vs child process, or something like that?)
bun:test is fast mostly because it’s integrated deeply into the runtime. Other test runners typically have to mutate globals and spend a lot of time setting up the environment and tearing it down. Also, expect() in bun:test is something like 10x faster[1] than in vitest (100x jest) because it’s written in native code.
Test runners involve a lot of polymorphism which means JS doesn’t get JIT’d much — in cases like that, writing in native is much faster.
Another factor is transpilation time. bun:test supports TypeScript and JSX without plugins and bun’s transpiler is very fast
2. Built in APIs. A lot of nodes APIs are implemented in JS and for this reason (or simply because they’re not well optimised), they run more slowly than Bun’s implementation.
3. Bun also has a highly optimised built in test runner
It sounds like a test harness bug if startup time matters, the setup here is quite suspicious
JSC vs V8 is interesting... I'd like to see something like tsc compile times for real projects compared. And then again against something like SWC (rust impl) :) I doubt we'll switch to bun any time soon for prod runtime, but for speeding up tsc/lint/babel/etc, sure!
You might be surprised how much startup time factors in. The last “blazing fast” Node test runner I tried had something like ~300ms of fixed overhead just to even start running a single test. None of that overhead is reported in stats, which is kind of understandable because it’s timing your tests, not itself. But 300ms just to get started measuring is a bit much IMO.
Anyway. Last I checked Bun still handily beats SWC (and ESBuild) on basically everything, and beats Node on tons (all?) of runtime metrics. Perf has been its foundational principle from the start, and every aspect of what’s under test in the article has been specifically mentioned as optimization targets by Bun’s creator. You’ll find many of your questions answered on his Twitter[1].
One thing I think is worth adding is that Bun’s perf obsession has been percolating a lot of optimizations upstream to WebKit/JSC. So even if you don’t ever adopt Bun as a server runtime, you or your users are likely benefiting from those contributions on the client.
I changed some code for a tool to use worker threads instead of child spawn. We had libraries that could talk to services, and the tool’s job was to compare prod and preproduction, so the simplest thing to do is start two processes configured for both.
But the call time had gone north of 1 second so there was a lot of friction to adding more questions. I subsequently fixed some dumb bootstrapping code to the tune of ~250 ms, which would have been enough to add all of our current apps to the tool, but what really helped was using worker threads knowing the number of environments to check was finite. We only needed k threads so we could eat startup time entirely.
Bun is also faster. If you added my fix and used Bun, you could use forking to cover cases that are O(m) instead of O(k). (Typically k << m < n when doing orders).
> Last I checked Bun still handily beats SWC (and ESBuild) on basically everything
What did you check?
Bun only has basic transpiring currently. I can't imagine what "everything" is, and without bundling, minification, and down-transpiling support I don't see how any meaningful project could have been tested on.
> Bun’s perf obsession has been percolating a lot of optimizations upstream to WebKit/JSC
Source?
AFAIK claiming either "bun devs made a lot of JSC perf optimization PRs" or "bun devs pointed perf hot spots to JSC devs" would be incorrect, but I'd love to learn otherwise.
> Bun only has basic transpiring currently. I can't imagine what "everything" is, and without bundling, minification, and down-transpiling support I don't see how any meaningful project could have been tested on.
Yeah, build/bundling use cases have lagged in priority. I’ll rephrase “everything” as “every transpiling use case”.
> Source? AFAIK claiming either "bun devs made a lot of JSC perf optimization PRs" or "bun devs pointed perf hot spots to JSC devs" would be incorrect
Jarred’s Twitter has been frequently highlighting these contributions. This seems like a weird thing to object to if you’re so familiar with the project!
As someone that has always used Node in the past for side projects and just started looking at Deno should I be looking at Bun instead?
Heads up for the author: the dark theme seems to be handling the tables incorrectly (keeping the tables light but inverting the text to become nearly white).
Bun is fast. But honestly I don't care about the fact that it can handle more requests. What matters for me is how much it affects my dev experience. See how Bun performs in a non-typical benchmark.
For npm install it seems it is a solved problem: just use pnpm as a drop in replacement. It's such a basic idea : instead of reinstalling dependancies in each repo, just hard link them from a common store and there you go, you just reuse them. Simple yet brilliant.
That doesn't help CI performance. I bet `bun install` is way faster than `pnpm install` on a cold or hot cache (though I haven't tested myself). I think it does similar things to pnpm under the hood to improve local development speed as well.
If performance is critical IMO it would make a lot more sense to switch to Rust or Go than Bun. Instead of getting a 2-3x improvement (best case scenario) you will get more like 10-20x.
Webpack 5 is probably the last version. Not my words, but from its main author. Vite is the new de facto bundler which uses EsBuild (written in Go).
I don't think you understand the point of bun. It's not to make npm/webpack/typescript faster at running JS code, it's to replace those tools entirely. So in essence, it's doing the same thing that esbuild is doing, just across all of the major tasks people currently do with node. What's different about bun vs esbuild is that it's one tool you can use for everything you do in JS projects.
I've swapped much of my usage of npm over to bun. I still work on projects where nobody else is using bun, but I can run `bun install` instead of `npm install` and `bun run` instead of `npm run` and these run probably 100x faster.
bun is already faster than esbuild? i don't understand this comment. the bun transpiler is basically a direct port of esbuild to zig.
esbuild is a javascript transpiler. it transpiles typescript into javascript that v8 or another js engine can run. it's written in go which is the main reason it's faster than pure js transpilers like babel. were you thinking that esbuild is for go code? i'm confused
bun is a combined javascript transpiler & runtime & test runner & package manager. it improves performance of many things:
- startup time including transpiling code and resolving modules and initializing the javascript vm
- installing packages
- running tests
it can also improve runtime performance (it is faster at basically every bit of the node api it covers & it has its own versions of some things like http servers that can have better performance than the node APIs)
switching your project to rust would improve runtime performance, but would make it slower to run tests & compile the project & use an ide language server & …
i haven't used go but i've heard it's fast to compile & has better runtime performance than js
Just watched it - I don't see him saying anything about 5 being the last major version. In the Q&A he answers the question "should I use webpack for a new React project" with an "I'd say a biased 'yes' but it depends". He weighs the pros and cons of fixing vs rewrite, identifying that there are fundamental issues which will never be addressed in webpack.
FWIW the presenter is employed at Vercel on building Turbopack, which is presented as the "webpack successor" - so I'm quite sure either way he wouldn't agree with your take on esbuild being the future ;)
There's a big slide that says: "I don't know" about using Webpack for new projects. Coming from its main author I'd say that very closely represents to what I wrote before.
The Vite part I added it myself. Sorry if that wasn't clear.
Regarding TurboPack, we'll see. So far it's not very popular outside of Next. SvelteKit uses Vite, even though Vercel is paying for two people full time working on it.
That's an "I don't know if Webpack will be a popular choice for greenfield projects in 5 years" (which is stating the obvious and should be equally true for Vite or turbopack at this point and very far from what you wrote).
Culture difference behind the misinterpretation maybe? The lack of self-inflation, exaggerated confidence and exceptionalism we might expect from Americanized presenters shouldn't be taken for insecurity or doubt ;)
I'd listen to what they say at face-value rather than trying to read the lines applying the wrong lens.
> Regarding TurboPack, we'll see. So far it's not very popular outside of Next.
It's very early still, in alpha, and isn't recommended for production use outside of Next, where most of the focus on integration and tooling has been so far AFAIK.
Besides the other use cases that benefit others have mentioned, there's much more to perf than req/s for a webserver.
Basic examples:
- Latency/processing (which suffers as req/s grows or peaks)
- Memory usage (idle, at peak)
- Startup time (one-off jobs, serverless cold boot)
The "thousands of req/s" benchmarks are often unrealistic - either super simple (the benchmark you linked serves a static tiny JSON payload) or hyper optimized and not representing a run-of-the-mill webserver.
> The "thousands of req/s" benchmarks are often unrealistic
A hello world Fastify project can get you into the 20k reqs per second. On the same hardware with a real project you probably will get a couple thousands which is more than enough to satisfy 99% of projects.
You’re right. The first paragraph of the article acknowledges that these kinds of throughout benchmarks don’t matter, and the author focuses on the CI/CD time savings you get with tooling like Bun, which can be substantial.
Faster is always better. Given the same hardware, faster requests, faster CI pipelines, etc... = lower power usage. Scale this up to the planet and the power savings could be substantial.
CI is one of the clearest cases of “it feels so good when you stop” that I know.
There are few areas of software where people will obstruct efforts before and support them after quite as much as CI. The biggest is probably software process maturity, where people have to see a scary production issue evaporate into the run book before they “get it”. Build pipeline maturity is solidly in third place, sometimes second.
I've been seeing a lot of glowing benchmarks and posts about Bun recently, but haven't heard anything about its performance or overall experience running actual production workloads. Does it work? How does it compare to Node, Deno and the rest? I'm especially interested in hearing about any possible inconsistencies between V8 (used by basically every other JS runtime) and JavaScriptCore (somewhat unique to Bun). Has anyone used it in any "real" capacity yet?
typescript(tsc) is the only one that does type checking.
bun, deno, esbuild, swc etc. can parse the syntax, but they chuck the TS (they probably don't even add it to the AST, but I haven't checked).
Keeping up with syntax is very doable. It doesn't change often, and updating the parser when it does isn't much work.
There are some past/ongoing projects[1][2] to create type checkers faster than tsc, but they aren't going to reach full parity and probably don't plan on keeping up with language features.
So tsc runs on bun and deno? Do they implement any node apis used by typescript? I guess there's very little surface area - process, fs, maybe stream, event, and worker_threads or process?
That said, it was a tiny and simple test set[1]. It may not be ready yet for more complex tests, as the docs warn[2]:
> You've never seen a JavaScript test runner this fast (or incomplete).
[1] https://github.com/drifting-in-space/driftdb/blob/main/js-pk...
[2] https://bun.sh/