Hacker News new | past | comments | ask | show | jobs | submit login
Bun 1.2 Is Released (bun.sh)
277 points by ksec 3 months ago | hide | past | favorite | 208 comments



Why are they putting 3rd party (databases) and external (s3) librares into their core / std lib? Wouldn't that be better as an optional library? I think a runtime like this should be very, very careful with what they put into the std lib, adding these already makes it feel like a kitchen sink project.


I think this position is also one of the reasons it gained attention. Batteries included is a popular and chosen route for many.

Setting up typescript can be hard. Same goes for webpack, s3, postgres, jest and more. I also find the simplified file and stream access quite interesting.

Lets wait and see how a distributed deployment provider turns out.


Which means they did not learn from Python's mistakes. You need to look further than the next couple of years. For some reason during the past 10 years or so it has become fashionable to throw away everything the industry has learned in decades prior and start anew, repeating the same mistakes. We'll never turn into proper engineering with such attitude.

https://lwn.net/Articles/776239

https://lwn.net/Articles/888043

https://lwn.net/Articles/790677


This doesn't seem like a mistake other than how it is the obvious lead to the scope creep, maintenance creep pipeline. Python having an highly capable standard lib is exactly why some people love it, specially "10 years ago" when it was very slow to rev up your machines that need to be very much isolated from other machines, and you just had to have a bright enough person on systems making sure everything was included from the get go. Python saved the ass of more than 0 people reading this discussion right now on that merit alone.

I think there's a simple solution to all this. Libraries targeting third party protocols get an expiration date and have to forcefully be replaced by name after a given number of versions. Even if they keep the same underlying code, still change the name to force developers to look up its usage and legacy. How many versions? However many equates to the threshold you use to call most systems "legacy". I don't mind some job security and some timebomb punishment aimed at dinosaurs. I have bigger and more consistent issues with that than with weter or not C++ let's me crack a .rar without extra libs.


The opinionated batteries-included stdlib was exactly what made Python popular in the first place though (and even despite its 'weird' syntax). And tbh, most of the current problems were also only added in the Python 3.x era (e.g. 3.x looks very much like a 'second system' - https://en.wikipedia.org/wiki/Second-system_effect)


> during the past 10 years or so it has become fashionable to throw away everything the industry has learned in decades prior and start anew

This is a curious take to me. I've spent the last 10 years seeing people claim again and again that if JS just had common stuff built in like <other lang>, we wouldn't have all this library churn, node_modules bloat, and left-pad silliness. That the mistake was not including a standard library.


This is mostly a sentiment I see from people who used or heard of JS 10+ years ago and didn't bother to learn added JS language features and recent tooling.

Standard libs can be great, but they should really be reserved for baseline features, especially in a language like JS where all changes must be backward compatible. The standard JS has now is not at all what it was in the early 2010s, it's a very good set of baseline features.


Python's "mistakes" are my weapons in my restricted corporate production system.


Batterie-included is one of the main things that made python great and useful before it became big enough to not need it (don't know for bun).


Batteries included mean wherever there is a fully compliant implementation, there is something available, even if it isn't the best solution in the galaxy.

Whereas the best solution in the galaxy might only work in a few selected planets, in other ecosystems without batteries.

I prefer batteries included, and not having a culture with a function per package.


As most things do, it cuts both ways. Rust suffers from their very slow adoption of libraries into a standard library, imo.



A thing can be both a mistake from maintainance perspective, but a great boon from user perspective, or even a horrible kludge from power user perspective.


I think their move away from a binary lock file to a text based lock file in this release makes this pretty clear - they shoot first and ask questions later. Any of those problems they've identified with the binary lock file are kinda obvious if you think about it for a bit. A strong indicator that you should think about it is that the popular languages with package managers (npm, ruby, rust) have text based lock files. The fact that the bun team didn't think about it and thought that binary was better because it was faster and no one had thought of this idea feels like hubris to me.

It's cool that they're doing the mainstream thing now, but it's something for them to think about.


Or they think extensively about performance.

Regardless, the switch shows they pay attention and are willing to change.


I despise Python's tooling and never touch it willingly.

That said, Python's 'mistake' also made it one of the most used languages ever. For nearly 2 decades, you could just type `python` in terminal and get rolling, and that was invaluable.

The only real 'mistake' that Python did was breaking backwards compatibility so spectacularly that single greatest feature was rendered useless.


> single greatest feature was rendered useless.

Which feature are you referring to?


being able to type `python` and start writing a program that would work nearly everywhere.

With compatibility break there was a decade of confusion, even the simplest print statement wouldn't work. I understand there were real reasons to do all that, but it did cause damage.

Steve Yegge put it better than I can[0]:

> the thing is, every single developer has choices. And if you make them rewrite their code enough times, some of those other choices are going to start looking mighty appealing. They’re not your hostages, as much as you’d like them to be. They are your guests. Python is still a very popular programming language, to be sure — but golly did Python 3(000) create a huge mess for themselves, their communities, and the users of their communities’ software — one that has been a train-wreck in progress for fifteen years and is still kicking.

> How much Python software was rewritten in Go (or Ruby, or some other alternative) because of that backwards incompatibility? How much new software was written in something other than Python, which might have been written in Python if Guido hadn’t burned everyone’s house down? It’s hard to say, but I can tell you, it hasn’t been good for Python. It’s a huge mess and everyone is miserable.

[0] https://steve-yegge.medium.com/dear-google-cloud-your-deprec...


> there were real reasons to do all that

No there weren't. It's just pure idiocy and incompetence.


Well, I would agree with you. But I'm no language designer nor maintainer. It could all be bollocks, but since I'm the ignorant one, they get benefit of doubt.


golang is also battery included which seems great. i hope lua can have something similar though smaller


> Setting up typescript can be hard.

Node just enabled it by default. You still need the dev dependency for manual compilation and checks, but at runtime it should "just work". https://nodejs.org/en/blog/release/v23.6.0


As enthusiastic as I am about node's typescript-support, calling it "just works" is a bit of a stretch. Not entirely sure if it's still the case anymore on the latest versions but last I checked it was required to use `.ts`-suffixes for all the imports, something a standard typescript project will hardly ever have and needs to be specifically configured to be considered valid syntax (allowImportingTsExtensions:true).

But yeah, there's progress, and once this gets solid traction (which I'm sure it will) it might finally be the last drop in the bucket to convince TC39 to stop being so antagonistic to having some notion of type-support directly in Javascript.


> but at runtime it should "just work"

Maybe when it doesn't use WASM and there's proper integration. Otherwise it's just like npm and people still need to look for alternatives.


nothing wrong with that, but why putting it on a global object instead of a built-in module. Better yet, that module may be publishable on jsr/npm


AFAIK bun is VC backed, so they need to make money at some point, therefore a speculation:

They want to make bun an all in one runner in order to vendor lock you in somehow. But I might be wrong. It indeed does not make sense to put such dependencies in the core/std lib


I think that's the plan too. But I'm still not fussed about it. At the end of the day, it's still a great tool for getting stuff done and one less decision I have to make.

Also.. I don't quite know how they're going to lock me in unless they mess with the license. I can deploy my own bun server anywhere, no? I'd be stuck on bun I guess but still not paying anything.


Quite a lot of people have told Jared (Bun’s Author) the same thing, but his opinion is that Bun should have everything a basic project might need. Keeping it in core he can make it more optimized than it would be as third-party library.

It’s a misguided approach according to me. And I feel Jared has become way too ambitious. But what can I say, it’s his passion project.


I can understand this vision: it's neat to be able to just open a file and start making a program, without having to choose one of the 20 relatively popular Postgres client libraries available on npm.

From the engineering standpoint, sure, it's a disaster.

But it's also the lost magic of TurboPascal and friends, where you could just be immediately productive, with no dependencies, no external tooling, on an old computer gathering dust in the school library.


> without having to choose one of the 20 relatively popular Postgres client libraries

They can easily provide official extensions/packages clearly namespaced and avoid all this mess. But I fear that they're more focused on a "headline-driven-development" approach, the more different from the status quo, the better


And it's successful! I've unironically learned about Bun on YouTube, and now am a happy user. :)

A set of "blessed" known good packages would be wonderful to have in any language, I wonder why it's not a thing.


I'm wondering what made you switch ? I find the ubiquity of node very comforting, maybe I'm not seeing the argument that will convert me over ?


Not entirely switch, but so far I've been using bun for small side projects and it's been rewarding. There's a lot less project managing, some scripts can be just a single file with no dependencies, no package.json and no build chain.

Of the "batteries included" features, the `Bun.serve` web server and the `HTMLRewriter` HTML5 SAX API thing are very powerful, saving a lot of node_modules space.

Running TypeScript without a build step is neat. (I know there's `node --experimental-transform-types`, and it's great to see this feature propagated "upstream")

That said, for large established Node.js projects, I don't really see the pull to switch over to Bun. There's cost associated with it (for one, the built-in test runner works a bit differently, so if you're using `node --test` it'll require some fiddling)


Not the original commenter -

It has just been a pretty low effort drop in replacement for me. It's definitely not a complete game changer, but quick iteration is just that bit more convenient since it's faster and I don't need to remember all the flags I normally have for my setup (Typescript, .env file, etc...)


No, thanks. That’s what Kotlin does and it’s an absolute shitshow. I want ‘editor index.ts’ and start working instead of going shopping for something that should’ve been part of std.


It's one of the things I love about Bun, I now write any non-trivial stand-alone scripts with TypeScript which is capable of a lot of functionality without needing any config or to install any external packages.


I think it's great. There are already alternatives so makes sense for Bun to do its own thing


I like the batteries included option and would likely use it a lot. But not sure why they simply did not make a "batteries" package you could install on top of core and avoid the inevitable push back they knew they would get from this.


Deno is doing this. I'm not sure if it's good or not tbh. Now I have to worry about std package version on top of deno ver. And if I'm going to install a random package, how am I sure std deno is the best? Maybe I should choose something else. Back to decision fatigue


Being misguided or not, it is good to have different approaches, including the ones that doesn't work.


Bravo, let a hundred flowers bloom. May the best ones win, or at least prove themselves to be "worse is better".


> Wouldn't that be better as an optional library?

Totally agree.

In their words, "Bun aims to be a cloud-first JavaScript runtime. That means supporting all the tools and services you need to run a production application in the cloud". This doesn't give me a lot of confidence.

This particular design choice seems even worse than Node.


> This particular design choice seems even worse than Node.

We could argue that it's worse/better, but in the end it's just different. NodeJS when it appeared had the vibe and "marketing" to be something lightweight, fast and event-driven (compared to the alternatives at the time at least), where the 3rd party ecosystem provided the tooling for what Bun now tries to bundle into their "all-in-one" tool.

We've seen the same cycle multiple times. Developers need flexibility to configure something so a flexible solution appears, everyone gets excited and starts migrating. Eventually, more developers are tired of the flexibility and don't understand why there are so many configuration-options, so eventually a "all-in-one" solution appears, everyone gets excited and starts migrating. Eventually, people need to be able configure more things so....


Just wait until they give preference to services that their VC benefactors are also supporting.


This is for their managed offerings. The problem of VC backed software was always that there would be these integrations that'd try to provide something unique to it, or IOW, lock-in.


What’s the monetisation plan


Since at least web2.0 I've been burned many times. So my default position on VC backed projects without clear monetization plan is that it is probably bait-and-switch.


I think it makes it easy to provide paid hosted services later if needed.

Hypothetical example: S3 client built in, enable a flag and now get a dashboard seeing analytics around file downloads, download latencies etc

Just pure hypothesis on my end given they have to make money somehow at some point


I'm sure in the short term people will be happy, but maintaining that in the long run is a footgun. I can't see a team big enough to achieve that correctly, especially the bug confusion (e.g: is it bun or the 3rd party issue) that will creep on the main repo.


As long as it works and follows a consistent API, why would that be an issue?

I kinda like the idea of not having to import potentially very slow JS code to do things that I need in basically all my projects.


The interesting question is: Where do you draw the line? Should a HTTP server/client be in the stdlib? File access? String templating? And why not a window system, 3D API and UI wrapper?

IMHO a stdlib should mainly provide standardized interface types, but not necessarily the implementations behind those interfaces. But that's probably not a very popular opinion since it falls between the two existing options of having a very bare bones and a batteries-included stdlib ;)


Those packages exist already though. Pretty sure the bun maintainers (or Ciro Spaciari in this case) asked the question "how fast could it be if written in zig?".


Isn't it possible to write it in Zig as a separate extension? Every mature language I'm aware of supports this AFAICT


Tree shaking means it won't bother you if you don't use it


it is a very bad idea. you start your project using the built in client libs, you are locked in to bun as js runtime. if the license changes you are stuck.


Not only that, but when the new hot JS runtime inevitably drops, it's going to be a pain to move to that.


What "new hot JS runtimes" are you talking about? Node has been pretty stable for over a decade and a half. This is the first time that anything is even challenging that place. It just so happens that both Bun and Deno efforts started around the same time and now have to compete with each other

I think any competent company would be savvy enough to avoid lock in for a technology with adoption this low. I don't think that's what these features are aiming for. I think they're aiming for the young dev starting side projects that wants to get up and running quick. Or imagine teaching a bootcamp class and you want a tool that will do some magic for you so you can focus on explaining other complex aspects of web development


You don't think they're trying to lock us in? What's their big monetization plan then? I think they want us to drink the Kool aid. I don't have a problem with it though because I'm small enough that I can still refactor the world in a few days if I really need to.


s3 is pretty much a de-facto standard, just like json like or not. Postgres also makes sense, it is the most widespread and community-liked db. What is the point of "optional libraries" tbh? It was a PITA in PHP back in the days and very inconvenient, prob most devs would prefer this way.

I feel like HN is on cognitive dissonance, they complain JS projects having too many dependencies and they also complain now when things are more integrated into the runtime because it increases vendor-lock and few extra megabytes (actually kilobytes according to the devs) to the binaries :/.

Lastly, big companies also prefer less dependencies, it is not just devs.


It probably looks very wrong to people who still think lots of "modularity" and small packages is a good thing.

I'm all for it, and lots of Bun APIs are purely practical. Bun.stringWidth, for example, exposes code Bun already has internally. Nodejs probably has the same thing, but instead of us being able to use it, it gets reimplemented in 10 different versions in node_modules. How is that better?

I doubt the Bun team will have to change the S3 code very much over the years. The test runner, bundler, Postgres client, sure, I can see those being harder to maintain. But I'm also tired of everyone assuming everything needs to change all of the time. DX aside, my team is still on Webpack and we've only needed one new feature from it in the last ~5 years. Why can't Bun's bundler reach maturity and only receive a few updates?


Node does have the same thing, and you can access it with `--expose-internals` from `const { getStringWidth } = require("internal/util/inspect");`. They don't expose it for some reason.


I'm also still using webpack. Not sure why the community abandoned it. Haven't had any issues. Everything seems to use babel under the hood anyway and they all have their little wrappers over top.


> s3 is pretty much a de-facto standard

So was XML before, and SGML before it. De-facto changes over time, and backwards compatibility means your decisions are cast in stone.

In 20 years, you could see s3 being abandoned for newer formats, but bun will have to keep those packages.


So? I mean, what ist the real downside, beside 5kb additional filesize?


Assuming backwards compatibility.

Ongoing maintenance burden. Ever increasing API, mistakes set in stone, subpar performance or properties.

"Standard library is where libraries go to die." exists as a saying for reason.


Back in what days of PHP? I've been using PHP and MySQL since... Probably 2003ish and never had a problem. I think you need the extension but most providers had it.


> s3 is pretty much a de-facto standard

Finding documentation on that standard (particularly the edge cases) is very difficult


>Wouldn't that be better as an optional library?

It would be better if the libraries were not the most optimal or good enough. In bun's case it is not just the minimal they are basically making everything as good as it can be.


I don't like it either but in practice it doesn't really matter. Tree-shaking means those unused features won't affect your project


Except that you can't shake built-ins. It's in the bun binary.


So? IT wont affect performance. Dont call the library and ITS egfectively unused Code in the binary.


I'm not the one complaining. I'm just saying this has nothing to do with tree-shaking.


that's old school javascript mentality. new school is bake it all in and let people get to work on the interesting bits.


Using bun has been a great experience so far. I used to dread setting up typescript/jest/react/webpack for a new project with breaking changes all over the place. With bun, it’s been self contained and painless and it just works for my use. Can’t comment on the 3rd party libraries they are integrating like s3, sql etc but at least it looks like they are focused on most common/asked for ones.

Thanks for the great work and bringing some much needed sanity in the node.js tooling space!


How does bun make a difference in the frontend tech stack that you mentioned?


Last I tried (several months ago) it didn't, the built-in frontend bundler was not very useful so everybody just used 3rd party bundlers so (for most people) it would not have any meaningful differences compared to nodejs. It seems they are putting more effort in the bundler now, so it seems like it can replace plain SPA applications just fine (no SSR). The bundler is inspired by esbuild so you can expect similar capabilities.

IMO the main benefit of using their bundler is that things (imports/ES-modules, typescript, unit tests, etc) just behave the same way across build scripts, frontend code, unit tests, etc. You don't get weird errors like "oh the ?. syntax is not supported in the unit test because I didn't add the right transform to jest configuration. But works fine in the frontend where I am using babel".

But if you want to use vercel/nextjs/astro you still are not using their bundler so no better or worse there.


not up to this point, but with this release, bun is now a bundler.

That means potentially no webpack, vite and their jungle of dependencies. It's possible to have bun as a sole dependency for your front and back end. Tbh I'll likely add React and co, but it's possible do do vanilla front end with plain web components.


Bun has always been a bundler (and package manager, and Node runtime). This release adds "HTML imports" as a way to use the bundler.


Doesn't the name "bun" come from the fact that it's a bundler?


No it's just a name. Originally it's an alternative to nodejs.


not really a bundler without a dev server that you can just set to an entry point, and css support


i have been setting up these react/ts/etc project with vite or next.js, just fine , i think you're underestimating how much progress happened in other tooling as well


Idk about next 15 but you can literally bootsrap next 13 using a single index.tsx with typescript & next being the only 2 dependencies in package.json. No typescript is fine too.

It's not new, has been the case for a few years, so honestly I don't get people complaining about next's complexity.


Bun is amazing. It’s a life hack for me. Chatgpt doesnt know much about it so there’s some productivity hit but i love bun.


Lots of good stuff here, but I do wonder if some of the default behaviour is getting a little too magical:

When you use new Response(s3.file(...)), instead of downloading the S3 file to your server and sending it back to the user, Bun redirects the user to the presigned URL for the S3 file.

That's a rather surprising choice for the default, and it's not at all obvious how you'd disable it if you don't want to expose your S3 bucket directly.


I respect the area Bun is trying to carve out, but if you follow the creator on Twitter, you'll see the decision making process is very focused on short-term wins/convenience and very light on the deeper implications of the "magic"

I was initially excited about the project, but have no faith in the long term direction given the spurious (and often poor IMO) design decisions. At least as it's publicized on socials.

It would have been fine if they kept it v0.x, but releasing 1.0 should have significantly raised the bar for increasing API surface area


Pass a stream into the response.

Response(file.stream())


I mean thanks but that doesn't change the fact that

> That's a rather surprising choice for the default, and it's not at all obvious how you'd disable it if you don't want to expose your S3 bucket directly.


To be fair, it does answer my “not at all obvious” bit -- just calling stream() is pretty clear! I should have thought a little longer.


I don't like this design, thought I get why they did it. Personally I think this is very 'side effecty', and it reads like you'd be returning the file contents and not a URL.

Even so, what would bother you about exposing the bucket? It's also a presigned URL so it doesn't have broad access utility.

I'd prefer an API like this where it's explicit:

    Response((file(...).getPresignedURL()))
Alternatively, the option to set an env variable or bun config which turns this behaviour on


What would bother you about exposing the bucket?

There's no immediate security hole, but I'd still be wary about exposing the bucket name, directory name and object name unnecessarily. Those might have a direct correspondence with the original public URL, but they might not. Maybe the object name is a database ID and the same ID gets used elsewhere for some other purpose. Who knows? Why take the chance if you don't need to?

I'd prefer an API like this where it's explicit

That's the thing, in the previous section it already described the "s3.presign()" function. If you want to redirect to a presigned URL, you already have everything you need, no magic required.


I missed the presign method. It would be very concise to be explicit here.

The more I look at recent bun developments the more it seems like they’re shooting for magic rather than intelligent design.

It’s clearly impressive stuff and I’m glad they’re doing what they’re doing, but stuff like this makes it so I’m hesitant to commit to the runtime APIs.


Batteries included makes a huge difference, this is why i love that Web APIs (Fetch API, Service Workers, Web Components, and ES6+, WebRTC etc) are now native on both V8 and Webkit runtimes.

But it has to be to a certain degree, maybe S3 is too far, but SQL drivers makes sense - but again, to which degree? There are _many_ databases out there, should there be drivers for half of them? Even at that level it's a lot of added code which means slower executable.

Also, I think Bun is missing out on security by adding such sensitive APIs to Bun, imagine bun taking all your source files and uploading it to your private S3 due to some script or path issue that allowed eval to run! It's game over right there.


> Even at that level it's a lot of added code which means slower executable.

Does it? Legitimate question. I would've assume that this could be almost entirely negligible depending on how the code is loaded into the runtime. If the code being loaded is only triggered when an import statement is seen, wouldn't that lead to essentially no speed overhead?

Even if it was statically linked in, I don't see why having the code would slow down the executable by any amount that we'd want to consider. Maybe literally more of an executable to load into memory, but I don't see that being a tangible slow down.

Would love to know if I'm missing a big piece here though.


All modern build tools have pretty good tree shaking. If you don't use these features they should have basically no impact on your final build


Neither deno nor bun do tree shaking because they can’t make the same assumptions as front end code. They do cache the compiled code though, which makes some difference. But the reason S3 SDK is at least 5X slower than bun, is partly due to code size, since JavaScript always have to parse the code on every run. I worked 3 years on cloudflare workers and scripts even 1MB in size are slow and cpu intensive to run (before using it!). Now, even the most common auth for node with basic features is 20MB of scripting. That put a perspective on things like this.


There was a discussion here on HN a while sgo, why browser will not support SQLite 1st hand. Maybe that point applies to Bun too:

Point is, who is responsible for maintaining the Lib and how do you change the Lib when SQLite changes. There might be a bug in SQLite. How do you fix it in Bun? Which versions receive a fix? How do you handle that a parch of your runtime (Bub) now might change behaviour of code running on it (because users worked around it)?

These are solvable issues, to some degree and with some downsides. However, at some point you stop being a runtime and start being a platform, which will bring other resposibilities and issues with it.


Well, also there isn't a reliable consensus on what "sqlite" actually is, due to extensions and build flags.


I think the argument in favor of S3 is that there are many object storage services that implement an S3-compatible API. I know its not truly a web standard, but it is also something that a lot of people have standardized around.


> Even at that level it's a lot of added code which means slower executable.

> article literally provides benchmarks, where bun is twice faster than fastest Node.js solution


A recent client uses bun in production.

I'm told their dev experience with Bun is out of this world bonkers good because of speed and simplicity.

Dev experience can play a big role long term. If your codebase and/or process sucks, you'll lose good people unless you pay FAANG tier compensation.


I have not tried Bun yet but the long list of features makes me skeptical that it's all solid and bug-free. I'm wishing to be wrong. I'll give it a spin in a future project.

From a project management perspective I'm a little confused why would you spend time on S3 support while you're still not 100% Node.js compatible. Next.js is a very big ecosystem and if you can get Next.js customers onboard you'll grow much more than supporting S3.


> while you're still not 100% Node.js compatible

100% compatibility is a nice marketing win, but the long tail of compatibility may not make much difference to the average user.

What percentage of the total Node.js API surface area do you actually use in your day-to-day? How many weird edge-cases therein are you actually depending on?


Do you inspect 100% of the code of each library you use to make sure it does not rely on missing or incompatible functionality?


I mean, you are either auditing your supply chain or you are not. And if you are not, then minor node incompatibilities are the least of your worries.


> From a project management perspective

This assumes you know what the project(s) is/are. Also the people working on it aren't robots. Maybe certain things take time to figure out and meanwhile you can do something else? It's also not just 1 person on the task.

> if you can get Next.js customers onboard you'll grow much more than supporting S3

Towards what? That doesn't make $$$. This is VC-backed. The goal isn't to provide Bun for free and gain all the users in the world.


This is a very uniformed response IMO. S3 seems very niche compared to Node.js compatibility. Not sure why you're attacking me for saying this?


You mentioned Next.js and then Node.js. As for Next.js, it is supported by Bun (https://bun.sh/guides/ecosystem/nextjs). I don't think it's safe to assume that every single Node API is more commonly used than S3, which really is the standard cloud-based object storage API.


S3 and niche don't fit in same sentence


> I have not tried Bun yet but the long list of features makes me skeptical that it's all solid and bug-free.

Especially that it is written in Zig, which is very memory unsafe. I mean if you refer a variable that is not alive anymore, it just accesses some random unrelated memory instead of segfaulting (in debug and safe mode too)[0]. How hard would it be to bolt a memory liveness system above it, that flags a variable name dead and blocks access to it, if it is dead? No, "just don't write UB"[1].

Anyway I'd certainly not put a Zig made anything facing the internet, especially not a webserver.

[0] : https://news.ycombinator.com/item?id=41720995 [1] : https://github.com/ziglang/zig/issues/16467#issuecomment-164...


In the early days of the project it was segfaulting during performance tests. That was a pretty hair-raising bit of information for me. Deno it is.

That being said, all of these run times use a JS JIT that are written in a memory unsafe language, that emit and execute raw machine code. They frequently have vulnerabilities.


Isn't it Deno written in Rust ?

Edit : It is


It is not very solid nor bug-free. We tried it last year and crashed all the time.


> a bug where bun add would not respect the spacing and indentation in your package.json. Bun will now preserve the indentation of your package.json, no matter how wacky it is

I find this entry pretty funny. Who even asks for this and what makes they think it's worth writing code for.


I’ve used it on big existing projects with tons of dependencies and small projects

I’m impressed

The dumbest thing I saw was Amazon’s CDK library looking for specific package manager lockfiles and was therefore semi-incompatible with bun

But if you use SST it doesnt matter


HTML imports In Bun 1.2, we've added support for HTML imports. This allows you to replace your entire frontend toolchain with a single import statement.

To get started, pass an HTML import to the static option in Bun.serve:

import homepage from "./index.html";

Bun.serve({ static: { "/": homepage, },

  async fetch(req) {
    // ... api requests
  },
});

this is amazing and so cool thanks a lot !!


How does this allow me to replace e.g. Vite? Is there a way to do hot module reloading, css pre-processing, or load framework specific plugins (like the Vue SFC compiler)?

Serving a static file isn't exactly new, so I feel like I must be missing something.


From its documentation [1] it looks a lot like a parceljs replacement [2], i.e. a zero config bundler which processes and bundles the dependencies in .html pages. So great for simple websites, not for replacing an entire Vite stack.

[1] https://bun.sh/docs/bundler/fullstack

[2] https://parceljs.org


Thanks for the links! I should have researched a little more before replying.

It actually mentions HMR at the bottom of the docs, and I see plugins are already available. So while it can't currently replace my Vite stack for most projects, it seems like it eventually could.

I'm not sure how I feel about this sort of coupling in general, but for small projects it could be very convenient (as you mention).


> framework specific plugins (like the Vue SFC compiler)?

They have a plugin api, but honestly I don't like the sound of "framework-specific plugins". It is because of this, all front-end frameworks are now a mini compiler (inside a bundler/compiler) and being too comfortable now to come up with new wacky syntaxes. I prefer frameworks to just be frameworks and being able to write normal typescript.


If that's what you prefer, there are plenty of options. I'm happy writing Typescript with no framework (or even plain es6), as long as there isn't UI complexity.

But for a large project, I'll happily trade a compilation step for the tools modern frameworks provide for managing complexity.


bun has its own bundler as well. So I suppose bun already replaces vite & I also think it does css pre processing but just not framework specific

I liked this approach because I actually wanted to create very simple / static serving in bun and I had to actually do a lot of hoops like reading it from Bun.file() and then some things more

Its just nice that its now solidified into the standard library / behaviour I suppose


Couldn't think of a project that was more doomed to fail than a competing alternative to Node.js, but glad that I gave it a shot when I needed to create lots of stand-alone scripts to process text files and SQLite DB updates which I was able to create with TypeScript, bun:sqlite [1] and bun $ Shell [2] working OOB without needing to manage any configuration files or local npm dependencies.

I've since tried it with new JS/TypeScript Projects which also makes use of its built-in Bundler [3] and testing support [4], installing deps is also instant. Having everything work OOB, fast, are real quality of life improvements where Bun has now become my first choice for any new JS project.

[1] https://bun.sh/docs/runtime/shell

[2] https://bun.sh/docs/api/sqlite

[3] https://bun.sh/docs/bundler

[4] https://bun.sh/docs/cli/test


Same for me. I was in the “whats the point npm will have all these next year” camp. Finally tried bun and was blown away. Subtle things in DX add up. For my project Bun is so “next generation”.


I actually used Bun for the first time the other day and it was an amazing experience. All my projects have Webpack or Vite configured to let me write Typescript and once setup they work almost flawlessly but it’s a pain to set it all and not worth it for small scripts.

On the other hand Bun worked right out of the box. I had spent 10-30 minutes futzing around with node-ts or whatever the tool is to run TS “directly” on the CLI and I was dealing with the all the dreaded messages “not a module”, “can’t use import/require”, “ESM/CJS” and trying all the normal fixes (changing package.json module type, changing tsconfig, changing the way import/require) all to get a ~200 line script to run. I switched to Bun as a Hail Mary and it worked wonderfully.


Latest node.js now supports „just running ts“ for cases like these:

node --experimental-strip-types index.ts

On non-latest Node tsx package worked great for me (as oppposed to ts-node)

But both of those options just throw away type information (and so does Bun)


I love the direction, especially including s3 and Postgres support natively - it makes a ton of sense for this to exist as an alternative to the “build your own framework” status quo

This is the standard in every web framework like Rails and Laravel, and the JS ecosystem will really benefit from this. The next steps are migration and schema management and a better out of the box testing story (w/ nice way to setup factories)


From what I can tell, this change was merged and released without a passing build. Indicating that the project’s quality assurance process is little more than lip service. I’m not sure how you would track regressions if your tests are flakey to begin with.


TBH, all of what’s in this release came from previous 1.1.x patches.

It seems they just drafted a new release to communicate the groups of change from the previous releases.


I work on Bun. Happy to answer any questions.


Some people make it a living out of complex interactions of all tools and make life hell for everyone. Don't listen to them.

Batteries included is the way to go. I hope the battery gets larger and larger and you also keep up the quality.

- extremely happy bun user


I didn't think this would be quite so contentious. It's been great for me. I manage very distributed teams across multiple time zones, multiple language barriers, and heterogeneous platforms. It's difficult. Bun makes it easier because the features in the distribution are virtually guaranteed to work together and obviate the tooling version hell. I have fewer 1:1 setup meetings. I haven't had cross-platform issues. Finally, the speed is actually important on lower-tier hardware.

It's for the same reasons that I switched to biome. It's faster and reduces total dependencies. I very happy with this combo.

Is there more code tooling (linting, formatting, etc.) on the roadmap for bun, or are you focusing on the runtime features?


Are there plans for a stripped-down version without S3 and SQL and things like that for those like us who just want a fast runtime to build our frontend resources to static files?


No plans to do that.

If you're worried about binary size from features you don't use: the binary size cost of Bun.sql is less than 50 KB (you can check this yourself via the .linker-map file in the *-profile.zip builds of Bun or via https://bloaty-csv-reader.vercel.app which was a tool I wrote a little while ago to see how much space various features of Bun use)

If you're worried about runtime overhead, practically everything in Bun is lazily loaded. So if you never access Bun.sql or Bun.S3Client, it will not load it. And, even if it does load it, because we implement it in native code and worry about this a lot, it doesn't really cost much to load.


Thank You Jarred. I seriously dont understand why there is such a huge resistance in including those. Especially when they are faster and more memory efficient.


> why there is such a huge resistance in including those

I don't know about all the reasons, but personally I stay away from projects who try to bundle in as much functionality as possible into one "all-in-one" thing. I prefer approaches where you yourself chose the right library for the problem at hand, as it tends to be that different libraries have different tradeoffs, and I want to chose those tradeoffs myself. Summarized as The Unix Philosophy or "do one thing and do it well" I suppose.

I'm not saying it's wrong of Bun to include those things, their value proposition is the "all-in-one" approach, which a large group of people seem to like, so seems they're doing right by their audience.

But again, personally I don't like the tradeoffs involved with that approach, but I wouldn't try to convince Bun either to go against their explicit goals.


Not worried about size on disk. That said, it's not just runtime overhead I am worried about, it is a handful of smaller risks that compound: Those include security risks, bugs in one section affecting something else, scope creep affecting time-to-fix-bugs, not everything is lazy-loaded so it will affect performance.

And like diggan said here, I like tools with focus, for example I choose to use one note taking app, one separate app to write code in, another app to chat with, and yet more apps for things like email, and SSH even though they are all text-centric apps and could be bundled into one and the same.


I see Bun pop up every now and then and it looks amazing. How is the development funded? With projects such as Node, I don't have to worry about it disappearing from underneath my feet, with Bun I'm not so sure.


thanks for the batteries-included runtime. I really appreciate that aspect. I really don't want to keep glueing libraries together more than neccessary :)


I just tried Bun to make a script to copy files from a service we were moving off of to S3 and it's pretty great.

Instead of having to tinker with my package.json, tsconfig.json, etc. to get everything just right, it works right of the box the way you expected with `bun init`.

Then it's just `bun run index.ts`.

And it's fast!

Node is great but there's just too many options to configure. I appreciate that Bun went ahead and made a bunch of assumptions and pulled commonly used stuff directly into it.


> Node is great but there's just too many options to configure

What exactly do you have to configure in Node?


TypeScript, maybe handle some CJS vs ESM incompatibility while at it, and a hot reload solution are what come to mind initially.



I'm aware of that initial support, I've been followed that since they added the experemental flag.

As far as I know, it requires some non-standard usages in TS code, like adding the .ts extension to your imports at the top of your TypeScript files. It doesn't support .tsx files or any TypeScript feature that requires more than type striping. The initial support is appreciated, but it's not on part with Deno or Bun yet.

TL;DR you will still need to work on adding proper TypeScript support anyway, compared to Bun.


Typescript, to start with.



I maybe chose the wrong horse and hopped onto deno early on. Bun's success has really surprised me. It had some really misrepresented benchmarks that were oft-repeated early on that seemed to contribute to it but they've been able to really capitalize on its hype since then. I guess choosing a small up-and-coming language like zig has the added benefit of making that entire community rally behind you


> I maybe chose the wrong horse and hopped onto deno early on.

After reading through this thread, I'm personally leaning toward Deno for my first non-Node project. Demo seems to be more thoughtfully managed, more pragmatic (e.g. Node/NPM compatibility), more secure, with better technology choices (e.g. Rust vs. Zig) overall.


> Node/NPM compatibility

Bun is miles better in this regard.

Deno initially did not even want to focus on Node/NPM compatibility, and then backtracked once they understood the importance of it.

Bun OTOH runs the entire Node.js test suite on every commit, and the mentality of breaking less existing node/npm code is clear. e.g, just in this release, `bun publish` has the exact same CLI as `npm publish`, and bun also works out of the box with .npmrc.

About the Node.js test suite.. many modules are at 100% compatibility, and many are at 90%. You can track it here: https://bun.sh/docs/runtime/nodejs-apis

Also, they reimplement the V8 public C++ API in JavaScriptCore (!) so that packages like npmjs.com/cpu-features work [1]

Every new feature has this aspect to it, e.g the postgres client inbuilt is a drop-in replacement for the `postgres` package.

As far as Node/NPM compatibility, and thought given to compatibility in general, is concerned, there is absolutely no contest.

And if I'm allowed a little snarky slight... Deno couldn't even maintain compatibility with their own API for reading and writing files during the Deno 1->2 update.

[1]: https://bun.sh/blog/how-bun-supports-v8-apis-without-using-v...


I'm still a fan of Deno and hoping it wins out but

> more pragmatic (e.g. Node/NPM compatibility)

Both projects are good on this front. Deno actually originally explicitly promised NOT to work on node compatibility to "move the industry forward". They realized this was a failing move and backtracked (which has been a little controversial amongst the core base)

> with better technology choices (e.g. Rust vs. Zig)

I think it's a little silly to take the choice of language as a "technology choice". Both are new languages, both still have a lot to prove, and both have pros and cons the other lacks


> I maybe chose the wrong horse and hopped onto deno early on

In the end, it's all JavaScript (or TypeScript if you like Kool-Aid), as long as you know the language you can pretty much effortlessly jump between node, bun and deno, they're more similar than they are different :) Migrating projects on the other hand, well...


If you're writing production code and not using TypeScript then you're using JsDocs. Which is just TypeScript again


Yup, there is only JSDoc or TypeScript, no alternatives, either before or after.

Most of my production code ends up ClojureScript for frontend stuff.


the assumed topic here is the node/javascript/bun/whatever ecosystem

That's cool that you get to write front-end code in an interesting language though! How long have you been doing that? How big is your team? I feel like larger teams tend to have more boring choices of software


OTOH it's strange to me that they're shipping v1.2, when Zig itself is at v0.13 (and generally considered a moving target).


Zig is following a different versioning philosophy. It's not semver, it's https://0ver.org/


Fun website, I like it. I actually gave this some serious thought when I saw it, but, the reality is that every release so far has had major breaking changes, or was a bug fix release. So, that's indeed a fit for semver with the major version zero:

> Major version zero (0.y.z) is for initial development. Anything MAY change at any time. The public API SHOULD NOT be considered stable.

There's nothing to gain from a different versioning scheme. People just want 1.0 and this is just another way to ask for it.

It'll be done when it's done.


I know that it's not semver, but the notice about the language itself not being stable yet is on the website.


No language is stable unless it has an actual published standard.

Python certainly isn't stable, for example. (And likely never will be, the devs just don't care.)


A bit ambitious maybe, but I don't necessarily see why breaking changes in Zig would need to be exposed to end users.


>To work around this, we had to change the assertion logic in some tests to check the name and code, instead of the message. This is also the standard practice for checking error types in Node.js.

Sounds like something they should try upstreaming?

Now they'll need to track all the tests to manually modify/import...


Really liking the text-based lockfile. I know there's a way to get diffing locally via a `[diff "lockb"]`, `textconv = bun` git setting, but that's still 1. manual setup, and 2. not really webui friendly.

Other stuff, like the C interop and psql client sounds amazing as well.

I'm currently only using Bun for smaller sideprojects where I'm also trying to use some of the more out-there features, and it has been a blast so far.

Though the most important question for me (as always with these announcement videos): where can I get a bun plush?


Can anyone speak to the performance numbers on the Bun page? It represents itself as significantly faster than the existing options, but why is that the case? Is it related to Bun itself? It must be the JavaScript engine they chose right? My understanding is it uses the Safari one rather than V8. Is JSC really that much faster? If so, what are the trade-offs of choosing that over V8 or any other option?


> Is JSC really that much faster

I think it is a bit faster. But a lot of Bun's speed comes from implementing API's in Zig rather than JS (which node does a lot)


> My understanding is it uses the Safari [JavaScript engine] rather than V8

Looks like you’re right. Thanks for clearing up my misunderstanding: I’m not sure why, but I thought Bun was using a custom JS implementation.


From their docs it is "written in zig" which implies "all of it is Zig". They don't hide that they use JSC but they don't like to advertise either.

It is a bit manipulative advertising to draw hype people in I would say. You know those tech influencers who like to peddle stuff to get views.

edit: I was a bit unfair, it feels like those "hype tech influencers" are the ones who downplay JSC in favor of promoting Zig, not the project itself. The frontpage of Bun mentions JSC twice and Zig once.


Well in their defense it would hard to get funding for JavaScriptCoreWrapper.sh so it's a useful misunderstanding to curate.


It's surprisingly hard to find an answer to this (an FAQ on the Bun site would be a good idea?), but I suspect it has to do with reimplementing some stuff (bundler, package manager, test runner etc.) in a compiled language (Zig) rather than using the JS runtime for everything (this is probably also the reason why Bun integrates so many functions into one "kitchen-sink" executable), and using a JS runtime which is faster than V8 at least in some regards.


Jarred is a gift to the FOSS world. It's quite inspiring to see such a success creating a product like this.


These improvements look amazing! I'm always blown away by the performance benchmarks of Bun.

Can anyone speak to how much adoption it's getting professionally? Even anecdotal points are useful. When v1.0 was released I briefly tried testing it out in my employer's monorepo. It unfortunately wasn't a drop-in replacement and we hit compatibility issues/couldn't invest time debugging. While we could have migrated smaller projects to it, we decided not to split our tooling and just stuck with pnpm.

Curious if others have had a different experience!


I've been using it for a greenfield NextJS 14 project for a client for about 8 months. It's been very smooth. No issues whatsoever. (edit: one issue was that I wasn't getting stack traces for error pages but they fixed that a few months ago)

The only issue I hit is that Vercel only runs node, so builds that may have passed on bun, sometimes fail on Vercel. So just before I deploy, I run `npm run build` to catch any node-specific build issues and fix them before Vercel finds them.

I hope Vercel will add support for bun in their edge runtime.


Re: bun patch, it would be great if it were possible to fetch remote patches (with a sha specified).

I aim to have an "upstream first" policy when it comes to patching/forking dependencies.

And fun fact about GitHub, you can append .patch to a PR url or commit URL to get a patch file.

This makes patches self-documenting (they literally are a link to the upstream PR) if the tool can fetch remote patches. Nix is the only tool I'm aware of that makes this easy.


How's the Windows support? I check every few months, and it always fails to even run bun install on my projects on Windows.


Works just fine for me


Maybe it depends on the project/dependencies.

Still a lot of open/recent Windows issues: https://github.com/oven-sh/bun/issues?q=is%3Aissue%20state%3...


Why does it matter


Because I develop on Windows?


My condolences.


With Docker, it isn't that bad. Default Windows UI/UX + PowerToys can be quite efficient.

Maybe I should try dual-booting with Ubuntu.


With docker you don't really need to care about Bun's support on Windows. But I've been using it without issues so far. If you have issues, it might be specific to some package.


aws sdk apis embedded in a runtime? aside from lockin it looks a terrible thing to do, to me it seems just a driver to increase adoption, I'd actually expect free performance gains by just importing s3 from aws-sdk in bun. this way the path forward is clear: a bloated runtime full of 3rd party integrations


Apparently their S3 is running entirely based on a native implementation and offers substantial performance improvements over the AWS SDK.

Bun seems to be a little inspired by the Go language design. It is batteries included and it has a substantial stdlib. Integrating something like an S3 SDK - what I would consider to be a high level feature - seems like an interesting choice, but makes a lot of sense considering Bun specifically targets cloud environments


If building a desktop app with the intention of having JavaScript plugins, would it not be viable to include bun in the app bundle to use to run the plugins? Curious if anyone has played around with this idea


Baked in S3 and PGSQL feels like it's just waiting for AWS to undercut Bun's future monetization model. Deno has KV and even they are being pushed to the point of cutting regions.


looking forward to updating more .gitignores with 3 of these 4

  package-lock.json
  yarn.lock
  deno.lock
  bun.lock


This is an underrated and misunderstood comment.

Let me explain: projects usually support only one package manager. In a world of N competing JS package managers, you need to ban lock files from N-1 of them.


Can't forget pnpm-lock.yaml ;)


I use pnpm the most and I missed this :O


Why would you gitignore those? Adding lockfiles to git repositories is considered good practice


I said 3 out of 4 of these. Committing multiple lockfiles is not good practice but I see people that struggle with the idea that package managers are not interchangeable all the time


Surely you would enforce this at pull request time, no? Ignoring the file works from a functional perspective, yes, but does nothing to solve the actual problem.


the "actual problem" is often management trying to find cheap labor or even using AI to "do it themselves"


If someone commits a wrong lockfile they are fired where I work at lol (exaggerating, but only slightly)


And why would they use multiple competing package managers and runtimes? It isn't a good faith comment.


> And why would they use multiple competing package managers and runtimes?

Some of us work across multiple projects and aren't up in our arms about what package manager the current project use. Some days you touch 3-4 projects that happen to all use different package managers.


It’s not just good practice it’s the whole point


url and dgram are still not all there yet, but I’m going to see if Node-RED runs now.


dgram is at 90% compatibility, and all of the methods listed in the advanced dgram methods issue[0] have been implemented in 1.2. The url package has currently only one unimplemented test out of 13. If you've had any issues with dgram in 1.2, open an issue in our repo and I'll take a look :)

[0] https://github.com/oven-sh/bun/issues/10381


Well, I can now actually _run_ Node-RED 4.0 and receive multicast UDP packets, so that's great. I will be testing it with more complex stuff soon, and let you know.


i am not a JavaScript programmer, but can someone explain exactly what bun does ? does it compile JavaScript and target a backend like LLVM equivalent for the browser ?


> what bun does ?

It installs npm packages, it executes JS and TS code, it bundles code for front end, it runs tests. Plus it has a lot of random pieces related to this development, like integrated database access.

Compare it to node, npm, webpack, and jest, all in one, and whatever the guy dreams of. It's certainly fast and offers great DX, but I wouldn't bet on it to stay around for, say, 5 years.


It doesn't really do anything for the browser. It provides a really fast runtime for JS to run in a server, a stdlib for http, crypto, and filesystem (among others) that JS lacks, a bundler for JS projects to go to a browser and a compiler for typescript to JS, among a few other things.


Bun is a build tool and code runner primarily. It can run a tsx or js file, and it can do bundling or the compilation of TypeScript into regular JS. It also manages packages and can run tests. It aims (I think) to be a node compatible replacement for node.


Congratulations!!!


Why ?


What? Why would you ask that?


Congratulations!


npmrc support is huge




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: