Hacker News new | past | comments | ask | show | jobs | submit login
Will Bun JavaScript Take Node's Crown (semaphoreci.com)
298 points by kiyanwang on Aug 14, 2022 | hide | past | favorite | 324 comments



I really hope Bun at least becomes a big player. Node.js has been very sluggish implementing new features, to the point where most people just don't trust them much anymore with this. Deno's direction is interesting but doesn't align well IMHO with dev's interests. Bun in exchange:

> Out-of-the-box .env, .toml, and CSS support (no extra loaders required).

This makes a lot of sense. Node.js should be "sherlocking" (integrating into the core) the most popular/common features devs use. It's crazy to me that, after 10+ years of `.env` being a common abstraction to manage environment variables, you still need to install a separated library because Node.js doesn't know to read the file itself. Same with yaml or toml.

Same with features like fetch(), which took 5 years from the Issue to be implemented in Node.js (and not for lack of collaborators, but for lack of wanting to merge it).

I'm happy though that Node.js is finally approving web features, and that the move to ESM has been finished, but they are moving so slow that I can def see a focused small team (might be too much for one person) overtaking Node.js.


There's value to a system where everything is bolt-on as well.


I have zero experience designing systems like that but I think the sweet spot would be a standard library that's modular and detached from the core. You could pull it in after setting up the core, replace it or upgrade it in parts but it's still maintained by one entity with a consistent level of quality.


That's the general approach Deno has gone with. it's finding success.


Sure, but I'd argue that value is marginal, for 90%+ of people building websites you want a straightforward HTTP server that you can build and customize with plain JS, and for some deeper cases yes then that bolt-on system is useful as well.


The problem with stuff like introducing .env support out the box is breaking backwards compatibility in a silent way, ie without any code changes and in a way that'd still run the same in dev, and pass tests on CI etc, and then just affect prod.

I know this shouldn't ever happen, but you can well imagine plenty of legacy/badly configured setups where it would. More pertinently, where it would no matter how loudly you warn about it in release notes etc.

When you're as mature and used a platform as node, you just can't risk things like this, unfortunately, no matter how more convenient it would be for the vast majority of users.


That would be very easily solved by adding a Node.js version or range in your package.json that specifies where the program is supposed to run, and treat major versions as breaking. There's a balance to be had here, and avoiding breaking anything at all costs is surely not really balanced and it's starting to add a lot of cruft that will be needed to be maintained long-term.

The solution is not to "warn loudly" here; specify a Node.js version in your package.json, and your code will continue working on that version. Upgrade the Node.js engine version, and then it's on the person upgrading making sure that nothing breaks. That's how virtually all platforms work (except the web itself, but that's not "versioned" so it's fair).

It's a bit more troublesome when changing core packages and considering dependencies, but the same could apply there.


Apple breaks stuff all the time when updating their APIs. If you hold yourself to decisions made long in the past, then your platform will undoubtably become crusty and stale. Maybe Node maintainers are ok with that, but it shouldn’t be surprising when people go to greener pastures.


I do agree to an extent but the specific point I'm making here is about breaking things in a potentially silent, not obvious way.

Not sure what things you are referencing here in terms of Apple's breaking API changes but if it's the sort of breaking change where previous code just doesn't compile or run on the new version it's a different matter in my view - as in, it's something far easier to catch in dev or testing and not only affect prod.


Node already has a solution for this: core modules are prefixed with "node:" to distinguish them from third-party modules. https://fusebit.io/blog/node-18-prefix-only-modules/?utm_sou...


Node is a small team too but what features in particular are we missing?


I explained some right in my comment, currently missing are things like native support for .env, for .toml and .yaml. Support for ESM and fetch() was missing for way too long until recently.

Currently I don't see big gaps between web and node anymore, but I still find the APIs a bit messy and to e.g. do work with files I have to import all of the three `node:fs`, `node:fs/promises`, `node:path`. It'd be nice if at least `node:fs/promises` ALSO included non-promises from `node:fs`, like "createReadStream()" to avoid having to import another core module for that. Also WebCrypto should define the variable `crypto` as a global, like `fetch()` and `URL` do, for compat with the browser.

Just some ideas/examples from the top of my head, no much search/research was put into this so def take with a pinch of salt.


not actually disagreeing, just my opinion:

.env would be nice but I personally do not miss .yaml/.toml at all so there's likely some subjectivity in these feature-requests and keeping it in user-space has a value too (simple, rock-solid core)

being able to run typescript code (without type checks) is in my opinion the biggest improvement in deno, I don't care about their own formatting or linting and I especially don't care about their LSP and opinionated way of file urls (requiring .ts suffix everywhere). Also, webgpu is nice but it shouldn't be in the core, etc.

BTW: Speaking of complexity, Deno takes 140M of memory, bun and node are both around 10M.

I think bun has huge potential to replace node, especially because it's not written in C++ nor in rust, both of these languages are extremely hard to master and that limits contributions (and FUN) to some extent. Safety is important but it seems to harm productivity a lot.


Those are the kind of things that you don't think about everyday until you actually need to think about it and go like, why isn't there a `YAML.parse()` like `JSON.parse` or similar? I need not only to import a separated library, but install it first? These things should def be in the core IMHO.


This is probably naive, but I'd love to see one of the Node.js competitors, i.e. Bun or Deno, innovate on the "it takes ~forever to load node_modules every time I run a test/script/etc" problem.

I.e. Bun is improving "bun install" time for projects with large node_modules, and that's great, but what about, after it's on disk, having to parse+eval all of it from scratch every time I run a test?

As admitted, this is a very naive ask due to the interpreted/monkey-patching nature of the JS language, but I'll hand wave with "something something v8 isolate snapshots" + "if accomplishing this requires restrictions on what packages do during module load, deno is rebooting npm anyway..." and hope someone smarter than me can figure it out. :-)


Deno doesn't use node_modules so you won't have a problem there. You specify each dependency as an import URL (either in the code or in an import map) and it grabs them directly (also using a local cache directory). There's no need in the deno world to even use a package manager.


Ah yeah, you're right that Deno doesn't have a node_modules, but AFAIU it still downloads dependencies to "somewhere on disk" and then, every time your code runs, it re-evals all of them from scratch.

So, admittedly I was using "node_modules" as a shorthand for "the code that makes up my dependencies", and that AFAIU Deno has not implemented this "use a v8 snapshot to cache preloaded/pre-evald dependencies" optimization.

I.e. I want something like:

https://danheidinga.github.io/Everyone_wants_fast_startup/


You could maybe load bytecode if that's a thing in javascript to avoid parsing at least, but outside of that you'd have to make sure the environment is actually the same.

For example if I define an environment variable and some script deep inside the modules folder reads that variable and does something with it. You'd need to make sure the environment is identical before everything is loaded. There are other things as well like a script defining a global (perhaps for polyfill) and it needs to load before some other script.

The Nix package manager is probably a good place to look at how this should be done. However it's all based on strict constraints and other functional principles that I don't think npm packages are anywhere near of satisfying.


does V8 support snapshot + restore?


> There's no need in the deno world to even use a package manager

That doesn't sound like I thing I would want.


It is a package manager. It's just integrated into the runtime to the point where you import things and it "just works".


How does that pan out as an application grows and you're pulling in the same dep across numerous files?


You still get one shared copy of that dependency. Why would it be any different?


How do you ensure that across files on a large project?


Bun and Deno both look really impressive and I hope development in this space continues at the massive pace it's happening right now. But I've tried using both on a small greenfield project recently (nothing fancy - just a static website with some custom build logic made by stitching together some templating engines and libraries) and I ended up reaching for Node again.

As mentioned in the article Bun still has a lot of incompatibilities with Node. For example, as far as I could tell scripts designed to be run with npx don't work right now. And I'm not sure what to make of binary lock files. Sure, it's more efficient but how do you know a `bun add` didn't change something it absolutely shouldn't have changed?

Deno has really fast TypeScript compilation and file watching built in which is awesome. I always loathed using tsc together with nodemon or some custom file watcher thing. And permissions are great. But the package index is more or less the most valuable asset of Node and Deno doesn't provide (and doesn't aim to provide) compatibility with most of it. Also, while I do like the idea of using URLs for dependency management, the way Deno does it with import maps and lock files feels extremely convoluted and too easy to mess up for me. NPM is more intuitive.


" Deno doesn't provide (and doesn't aim to provide) compatibility with most of it"

which packages did you have issues with/need that didn't work? my understanding was with node compat flag deno supported quite a bit...


At the time it was the "tailwindcss" package IIRC. I also just tried it again on the latest iteration of that codebase and ran into some import problems, seemingly related to "markdown-it" (version 13.0.1). First I got "SyntaxError: Cannot use import statement outside a module", when I changed the package.json file to add "type: module", as requested in the error message, I got a panic. If you want, I can open an issue for that.

Additionally, I'm currently using at least one function in "fs/promises" that doesn't seem to be supported yet ("opendir") and the "vm" module to evaluate some (trusted) JS.

Most of those issues can be worked around, I just decided to go with Node instead for the time being.


Some of Bun's server speed can be attributed to uWebSockets, which has node bindings [1]. Of course this is just a small detail, since Bun is a huge project.

[1] https://github.com/uNetworking/uWebSockets/discussions/1466#...


alex got good points here as usual.

it's been a long time since i've used node's built-in http module since I've found uwebsockets.js.


Good to hear there are others successfully using uwebsockets.js! We are using it in very early stage project, not production yet. Can you share your experience using it?


Great experience with it. The examples in the repo, the issues and discussions sections, and the documentations are all very helpful. Alex and the other users are also quite hands-on in replying on each and every issue and discussion there.

We have some thin wrapper for uWebSockets.js that lets us do the following:

- parse request json

- parse request multipart data

- serve response json

- serve response buffers

- serve response streams

- serve response static files

- support async handler

- support multiple async handlers (middlewares)

links are here

- https://github.com/joshxyzhimself/modules/blob/main/uwu.mjs

- https://github.com/joshxyzhimself/modules/blob/main/uwu.d.ts

- https://github.com/joshxyzhimself/modules/blob/main/uwu.test...

internally we just serve http, then it goes through caddy or haproxy depending on the project's needs such as tls, caching, etc.

there are other similar projects too that tries to deliver express-like api:

- https://github.com/kartikk221/hyper-express

- https://github.com/nanoexpress/nanoexpress (defunct)


Got to say your code looks very clean - really impressed! Db access, cache access sessions, s3 access and server init with middleware and endpoint interface - in a few short files with practically zero deps!

Honestly if it was 2 months ago we would definitely start with this, but unfortunately it is a little bit too late. Oh well


Installing a gigabyte of NPM packages (or whatever is being said to be faster) a little bit faster and a little bit worse is not that interesting IMO.

I actually use Deno right now because I only use a handful of third-party dependencies in my project, and I can use Deno’s bundler instead of webpack (even though that’s not its intended use). I’d rather simplify away the stuff that’s slow and complicated rather than making it faster and more complicated (less correct, less compatible).


When talking about JavaScript performance, aren't we essentially comparing Bun to V8?

If so, wouldn't it be very hard to beat V8's performance, given that there has been so much engineering effort for many years towards making it fast?

Or are we taking into account the speed of the package managers and bundlers as well?

Disclaimer: I haven't saw the benchmarks, if any.


It's _mostly_ talking about the speed of the surrounding tools. They wrote a package manager that speaks NPM (presumably in C++ or zig instead of JS) which unsurprisingly completely dominates NPM in performance.

The other stuff he benches is built into the runtime ... HTTP requests, copying files, and a webserver Bun ships with. Presumably the Bun team wrote these tools with performance in mind, but the article doesn't compare features. I'm skeptical the webserver in particular is at feature parity with the ones he benched against, which makes the numbers look pretty watery to me.

It does not address the runtimes of JavaScriptCore (Webkit/Bun) vs V8 (node/deno) which, as you pointed out, are probably very similar.


For the simplest benchmarks the performance disparity might come down to the difference between JavaScriptCore (which Bun uses) and V8 (which Node uses), but for anything non-trivial there's a significant amount of overhead introduced by the Node runtime.

Someone wrote a barebones V8 wrapper called Just Js, which is based on V8 just like Node, but it crushed Node in the Techempower benchmarks [1].

[1] https://www.techempower.com/benchmarks/#section=data-r21&tes...


>When talking about JavaScript performance, aren't we essentially comparing Bun to V8?

Bun is powered by JavascriptCore which is webkit. Webkit is developed by Apple. Safari typically outperforms Chrome on benchmarks.


Benchmarks are shown on the project’s homepage


Bun is currently lacking workers, which at least for me is a dealbreaker. I'd also think it'd be interesting to compare against justjs when it comes to speed, since that is a very small wrapper around V8 (much smaller than node/deno) and manages to score extremely high on techempower benchmarks (top 25 on all, first/second spot on a few): https://just.billywhizz.io/blog/on-javascript-performance-01...


Looks like Just-JS author is looking into swapping V8 for JSCore.

I’d be super interested in seeing just how much faster Just-JS would get using JSCore (like Bun uses).

https://Twitter.com/justjs14/status/1557856790897106944#m


Interesting! A small wrapper switching the engine might be the most reasonable "real-world" benchmark there is.


Just to clarify, Bun isn't a wrapper around V8. It's a wrapper around JSC.

I don't think Jared's benchmarks were benchmarking Bun and Node so much as V8 and JSC.

I don't think benchmarking Bun against justjs is going to change much.


Considering the performance difference between node, deno and justjs I don't think that's right.

There is clearly a big difference between different wrappers and how to handle different things. For example some of them delegate most of HTTP handling to a native lib while IIRC node does a lot of that in its js stdlib.


FWIW I read it as the opposite. I think most of the stuff he benched was explicitly not benching V8 against JSC. This is all me reading between the lines .. maybe you know more about JS runtimes than I do and can confirm/deny my theories?

Package manager perf: Bun (presumably) wrote a C++ or zig package manager that's integrated, and speaks NPM. I guess you could aruge that this benches their fast one against the npms V8 runtime, but.. that's a bit of a stretch for me.

Copying large files: I'd be surprised if any of the benched distributions rely on the javascript runtimes to copy files on disk.

HTTP Requests: Maybe distributions actually call into JSC/V8 for these .. I have no idea. I'd guess not, but this one sounds the most plausible case for "benching V8 against JSC".


What is up with buns binary config file? How does one check dependency versions and warn of security vulns? How can someone think binary format config is a good idea?

(edit): https://github.com/oven-sh/bun#why-is-it-binary (though this does not answer the last question, and only partially the second)


Honestly, I can sort of see it making sense. The only time I need to look into my lockfile are to see the exact version of something I'm pulling in, and generally I think I would be fine doing something like `lock-file-tool --show-me xxxx`. In Rust, there's a `cargo tree` command to see the tree of your exact resolved dependencies for when you want to see the whole thing expanded, so I generally don't use the lockfile for that anyhow. I certainly don't ever update my lockfile by hand, so there'd be no loss of usability in that regard.

That said, I'm not super convinced that performance would necessitate this; if parsing the text file was really that slow, I think you could instead have a separate binary file created whenever the lockfile is changed that has a serialized representation of the lockfile along with a checksum to ensure that it's not out-of-sync, then hash the lockfile before using the binary representation. I guess it's possible that if the tooling is brittle or people try to edit their lockfile by hand, this might end up detecting an out-of-sync lockfile more often than not, but at that point I think the issue isn't really with the lockfile format.


There's an option to output a yarn-compatible lockfile. In practice, I think this means you'd need a branch protection rule to disallow a change to the binary lockfile without updating the yarn lockfile. I'm not sure that complexity is worth the performance gain of the binary format, personally. I think Bun should have an option (maybe in bunrc) to always use the human-readable format, though that detracts from the "batteries included" nature a bit.


> Documentation is limited, but Bun’s Discord is very active and a great source of knowledge.

Discord is a blackhole for information that search engines cannot index, it is not a replacement for documentation.

I hate Discord with such a bloody passion, and this is one of the biggest reasons. People think "just ask on discord" is appropriate as a means of documentation, it absolutely is not.


Speed isn't really the biggest factor for most of my node usage, though. Of course, if it's a drop-in replacement one might switch.

An aside, but as a non-native speaker and haven't heard of Bun before, an all-capitalized title is very confusing. "Who is Will Bun"?


Speed nowadays is almost an anti-goal.

People choose the "fastest" language or framework, then add a bunch of lazy ORM queries without really knowing how to properly leverage a performant database, stitch together a bunch of "microservices", adding multiple HTTP trips across the wire, and by that time the performance of the core language is a rounding error.


I got a PHP gig recently (after working as a Go developer for some time), and this rings very true to me. In PHP you often have to make the most of your database, and this tends to result in a far more enjoyable codebase experience.


To note, nowadays node is not just for production runtime. Anything related to tooling (compilation/check/bundling etc.) for instance will benefit from faster runtimes.

Also, having a slower runtime only marginally push people to code more efficiently (e.g. people wanting to use an ORM will do so either way)


Sure, until you're stuck on some inner loop with a poorly performing language.

Of course,if you're doing microservices correctly, it should be easy to separate the parts where performance is important and write them in a language optimized for that.


That is almost never used correctly. I have not seen it, for one. If it's crazy math calculations, file processing, image processing - sure, but that's not how microservices are used in 99% of the cases. It's basically an anti-pattern.


Are you saying that performance can't be outsourced?


Yes, I had to backtrack while parsing this sentence. Not all-capitalized would not have helped, "will" is still the first word, so would get capitalized regardless, and Bun is a proper name that would also get capitalized. It doesn't help that "bun" is a kind of bread, and is also part of the name of the "bo bun" dish.

An "?" at the end would have helped a lot though.


For me speed issue is totally solved by Vite. Moved whole project to it including old legacy scripts.


There are some real obvious downsides to using this tool right now. Most importantly, the article lists there:

* Documentation is limited, but Bun’s Discord is very active and a great source of knowledge.

I'm not interested in becoming part of a community, I want to use the tool. Without official docs, I'll pass until the project has matured.

* Bun is not 100% compatible with Node yet. Not every npm package works. Express, for instance, is not yet functional.

I don't expect full bug-for-bug Node compatibility but if extremely popular packages such as Express don't work, I'm not sure if any of my projects will even run with this runtime.

Hopefully, this project will turn into a proper Node replacement because the Javascript ecosystem can use a big performance boost. It'll be a while before it'll become part of any pipeline I have a say in, though.


Yep, doing documentation and support through a chat service designed for gamers with terrible search that means the same question is going to be asked over and over again and where most users behave as if they were 15 year olds because that is the prevailing social norm means I'm going to pass.


I'm stuck with an open source project on Discord for a mix of developers, artists and casual users across a wide range of age groups. I'd love to migrate off but I honestly can't think of the right platform.

1. Lots of people like realtime chat. It's worth having something that fills that niche (the old "mailing list + IRC" combo used to work well)

2. Forum software is usually fairly awful. I don't want to install some 15 year old pile of PHP with bad UX.

3. A "Stack Overflow" style site requires a lot of moderation and either annoys users for being too messy or annoys users for being too strict

4. I actually still have some fondness for Google Groups but Google's brand is too tainted for most people.

5. Nobody under 30 seems to use email any more so mailing lists are out but something that syncs with email is a must (reply notifications etc)

I am also hoping for something free but very easy to install. (SaaS with a free tier, one-click docker or similar). And easy on resources ideally (I guess $10/month for hosting is about our limit at the moment)

Yeah. I'm asking for a lot.


Did you look at zulip? https://zulip.com/

I'm not convinced it's better than irc+proper mailing lists - but given that real users are stuck in awful mail clients (like web/Gmail or outlook) - mailing lists isn't a great option any more.

Main thing I miss from zulip is a proper weechat plug-in (like wee-slack) - although the official cli client isn't terrible - it's just not irc-like.


Why is something like Gmail an awful client and what does a great client have (or has not) that it doesn't (or does)?


Mutt/pine (or newer clients like sup/notmuch) are faster, handles quoting/threading better, allows decent keyboard operation and is workable with high-volume of email.

My biggest issue with Gmail (as someone who interacts with users of Gmail) - is that it generally breaks quoting and hides this from Gmail users with its magic conversation view.

Outlook(web) does atrocious top-quoting rendering it useless for mailing lists.

In general the web interfaces doesn't work well with hundreds/thousands of mails IMNHO.

They also tend to needlessly lean on html formatting rather than plain text - with things like "my replies in blue" - rather than just proper quoting.

Ed: in general I've just given up the idea that the average user has any hope of interacting well with email lists - which means such lists aren't useful for general discussion (but can still work for specialist groups like Linux kernel etc).

The conclusion being that one will need "something else" for a general audience.

Zulip might be a reasonable compromise.


>Outlook(web) does atrocious top-quoting rendering it useless for mailing lists.

The weirdly aggressive way people talk about top vs bottom quoting is probably one of the things turning new users away from mailing lists.


Maybe. I think the bigger problem is that there's no easy fix/recommendation to give new users; "stop using your propiatary group-ware that your company/organization pays for" isn't a great recommendation.


Gmail can't do basics properly like replying in the middle of a quoted text, etc. It shows you something relatively sensible, but sends completely garbled mess to the recipient.


Yeah, I don't quite get it. The problem with mailing lists is their fundamental design, not the clients for them.


Mailing lists aren't great, and neither is nntp. But mailing lists (and nntp) work fine with proper clients and some discipline/netiquette.

Sadly I don't think there is a reasonable way to introduce new users at this point - I'm not aware of any great gui clients that do the right thing; and AFAIK neither Gmail or o365/exchange work in a reasonable way with open standards (allow checking for when others are free; accepting/changing appointments etc) - so users are herded towards the semi-proprietary clients. (and interop across silos doesn't really work in a sensible way).

There are things like the d-lang forums that mix a decent web front-end and nntp (and mailman makes an effort for web+smtp) - but I'm not aware of anything that really works great, with a low barrier of entry and reasonably lets users participate across more than a handful of threads.


> 2. Forum software is usually fairly awful. I don't want to install some 15 year old pile of PHP with bad UX.

Isn't Discourse an answer to that?


Yes, can use Discourse for async, long-term communication. Or Flarum if you want something more modern feeling (but with a lot less features).

And then Matrix for real-time chat. Bridge it to IRC if you want. Search is not great in Matrix but sufficient for simple keyword searches.


I just checked out Flarum the other day, why is it more modern? I actually found Discourse is still much better. Flarum does not even have nested reply threads as far as I can tell, it's all flat and hard to read for any thread of replies.


I far prefer Discourse myself, and it is superior in useful features. "More modern" is mostly what I heard others saying in response to me recommending Discourse. Suppose it is a look&feel thing mostly, and personal preference. That said, you can customize Discourse quite extensibly.


What about GitHub Discussions ? (https://github.com/features/discussions)

I am not seeing a lot of projects adopting it and I wonder why because it seems quite good to me


It's locked behind a proprietary service/account owned by a publicly-traded corporation, Microsoft, with shareholder value obligations. The other options listed in siblings can be self-hosted, are all open source, and some don't require a an account or at least support a form of decentralized identity.


Have you tried Outverse?[1] It's a new startup aiming to solve this problem, with support for forum spaces, threads and more. I've been using it for some time now and would definitely recommend it! Makes it easier to save threads, common questions, etc and has a super intuitive interface overall. I'm looking forward to using it for my open source project, and it solves a lot of the problems I've had with current platforms.

[1] https://www.outverse.com/


Matrix? Assuming you don't want to host your own homeserver you can make a room/space on matrix.org


I think NNTP+IRC is good, and you can have bridging to other services if desired (which is probably a good idea too). You should also make logs of the IRC; the IRC channel for my project has public logs.

Discourse and Flarum are even worse, in my experience.

However, having chat services does not substitute for having documentation; you should also have good documentation, too; you should not expect someone to only ask questions.


Mattermost? And all your chats are in Postgres so you can build all kinds of fun magic on top.


How about GitHub discussions? One open source library I depend on uses it to good effect. You could keep the Discord for realtime chat for stuff that should be realtime. Of course, you’d probably have to make effort to push support questions to GitHub. People would probably still ask questions there.


Discourse


i think element (matrix) could work for you. it can't do email notifications, but if you have the mobile client on your phone you can get push notifications.


Nobody’s saying don’t use Discord. But that shouldn’t be the home of the documentation.


Well - I am kinda saying "don't use Discord" - at least except for specific cases where it's ephemeral nature isn't a liability. I personally wish I wasn't so tied to it.


Would Reddit fit your bill?


In terms of functionality - in many ways: yes - but it's weird to throw users who don't use Reddit into that crazy jungle just to get support.


Discords text search is incredible though, It's still not a good platform for support but not because search is bad


> Discord's text search is incredible

Really? There's no way to do verbatim searches and it's "very" generous in deciding what it thinks you meant to search for. It's almost useless for searching technical posts (or anything when you need to disambiguate similar words)


It’s far better than Slack’s search, which limits you to the last 10,000 messages unless you pay them a fee per member in your community.


This might change in the future as Discord tries to recoup their VCs' investment. See GitLab for a recent example.


GitLab is charging for metered compute minutes, which is hardly an unreasonable bargain for users who have been them for free for two years. Slack's feature-gating is disconnected from any unit economics - indeed, they actually do _store_ the messages prior to the most recent 10,000, but they just refuse to index them. The cost of that indexing is relatively minimal and does not scale linearly with number of users.


Doesn't matter what the technical justification is, it's a business move, not a tech move. It's about the customer's willingness to pay, and it looks like for Slack, companies are very willing to pay.


> But according to Butterfield, Slack is actually an acronym. It stands for: "Searchable Log of All Conversation and Knowledge."

https://www.businessinsider.com/where-did-slack-get-its-name...


Being better than Slack’s search doesn’t make it good.


The primary use case that we’re talking about is that someone has a common error, in which case they can put that error message into Discord search and read conversation about previous errors. Discord search is perfectly adequate for that use case.


That's fairly close to my use case for which Discord fails miserably.

But let's not get stuck on this point. There's a dozen other reasons why Discord shouldn't be used for technical forums


I don’t love that projects use discord now, but I don’t hate it either. Beyond the crappy search, what other reasons make discord a bad fit for technical forums? What’s your preferred solution?


My question was driven by the fact I don't have a preferred solution that would suit both technical and non-technical users.

But as for why Discord is a bad fit - it's completely opaque to search engines, it requires an invite, it has a fairly complex UI and it's basically "unstructured chat". Threads are an afterthought and don't come close to giving any proper structure.

I want chat for people that, you know, want to chat - but structured posts with topics and categories are needed for any sane, long term knowledge-base.

People on chat (this goes for IRC as well) will continually ask the same questions and you will continually have to answer them again because there's no structure.


The primary use case is actually googling the error, in which discord falls flat on it's face.

Every external chat room logs service I've seen has also had bad SEO, buggy, bad UI/UX, etc..


I actually don’t mind projects that use discord as their primary online community, but goddamn, discord’s text search is NOT amazing. Yes, it searches text and is likely fine for gaming / casual online communities. But man, it provides none of the utility I would expect a modern chat app’s search feature to have. Boolean operators, date ranges, wildcard matches, fuzzy matching, scoped searches (beyond just being able to specify a channel, like searching for a string in a user’s posts in a specific channel), etc. The subpar search functionally is a huge weakness, especially for communities like OSS where being able to meaningfully surface past comments / threads / solutions provides enormous value.


In one sense, I completely agree with your central point: Bun isn't yet stable enough or mature enough for production, "Node drop in" usage.

But on the other hand, when I look at Bun I think it has heaps more going for it than Deno. The fundamental value prop of offering better performance (both runtime and developer experience) is huge, and something that would get me to want to use it. Contrarily, while I see a number of improvements in Deno, the vast majority of them seem to be "niceties" that solve some initial "setup" issues that can be painful in Node - but since I already have taken care of a lot of those Node issues in my own projects, there is not a ton that I see compelling in Deno.

Point being, if I were a betting man, I'd easily put all my chips on Bun vs. Deno. It's a "plan where the puck is headed" vs. a "plan where the puck is now" approach, and from that perspective I see a lot more value in Bun.


> …I already have taken care of a lot of those Node issues in my own projects, there is not a ton that I see compelling in Deno.

I felt this way too before using Deno for a while. So far I enjoy:

- Breaking from npm/node modules support on purpose turns out to be a real plus for me. It's refreshing to have dependencies referenced by URL, and it's good to have them cached centrally by default (in $HOME/Library/Caches/deno on macOS and $XDG_CACHE_HOME/deno or $HOME/.cache/deno on Linux, for example).

- Use of web platform APIs (https://deno.land/manual/runtime/web_platform_apis ) and the work to standardise those across platforms (https://wintercg.org/ ) is encouraging.

- `deno lint` and `deno fmt` make adopting and using JS/TS feel more like Rust/Go/other languages with good built-in ceremony-free tooling.

- Fresh is turning into a very nice Next.js/Astro alternative (https://fresh.deno.dev/ ) that I found very easy to learn and deploy, with great performance and developer experience out of the box.

Bun is interesting, but I wish it didn't embrace node modules: perpetuating its use instead of attempting to move the community on by recognising it for the mistake it was feels sad to me. (See Ryan Dahl's “Design Mistakes in Node” PDF or talk for more: https://tinyclouds.org/jsconf2018.pdf and https://www.youtube.com/watch?v=M3BM9TB-8yA )


To your point, it would be refreshing - and appropriate - if more projects baked "documentation-first" into their purpose. For all intents and purposes it's "marketing materials". When it's lacking, as you noted, it creates doubt. "Join us?" Huh. Join what?!?

This is one of engineering things that happens year over year. We all understand the value of good docs. Yet it keeps happening. Why?


These days now that I'm fully in charge of building my own product, I often wonder if we place too much emphasis on centralized documentation sites that users have to deliberately seek out and visit vs in-context documentation snippets that show up as closely as possible to where users might actually need them, deeply integrated into the product and user journeys.

Users don't search for documentation for the sake of finding documentation. They search for documentation because they want to know how or if our product can solve a particular problem they have. My hypothesis is that the documentation discoverability problem is really just a symptom of the product discoverability problem, and that centralizing docs in 1 searchable website to make docs "discoverable" is only addressing the symptom, when that effort can be much better spent addressing the root cause by making the product itself more discoverable and deeply integrating useful documentation into it.


> centralizing docs in 1 searchable website to make docs "discoverable" is only addressing the symptom

a searchable docs website is the most important thing for me. having to "discover" an api by stepping through code and comments is a waste of time--only useful when you already know the basics, which requires documentation


Centralized documentation is valuable for several pieces:

* Easily get an overview of the entire API. I may just be scouting the library, so I want to understand what the API looks like.

* Examples as a starting point. How do I use your API?

* Makes your product more discoverable / approachable to potential users.

Think of your users like a funnel - how are they using the library? What are the common reasons you’re losing potential users? What are the common reasons you’re losing existing users? Users are also different so you have to analyze by cohort.

Now can something better be done? Maybe It takes a lot of work and would have to address the above issues and I don’t know if it would necessarily change the need for something centralized.


I read the documentation before downloading anything, to find out if the software will work for my problem. I’m not sure that “in context” helps with that?


That’s what kaseya is doing with their software, basically having contextual assistance for each feature as an “AI Buddy”, where users can choose to listen/learn or just continue using the product, all integrated as one


I think it's because documentation (and design) requires different skills than writing code. A great programmer is not necessarily a good writer or designer. There are rare gems, people who have a balance of such skills, but most of us struggle to write useful documentation, even the bare minimum necessary for others to get started - or any at all. The code, or the lead dev's brain, is the documentation.

A perfect example of this is documentation generated from docblock comments. Some projects only have such docs, and expect users to go from there. It's as if programmers envision their audience as another machine to program.

I feel similarly about business/marketing aspects of software projects. Many programmers seem to assume, "If you write it (the program), they will come."

Successful software is so much more than just the code, it usually involves a communal effort of various skills, especially human communication, including writing good documentation.


Because being a small project with limited resources requires devs be as resource efficient as possible. Docs, demos, tuts are all super important but it creates an additional maintenance overhead - if they are not maintained up to date, they become worthless.

So for the early stages of a project it makes sense to keep the overhead low and work with a small group of focussed early adopters.

As projects grow, mature and become more stable, more investment into learning material becomes important. Doing it too early burns valuable resources.


I've learned over time that it's much easier to incrementally update existing docs than it is to add them to a large project from scratch - so every single one of my projects, no matter how small, now has documentation from day one.

As I add new features I update the docs to reflect the changes, trying to keep those documentation updates in the same commit as the tests and implementation.

Since I started doing this the quality of documentation I produce has gone up a ton, because I'm constantly exercising those muscles.

It's been a huge win for my coding quality and productivity too - I don't have to remember as much stuff because I can refer back to the docs, and documenting as I go along causes me to make much better design decisions.

Worth noting: I'm a native English speaker writing documentation in English, and I've been blogging frequently and writing online for over twenty years so I've accumulated a LOT of writing experience. So what's easy for me may not be easy for other people!


Good point but writing and maintaining README.md files for your solo projects is nothing like writing proper docs for big projects with multiple contributors and users. That's a full time job, isn't it?

Writing non-code is easy and fun to me - I used to do it for a living - but when I am pressed to deliver features, documentation takes a backseat. Also, I've never felt like documenting other people's code


The two largest projects that I write documentation for are https://docs.datasette.io/ and https://sqlite-utils.datasette.io/ - they're both big projects with multiple contributors, though most of the commits are still from me.

I find the same approach I take to READMEs for smaller projects scales up pretty well: any time I make a change to one of my larger projects I ensure that the documentation is updated as part of that change.

If I accept a PR from someone without documentation I'll follow up by adding the docs for it myself in the next commit. I think that's more reasonable than demanding people add documentation if it's not necessarily their core skill set.

Honestly, I'd love to experience working with a professional technical writer on this kind of thing, but that's not something that's happened at any point in my career to date!


This is good advice, thanks. Really want to practice my writing more as well.


> As projects grow...

I'd argue if should replace as. The point being docs are all but essential for growth (and added participation). To neglect them is self-defeating.

Put another way: Docs are an investment. They have a known return.

Again, we've seen this. We know this. Yet again and again there's case after case of docs denial.


I've been working on my OS library for 9 years now, almost entirely by myself. For the majority of that time I kept documentation separate from code. Now I do enjoy documenting stuff - I have diplomas in both Computing, and Creative Writing - but my various approaches to documenting the project would start off with the best intentions but soon enough degenerate into an irrelevant mess as I tweaked code, forgot to update related webpages and demos, etc.

In 2017 I stopped working on the project entirely: I'd coded up a new major version of the library but never released it because knowing I had to overhaul and re-document so many different pages, demos, etc ... like a dementor, it sucked all joy from me.

Then in 2019 I recoded the entire project from scratch. This time I thought about the documentation first, before I wrote a line of code. I decided the best approach was to generate the main documentation from inline comments. I did this for both core code, and for demo examples - the demos stopped being standalone afterthoughts and became instead my end-to-end testing suite. To present the documentation to any developers who might show an interest in the library, I coded up the library's website in a way so that the core documentation[1] and the demos[2], with easily accessible code, could be very easily copied over to the website whenever I rebuilt the library (which regenerates all the documentation). I also added a set of lessons to the website, and a set of "How do I" articles - both of which are an ongoing project.

The library's website doesn't have any functionality where users can ask questions - but that's what the GitHub issues and discussions pages[3] are for.

The system isn't anywhere near perfect (I still need to automate the demo testing, for example, and there's no CI for copying stuff from the repo over to the website, etc) - but, given the depressing messes I've managed to fall into in the past ... it's working really well for me!

[1] - https://scrawl-v8.rikweb.org.uk/documentation/

[2] - https://scrawl-v8.rikweb.org.uk/demonstrations/

[3] - https://github.com/KaliedaRik/Scrawl-canvas


The boring (and probably mostly correct) answer is that engineers don't like writing documentation so it is the last thing to get done, if it gets done at all. This is especially true of FOSS projects. You may be motivated to spend your free time hacking on something cool and interesting, but spending your free time writing documentation is another matter.


Many engineers do not value good docs, even if they say they do.

Many people won’t read them at all, no matter the quality, and so it can feel like work wasted.

Plus writing docs is a different skill set than writing code that many simply don’t like to do.


Because a poorly-documented product that does something, is much better than a well-documented product that does nothing.

Engineers have to decide where to spend their time, and taking time away from feature development when your early-stage product doesn't have many features is not a winning move.


> a poorly-documented product that does something, is much better than a well-documented product that does nothing

not necessarily. with the latter, you know the constraints of a solution and can make an accurate judgment on whether it's worth using


True, but if it’s too early for a good documentation then talking about taking someone’s crown is too early as well.


It’s not fun to write documentation, and for documentation to be good, it has to be persistently updated.

It takes a lot discipline to do things right.


Agree. But it’s not either/or. I invested easily hundreds of hours in docs before launching TinyBase.org - but I don’t think it made the subsequent community building challenge any easier.


Yes, fair enough. And I'd like to add

Docs !== community

Docs === Odds of gaining traction

To your point, achieving community is a whole other beast to battle.


Because people are "scratching an itch", not creating a product. Most engineers are terrible at writing documentation.


True. Adding tho' that the original topic isn't a side project. It's a team bringing something to market and looking for participants.

And to your point, writing docs *is* something to consider when building the team. As is making sure the culture has a reward system if such behavior is important.


> This is one of engineering things that happens year over year. We all understand the value of good docs. Yet it keeps happening. Why?

Because doing anything is is irrational?

Time is finite, only a rounding error number of projects (especially open-source ones) are going to be successful, with or without docs.

Only an irrational developer would think that they are going to hit the 1-in-a-thousand jackpot with their project.

Any time spent producing documentation is time not spent on adding features and fixing bugs.

Spending time on documentation over and above the bare necessity needed to get out a working product "just in case we win the jackpot" is simply irrational.

This is probably why you don't find many projects spending significant startup time on good, clear and comprehensive docs (as opposed to a README and an FAQ and nothing more): the ones that do as you suggest mostly die before even getting users.

TLDR; don't treat your startup project as if you already have 5m users who depend on you. The odds are that you are never going to get to that point without a good product, and the better the product, the fewer the docs needed.


because time is finite

if you spend your time making sure your product works but don’t write thorough documentation, it can become a breakout hit even though some hacker news commenters may complain about a lack of good documentation

if you spend your time writing great documentation, the product will suffer and nobody will use it


> if you spend your time writing great documentation, the product will suffer and nobody will use it

This is an unfortunate perspective. And I could say the opposite is true. If you don’t spend time writing great documentation, your product will suffer and nobody will use it.

Documentation gets neglected because developers don’t feel like doing it. If you are building a product for developers, documentation is critical.


Docs aren’t the fun part of dev.


They are more fun to me than writing tests


Unless you make it programmable


And it doesn't address the semver malware injection bug demonstrated by colors author. Funny isn't it, any one of the thousands of npm package authors can inject a malware into our computers and nobody gives a shit.

https://snyk.io/blog/open-source-npm-packages-colors-faker/


> And it doesn't address the semver malware injection bug demonstrated by colors author

[Not a fan of Bun] In Bun, you can pin versions with package.json just like in node.js

What would address the malware injection when someone chooses to auto-update packages as part of a build?


You can only pin your direct deps.


A lockfile is a way to lock your indirect dependencies. If you only install based on the lockfile every dependency is pinned.


You cannot audit and pin all your transitive dependencies. The amount of churn tge lockfile goes through is insane.


Dependabot watches all the transitive dependencies in your lock files. For better and worse. For worse in that it's not a great developer experience to get a PR on a low level transitive dependencies, which is also one of the largest complaints about Dependabot that it often feels too low level and not working at the dependency level you are working at. But Dependabot (and npm audit) still exist to audit all your low level transitive dependencies in your lock files.


It’s a work in progress, it’s definitely not meant to be ready for production workloads yet. It’s in beta, I’m assuming if they do get to a full stable release, node compatibility will be far better, Express will work, official docs will be improved, etc.


Another concern I have is that cool as zig is, it is still pretty unstable, with breaking changes almost every release. So, I'm not sure i would want to trust this with production code.


> Documentation is limited, but Bun’s Discord is very active and a great source of knowledge.

What sort of knowledge are we talking about?


The kind that requires another proprietary account behind a nonfree service to get access to.


True story: “But our Discord is very active“ is the new “hard pass”. Tell your friends. We’re gonna make fetch happen.


Thr just reinforces the power of the java community to me. Servlet container is completely a choice built around standards unlike the js community where it still seems to be node or bust. It makes java look light years ahead of js.


Discord scares the crap out of me. I just don’t trust the app. I couldn’t figure out how to stop it from opening on boot. And it has access to my camera and mic for some reason? Managing identities across different communities seems leaky at best because I think I have to change my name after joining. I’m sure a lot of this is my own ignorance of the tool but I have no interest in learning a modern gamer chat service. None of it is going to stay the same in 10 years, or even 10 weeks. That’s all a hard pass.

This says nothing of what a terrible choice it is for a knowledge base, especially long term. The only way you choose Discord is if you aren’t thinking beyond the “next release”. That’s a huge red flag in a fundamental framework project like this.


Changing your nickname on a server doesn’t hide the name used on your account from anyone. This is why I’m not on my work’s discord server.


You can have multiple accounts in the Discord client now.


Sure. For now. But I have to carefully police the features of every release before I start the application.

It’s a minefield. People will get hurt. Those of us who have been around will be entitled to say we told you so. But I hope users move away before they get hurt, which will happen.

This is a proprietary application and protocol. It will come back and hurt you. Get out while you can.


And it has access to my camera and mic for some reason?

That's not unexpected given that it has voice and video chat functionality.

if you aren’t thinking beyond the “next release”.

I think that lack of long-term thinking is endemic to the JS community in general, unfortunately.


It’s not unexpected but it is undesired.


The post is wrong about Deno's node comparability. Deno has actually a node compat mode [1]

[1] https://deno.land/manual/node/compatibility_mode


True, but in practice it barely works.


Fan of Deno here. You get permissions, standard library, no node_modules, Typescript, built in test, fmt and linter, auto reload for development etc.

Their new “Fresh” framework is a nice touch too along with the Deno deploy infrastructure and good documentation.

I’ve used it for a few projects now, no issues.


> Documentation is limited, but Bun’s Discord is very active and a great source of knowledge.

:-1: I loath to use discords search interface to look for information. I'm sure the Bun team doesn't see this as an end goal, but I wish forums were still a thing. At least they're mostly indexed.


> Choosing proprietary tools and services for your free software project ultimately sends a message to downstream developers and users of your project that freedom of all users—developers included—is not a priority.

— Matt Lee https://www.linuxjournal.com/content/opinion-github-vs-gitla...


I agree, but nobody bats an eye if your open source project is hosted on GitHub.


I see your point, and I don't use GH when I can avoid it, but those are very different in my mind. GH, for all its flaws, is a value-added hosting service on top of git. It interoperates freely with any other Git host and you can clone/pull at any time. Discord is in an entirely different category: a completely proprietary reimplementation of IRC, with rent-seeking features bolted on.


IRC is horrible compared to Discord at least you have a history to of chat vs nothing.


This is why SourceHut provides an IRC bouncer with its paid subscription because they are committed to the usability of IRC.


For issue management, sure – but just using Github doesn't remove the semi-decentralization of git.


A friend and I recently started a dev collective (https://pico.sh) to work on projects targeting the smol web. We decided that many popular communication and collaboration tools would not fit into our stack. Things like GitHub and discord were out of the question pretty quickly.

Instead we opted for Sourcehut and irc and have been pretty happy with the results.


At least some of the things on GitHub are not inside a walled garden.


I am all for moving away from GitHub, but GitLab is dogshit:

https://gitlab.com/gitlab-org/gitlab/-/issues/22578

https://gitlab.com/gitlab-org/gitlab/-/issues/556

I think Codeberg is a better option:

https://codeberg.org


How is that first issue a blocker for you? It arguably makes the plafond worse to have more gamification and social features.

Anyways, GitLab vs. GitHub is a myopic lens for the quote I posted. All proprietary software choices have consequences, and it this case we're focusing on the parent's issues with Discord vs. open alternatives.


In this case, you're not supposed to search. You're supposed to ask your question, hoping that once somebody gets annoyed enough to repeat the answer over and over, this question gets deemed worthy of documenting.


This was my experience working with Unreal Engine. Couldn't figure out what to do next until I could get somebody to take a look at my problem and that would take multiple postings on discord to get somebody's attention. Somebody had the galls to get upset by my question because he was in the middle of getting help on his.

This support by discord or community only on discord trend needs to stop. It's just creating toxicity and those with malice/narcissism personality dominates.

Rarely do fast response times create a sane knowledge based community, it only agitates and lots of noise is created as a result.


So after node, the ion, then the merge, then deno, now bun.

JS started to get a bit sane, but I guess we are going back to fragmenting and madness pretty soon.


Not really. Know your history first. Node split to io.js due to team disagreements and fragmentation. io.js made an immense amount of progress, adopted semver, and had all the momentum. At the time, Node was stagnant. The choice to merge was rational and good. Deno is a separate project, and while has a node compat mode, is meant to be a TS-first ecosystem with many different rules and methodologies. Bun is an experiment to see if it's possible to build node greenfield today. Only the brave will use it in prod, and only the wild mustangs of management will allow it.

Try not to give into FUD. The JS universe is fine.


1 GB+ files to create a react app seems pretty crazy to me. That's an order of magnitude more than Rails.


perfectly reasonable to choose not to use react for that reason. CRA is also known to be very bloated. See Vite et al.


Nice article.

Was listening to this chat [0] recently. Jarred Sumner is doing really cool things with this! Excited to see where it goes.

[0] - https://youtu.be/rL4qpniIR7o


That’s a great interview. He sites one of his motivations for wanting to make the dev cycle faster: “if it is slow you get distracted and read Hacker News”…


I've given Bun a spin against several of my older Node.js projects and the lack of Express support killed it for me.

As a drop in replacement for Node it has a long way to go. A promising start, but I'll check back in a year


One of the recent releases added partial express support, you might want to give it one more shot.


> Will Bun JavaScript Take Node's Crown

It's far too early to be asking this question. Deno has been around for longer, and it hasn't decrowned Node yet.


IMHO Deno's has been approaching it "wrong", so it'd never take over Node.js. Deno has 3-4 big differences over Node: url-imports/no npm, explicit permissions, web compat, typescript.

As both a dev and as a package publisher, those are either not an advantage, or a straight disadvantage. npm, for all the downsides it has (and it does have them), IMHO has been a huge net positive for the JS/web community. Explicit permissions on the land of 100-dependencies is just a non-starter. Web compat is def the biggest advantage as a dev myself, but Node.js has finally woken up and catching up. I don't use TS so won't comment.

IMHO the main advantage of Deno, like IO.js back then actually, has been to make Node.js stop being sluggish and actually accept PRs/new challenging features.


I had completely given up on Node and JavaScript in general until I had a chance encounter with Typescript recently. Seriously looking at deno specifically for the built in TS support. Will never touch basic JS again.


From Deno's early days, it seemed their goal was to make something fairly different to Node. Holding up "had it replaced Node yet" as a yardstick seems misguided.


Agreed. The guy just wanted to make something he felt was better. I've never seen a mission statement that said the goals were to replace Node. It's a separate beast.


"No native Windows support" should be written in all caps at the beginning of the article. This is a dealbreaker for most developers working on Windows, and this project will never take off until it treats Windows as a first-class citizen. WSL helps but not everyone uses it (probably most people don't use it for regular development environment).

Don't believe me? Just wait and see how this turns out in three years.


Fair point, but I sometimes wonder if there are even any web developers left who use Windows as their main platform. The one thing that has probably changed most in the last 15 years is that Windows isn't the center of the (coding) universe anymore. And (from what I can tell), Windows support with node.js isn't perfect either, otherwise tools like rimraf wouldn't be needed.


>if there are even any web developers left who use Windows as their main platform

There's quite a lot of developers at stodgy Fortune 500 non-tech companies that are sort of forced to use Windows for development. Either explicitly, or though poor enterprise support (vpn connectivity, local admin restrictions, difficult path to purchase a Mac, etc).

It's one reason things like Gitbash, WSL, Docker Desktop, etc, are very popular.


> There's quite a lot of developers at stodgy Fortune 500 non-tech companies

"Quite a lot"? There are more developers at Fortune 500 than FAANG for sure.


> There's quite a lot of developers at stodgy Fortune 500 non-tech companies that are sort of forced to use Windows for development.

In that sort of company it’d take 2 years to get approval to install it anyway. If you submit the paperwork now there’ll probably be a windows port by the time it’s approved.


This might be the case in your specific social circle, but it doesn't seem to be generally true https://survey.stackoverflow.co/2022/#section-most-popular-t...


Those numbers are all developers overall. In the Django developers survey (i.e. web only), Windows is just above 10%. The most used platform is Linux (42%), Mac 32%, Windows with WSL (i.e. Linux in a VM) 17%, and straight up Windows is a paltry 12%. In other web frameworks the numbers are likely worse, since Django has a pretty good Windows support.

https://medium.com/codex/jetbrains-django-developer-survey-r...


Actually, they are mostly web developers https://survey.stackoverflow.co/2022/#developer-roles-dev-ty...

In case you are seeking anecdotal evidence: does your company employ offshore devs? Try and ask them what do they use. Outside of the few countries where most people can afford Macs, things are different.


I wonder if respondents were allowed to check more than one item on that list when answering the question? If so, there is likely a lot of double counting. Large, overarching surveys are quite difficult to create and interpret. The Django survey I quoted earlier was a worldwide survey for what it's worth.


> I wonder if respondents were allowed to check more than one item

yes, you are right, they definitely were allowed to check more than one item, we can see that the total sum of all options gives us a total of more than 100%, so that must have been the case.

But even then, it's hard to overlook the fact that the first three options in the list are related to web development, and with quite high percentages.


I'm coming out of the game development 'social circle' which is still a Windows fortress, but even there the wall is slowly cracking ;) In general I notice more and more that 'new peeps' are not automatically familiar with Windows, but instead started to tinker with programming on Linux.


We are talking about web development, while the survey you linked is about all software developers. No one argues that Windows is king in software development world in general, but for web development I would argue Linux/Mac duo is way more popular.


Any data to support this claim? I'd argue the opposite: Especially in web dev, Windows is king since most people, including semi-professionals, do some sort of web dev.


> No one argues that Windows is king in software development world in general, but for web development

But given that the majority of developers are web developers[1], how can that be the case?

[1] including, unsurprisingly, the respondents to that survey https://survey.stackoverflow.co/2022/#developer-roles-dev-ty...


Health care, logistics, law, finance, retail - there are a number of major industries that all still use Windows as primary. I have a good friend working in tech for Walmart, and another for a major hedge fund; both are Windows shops.

It's naive to assume that Windows isn't prolific. There's probably no good way to quantify numbers, but it's a major player still and likely always will be. WSL certainly helped keep that a reality.


> And (from what I can tell), Windows support with node.js isn't perfect either, otherwise tools like rimraf wouldn't be needed.

I don't see how rimraf is related in any way. If you want to remove folders by calling a shell command (with full knowledge of all potential security and portability risks), your code is going to be platform dependent - it's not Node's responsibility to reimplement bash and/or GNU coreutils to make it magically work. Therefore, you need a Node re-implementation of the same functionality, either in the standard library or as a package.

And besides, rimraf has been unnecessary for a couple of years now, as Node's standard library includes a direct replacement: https://nodejs.org/docs/latest/api/fs.html#fsrmsyncpath-opti....


It might not seem like it but a vast majority of developers exist outside of hacker news sphere and have no doubt that windows is by far the most popular dev platform. WSL2 is making this a lot easier to use linux though.


> Fair point, but I sometimes wonder if there are even any web developers left who use Windows as their main platform.

I kind of have to as of late, because due to corporate security policy I get around a minute of internet access from WSL and then the antivirus steps in and blocks it.


If that's because you're using wsl-vpnkit, you might want to see this commit:

https://github.com/sakai135/wsl-vpnkit/commit/94a85aaf4db365...


I'm working in web Dev for a decade. Yet have to see at least single Dev using linux. Have seen several working on Mac, both designers to QA. Strictly speaking I'm a bit biased, cause work in . net stack. But still.


Same here. Lots of .NET software around in my country, and most of it isn't on .NET Core yet (and having done it once, migrating away from .NET Framework isn't always easy... though it did get easier with .NET 5)


If you are developing for the web what difference does the OS make? Web technologies work the same on any OS


Mainly subtle differences around command line tools and the file system. There are UNIX cmdline tools collections for Windows (like busybox) running in the vanilla cmd.exe or Powershell, but they don't quite fit in (for instance the way how string escaping and cmdline argument parsing works). You can use bash in a mingw 'UNIX emulator', but then you can just as well switch to WSL. And those little things propagate and amplify upward into toolchains and workflows.

It's possible to create command line tools that work across Windows and UNIX-oids, and I appreciate this, but it's a lot of additional work (even 'cross-platform' solutions like Python don't fully wrap this stuff, even though they do their best).


> it's a lot of additional work

FWIW, I have been remarkably impressed with Rust for this. The stdlib and package ecosystem are unusually good at building abstractions that work across *nix and Windows, and so the average command line tool written in Rust usually has good Windows support.

Of course, the additional work hasn't really gone away - it's just been relegated to libraries unusually effectively :)


Sometimes the target OS for deployment matters. Obvious case I run into: a deploy broke because a developer using Windows didn't realize he had some MiXeD case file name with the wrong capital letter. The Linux OS on the production system noticed. Sometimes it's the availability of native modules. They tend to exist for Linux, because it's the target for deployment, and sometimes not for Windows.


Basic CI tests should catch those cases.


The file names yes, modules are catched at development time. "We can't use PackageX because it doesn't exist for Windows and Joe won't be able to run the project anymore." A VM first and WSL2 now solved much of those problems, which leaves us at why bothering with Windows in this kind of business tough.


As a web dev, my choice of OS influences:

* What IDEs/editors I have access to

* What image editing tools I have access to

* How easy it is to get my server’s backend running on my local machineas a dev environment

The last one is most relevant to this conversation. If I’m working on a bun-based project, not being able to run a local dev copy on Windows easily is a nontrivial obstacle.


While the final output should be the same, the development tools are not and that's really where all the subtle problems are.

One of the reason WSL exists is to solve this problem.

(assuming bun doesn't work on native windows but works under wsl)


WSL has its own pain points. It’s not the catch all solution everyone claims it to be.


You cannot install safari on Windows and Linux. If you want to support apple users, you need to get a mac.

Safari often has bugs and rendering issue which aren't present on Chromium or Firefox and need fallback.


Not Safari, but WebKit: https://build.webkit.org/


Lots of little gotchas if you don't at least do a fair amount of testing on Windows, since that's what most of your end users (usually) will be on.

Things like "www-authenticate: Negotiate" in an SSO environment, people pasting rich text into web forms, handling environments with private certificate authorities, and so on.


Windows it's painful slow to do anything.


I swapped between Windows and Linux (Mint/Pop) for a while because quite a lot of JS ecosystem things were problematic on Windows. WSL2 has solved most of those problems though.

I have bad luck with Macs, getting dev tools properly installed seems as hard as on Windows and everything falls apart a few times a year. The OS also seems to break itself and crawl to a halt sometimes (>1 minute to open settings). Linux distros were stable only if I can stayed on the happy path (single monitor, integrated graphics, don't try to sleep/hibernate) with close to default config. Windows can seemingly tolerate a lot more fiddling, at least since 7.


I hate it but where I work WSL is not allowed and Docker requires continuously renewed exceptions.. InfoSec/IT says they can't see what we (or malware..) are doing within..


There are two developers on my team who have the option of using the latest Mac laptops but still insist on using Windows Think Pads


Yes. My entire company. And it’s not some little 5 person shop. They starred that way long ago and continue to this day.


I think "WSL is our Windows strategy" is perfectly acceptable for "bleeding edge"/modern things like this. I've been using WSL as my full time web development environment for years now. Thanks to VSCode's great remote support for it I don't notice any downsides to it.


I don't like WSL, it magically works with VScode until it doesn't and some extension breaks or it runs super slow or git commit takes 10 seconds.


VS Code remoting is great and magic when it works well (which in my experience has indeed been most of the time), but it's still so much overhead versus native builds and native support. I use WSL for the rare times I need to use Ruby, but Node has been mostly good enough in Windows for years at this point and it is nice to have native builds with no remoting overhead.


We should also talk about why this is the case. JavascriptCore. The reason Bun is so fast is mostly due to the fact that it is using Apple's JavascriptCore and what all of the benchmark comparisons are really doing are comparing Google's V8 engine to Apple's JavascriptCore engine.

So basically, there will never be Windows support until JavascriptCore is able to be used on Windows and I'm not entirely sure on the state of that. My guess is that it has limited to no support for that scenario.


Zig, the programming language Bun is written on, is currently having some Windows headaches of its own: https://github.com/ziglang/zig/issues/12420

I wonder how much of Bun's immaturity is due to Zig's own immaturity.


Why not point the finger at Windows? Development on Windows is archaic.


> Why not point the finger at Windows? Development on Windows is archaic.

How so? Zig is a programming language, so I'm guessing the main interactions it needs with the OS are file IO. It shouldn't need to do any GUI work as long as it provides proper bindings to the C functions. And file IO is essentially cross platform in C++ as long as you use the stdlib. Threads are also essentially multiplatform if you're using the stdlib. I also don't know if Zig is written in C, C++, or Zig so it may differ.

Generating binaries is different, but I wouldn't consider windows binaries any more archaic than Mac or Linux binaries, and I'm not sure if zig already uses LLVM backend or something anyways.

But given all that, writing basic Win32 code to do file IO or any sort of OS level interactions isn't any less archaic than what I've had to do in Linux. It's an API, and it's got a lot of cruft built up over the years, but so does any sort of Linux OS API.

Here's the Linux API for creating a file[0] and here's windows[1]. There's more parameters for the Win32 version, but the documentation is solid and gives a lot of tangential information. I actually prefer the Win32 docs to a lot of the Linux docs that I've used because they describe all the details very explicitly. So I wouldn't call Windows any more archaic then any other OS.

Basically what I'm saying is, it takes like 2-3 hours at most to add Windows support to a programming language unless you've architected your code in a way that tangles OS operations with regular operations. It's really not that hard to keep OS code separate from the apps logic to make porting the app to different operating systems trivial.

[0]: https://linux.die.net/man/3/creat

[1]: https://docs.microsoft.com/en-us/windows/win32/api/fileapi/n...


I was with you until "it takes like 2-3 hours at most to add Windows support to a programming language..."

C'mon.


This is all that it took me to port a fairly sizeable code base to Linux[0]. This commit allowed me to run the app with the only problem being some font issues that I needed to fix by modifying how I used a library.

The total:

> Showing 26 changed files with 255 additions and 90 deletions.

If you architect your code well, porting between different systems shouldn't take anymore than a few hours ;)

Edit: I just looked through the diff and remembered that the bulk of these changes was fixing warnings that surfaced from using a different compiler.

The actual code that I changed necessary to get this running on Linux was in File.cpp and consisted of 124 lines of code. Going the other way (from Linux to windows) would have been just as simple, I just would have added the code in the __WIN32__ macro block instead of the code in the __linux__ macro block.

[0]: https://github.com/codingminecraft/StreamMinecraftClone/comm...


A small game is not even remotely comparable to an average programming language.


It really isn't that much different though. After skimming through the source code of Zig, you can see that well over 90% of the code is OS independent.

Not only that, it looks like they do have windows support and it's just failing atm. It also looks like they have cleanly separated all OS functionality from the logic. This is where it looks like the majority of the OS dependent code lives[0], and the implementation for this is 1000 lines of code. So clearly, it looks like it shouldn't take more than a few hours to get even a programming language up and running when porting it.

Further, it looks like they're using the cpp stdlib to assist with some OS dependent functions[1]. They're clearly using at least:

* std filesystem

* std future

* std iostream

* std mutex

* std thread

* std atomic

And more. So if you're being smart about things, which it looks like the developers most certainly are, then you don't need to reinvent 90% of the OS dependent code and can instead use the stdlib that already exists to automagically get that functionality.

[0]: https://github.com/ziglang/zig/blob/master/src/stage1/os.hpp

[1]: https://github.com/ziglang/zig/blob/a9c4dc84f487c4764c578743...


Windows isn't archaic; it's just not Unix. So blame the Unix monoculture.


You can use JavaScriptCore on Windows right now: https://github.com/Lichtso/JSC-Standalone


Not saying you're wrong about the windows support but when it comes to WSL, my entire department uses WSL for their primary development environment


Ooh. I find it a bit painful. Stuff like watching for file changes if you mount C:/ don’t work. Adds friction to web dev for sure.


Dogshit IO performance when accessing native files makes it unusable for me.


I suggest never access native files, if you are still keeping your code in C:\ somewhere I think you're approaching WSL the wrong way. Go all in and I've never looked back


I'll second this. I treat WSL (2) as a Linux box I'm working on, which I can access via a terminal and \\wsl$ (and VS Code, crucially.)


This is why I don't use WSL 2. WSL 1 has much better performance when used on C:\


> Don't believe me? Just wait and see how this turns out in three years.

There are so many projects which do well without a native Windows port. It may fade away in three years, but lack of Windows support will not be the primary reason for it.


It is the other way round: Windows support usually comes because something is popular (and therefore has the funding, resources or help to do so from keen Windows users).


I am genuinely curious who is using production node.js web apps on Windows server OS. I'm sure there are niche things like a massive legacy Windows shop that's slowly moving off, or trivial load internal applications where it doesn't matter. But it would really surprise me to see someone take a major bet on running large public, internet facing load on node + Windows server.

ASP.NET + Windows, sure that makes total sense. Even C++ + Windows is solid and very performant. But node.js has always been a second class citizen with Windows support (it wasn't until Microsoft themselves put fulltime devs working on node that it supported Windows) and it's a huge bet to take with very little benefit and tons of potential failures.


WSL is there for a reason though, it's that Windows is severely underused in some development areas and web is one of them.

Not having windows compatibility is not uncommon working on the web.

As an example, out of 100 developers of my company working on a web product, zero are using windows in my knowledge.


that might be the case in your company, maybe in your state or even in your country, but not outside of it https://survey.stackoverflow.co/2022/#section-most-popular-t...


It's kind of amazing how our beliefs are still mostly formed by our surroundings. Even in the age of internet where information is plenty and free.

I still read often in HN that "most people use iPhones" and "almost no dev uses Windows".

Makes me question if my own beliefs have these enormous misconceptions too.


Someone on Twitter was upset that people they follow were following someone they disagree with. I pointed out that it's healthy to make yourself aware of dissenting opinions, and that following someone doesn't mean you align with every one of their beliefs and opinions.

They seriously couldn't fathom the idea of intentionally following someone they didn't agree with.

So not only are people aware of their filter bubbles, they guard them like it's the responsible thing to do.


Cult members need something to hate to reinforce their identity.


Question you should. For decades I've been trying very hard to avoid such statements because what do I know about "errybody" or "most people". I find it very off-putting when someone uses these levers in a conversation


Stackoverflow isn't only for web development though, they are areas were Windows is still popular, I 100% agree on that.

Even Microsoft themselves recognise it indirectly, that's the whole reason why they built WSL.


> they are areas were Windows is still popular

those areas where Windows is still popular account for a tiny percentage of the devs who responded to that survey. The majority of responders define themselves as backend, frontend or fullstack developers - which are web development roles https://survey.stackoverflow.co/2022/#developer-roles-dev-ty...

TL;DR nowadays, most developers are web developers.

PS the reason why the total of all responses is more than 100%, is that this was a multiple selections response.


That "over 100%" is more significant than you think.

If you use windows primarily, there's a very big chance it's the ONLY operating system you use.

If you use MacOS or Linux, there are most likely windows machines sitting around too.

In my case, I have a windows laptop from work in addition to a macbook. I'd check both boxes, but I've only opened that windows machine a handful of times for esoteric windows issues.

If this idea held true just 50% of the time, then real use of windows would actually be somewhere around 25% and that's assuming a 100% overlap between the Linux and MacOS people (a bad assumption).

Another telling point is the tiny sliver of WSL responses. MS Server is so incredibly unpopular that even the majority of Azure instances have run on Linux for years now. Writing on Windows without WSL then deploying to Linux is a recipe for disaster.

A question where the reader has to infer a lot is a bad question. They should have asked about how much a particular OS is used.


> WSL is there for a reason though

to be clear, I'm not disputing that in some areas of web dev, Windows is less supported than Linux, I'm just saying that a significant percentage of web devs use Windows, whether or not an open source project decides to keep this into account is obviously not up to me to decide, and I can understand why some projects, especially smaller one, might decide to focus on Unix/Linux in the initial periods.


Windows users are fairly conservative and won't be looking at the latest tech that isn't established yet


> This is a dealbreaker for most developers working on Windows

Yes but how many JS developers use Windows?


Lots


> This is a dealbreaker for most developers working on Windows

How many developers are really working on Windows? That's got to be a rounding-error of zero.

There are so many programming tools with basic or non-existent support for Windows and nobody notices.


HN bubble alert!

I think most programmers are actually using Windows as their development environment.

https://insights.stackoverflow.com/survey/2020#technology-de...


Even the developers I know at Microsoft use Linux or macOS!


You were just shown statistics that disprove your idea that almost nobody codes using Windows... over half of the developers who answered StackOverflow's survey are using Windows. Pointing at more anecdotes around you does not change this fact.


Maybe Stack Overflow skews to Windows developers? I don't know how good their data is.

It's hard not to look around with your own eyes and notice that you never see anyone using Windows - like I say, even the Microsoft ones!

Do you meet a lot of developers using Windows? What kind of circles is that in?


"The survey must be bad because it doesn't fit my optics" is an awful take.


Most surveys are skewed to a particular set of users who choose to fill it in! SO is one of the few examples of a major developer site that runs on Windows, for example.


The Jetbrains survey also shows majority of developers working on Windows.


9 days late, wanted to point out that mac and linux have vim and emacs :p


Games, security, finance, and a mindboggling number of companies writing line-of-business software. "Dark matter developers" is a term I've heard used for some of the latter: https://www.hanselman.com/blog/dark-matter-developers-the-un...

I would generally say that native Windows development is in decline, but it's a long slow decline and it's starting from an incredibly high base.

"a rounding-error of zero" is so wildly incorrect that it is very clear you are in a bubble of sorts.


In game development, it's all almost exclusively Windows, and Mac if developing mobile games.


I think you may have quite a large blind spot here.

The games industry is primarily Windows based.


I work at Microsoft, and the majority of people I know use Windows for development. Macs are around, but they seem to be more popular with the managers than with the devs.

(It can be different in some teams - e.g. lots of Macs in VSCode - but company-wide that's an exception rather than a rule.)


Bad statistics alert!

If you use windows, you are VERY unlikely to use Linux and OSX. Conversely, almost EVERYONE who uses Linux or OSX will still have to use Windows. If even just half of the Linux users checked Windows, but rarely used it, then Windows would drop to 3rd place.

The overwhelming majority of servers are Linux (even Azure was announced to be majority Linux a few years ago). Despite this, only a fraction of "windows users" also used WSL. Trying to build UNIX software on Windows is fraught with difficulty. The fact that this number is so low either indicates a sampling bias for Stackoverflow or a lot of people checking the Windows box despite not using it a lot.


Why should half of the Linux users check Windows when asked "What is the primary operating system in which you work?"?


That looks like global statistics. Of course most developers in the developing world aren't using MacOS. I'd be interesting to see the OS breakdown when selecting for just the developed world. I'd imagine Windows percentage is much lower.

Also, looking at some strictly web development surveys, Windows is behind Mac/Linux: https://medium.com/codex/jetbrains-django-developer-survey-r...


Definitely. I created a popular dev tool, and more than half of the users are using Windows.

I'm also a front-end dev using Windows since decades, and I had like 0 problem using this OS.


lol this is a ridiculous take


I do professionally but I prefer Linux for coding at home, mainly because it needs less RAM, runs quicker and stuff just works.

If we move to .NET5 or higher I might use linux at work too.


Apparently windows support is planned.

This is still a new project — anyone interested in using a new, cutting-edge beta JavaScript runtime is certainly capable of using bun through WSL.

Your comment is unnecessarily antagonistic. It comes across as if you’re personally offended that bun wasn’t launched with native windows support.

Also, this made me laugh:

> …this project will never takeoff until it treats windows as a first-class citizen.


>Just wait and see how this turns out in three years.

I think you seriously over estimate the number of developers still on windows.


On a lark recently, I tried to get node setup on my Win10 gaming box so I could try out a basic workflow and see how I liked it. My normal environment is using MacOS and Linux.

I don't remember any specific "wtf" details, but generally it was kind of a non-starter. I got node installed, but then there were issues with using that runtime for my simple server demo. The whole attempt was frustrating. Tried installing the pseudo Linux shell thing for the command prompt and I gave up trying to use that pretty quickly.

Obviously I'm not a native Windows developer. This was just me experimenting. I'm not sure how Win devs actually work these days on non-Win applications in any productive capacity.

I can easily get stuff done on MacOS and Linux but Windows is just this multi-decade old black box of cruft.


> I'm not sure how Win devs actually work these days on non-Win applications in any productive capacity.

WSL. It's trivial to drop into Linux for development these days.

That said, I generally haven't had issues with Node on native Windows.


> probably most people don't use [WSL] for regular development environment

I know quite a few people who do, I actually wouldn't be surprised if their number was higher than pure Windows users for web dev.


Are you sure? From let's say the 100 web devs I encountered the past years I remember 3 using Windows, and they all used the Linux environment for development.


Web development is predominantly macOS and Linux; node servers generally run in Linux containers. Your claim that this won't take off until Windows is at parity just doesn't ring true.


several folks in the threads here have posted references to the contrary. perhaps you have data which disagrees, or you may be operating on observations from your own circles.


Those numbers were all developers overall. In the Django developers survey, which is web only, the most used platform is Linux (42%), Mac 32%, Windows with WSL 17%, and straight up Windows is a paltry 12%. In other web frameworks the numbers are likely worse, since Django has a pretty good Windows experience.

https://medium.com/codex/jetbrains-django-developer-survey-r...


OK now in true JS fashion, somebody make a combo of the two

- Bun lacks Denos best feature which is the permission system / sandboxed by default operation

- Deno lacks Bun's best feature which is backwards compatibility with NPM


I have a soft spot for bun because I like zig so much.

But the key attraction of deno for me is being able to delete node_modules, package.json, package-lock.json etc etc. I don't want to bring them back.


First thing I did is to test bun on my prod build and it crashed spectacularly. It's nowhere near being ready for any serious use.


The article mentions Yarn, then proceeds to benchmark Bun against npm only. Not a fair comparison IMO :-)


I have seen that while Bun is fast for startup, it performs about the same as Node and Deno for longer running tasks, simply because it just wraps JavascriptCore, which performs similarly to Node and Deno's V8.


Not a js/frontend dev, but what I see is "47.12 percent of respondents reported to be using Node. js, while 42.62 percent were using React."

Don't know if that qualifies as having the "crown".


This thread reminds me of the comments on Slashdot when the iPod was announced. A single guy launches an amazing project after a year of solo work.

The response a month later: "No docs? Pfft. Useless."


I think it’s really unfair to Deno as thats existed for many years and gets ignored meanwhile a hobby project is being touted as “the node replacement”.


Deno is featured here frequently. Plus they had $21m in funding.

I woudn't call it unfair. There's space for both.


Unfair? So what would be fair? Banning articles and discussions on technologies which may compete with Deno.


I’m waiting to see when it can run Node-RED (it currently can’t). Seems quite promising for embedded devices in general (again, not right now).


Are there any benchmarks between Bun and GraalJS?


The problem with Bun currently is the missing killer feature, there is no absolute reason to switch over or I might be wrong?


I suspect it's going to be the FFI integration.

On my machine (M1 pro), Bun calling into a C library and running a no-op is 15x faster than Python calling a no-op function. V8 has never had stable bindings but Bun is changing the game with a TinyCC JIT-compiled FFI that yields a simple API.

    $ bun bench.js
    3.5331715910204884 nanoseconds per iteration


    $ python3 -m timeit -s 'def f(): pass' 'f()'
    5000000 loops, best of 5: 53.8 nsec per loop


The “what is bun” section explained almost nothing for me. Is there a better explanation?


Just bun it all to the ground.


Betteridge’s law response to headline: no.

Short answer: not any time soon.

Long answer: Maybe, in the long term. Node is really entrenched in orgs and 99.999% compatibility is probably a must have before they’d would consider switching. Also, don’t underestimate the strength and persistence of Google’s V8 team. Year over year incremental performance improvements may prove to be enough to ensure Node’s dominance for another 10 years.


How zig has worked out in bun as compared to how rust has worked out in deno?


I thought deno was supposed to take node’s crown first?


Not until they remove URI-based imports.


Well I guess Deno is dead in the water then. I have a feeling we are going to have like 10 competing JS runtimes by the end of the year.


Oh there are way more than 10 implementations of JavaScript before we even start counting all their distributions (Bun being a distribution of JSC, etc).

https://notes.eatonphil.com/javascript-implementations.html


Of course, but as far as general purpose runtimes go your options currently are pretty much Node, Deno, and QuickJs. Most of the other ones are for niche things like IoT or embedded systems.


That's not true. Take a look at the post! And look into these implementations and their users.

Or don't. But don't make assertions then without looking into the facts.


Why do you say that?


If Bun gets more popular, which it seems to be, it will probably draw away a lot of people who are currently using Deno.


i love how these folks are trying to blame node for create-react-app's speed issues


If a title contains a question, the answer usually is No.


Node never had a crown. It’s a dirty turd.


Betteridge's law of headlines. No.


another day, another javascript runtime environment


do we actually get many Javascript runtimes? Certainly fewer than JS frameworks.


Competition is a good thing. The javascript engines wouldn't be as fast as they are now if there was only one engine.


Sometimes it not a good thing when it keeps people using JavaScript.


we’ve done virtual dom (and better than virtual dom) UI frameworks to death so I guess this is next!


as far as I'm aware Bun is significantly faster than others while maintaining a lot of the compatibility


The problem of the JS ecosystem is not speed precisely.


Is there really that many? Outside of browsers, I know of Node, Deno and now Bun. Is there really that many serious attempts to replace node?


The engine (V8 analog) is not a "Zig engine", it's javascriptcore. The wrapper is Zig, like node is C/C++ and deno is Rust.


My bad, their page says

"Bun uses the JavaScriptCore engine, which tends to start and perform a little faster than more traditional choices like V8. Bun is written in Zig, a low-level programming language with manual memory management."

I conflated these two.


Bun is using the JavaScriptCore engine which is the Webkit/Safari Javascript engine. It is not written in Zig.


History is repeating once again. So let's face it: NodeJS is now the new Java. ;)


This title is bad, it reads as if “Will Bun” might be a person.


I follow bun since the start and its author has achieved a tremendous work!

However, I am a bit afraid of the rapid growth of the project. I could appreciate a slower growth in order to think twice about the direction of the project. It could be terrible to have new tools' specifics in a world already crowded by them.


From the start of the project, at least from when I first saw it, it seemed to me that the author already has a long-term vision for the project, and the direction it should go.

There was a list of super ambitious goals, like replacing Node.js, NPM, Webpack, etc. It's been impressive to watch how the project is making such progress in achieving them.

So I think the rapid growth is a good sign that the project has communal support and momentum. I'm looking forward to seeing what Bun becomes, maybe more excited about it than Deno.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: