I’m slightly disappointed that the vision of leaving NodeJS and npm behind ended up failing.
The last year ended up being a bunch of concessions to make Deno more appealing to the masses, like disabling type checking on run by default.
But hey, that’s pragmatism for you, sometimes you just have to let go of ideals even if it hurts a bit.
I’m certainly looking forward to what happens next, even if more from the sidelines.
> But hey, that’s pragmatism for you, sometimes you just have to let go of ideals even if it hurts a bit.
But on the other hand, that's how we end up with a programming language landscape where every language is mostly the same, except for community conventions and very slightly different syntax.
I'd love it if more languages where strongly controlled like Clojure and alike, where there is a unified vision that is well kept across time. Not that Clojure is perfect, but probably the best example of a language where the community is welcome to suggest things but unless the person in control approves it, it won't make it into the core language and instead will/could be implemented as a library. In contrast to Rust which seems to base language additions/changes based on popularity in the community.
JS is JS. It's not going to stop being JS. Deno is a slightly different syntax for the same thing. If you want a different language, step 1 is to pick a different language.
JavaScript in 2022 is JavaScript in 2022, but won't be JavaScript 2022 won't be the same as the JavaScript we'll have in 2030, nor was it the same JavaScript we used in 2010. That's because JavaScript is not driven by a single person with a unified vision, it's driven by a committee who implements stuff based on popularity, quite literally.
I don't want a different language, I want multitude of different languages. And not a multitude of different Fortrans/C-like language.
I’ve tried Smalltalk, Common Lisp/Clojure, and read about Forth the past months. And I agree with you that there is more to explore than the standard set of features found in C-like language (live programming, real macros,…)
I differ from this opinion. Why would you expect "node" people to switch to "deno" - because it's better? I wouldn't expect that. I would expect that to happen only if "deno ecosystem" is better than the "node ecosystem". And ecosystem is not built in a day or few years.
And this is a great move, help people to switch from "node" to "deno" while they keep using existing tools/libraries they use. And slowly, people can then switch to deno standard/3rdParty modules.
It's a chicken-and-egg thing and reasonable people can disagree. And I happen to disagree with you. :)
I'm not familiar with Deno, but have skimmed a few posts about it here and on Reddit. In any case, my point of view here is general enough that it doesn't matter if it's literally about the Deno project or some other new language/runtime project.
When you say,
> And this is a great move, help people to switch from "node" to "deno" while they keep using existing tools/libraries they use. And slowly, people can then switch to deno standard/3rdParty modules.
my main thought is that compromising whatever benefits Deno envisioned to "attract" Node developers actually gives Node developers LESS reason to switch.
I'm no psychologist, but I can't help but believe that part of the reason Rust got such a cult-like following (myself included) is because a bunch of us spent years writing C++ and then tried Rust and it wasn't smooth. Perhaps that's paradoxical, but for me, it really showed me how much safer and more robust my code could be with this new tool. I don't think that Rust would be as popular today if it had some kind of unsafely-call-C++-mode enabled by default.
It's a balance, obviously. If you want your language/runtime to be the best quality, you don't compromise much; if you want it to be popular, you make it easy to get in to. Often those two have some tension.
Just my two cents as someone who's a bit more on the idealist/academic side of things and is super disappointed by the compromises in languages like Kotlin and TypeScript.
Meh, I feel like this is somewhat inevitable. People love talking about clean breaks and how NPM is evil and left-pad, but nobody wants to write their own packages. It's probably the biggest blocker for adoption for any language or tool. If there's no mature libraries for authentication on Deno, am I going to roll up my sleeves and reimplement JWTs, or am I going to sigh and switch to Node? For a lot of tools they're stuck with people who will do the former, and then finally, after enough libraries are built, they start getting the people who do the latter.
Because as much as there are loud voices on the internet decrying packages, most people are not so ideologically focused. They just want to get their code written. Packages help and so they use packages. Plus most of the loud voices neglect to offer any solution other than "packages are bad!!!"
Supporting TypeScript OOTB in any capacity was itself "appealing to the masses". Being able to use TS without the burden of setting it up was a big selling point that resulted in people giving Deno a try, particularly in the early days.
If you'll permit me to be a cynic: this is VC funding for you. Deno took on a ton of funding and they need results. The barrier of NPM incompatibility simply can't be allowed to get in the way of them getting marketshare.
I tried Deno a few months ago, because I wanted to avoid the hassle with setting up a package.json, yarn lockfile, adding dependencies, etc.. just to run a TS script that generates k8s manifest YAMLs via https://github.com/cdk8s-team/cdk8s-plus
npm has a lot of issues and I don't think the credit for adding value is delivered with the flack it gets. Everyone's darling Python is still garbage to figure out packaging and deploying even with supposedly great new poetry compared to npm.
Decoupling running and type checking is quite popular nowadays, Vite does it as well. It can give a huge speed boost and allows you to ignore type errors while you're just messing around. Of course then you also need the discipline to eventually fix them, but there's probably a reason you went with TS instead of JS.
It is popular in newgen tooling but I feel its more a function of the slowness of `tsc` than anything else. I get why `tsc` is slow, and have a lot of respect for the team and the constraints they work under but I can't help but feeling if we were to get a faster type checker (potentially `stc` from the creator of `swc` which also does this? https://github.com/dudykr/stc), this choice would be less popular. However if I'm honest I don't know if TypeScript's type system means that there are some natural constraints on how fast it can be validated.
I feel like part of the issue is not as much that typechecking is slow, but that at runtime type checking "doesn't matter". I think a lot of people would like to see the Stage 1 Type Annotations proposal [1] move forward where JS would allow and ignore type annotations at runtime and even "type stripping" disappears at runtime for TS files.
It doesn't matter how fast a type checker is at that point at runtime.
The only reason to bring back type checking at runtime would be if V8 et al ever also started using type hints as JIT hints and you want to triple check you are sending the right type hints and not accidentally stomping on your own runtime performance. (Which is intentionally not in that Stage 1 proposal mind you: no one is expecting runtime type hints in the near future.)
Do they really have a choice? Deno is not particularly faster than Node and Bun outperforms both, so Deno has to find a place somewhere between the two. Ecosystem ultimately matters more than speed, thus Deno is making the smart move, as much as I would like to move entirely beyond Node.
*Bun outperforms both on some metrics, which Deno has committed to matching/beating, neither of which may be the actual deciding factor in success since performance isn't everything
> When I’m refactoring I need to be able to test that things are partly working as I go.
Personally speaking, isn't that exactly the value of having a type system in the first place?
If it still type checks after the refactor it ideally* is still working.
Unless of course some system boundaries changed or there is some dynamic component.
I suppose we both come from different philosophies here.
I write out the types for a program first, then the behavior follows through.
If I cannot properly determine the types I escape with dynamism.
You seem to dynamically write the system up and then determine the types,
do I interpret that right?
The current setup might be a better experience then, but I think the default matters here.
In my little bubble I've encountered more libraries with broken typing since type checking has been reduced, so I perceive it as a net-negative.
You end up having to explain that you need to take manual precautions to actually get type checking.
Of course this could just be due to the growing user-base and the higher probability of hitting a "wrongly typed" library, so it will remain to be seen how much impact it has.
I'm used to the --check flag by now :)
* How ideal depends on how expressive the type system is.
But hey, that’s pragmatism for you, sometimes you just have to let go of ideals even if it hurts a bit.
I’m certainly looking forward to what happens next, even if more from the sidelines.