My personal view is that syntactic debate over imports (including whether it's in an import/require/url/package.json/whatever) are basically meaningless except for surface level debate. How "nice" it is to write imports is pretty worthless to me.
The structural and semantic differences between imports are a much more important discussion. Import syntax is the front end to the languages meta programming semantics. It's meta programming in the sense that your code and other code is the data, but what you're really programming is a task runner with specific instructions about how to find and satisfy the requirements to build and/or run the software.
Integrating package resolution into the language itself is a really important distinction for contemporary designs - passing that buck to external tools but simultaneously coupling them to the runtime is a mistake that I think we should learn from. Deno is a good step in that direction.
What happens when there is the next Codehaus-like shutdown, and so much source code just won't work? Or when a bad actor takes control of a domain that commonly hosted packages (perhaps through completely legitimate means, such as registration expiration), can get a completely legitimate SSL certificate for it, and responds to requests for packages with malicious code? I think the abstraction, and to some degree the centralization, of package management is generally a good thing
URL-based imports aren't less secure. They just make an existing attack vector more obvious. Is NPM really keeping you safe? What happens if a package maintainer is compromised and the attacker adds malicious code?
The fact that URL-based imports make you uncomfortable is good. Let that discomfort guide you to adopt some extra security measures to protect yourself from third-party dependency exploits
I would argue that NPM and other centralized package managers have the ability to add security: if npm made 2FA a requirement (or publicized the status of a package maintainer having 2FA enabled like GitHub does, which is perhaps a security concern itself), there would be some assurance that a maintainer is not compromised.
If we are using URL-based imports, the scope of security assurances are much broader: an SSL certificate doesn't say anything about whether the current owner of a domain was the owner at the time of a dependency being safe. There is no one authority who we can expect to take swift (or even slow) action to remove malicious packages from being available on the myriad hosting protocols that are accessible through an environment like Deno
Every time you fetch a package from NPM you are accessing that code from a URL (at npmjs.com) and then caching that locally. Deno is just eliminating the middleman. If you still trust npmjs.com for source code delivery you could continue to do that.
What it isn't eliminating is the ability to define "local" caches. Just because you are importing everything from URLs doesn't mean that they can't all be URLs that you directly control. You don't have to use CDN-like URLs in Production, you can entirely copy and paste all the modules you need onto a server you control and URL scheme that you maintain.
There will still possibly be good uses of caching automation in Deno (once people get bored with xcopy/robocopy/cp -r/rsync scripts), but it will likely seem a return to more bower-like tools rather than necessarily full blown packager npm-like tools. (ETA: Or proxy services like GitHub Artifacts that pull an upstream source for you and rehost it on controlled URLs with some additional security checks.)
a note for many: yarn v2 provides '0-config' fully cachable dependencies (zips in lfs). This makes it possible to fully vet dependencies and enforce change approval and analysis in CI/CD.
They don't do a great job of advertising the changes; take a look under features/zero-installs.
The common practice is to still use yarnv1 as your global yarn, just do "yarn set version berry" inside your project to use v2. 0-config can be a breaking change - though I haven't had problems in a long time.
I wish there was something like Docker Hub's automated builds in the Node world because the way NPM works right now, what comes from NPM is an unknown. The only thing you know is if you download a specific version once, you'll always get that same version again, unless it's invalidated. Otherwise, whatever the package author wants to upload and include, that's what you get and you can't know that what you're seeing in some Git commit is what's running in your application. I wish that was the state of the art.
THIS! I cannot believe that these is still no auto hash validator thing between git and npm. Feels like npm should force a commit containing hash for the current version on every publish or something. How can we make this happen?
It would have to be some kind of integrated CI build system thing that builds the product in a container. Seems like they have no incentive to offer that given that they totally own JS packages.
Also: Nobody prevents you from using a package manager anyway. Just because you can use urls in imports doesn't mean you have to. But it is very convenient that deno downloads exactly the code that is imported. A package manager will always just throw everything at you. Some packages in node.js try to fix this by splitting the project into submodules like @babel/core, @babel/env, .... But that made installing dependencies worse. Just let deno figure out what is required is way more elegant IMO.
Not sure I understand, are you implying Deno does automatic tree-shaking on package imports? If not, how does "deno download exactly the code that is imported" and not just a whole package?
Also, from your link:
"In Node this is done by checking node_modules into source control. In Deno this is done by pointing $DENO_DIR to some project-local directory at runtime, and similarly checking that into source control:"
I disagree with this. In my opinion, this is done by using a pull-through cache that caches every package developers request and so inherently has a cache of the packages that will go to production.
Is it possible to do this in deno today? I don't really get that sense.
> Not sure I understand, are you implying Deno does automatic tree-shaking on package imports? If not, how does "deno download exactly the code that is imported" and not just a whole package?
npm install copies in to your node_modules an entire zip/tarball package. Deno uses ES2015+ module syntax and it's only going to grab the JS files that are imported (or imported by imports). So it "naturally" tree shakes at the file level, and doesn't have a concept of a "package" in the same way. It's not directly going to tree shake inside of single file boundaries in the way that a major bundler/packager might (though the V8 JIT should still sort of indirectly compensate).
So yeah, if the package is published as just a single (M)JS file it will still get entirely downloaded by Deno, but if the modules are split across multiple files, Deno will only download the ones that are directly or indirectly imported (and will have no idea of the remainder of the files in the "package").
> I disagree with this. In my opinion, this is done by using a pull-through cache that caches every package developers request and so inherently has a cache of the packages that will go to production.
> Is it possible to do this in deno today? I don't really get that sense.
Yes, because URLs are just URLs, you could always have a Proxy service running at a URL that knows how to request the packages from upstream. https://jsproxy.mydomain.example.com/lodash/v1.1/module.js could be simple caching proxy that knows how to get lodash from upstream if it doesn't have the requested version cached (or sends a 404 error if it isn't allowed to cache a specific version or whatever).
Thanks for the package / module / import explanation.
Re: URL proxying, this all feels ad-hoc and overly decentralized. I agree with your assessment that "rolling it yourself" looks simple enough at first glance, but after having so much success with package managers and associated tooling I can feel the doubt in my mind that a new approach won't just reskin the same problems. I see they've done some reasonably smart decision-making along the way though, so let's hope they walked down this logic tree and are happy with the likely paths
> Re: URL proxying, this all feels ad-hoc and overly decentralized
Well, I started from the "full paranoia" mode where people would want it to be ad hoc and as decentralized as possible. It's very easy to imagine that there would still be trusted/trustable 3rd parties creating central proxies for stuff like this. Just as unpkg today provides one way to pull individual JS files from npm, you could imagine unpkg as an option for Deno. Similarly, there are lots of artifacts repositories with upstream pulling options already in the wild such as GitHub Artifacts or Azure Artifacts or jFrog Artifactory as three easy examples to mind (among many). It's not hard to imagine those artifact libraries also supporting Deno style URLs (in a similar manner to what unpkg does with npm) as Deno becomes more popular.
Ultimately it is likely to be a huge spectrum from people that want a JS proxy they entirely control/run/manage themselves, to those that want one they trust from a 3rd party, to those that are fine with whatever CDNs their libraries suggest. That's already a spectrum that exists in the npm world: people have private npm servers, people use npm artifact libraries like GitHub Artifacts, people use unpkg directly to pull in JS files from npm, people still go to Bootstrap or JQuery in 2021 and just copy and paste whatever CDN is mentioned in the "Getting Started" section of the appropriate docs. That spectrum is still likely to exist for Deno libraries, and while it might make it harder as a user/developer to choose which part of that spectrum is best for your own projects, Deno having little to no "out of the box" opinion on which part of the spectrum your project falls into (as opposed to Node defaulting to npm these days and the two ever more seemingly inseparable) isn't necessarily a bad thing for the ecosystem's health as a whole.
>"deno download exactly the code that is imported" and not just a whole package?
In a fairly simple and elegant manner. You specify a URL to the specific file FOO you want to import and FOO gets downloaded. Then deno looks at FOO to see what files FOO depends on, and downloads those, so on so forth...
That's very different from how NPM works where you have to download the entire package, including parts you may never use, along with every dependency the package depends on, even if you never end up using those dependencies either.
NPM's approach, which frankly worked well given the circumstances, has the downside of not providing end-users much benefit of writing code in a modular fashion, so there's no advantage to breaking a package up into multiple files versus distributing a single and large minified file.
Deno's approach promotes breaking up packages into multiple files so that an end-user only downloads the files and dependencies that are actually needed.
Thanks for the explanation. I would be a bit scared to have a recursive solver install my dependencies, but it's ultimately not that different from packages I suppose.
I will be interested to see how the tooling & community resulting from this decision look. Hopefully good.
Unfortunately this won't help those who add dependencies based on shared snippets, without the context of a project. Or the innocent mistake (or choice?) of not checking the lock file into version control. But yes good point, existing projects that have checked in the integrity file will be safe against future malicious modifications of the resource
There is some degree of assurance that this dependency won't last long in the Maven central repo, or any other user configured repository, if it contained malicious code. Obviously it is not foolproof and incidents happen, but without a centralized authority for package management, there is much less assurance that a package is not malicious
if you can magically copy and paste in code that also includes a dependency then you might have just screwed yourself if you didn't read the code of said dep (or even if you did, maybe you missed something) if it just looks like a comment then maybe your team missed it in review. its harder to reason about deps that live in deep modules.
Yes, i think recreating an npm kind of central place for your project dependencies seems almost required if you want to avoid depending on a different version of a lib in each file of your project.
I cannot understand the benefit of this scheme honestly. I would have preferred they fix something npm is lacking, like adding the ability to sign packages
Those still are all trivial in the "too little too late" sense.
URL-based imports are already something you can do in package.json if you really thought that was great. Top-level await really is trivial. The permissions system does very little for you since they are process-level permissions when I'm scared of transitive dep attacks. Typescript in Node is good and Deno has to catch up in tooling.
If Deno was where it is today but like 8 years ago, it would have made a splash. Now it just looks like someone made a few creature comforts to Node.
Thanks, I agree 100%. Perhaps I'm getting unnecessarily cocky in my old age, but this really does feel like other cases I've seen in the past where people (including the original founder) want "a clean slate", because of course there are warts and lessons learned from the original implementation, but the cost of workarounds for those warts is actually lower than the cost of switching to an entirely new competing technology for most people. Going back to the original list:
- Typescript as a first class citizen: Agreed, it's not hard to get TS set up in Node, and it's a "one and done" cost.
- An actual `window` global with familiar browser APIs: I've never needed or wanted this, though I could see how the server-side rendering crowd could benefit, but I still have to believe there would by necessity be enough difference between the browser-based window that implementing SSR is still non-trivial.
- Sandboxing w/ permissions: I honestly think this is going to be useless. As you stated, if you're really running untrusted code better to rely on process permissions. This actually reminds me of many of the overly complicated security/permissions architecture in the Java SecurityManager stuff that I rarely if ever actually saw being used.
- URL-based imports (no need for NPM): NPM certainly has its warts, but I think a lot of Deno supporters woefully underestimate how most server-side JS developers love having a single primary global package repo.
- Bundling into self-contained binaries: Again, this is nice, but also gets a "Meh, I don't really care" from me.
- Things like top-level-await which Node.js still treats as experimental: Once you learn how to do an IIFE who really cares?
- Better-designed APIs than the Node standard lib (esp. when it comes to promises instead of callbacks): Again, a minor nice-to-have, especially with Util.promisify.
top-level await isn't really trivial implementation or action wise. But as a feature it seems obvious until you step into all the edge cases it can create.
I write a lot of short JS files that aren't part of a big cohesive backend, and little things like top-level await make the code somewhat easier to read. Yes, you can just do an async IIFE, but that's just another thing for the eye to parse and feels quite vestigial.
EDIT: And since I can't respond to your other comment, I'll ask this here; what do you consider great and important about Deno that doesn't fall under the bullet-points I listed? I'm simply curious.
I think deno is really interesting in the aspects of how it is parsed and handled, I have a lot of hope they can continue to do cool things with it too. I would like to continue seeing them take strong opinions on things that improve the language, and hope to see it evolve beyond javascript. But thus far - it has been: stealing code from node.js and taking strong opinions on things that make the overall DX worse. Deno's strength will being a scripting language on its current path.
Edit: just realized I gave no example.. an example would be a dep that has gone missing and there is no backup for it. Deno sort of handles this but all we have actually added is no centralized way to describe dependence. So if you dynamically import a package via its package manager you also need to write some code to ensure it even exists. how is that better?
Can't we have archives in case any packages fail to resolve? Archive.org might even work in some circumstances. (not saying that we should use that, but that it would be trivial to have backups in a repository that we don't always rely on for all dependencies)
It's cute for scripts. Otherwise there's the trivial top-level wrapper runProgram.catch(console.error) entry point. It's just not a detail that does much to sell Deno.
Though I admit it's not that fun for me to come in here to poopoo Deno.
I agree with the spirit of what you're saying, but there billions of lines of JavaScript code that can't "just go away". Being able to use e.g. 'window.crypto' without worrying about whether I'm in Node, Deno, some other runtime or a browser is great news for reusing huge amounts of code.
I'm in favour of compatibility and against a global named 'window' in a non-browser JS implementation, and I don't understand why the former necessitates the latter. Since window is the global object, why not just refer to crypto? Then you don't care what the global object is named.
It is relevant to this that ecmascript specifies `globalThis` as the global object (the one pointed by this outside of functions in non-strict mode), all js environments have it and simiply globalThis === window || globalThis === global
Well, okay, but it's pretty much here to stay and doesn't have that great of a drawback that it's worth breaking any cross-compatible code that currently exists.
If there is going to be a global object, then it might as well be the same one used in the browser, which everyone who writes JS is familiar with.
- AFAIK - it still compiles into normal JS nor does the TS team work on deno (hope I am wrong) so it's not really FCC as most would think
- this doesn't make sense, its bad API design and shoe horning in a completely different paradigm is not a good idea, it should have moved somewhere less familiar and more oriented to the environment (I know that sounds weird but giving a quick answer, there are better solutions and this aint a browser)
- not bad, but seems odd to me as a language feature in some ways, there are plenty of ways to achieve this. (macOS will do this to your node scripts even itself, why do we need more of it)
- this is subject to the same problems NPM could have, but I guess it is easier? Now you have to lexically parse all files to determine an import and you also have to remember where you keep your 3rd party packages without a centralized way to know it (edit: this seems to harm the previous point)
- bundling isn't that interesting in this context as it just bundles the whole runtime with the package, which is terrible for size of packages and doesn't net a lot of benefit since they are also now tied together - if there is a security flaw you now must fix the entire binary and hopefully it still compiles. (edit: technically swift did this a long time too... but they also were reasonably sure they could include their runtime with the OS itself once ABI was stable, I am not sure if Deno has a path for this or if we just get fat binaries forever)
- top level await is a weird one to me, there are valid reasons its not in node currently. but yeah- no one likes having to write janky init code to work around this
final edit: I have a lot of opinions on this and would love to learn more about why deno is great. from what I can tell, its just a faster language of js which, imo, is great. But the points drawn from GP are just bizarre to me.
- I've had the typescript compiler do WEIRD errors on me, depending on configuration, a zero configuration, no brain environment is a godsent
- it makes a ton of sense if you don't want to maintain two different versions of the same library. With WASM there is zero need for Node native modules anymore, so there is no need to have platform specific idiosyncrasies that go beyond Web APS's. Is the fact that it's called "window" a favourite of mine? Certainly not, but when you try to get a library running that you really need, that was originally written for the browser or vice versa, you don't care what the thing is called, as long as it's backwards compatible.
- defense in depth, multiple sandboxes are better than one
- this has to do with the window point, it's a lot easier to make code cross platform compatible if you only have to reference publicly accessible http endpoints
- maybe that's not interesting for you, but I've had to deliver command line tools as binary, and I'm super duper happy that I could do so in JS, the security I gain from controlling the runtime version is also much better than what I'd get from that being the users responsibility, besides that fact that not knowing exactly which runtime your code is gonna end up on is also a big source of security vulnerabilities
- TS also lets anyone do whatever they want so not a great thing to support, I have always said TS would be better if people couldn't configure it without a PR to TS itself
- we have the chance to define a new and better API, we should do it - you are already adopting something different
- I agree, but its not a selling point, these things are probably in containers already and my OS should be doing most of the work
- window: just no, call it global if you must. perpetuating ideas like this isn't good for anyone but the lazy.
- I think the cmd aspect of shipping a whole binary is cool for sure, but lets not conflate this as a "compiled language" its not. you are shipping a runtime and a scripting language together.
relatively speaking the windows vs. global should be a non issue. I believe that every js runtime except for IE supports globalThis such that `globalThis.window === window` in the browser and `globalThis.global === global` in node.
Deno's difference from node is the choice to implement web API instead of nodejs stdlib API
While I also think url based imports are a weird idea, it might just stem from the fact that I haven’t used it much, it might be wonderful, who knows.
But what I’d like to question is why the idea of parsing everything is considered bad. Semver itself, while miles ahead of what came before, is still just lies we tell the compiler.
You can never actually rely on the fact that a miner fix will not be a breaking change and the like.
Rich Hicky had a great talk on the subject of imports and breakage, and the conclusion that I agree with was to actually load and check every function in all the libraries you use, so that if there is an incompatible change you will not be none the wiser.
I’m glad people are trying to experiment here, as package management is far from a solved problem and issues with it cause sweat, tears and blood to most devs during their careers.
I've used imports in this manner before and the issue with having a non-centralized place where packages defined has 2 sides:
1. people will import packages willy-nilly and not think about what they are doing and it becomes harder to understand WHAT they imported (the why becomes more clear imo), I am aware that is very much so JS culture today but I also believe that to be harmful.
2. Having to parse all files to find deps takes time, obviously not a ton of time, but it takes time, it simply doesn't scale appropriately
Working in finance - I think personally that it is really important to make changing the dependency chain something everyone should be VERY aware of.
I was just looking for git tooling and found a few node base projects with npm-install instructions. I thought deno is going to eat these things up for teams. We don't want C++ devs to need a node tool-chain for one tool.
This doesn't seem to add much besides having to install `tsc` or `ts-node` and also not having the choice of the TypeScript compiler version you use.
- An actual `window` global with familiar browser APIs
Node.js has a `global` global object and the only API I would understand having in common with the `window` object is the `fetch()` API.
- Sandboxing w/ permissions
Sandboxing is so basic that any large project will have to enable all permissions.
- URL-based imports (no need for NPM)
I would consider this a disadvantage.
- Bundling into self-contained binaries
Again, I would say that this is rarely useful in a world where a lot of operations use container technology.
- Things like top-level-await which Node.js still treats as experimental.
This is trivially solved by anonymous self-executing functions
- Better-designed APIs than the Node standard lib (esp. when it comes to promises instead of callbacks)
I think that this is the strongest advantage, however I would argue that this is not a reason to start a completely new backend platform.
Also, I think that it might be a disadvantage in some high performance scenarios because Promises are much, much slower than callbacks currently.
> This doesn't seem to add much besides having to install `tsc` or `ts-node` and also not having the choice of the TypeScript compiler version you use.
Depends on your point of view. With TypeScript being built in, you don't have to think about using tsc or whatever version of TypeScript you have. It's just what version of Deno you use. If someone doesn't like that, then they still can have the option of using TypeScript by itself.
> Node.js has a `global` global object and the only API I would understand having in common with the `window` object is the `fetch()` API.
It also supports `addEventListener`, which is commonly used by browser code on the window object.
Just the existence of something defined as `window` makes more sense than a `global` which never existed in browsers in the first place.
> Sandboxing is so basic that any large project will have to enable all permissions.
That's pretty dismissive. Why should an app that doesn't interact with the file system be allowed to write or even read from it? I don't know how this feature can be considered a drawback. Don't like it? Don't use it. I don't see how it detracts objectively from Deno.
> I would consider this a disadvantage.
Then you can still use NPM. Others of us get the option to just import packages from a URL instead of publishing it to a central repository.
> Again, I would say that this is rarely useful in a world where a lot of operations use container technology.
Why? Building Docker images requires extra software, Linux images, time spent running apt-get or apk, time spent downloading and installing your runtime of choice, and so forth. Having Deno build a binary can give you a bit of a shortcut in that you have one tool for running and bundling code, and you don't need to deal with as many OS-level nuances to do so. Docker and k8s are there for anyone who needs something beyond that.
> This is trivially solved by anonymous self-executing functions
That's your opinion. Just promise me you don't go on to say that JS is a bad language, because people keep saying that yet are opposed to reducing complexity they consider "trivial". If using IIFE for the mere purpose of accessing syntax makes more sense to you than making `await` syntax available, then I really don't know what to tell you. What exactly is the argument for not implementing this feature besides "all you have to do is type some extra characters", to loosely paraphrase you.
> I think that this is the strongest advantage, however I would argue that this is not a reason to start a completely new backend platform. Also, I think that it might be a disadvantage in some high performance scenarios because Promises are much, much slower than callbacks currently.
> I think that this is the strongest advantage, however I would argue that this is not a reason to start a completely new backend platform. Also, I think that it might be a disadvantage in some high performance scenarios because Promises are much, much slower than callbacks currently.
I honestly have to wonder if you are joking. This is exactly why people invent new backends, new libraries, and new languages.
My only response to your point about Promises is that perhaps one shouldn't be using JavaScript if Promises are that much of a bottleneck. What you're saying is totally valid, though.
On the sandbox part, you can use workers to offload risky code. There is cost to doing it but it can be minimal depending on what you are using it for.
> That's pretty dismissive. Why should an app that doesn't interact with the file system be allowed to write or even read from it? I don't know how this feature can be considered a drawback. Don't like it? Don't use it. I don't see how it detracts objectively from Deno.
Because any app of a medium size and above will require access to all permissions. Also Deno had some obvious security vulnerabilities around symbolic links for example which really detracts from the supposed security goal.
> Why? Building Docker images requires extra software, Linux images, time spent running apt-get or apk, time spent downloading and installing your runtime of choice, and so forth. Having Deno build a binary can give you a bit of a shortcut in that you have one tool for running and bundling code, and you don't need to deal with as many OS-level nuances to do so. Docker and k8s are there for anyone who needs something beyond that.
But you are going to need some kind of operating system image anyway due to other tools that will need to live with your app like log shipping, load balancers, DNS caches, firewalls, daemons, etc.
So in the end you will need to describe this somewhere and why not also describe the dependencies of your apps at the same time.
> If using IIFE for the mere purpose of accessing syntax makes more sense to you than making `await` syntax available, then I really don't know what to tell you.
If using IIFE is so heavy that a new backend platform needs to be built I don't know what to tell you.
In the apps I see, there is exactly one top level IIFE that is needed in the whole application.
> This is exactly why people invent new backends, new libraries, and new languages.
New libraries yes, new languages no. The util.promisify() already makes 90% of the cases work painlessly and some promise wrappers for existing core libraries already exist on top of that. Since core is moving to promises slowly anyways I fail to see how this advantage will carry on being one in the future.
> My only response to your point about Promises is that perhaps one shouldn't be using JavaScript if Promises are that much of a bottleneck.
Yup, that's absolutely true. I would say that there is always an advantage in having leeway in a programming language between the convenient option and the fast option so that when something becomes a bottleneck you have easier options than porting it to another language. But of course this might not be the most common case.
> due to other tools that will need to live with your app like log shipping, load balancers, DNS caches, firewalls, daemons, etc.
some apps are just CLI tools.
Top-level await helps with the rigidity of the ES module systems[1]; I believe they can also be used with dynamics imports giving ES-modules and CommonJS similar expressivity
- Typescript as a first class citizen
- An actual `window` global with familiar browser APIs
- Sandboxing w/ permissions
- URL-based imports (no need for NPM)
- Bundling into self-contained binaries
- Things like top-level-await which Node.js still treats as experimental.
- Better-designed APIs than the Node standard lib (esp. when it comes to promises instead of callbacks)
To me, those aren't just minor details. This has the potential to create a new epoch in server-size JavaScript.