Think of it as Node.js + Typescript, but you don't have to think about configuring the typescript compiler, linter and formatter as everything is bundled in the `deno` binary.
Don't think about syncing that formatting/linting/import aliases configuration with VSCode, all you need is the Deno plugin and you'll get all the benefits of working with TS on VSCode.
Packages are obtained (and heavily cached) from any URL instead of relying on a centralized repository. Obtain your dependencies as you want, whether it's Deno's proxy, directly from raw.githubusercontent.com, your own http server, or any other thing accessible thru an URL.
At the end, the permission system is the least interesting part of the project imho. It's useful, because if you're doing a CLI that just receives stdin, processes it, and prints to stdout, you can block any disk and network access, but apart from that it's really limited because the nature of JS itself. (Maybe for the next trendy language we could think about the Object-capability model before it's too late. https://en.wikipedia.org/wiki/Object-capability_model)
The thing I value the most is consistency and having a fully working development environment out of the box to be productive.
nice to have ts+linter+formatter together, it took me one or two hours to set up with my own vim(using vite/volar), so it saves me some time.
what I really like a new Node.js(e.g. deno) is actually its module system, I don't like the `npm i` in node.js pulls hundreds of modules, a standard library that contains most commonly needed modules is the key. If deno does not do that, I probably will never try that(as I can have the bundle of tooling done myself in one-or-two hours, vite/volar now makes this even simpler).
In short, I will prefer gold-quality set of modules or APIs(e.g. glibc) to the 100% flexibility "you pull whatever you want freely now even over http URL", for security and stability reasons.
> I don't like the `npm i` in node.js pulls hundreds of modules
Because modules/packages routinely have dependencies. And their dependencies have dependencies. And...
Deno changes nothing in that regard with one single exception:
> a standard library that contains most commonly needed modules is the key
^ This is the bane of Javascript, yes. But this doesn't mean that having a standard library somehow prevents modules having multiple dependencies and subdependencies.
> "you pull whatever you want freely now even over http URL", for security and stability reasons.
- If you pull your deps from a random URL and that URL goes away, how do you solve that?
- If your deps pull other subdeps from a random URL and that URL goes away, how do you solve that?
- For security, how do you vet what your dependencies keep on pulling from random URLs?
For node, the answer is: run your own registry, and don't load anything from outside. That's how many companies operate. How can this be solved with Deno?
Deno API, which comes bundled with Deno and is present globally, always, without importing: https://doc.deno.land/deno/stable. Includes basic stuff like stdin, stdout, stderr, file management... etc and standard Web APIs like Fetch API, web assembly interoperability, localStorage, FormData, TextEncoder/TextDecoder, etc.
Deno std lib, an official library that presents what you could expect from a standard library, based on Deno API: https://deno.land/std@0.120.0, but you need to import it, because it's no different than any other module in the wild. Mime types, more encoding options, extended file system management, etc.
I had been watching some issues around this, but lost track, so I'm very excited to see it is available now! This makes Deno _very appealing_ for a wide range of tasks where FFI is a small but non-negotiable necessity (places where I would use Python's ctypes, for example tooling around C libraries; such tooling becomes much more complicated if another toolchain and compilation step is required before lib can be called from a script).
1. you got pretty much 80% of what you will need from the std libraries for a mid-size 'normal' application(similar to glibc, libstdcpp, go stdlib, python's stdlib)
2. you got a small stdlib than usual(rust's stdlib), the rest you use cargo and good luck with that
3. you grab whatever you need(node.js, you have no idea about the 500 modules npm just installed)
C and C++ also have light stdlibs from my point of view, just like Rust.
Rust has collections, strings and algorithms manipulating these. It supports synchronous IO (files and network) and can work with threads/processes. It lacks async IO, higher level network protocols (e.g. HTTP and TLS), regex, advanced unicode support, serialization, UUIDs, GUI, linear algebra/vectors, logging, encoding, time, randomness, command line parsing, cryptography, compression.
C++ has reasonable randomness and time support, but apart from that Rust isn't far behind.
That built in linter leaves a lot to be desired. If you're a big big fan of being told how your code should be formatted, then it's a good fit. however, if you're the slightest bit opinionated, you still have to implement ESLint and jump through those hoops.
Right. Personally, I don't think any opinion is more correct than any other in regards to code formatting, so, consistently and collectively choosing one and going forward with it is the way.
Deno’s permission system is broken, you shouldn’t rely on it. Deno developers consistently ignore security issues, high priority bugs take months to fix.
API-based access control can’t possibly work because it’s nearly impossible to predict the effect of any single permission. For example, “permission to run specific command” makes no sense without checking the integrity of the binary, controlling the environment for LD_PRELOAD-like hacks and evaluating the code of this command for possible escape hatches. If you want to isolate a program, you need to do it on the OS level.
I'm not overly concerned that allow-run leads to possible elevation, I'm more interested if:
> deno run --allow-read=./assets
works as intended - preventing most execution of local code, and prevents writing to disk. I think it's a useful real-world use-case, that is complicated to copy with nodejs.
That said, I think one should still be wary of running random code - but at least deno makes it a little easier for honest authors to adhere to principle of least privilege?
So, If I understand it correctly the security issues are at architectural level and not simply bugs?
Is there an alternative tool you can suggest, to allow us securely run arbitrary JS? I was looking at Apple's JavaScriptCore to run JS and and if it happens that I need any level of access to the system(i.e. files) simply handle that in Swift and pass the file to the JS. Would that be a secure approach?
Yep, trying to restrict the access to a system on the API level instead of the OS level will inevitably lead to problems like these (Deno is doing worse than it could though).
If you want to isolate your program, you should use an OS-level sandbox like bubblewrap or a lightweight VM like Firecracker. I’m not familiar with Apple’s JavaScriptCore, but if it doesn’t provide any access to the system (and instead relies on passing arguments from Swift code), it might also be a viable approach.
Yes, my understanding is that it's the pure language interpreter without anything about filesystems or web browser. You need to create an interface in Swift/Objective-C or C to put in and get data out of the execution context.
Re: the first linked issue, does it only affect scripts that require the --allow-run permission in some form? Are the other permission types (--allow-read/write/net/etc) also affect by this or similar issues somehow?
The 2nd issue does seem concerning to have taken so long to resolve.
Yes, it exploits `--allow-run=whatever` + `--allow-write=whatever` to execute arbitrary code outside of the sandbox.
The problem with the Deno security model is that it’s hard to predict how granting any specific permission would affect overall security. For example, it may seem to be kinda reasonable for an application to ask for `--allow-write=~/.config` to create config directories & files, but it’s probably exploitable to escape the sandbox. Is `--allow-env` + `--allow-write=whatever` dangerous? I don’t know. If Deno runtime spawns a subprocess at some point, it could be used to execute arbitrary code via `LD_PRELOAD`. Is there a guarantee that Deno runtime will never spawn subprocesses? There is no way to know.
Lots of weasel words here and you don't answer directly.
It still seems to me that a Deno script running with necessary permissions will be a huge hurdle compared to running a Node script which by default has al permissions.
Edit : Also I'm seriously fed up with tech community cr#pping at everything good because it isn't perfect.
Today with Node any random package can download any DLL as long as it is novel enough to pass antivirus (hint: last I checked a 17 year old could make a trojan that flew straight past, no questions asked.)
Even Node is usable with the right precautions.
But trying to get people away from thinking about Deno is not ok.
Painting any criticism of deno as simply the work of haters is much more a tactic of stopping thought.
Whether or not ideas and discussion are negative or positive is not relevant at all. When they're presented in some detail and in good faith, we should tackle them in turn. This encourages thought, discussion, and understanding among all.
GP's comment is in a thread that starts with "Deno’s permission system is broken, you shouldn’t rely on it".
Not relying on Deno's permission system will, in practice, mean just allowing everything or using Node instead of Deno. I can't for the love of god understand why that's better than using a permission system that provides some more protection than just about any other currently-commonly-used backend dev platform. Nobody at Deno is suggesting that their permission system solves all your security risk.
I bet "just allow everything" is not what the top poster intended but that's the takeaway. How many Node deployments do you know that use OS-level protections to eg disallow Node from spawning child processes?
Because illusory protection is worse than no protection. if it doesn't actually provide any protection against malicious code in practice then the only thing it can give you is a false sense of security.
I did answer directly: this exploit requires run permission with any argument and write permission to any directory. It allows malicious script to escape sandbox and execute arbitrary code outside of it.
> this exploit requires run permission with any argument and write permission to any directory. It allows malicious script to escape sandbox and execute arbitrary code outside of it.
I suppose you wrote something wrong here and I'm interested in knowing what.
Because as it stands now I read it falls down to: "If you open the permission system up extremely wide you can get exploited."
Alternatively, after thinking for a couple of minutes I can read it as "if you simultaneously allow run permission with anything and write permission with anything".
In the last case it is slightly more problematic, but if one allows a script to execute anything that is itself a huge red flag.
... and on Node this red flag is always flying by default.
So, I have a highly upvoted answer below but I'm going to humble myself a bit (smart anyways when I'm wrong and definitely better than having others do it).
I read through your bug reports now.
These are sound and very very useful.
I must admit I pattern matched on your language and answered based on that below and therefore my answer even if it is maybe somewhat(?) correct is extremely wrong in tone and what it implies.
Sorry.
I still think that it would be better if you were somewhat more specific. All in this thread seems to be related to subprocesses, which is a scary thing anyways for anything internet facing, isn't it.
It’s arguably worse than Node because Node doesn’t pretend to provide any security. With Deno you may be tempted to think that permission to run specific command actually means that program can’t run some other command (it can, and doing this doesn’t even require _clever_ hacks: Deno uses binary name instead of the full path in it’s permission system, so you only need to change $PATH for the child process).
> make sure you carefully consider if you want to grant a program --allow-run access: it essentially invalidates the Deno security sandbox
Saying Deno shouldn't "pretend" (or attempt) to provide more security because a non-default flag invalidates the sandbox (as stated clearly in the docs for that flag) seems slight hyperbole.
It would admittedly be cool if we could use this flag securely (though I'm sure the implementation complexity would be significant, and more code surface area is never nice to audit).
I am very interested in this. Are there existing exploits for deno? Using a stock out of the box configuration, can you execute some code that breaks its permission model?
Has deno undergone some kind of security audit to verify its claims irt security?
EDIT: I see some referenced issues in comments down below involving the --allow-read/write flag. I'm not interested in that. I'm interested in if anyone can prove that with no permissions granted at all, they can break out of the sandbox and achieve ACE.
I think you’d need to either grant permissions (eg allow-ffi) or find a privilege escalation bug in V8 or Deno’s Rust bindings. The latter is less likely for sure. But being realistic, most people using Deno are granting some privileges, because most use cases at minimum do some I/O.
I’m academically interested if there are other such exploits, too. But I’d expect if they’re found they’ll be patched before they’re disclosed (or they’ll be exploited in the wild).
Most Browser exploits these days use Heap Spraying attacks that try to corrupt the state of the sandbox in between bindings and native libraries (or their data structures that are transferred between contexts). So technically, a JIT VM always leads to possibilities for breakouts when there is a discrepancy between the optimizer and deoptimizer's assumptions (e.g. in regards to callstack, garbage, memory ownership etc).
Also: There's a legacy navigator.plugins C-Bridge based API which hasn't been maintained or redesigned/refactored since the late 90s yet it is still active in most Browsers.
You are probably right. It would be nice if node adopted it, I agree that it's unlikely to replace, but I do hope it can at least coexist.
I use Deno mainly for simple scripts with Typescript. In nodejs I always find myself having to configure the environment, while in Deno it mostly just runs.
I beg to differ, Deno adds some fire to quality modules becoming the better norm for javascript on the server-side, smaller, faster, purpose-made modules and frameworks that don't rely on thousands of dependencies.
Of course, it's up to each and every developer that contribute, but looking at some of the most up-stared Deno modules, that's the direction it's heading.
> smaller, faster, purpose-made modules and frameworks that don't rely on thousands of dependencies.
This isn't necessarily enforced by Deno itself right? That seems like more of a side-effect from the self-selection of its users. Once the ecosystem grows and the all the "normies" come in, this doesn't seem guaranteed at all.
My understanding is that one of the goal of deno is to be more compatible with the Web. As a backend only dev I don't find that great, I rather have clearer separation between quality backend modules and shitty 100-dependencies web modules ;)
Also I don't want smaller modules. I want bigger and better maintained modules with few dependencies. Small modules is what makes npm ecosystem not that great.
> quality backend modules and shitty 100-dependencies web modules
As someone who works in both stacks, I have seen incredibly shitty backend modules that dump 400mb of node_module deps.
I'd even argue node ecosystem is at fault because frontend adopted their package manager and "best practices".
So off of your high horse.
And Deno's goal of web compatible is in using web APIs where applicable instead of special snowflakes ( eg the web crypto api as browsers use vs nodes crypto module) and being permission focused rather than access by default. Feel free to debate the merits of that instead.
Sorry again, I see that I forgot to answer your good points about crypto api and permission.
- I think the permission thing is bullshit (and probably a reason enough not to take deno seriously). I'd be interested to be proved wrong (I work on backend security so this is sincere), but it feels like traditional server side isolation mechanims (cgroup, namespace, outgoing http proxy, ...) are well known, work well, seem safer and more flexible and are not nodejs specific, so are better in any system that are not running JS only backend (probably any reasonably large system)
- nodejs now supports the webcrypto API - experimentally (and for what it's worth, I liked the nodejs one better ^^)
Okay, well my experience is different, sorry :) maybe it had more to do with the people working on the FE teams vs the BE teams than with ecosystem, but FE always ended up downloading half of npm, while BE was much more reasonable.
I'm sure not everyone use React&co, so you are probably right, I should not have generalized too quickly.
What's the real problem with 100 dependencies and size of a folder? I can name only non-cached install time, which is irrelevant for a regular developer.
What is "shitty"? Do we have something better than that? Like one big web framework which is both maintainable, DCE-able and solved everything frontend ever needs? Why further modularization of this theoretical module would be shitty?
Edit: Also, what are we comparing to? VS202x takes several gigabytes for a base set of libraries. Xcode is tens of gigabytes now, afaik. Qt is few gigabytes. Enterprise systems also take few gigabytes at least. But when node_modules takes more than 80mb to produce a 0.5mb bundle for a webscale product, we start to complain. I don't understand.
I don't mind 100 dependencies maintained by a single large community/project. I hate 100 dependencies each maintained (or not maintained) by single amateur devs (not negative to be clear, I just mean they are not paid for this).
Thing is - unless you really love JS for some weird reason, then deno is competing with established languages like “go” that have a great web server in the standard library. In addition to competing with node which is fine! Apart from it’s insane package management situation. Which most people just kind of shrug at
JS is already a popular language for "DevOps" cloud scripting. Deno would be a better fit for these kinds of use cases for the same reason Go is a good fit and there's already market share.
I maintain Nodemailer and moved to the zero dependency model years ago (Nodemailer only has some development dependencies needed to run the test suite but no production dependencies). Mostly though, because npm versions of that time were super slow – aggregating all modules into a single package took install time to around a single second while the same code separated into modules took like 15 seconds or more to install. It turned out to have other benefits as well. For example, it is far easier to support older platforms (Nodemailer supports all Node.js versions since Node v6) if your dependencies do not start using the latest ES syntax overnight.
I'd like to use some modern common ground for js/ts development, but the entire toolchain is not ready for this, somehow turning it into chicken/egg problem. Typescript, webpack, babel contribute to that. For last 10 days I tried to pull my generic project as much to the top as I could^, but modules are still commonjs, because to use imports I have to "type":"module" in package.json, which makes webpack.server.config.ts fail because typescript is not ready for type:module until 4.6. I can't even recall now what's with Babel, I guess the same issue, since I'm using it to strip types in development builds. And then there are modules from ESM movement which are incompatible with this state of things. I understand their idea that nothing will move if not kicked further, but I hate it in real production where I can't upgrade because the author said so. Once ESM transition will be done, Deno will get much more modules, I believe. But right now the friction is unbelievable. Idk why they can't just allow all of the things like imports, requires, sync/async, side effects thing at least for a while, to cooperate. It's a matter of a form, not of a content. And there is no reason seemingly why nodejs couldn't make main.js async by default — sync modules would return a resolved promise. There is so much circus in all of that, which makes you pull hairs for weeks of a setup process.
^ I'm bundling server-side for hmr/watch functionality in a monorepo with many cross-side shared code/modules.
I tried doing Advent of Code in Typescript, and spent entirely too many hours googling circular problems like "I can't use type:module because then ts-node can't load .ts files anymore, but if I don't use it, then require-statements are broken, but ...."
I don't even remember the details, but I remember it all feeling very rickety. I'd never push anything that fragile into production, it didn't even survive 25 days of doing small puzzles without constant nursing
Yep, the same experience and conclusions. I'm afraid of using this in production, because chances are some of the major players would say "well, I'll just kick the can down the road", and I will be dependency-deadlocked for all the time it's in the air.
Meanwhile guys like node-fetch and chalk ask us why don't we just adopt ESM.
You can do that with nodejs too, e.g. edit package.json and main.js in vscode, hit hotkey to npm install hit hotkey to run node ./main.js. This is what I'm doing at the day job.
What I'm trying to do there is to have a client and server in the same ./src, hot-reloaded module-wise on "save" only when a relevant part changes, and vscode to typecheck both at the same time. The language similarity is also a goal. It's a little more than just a traditional lazy-compile-restart cycle.
Non-monorepo non-ts-only guys do not experience my issues, because they only have one environment per project (or per src-<target>), and don't try to make their build configs incompatible with other parts of a build system. I tried to push it as far as it could go to evaluate the state of things for writing non-standard slightly different web apps. To make a ts-react app, they just use CRA, for a backend they just use node main.js.
But anyway this shows how interdependent this ecosystem is, instead of being full of orthogonal possibilities.
Love Deno. So much much more intuitive and simple than Node. I highly recommend you giving it a try, if you used Node before, Deno will be a super easy to learn tool.
Offtopic but this webpage disables my bottom navigation bar on mobile safari, their own navigation on top re-enables this. Has anyone else experienced this?
Now instead of npm packages abusing SemVer unless you use non-default --save-exact behavior as an attack vector, we can import modules from URLs without subresource integrity unless you use non-default lock file behavior! Great! We learned nothing!
Deno’s import system is near at first until you are repeating yourself everywhere, so you do what they suggest and make a file of import URLs and now you have your own package.json format.
Node.js fully supports ECMAScript modules as they are currently specified and provides interoperability between them and its original module format, CommonJS. Authors can tell Node.js to treat JavaScript code as ECMAScript modules via the .mjs file extension, the package.json "type" field, or the --input-type flag.
Mentions security and then completely glosses over security in the section on "yeah, we load random modules from URLs but it's okay because most Deno registries are immutable and we cache files"
If you care about security you will have setup your own node package registry with a curated/audited list of dependencies, then you need to point to the registry for the dependencies and maintain the registry.
With deno it should be easier to do this, you setup your own cdn, just upload plain js files and point it from your import map[1], the browser will take care of download/cache them all.
> If you care about security you will have setup your own node package registry with a curated/audited list of dependencies, then you need to point to the registry for the dependencies and maintain the registry.
Exactly. And it's quite easy to do.
> With deno it should be easier to do this, you setup your own cdn, just upload plain js files and point it from your import map
I was waiting for the inevitable just.
- Just set up your own CDN.
- Just upload a plain js file there (where do I get those files from?)
- Just point to dependencies using a feature that, quote "is not a W3C Standard nor is it on the W3C Standards Track"
- And then the browser... record scratch who said anything about a browser?
And we have lock files to verify integrity. This is no different to module loading in Node. If you don’t trust your registry, you should not be loading code from it!
> If you don’t trust your registry, you should not be loading code from it!
So you immediately pinpointed the difference: with Node I can run my own registry and easily set up npm/yarn to never load packages from anywhere else. Deno loads code from random urls.
> So you immediately pinpointed the difference: with Node I can run my own registry and easily set up npm/yarn to never load packages from anywhere else. Deno loads code from random urls.
Which is why we support a) import maps which allow you to rewrite all URLs however you want, and b) HTTP_PROXY, which allows you to intercept all HTTP traffic (also letting you rewrite all specifiers).
I don't know if you have ever worked on a Go project, but it has a very similar registry proxy situation as Deno. It works well.
You are so wrong. If you would have done maybe 3 minutes of Googling you would know we support import maps, which allow you to arbitrary rewrite specifiers, even deep inside of the module graph.
> for example to add bootstrap to a site you import like this
I know how to import a file in a browser. However, Deno is not a browser. The whole subthread is about managing dependencies, which Deno fails at, and its proponents come up with the most ridiculous things to justify it.
NPM is not a trusted repository, I think. There are no checks done to the content of the packages uploaded by users. It's up to you to make sure that what you add to your project doesn't contain malware/vulnerabilities.
If you use a lockfile, downloading a package from NPM or directly using a random URL is conceptually the same, since they are both untrusted sources. Having a lockfile will ensure that if you download a dependency to review it for vulnerabilities, later re-downloads of the dependency will not have changed files.
It's not. But it's rather trivial to run your own registry, and many (most?) companies do exactly that.
> It's up to you to make sure that what you add to your project doesn't contain malware/vulnerabilities.
That's why, again, many companies run their own registries, and don't download random files from the internet.
> Having a lockfile will ensure that if you download a dependency to review it for vulnerabilities, later re-downloads of the dependency will not have changed files.
And to generate that lockfile... you need to first download a random file from the internet. Got you.
> there is not really a difference between a url to a registry and a npm package name.
There is. For starters, I can run my company's registry and make sure all npm packages are downloaded from there since resolution mechanism for npm/yarn is well known.
How do I tell deno to download <random-url> from my own registry?
Looking at how deno "solves" this I can't stop laughing [1]
--- start quote ---
In Deno there is no concept of a package manager as external modules are imported directly into local modules. This raises the question of how to manage remote dependencies without a package manager. In big projects with many dependencies it will become cumbersome and time consuming to update modules if they are all imported individually into individual modules.
The standard practice for solving this problem in Deno is to create a deps.ts file. All required remote dependencies are referenced in this file and the required methods and classes are re-exported. The dependent local modules then reference the deps.ts rather than the remote dependencies...
With all dependencies centralized in deps.ts, managing these becomes easier.
There is one difference. I know npm keeps published versions. I don't know that random URL keeps versions. Caching locally doesn't help. I expect my code to work for others. Of course using any source is nice. node also allows this, just put a git URL as your dependency.
> URLs to a registry keep published versions if the registry keeps published versions.
Yes. Have you've ever heard of running your own registry? It' quite easy to do and most companies do it prcisely because they want to a) keep published versions and b) prevent things like colors/fake.js
Literally no one who promotes Deno has yet shown how to do the same with Deno beyond "yeah, you check in all your node_modules dependencies into Git".
> You assign the dependency to a variable and that's it
erm, how is that any simpler than `export const/function/class $defintion` ?
I guess if you only touch JS once in a blue moon it's difficult to remember?
Also CommonJS has some acknowledged issues around cyclic dependencies, and being incredibly fudgeable at runtime that makes static analysis and linting a pain.
Because the ESM standard specifies that modules are loaded by URL and themselves might initialize async. Once anything is async the whole call stack into it is. See also various discussions of “colored functions” which have made the rounds again recently.
Node did make modules async (just ESM modules). They can’t/won’t make CJS async because their synchronous semantics are a guarantee they determined not to break well before ESM. If you want async modules, ESM is the solution for that.
This is inaccurate, or at least misleading. Your code doesn’t begin executing until all of the modules are loaded so it doesn’t “make the call stack async” just because of the import system. Module loading that’s async from the point of view of user code is also supported, but it’s use is rare.
> Your code doesn’t begin executing until all of the modules are loaded.
This is inaccurate, or at least misleading.
This, or some aspect of it, is both possible and relatively common:
import foo from 'foo';
await foo.bar();
await Promise.all([
import('other'),
import('stuff'),
]);
// Admittedly this is less common but also valid!
import yet from 'more-stuff';
The asynchrony of ESM was controversial before it even became a standard. But it’s necessary because it allows network I/O. And most of the above patterns being relatively common is one of the major use cases for bundling, because it also introduces an indefinite waterfall.
In terms of rarity, dynamic import calls are already quite common for “lazy”/“suspense” or their equivalents in quite a lot of real world code, and likely to become moreso with React Server Components and other similar solutions deferring to server rendering.
Yes, an import statement is [semantically] blocking. But even so it’s important to know that it’s performing async I/O.
You only have to `await foo.bar()` if foo.bar was an async function already, the module system is irrelevant. You would still need to await it even if it was `require()`'d in.
Of course these have to be awaited, at this point you're explicitly trying to load new code while the program is running. It's no different than any other kind of async IO.
> import yet from 'more-stuff';
I assume this is included to imply that loading `yet` is blocked by the `await Promise.all` above it to show that import statements can run after module resolution. If this was your intent you are mistaken, that import statement is still resolved before execution begins.
// main.js
console.log('a')
await new Promise(res => setTimeout(res, 1000))
console.log('b')
import { foo } from "./foo.js";
// foo.js
export function foo() {
}
console.log('c')
Results in
c
a
[one second pause]
b
> Yes, an import statement is [semantically] blocking. But even so it’s important to know that it’s performing async I/O.
No, it really isn't. From the POV of the application code it's irrelevant. This is trivially proven by the existence of module bundlers which convert non-dynamic `import` statements into one big file that doesn't do anything asynchronous to initialize.
> You only have to `await foo.bar()` if foo.bar was an async function already, the module system is irrelevant. You would still need to await it even if it was `require()`'d in.
You cannot do that. Top-level await is forbidden because CJS require is explicitly synchronous.
> Works just fine, no `await` necessary.
I typed all of that on a phone so I erred on the side of brevity, but suppose either the Promise.all or the awaited imports depended on some result from importing foo.
> I assume this is included to imply that loading `yet` is blocked by the `await Promise.all` above it to show that import statements can run after module resolution. If this was your intent you are mistaken, that import statement is still resolved before execution begins.
I was trying to show that import statements are blocked by prior awaits. I’m not sure what you’re trying to show, but you’d need to log output after the import from foo.js to see how the asynchrony works for that import.
> No, it really isn't. From the POV of the application code it's irrelevant. This is trivially proven by the existence of module bundlers which convert non-dynamic `import` statements into one big file that doesn't do anything asynchronous to initialize.
Compare that to a non-bundled but otherwise similarly optimized app and look at the waterfall.
Right. Working with Node.js I assume all dependencies are on the local disk. And that makes it simple and straightforward to use "require" in Node.js.
But sometimes I'd like the same code to work also in the browser. Then I should be using ES6, but it doesn't work very smoothly with Node.js and I fear there may be complications if I have to wait (or "await") for the modules to have loaded.
The ES6 "dynamic imports" add more considerations to the mix. Surely their exports can not be used until "await" is over?
I think it would simplify things if I didn't have to choose between module-systems when I really just want to choose between sync and async.
Don't think about syncing that formatting/linting/import aliases configuration with VSCode, all you need is the Deno plugin and you'll get all the benefits of working with TS on VSCode.
Packages are obtained (and heavily cached) from any URL instead of relying on a centralized repository. Obtain your dependencies as you want, whether it's Deno's proxy, directly from raw.githubusercontent.com, your own http server, or any other thing accessible thru an URL.
At the end, the permission system is the least interesting part of the project imho. It's useful, because if you're doing a CLI that just receives stdin, processes it, and prints to stdout, you can block any disk and network access, but apart from that it's really limited because the nature of JS itself. (Maybe for the next trendy language we could think about the Object-capability model before it's too late. https://en.wikipedia.org/wiki/Object-capability_model)
The thing I value the most is consistency and having a fully working development environment out of the box to be productive.