
Deno Is a Browser for Code - beefman
https://kitsonkelly.com/posts/deno-is-a-browser-for-code/
======
soapdog
Just for the kicks, I tried using deno for a task at my day job.

We had a need to extract records from a database into a textual format to be
processed by other tools later. This is the typical task for a throwaway
script which can be written in basically any scripting language. There is
nothing in the task that called out for deno but I wanted to try it out and so
I did.

I basically worked on the script iteratively, doing small changes, running it,
checking the output, fixing things and running again. No bundler, no complex
NPM tasks. Just a simple spaghetti JS and a single binary engine to run it. It
was a very refreshing experience. Using URL imports was pretty nice, I really
like working like that since my preferred JS environment is not NodeJS but the
browser. I can see myself transitioning all those small tools and throwaway
scripts I'd do in NodeJS to Deno.

People relying on NodeJS for servers and tooling for their client-side webapps
might need to wait a bit longer, but for doing small tasks and scripts, I
think it is quite ready and better than NodeJS IMHO. These are also good
projects to give you a feel for the new runtime.

~~~
jfkebwjsbx
For small scripts there are many languages out there which are better, with a
proper standard library and pre-installed runtimes in virtually every machine.

Why would someone want to use a new language and a new runtime in production
scripts is beyond my understanding.

Just write Python for those. Even Perl, awk or sh would be better options, and
that is saying something...

~~~
searchableguy
Well I can think of few. For one, if you are writing a script with external
dependencies, it's more work than deno to distribute them.

For python, you would need pip. Can you install any script from arbitrary urls
using pip?

I am not sure but I don't think you can. You also have to declare your deps
for pip to figure out.

With deno, it's just deno run -A url to script since dependencies are declared
in the code with their respective url or file location.

~~~
jfkebwjsbx
The whole point of Python is that _you don 't need external dependencies_ for
"small tasks and scripts".

A production "script" requiring downloads of third-party code from external
servers is already a red flag, by the way.

~~~
candu
One could extend that logic to argue that all package managers are red flags:
after all, you're downloading third-party code from external servers; you just
happen to be doing it ahead of time, rather than at runtime.

In principle, there's no reason why the same supply-chain security mitigations
npm and other package managers / repositories have put in place could not also
be applied in this case: you just apply them at download time, same as before,
except now it might result in a runtime error instead of an npm install
failure. (No idea if deno does this in its current state, tbh.)

Agree, however, that blindly executing third-party code without somehow
vetting it is a security risk.

Also, re: Python - there's a reason why libraries like requests exist; while
the Python Standard Library is pretty comprehensive, the APIs it exposes are
not always the most intuitive. You might be surprised at how often people pull
in helper libraries to work around some of its warts - now, one could argue
that maybe they shouldn't do that and should just know the built-in modules
better, just as one could argue that functionality is useless unless combined
with usability.

~~~
jfkebwjsbx
Package managers are not red flags, the packages, its vendors and procedures
behind those are.

When you download a software update for your kernel you are trusting the Linux
Foundation and your upstream vendor. They are supposed to have proper
processes in place, they sign the binaries and may even have a support
contract with you.

When you put a random URL as a dependency, you are just trusting some random
person over the Internet not to screw it up.

Re: Python libraries. The topic of this thread was about "small scripts and
tasks". The point of Python and its "batteries included" is that you don’t
need external libraries to accomplish common tasks. It is a mistake to use
external helper libraries in your scripts (not _apps_ ).

------
jppope
Its very interesting to me the level of skepticism that is being shown towards
Deno.

Whether or not you agree with the approach, you have to agree with the
underlying problems that Deno is trying to solve. Furthermore I agree with the
article in that there is a paradigm shift here, which should be given a
chance.

I for one am happy about the url imports for a variety of reasons, but top on
my list is that it will (in theory) reduce the total amount of code people are
using from other sources, and (in theory) will make developers more aware of
what they do choose to import. All of this should reduce bloat and award open
source maintainers that solve novel problems in a clear & concise way.

~~~
emerongi
There are better ways to reduce bloat. Gitlab can do web asset analysis and
throw warning signs when your JS file goes over a certain size. Of course, in
server-side applications the use-case would be different, but the idea remains
the same: do not depend on developers doing the right thing every single day,
depend on an automated system if something like this is crucial to you.

The URL import completely throws away all the goodies that great package
managers have. Being able to manage sub-dependency versions is a big one. I do
not understand how people are OK with throwing those away. Either they have
never encountered issues that stem from dependencies or they just don't care.

Deno could technically add layers on top that make dependency management
easier, but then we are back to NPM.

------
paxys
The key paragraph:

> The Deno CLI works like a browser, but for code. You import a URL in the
> code and Deno will go and fetch that code and cache it locally, just like a
> browser. Also, like a browser, your code runs in a sandbox, which has zero
> trust of the code you are running, irrespective of the source. You, the
> person invoking the code, get to tell that code what it can and can’t do,
> externally. Also, like a browser, code can ask you permission to do things,
> which you can choose to grant or deny.

The big problem with Deno is that permissions, once granted, apply to _every_
imported URL. You cannot ask it to let one script access the file system,
another access network and the rest of them run fully sandboxed. This makes it
so that you either only import scripts you already fully trust (making the
permission system useless) or don't allow any access (making the ecosystem
useless).

~~~
chii
> ask it to let one script access the file system, another access network and
> the rest of them run fully sandboxed.

this is actually quite a difficult problem to solve. You're balancing the
security needs with usability.

Browsers have an advantage on this aspect, because they can make the
assumption that the webpage does not require filesystem access.

i think fully sandboxed OS-esque security model is the only way this can work.
Like on android, where each app sees only their own filesystem, and app-to-app
interaction is done via a permission based system that the user has to
approve.

~~~
aabbcc1241
On android the apps has access to shared folders. For example my music player
can scan and read files under
/mnt/sdcard/Android/data/com.dropbox.android/files/

It maybe more isolated if it follows the nix file system model?

------
sholladay
To the people who are skeptical of Deno, I would urge you to have patience and
try it out. Deno is a young tool and it will take time to develop the features
that you take for granted in other systems. But Deno gets a lot of things
right. I already love using it day to day.

I remember using Node when it was this age and thinking it was the future,
too. And it was. But as with Deno now, there were problems. In fact, Node was
far worse in many ways. npm was extremely unstable back then, for example,
which often made it impossible to use. Deno doesn't have that problem. Node
had to create libraries from scratch and invent a ton of non-standard things
along the way to do it. Deno is built on web standards but can also leverage
many of the existing Node libraries by using std/node (the Node compatibility
layer) or with jspm.io, for example. Yes, there's the occasional quirk or
missing library, but it's already easy to be productive with Deno and it's a
lot of fun.

~~~
tannhaeuser
> _Node had to create libraries from scratch and invent a ton of non-standard
> things along the way to do it. Deno is built on web standards but can also
> leverage many of the existing Node libraries by using std /node (the Node
> compatibility layer) or with jspm.io, for example_

Node.js is based on CommonJS [1] which defines the require() semantics for
CommonJS/node.js modules as well as core libs/APIs such as JSGI (for
express.js or node core http middleware) and others. When node.js was new,
there were many JavaScript app server projects such as helma, v8cgi/TeaJS, and
others, and CommonJS was very much a community effort.

[1]:
[http://wiki.commonjs.org/wiki/CommonJS](http://wiki.commonjs.org/wiki/CommonJS)

~~~
specialist
The inability to import (requires) modules using relative paths drove me nuts
while using nodejs. I hope deno fixes that.

------
orta
I think this is a good article, and worth the time to give it a read. When
Deno was first announced I was not sold on this aspect of the system.

Over time though, and after having wrote a barely-above-trivial app in Deno.
I've started to come to the conclusion that Deno's form of dependency
management is a fresh perspective which ties closer to how browsers work and
is a welcome simplification. The article covers a few ways that you can get
production/reproducible builds out of the box which is usually people's first
worry.

~~~
jppope
agreed

------
earthboundkid
Odd that the article doesn’t mention Go. Deno works like Go. It’s slightly
different in that Go added a system for version rewrites recently (“modules”),
but other than that it’s the same, and if importmaps ever happen, it will be
the exact same.

Also Deno has a command line fallback for its version of GOPATH, which is
smart because it turns out a huge percentage of devs don’t know what env vars
are or how to set them.

But other than cosmetic differences, it’s pretty much identical to Go.

~~~
maxmcd
I've been skimming the arguments so I might have missed something big, but
isn't the key distinction that Deno can take anything from a URL? Go's "go
get" is backed by version control systems, but you could freely point your
deno at "[http://dangerous.com/evil.ts"](http://dangerous.com/evil.ts"). I
thought the concern was about the potentially problematic flexibility.

~~~
14u2c
Isn't that the advantage though, that you can do exactly that without concern.

> deno run --allow-write=~/only-evil-here
> [http://dangerous.com/evil.ts](http://dangerous.com/evil.ts)

------
gigel82
You hardcode the dependency version in the actual URL string in JavaScript
code? How in the world is that manageable? I run `npm update` once a week and
it brings down a dozen or so updates (occasionally `npm outdated` as well with
manual testing).

I bet we'll have a bunch of competing standards for "dependency management" in
deno coming online really soon, because there's an obvious need for that.

Also, are dynamic dependencies a thing? Can I run some code to decide (at
runtime) whether I import this.package or that.package? Holy hell... with test
coverage the way it is in most projects, that's going to blow up really fast.

~~~
PudgePacket
> You hardcode the dependency version in the actual URL

Websites have been doing this for .. 20 years?

It's quite a different paradigm, yes.

~~~
onion2k
_Websites have been doing this for .. 20 years?_

Web developers have been trying to move away from that paradigm in order to
make sites faster, more reliable, and smaller for almost as long.

~~~
jai_
Web developers have been trying to move away from URLs that describe their
content for 20 years?

Well that certainly explains the state of the 'modern' web

~~~
onion2k
_Web developers have been trying to move away from URLs that describe their
content for 20 years?_

My comment wasn't about URLs describing their content, but rather the
assumption that goes along with it that every network request will always work
perfectly. Once you realise that things do actually fail occasionally you
should realise that local caches and bundling things in to a smaller number of
requests can improve resilience.

That's not really an issue with Deno because it does things to mitigate the
issue (eg caching packages locally), but _websites_ that loaded libraries
using separate requests for each one were horribly prone to failure if one
thing was missing, and slower overall if the browser couldn't fetch everything
in parallel, so devs invented things like bundlers to fix the problem. Then
came things like code-splitting and tree-shaking to reduce the size.

------
forty
If they wanted to fix something on dependency management, they could have
started by adding a way to sign the code/packages. Indeed browsers don't allow
to sign pages, which means anyone from the hosting provider to the CDN to the
mitm corporate proxy can inject any code they want. It sucks and I'm not sure
why "Deno is a browser for code" seems to be a good thing....

Hopefully something like google signed http exchange will fix that in the
future for browsers at least.

~~~
ccmcarey
Unless I'm missing what you're referring to, that's not true at all - HTTPS is
secure and uses certificates to sign and encrypt traffic.

~~~
emerongi
I think OP is arguing:

\- Host can change what is in a package at any time. In case Deno calculates a
hash on your machine (e.g. package-lock.json), it's not as bad, but still
without a signed package you can't be sure that it was the author that
released the code in the first place.

\- Host <-> CDN usually happens over HTTP. Often it is the CDN where HTTPS is
terminated. Technically the CDN can deliver whatever code to you.

\- Corporate HTTPS proxies exist. With a proxy like this, it can also replace
the code with whatever it likes.

~~~
brabel
> In case Deno calculates a hash on your machine (e.g. package-lock.json),
> it's not as bad, but still without a signed package you can't be sure that
> it was the author that released the code in the first place.

If you trust the URL is in a (sub)domain controlled by the author and it's a
HTTPS URL, that's as much guarantee as a signature.

------
julvo
The two most frequent arguments against using URLs instead of npm packages
seem to be: 1) Security: What if someone changes the code behind a URL? 2)
Hard-coded versions in application code

I don't think these arguments are valid. For 1) you can always limit the URLs
you use to hosts like jsDelivr or unpkg and reference npm packages directly.
In this case instead of trusting only npm to send you the code you requested,
you're now trusting the CDN as well. Potentially even better could be to
import scripts from Github directly. For 2) a simple solution could be to have
a deps.ts file for your project where you import from single-version URLs and
export the dependency version-unaware. Now your versions are not scattered
throughout your application code.

To me URLs seem like a nice (because compact and universal) interface for
dependencies which allows decoupling your package management from your
runtime.

~~~
jfkebwjsbx
So, in order to solve 1) and 2), we should return to what every other language
does? That sounds like a design mistake...

What do you mean by "URLs are compact"? How can a URL be compact compared to
the same URL in some sort of dependencies file? Unless you use the dependency
once, of course, which is rare.

URLs and explicit versions in the import also encourage you to use several
versions of the same dependency in your code, which is a very bad idea.

NPM’s model of referencing packages from central repositories is bad for many
reasons proven over the years. But referencing GitHub scripts and arbitrary
URLs is even worse.

~~~
julvo
URLs are compact as in a single string they specify everything you need to
locate and use the dependency and you don't need any context, like a specific
dependency manager (incl some sort of dependency file). I guess I mean compact
as in self-contained, not necessarily short.

~~~
jfkebwjsbx
That’s fair and I’d agree it could be a useful property for scripts and even
small apps if all code was trusted and all networks had perfectly
availability.

Sadly, that is not the case, which is why I don’t see the advantage of going
"the browser way" for local scripts.

------
pjmlp
The first time I dealt with dependencies written directly in code was with
Groovy.

It wasn't a good idea back then, it wasn't a good idea in Go and it isn't
definitly a good idea in Deno.

~~~
vlaaad
Care to explain why?

~~~
pjmlp
Because it ties the source code to the origin of the dependencies, forces code
rewrites that don't scale across all dependencies and aren't enterprise
friendly for using internal IT vetoed repositories.

Just because a dependency changed location I should not be obliged to touch a
single character of the source code.

~~~
brabel
In the case of Groovy that OP mentioned, that's wrong. This is what it looks
like in Groovy[1]:

@Grab(group='org.springframework', module='spring-orm',
version='3.2.5.RELEASE') import org.springframework.jdbc.core.JdbcTemplate

This provides the dependency's coordinates... but not where it comes from.

You can define where it comes from by configuring Repositories (normally done
by modifying the M2 settings file). This is how Java has always done it in
Maven and Gradle, even Ant long time ago. And this works really well! Never
seen anyone complaining about how that works.

[1] [http://docs.groovy-
lang.org/latest/html/documentation/grape....](http://docs.groovy-
lang.org/latest/html/documentation/grape.html)

~~~
pjmlp
I am the OP and you forgot this little detail,

    
    
        @GrabResolver(name='restlet', root='http://maven.restlet.org/')

~~~
brabel
No I didn't. This is if you WANT TO use a particular repo. Read the docs and
you'll understand that.

~~~
pjmlp
Which breaks the script when that repo is no longer available.

------
keymone
The holy grail of package management is absence of packages and versions, but
distribution via signature-addressable code.

In the end it’s all just functions that take arguments and return values. They
should be the minimal unit of distribution. Potentially even with precompiled
bytecode signed by some trusted compilation provider.

~~~
devurand
Sounds like you want Unison:
[https://www.unisonweb.org/docs/tour/](https://www.unisonweb.org/docs/tour/)

~~~
j1mr10rd4n
I was just racking my memory and searching through my library of interesting
links to find exactly this! Paul Chiusano gave a nice introductory talk at
strangeloop last year:
[https://www.youtube.com/watch?v=gCWtkvDQ2ZI](https://www.youtube.com/watch?v=gCWtkvDQ2ZI)

------
AbuAssar
The node_modules folder with its tenths of thousands of files is killing my
ssd!

as it is present in every node project I have, and consumes a gigantic number
of io operations.

Thus I’m very glad that deno didn’t continue this tradition.

~~~
Epskampie
Replace ‘node_modules’ with ‘deno cache’ and presto! you’re in the future.

~~~
arcatek
Yarn already removes the need for node_modules without requiring to change
interpreter and APIs, though.

------
sktrdie
All these "features" of Deno are super cool, but why reimplement a runtime
from scratch :(

Can we not benefit in Node from what I essentially see as "package.json not
mandatory" \- which I think is already achievable with ES Modules - and
"better permissions model"?

Maybe this can be seen as the effort from several years ago where a project
"io.js" branched out of Node as trying to implement certain things the
community didn't like. Then after a few years they were implemented into Node
and io.js "died off". Although io.js had always the intent of being compatible
with Node, so that definitely changes things.

~~~
petercooper
Some of what Deno does can be done in Node or people are trying to implement
it (I've seen some proofs of concept around the way modules are included).
However, it seems that being able to write the runtime in Rust has been a big
deal here as it dramatically changes how the runtime is distributed and the
developer experience of writing extensions for it versus Node.

~~~
sktrdie
How is choosing between Rust and C++ as the language in which the thing is
written in, an important thing to the users who only care about the JS/TS
interface?

~~~
mst
Because, in the long run, rust is a much better language for writing this in,
and will enable faster addition of features _for_ the JS/TS interface and more
certainty that the security level stuff is actually secure.

------
talkingtab
The npm registry is like a rickety bridge across a chasm. Every day I cross
that bridge because of how much time and work it saves me, but I am well aware
that the bridge might fall, especially with Microsoft now in charge.

Claiming that I don't need a bridge does not help me. Claiming that Deno is
agnostic about bridges is fine. If someone is looking for a startup idea, here
is a perfect opportunity to build a better bridge - one that works with Deno.

------
dzsekijo
_For example[https://deno.land/x/](https://deno.land/x/) is effectively
nothing but a URL redirect server, where it rewrites URLs to include a git
commit-ish reference in the redirected URL. So
[https://deno.land/x/oak@v4.0.0/mod.ts](https://deno.land/x/oak@v4.0.0/mod.ts)
becomes
[https://raw.githubusercontent.com/oakserver/oak/v4.0.0/mod.t...](https://raw.githubusercontent.com/oakserver/oak/v4.0.0/mod.ts),
which GitHub serves up a nice versioned module._

Thus [http://deno.land/x/](http://deno.land/x/) is the de facto central
registry. I'm yet to see how it makes a difference whether the logic of
resolving a package name to a concrete version of the package is baked into
command line tooling or a web service.

------
macca321
Seems odd that they don't support an equivalent of the integrity attribute.

------
diffrinse
Feel like it would've been useful to require a manifest.json or some such that
spec'd required permissions right next to the deno package on the FS, in the
web server web root. This is not different from manifests for Web Extensions.
Then anyone could write their own CLI to download and cache packages but have
their own "excluded permissions" list that throws/rejects packages that
overreach or could potentially overreach.

Now package writers have a feedback metric for getting as granular as possible
with that access their package needs. Even better, anyone could build server
infrastructure that does not the client-side validation bit _for_ package
consumers with their own attendant CLI tool.

It looks to me Deno opens the possibility of _multiple_ NPMs, which is a good
thing over the de facto monopoly NPM exerts today.

------
forty
The issue I have with Deno is not only comparing to NodeJS. If you might use
Deno it means you are open to switching platform and even language of you were
not going TS already. And if you are at this point, you might as well choose
any plateforms, for example one that use a better (with stronger types)
programming language and a bigger ecosystem (scala, swift, rust, F#, java,
ocaml, kotlin... At this point I feel nearly any other languages fit this
description ^^). I have been doing Typescript for a while now, it's probably
as good as it could be doing what it does (being a JS add on), but if you
don't have this constraint, there are many better options.

~~~
ex3ndr
Please,

scala could be so different that you can spent years learning it and still
don't understand a lot of things. Swift is ok, but ARC is a nightmare. Good
for mobile though. Rust is really for C++ guys for specific use cases.
F#/Ocaml is cool languages, but good luck hiring fast if you need to. Kotlin
probably best, but in fact it works with decent performance only on JVM.
Native is still 15x times slower than JS.

TS is a first class language and have bigger ecosystem anyway than, say, F#.
etc etc

~~~
saagarjha
I’m going to skip over the other stuff, but

> Native is still 15x times slower than JS.

Surely you meant the other way around?

~~~
cercatrova
[https://discuss.kotlinlang.org/t/kotlin-native-
performance/1...](https://discuss.kotlinlang.org/t/kotlin-native-
performance/17823)

Seems Kotlin native can be 100x slower than its JVM version. It's still under
development so this can of course change.

------
oddthink
This may be inviting downvotes, but I still don't understand the use-case of
Deno. Why would I use this over something like Python?

In Python, I find I can do most things I want with the standard library plus a
handful of well-known packages, so even just downloading the tarballs and
installing manually isn't all that onerous.

Why do I want a "browser for code"?

I may just be out-of-touch, as a purely backend/machine-learning/quality
engineer. Clearly, this is a real need in the front-end / JavaScript space,
but coming from the outside I just don't quite understand yet the value
proposition here. Can anyone explain?

~~~
ordinaryradical
>Why would I use this over something like Python?

That's more a js vs python question than a Deno question so I don't think
you'll find many answers here as we'd veer off-topic.

>I find I can do most things I want with the standard library

A standard library exists for Deno as well. Code from any URL is not an
argument against having a standard library. The distinction being made is
whether or not package management should be brittle and baked into the
programming model to accept only one source of truth vs. any. Any source of
truth has the advantage of letting better solutions evolve and reveal
themselves over time, which is great for longevity, standards, etc. etc.

> Why do I want a "browser for code"?

My understanding is that this a lot to do with reproducing how front-end code
behave "in the wild." If you have a full browser, you aren't tricking your
programming model into thinking it's being run in a browser. I think this has
tooling advantages in the short term (i.e. simplicity) and more subtle ones in
the long term.

~~~
oddthink
Thanks. My question really wasn't focused well, and it was easily read not as
"please help me understand" but as "hurr durr Python is better".

I think my reaction comes from the dissonance between people saying "Deno is
the best thing ever for CLI apps!" then following up with "Here are the ways
it's better than Node!", without acknowledging all the other scripting
languages out there. But I think I need to read that first line as having an
implicit "...in the JS/TS ecosystem", and then it all makes sense.

Since clearly one of Deno's draws is that it improves on the package
management of Node, I was wondering why JS projects so often end up with such
a long list of dependencies, compared to the Python/Go/C++ projects that I'm
more familiar with. That's probably explained just by NPM making it easy, so
people did it. (Sort of like how Java grew convoluted DI schemes because OO +
GC + reflection made it possible.)

One of these days I should learn more JS than having read through the Good
Parts once. :-)

------
swagonomixxx
Can someone explain to me what problem that Deno and Node are trying to solve?
Is JavaScript really such an amazing language that we need to use it for
backend applications now?

What niche is it filling that is unfilled by other, more sanely designed
programming languages?

I'm not trying to be obtuse or anything, I'm just really curious as to why
anyone would use Deno (or Node for that matter) for a project when I can think
of at least 4 alternatives that are more sane (Go, Python, Rust, Java, and
many more out there, just that these are the ones I'm most comfortable with).

~~~
franciscop
This seems like trolling more than a genuine question, but I'll answer in case
it is genuine.

It makes it easier for fullstack developers since it's a single language both
in the front-end and back-end. You only need to learn once how `JSON.parse()`,
`atob`, classes/prototypes, etc. work, and not worry about language
inconsistencies making errors.

It is decently efficient (not the best, not the worst), fairly readable (I'd
say on the upper side). The three main disadvantages IMHO are:

\- Its async nature is more difficult to grasp for beginners compared to one
thread/request and then linear-ish work of some other languages

\- Node.js standard library is optimizing for flexibility instead of
conciseness or completeness, so there's quite a bit of manual gruntwork
involved (not the worst of those mentioned though).

\- Right now the ecosystem is in a major migration from `require()` to
`import`. The last large migration like this was from Callbacks to Promises,
and IMHO it went very well.

As someone who is very comfortable with Javascript, and was comfortable with
PHP and Python back in the day, I would not trade Node.js for any of those. I
am thinking most of the time about what I want to achieve in a coherent way,
and not fumbling with pointers.

~~~
mekster
> \- Its async nature is more difficult to grasp for beginners compared to one
> thread/request and then linear-ish work of some other languages

Grasping isn't the hard part, it's just about placing callback functions on
executed functions that allows callbacks.

What's so bad is that it is async by default and now you need million "await"
and "async" keywords all over the place. I wish it would be sync by default
and for those rare cases where you really want it async, you would add a
keyword.

~~~
franciscop
I mean 4 of the very common Node.js operations are async by nature: database
calls, file access, external API calls and crypto.

~~~
dragonwriter
They aren't async by nature, just time consuming. Async is a useful way of
dealing with time consuming operations in some situations, but blocking calls
also a work (and make more sense for some workloads.)

------
marcus_holmes
I still think that sandboxing dependencies is the wrong answer to this
problem.

(first - the problem is trusting dependencies. How to spot malicious code in a
dependency, and stop an evil upstream maintainer from doing bad things to your
site)

This solution slows down the system at run-time (and also never quite solves
the problem - sandboxes are pretty leaky. The history of browser extensions
trying to do the same thing is a good example).

I'm not really sure what the solution is. I have some ideas, but they're
mostly about changing the culture around coding, rather than technical.

~~~
lolc
Having sandbox restrictions available allows deeper defence. When it comes to
running other people's code, VM restrictions provide a solid layer of
protection.

At work we're transitioning our NPM scripts to run in containers. This is very
cumbersome. But the extra layer protects dev homedirs from rando NPM authors.

------
zubairq
Deno is awesome. We had a similar idea at Yazz Pilot and are currently
building out the ability to import a subset of Javascript code from a URL.
Currently it only supports code of the form:

pilot
[https://raw.githubusercontent.com/zubairq/pilot/master/start...](https://raw.githubusercontent.com/zubairq/pilot/master/start.js)

------
cercatrova
Interesting, so by making dependencies in the code itself, which is more
obtuse than using node modules, we might see fewer overall packages being used
by developers, and more implementations from scratch. No more gigabyte sized
node modules! I'm looking forward to this.

~~~
searchableguy
Right!

[https://deno.land/x/is_url](https://deno.land/x/is_url)

[https://deno.land/x/blox](https://deno.land/x/blox)

[https://deno.land/x/deno_flatten](https://deno.land/x/deno_flatten)

[https://deno.land/x/ende](https://deno.land/x/ende)

[https://deno.land/x/humanize_url](https://deno.land/x/humanize_url)

There are many others. I don't know why people aren't looking into std (which
provides most of the functionality) or using regex where it doesn't make sense
given deno is a browser.

~~~
satvikpendem
Well hopefully people find it too annoying to use deno modules and we get some
sanity in development.

------
jolux
I’m still not sure how this addresses Laurie Voss’s point about left-pad. Does
the caching and lock file happen by default?

------
paulryanrogers
> The Deno CLI works like a browser, but for code. You import a URL in the
> code and Deno will go and fetch that code and cache it locally, just like a
> browser. Also, like a browser, your code runs in a sandbox, which has zero
> trust of the code you are running, irrespective of the source.

Guessing it uses containerization or virtualization, hopefully with no gaps in
security.

~~~
cookrn
I don’t know that it uses either of those tools as the built-in approach. It
could have changed, but last I read, it used V8’s built-in Isolate concept to
provide the sandbox and that when an Isolate is created, it is only provided
the underlying system access specified by CLI flags or other options e.g.
Filesystem, Network, etc...

~~~
lioeters
According to its architecture description ¹, there are no containers or
virtualization involved in Deno.

I found deno::CoreIsolate in the source ². Userland process isolation seem to
be provided by V8 Isolate.

The execution and security model remind me of recent trend in FaaS, in
particular running WebWorkers (or similar), WASM, etc. Found a fascinating
presentation about how V8 is used at CloudFlare ³.

"..using V8 isolates instead of containers or VMs, achieving 10x-100x faster
cold starts and lower memory footprints.."

\---

¹
[https://deno.land/manual/contributing/architecture#schematic...](https://deno.land/manual/contributing/architecture#schematic-
diagram)

²
[https://github.com/denoland/deno/blob/2610ceac20bc644c0b58bd...](https://github.com/denoland/deno/blob/2610ceac20bc644c0b58bd8a95419405d6bfa3dd/core/core_isolate.rs#L77)

³ Fine-Grained Sandboxing with V8 Isolates -
[https://www.infoq.com/presentations/cloudflare-v8/](https://www.infoq.com/presentations/cloudflare-v8/)

------
vanderZwan
Clear article, I honestly don't have much to add to the main thesis of it,
other than that I'm very curious to see how well the arguments will hold up
when faced with "real world" usage, and how Deno will evolve.

But if I may indulge in a bit of bike-shedding on the side, does anyone else
think that the repeated URLs in the example dependency tree are a bit noisy
and hide the deeper structure of the dependencies?

    
    
        https://deno.land/x/oak/examples/server.ts
          ├── https://deno.land/std@0.53.0/fmt/colors.ts
          └─┬ https://deno.land/x/oak/mod.ts
            ├─┬ https://deno.land/x/oak/application.ts
            │ ├─┬ https://deno.land/x/oak/context.ts
            │ │ ├── https://deno.land/x/oak/cookies.ts
            │ │ ├─┬ https://deno.land/x/oak/httpError.ts
            │ │ │ └─┬ https://deno.land/x/oak/deps.ts
            │ │ │   ├── https://deno.land/std@0.53.0/hash/sha256.ts
            │ │ │   ├─┬ https://deno.land/std@0.53.0/http/server.ts
            │ │ │   │ ├── https://deno.land/std@0.53.0/encoding/utf8.ts
            │ │ │   │ ├─┬ https://deno.land/std@0.53.0/io/bufio.ts
            │ │ │   │ │ ├─┬ https://deno.land/std@0.53.0/io/util.ts
    

I'm not entirely sure what would be the ideal alternative presentation though.
Perhaps something along the lines of one of these two mock-ups I just edited
by hand:

    
    
         https://deno.land/ 
           └─┬ x/oak/examples/server.ts
             ├── std@0.53.0/fmt/colors.ts
             └─┬ x/oak/mod.ts
               ├─┬ x/oak/application.ts
               │ ├─┬ x/oak/context.ts
               │ │ ├── x/oak/cookies.ts
               │ │ ├─┬ x/oak/httpError.ts
               │ │ │ └─┬ x/oak/deps.ts
               │ │ │   ├── std@0.53.0/hash/sha256.ts
               │ │ │   ├─┬ std@0.53.0/http/server.ts
               │ │ │   │ ├── std@0.53.0/encoding/utf8.ts
               │ │ │   │ ├─┬ std@0.53.0/io/bufio.ts
               │ │ │   │ │ ├─┬ std@0.53.0/io/util.ts
    
    
         https://deno.land/x/oak/examples/server.ts
           ├── https://deno.land/std@0.53.0/fmt/colors.ts
           └── https://deno.land/x/oak/
                 └─┬ mod.ts
                   ├─┬ application.ts
                   │ ├─┬ context.ts
                   │ │ ├── cookies.ts
                   │ │ ├─┬ httpError.ts
                   │ │ │ └─┬ deps.ts
                   │ │ │   ├── https://deno.land/std@0.53.0/
                   │ │ │   │     ├── hash/sha256.ts
                   │ │ │   │     └─┬ http/server.ts
                   │ │ │   │       ├── encoding/utf8.ts
                   │ │ │   │       ├─┬ io/bufio.ts
                   │ │ │   │       │ ├─┬ io/util.ts
    

I'm sure there are good arguments to be made against both of these options as
well - plus I just came up with this on the spot, I don't even know which
"rules" would generate such a tree.

------
draw_down
I think this is pretty good. I like the use of URLs as identifiers for your
dependencies, just because it’s so unambiguous. I also like re-exporting deps
manually.

My only real point of discomfort is how the runtime seems to be a bit too
closely tied to the actual download and transport of these files/URLs. Like,
importing from https implying all further imports are from https is nice, I
guess, but is really stretching the browser metaphor imo.

I guess that is really my main point of contention, I agree with the security
model but I don’t agree that websites are particularly analogous to code
dependencies.

------
einpoklum
The post sounds rather Javascript-specific, or NPM-specific. How well does
that apply to other languages? Especially compiled ones?

------
xixixao
2 reasons why this could be a faulty approach:

1) Code, specifically library code, which is what npm mostly consists of, in
its most ideal state should not have a never-ending variety. On the contrary,
we do want to all arrive at one good way of doing a thing. This naturally
invites a single central repository over the decentralization of the web.

2) Very few people publish their own websites. Why? It's complicated, costly.
This is why Facebook and Twitter fair so well. So even the decentralized web
eventually diverges onto a centralized platform that solves the hosting
problem. Why reintroduce the problem when there is already a fairly un-
problematic central registry?

If the security model is the real winner, I don't see why it could not be
back-ported to Node - it likely will be if Deno takes off in any way.

~~~
jclulow
I don't have an opinion on Deno per se, but I quibble with this part of your
point:

> Code, specifically library code, which is what npm mostly consists of, in
> its most ideal state should not have a never-ending variety. On the
> contrary, we do want to all arrive at one good way of doing a thing.

This implies that there is in fact one good way to do most things, but in
practice different people want different trade-offs and forcing them into a
single codebase isn't possible when the options are mutually exclusive. You
can't have a library that is simultaneously strict about types (panics if you
pass a string instead of a number, say) and forgiving about types
(automatically coerces strings to integers or vice versa).

------
gfxgirl
Deno seems like the next mongo db fiasco waiting to happen. Insecure by
default. The fact that it doesn't grant permissions by default is irrelevant,
servers need permissions to function (access databases, access files, access
the network, etc...).

The incentives are all wrong. If I MITM a website at best I gain access to the
data of the few users that pass through my part of the net. If I MITM
something Deno is using I get access to the server. That's orders of magnitude
more data I get access too and therefore the incentive to MITM (or other) is
much much MUCH higher.

~~~
chrisco255
I agree, they should force cached, sandboxed fetching by default and have
people opt in to dynamic fetching.

