Hacker News new | past | comments | ask | show | jobs | submit login
What the Hell Is a Deno? (breadth.substack.com)
166 points by harrylucas on June 4, 2020 | hide | past | favorite | 146 comments



> Javascript is great. But... in saying that it has a few quirks and can work in some unexpected ways.

Typescript has just as many[1] (in fact more, as it's a superset) quirks than Javascript. I like using it (and it makes JS type-safe-ish), but it's not really some kind of paradigm shift.

Not sure how I feel about import maps. They are quite literally the same thing as package.json. In fact, converting between the two takes about 20 lines of code[2]. I'd bet my bottom dollar that everyone's going to use them, which is going to lead to exactly the same types of problems as Node.

[1] https://blog.asana.com/2020/01/typescript-quirks/

[2] https://github.com/WICG/import-maps/issues/60


Frankly I find TypeScript with strict mode turned on to be a safer and saner than C# and Java because of explicitly nullable types alone. I _never_ get null pointer exceptions in my own TypeScript code. Combined with fairly strict ESLint you get something that catches a lot of problems at compile time.

Of course it's still far cry from being as safe as for example Rust.

And yeah the inconsistency of JavaScript/TypeScript can be frustrating for sure. I think my dream language is one that is simply TypeScript cleaned up to be made consistent and sheds a lot the features and retains a simple core.


C# 8 has support for non nullable references.

For lower C# versions and Java you can get your strict mode turned on via static analysis tools.

I have been using Sonar on CI/CD builds since 2008. Static analysis errors break the build, plain and simple.

Also quite convenient for writing sane C and C++ code by the way.


> I think my dream language is one that is simply TypeScript cleaned up to be made consistent and sheds a lot the features and retains a simple core.

I agree. I think this probably looks a lot like a Rust-lite (GC instead of lifetimes for memory management) or a Go with generics or a mature ReasonML?


Have you tried Kotlin?


Kotlin.js perhaps.


In regards to that first link, discriminated unions are awesome in general, it is a pattern that occurs all over the place in real world code and having compiler support for it is very nice.


100% agree that typescript has it quirks and definitely agree about import maps most likely going to be the way you use deno.

However there are tradeoffs with everything you use.

It really comes down to which tradeoffs are the right ones for you, which may not be the same for someone else.


For me this is a case of "too little, too late". I've been burned too much over the years by the Node ecosystem to risk investing any further interest in it now, even if it has marginally improved.

I have worked extensively with Typescript, which I consider to be the minimum viable solution within the Node ecosystem. Typescript shouldn't even be seen as a 'nice-to-have. It should be seen as a required remedy for some of the mistakes of a terribly engineered language. Javascript should never have been considered a serious language for back-end development. Node.js to me seems like someone's 'Frankenstein's monster'-esque experiment that escaped its captivity and wreaked wide-scale havoc on the surrounding world.


This doesn't make sense to me.

Javascript is a breath of fresh air after writing async code in almost every other ecosystem from Go to Java to Python to Swift. Pretty much any async code snippet in those languages can be improved by porting it to Javascript.

Async/await + the ubiquitous Promise make it my go-to choice for writing anything networked. Especially over the other popular dynamically typed languages.


Well, if so I don't think it was true until recent JS versions. And I'm not sure you're picking very good languages for your async comparison. (Python? yikes.) Long-standing async support in C# or newer support in Kotlin, or almost any language with real co-routines will fair better than JS. As for as promises go, using CPS with or without Promise wrappers seems pretty old hat.

More to the point I think, is that Node (and Deno, FWICT) lack general native or green thread support for true multi-processing without serialization to separate clusters, so you are forced to use async and timers for long-running or parallel work.


I'm not being unfair when I enumerate the most popular languages for comparison. When people crap on Javascript, they presumably prefer another language. And C#/Kotlin aren't exactly the top picks.

Kotlin has BYOB coroutines which are hard to work with. People don't use them. Going with the C# approach where async behavior looks sync was a bad move. I predict Kotlin's coroutines will never be a centerpiece abstraction just like how people don't really use Go's channels (people in practice just go back to Mutexes).

I mean, try it. Write the equivalent to this in Kotlin:

    // get background work started now
    const background = promise()
    
    // crawl some urls concurrently as well, just 4 at a time
    const crawl = Promise.map(urls, crawl, { concurrency: 4 })

    // while that's going on, we have some work that
    // we must get done.
    for (const task of tasks) {
      await worker(task)
    }

    // worker's done, now we can wait on 
    // the crawler and background work.
    const [a, b] = await Promise.all([
      crawl.then(processResults),
      background
    ])
CSP never caught on because after you have more than one channel as a central bus (the toy architecture), you immediately descend into channel hell. In-channels, out-channels, channels over channels. Back to using pencil and paper and scouring your code to decode the classic buffer bloat problem.

A single-threaded event loop with a central promise abstraction is a great way to write networked code.

Btw, I use Kotlin in a large JVM project and I'm stuck with the horror of https://docs.oracle.com/javase/8/docs/api/java/util/concurre.... That's more likely what you'll be doing day to day with Kotlin, not playing with its toy coroutines.


Whats wrong with something like this

  import kotlinx.coroutines.*
  import kotlinx.coroutines.channels.Channel

  fun background(): Channel<String> {
      val ch = Channel<String>()
      GlobalScope.launch {
          delay(5000)
          ch.send("OK")
      }
      return ch
  }

  fun getWebsiteData(url: String): Channel<String> {
      val ch = Channel<String>()
      GlobalScope.launch {
          delay(4000)
          ch.send("website data")
      }
      return ch
  }

  fun work(): Channel<String> {
      val ch = Channel<String>()
      GlobalScope.launch {
          delay(1000)
          ch.send("work result")
      }
      return ch
  }

  fun main() = runBlocking {
      println("START")
      val bck = background()
      val urls = listOf("www.web1.com", "www.web2.com", "www.web3.com", "www.web4.com")
      val chs = urls.map { getWebsiteData(it) }
      generateSequence { work() }.take(4).forEach {
          println("sync job done: " + it.receive())
      }
      chs.forEach {
          println("data fetched done: " + it.receive())
      }
      println("background done:" + bck.receive())
      println("END")
  }


Re:Kotlin, I'm not a JVM expert/fan, so I'll take your word for that. As for the lower-level coroutining, isn't the point that this will allow you/library-writers to abstract over these and provide the higher-level abstractions that you want to use?

Regarding "C# approach where async behavior looks sync", I don't follow you here. C# is very explict with async's returning Task<> where with Go's lack of "colored" fns, for example, you don't really know when you code is async or not.


Well the async/evented execution model, and omitting synchronize, complex "happens-before" semantics, and shared memory a la Java (which JavaScript and V8 lacks) is the entire point of node.js and libuv. I agree that it doesn't fit typical complex business logic with expectations of some level of isolation, but then node.js isn't a good fit for these kind of problems. Node.js is based on CommonJS, and there are/were alternative implementations of a CommonJS runtime, including process-per-request implementations like v8cgi/TeaJs, or implementations based on Rhino (Mozilla's venerable JavaScript engine written in Java) such as Ringo which can call into the JVM, and do multithreading. Complaining about this on node.js is complaining about your own decision to use node.js really. And multithreading isn't great either for these workloads; it was originally invented for coroutines in desktop apps.


> Complaining about this on node.js is complaining about your own decision to use node.js really

Yes - but there isn't much choice in the mainstream. Sure, I'd rather use .net core/Kestral, but if you want back-end JS/TS then Node is it, unless your org let's you experiment w/Deno or other.


I dislike javascript for much the reason you love it.

Promises just mean having to manually build and manipulate cooperatively multitasking green-threaded call-stacks. It was a dirty necessity following the inability of the earlier callback patterns to manage the level of complexity people were attempting to express in the language.


Even Rust implemented async/await. It's one of the few nice ways to write asynchronous code.

People who think that it's "manual" only haven't yet discovered the trade-offs of what they think is their preferred solution. There is no free lunch here.

It's like sniping at people who chose different prongs of the CAP theorem while thinking you're somehow in a superior position.


As someone with most of my coding experience in JS/TS, what would you say is a good alternative to explore?


Go with goroutines and channels is a nice way to handle concurrency.

At least it was the one that was easiest for me to wrap my head around and actually improved the performance of my code without weird race conditions or bugs.


Go concurrency is basically threads + the ability to choose between classic mutex synchronization (and all the problems with that like reentrancy bugs) or burn yourself in channel hell with deadlocks and buffer bloat.


Have to you agree with you there. I was pretty hyped for Go some 4-5 years ago but discovered its parallelism/concurrency model was essentially mutexes / condition variables / bounded queues when you get down to it. It's an improvement over doing it manually -- but definitely not a big one.


Erlang / Elixir.


As someone who mainly writes Scala, I really enjoy working with effect types that are also monads. Cats effect, Monix and ZIO are some examples.


Erlang.


If you think async/await is the holy grail of networking, you're up for a surprise in the next few years.

The problem with async/await: http://journal.stuffwithstuff.com/2015/02/01/what-color-is-y...

A solution proposed by none other than Java (available as an experimental feature in JDK 15): http://cr.openjdk.java.net/~rpressler/loom/loom/sol1_part1.h...

This seems to me to obsolete async/await, quite honestly.


Don't fall into the trap of thinking that just because you've found a blog post that could enumerate the downsides of something that the alternatives don't have the same trade-offs.

I've responded to that ancient blog post many times on HN by now. Since you must think that blog post is pretty good criticism of async/await, here's a little challenge for you: which language do you think doesn't have an equivalent of the "red vs blue" problem? :)


> Don't fall into the trap of thinking that just because you've found a blog post that could enumerate the downsides of something that the alternatives don't have the same trade-offs.

I've been using async/await for a long time and I thought it was great, but since I started using JDK15 virtual Threads, I feel like it's way superior in every way. If you have some trade offs you would like to mention, please do.

> which language do you think doesn't have an equivalent of the "red vs blue" problem?

Any language that distinguishes between async and non-async functions will have this problem. I believe one language that works around that is Erlang, as Erlang seems to do everything async (but it's not visible to users, but I am not sure). The proposed Java virtual Threads solve that problem as well. You should open your mind to new ways of doing things, this is by no way a solved problem as you so strongly seem to believe.


A link to one of your responses would be spiffy.


I skimmed the Loom article and I don't see how it obsoletes async/await. It seems primarily focused on performance.

But that's not the problem that async/await solves for me. I like JS concurrency because shared-memory concurrency is very hard to program correctly. By using a single-threaded event loop with async/await, I know exactly when it's possible for the contents of memory to change out from under me: only when I `await`. This makes it much easier for me to reason about the correctness of my application.

Given that Node.js makes it easy to spin up processes on multiple cores (e.g., with a library like worker-farm), I get full CPU utilization without the safety and liveness problems that shared-memory threads have. This is very nice.


> If you think async/await is the holy grail of networking,

They wrote that it's better, not perfect.


Why do you need async for "anything networked"? Genuinely asking.

There are many ways of doing I/O, but nowadays everyone seems to do everything async without giving it a second thought.


It's more about the concurrency abstractions available. And half the joy of writing async code in JS is that it's single-threaded.

Doing many things at the same is inherent to doing networked stuff. How ergonomic that is going to be is one of the only real things that sets languages apart.


> Async/await + the ubiquitous Promise

There's a legitimate concern about async-awaits — they don't have an inbuilt cancellation mechanism. May not be a big thing for the server; but definitely a big thing for the client.

See, for example: https://twitter.com/getify/status/1171820070538022914


Free cancelation has been a pipedream since day one of computer science.

We don't want it so badly that we want to pass around poison channels or cancelation tokens. People don't even bother threading cancelation contexts in Go because it's annoying and you still have to write disposal logic for anything worth canceling (which usually isn't possible anyways -- e.g. can't undo that database query that's in flight).

There are cute things you can do with generators in the UI where cancelation cascades make a lot of sense, but notice how nobody actually cares enough to use redux-saga nor this guy's library.


> There's a legitimate concern about async-awaits — they don't have an inbuilt cancellation mechanism. May not be a big thing for the server;

They are definitively a big thing for the server.

The lack of cancellation mechanism and pre-emption + single threaded nature are the main reasons why the P95 latency of Nodejs App tend to get horrible under load.


Terribly engineered because it's not type checked, or for some other reason?


Many reasons, some of which Dart fixes. Too bad it hasn't catched on. Now it's a language to build mobile apps with. Maybe it will be usable for cross platform desktop GUIs as well. And hopefully some of its sane language features will be ported to future versions of ES.

https://medium.com/flutter-community/the-ultimate-javascript...


Deno's existence is partly based on the idea that Typescript would need to be a first class citizen in whatever Node's successor will be.


I like the concept of Deno, and especially love the shift away from NPM. NPM was probably the main reason I never fully embraced node.js. I truly hated the bloat of the modules folder, and the impending fragility the dependencies would bring.

I like the idea of a standard library. This will hopefully only improve with time. It sort of brings the ease of use factor of PHP to a server side JavaScript environment.

I'll be watching Deno very closely these next few months!


The set of core libraries like express (fka connect) + dependencies is the stdlib. It was developed in early node.js days as part of CommonJS, which was an initiative for a portable/modular server-side JavaScript runtime with many implementations besides node.js (RingoJS, Narwhal, Helma, TeaJs/v8cgi). The existence of a community-driven lib effort, the portable nature of JavaScript, and the existence of choice was the entire reason I even looked into node.js.


Other languages had all that, plus a better standard library, plus a type system, for years before Node was a thing.

Node got popular because it enabled frontend devs that only knew JavaScript to move easily to backend, not any technical reason.


I definitely agree, the insane dependency chains really make me uncomfortable.

However, not every project will involve pulling in such crazy stacks. I recently experimented with Hapi + TypeORM. Hapi's core goal is to use only what they directly maintain, and I think it works really well! Node_modules is still not tiny, but it's at least an order of magnitude better than what I'm used to.


Deno is great ! It makes it possible for us to write TypeScript and execute it really fast, without a build system to worry about!

The only thing that, to me, is a big problem, is that, even though you have types in your TS code, you're basically throwing them away at runtime, wasting huge opportunities for optimisation that even the V8 can't recover. If V8 had support for strictly typed TS code, can you imagine how fast it could get it to run?! I think that's the next stage in the evolution of JavaScript/Node/Deno: Node -> Deno -> Done.


I'm no expert on compilers/interpreters, but wouldn't the added overhead of type checking cause things to slow down (genuine question)?


No, because it does the type checking already: it just fails to keep the type data into the runtime, which could be highly optimized if it had types. V8 apparently "generates" types on-the-go to make JS run fast, so if it had pre-built types it could already run the faster version of the code from the start.


Ah, understood. So, almost like Java compiling a class to bytecode before running in the JVM?


Yes, something like that. Interestingly enough, Dart has a mode to compile to the AST, so that when you run it, the interpreter does not need to parse the text again and check the types, it just starts straight off an AST. The TS compiler and V8 could definitely do something similar.


I am the author of Pogo, a web server framework for Deno that has friendly APIs and is secure by default. It supports React out of the box and has the best documentation of all the frameworks.

GitHub: https://github.com/sholladay/pogo

Video tutorial: https://www.youtube.com/watch?v=Fe4XdAiqaxI


> "and is secure by default"

Extraordinary claims require extraordinary evidence.


I think you misunderstood me. I simply meant that the default configuration optimizes for security. For example, Deno and other frameworks listen on `0.0.0.0` by default, which is convenient for development but is not worth the security concerns, in my opinion. Instead, Pogo uses `localhost` unless you explicitly override that setting, meaning much less risk of accidentally exposing your server publicly. This is not some revolutionary feature, it's just attention to detail that I think you'll notice cumulatively.

Additionally, I would like to pay for a thorough security review when we have more features and users. I doubt any of the other frameworks will do that as it's extremely rare in OSS. Of course, that means very little until it actually happens. But know that my intention is to deliver the first Deno framework that I would personally feel comfortable using in production.


any plans to support angular out of the box too?


Not at the moment, but I'm open to suggestions. Angular uses some custom syntax, right? Deno supports JSX natively, which makes it easier to support React. Do you know if there is a way to use Angular's syntax in Deno? What would the server need to do to make Angular development more convenient?


Deno's sandbox security is somewhat similar to Mandatory Access Control (MAC) implemented by SELinux and AppArmor. But it looks like not as fine-grained as MAC. In the example:

deno run --allow-net myWebserver.ts

With SELinux, one can specify the port range and network interface that the application is allowed to access. It also provides audit log that can be examined by the admin. Maybe there is no need to reinvent the wheel but just use some form of MAC if you really care about security.


Am I wrong in thinking that this example specifically does not protect against the threat posed immediately preceding it? As in, one is running a script that foolishly imports a nefarious package that uploads tasty environmental variables to an evil server, which it can do when network access is not controlled. Well, what if myWebserver.ts imports that package? A more fine-grained approach that limited network access by source file might be valuable.


> A more fine-grained approach that limited network access by source file might be valuable.

This is what I think I'd like to see as well. The most common case isn't that I don't trust the program I'm running, it's that the level of trust for my dependencies plus their dependencies is essentially opaque.


That's my impression too (see my other comment).

Each package published to Deno could come with a set of declared permissions (similarly to Android apps).

When importing the package in a module, Deno should detect that permissions scoped at current module level are wider than what the package requires, and automatically narrow down the list of authorized calls.

This would probably be very costly. Suppose that I'm importing a function from lodash (that requires no permissions) and my module calls it repeatedly while also accessing the file system...


From the docs at https://deno.land/manual/getting_started/permissions:

> --allow-net=\<allow-net> Allow network access. You can specify an optional, comma separated list of domains to provide a whitelist of allowed domains.

So it seems to allow for a bit more fine-grained configs than just opening up everything.


Deno definitely looks interesting!

All the examples I’ve seen so far are using the ‘deno’ command to run .ts or .js scripts. Can it also package standalone binaries like go/rust—-to make a cli tool, for instance? If so, how do permissions work in that scenario? Does the user need to grant permissions on every invocation or is there some way to whitelist a script/binary?


Packaging to standalone executable is being worked on https://github.com/denoland/deno/issues/986


Nice! Will be interesting to see how permissions work in that case.


My gripe with npm was the lack of a lock file.

- Yarn helped solve that, but because of its backwards compatibility to node_modules, you could not have different versions sitting side-by-side.

- Node_modules could have a different version installed vs lock file and no one would know without looking.

It seems Deno is able to solve the side-by-side versions and distributing the 'lock' to the file itself. The Deno team is trying to create a 'map' file to consolidate the 'distributed version' issue.

Sadly, Ruby's Bundler has solved this for years and while I love TypeScript, I'm always saddened by the state of package management in the Node space.

I'm not saying Bundler perfect, but its basis with canonical lock and ability to have side-by-side versions allows me not to think about that issue.


> - Yarn helped solve that, but because of its backwards compatibility to node_modules, you could not have different versions sitting side-by-side.

> - Node_modules could have a different version installed vs lock file and no one would know without looking.

> Sadly, Ruby's Bundler has solved this for years [...]

I don't understand your first point. Different projects can use different versions since the modules are installed locally (inside the `node_modules` directory). And nested modules can also have different dependency versions, e.g.:

    A
    => depends on B @ 1.0
    => depends on C, and C can depend on B @ 2.0
Regarding your second point, I haven't ever seen that happen in practice and IIUC it's mostly a property of the the fact that `require 'bundler/setup'` checks your dependency versions, and you could implement something similar for JS (e.g. traverse node_modules directories recursively checking if the versions declared in the package.json of dependencies match the ones in your root lockfile).

Since we're on the topic of Ruby and JS, Ruby's module system is probably one of the worst I've ever seen and JS one of the best.

In Ruby, just like in Python, everything in a file is public by default and the only way to make things private, AFAIK, is using Module#private_constant, and that only works for methods/inner classes/things in a class scope.

And, unlike Python's import, require is side-effectful! If you have file a.rb that requires b.rb, and b.rb requires c.rb, everything in c.rb will be visible in a.rb. This is terrible.

JS's module system is one of the best IMO (better than haskell, python, java, etc):

- simple mental model: a file is a module

- everything in a module is private by default, you have to explicitly mark things with `export` to make them public

- You can either qualify imports or explicitly import individual functions, so it's always possible to find out where something is defined by simply looking at a file. Most languages fail here. This is useful for beginners and in places where you don't have an IDE available, like GitHub


> Different projects can use different versions since the modules are installed locally (inside the `node_modules` directory)

I'm speaking about within the same project. It's not hard to have problems over time when node upgrades (for example[0]) or to get a different version than expected.

Any project that's lived long enough runs into some sort of version mis-match where the solution is `rm -rf node_modules`.

Deleting and reinstalling the package folder as a regular fix is symptomatic of a deeper package issue.

Deno solves parts of this by giving module versions their own explicit folder. I'm concerned if it still stores the package locally that you can still run into a deno version mismatch.

.rbenv + Bundler's folder structure has been `.rbenv/versions/2.6.5/lib/ruby/gems/2.6.0/gems/mime-types-3.3`

The version of ruby and the version of the gem are explicit allowing separation.

Again, far from perfect, but this keeps out so many problems.

> Since we're on the topic of Ruby and JS, Ruby's module system is probably one of the worst I've ever seen and JS one of the best.

This thread is about package management. While fair criticism, it's too sideways.

[0] https://stackoverflow.com/questions/46384591/node-was-compil...


Your original point was:

> you could not have different versions sitting side-by-side.

bundler can't do that either. You can't depend on both rails 5 and rails 6 in a single package. Most languages can't do that.

> Any project that's lived long enough runs into some sort of version mis-match where the solution is `rm -rf node_modules`.

I agree, but that's not the only solution, as I've said you could write something similar to require "bundler/setup" in JS that does version checking.

> The version of ruby and the version of the gem are explicit allowing separation.

You can specify the node version in your package.json.

EDIT: on the version checking point, I agree that this is a deficiency of npm. It probably should ship something similar to bundler/setup by default and encourage users to do

   require('npm/validatePackageVersions') # or import 'npm/...' in es6
in their top level code.

I was just pointing out that this is not a fundamental limitation of npm, and it should be fairly easy to implement in user-level code


> bundler can't do that either. You can't depend on both rails 5 and rails 6 in a single package. Most languages can't do that.

You're right. Originally I was speaking about package versions which deno does solve, but then I brought in node versions w/o explicitly stating so.

That's managed/wrapped at rbenv's level which I hope deno can come up with a way to solve it. But looking at deno briefly, it appears the packages are still stored locally which leaves the deno version mismatch a possibility still.



Does package-lock.json not fill this void? It's been a thing for a few years IIRC.


Partially, like the yarn.lock it only filled half the problem. The other half is being able to have multiple versions installed at the same time and freely, confidently referencing the version I want.

node_modules can only have one version and it's not hard to have version drift even while having a lock. The standard answer is to do the `rm -rf node_modules` & install. Often that fixes whatever problem creeped in.

Blowing away a package directory to solve problems for years should not be the answer.


> The other half is being able to have multiple versions installed at the same time and freely, confidently referencing the version I want.

It's not well-known, but it is possible:

    "dependencies": {
      "sodium-native-2": "npm:sodium-native@2",
      "sodium-native-3": "npm:sodium-native@3"
    }
> node_modules can only have one version and it's not hard to have version drift even while having a lock.

Don't get me wrong, npm is haunted, but I use it daily can't remember having experienced "version drift". The only reason I have to `rm -rf node_modules && npm install` is that `npm update` (even with --depth) doesn't do its job, so if you want to update all deep dependencies then you have to blow up your lockfile.

(Btw, if you are experiencing some "version drift" problem, I'd recommend `npm ci` as an alternative to `rm -rf node_modules && npm install`.)


I assume that permissions are given at application level, not at module/import level?

This means that if I write an application that requires filesystem access and has external dependencies, I'm essentially giving them access to the filesystem even if they don't need it.

These dependencies could silently check whether they have permissions and do something fishy only if that is the case.

It would be nice to be able to import dependencies in a nested sandbox but I guess it is not a simple problem.


What deno wants is something like caja - https://en.m.wikipedia.org/wiki/Caja_project - object-capability security for JavaScript. Sadly I believe the caja project was not successful because it is very hard to avoid ambient authority in JS without becoming incompatible with everything.


I haven't looked into it, but TFA suggests you can do it call by call.


Well I'm not sure. The doc says:

Access to security sensitive areas or functions requires the use of permissions to be granted to a deno process on the command line. [1]

The only other mention of permissions in documentation is that a program may query or revoke permissions.

[1] https://deno.land/manual/getting_started/permissions [2] https://deno.land/manual/examples/permissions

EDIT: formatting


Deno is on my list of tools to play with. It's sandboxing capabilities are the main selling point for me because it will allow executing semi-trusted/un-trusted code on private data sets. If I know that the code can't access the network or do anything fancy with the file system then I can treat the untrusted code as a pure function and know that the output will only depend on the input. This is a very desirable property and I'm looking forward to the type of code and data sharing it will enable.


Sandboxing has been possible for many years before Deno. Running untrusted code is not popular for good reasons.


I'm aware but Deno allows more programmers to take advantage of those capabilities in a way that will make sense to them which will lead to more and better applications. I think every programmer intuitively understands what network and filesystem access means. Deno has made sandboxing much more approachable and since those capabilities are front and center instead of some hidden feature more people will take advantage of them to structure their applications.


The example on permission isn't great. It said the library you used cannot access your database password from env and send it over the internet unless you allow so.

When you're putting database password in the env, most likely you need to permit env var and network access for your database client library. Then at the same time, the library in example can do that malicious thing.

The problem here is, deno request the permission per process, not per library import


One thing you can do is explicitly revoke permissions. This allows you to start the program with more permissions and then give them up as the tasks that required them are done, e.g. allowing environment variable access at the beginning and then revoking it before you start a server: https://deno.land/manual/examples/permissions

It's not as fine-grained as allowing libraries specific permissions, but it gets you part of the way there.


I think the network permissions allow you to specify IP ranges it is allowed to connect to. So you could limit it to only being able to connect to the DB server. It would still be an issue for things that need global internet access, though.


> Both the browser and node.js use the same engine - the V8 engine.

RIP every other engine, I guess?


> RIP every other engine, I guess?

In practice, pretty much :(


React Native uses JavaScriptCore, the JS engine for WebKit, at runtime.


Only on iOS right? Or is it embedded everywhere?


Embedded in IOS & OSX, but one of the few that's easy to get a build of (or compile) in Linux, Windows, Android

Chakra also easily builds on osx, ios, embedded in windows.

V8 is a PITA to build & get a build of. (Wish it wasn't)


Everywhere. Even the android builds use JSC or Hermes instead of V8


Oops that was definitely a mistake on my side! Good pickup though, I've updated it to include a link to a full list :)


do permissions in deno propagate to all dependencies recursively? like, if i grant filesystem access to a top-level script, did all its imports just inherit that permission, too?

if so, i can see this type of system being mostly worthless.


Yes... it would actually be quiet amazing to have different libraries in different sandboxes with defined communication channels.

The browser actually does something quite a bit like this with iframes. Iframes are sandboxed and can only communicate through postMessage. There's more to it but at a simple level it looks like this.

Chrome nowadays even runs iframes in a separate process! Finally... https://www.chromium.org/developers/design-documents/oop-ifr...

This is actually quite impressive because it presents a decent illusion to JS that all frames are running under the same thread.


You could implement this by fine grained imports and subprocess execution. Node.js actually has a very nice sub-process communication API: https://nodejs.org/api/child_process.html#child_process_subp....

At some point I remember writing some gpg wrappers with Node.js and I remember the subprocess API being one of the more pleasant ones to work with. In the case of more stringent Deno process sandboxing, the parent process would spawn another Deno process with a smaller set of capabilities.


Deno uses the web standard Worker API to implement sub processes. They are also working on fine-grained permissions for these workers [1].

[1] https://github.com/denoland/deno/issues/4867


Good to know and even better than my proposed solution then. If the language supports it directly then there is no need to write sub-process shims for managing permissions.


Doesn't seem to limit them?

Using something like Caja https://developers.google.com/caja/docs/about might work, using object capabilities rather than ambient privileges. Not sure if Deno helps at all there though.


Some random comments: - good to see native TS support

- Would be nice to see a builtin maven/gradle-like standard build system, which IMHO not having it in node.js was a major let down for me. Sure, you have gulp/grunt, but having to write repetitive code (which can be buggy!) for running a compiler/test/packager? With another set of plugins? just give me some standard tool and let me call the equivalent of "mvn package", which will compile/test/package my application with sane defaults.

- the dependencies-as-URL is going to be hit hard as soon as some big fish corporation wants to use Deno for some in-house projects. What if those corps disallow calling https://deno.land/my/dep and pretend to use internal repo? now all 3rd party dependencies won't work unless manually modified to use the internal corp repo.

- I predict that the --allow-this --allow-that will evolve in a SecurityManager-like complexity, that only the early enthusiasts will understand, and the rest will simply put the equivalent of "--allow-all" and let the ops deal with security issues


Build systems are generally required for frontend code and that's a bit out of scope for a project like this. Even if it wasn't - frontend landscape is wild and everchanging - arriving at sensible, useful defaults would be very hard.


Node with a hat and fake moustache.


One point that was not mentioned: Deno is distributed as a single binary. Combined with the security options, I think it's a great feature for small scripts in an enterprise context.


The security stuff for NodeJS is really frustrating. If anything, NodeJS is more secure than something like the JVM or C++. If I include a 3rd party package in the JVM, I have absolutely no guarantee that it will work well, much like in Node. In fact, in Node, I can actually read the source code and see what the package is, running is doing. In nearly every other environment, you may simply have access to a binary, with maybe some interface info.

So why do people not throw the same kind of fit about nearly every other programming environment as they do for Node/NPM? And frankly why do those other environemnts not have the ridiculous security breaches we have seen in Node/NPM land?

The real problem with Node/NPM i suspect is a lack of a standard library. Simply having a standard library would have greatly reduced dependency and package hell. Further, a standard library would mean people would be more willing to write a little more code rather than include a new dependency.


Because in those languages:

. dependencies are carefully considered by users

. dependencies are not added recursively

. dependencies try to be dependency-free themselves to assist with the previous point

. dependencies are not blindly nor automatically updated

. dependencies solve important domain problems, they are not trivial one-line-functions

. dependencies are typically developed and tested by a known team or company, which you trust, not just someone random

. binaries can be signed

. support contracts are a thing

. etc etc etc...


This is 100% true especially stupid libraries that are someone's class project. And JavaScript developers are so used to dependency hell that one of my developer imported 3rd party package for date formatting.


This seems to be a similar situation to the Bootstrap fallacy where you use the same frameworks every time you make something in a language to make it quick and easy for developers to work on all of your company's projects by just learning one. Using the same familiar libraries is great for reducing the amount of time it'll take to train someone to work on and maintain a large portion of your company's tech.


JS's built in date formatting/handling is terrible and often do what needs to be one.

MomentJS may be a giant import, but it works an it works really well.


Formatting is fine. Take a look at `Intl.DateTimeFormat`[1]. Then there is a proposal for `Temporal`[2] which will make handling a lot easier.

[1]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

[2]: https://github.com/tc39/proposal-temporal


It’s the closest thing we have to a useful standard library. Date handling in js without a library is a code smell.


I am confused, are you using MomentJS for fancy output like 3 days ago etc or for simple output like 5/31/2020? I can see how it is useful in former case but seems overkill in later case.


Safari is lacking in support (again) but we have `Intl.RelativeTimeFormat`[1]

    new Intl.RelativeTimeFormat("default").format(-3, "day")
1: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


the parent said something about problems being solved if there was a standard library, and if perhaps there were a standard library people would be willing to write more code instead of just adding another dependency.

I believe these points

dependencies are carefully considered by users dependencies try to be dependency-free themselves to assist with the previous point dependencies solve important domain problems, they are not trivial one-line-functions dependencies are typically developed and tested by a known team or company, which you trust, not just someone random

would be solved by the parent comment's proposed standard library.


A good standard library helps, no doubt.

However, it is not required. One of the languages mentioned was C++. That language has a tiny standard lib in comparison to Java.

So it is mainly a "cultural" thing and how projects are structured and reviewed.


C++ had a tiny standard library.

That’s why Boost exists (although much of its functionality has been subsumed into the std lib now).


True to an extent, but consider Python which has a standard library and has also seen some of these same types of security breaches.


What you’ve said has nothing to do with languages. Claiming X devs are better than Y devs is just your bias. Other languages can be more domain specific, have more frictions around using package management, and have less nr of developers.


I have nowhere claimed anything about devs being better/worse.


Yeah, I feel like Deno will reduce dependency usage, and people will hurrah and say "look, using URIs as deps actually worked to make things easier!", when in reality the reason dependency hell freezes over is because Deno actually has an STL.


I agree. lack of standard library in Node.js means everybody has to re-invent the wheel, or find on on npm.

I believe there was a time when C++ did not yet have a standard library. But now it does. JavaScript should have a standard library, not "Deno".


The problem with adding to JS’s stdlib is that it’s interpreted. If the next ECMAScript version provides new features in the stdlib, it’ll take a while for browsers to adopt it, and during that time, you’ll need polyfills.

Compare that to a new version of C++ where you just update your compiler, and the executable runs (almost) anywhere. No polyfills to run it.


I think the problem is less that it is interpreted and more that there are many different interpreters for different platforms, all of which are competing.

Python is also interpreted, but there isn't much problem changing the stdlib because CPython runs pretty much everywhere you need it to. Sure, there are other interpreters like PyPy that also need to implement changes to the lang, but it's not a show-stopper like it is with browsers.


> If I include a 3rd party package in the JVM, I have absolutely no guarantee that it will work well, much like in Node.

In the JVM you can use the security Manager [1] and limit file access and access to similarly sensitive areas. If you want you can fully guarantee that nothing is accessed randomly.

Of course that builds on the JVM not having a zero-day bug.

[1] https://en.wikipedia.org/wiki/Java_security


There are all kinds of things NPM users can do to mitigate security problems. The only interesting question is what the default is.


I remember 10 years ago (?) people were already complaining about how crazy installing some random packages via pip is instead of installing them via distro's package manager repo. The packages from distro's repo might be out of date but at least someone already vetted them they said. If someone with a time machine go back and tell them what we'll do with npm and docker today they'll probably quit programming on the spot.


I'm not too sure, but in Java, you depend on specific versions and the packages are signed. And most company have an internal repo they work off on, in case the public repo is having downtime or the package gets removed from it. Also deployments don't make use of dependencies, a single uberjar bundles it all up.

Was this all true of NPM as well?


> Also deployments don't make use of dependencies, a single uberjar bundles it all up.

That's not actually completely right... there are problems with deploying a single jar. With Java 9 modules, you're actually throwing away module encapsulation if you deploy a uberjar. The current state-of-the-art is to deploy the whole app + the JVM in a jlink image, which requires no uber jar.


Deno is the Java-fication of JS. I'm sure there will be Node-like tide of "JS is better than Java now that we have X" even though Deno brings JS closer than ever to Java. Despite being cynical, I don't think its a bad thing. Java does a lot of things right.

Deno -> Java

Runtime security options -> Security Manager.

URL based packages with simple HTTP-> Maven works same way.

Bigger standard library -> Java's is huge.

Types -> Java, yes.

Single executable -> Fatjars.

All the features mentioned in this article have been in Java over a decade. Its relieving to see a JS runtime that finally gives in to enterprise niceties. Us Java devs like to crap on JS for reasons besides being "boomers". The features Deno brings were all real reasons to use Java instead of JS up until this point.

Now if they would only fix threading, I would consider Deno/JS a real contender for backend dev


Funny, I thought it was the Go-fication of JavaScript. Look at the 'contributing' section of their stdlib: "deno_std is a loose port of Go's standard library. When in doubt, simply port Go's source code, documentation, and tests. There are many times when the nature of JavaScript, TypeScript, or Deno itself justifies diverging from Go, but if possible we want to leverage the energy that went into building Go. We generally welcome direct ports of Go's code."


For Deno this seems like a very good choice. Pick the good parts from Go and reuse them. I applaud Deno for going its own way with the module system though... That's been a hot topic in Go for a very long time


I love how Browsers & Deno have shown how we can do dependency management using urls containing code (ESM). This is a huge step forward.

This makes me think, is file-based code-split strategy enough?

What if we'd put code (not data) behind GraphQL and request only the actual piece of code we need to use?

import { foo } from 'https://graphql/{ give: { me: { foo }}}'


Just `import { foo } from 'https://graphql/'` is enough for that.


I'm interested to read the content but find the writing style and the visual layout make it difficult.


Node => Deno => Done.

One can simply say, 'Deno is like Node but done right'.


'node'.split('').sort().join('')


its also a nice illustration of little endian vs big endian.


Deno : Node :: Deltarune : Undertale


* This comment fills you with determination.


I think its only the natural progression. there are definitely some flaws with it, it isn't perfect. However, the tradeoffs really do seem to swing in favor of deno.


It's too little, too late. Deno today might have been more interesting years ago when you had to write your own Typescript definitions for every library because TS had no traction.

In 2020, Deno is just a stone's throw from `ts-node server.ts`.

And the permissions system lacks the granularity to be useful.


Is the goal to have this also work for electron executables?


> This site requires JavaScript to run correctly. Please turn on JavaScript or unblock scripts

FYI, I read the whole article without JS fine, thanks for making the site work without it even if it says it doesn't.


The author already inflicted one of the worst "worse is better" victories I can think of in the history of computing.

Now he's fighting his own monstrosity.


Please don't cross into personal attack in HN comments. I grant you that this is borderline, but it's still dipping a toe into those black waters.


Author of the artcile (me) or Author of Deno?


I think he means Ryan Dahl

Anecdotally, there are several languages (French, "Argentinian" Spanish to name a couple) where it's common to re arrange the syllables of the words backward-ish (More often than not for slang - argot for France, lunfardo in Argentina - uses).


What do you mean by your last sentence? The sentence makes complete sense and is not backwards translated to spanish


Hey sorry if that was not easy to understand, I was referring to the name of the thing. "Deno" is the same as "Node" with the syllables arranged backwards.


Right, of Deno/Node, not you.


This comment is a meta-critique on the blog itself. I've noticed a recent trend arising on the web of applying a text shadow to code to make it seem like it's glowing. I don't know where people started thinking this was a good idea but I hope it doesn't gain too much traction. When I saw it in this blog, I blinked and rubbed my eyes because I actually thought my vision was blurring...

Another nitpick which may be totally invalid and I'm open to being educated on this. It seems odd (i.e. inaccurate) that, at the end, the myProgram.bundle.js is being called an "executable binary." It's just minified JavaScript. Is it a binary because it has been minified and optimized, or is all JavaScript suddenly considered "binary?"


I'm guessing that "binary" can be interpreted as "blob of unreadable mess when opened in a text editor"


re: "binary", i think the article just uses the term incorrectly – it's probably intended to mean "single file thing you invoke directly". i'm guessing they didn't just say "script" bc of possible confusion with <script> tags




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: