
Things I Regret About Node.js [video] - dazhbog
https://www.youtube.com/watch?v=M3BM9TB-8yA
======
royjacobs
I think it's quite interesting to see that originally node.js was presented as
a bloat-free alternative to "enterprise languages" like Java, C# or even
Python or Ruby. A lot of complexity was subsequently added in an ad-hoc way
which has resulted in (for example) a package management system that's wildly
out of control.

It's very popular of course, so I'm definitely not arguing that metric.
However, the stuff that was originally called examples of tooling that
exhibits unneeded bloat and complexity (Maven) is now reimplemented in
Javascript, but poorly (npm).

~~~
jrochkind1
I think this is a story that gets repeated lots of times in our world of open
source software dev.

1\. X is SO bloated and poorly engineered full of bad legacy decisions.

2\. We can totally do better let's invent a new thing, Y!

3\. Wow, Y is so clean and fast and understandable.

4\. But it doesn't do this thing a bunch of people reasonably really need...
let's add it.

(repeat 3 and 4 a few hundred times)

5\. Y is so bloated and poorly engineered and full of legacy decisions. We can
do better! (Go to 1.)

The grass is always greener, but mature complicated software is _usually_
complicated for... reasons.

~~~
pknopf
How do you break the loop?

~~~
flukus
I think vi -> vim -> neovim shows a pretty good model.

Neovim is an effort to modernize and remove cruft from vim, so they get to
keep all the good parts and throw out the backward compatibility. If it works
out it can eventually replace vim, not to different to what vim did to vi.

I'd like to see similar stuff done to much of the GNU tools. Make for instance
has to worry about backward compatibility and posix compliance that makes it
hard to progress. As of today there have been about 12,000 attempts to replace
it with something else and I find all of them inferior for one reason or
another, they've all reinvented the wheel poorly. If someone had taken the
fork and modernize approach we might have something better by now.

It doesn't even have to be a "hostile" fork. The same can be done by the
developers of the existing tools.

~~~
mirekrusin
Text editor and programming language are slightly different things, backward
compatibility story is completely different.

~~~
hjek
Not when the text editor includes a programming language (Vimscript). And
backwards compatibility of plugins is a big issue.

------
thosakwe
This was one of the more interesting software talks I've listened to recently.
I like that it was very real - there are serious, serious problems with
Node.js, and the fact that even the creator acknowledges these problems caught
my attention.

I'm also a long-time user of Dart, so when he brought that up, and compared
TypeScript to its shortcomings, I definitely agreed.

That being said, even with the Deno project, I'm not so sure what can come in
terms of performance and security from running JavaScript outside of a
browser. The choice of V8 also raises concerns for me about build systems. He
mentioned the failing of GYP, but anything using the same build toolchain as
Chromium always introduces a wealth of complexity for anyone building said
software, as not only do you need Google-specific build tools, but also very
specific configuration including Python.

It will be interesting to see what comes in the future.

If it were up to me (which I guess it isn't), I'd probably prioritize
portability/less complex builds, built-in FFI, a flat module system, and
optimizing calls to C from JS.

~~~
mschuetz
> performance and security from running JavaScript outside of a browser.

I'm just now building a node app to filter point clouds, so lots of number
crunching. In two days I've got something in javascript that's faster than the
C++ solution I've been working on for a week. Mostly because javascript/node
makes it trivial to parallelize file IO while doing work in the main thread.
This app reads 13 million points from 1400 files (~200mb), filters all points
within a clip region, and writes the 12 million resulting points in a 300MB
file, all in 1.6 seconds. (File reads were cached by OS and/or disk due to
repeated testing, but file writes probably not)

My personal conclussion is that javascript can rival or even exceed the
performance of C++, not because it's inherently faster, it's obviously not,
but because it makes it much easier to write fast code. For the highest
possible performance you'll defenitely want to use C++ but sometimes you'll
have to spend multiple times the work to get there.

~~~
chrisco255
Right, V8 works great at optimizing JS and it handles streams great.
Productivity is one of the most important factors to think about when building
software systems since human time is much more expensive than CPU cycles.
That's why Node.js works great.

~~~
maerF0x0
This fails from the false dichotomy of speed of execution vs speed of
development (which includes fixing bugs). Well written languages optimize to a
certain weighted preference of the two and some languages deliver more of
_both_ than others.

For example; Typescript is fast to write. and Golang is reasonably quick to
write, _and_ execute. Both should have ~15% less bugs than javascript,
potentially making them faster to develop (where bugs matter).

~~~
chrisco255
Once you've paid the upfront cost of learning Golang, I might agree with you.
But then again, in particular when building a full stack app (and not when on
a team that has back end and front end specialists), it's helpful to use the
same context (JS/NPM) when devving. People are bad at context switching.

~~~
notdonspaulding
I've heard this argument several times now and it finally hit me what I
dislike about it.

If you aren't context switching between your backend code and your frontend
code (even when both are JS), you're probably incurring technical debt in your
architecture to be paid in even greater numbers of dev hours down the road.

When you are writing an all-JS full-stack app, do you really feel like you're
only working on a single app, as opposed to two different apps which happen to
share the same repo?

~~~
chrisco255
It's not even likely they're in the same repo when working with Node (although
possible). The context switch between apps is one cost you have to pay (surely
your backend will differ from frontend in architecture). However, the costs of
switching languages is higher. Golang / JS conventions are very different. It
is possible to share some common libs front end / back end (lodash, validation
logic, etc) and that helps too.

~~~
dozzie
> The context switch between apps is one cost you have to pay [...]. However,
> the costs of switching languages is higher.

Is it? I was working with a system where server is written in Erlang and
client (and another server) is written in Python. No problems with switching
back and forth.

~~~
chrisco255
Yes, that's a cost your brain is paying. And a higher cost if you bring on new
devs to that project, who then need to learn and understand both Python and
Elixir. Debugging is different in both languages, libraries are different,
standards, conventions, top level APIs, runtimes, capabilities, all that has
to be understood to operate at a high level. That's a non-zero cost and it's
pretty significant. That you've learned both so well that you can switch
between them is great, but that's equivalent to knowing two musical
instruments as well as one.

~~~
dozzie
>> I was working with a system where server is written in Erlang and client
(and another server) is written in Python. No problems with switching back and
forth.

> Yes, that's a cost your brain is paying.

What cost? I _said_ I haven't noticed any.

> new devs [...] then need to learn and understand both Python and Elixir.

No, they don't need to learn even a speck of Elixir.

\--

What you described is a trusim that one needs to learn two languages to write
in two languages. Yes, this is obviously true. What I'd like to hear is an
argument that _switching_ between them used in a _single system_ is costly,
because I haven't observed that. This is what I discuss with, not with that
learning another language has its cost.

~~~
always_good
How wouldn't there be a cost?

There are more things you have to remember. Workflows in both languages. Of
course it's more stuff, thus more context. And you have to use both languages
constantly to stay fresh in them. The syntax isn't the only problem, just the
easiest one.

Is it just as easy to maintain Spanish and English skills than just English?

"I don't notice it" isn't a very strong argument. I bet you don't notice the
effects of slight dehydration and your diet and exercise on your output
either. But if you were actually experimenting with it, I guarantee you could
soon perceive it.

~~~
dozzie
> How wouldn't there be a cost?

How wouldn't there be a cost of switching between two languages? Normally. You
could try, you'd know then. Though the prerequisite is a system that is
_designed_ , with clearly designated borders between the parts, not a system
that has just _emerged_.

Proving something's non-existence is a little like proving that you're not a
weapons smuggler. How would you expect to even start?

> There are more things you have to remember. Workflows in both languages. Of
> course it's more stuff, thus more context.

But this is irrelevant to switching between the languages. You have just as
much to remember if you write unrelated things, each in its own language.

> Is it just as easy to maintain Spanish and English skills than just English?

"Just as easy than"? Really? In a thread about _languages_?

You picked wrong analogy. It is just as easy to write prose with every second
paragraph in English and Spanish as it would be with just English. The
prerequisite, obviously, is that you know both languages.

> And you have to use both languages constantly to stay fresh in them.

For some value of "constantly". It's not like people forget everything about a
language when they don't use it for a week or a month.

> "I don't notice it" isn't a very strong argument.

Well, at least it's _some_ argument. On your side is only "how wouldn't there
be a cost?", clearly from a position of somebody who doesn't use many
languages.

------
tnolet
Having worked with Maven, Gradle, Ruby Gems, Pip and the non-existing Go
package management I must say I actually really like the Node / NPM combo. I
guess artists are their own worst critics.

edit: forgot Scala's SBT, admittedly a builder using Maven repo's but still an
excellent example of how bad UX in this area can get.

~~~
CSMastermind
npm is the worst software I use daily.

* The maintainers have pushed several breaking updates by mistake (I'm a teapot recently).

* There have been a few cascading failures due to the ecosystem (leftpad).

* node-gyp (alluded to in the talk) break cryptically on install in different operating system/package combinations. It also obscures the actual package contents.

* The lack of package signatures and things like namespace squatting significantly hurt the overall security of npm.

And let's not forget how terrible things were pre-yarn with the nested folder
structure of node_modules and no lock file.

Compare that to NuGet where I've literally never had any of these problems.

~~~
darzu
Here's one thing Node and npm are great at and NuGet fails at completely:
local development of two packages. With npm, I can use "npm link" to redirect
a package reference to a local folder. With NuGet, the best you an do is edit
the .csproj and change the nuget reference to a project reference (if you can
find the original source code). This makes simple step-through debugging
across package boundaries a chore every time, whereas a source-based package
system doesn't have this issue.

~~~
n0us
"npm link" only establishes a symlink between the two directories and doesn't
respect .npmignore or behave in any comparable way to publishing a package and
installing it. Sometimes the only way to debug is to repeatedly publish and
re-install the package you are developing.

------
sebringj
Why I admire Ryan is "I shouldn't just complain without giving a solution..."
and he gave a solution, more than once. I have done this on a no-one-cares
scale but it really is better to do yourself when you can. Also, Ryan has some
sharp sarcastic wit which is pretty fun to watch on this talk.

~~~
draw_down
I thought that was nice too, but I wish we as an industry placed a little more
value on simply admitting that something is bad or suboptimal. You're not
supposed to "complain" or "be negative", which I think is unfortunate in lots
of ways.

~~~
sebringj
Yah I get that. I remember at my work, I said something negative about how we
were sending out emails that they looked like they were from 1990. I got a lot
of backlash for complaining but next day gave them a modern solution and bam,
they were super happy. Maybe its just human nature but providing something to
fix it rather than just being unhappy alone is more powerful.

------
iamleppert
He talks about not to add in features you think would be “cute” because they
are always a mistake..

Then a few minutes later says “I thought being able to specify URLs in import
statements would be cute..”

Uhh...Houston, we have a problem with this one.

~~~
jonny_eh
I dunno, importing from a url seems really smart and practical to me. It
divorces the run-time (Deno) from a package manager (like npm). I wonder how
it handles dependencies though.

~~~
iamleppert
Is it synchronous? Does it follow redirects? What happens in a 404 situation?
Does it obey cache headers? What happens when a timeout occurs or the resource
isn’t code? What happens with recursive dependencies and other edge cases as a
result of not knowing the dependency tree until runtime? What about error
handling and recovering from these failures at runtime or compile time? Should
all resources be secure? How does it work when you are developing on one of
those dependencies? Do you have to run a local web server? How does versioning
work? Where is the cache stored and is it content addressable? How to clear
it? Is it global or per user?

I could go on but you get the idea. A package manager is not simple and
requires A LOT of choices. The best one I’ve seen is old CPAN.

~~~
inimino
The Web has all of these issues and yet importing JS libraries this way has
worked just fine.

Obviously it's not what you should do if you are publishing a library for
others to use, but for local use, and the kinds of exploratory scientific
computing he was talking about, it sounds perfect.

~~~
pitaj
It has _not_ worked "just fine" which is why we have bundlers, minifiers, and
other build systems.

~~~
inimino
We have those things as a result of having too much code pulled in from too
many sources. This has approximately nothing to do with URLs as identifiers,
which is one of the foundations of the Web.

------
arenaninja
One takeaway from this is how Microsoft is killing it currently:

* Much-beloved TypeScript

* Much-beloved VSCode

* Much-beloved GitHub

If they hire Ryan to flesh out his vision for deno we'd probably need a new
acronym, MAFANG

On a more serious note, I wonder if deno could support a lower level construct
like Observables. As much as Promises are perceived to be an improvement over
callbacks, they still have major flaws (only one of which is mentioned in this
talk), this is something that Observables can address

~~~
pjmlp
* Much-beloved Ubuntu

Just leaving it here, given Azure Sphere and WSL.

~~~
darzu
Nitpick, but Azure Sphere doesn't use Ubuntu.

~~~
pjmlp
True, it runs Microsoft's own distribution highly customized for the security
context of Sphere.

My suggestion is just the distribution I would expect Microsoft to acquire, if
they would decide to go shopping instead of pursuing their own.

~~~
the_af
Does anyone remember the joke/hoax, back in the old days, of the alleged
Microsoft Linux distro? :D (Of course, this was in the age of the Halloween
Documents and the "Linux is cancer" mindset [0], back when Microsoft was a
very different business).

[0]
[https://www.theregister.co.uk/2001/06/02/ballmer_linux_is_a_...](https://www.theregister.co.uk/2001/06/02/ballmer_linux_is_a_cancer/)

------
pan69
I've said it before and I say it again; NodeJS is an infrastructure component,
not a general purpose application runtime environment.

I totally recognise the IO problem in our connected world and NodeJS really
does solve the problem around the "many simultaneously persistent
connections", something that would be really hard to do without something like
NodeJS. In essence (and in my humble opinion) NodeJS is basically a
programmable socket server and it's main feature is "websockets".

Writing software applications in NodeJS is the most awkward experience due to
it's async nature. Business logic is inherently sync, not async. Most of
NodeJS's existence has been trying to find an elegant solution to make this
async behaviour look like it's sync, from callbacks, to promises and now
actual language features added to JavaScript itself (async/await).

The problem with Javascript on the server is not Javascript, but the runtime
in which it's executed.

I think it would be interesting to have an sync version of NodeJS that acts
more like traditional Ruby and Python next to the async variant that we
currently have. Both types could then be used along side each other, each
solving the problem for which it is best suited.

~~~
always_good
I'd say your intel is far out of date. Async-everything + async/await makes
Node more elegant than the same programs in Ruby/Python.

Even little things like "make these two database requests in parallel and wait
on them both" or "process these urls but only have 8 requests in flight at a
time."

~~~
davedx
> "process these urls but only have 8 requests in flight at a time."

What's the state of the art solution to this in node.js?

~~~
cloverich
There's many ways to do this, but at its simplest because Node is single
threaded, you can have a shared array of URL's and then a pool of workers who
.pop() a URL off and stop when there are none left. Each worker's http request
runs async, so with the `async` keyword at the start of their signature, and
`await fetch(url)` inside, the code reads (and in some ways behaves) like
regular synchronous iteration but runs (mostly) concurrently.

------
mromanuk
I was thinking, watching the video "Ryan should fork it, or start over"...
until he revealed it: [https://github.com/ry/deno](https://github.com/ry/deno)
:)

~~~
sametmax
My head can't wrap itself around:

import { test } from
"[https://unpkg.com/deno_testing@0.0.5/testing.ts"](https://unpkg.com/deno_testing@0.0.5/testing.ts")

There is so many issues with that I don't even no where to begin.

~~~
dylrich
His arguments for it seemed reasonable - would love to hear your criticisms

~~~
sametmax
What the others say.

And also, golang tried no centralized package management by using git repo. It
didn't end with people go getting from moving masters. Of course they did.

And in an area of finally accepting lock files, do you really want to go back
in middle age ? Lock files are not a constraint. They are a god saver. You
want lock files. You don't want to have either vague dependencies or pin
pointed ones. You need both to stay sane.

And of course somebody will do something dynamic that will open a security
issue.

And of course typo squatting is going to be so much easier.

And removing a bad lib ? From npm you signal the admin. When it's on it's own
domain ? Good luck.

And then searching for libs is going to be fun. And naming, naming will be
amazing.

And having a quick glance at the dependency of a lib ? So much fun.

And wait for the search/replace in the code that will change your entire run
time by mistake.

And no possible alternative package manager. Hope their dependency resolver
never sucks cause you will be stuck till you can install the next node... if
they ever fix it.

Ah, and the git blames to see what dependencies have changed are going to be
just peachy.

Oh, and your juniors copy/pasting code from the net is going to get extra
crispy.

But wait, running the code on conditional imports means your project may
install something at ANY moment in its life cycle. And could be changed by
somebody editing the code by mistake without really asking to change a
depandancy. You know, like a ctrl + D on "0.1" to replace all those floats
quickly.

Also cool URLs don't change. Until they do. Tiny URLS dependencies are going
to be hilarious.

I could go on and on and on and on...

I can't understand how you can be intelligent enough to code freaking node JS
and not see THAT elephant in the doll room.

~~~
JeremyBanks
You are imagining the worst possible execution of these ideas. Nobody is
proposing that you should start using libraries that pull in code from random
domains, unless you have some specific need to. Whitelisting sources is such
an obvious step, given the security focus, that you should really have applied
the
[https://en.wikipedia.org/wiki/Principle_of_charity](https://en.wikipedia.org/wiki/Principle_of_charity)
in your speculation.

~~~
sametmax
No one proposed to make spaghetti code with "goto".

Yet we did. And we replaced it with "if" or"while", to avoid repeating
history.

This is a prophecy: this dep managing concept, if kept in this form, will
cause something terrible. It will.

Good luck.

------
pvsukale3
Can anyone explain why he called Dart a complete failure?

[https://youtu.be/M3BM9TB-8yA?t=19m55s](https://youtu.be/M3BM9TB-8yA?t=19m55s)

ps: I am a junior developer. I am taking up Dart to learn Mobile apps
development using Flutter.

~~~
munificent
I'm on the Dart team (but I don't speak for the entire team here). Dart had
two initial goals:

1\. Get a native Dart VM into Chrome and eventually other browsers.

2\. Get a significant number of client-side web developers that were using
JavaScript to move to Dart.

It's probably not obvious, but these goals are in tension with each other. In
order to motivate adding a giant new VM to a browser, you need to make the
language pretty different from JS. Likewise, you need to make your
implementation much faster than JS.

Both of those push you down a path where interop with JS is difficult. You
don't want your language's semantics too close to JS because that reduces the
value proposition of the language. And you don't want JS interop requirements
to limit how you implement the VM around things like garbage collection.

But for (2), to get people to move, you need the absolute smoothest migration
path you can get. You'll make all sorts of compromises and edge cases in your
new language to reduce friction when getting developers to migrate to yours
and you'll do anything to make interop seamless to support heterogeneous
projects. (For example, TypeScript pokes quite large holes in its type system
in order to play nicer with JS idioms.)

The Dart leads prioritized (1) over (2). The idea was that the VM would be so
great users would flock to it giving us (2). That didn't work out,
unfortunately. In practice, I think it's very hard to create a language
implementation so much better that it trumps the value of existing code. So
you really do need to win at (2) at all costs, if you want to a successful
web-only client-side language.

That's the approach TypeScript has taken, and they did a fantastic job at it.
Having one of the world's best language designers doesn't hurt.

In the past couple of years, in response to this and other changes in the
landscape, we pivoted Dart. We now aim to be a _multi-platform_ client-side
language. In particular, we're the application language of Flutter, a cross-
platform mobile framework.

Flutter is a very different platform than the web -- there isn't an existing
entrenched corpus of billions of lines of code. Performance and memory usage
matters more. You can't JIT on all platforms. Developers coming to Flutter are
equally likely to be coming from Android (Java) and iOS (Objective-C, Swift)
as they are the web.

Those different constraints play well to Dart's strengths. And, in particular,
they align nicely with Dart's move to a full, sound static type system. Dart 2
is more "C# with less boilerplate" than "JS with more types".

Dart is still _also_ a web language, and the better static type system really
helps with static compilation to JS, but it's not our only path to success.

~~~
darzu
How would you characterize the positioning of Dart with respect to Go? Dart is
for client-side, Go is for server-side? I'm also curious if you see having
separate languages for these roles as desirable or just incidental.

~~~
isoos
Disclaimer: I've been working with Dart for 5+ years now, I'm running several
small server-side Dart apps myself, and I also contribute to the Dart app that
is behind pub.dartlang.org

The Dart VM itself is great for server-side, however Google is focusing on the
mobile and web tooling and support. While they do develop server-side packages
e.g. for AppEngine-, gRPC-, or Memcache-support, connecting to databases like
Postgresql is through a community-supported package, and sometimes it is hard
to find an actively developed one.

Considering these limits, there are still good server-side frameworks, and
there exists couple of big full-stack Dart applications. I've created a
HackerNews-crawler twitter-bot (@DartHype) in a matter of hours in Dart, and
it is running almost unchanged since then. Not that it is a big feat, but it
was an easy thing to implement given the ecosystem.

If you have a fresh project, and you can select your database and other parts
of your stack, Dart can be a good choice. Depending on the domain, the
performance is close to the one in Go, or in Java VM, and it is much easier
for beginner to pick up than other languages, while the tooling provides more
safety than JavaScript or TypeScript.

However, if you need to connect to Sybase, it may not be the best choice.

~~~
TheAceOfHearts
The Dart community is tiny and package selection is incredibly limited. I'd
argue that Dart would be a very poor choice for most developers. It's not
easier to pick up than something like Rails or even Spring.

~~~
isoos
Your argument is noted, we just happen to disagree. I've mentored high school
students for a couple of weekends to help them build their mobile app. Java
(Android): struggle with the bloat. JavaScript (React Native): shoot
themselves in the foot couple of times - lack of tooling. Dart (and Flutter):
instant success.

The language, its consistent API, the IDE and tooling support with the static
analysis is just great for beginners and advanced developers alike.

People like to hate Dart because it threatened to take away their beloved
JavaScript. For those who have actually tried in the past few years, I only
hear they wish their IT stack could be migrated to Dart. If you start a new
project, choose wisely :)

------
StreamBright
Maybe it is just me but Javascript is not my favourite dynamic language. Given
project like ReasonML and other languages that have type inference prototyping
is not harder than without types. Is it a valid argument that it slows you
down? Not sure.

~~~
sebringj
I was not happy having to do TypeScript on a project thinking in your point
about slowing you down which initially is completely true if you are forced to
tslint it to hell. However, I've done a 180 in that TypeScript is very nice
when you relax the tslinter and you sprinkle it on as you go which gives you
that extra feeling of being auto-guarded when its just more practical but
without getting in your way which is exactly what Ryan was saying in his Deno
approach. I've saved loads of run-time errors from VSCode knowing types ahead
of time for example.

~~~
acemarke
Could you summarize what settings you used for the TS compiler and tslint for
that "sprinkle it on but keep it out of the way" approach?

~~~
sebringj
Basically you turn off as many settings as possible that make things
impractical. Its more of how you work, not a catch-all. I have VSCode and
installed a supporting library to work with it. Basically it tells me what's
wrong and how to disabled it as well. I either inline disable it or go to the
specific tslint rule and disable globally. It is most difficult when you have
to use another person's biolerplate that was built without typescript AND
tslint is turned all the way up.

------
z3t4
For security I make Apparmor profiles for each script. I think the module
system and NPM is what made Node.JS popular. And personally I like function
passing style aka. callbacks, Promises and async/await looks more terse but is
actually more complicated and prone to errors. I also don't like that
TypeScript extend the JavaScript language and ads a compilation step to it,
it's much better to add doctype like comments and you would get the best of
both worlds, although I don't think type-checking is needed if you already do
testing. For me static typing is mainly for performance, like in Dart, you
can't simple make JavaScript more performant without it. With TypeScript you
have "performance optimization" but without the performance benefit. If your
code needs type annotations for others to understand what it does you need to
use better names. The type annotations are for the compiler. Auto-complete and
parameter hinting can be done via inference. And public parameters and methods
should have documentation.

------
rb808
Interesting that he's unsure about Go. Would be nice to hear about why.

One huge strength about Node (&Deno) is having the same language and tools on
the front end and back end. Its a huge benefit to have a team on one language,
even if it might not be the optimal choice. I'm not sure if that is the
problem he had with Go though.

~~~
TheOtherHobbes
I've never understood why this is a benefit when the one language is - let's
say - suboptimal in many ways.

The interfaces between server- and client-side code should be well-defined and
language-independent. You don't want the same people writing both, because
it's harder to check that your API is working to spec if it doesn't get fully
independent testing.

There's also a lot of useful server-side optimisation and security management
that Node - or a high-level replacement - can't handle.

V8 may improve things, but it's going to have to improve performance a lot to
be competitive.

[https://www.toptal.com/back-end/server-side-io-
performance-n...](https://www.toptal.com/back-end/server-side-io-performance-
node-php-java-go)

~~~
overcast
You don't see how having to know only one language, is easier/faster than
having to know two languages? A lot of people are doing both front and backend
work. Often times it's a single person running the show. I'll tell you right
now I use node for ALL my web projects, because I only have to focus on a
single language. When I wear every hat managing the domain, server, databases,
mail, user questions, and everything else involved in running a modern web
app, one less thing to learn sounds great.

~~~
BigJono
I don't see it. Any old idiot can learn a programming language. The difficulty
doesn't come from the language, it comes from the environment and tooling.

~~~
overcast
So what's your alternative? I don't know of any environment easier than
installing Node. I type 'n latest', and yarn add/install. Done.

Ruby? Laughable. PHP has a zillion dependencies. Go with all of its system
paths. Java VM, no thank you.

~~~
BigJono
You completely missed the point. My point isn't that node is better or worse
than any of them, my point is that if you're starting from scratch with no
back-end knowledge, then the benefit of already knowing the language you're
going to use on the back-end is dwarfed so heavily by the other stuff you have
to learn that it's inconsequential.

------
breatheoften
Why would deno implement “download on first encounter” for its module system
...?

Would you ever want this vs an explicit “build” step that downloaded all the
required resources ahead of time ...?

Does deno walk the program source and download everything that could be
required upfront or do the resource loading on demand as it’s executed ...?

~~~
breatheoften
I think it would be a real shame if the module system design made it
impossible to statically enumerate all the module's reachable from a program
entry-point without executing the program ...

------
jaequery
much respect to ryan for coming out like this. i used to think that he might
have thought node was the bee’s knees and that callbacks and promises were
sent from the gods. glad to hear him confirm what a lot of js devs are feeling
right now with the wretchedness of the js callbacks, promises, and generators.
there is a better way and im glad he is trying to do something about it.

~~~
Guillaume86
He looked pretty ok with async/await (which uses Promises under the hood), one
of the advertised features of his new project is top level await.

~~~
spankalee
Which is unfortunate. Top-level await is a very bad footgun and should almost
never be used.

~~~
tlrobinson
Can you elaborate? It seems useful for simple scripts and REPLs, at least.

~~~
lostctown
It is useful for those things. I implemented a fairly involved scripting
system that we now use inside all of our server instances. Each script in the
system is an entry file mapping to an alias command and the entry file is
invoked from the top level. To get the benefits of async/await, we write the
logic of each entry script under main = async () => do_logic(), and call it at
the end of the file as main().

I just don't see how this workaround is the correct solution. Just saying
that's it's generally a bad idea is not enough for me. At some point there
will come a time when you want to treat a file as a function, and when that
day comes you will want async/await at the top level.

------
chriswarbo
It's nice to see some recognition of the idea that interpreters are safe by
default (excluding infinite loops/OOM), and we can avoid many security
concerns (access to files/network/etc.) by simply not including that
functionality in the interpreter (unless opted in via a startup parameter, as
described).

I'm also a fan of using env vars for configuration and locating dependencies,
as mentioned in the talk. Much simpler and easily extensible compared to e.g.
vendored directories (requires messing with the contents of the source
directory, which requires write access and breaks hashes, etc.), hard-coded
system paths like /usr or ~/.some-default-location (causes conflicts when
running multiple incompatible versions), etc. A simple env var of paths, e.g.
colon-separated, maybe with some sane quoting convention, can be used for all
of those if desired, whilst making it super easy to extend-with or restrict-to
any other location(s) instead. It's also trivial to "bake in" an env var to an
application, just call `PACKAGE_PATH=foo:bar my_app` rather than `my_app` (or
make a one-line wrapper script).

I agree with others that importing from URLs seems like a bad idea: network
I/O is one of the least reliable actions we can take, which would make
importing far more complicated than necessary (what if we're offline? should
we follow redirects? should we check for proxy settings? how should we report
errors? etc.). All of this complexity and the inescapable problems of network
failures are completely avoidable by just downloading things up-front. Package
managers/build tools can fetch whatever they like (URLs, git repos, etc.),
however they like (with proxies, caches, etc.), to wherever they like (one big
cache, project-specific vendor dirs, whatever), and just stick the resulting
directory (possibly of symlinks) in the program's environment (e.g. via a
wrapper script, as above).

~~~
tome
> I'm also a fan of using env vars for configuration and locating dependencies

O please just let things be configured by a single JSON value, built from
smaller JSON values if necessary.

~~~
chriswarbo
Sure thing. How do you get such a JSON value into an application though?

What I'm saying is to use env vars. You could put your JSON straight into an
env var, or if you want a persistent JSON file on disk then put the path to
that file in an env var.

~~~
tome
That's totally fine. I just never want any more than _one_ env var, otherwise
the API surface becomes very flakey very quickly.

------
misiti3780
i have always really enjoyed listen to ryan speak. he seems very humble for
all of his accomplishments.

~~~
curun1r
Agree completely.

Although one of the more interesting "talks" I ever heard him give was a
discussion on promises. I was lucky enough to be in (the ridiculously long)
line behind him waiting for food at NodeConf in 2012 and he and an engineer
from Microsoft had a pretty spirited discussion that explored the subject in
way more depth than I had previously thought possible. Ironically, given that
he now considers the removal of promises to be a mistake, it was the engineer
from Microsoft who took the pro-promises side of that argument.

~~~
tlrobinson
> it was the engineer from Microsoft who took the pro-promises side of that
> argument

JavaScript's async/await is essentially taken from C# (with "promises" instead
of "tasks"), so that could explain it.

------
matchagaucho
He buried the lede. The talk highlights some regrets, but the most interesting
part is a discussion about a new framework he's building based on Go and
TypeScript: _deno_ [https://github.com/ry/deno](https://github.com/ry/deno)

------
tlrobinson
I'm glad to hear he (and the Node community in general) has come back around
to promises.

I remember the very early version of Node.js that had them (looks like they
were added in v0.1.0 and removed in v0.1.30), although they weren't true
chainable promises we have now.

~~~
jonny_eh
I find it really bizarre that Node still doesn't have full promisified built-
in libraries. It's not like it'd break existing APIs.

~~~
tlrobinson
Node 10 has fs, at least:

    
    
       require("fs").promises.readdir(".").then(console.log)

~~~
jonny_eh
Why not just:

    
    
        require("fs").readdir(".").then(console.log)

~~~
salehenrahman
It can potentially break things.

There are legacy code that assume that calling `readdir` will yield undefined,
and will have just passed that result to a function, that alters its behaviour
based on whether a parameter is undefined.

~~~
jonny_eh
Interesting. Seems like a great opportunity for a major version bump.

------
tannhaeuser
Let's not forget node.js wasn't created in a vacuum but was based on CommonJS
(module format and std lib) also implemented by TeaJs/v8cgi, Helma, and many
others [1]. In fact, server-side JavaScript was a thing as early as 1999 or
before (Netscape Server). That it's based on a highly portable language also
used on the browser is what made it attractive over alternatives for me back
in 2012 or so.

[1]: [https://en.wikipedia.org/wiki/List_of_server-
side_JavaScript...](https://en.wikipedia.org/wiki/List_of_server-
side_JavaScript_implementations)

~~~
mlinksva
The Netscape Server SSJS thing was originally called LiveWire. I used it in
1996 and it was terribly buggy, but the web says it dates from 1995.

------
davidw
Some kind soul want to do a summary? I got through "uh hey, uh so" and
remembered why I don't do videos.

~~~
billbrown
Here is his PDF of slides -
[http://tinyclouds.org/jsconf2018.pdf](http://tinyclouds.org/jsconf2018.pdf)

------
logicallee
Here is an amazing quote from a different interview:

[https://www.mappingthejourney.com/single-
post/2017/08/31/epi...](https://www.mappingthejourney.com/single-
post/2017/08/31/episode-8-interview-with-ryan-dahl-creator-of-nodejs/)

Apparently these days,

"Ryan: Yeah, I think it’s… for a particular class of application, which is
like, if you’re building a server, I can’t imagine using anything other than
Go"

!!

------
inimino
It's wild to see this after having been around in the early days of Node, and
now having also moved most of my attention to Go as well.

Ryan has always had, IMHO, excellent taste and good instincts and guts to make
things simpler. A lot of the accidental complexity in our industry persists
simply because people tolerate it, and it's great to see that he hasn't lost
his fire in that area. Looking forward to see more about Deno.

The point about promises was an interesting one for me personally since I was
one of the people at the time who argued in whatever small way for taking
promises out, in the hopes that the language community would come up with
something better.

I have mixed opinions on the topic now, but it's interesting to speculate
about what might have been. The Node.js ecosystem was weakened by having
different ways of handling the async question, and by a lot of developers not
knowing the best practices in using callbacks effectively and leading to
"callback hell", which is totally unnecessary. It's possible that having
promises baked into an early Node would have constrained or even fragmented
what we ended up with in the language, and that would have been worse. I'm
still a little disappointed that we didn't end up with anything more elegant
than promises in the language itself.

It's interesting to compare package management in Node and Go. NPM got the
early adoption and became the de-facto package manager at a time when JS had
no such thing. In Go, the package manager question has been unsettled for a
much longer time and there are more chances to experiment. Package management
is simply difficult, and it seems impossible to design a good programming
language and resolve the package management questions at the same time. It's
sad to see some of the criticism against NPM... it's much easier to criticize
than to build a better system.

------
azylman
It's interesting to hear him say that npm and node_modules are regrets since
lots of complaints about Go packaging from people new to Go ask for something
similar...

~~~
k__
I think the package system consists of many parts.

Some parts of npm are much better than with most package systems, some really
suck.

Maybe if npm weren't included so deeply into Node, it would make something
like Yarn emerge sooner and replace npm without much hassle.

~~~
jessaustin
yarn just seems like a set of incremental improvements over npm? Which is
great, we're all for improvements. However, the vehement complaints I've seen
about npm [0] make it seem that nothing short of a complete re-architecture
could be tolerated. yarn does not seem to be that.

I don't agree with those complaints, but I do agree with you that "some parts"
are really good. node really figured out the right search strategy for an
unscoped import. (e.g. "require 'foo';" rather than "require './bar/foo';")
Just look for a directory with the name "node_modules". If you don't find it,
go up one directory and try again. So simple! So predictable! So complete! It
works so well, all manner of "left-pad" abominations can be supported. Any
other system should think very carefully before using a different import
search strategy.

[0] with the exception of those related to path depth: I think those are
resolved now? I wouldn't know because I stay on an OS that doesn't go out of
its way to frustrate me.

~~~
k__
Well, since npm was the de facto default, it had enough time to catch up, I
guess.

------
partycoder
Deno is not a good idea and Ryan Dahl should just let node.js developers
moving to Go continue doing just that.

Go is a well designed language that is productive, has a very tolerable
learning curve, good documentation and does not need a JavaScript facade.

If kids in the 80s without the Internet could learn programming, you can learn
Go in 2018.

Expect another video in 10 years: Things I regret about Deno, probably having
advice very similar to this.

------
danschumann
I still remember the first time I saw his node.js release video- changed my
life.

------
dxhdr
I don't understand the rationale behind using V8 for server code. Yes V8 is a
general purpose JavaScript engine but ultimately all of the performance trade-
offs and design decisions are made with browsers as the optimization target.

It sounds like Ryan is still interested in making V8 work so I have to ask:
why do you want to be writing server code on a client browser engine?

~~~
sametmax
Because that's the best existing engine (even with Zilla nicely catching up),
with dozen of genius and millions of dollars dedicated to it since the
beginning and for the next years to come.

Why do you think JS became so much faster compared to the anemic octopus is
was before ?

And can you imagine the perf of Ruby or Python if a 10th of those resources
were allocated to those ?

------
VikingCoder
Why msg.proto? Why not json?

This seems like an odd dependency to me, and it seems like it adds no value.

~~~
pknopf
Performance.

~~~
VikingCoder
That seems highly unlikely.

I'm going off of what I've seen myself, and what I've seen in the xi editor:

[https://github.com/google/xi-editor](https://github.com/google/xi-editor)

"The protocol for front-end / back-end communication, as well as between the
back-end and plug-ins, is based on simple JSON messages. I considered binary
formats, but the actual improvement in performance would be completely in the
noise. Using JSON considerably lowers friction for developing plug-ins, as
it’s available out of the box for most modern languages, and there are plenty
of the libraries available for the other ones."

~~~
dwetterau
Does the xi editor use a JSON transport layer for all syscalls though?

Low-friction interfaces for developers are great but I'm not sure I agree with
the comparison here.

~~~
VikingCoder
Yes, it does. The frontend and the backend only speak JSON.

------
qaq
Deno looks very promising

------
ebbv
Ryan rightly warns about adding “cute” and unnecessary features to projects.
Then he goes and adds the “Load module directly from a URL” feature with all
its complexity to Deno.

Ryan; kill that feature now. It’s not needed. It’s just cute.

------
dergachev
I first read this as "Things I regret about Node.js by Roald Dahl" and was
instantly intrigued!

~~~
modzu
can somebody downvoting the comment above kindly explain why?

~~~
bshimmin
One-line jokes are generally not well-received on HN unless they are so
exceptionally amusing and original that they are beyond reproach. The standard
rationale for this - which can make HN seem very dry and humourless at times -
is that people want to avoid HN becoming like Reddit.

~~~
modzu
thanks for answering. but thanks to the parent comment i learned about a new
(albiet widely recognized) author. thats useful because we all come to hn to
learn things. i suspect most downvoters probably dont even understand the
reference and just see a joke qua joke. boo.

~~~
hboon
I don't know how entertaining they are for adults, but I enjoyed reading Boy
and Going Solo when I was younger.

~~~
cafard
I encountered them probably in my 40s, but did find them enjoyable.

------
amelius
TL;DR, from quickly browsing through it:

He mostly regrets the way Promises work (edit: that the async programming APIs
don't work with them), security/sandboxing, and the module system.

Then he introduces "Deno", a successor which is under construction and which
will fix all these problems. It exposes a language based on TypeScript.
Internally, some parts of Deno are written in GoLang.

~~~
k__
He regrets not including promises early on.

------
bitwize
"And JavaScript is a the best dynamic language." [sic]

Uhhh, Ryan, there's this thing called Lisp...

------
nodesocket
We've missed you Ryan. :-)

------
emehrkay
Interesting that he had promises in node that early. Maybe if he kept them,
the community wouldn’t had standardized on two space indents

Edit: so two space indents weren’t a consequence of “callback hell” ?

~~~
abiox
i could see nested callbacks favoring two spaces, but overall it's not a style
unique to javascript.

for example, a lot of ruby i've seen (admittedly it's been a while) used two
spaces, and i don't recall it being any more prone to deeply nested code than
other languages.

that said js code tends to be rendered in a variety of other places like
browser dev tools, which may influence this as well.

~~~
jessaustin
After getting used to 2 spaces in js I try to use them everywhere now. Except
python, which obviously must be 4 spaces...

------
krmmalik
Great talk. I have missed Ryan and im not even a developer. I just love his
mind. He always demonstrates a lot of purity of thought in many ways.

By the way, was i the only person to notice that “Deno” is an amalgamation of
“Node”?

~~~
bitsoda
*anagram

------
uqimerioni
WTF slide brought be here. In the world where Python and Ruby exist - calling
javascript best dynamic language is nothing but heresy.

~~~
always_good
How?

Javascript has a lot going for it from async-everything to conveniences like
destructuring. And it works in the browser and has things like Typescript.

I have to find good reasons to use another dynamically-typed language over
Javascript.

~~~
LargeWu
I would argue async-everything the biggest hassle with Javascript. Most of the
time, I need things to run synchronously. Do this thing, then do that thing
based on the result of the first thing. The need for async is the exception.
So what we end up doing is expending extra effort forcing all the async stuff
to run synchronously when that should be the default case.

~~~
emilsedgh
With the callback methods it was really a hassle.

With async/await is so nice now.

await step1() await step2()

That's it.

~~~
LargeWu
I get what you're saying, and yes, async/await is much better. But the entire
reason it has to exist (and be used everywhere) is because javascript's
default async behavior gets in the way so often. The irony is that when I do
want some sort of async behavior, I often have to reach for something like
bluebird anyway because I want better control over the Promises. So either I'm
circumventing JS's default behavior by using async/await, or using a library
that makes promises manageable. Almost never is it preferable to me to use
JS's standard plain async behavior.

------
rmrfrmrf
TBH I felt like a lot of this could have been resolved if he stayed involved
with the community and opened issues with the relevant repos.

Seems pretty classless to rage quit the community, brag about how much better
Go is, then mark your return at a JS conference by shitting on Node and hard-
forking the server-side JS community.

------
davidy123
The first post in this issue has a summary of some performance problems with
node.js:
[https://github.com/ry/deno/issues/162](https://github.com/ry/deno/issues/162)

Though, it's posted in a confrontational way so the conversation deteriorates.

~~~
Touche
No, that thread is everything that is wrong with open source.

~~~
davidy123
I don't think your comment is clear at all. Do you mean the "community" or
(arbitrary) toolsets of open source?

~~~
Touche
The community. Specifically that someone thought it was appropriate to post an
issue shitting on someone's old project in that person's new project's issue
tracker.

~~~
Demiurge
It's also kind of pathological when someone focuses on just one technical
aspect of software, like performance, or security, completely unable to reason
within a large context of people doing things for other people.

------
VeejayRampay
"And then there were people adding modules. I thought to myself, this projet
is done now. So wrong."

Seriously why are we as a community still entertaining those outlandish
statements and attitudes? If you heard that sentence as a non-tech person,
you'd swear modules is a computer virus or something, that it's so inherently
bad that it's not even worth discussing.

Props to Ryan for his amazing work, but come on let's try to be civil and
respectful here.

~~~
sonofaplum
Thats not what he meant. He was not saying "People are adding modules, modules
are bad, this project is ruined, [modules are] so wrong." He was saying,
"People are creating modules, people are building on top of my work, this
project is COMPLETE. [I was] so wrong."

~~~
VeejayRampay
Ah my bad, completely misunderstood what he was saying.

Thanks for providing context.

