Hacker News new | past | comments | ask | show | jobs | submit login
JavaScript Containers (tinyclouds.org)
177 points by 0xedb 12 days ago | hide | past | favorite | 104 comments





There's a lot more to unpack from this concept than Ryan's post is letting on. I'm seeing a lot of negative sentiment, largely focused on negative opinions towards JavaScript.

This is entirely besides the point. The emphasis is mostly on V8, which we all know is first and foremost a JavaScript engine. But it is also a WebAssembly engine, meaning several languages beyond JavaScript can execute with the approach he's talking about.

What the post doesn't really go into detail on is how V8 is arguably the most secure runtime in the world. The browser runtime is one of the most battle tested pieces of software ever built. At its core, it enables remote code execution on anyone's machine. It has a robust security model. Your program can't do whatever it wants on the host operating system, unless granted by the user.

Docker Containers were an improvement over Virtual Machines by enabling dependency snapshots and compiled programs on top of an OS. V8 Isolates (translation: chrome tabs) are an improvement over Containers. Cloudflare's discovery here is what enabled them to build their edge network. With a containerized server deploy, you can execute dozens of concurrent, isolated V8 programs without needing to spin up a new container for each of them. The server also has complete control over execution time and memory. If a V8 program consumes more than its budget, it can easily be terminated.

Runtime performance concerns about JavaScript are also misguided. With Rust, it's easier than ever to write native functions that can be executed in JS or WASM. Extending the V8 codebase has historically been so difficult that most people don't even consider it a possibility. With Deno's V8 bindings, snapshots, and core crates, anyone can extend the language ecosystem with relative ease. Programs operating in this model on the edge have already shown to be significantly more performant than their traditional server deploy counterparts. End-to-end latency is all but eliminated, until that edge program needs to fetch data from a datacenter. And for that, the "global state" problem is aggressively being worked on by the major edge providers (sorry Cloudflare KV, we're having a hard time relating with you). Once this has been solved, the web is prepared for some serious optimization.

If you still think "the edge" is a fad in 2022, I'd recommend spending more time learning about it. It's not just JavaScript. Ironically, this architecture is the solution to the web's JavaScript problem, and will be the demise of heavy clients and SPAs.


Technically, V8 by itself is not "secure" RE remote code execution. It's secure in terms of memory safety, and keeping isolates out of each-other's memory space, but the actual "security" you think the engine provides is actually inside Chromium (a very locked down sprawl of sandbox processes where the engine is confined and restricted by the OS from touching resources it's not supposed to).

Good clarification. That said, the permissions are highly programmable on top of V8 behind specific calls, and it's better than root by default as with containers :)

Similar to Chromium, Deno's permissions model is embedded in their CLI crate. Would love to see this get extracted into their core crates so that custom JS runtimes can leverage it more easily.


Thank you. Your post does a much better job describing the potential than the link does. Something about the idea of “javascript containers” resonates with me, but Ryan’s post seems a little unfocused.

Javascript the language will be around forever, but I don’t know that I’d call it futureproof. Just because it’s there doesn’t mean people will continue to want to use it, particularly if wasm as a build target for other languages becomes more realistic and/or practical. However, javascript the technology (i.e., highly optimized and hardened runtime coupled with wasm) is pretty remarkable.


Completely agree on the questioning of JS being futureproof. With the advent of WASM, JavaScript developers have been given foresight and "reskilling" might not be a bad idea (in the words of swyx [1])

[1]: https://twitter.com/swyx/status/1521973694414864385?s=20&t=I...


If you still think "the edge" is a fad in 2022, I'd recommend spending more time learning about it. It's not just JavaScript. Ironically, this architecture is the solution to the web's JavaScript problem, and will be the demise of heavy clients and SPAs.

This appears to be self contridictory. Putting things on the edge means downloading more code to the client and heavy client code and more logic on the client side, and things more like full-fledged apps (I'm avoiding saying Single Page App here because the concept of "single page" is related to website navigation). I don't see how this solves "the web's JavaScript problem", whatever it is you mean by that. Can you expound upon this part of your comment more?


I think he's talking about edge computing rather than CDNs. There's a new crop of web frameworks that are building around the idea that you can make navigation in server-rendered apps feel comparably fast to SPA's (and faster in some cases because SPA payloads are often huge) by putting the server very close to the user. In this model, you just have the client download HTML for the current page and the browser sees way less JS.

Here's a web framework that is focused on this architecture: https://remix.run/

Here's a PaaS that is focused on this architecture with wide language support: https://fly.io/


the idea that you can make navigation in server-rendered apps feel comparably fast to SPA's (and faster in some cases because SPA payloads are often huge) by putting the server very close to the user.

Despite all the rhetoric that makes this claim, I continue to fail to see that server-rendered apps are somehow less interactive and have slower response time (both time to first byte and time to first interaction) than SPAs. I don't think I've ever seen a single page app respond faster and achieve interactivity sooner than an completely server side HTML templated site, but we say it feels faster for some reason. The Linux kernel cgit served interface¹ beats the pants off of almost everything else and it has way more data in its backing store (a git repo containing the Linux kernel) than many sites. Admittedly, there's little to no per-user custom state management there, but that's not something that gets removed by having the server closer to the user or only sending page patials anyway.

I feel like we spend, on average, more time staring at spinning circle gifs waiting for SPAs to request their data and document fragments than we want to admit.

(Yes, yes, I used some hyperbole here to make a point)

¹ https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...


> we spend, on average, more time staring at spinning circle gifs waiting for SPAs to request their data and document fragments

Definitely. I think that's the core problem remix is trying to solve (simple server rendering with the nice devex of our modern react toolkits). It will be interesting to see if the approach can works at scale and if the edge computing part of it remains a big selling point.


> The server also has complete control over execution time and memory. If a V8 program consumes more than its budget, it can easily be terminated.

Is cpu limiting also enabled by v8? Ie noisy neighbor?


Well said.

> The future of scripting languages is browser JavaScript.

No. I'd wager it's much more likely _any scripting language but JavaScript_. Statistically speaking it's probably Python, though I wouldn't love that universe either, being a Rubyist.

Browser JavaScript has 30 years of technical debt in the form of the general inability to make non-backwards-compatible breaking changes lest half of web pages stop executing properly. This has been a boon for legacy code, but the language itself has languished for so long and in so many ways. It shouldn't be anyone's North Star, especially given a choice. Web Assembly gives us real choices, and I think it will eventually result in the end of JS's dominance in frontend when we all come to our senses and support improves a bit more.


The technical debt really only exists in some APIs. Many have been deprecated or replaced with newer ones, although the olden ones still exist.

JS now has: async/await, lambda arrow functions, generators, optional chaining, null coalescing, spread and rest operators, destructuring of objects and arrays, block scope with const and let, default values for function parameters, for-of loops, iterable interface that works in any loop construct, template literals, dynamic object literal keys as {[foo]: 'bar'}, and many new APIs such as Symbol, Map, Set, WeakMap, Typed Arrays, Workers, BigInts natively as ####n, and so much more.

Historically, the language has been progressive in having: first-class regular expressions as /.../, value-preserving non-boolean-coercing logical operators such as || and &&, closures, polymorphism of any flavor (even multiple inheritance) through prototypes, and so much more.

The language is concise, beautiful, and the reason so many people like yourself have an ill perception of it, is because you wrote or worked with codebases that had shitty JS code that looked like PHP spaghetti. That isn't JS's fault.

I've always written JS code into separate classes or modules. I've always found the language to be rather elegant and it's only gotten better in recent years in precision and conciseness.

We now have better standard library objects and better DOM methods. The syntax and actual operators are so much more powerful.

And lastly, it is fast as fuck for a JITted language.


> Historically, the language has been progressive in having: first-class regular expressions as /.../, value-preserving non-boolean-coercing logical operators such as || and &&, closures, polymorphism of any flavor (even multiple inheritance) through prototypes, and so much more.

Perl 5 had already all those features in 2000 (Perl 5.6), or even earlier.


Perl was a great language while it lasted. I used it all the time in opsec.

> And lastly, it is fast as fuck for a JITted language.

Nitpick: it's fast for a dynamically typed language. It's not faster than, say, Java.


I agree that the new syntax approaches beauty, but there are places where it just so awkwardly dodges old skeletons, and there is no sense of stability. Imagine writing JS and having it look good and utilize best practices 5 years later without a complete syntactical overhaul. If you can't imagine that, there's something wrong with the community.

How do you do multiple inheritance with prototypes?

That's the point of prototypes -- they give you the power to do OOP in any way you want to do it. You manually write the procedures to do inheritance through prototype manipulation and generation.

See https://www.crockford.com/javascript/inheritance.html


It's very easy to ignore the cruft with JS, and opening up the console and typing some code and having a graphical playground makes it ideal to try things.

Python and Ruby are not in the same category IMHO, and while they are extremely good at what they are designed to do, I doubt that they could reach the lingua franca status js has. I would evaluate JS with PHP, Bash and C together, and I know how weird that sounds.


Lingua franca "language used as a means of communication between populations speaking vernaculars that are not mutually intelligible." Javascript is too Web focused. Python and C are better candidates as they are more than Web.

Nearly everybody can read JS, significantly less people can read Python or Ruby, it's not even the same ballpark.

There is tooling for most script languages which enables the same and often much better experimentation playgrounds.

JS having it in in the browser, which is basically guaranteed to be on all devices and doesn't have to be setup before use is it's true advantage


A generalization like this doesn't convey much:

> the language itself has languished for so long and in so many ways

When I read this, I thought "I wonder what they're talking about?" and then came up with numerous counter-examples of things that have improved immeasurably since I started writing Javacript. For example, there is now a `class` keyword with truly private properties. When I started writing Javascript, you had to manually manage the prototype chain.

But, I don't know what you're thinking of with that generalization. So it could be that there are many things that I don't notice anymore, but are still really crufty (like array reverse & sort mutating in place). Bringing those up specifically would improve the discussion.


Specifically, I would call the haphazard implementation of classes in JS one of these examples of a place where really true OOP should have been added, but they couldn't because they had to support existing keywords and web pages and language conventions, so we end up with the hacky syntax that is in the spec today.

Another example is the mess with for each loops, with .forEach, and the for..in syntax, and the peculiar behaviors therein where nothing does what you think it would coming from any other programming language.

Then you have things like "should I use var or let". It's embarrassing frankly. And don't even get me started with the uneven and confusing state of modules across the different ecosystems that exist now.

For someone who has worked in js since the pre ES5 days, it's fine, because we know the history and we know "oh, forEach is a legacy thing, don't use that if you can avoid it" but to newcomers it is extremely overwhelming having this sea of available functions and syntaxes and not knowing what to use. I know because I have had to talk through this with hundreds of coding bootcamp students as I used to serve as a mentor for one of the major ones. I tell all of them that the JS of today is a spaghetti soup of 30 years of tech debt, because it's true.

What's worse, is for newcomers, there is no way of viewing just the new/blessed syntax without going through each ES version and reading through the changes. It is very hard to distinguish the skeletons from the things you should use unless you know the entire history, and that fact is javascript's biggest smell to date. And it's also impossible to explain the new syntax in isolation without a story about what it replaced and why what it replaced fell out of favor.

JS has been adding new language features at an alarming rate, doing so poorly, and the reason for this is they have to walk on eggshells so as not to break existing web pages. Multiply that by 30 years and you get the mess that is everything since the netscape days. Things really got off the rails with ES6+, though, in terms of adding "awesome" new stuff without fixing/reworking old stuff. It's just plastered on in time-sensitive, fragile layers.


That's not the experience I have actually using javascript on a day to day basis. People don't use prototypes anymore, they uses classes. People don't use var anymore, they use const or let. People generally don't use `for...in` or .forEach() anymore, they use `for...of`. The javascript experience is a lot cleaner now, because of all the evolution in the past few years. For those who only code in javascript once every few years it can be jarring, but the ecosystem as a whole moves very fast (new frameworks and runtimes like Svelte or Deno), and that can be a good thing too

JavaScript classes are nothing more than syntactic sugar. It’s still prototype-based inheritance under the hood.

Prototypal Inheritance is considered one of JavaScript’s strong points, and many, including myself don’t think the class syntactic sugar that was added has much value. If anything it just confuses new developers.

A lot of JavaScripts detractors simply don’t understand the language very well.


Right. The whole language is held up by a bunch of "people generally" conventions that are non-obvious and frankly difficult to figure out if you haven't been watching the language for 1+ decades. It simply isn't designed for novices to pick up in 2022, and that's a smell.

"People don't use...". They did and they will and you will have to read and understand and maintain that cruft for years to come.

Rust solves this kind of thing using "editions" that ensure some features are available, and disable deprecated features. Perl goes super far in this direction with user-defined pragmas (https://perldoc.perl.org/perlpragma). Javascript something like this in `"use strict"`. But the standards bodies have ben against adding more of these, or gating new features behind them. The issue always comes down to, how would you choose what features to eliminate? Some, like `var`, can be turned off, no problem. But others like `.forEach` are more debatable. I like forEach, it doesn't seem like legacy to me to have both functional and imperative loops.

Another language that has continuously accreted language changes is C++, and I've really struggled to learn it for that reason. It's so hard to find a "how to do it right" kind of guidepost for C++. Like modern JS, there's so many build systems to pick before you even get started...


Backwards compatibility !== languishing

JS has added a ton of nice features while continuing support the skeletons in its closet. But you're free to ignore the vast majority of skeletons. You can block the old cruft using linters (e.g. preventing usage of `var`)


Really? In 2022 you still think this, when people had the option to use any language they wanted to on the server side and yet Node's popularity is clear for even the most blinkered rubyist or pythonista to see... I've listened for decades while Python & Ruby advocates whined about how JS was terrible and should be replaced in the browser and yet when people had a choice on the serverside, they chose Node and proved you all wrong.

Maybe in the far future, but in the near- and medium- term the same backwards compatibility that you believe (a little unfairly, I think) has hobbled Javascript will keep it dominant. If existing web applications can't be bothered to update to support ES6 or Typescript, what will make them rewrite in Web Assembly? New applications will, maybe, but it's a very, very slow change, with so many opportunities for new events to change the conclusion.

Python is loaded with as much weirdness as JS.

Python is not popular because of it's lack of technical debt etc..


I'd love to learn more about the concept of JavaScript containers, but all I've got so far is several paragraphs about language supremacy (ick) and a veiled hiring advertisement.

They don't exist yet, not fully. But what you _really_ want to look at right now is WASM as a replacement for Docker.

For example:

https://developer.okta.com/blog/2022/01/28/webassembly-on-ku...

https://training.linuxfoundation.org/blog/how-wasi-makes-con...


OK, after I read this, I can't stop thinking about how much I love this idea. I've been searching and searching thinking that there must be a toolset in alpha out there somewhere that I can play with.

The only think I can find right now is Krustlet, which is not really what I'm looking for. I want something that I can run on my own hardware. Have you (or anyone else 'round these parts) hears of a WASM replacement for Docker in development?

We're looking for a WASM runtime that can handle orchestration of multiple assemblies on the same machine, I suppose: https://www.docker.com/wp-content/uploads/2021/10/Docker-Web...

EDIT: I found what we're looking for! Docker Engine is a container runtime. What we need is an alternate container runtime that will work with WASM. K8s-compatible options are described in the following article: https://kubernetes.io/docs/setup/production-environment/cont...

It turns out that both CRI-O and containerd can run WASM assemblies via the WasmEdge runtime in crun:

CRI-O: https://wasmedge.org/book/en/kubernetes/cri/crio.html

containerd: https://wasmedge.org/book/en/kubernetes/cri/containerd.html

And you can create WASM containers without docker hub images by controlling crun directly: https://wasmedge.org/book/en/kubernetes/container/crun.html


> Scripting languages allow business logic to be written faster and cheaper. The scripting languages (Python, Ruby, Lua, Shell, Perl, Smalltalk, JavaScript) are pretty similar.

That is a pretty bold statement to that scripting languages allow business logic to be written faster and cheaper. I think that it depends completely on the size and complexity of your business logic, and on whether you factor in the cost of maintaining that software over time.


Agreed. I've spent a whole lot of time hopelessly optimizing Python so that it ran only 10% the speed of a naive Go implementation (and naturally the Python version was less maintainable as it was a mess of numpy and multiprocessing). Similarly, when I write Python, I spend a lot of time trying to get accurate type information (I've never worked in a project that actually used mypy, so type annotations could be missing or incorrect).

I'm sure dynamic languages were more productive than 90s-era Java, C, and C++, but I don't think those productivity claims hold today.


Please sir don't blame Python for a PhD's NumPY mess.

I'm no great fan of Numpy, but other Python performance solutions aren't more maintainable. :(

Well a naive Python solution would be much slower than the Numpy one, so...

Sometimes this is true, and the parent's criticism is certainly unhelpful. That said, I've definitely seen people try to optimize with numpy and end up with something that's even slower (Numpy isn't a substitute for mechanical sympathy, but it's often treated as magic dust by the Python community). Basically Python performance is just a hot mess all around. :(

I guess I'm the OP. Sorry my comment didn't help you; please let me know what you need.

>Basically Python performance is just a hot mess all around.

Maybe don't use Python where performance matters? Just a thought.


That's the right answer, but it's not helpful for people who don't get to make the call on what language to use. Moreover, a lot of people don't think an application will have a performance problem until suddenly it does (either because of scale or shifting requirements or whatever). Even worse, a whole lot of people simplistically believe that you can just throw C/multiprocessing/numpy at any Python performance problem, which is how you make Python even slower and less maintainable.

> whether you factor in the cost of maintaining that software over time

In many cases the largest cost indeed


The modern web browser as a universal VM is the single greatest technical achievement in the history of personal computing. I think Ryan is spot on here. There's plenty of reason to be critical of the web as a platform, but the reality is that we have finally achieved what people dreamed about in the 80s and 90s; a true write-once-run-everywhere platform. And Javascript (more specifically, V8) is core to that.

> The future of scripting languages is browser JavaScript. The fundamental mistake of Node.js was diverging from the browser as new APIs were standardized, inventing too much. In 2010, we didn’t have ES modules, but once it was standardized it should have been brought into Node

Everyone likes s**g on JS/Node.js, but things are a lot more nuanced than this. In Javascript it's been normal that first there's some user-land fix to some problem, then it's implemented within the language core, and then things move over, including (not surprisingly) ESModules. The problem is that the way ESM was implemented was totally incompatible with CommonJS, making the transition a really painful one. On the other hand and TBF, Node.js innovation has slowed significantly and instead focused on stability, which is good to some but not to others (see the founder creating Deno with an implementation closer to the browser).

So what has happened is, effectively, what has always happened: Node.js ("userland" for the standard) introduces new concepts, then the ECMAScript body makes it a standard that it's similar but not the same, then Node.js has to change everything to adapt to that standard, and it's a PITA and leaves everyone scalded and burned out. Node.js in particular had to innovate a lot since it's running the same language in a different context, so all of these have followed that path: AJAX, fetch(), crypto, pipes, promises, request handling, Buffer, ESM/commonjs, etc. Most of the things we use day-to-day.


> In Javascript it's been normal

What you're describing is not just "normal", but the way standards works in the web space, both in theory and in practice, driven by both browsers and the standard bodies.

1. Developers/browser developers want to be able to do something in browsers

2. One engine implements it, developers start building with it

3. Second engine implements it

4. Work begins to standardize it

5. Both (all) engines starts moving from their implementation, to the standardized one

If the step after one of the steps doesn't happen (like only one browser implements a new API), the next step doesn't happen. Standards are not created for one engine, but if at least two engines have implemented something.


It's completely wrong that performance doesn't matter for the server side, in fact it's the complete opposite.

Javascript is a fine language for client-side software because the client is paying for it (and the client's computer, tablet or phone is running idle most of the time anyway)

Firms that run web services pay dearly to run their servers on the cloud, expect to run at a high utilization fraction, and if they ran their infrastructure on a slow language like Ruby they would be paying cloud bills 10x what they'd get for infrastructure written in Go.

The people who make the decision are paying the bills and that means they make a very different decision.


I wouldn't lump JavaScript / Node.js with the other interpreted languages. While it may look similar on the surface, JS has an advantage over most other languages: It's used in browsers, and as such, there's an arms race between 4 very large companies to make their browser the fastest, resulting in enormous amounts of effort being made in making JS faster.

While it's never going to match compiled languages like C or Go, it's definitely the fastest interpreted language out there; I've seen studies putting it in the same order of magnitude as C (i.e it's 2x slower, not 10x). Moreover, its concurrency model (callbacks / event handlers) makes it uniquely suited to handle IO-bound workloads easily, which is 99% of the web nowadays (getting a request, sending a query to the database, and sending the reponse back).


That’s only relevant for browser side JS, though. V8 has a monopoly on server side JS. It’s quite slow on its own, but the bloat that people pile on top of it makes it even slower. And V8 is definitely not “the fastest” interpreted language - I have found that LuaJIT performs much better for my scripting needs.

> And V8 is definitely not “the fastest” interpreted language - I have found that LuaJIT performs much better for my needs.

According to a 2021 performance test¹, Lua with LuaJIT is slower than JavaScript with V8. Note that the "quite slow" JavaScript running on V8 is nearly as fast as Java.

¹ https://eklausmeier.goip.de/blog/2021/07-13-performance-comp...


Yeah, but V8 is Chrome's engine, which _is_ in competition for fastest engine against Safari, Firefox & Edge (although Edge isn't much of a competitor anymore).

The event loop is the original sin of node.js.. I feel like you either fight the broken concurrency model at every step, or you slip beneath the waves. The fact that languages like erlang have been around forever makes it a real slap in the face.

I don't know erlang, but honestly it's much easier to grasp / easier to avoid footguns with callbacks than threads. The main "issue" is callback pyramids when you have a lot of dependent callbacks, but that's solved relatively easily with Promises, and even easier with async / await.

It's not a binary choice between runloop with callbacks or threads, that's why you should probably take the time to learn about Erlangs actor model implementation, which Dart's Isolates also implement to some degree.

> Moreover, its concurrency model

JS is a single-threaded language, there is no concurrency


This is just not the case for a vast majority of businesses. It doesnt matter if you are using Ruby or Go or ASM if you write N+1 SQL queries all over the place.

Similarly, caching is universal and is the answer to nearly all API issues. Raw compute is such a non-factor in a significant portion of use-cases.


I generally agree and for one welcome our systems-language-based-backend overlords in the form of robust Rust and Crystal-based web frameworks (Go I dislike for other reasons, but its heart is in the right place).

That said, I've had to eat my words recently on this -- serverless compute is just so damn easy to use and scale, it almost doesn't matter how inefficient the language you're running on it is as long as there are good ways of optimizing around cold starts (i.e. pre-warming), etc.. General efficiency of the backend language mattered a lot more when we had long-running web servers where memory leaks and stability issues would rear their heads inevitably after the server has been running for a few days. With serverless, these things almost no longer matter, because every micro VM is so short lived, there is no opportunity for these types of issues to arise. When it comes to raw performance, scripting languages _are_ sufficient if all you're doing is CRUD, in which case the main bottleneck is going to be your database anyway, especially if you are using a robust ORM like ActiveRecord which heavily optimizes the server side portion of this.

At Arist (YC S20) we are using Ruby on Jets in production, and it's quite efficient. We have a ~0.996 Appdex score, pages load typically within ~80-150ms, and many of our pages are server side rendered. I also operate a personal project that uses a Rust-based web framework. If I run this project on a raw EBS cluster (instead of serverless), I can get pages to load from the server in as quick as 20-40ms, but my point is that the difference visually is usually imperceptible -- 110ms is fast enough 99% of the time. When you factor in the fact that scripting languages, and in particular, Ruby, has significantly higher productivity than some of the systems languages, it becomes pretty obvious why a lot of startups stick with scripting languages.

All of that said, bring on the good and robust Rust and crystal web frameworks!!!


What car do you use to go to the supermarket? An F1 car? A truck? A sedan?

JavaScript is totally fine for serving thousands of reqs per second. In many cases the bottleneck will be third party APIs or the DB. The cost of running that will be negligible.

If you have hundreds of thousands (or millions) of reqs per second, then of course cost can become an important factor. But at that stage you probably have the resources to build whatever you want with the best possible language for the use case.


I don't think it's unfair to say that 1 man-hour equates to many machine-hours (in terms of dollar cost). Then in terms of per-core performance, modern JavaScript runtimes are nearly competitive with Go and per-core performance is ultimately what matters at scale since you'll load-balance between cores or VMs to saturate your compute.

Per-core performance is one thing, realized performance on today's multi-core machines is something else.

I have been writing back ends in Java and C# for more than a decade and it is widespread for people to take advantage of multi-core systems in two ways if they can: (1) threads sharing data structures such as system configuration and caches (e.g. it is no problem to have 10 or 100 megabytes of configuration data for a Java-based system) and (2) using Executors to split up tasks into smaller pieces and running them concurrently.

In Node.js, Python, and other GIL languages you can't do the above and slow down from "configuration at the speed of RAM" to "parsing configuration over and over again", "configuration at the speed of the database", etc.

I see people using Node.js for build systems but I think it's still an unusual choice (like Python) for a back end for a commercial system.


I'm not disagreeing with you but I am curious: what's the effective difference for most people (concerning perf and utilization) between multicore support in Java and Go and just running multiple processes in Node (edit: let's just ignore Python to make this simpler) ?

> what's the effective difference for most people (concerning perf and utilization) between multicore support in Java and Go and just running multiple processes in Node

The perf difference here probably isn't that great (although Java/Go will likely still be faster), but if you're at the point of running multiple processes you may well find that it's less dev effort to write your code in Java/Go (assuming you are familiar with both).


Yes I agree, it's always been simpler for me to operate Go apps rather than Node ones (but also because of memory usage in callback- and streaming-heavy JavaScript).

Multi-process in node.js web serving is doable but operationally more of a pain. The cluster module will let you spawn a bunch of processes all sharing the same listening port, but sharing caches, connection pools, etc become much more difficult. If some aspect is particularly CPU-bound you can use web workers, but for run-of-the-mill web requests I'm not sure it's worth it.

Processes eat more resources, and even taking them out of the equation, dynamic language runtimes have less opportunities for good JIT code optimization, at the same level as languages like Java and Go type systems allow for.

In regards to Node, as Python is anyway mostly CPython 99% of the deployments.


A fixed process pool of 1 process per core is truly overall more resource intensive than a goroutine per request model in Go?

It would kind of seem like a wash to me, naively.


Because you're missing the picture that a goroutine doesn't use the stack size, heap from OS data structures, or CPU context switches into kernel code, as a full blown process.

Let alone the detail that the goroutine is full blown native code, while the node/Python process is interpreted, and even if a JIT is used, many C2 level optimisations are out of reach for dynamic languages.

It is no wonder that even with the herculean effort that has gone into V8, for the ultimate performance it needs help from GPU shaders and WebAssembly, both typed.


I will extend this question to things like fargate

For most firms that run web services, developer time (the engineers’ salary) is 10x or 100x the cloud bills.

They'll choose Ruby over C++ any day for purely financial reasons (obviously there are more consideration, but if it had been only about money).


So you are saying no scripted language is appropriate for the server? As far as I know, JS is extremely optimized and has no issues holding its own against Python.

I recently compared the performance of a node vs rust server

Js has many tricks up its sleeve that cant be underestimated


"To summarize: scripting languages are useful, but they’re all pretty much the same, "

Yeah ... I'm just going to have to go ahead and um disagree about that.

But I agree with the point that a higher level and well defined universal container/vm can be valuable. I think it'll be wasm-based though, not js.


It should be WASM - I hope it will be WASM. JS leaves a lot to be desired and there are some things that will just be too hard to change.

It is WASM. This is implied in the post, even though the emphasis is on JS. WASM is still early and has a young ecosystem. Only now are we really starting to explore what WASM can do and I predict that will only accelerate.

Yea, I believe they made that fairly clear in this paragraph of the article.

> Instead of invoking Linux executables, like shell does, the JavaScript sandbox can invoke Wasm. If you have some computational heavy lifting, like image resizing, it probably makes sense to use Wasm rather than writing it in JS. Just like you wouldn’t write image resizing code in bash, you’d spawn imagemagick.

I agree with you all. WASM should be the solution to the problem in this space.


If you polled all the engineers that work on actually designing and building browsers, I bet they would all say "Of course Wasm containers sounds way better than JavaScript containers. What are you, a pathetic frontend noob? Oh, wait, even they don't use JavaScript anymore. They all use a superset of it that's slighly less stupid. JavaScript containers, sheesh. Go be quiet in the corner."

Agreed. The only thing JS has over WASM is that you don't need to ship a language runtime or standard library with your application. The counterpoint is that the JS standard library is pretty anemic so the savings might not be that great, and applications might need a more recent runtime version than the host offers (e.g., Java applications often bundle the JVM so they don't have to worry about the version installed on the target platform).

For me using scripting languages like Javascript and Python for everything they were not intended for like backend, desktop apps, mobile apps is just a crazy train.

I do believe scripting languages are terribly useful for scripting the browser, a game, a tool, for writing glue code, small tools and use once code. But not for everything.

People should try to learn and use more than one programming language.


The awful performance you can be stuck with when you can only use emulators and v8 is going to come back to haunt some of you, I believe. It is indeed easy to sell something like this, as it looks easy to work with, the same way that no-code looks easy. But inevitably even native performance is not nearly enough and you need to employ algorithms that use 256- and 512-bit vector operations. At that point, we are talking about running natively on the CPU, and for that you need something that isn't just a web app.

I remember I recently saw a lecture by an industry professional who took a Python algorithm and improved it 100 000x, in parts. That is, in stages he explained how to utilize the hardware on that particular machine, to improve the algorithm so much that it was hard to believe. Anyone have a link? I looked and I could not find it. It was a recent thing that I believe I saw on this site.

When it comes to configuring web services: It is extremely important to configure them in such a way that you can fully or partially cache content, reducing costs. I get it, if you are a startup you need to move fast, but with the way things are going these days, you should have someone on your team who can at the very least deal with the basics of caching.

You also cannot pre-initialize v8 as far as I know, so you inevitably end up putting it inside an emulator anyway, and guess what happens to the performance then? I could be wrong.


> Even native performance is not nearly enough and you need to employ algorithms that use 256- and 512-bit vector operations. At that point, we are talking about running natively on the CPU, and for that you need something that isn't just a web app.

Yes, as the article said - this is what WebAssembly is for. Shell : Executable :: Javascript : WebAssembly. WebAssembly supports wide SIMD instructions (https://v8.dev/features/simd) and many WebAssembly runtimes pre-compile the web assembly to native machine code (eg https://crates.io/crates/wasmer-compiler-llvm).

In a way, you could think of WebAssembly like a more portable LLVM bitcode - it’s a compact, partially optimized representation of a program ready for an optimizing machine-specific compiler to lower to the native architecture. But, it’s also possible to interpret it in contexts that prefer fast start up.


> You also cannot pre-initialize v8 as far as I know

You can create a snapshot/image with V8 to reduce startup times (https://v8.dev/blog/custom-startup-snapshots).

Node already does this for some of its core libraries (https://github.com/nodejs/node/issues/17058). There are plans to expose this functionality to users so that any application and libraries can be snapshotted (https://github.com/nodejs/node/issues/35711).


> I remember I recently saw a lecture by an industry professional who took a Python algorithm and improved it 100 000x, in parts. That is, in stages he explained how to utilize the hardware on that particular machine, to improve the algorithm so much that it was hard to believe. Anyone have a link?

I suspect it's this one:

https://www.youtube.com/watch?v=e08kOj2kISU



> The awful performance you can be stuck with when you can only use emulators and v8 is going to come back to haunt some of you, I believe. It is indeed easy to sell something like this, as it looks easy to work with, the same way that no-code looks easy. But inevitably even native performance is not nearly enough and you need to employ algorithms that use 256- and 512-bit vector operations. At that point, we are talking about running natively on the CPU, and for that you need something that isn't just a web app.

> I remember I recently saw a lecture by an industry professional who took a Python algorithm and improved it 100 000x, in parts. That is, in stages he explained how to utilize the hardware on that particular machine, to improve the algorithm so much that it was hard to believe. Anyone have a link? I looked and I could not find it. It was a recent thing that I believe I saw on this site.

> When it comes to configuring web services: It is extremely important to configure them in such a way that you can fully or partially cache content, reducing costs. I get it, if you are a startup you need to move fast, but with the way things are going these days, you should have someone on your team who can at the very least deal with the basics of caching.

> You also cannot pre-initialize v8 as far as I know, so you inevitably end up putting it inside an emulator anyway, and guess what happens to the performance then? I could be wrong.

> The awful performance you can be stuck with when you can only use emulators and v8 is going to come back to haunt some of you, I believe. It is indeed easy to sell something like this, as it looks easy to work with, the same way that no-code looks easy. But inevitably even native performance is not nearly enough and you need to employ algorithms that use 256- and 512-bit vector operations. At that point, we are talking about running natively on the CPU, and for that you need something that isn't just a web app.

> I remember I recently saw a lecture by an industry professional who took a Python algorithm and improved it 100 000x, in parts. That is, in stages he explained how to utilize the hardware on that particular machine, to improve the algorithm so much that it was hard to believe. Anyone have a link? I looked and I could not find it. It was a recent thing that I believe I saw on this site.

> When it comes to configuring web services: It is extremely important to configure them in such a way that you can fully or partially cache content, reducing costs. I get it, if you are a startup you need to move fast, but with the way things are going these days, you should have someone on your team who can at the very least deal with the basics of caching.

> You also cannot pre-initialize v8 as far as I know, so you inevitably end up putting it inside an emulator anyway, and guess what happens to the performance then? I could be wrong.

> The awful performance you can be stuck with when you can only use emulators and v8 is going to come back to haunt some of you, I believe. It is indeed easy to sell something like this, as it looks easy to work with, the same way that no-code looks easy. But inevitably even native performance is not nearly enough and you need to employ algorithms that use 256- and 512-bit vector operations. At that point, we are talking about running natively on the CPU, and for that you need something that isn't just a web app.

> I remember I recently saw a lecture by an industry professional who took a Python algorithm and improved it 100 000x, in parts. That is, in stages he explained how to utilize the hardware on that particular machine, to improve the algorithm so much that it was hard to believe. Anyone have a link? I looked and I could not find it. It was a recent thing that I believe I saw on this site.

> When it comes to configuring web services: It is extremely important to configure them in such a way that you can fully or partially cache content, reducing costs. I get it, if you are a startup you need to move fast, but with the way things are going these days, you should have someone on your team who can at the very least deal with the basics of caching.

> You also cannot pre-initialize v8 as far as I know, so you inevitably end up putting it inside an emulator anyway, and guess what happens to the performance then? I could be wrong.


“Any application that can be written in JavaScript, will eventually be written in JavaScript.”

Gary's predictions are also still on track https://www.destroyallsoftware.com/talks/the-birth-and-death...

My question is about the implementation. Are you running a Node.js executable in a Docker-style container for each of these? Or are you attempting to achieve the constraints that containers provide at a higher level of abstraction? How do you restrict access to network, disk, excessive RAM, and excessive CPU in a high level JavaScript container — is there an actual JavaScript runtime that supports constraining all of those?

I also wonder about applications where WebAssembly isn't the most performant thing — image resizing, one of your examples, is also an example of a situation where you get significantly better performance with SIMD instructions. The experiment was tried here:

https://www.libvips.org/2020/09/01/libvips-for-webassembly.h...


Reading this makes me think of a five-year-old confidently explaining how something works to their parent.

Stick to a detailed comparison with... Shell !? So that's the context for universality he's touting.

the birth and death of javascript is a perfect primer for this article in case anyone did not see it yet (https://www.destroyallsoftware.com/talks/the-birth-and-death...)

Ryan Dahl is totally on-point here. It's one of those things that will be obvious in 10 years or less. Only SASS providers need bother with linux containers or "real" operating systems, the vast majority of everything else will be glued together over emerging runtimes like Deno/CFW, HTTP, and WASM.

This is an incredible API I wish existed in the browser (maybe over WebRTC?)

    addEventListener("fetch", (event) => {
      event.respondWith(new Response("Hello world"));
    });

This is how it works with ServiceWorkers[0]

[0]: https://developer.mozilla.org/en-US/docs/Web/API/FetchEvent


To expand, this API is directly inspired by the service worker API. A service worker can be thought of as kind of a reverse proxy (logically speaking) that is right in the browser. People are doing very cool things with it. It's a great API for PWAs, SPAs and the like but it also has quite a bit of utility for traditional sites.

However there is one big caveat - it's super easy to mess things up with it and it is harder to test and debug (Chrome is the best tool for this IMO as a fan of FF). You better have a very straight forward way to update your service worker right off the bat.


> Technology is difficult to predict, but certainly the World Wide Web will be here in 10 years.

I thought the WWW was already here

Probably should say "the World Wide Web will STILL be here in 10 years."


Some more technical details would have been lovely.

Gary Bernhardt is a prophet.

JavaScript is the next scripting language?

what happens to Lua then?


As a huge Lua fan and someone who maintains several top-10 npm packages, both of these languages have huge warts that end up hurting you in the mid- to long-term.

>Scripting languages allow business logic to be written faster and cheaper.

You can cat business logic to /dev/null. That's even faster and cheaper.

Correctness? Reliability? Speed? Extensibility?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: