Hacker News new | comments | ask | show | jobs | submit login

I have a somewhat off topic question: is there anything in the design of Javascript that mandates single-threadedness? Could any Javascript engine implement threads?

I'm asking because I'm wondering if Node.js's evented approach is the only way to do things.

If you allow threads to share memory arbitrarily you need to add locking to all internal VM structures, which is going to be a significant slowdown.

Also it's not (any longer) considered good practice to have languages that allow mutable memory sharing since that makes software unreliable, so it's not really a good idea.

Without arbitrary memory sharing, multi-threading is already supported with web workers.

Web Worker is very weird in that you have to have a separate file.

Agreed! FWIW this is why I made Operative. It gives you a way of writing "inline" JS that utilizes web workers (caveat: not actually inline; no scope/context access of course). It provides good support across browsers and fallbacks for envs where web workers don't exist (and ~all the in-between cases): https://github.com/padolsey/operative

Looks nice! I took a stab at this a while back https://gist.github.com/icodeforlove/deb0f19a9e7bd528bd48

Not with a little imagination and hackishness!


Sharing only typed arrays shouldn't be hard.

Well it's not sharing, but web workers have a zero-copy way of transferring typed arrays between web workers. [1]

It gives many of the benefits of sharing memory without all of the gotchas


I use that a lot already but I wish there was at least read-only shared access.

There's a draft spec for SharedArrayBuffer:


Shared memory in Javascript is a Bad Idea(tm). Run, don't walk, to your standards body, and tell them not to entertain such notions.

You know what happens with shared typed arrays? Shared Uint8 arrays.

You know what happens with shared Uint8 arrays? Multiple workers using JSON.parse.

Do you want ants? Because this is how we get ants. :(

Why would anybody ever want to JSON.parse typed arrays from web workers when the parsed data (or unparsed strings) can be passed around directly? Strings are immutable and are not copied when you pass them around. I don't see your point.

Remember: web workers communicate (to the best of my knowledge) using a full serialization/deserialization of message objects (which is why they added transferable objects). So, to create a one-to-many broadcast mechanism, you'd do exactly what I described. Additionally, I am unsure, but I could easily see immutable strings being copied nonetheless between web workers because of the desire to give every worker a private heap.

I still think is faster to send the same message to all than using JSON.parse on each (not to mention much more compatible). Since each worker is going to make a copy anyway (with JSON.parse or by receiving a message), why does it matter?

Also, assuming what you say is true, what's the problem? It's much easier for the JS implementations to synchronize things only with a specific type of typed arrays than sharing all kinds of GC-managed data structures.

There is nothing in the language itself that allows multi-threading. Going forward, they are adding support in the language for the async keyword, similar to F#. Any way to achieve parallelism would have to be from APIs, e.g., like WebWorkers in the browser.

No! Javascript the language and thread are not related. If you find a Javascript implemented over JVM like ringojs uses JVM threads to execute JavaScript in parallel. In which case you get both event based within thread, and multi-threaded at the same time.


The JVM implementations of Javascript (e.g. Nashorn) support multithreading. Imho it's not a good thing, because all existing Javascript code is written with singlethreading in mind and lots of stuff will break if you use it from multiple threads. Multithreaded code for Nashorn requires the use of JVM synchronization primitives, which are then not supported by other Javascript engines.

It used to be possible to write Firefox extensions that used multi-threaded JS; however, like any other language, accessing the DOM was not threadsafe (so you can't have a window be your global object, and therefore they needed to live in separate source files). As JS lacked native support for threading, you also had to be very, very careful and often ended up with less obvious threading issues anyway.

In the last few years SpiderMonkey (Firefox's JS engine) has dropped support for this more and more, and these days you can't anymore. But that's a consequence of the engine implementation, and not the language.

Mozilla is working on a spec for something called SharedArrayBuffer, which will allow Workers other than the main thread of execution to share memory.

The reason for this is that as soon as you have separate threads sharing memory, it introduces non-determinsism into the mix. When developers use synchronization mechanisms incorrectly, or not at all, this can lead to deadlock between threads.

This must not be allowed to occur on the main thread shared with the rendering engine.

So keep your eyes out for SharedArrayBuffer.

The event loop is very important to the existing language semantics and isn't going anywhere.

My speculation is that before Node stepped into the picture, primary use-case for JS was browser, hence UI manipulation. That usually prompts for a single thread that can update UI, unless you're willing to introduce a lot of mental overhead when dealing with synchronization primitives.

I don't think there is anything preventing JS running shared memory threads. Apart from all the existing code and libraries which aren't thread safe, but that's the case in many languages.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact