

Non-Blocking Asynchronous JSON.parse Using the Fetch API - mohsen1
http://azimi.me/2015/07/30/non-blocking-async-json-parse.html

======
zamalek
> For each message to be sent to or received from a worker we need to convert
> it to a string.

Strange, according to MDN data sent to web workers needs only be deep clone-
able[1]. Is this non-standard Gecko behavior?

> aMessage: This may be any value or JavaScript object handled by the
> structured clone algorithm, which includes cyclical references.

[1]: [https://developer.mozilla.org/en-
US/docs/Web/API/Worker/post...](https://developer.mozilla.org/en-
US/docs/Web/API/Worker/postMessage)

~~~
msoad
It converts it to a string anyway.

~~~
zamalek
Gecko, at the very least, maintains cycles in the object graph. This is
something that a naïve JSON round-trip can't do.

------
fsdffs
If you add a wrapper, maybe use a simple heuristic to switch between
JSON.parse and fetch. Even something as simple as checking the length of the
string may cut it: lower than X? use JSON.parse . Higher? Use fetch.

~~~
nallerooth
Sounds reasonable. This might, however, differ between different browsers too.

------
jwmerrill
This doesn't actually succeed (yet?) at avoiding blocking the UI thread while
parsing large chunks of JSON. The async API means that browsers _could_ parse
JSON from fetch responses on a different thread, but it doesn't mean that they
do.

Here's a jsbin experiment that shows that the main thread is blocked while
parsing JSON using the author's technique (tested in Chrome 44 and Firefox
41):

[http://jsbin.com/yutahe/8/edit?js,output](http://jsbin.com/yutahe/8/edit?js,output)

------
davej
My results on Chrome 43 running on an i7 Macbook Pro. JSON generated from:
[http://beta.json-generator.com/NJG1eN49](http://beta.json-
generator.com/NJG1eN49)

~34KB json string (1st run):

    
    
        sync: total time (blocking): 0.412ms
        async: blocking time: 1.822ms
        async: total time: 3.943ms
        ➜ str.length
        34527
    

~34KB json string (10th run):

    
    
        sync: total time (blocking): 0.569ms
        async: blocking time: 0.438ms
        async: total time: 2.336ms
    

~112KB json string (1st run):

    
    
        sync: total time (blocking): 1.217ms
        async: blocking time: 1.036ms
        async: total time: 4.818ms
        ➜ str.length
        114745
    

~112KB json string (10th run):

    
    
        sync: total time (blocking): 1.658ms
        async: blocking time: 0.683ms
        async: total time: 4.444ms
    

Could do with some proper benchmarking by someone who knows more about
benchmarking this stuff for real-world usage (i.e.. not me). By the way the
2nd - 10th run results were pretty similar, the amount of time that async
function was blocking gradually decreased as it was run more often (I ran the
~34KB tests first).

------
m1el
Maybe it would be better to create a streaming message passing API. e.g. send
an object in smaller parts instead of passing JSON and _then_ sending it to
the worker and fetch it in parts.

------
efxzsh
Hmm, ok but why not

function asyncParse(string) { return new Promise(function(res, rej) {
resolve(JSON.parse(string)); }); }

\- or -

let asyncParse = async (string) => JSON.parse(string);

~~~
alt_
Promises still run in the main thread and will block other javascript. The
purpose of this hack is to move the parsing out of the main thread.

~~~
efxzsh
Ok, get it. Thx

------
jchomali
I totally agree with this

