
The Cost of JavaScript in 2019 - kiyanwang
https://v8.dev/blog/cost-of-javascript-2019
======
lootsauce
Code splitting can be a nice optimization but it can also be a lot of effort
for little gain as it is in our case. It is an optimization for the few times
a user hits our app with a stale or empty cache. And we have no mobile users,
this is an enterprise analytics app.

We do not grow organically by people stumbling on our app and thinking "wow
that was fast". We go through months of enterprise sales process to ink a
deal, then onboard maybe 20 key users at the company.

To put the effort into code splitting would be purely an exercise in keeping
up with the new hotness. That's not to say we don't keep a close eye on the
package size, just that it's not much of a great optimization for a regular
user's experience in our case.

Also serving all assets from the same domain saved us some time in domain
resolution.

~~~
nullwasamistake
The last part, absolutely yes. CDN's are obsolete, although they'll jump
through all sorts of hoops to convince you that's not the case.

As long as your service uses HTTP/2 it's far more efficient from DNS and
multiplexing/TCP/TLS handshake standpoint to serve from your own domain. And
better for security most of the time since hardly anyone uses CSP and hash
signatures for their third party scripts.

The original sell of CDN was that everybody would have the same libraries
cached. With the massive poliferation of JS you would have to have a 100 gig
cache for that to be remotely true.

A couple years back I moved a C# app to .NET core for the HTTP/2 support. I
tried removing the ~4 external CDN dependencies just to see what happened.
Load speed improved around 30% because no additional DNS lookups and TCP
window stuff worked around by multiplexing.

An aside, try not to use multiple subdomains. They trigger DNS lookups and
don't work as well with CORS. It's easy to accidentally trigger cors and a
bunch of meaningless round trips by using different subdomains

~~~
billyhoffman
I think you are conflating general Content Delivery Networks with “shares
JavaScript library repositories that happen to use a CDN”. While the “it’s a
shared repo of common JS files so it’s already cached” idea never really
worked (you are right about the added cost of DNS + tcp/TLS) general Content
Distribution Networks absolutely provide performance benefits by delivering
your static (and optionally dynamic) content from edge nodes that are much
geographically closer to the visitor. Usually these CDNs front the entire site
origin, so you dont have extra dns/overhead for subdomains like shared JS
repos.

(I work with many IR Top 100 retailers, and I’ve helped to build the
dashboards comparing edge vs origin. It’s valuable even for sites where the
majority of the visitors are in the US, and especially so if you have a
substantial international audience)

~~~
nullwasamistake
When they front the origin there's definitely an advantage, I agree.

But that implies: You allow a CDN to host your initial asset, possibly a
security risk

You only use that one CDN for most/all non-dynamic content.

When latency becomes that important I would rather host my own AS. Giving CDN
the origin and your first content load is effectively handing off your entire
site security to a third party.

At that scale you could easily and cheaply run multi-homing on your own AS in
maybe 10 colos across the world. Maybe 100k a year to eliminate third party
risk. Maybe worth it? I think so

------
keymone
Ugh.. does anybody else have this feeling that no matter how fast JavaScript
gets, average web app performance will not change at all? Kind of like with
risk compensation principle - the safer your gear, the more risks you take on,
we’re in the same spot with web/electron apps.

JavaScript vms got so much faster than 10 years ago and yet all the web sites
are much worse. Memory and cpu hogs.

This is not something improving the vm can fix. There just is no competition
so that customers could send a feedback signal saying that performance is
unacceptable.

~~~
Spivak
What do you mean there's no competition? This feels like such a weird
statement when there's competition in just about every market segment on the
web. If you mean there's no competition where the differentiating feature
between competitors is performance then well sure, that's a lot rarer.

For user-facing applications performance only really matters in human time,
and is pretty far down on the list of how people choose software over things
like features, price, ease of use, etc.. Until performance becomes a big
problem because the app/site becomes unusable it's basically no problem.

Hackers might not like it but we're such a weird market to sell to. Not
necessarily that performance is a weird preference but that by in large
hackers are perfectly fine with woefully inefficient feature-packed apps for
their "primary" apps like their IDE but then want super lightweight skeleton-
featured experiences for everything secondary.

~~~
chrisweekly
Performance _is_ a feature. In fact it's the most important feature.

~~~
hedora
Well, correctness / security are probably more important than performance.

(Spoiler alert: If your system is well-enough implemented to be correct, it is
well on the way to being secure and performant too)

------
dmitriid
If you look at why the bundles are so big, the frameworks are so large etc.,
you’ll realise it all comes down to fighting browser deficiencies:

\- no declarative APIs for DOM updates

\- no good APIs to do batch updates

\- no native DOM diffing (so decisions on what to re-render have to be done in
userland)

\- no DOM lifecycle methods and hooks (you want to animate something before
it’s removed from DOM, good luck)

\- no built-in message queueing

\- no built-in push _and_ pull database/key-value store with sensible APIs
(something akin to Datascript)

\- no built-in observables and/or streaming

\- no standard library to speak of

\- no complex multi-layered layout which matters for things like animations
(when animating a div doesn’t screw up the whole layout for the entire
app/page)

etc. etc. etc.

As a result every single page/app has to carry around the whole world just to
be barely useable.

~~~
worble
> As a result every single page/app has to carry around the whole world just
> to be barely useable.

So pages before this javascript bloat became commonly accepted were unusable?
Maybe at some point web developers have to accept that the web was simply not
built for the things they're trying to do with it, and that comes with a cost.
Maybe they should evaluating whether they _really_ need animations on
everything, whether everything _has_ to be a SPA made in [current popular
framework]. I'm not even saying these things are bad, they absolutely do have
value, but that value has a tradeoff, and usually that tradeoff is placed on
your customers.

~~~
derefr
The parent’s point is that, if all these things were browser JS runtime
features, there’d be no tradeoff to be made. They’d be “free.”

If every JS page in the world today includes the same line of code, isn’t it
obviously the fault of browser makers for not making that line of code part of
the JS prelude and thereby making it “free”?

~~~
moltar
That’s a slippery slope. We’ll eventually turn the browser into an OS.

And more features = more surface area for bugs.

~~~
mekoka
An OS? An exaggeration maybe, but only slight. The browser is pretty much its
own environment already. It's been the case for decades. It can display text,
process a variety of media, and run programs (written in JavaScript). The main
cause of frustration at this point is that, compared to other popular
programmable environments (C, PHP, Python, node.js, etc), where some real
focus is put into evolving the language along with a variety of primary
development tools, people in charge of the browser's ecosystem seem to still
be coming to terms with the programmable part of its identity.

All the dilly-dallying results in community efforts that pile on top of each
other to create all this bloat that is carried from one tool to the next
framework.

------
yyyk
"Avoid large inline scripts" should be 'Avoid all inline scripts' in practice.

Inline scripts prevent using CSP 'unsafe-inline' therefore increasing XSS
risk. Performance is only a secondary problem here.

~~~
bdibs
What’s wrong with the hash approach?

~~~
yyyk
It's not "wrong" in the sense of being intrinsically unsafe, but IMHO it's a
bad habit which can tempt to insecurity in the end.

When working with external files a programmer almost has to work with the
framework, and once (s)he does that the framework - well, a modern framework -
would take care of the security details and do it right.

When working inline it's too easy to add a script tag manually, and from there
it's a bit too easy for someone in the team to miss something (write a js
without the hash/nonce and not notice the warning) or talk him/herself down to
lowering security ("importing this 3rd party js is too hard, lets use just a
nonce and forget about hashes", "this policy is too constraining, it's just a
SMALL script, no risk here").

When working in a team, it's much better to have a hard and fast rule which
forces everyone to work right. There's really no reason to use inline when
using external files works really well now - and is apparently better for
responsiveness too.

------
dehrmann
It's interesting that they benchmark Facebook and Reddit, two sites I find
significantly more laggy than, say, Google. Are there choices in _how we build
sites_ that are more important than how V8 can optimize things?

~~~
nindalf
Are you comparing the Facebook news feed to the Google home page? Is that a
fair comparison, considering the news feed does a lot more than loading static
assets? Unless you were comparing the News Feed to say, logged in Gmail.

------
pepijndevos
I read the title as "the price we're paying for everything being written in JS
these days"

~~~
carapace
Me too. I keep asking people about the business value of Elm vs JS. Why isn't
Elm-lang wildly popular?

(I think I know the answer, but even so I'm interested in what other people
think.)

~~~
kolme
Because it's a business risk.

\- It's very easy to find JS programmers. There's a _lot_ of them. But
programmers proficient with pure functional programming languages are harder
to come by.

\- JS is natively supported by browsers and it is pretty much guaranteed that
the code you write is going to work for ever. Elm on the other side, who
knows? It might lose steam and go into support mode, or drop support, or maybe
they introduce breaking changes in a future version. JS is a much safer bet.

\- Writing Elm does not guarantee a good polished product. You can write bad
software in good languages and vice-versa.

That being said, it's only a risk. Maybe your team is more comfortable with
Elm and they get more productive. Maybe the language design makes easier
writing code with less defects. Maybe it actually gives you an edge.

It is hard to say if Elm brings value to your business. It most likely depends
on what kind of a business that is (web agency than cranks 3 visit-card
websites a day? Or is it one developing a complex app?)

In any case, managers get very nervous about languages that are not
mainstream. And they do have good reasons.

~~~
carapace
Thanks for replying.

> But programmers proficient with pure functional programming languages are
> harder to come by.

I used to get a similar argument when pushing Python over Java, and my
counterargument is the same: I wouldn't hire a JS programmer who was afraid or
unwilling or unable to learn Elm. I would bet that any normal person smart
enough to solve Sudoku problems could learn Elm, I wouldn't make that bet with
JS/CSS/HTML.

> It might lose steam and go into support mode, or drop support, or maybe they
> introduce breaking changes in a future version. JS is a much safer bet.

But who is using _just_ JS these days? Frameworks and transpiling are the
order of the day, no? Same argument applies to them. And JS is a moving target
too. As for "breaking changes", well, it is just version 0.19 so far. Once
they hit 1.0 I would bet on Elm being more stable than JS+Whatever.

Weigh this against _zero front end bugs_. That's a staggering (if hard-to-
quantify) cost savings.

> Writing Elm does not guarantee a good polished product. You can write bad
> software in good languages and vice-versa.

Yeah but that's not an argument in favor or against any particular language,
and in the specific case of JS vs Elm I think it's clear Elm wins. If your
developers are truly crap then, yeah, Elm won't save them. But then you also
have bigger problems than what tech stack to use, eh?

~~~
orange8
> I wouldn't hire a JS programmer who was afraid or unwilling or unable to
> learn Elm.

Would you hire an Elm programmer who was afraid or unwilling to learn proper
JS?

~~~
busfahrer
Also I want to throw in that learning a new paradigm (Elm) is something else
entirely than going from one imperative language to another one.

~~~
0_gravitas
In all fairness the functional paradigm isnt exactly all too hard to grasp
(when it comes to what you need to be effective at least, you dont need to
understand what a monad is, I still dont!). I started learning Clojure
completely cold going through Exercism and it took me maybe 2 weeks of doing a
problem a day (not much work at all) before I felt like I started to "get" FP.
In fact, I would say it's even easier in a lot of cases because FP is a much
"purer" skill set- any imperative language still uses functions, and arrays,
and switch statements, and overloading. In FP, you have all of those familiar
concepts (just presented in a possibly different way) but you dont need to
worry about how classes work at all, or even loops in some cases!

------
snek
This is an excellent analysis of JS performance and usage, although it seems
to suggest optimizing your site for V8 a bit too much.

Also, the "LONG TASKS MONOPOLIZE THE MAIN THREAD. BREAK 'EM UP!" section
header felt a bit on the nose.

~~~
badfrog
> it seems to suggest optimizing your site for V8 a bit too much.

+1. After seeing some opinions that Chrome is becoming the new IE over the
past few weeks, I've switched to Firefox to do my small part in trying to
prevent another browser monopoly.

~~~
supakeen
Which parts of the article aren't applicable to Firefox? It seems like most of
these idioms should give advantages to SpiderMonkey (if it's still called
that) and V8.

------
NohatCoder
Tip no. 1: Don't put any code on the page that doesn't need to be there, then
any minute differences in the exact implementation likely won't add up to
much.

------
dep_b
I would also be interested in the cost of developer time lost by using the
language itself and compensating for it’s almost entirely lacking standard
libraries.

~~~
lacampbell
I have usually found, installed and started using a good javascript library in
less time than it would take me to make sense of MSDNs enterprise-soup
documentation.

~~~
manigandham
What does MSDN have to do with this? Are you talking about C# and .NET
specifically? Have you seen the current documentation?
[https://docs.microsoft.com/en-us/](https://docs.microsoft.com/en-us/)

~~~
lacampbell
Yes, I felt brave, so I went after everyones darling C#, and not an easy
target like Java.

I read C# documentation several times a week. I do not like it.

~~~
gmiller123456
Can you provide some examples of documentation you do like?

While I can't ever remember reading some documentation and saying, "Wow, that
was fun!", so I can't say there's any documentation I like. But I do find that
the MSDN documentation usually gets me working, and not reading documentation
much faster than others for e.g. Python, Java, JavaScript, CSS, my pedometer.

~~~
lacampbell
> Can you provide some examples of documentation you do like?

MDN (Mozilla documentation for web stuff) is generally pretty good. Obnoxious
colors aside, this javascript library is very easy to get started with:

[https://localforage.github.io/localForage/](https://localforage.github.io/localForage/)

------
elbasti
As a JS dilettante, I found this incredibly surprising.

    
    
        const data = { foo: 42, bar: 1337 }; // 
    

>>> can be represented in JSON-stringified form, and then JSON-parsed at
runtime:

    
    
        const data = JSON.parse('{"foo":42,"bar":1337}'); // 
    

>>> As long as the JSON string is only evaluated once, the JSON.parse approach
is much faster compared to the JavaScript object literal, especially for cold
loads.

~~~
baron816
I hate these microperformance hacks. There’s no way you’re going to have a
bottleneck here and you’re making the quality of your code dramatically worse.

~~~
outside1234
I hate that this micro performance hack actually is better - why don’t they
just fix this in the JIT so that the literal is just as fast as the JSON
parse? Why do we have to think about this?

~~~
baddox
The runtime can't know whether it's parsing a plain old JSON-compatible object
literal until it, well, parses it.

~~~
pbhjpbhj
IANAProgrammer but couldn't it do like try..catch and assume it's JSON
compatible and fall back to the [much] slower method if the parse fails? Does
that cost too much time/resources?

~~~
daleharvey
That would make all JavaScript that has non JSON object literals (most JS
code) slower

~~~
pbhjpbhj
How much slower?

We already established it's far slower parsing js; in most cases JSON parsing
would fail - IIRC - on the first character ^[^{] ... so would it be worth
being slower by the time it takes to check the first character, and then only
JavaScript that started with { would be slower. Which I guess is virtually
none.

I suppose those sorts of questions are part of what makes language design
interesting.

------
Zardoz84
"With HTTP/2 multiplexing, multiple request and response messages can be in
flight at the same time, reducing the overhead of additional requests."

Oh... we are using Servlet 3.0

~~~
hinkley
Odds are good that a reverse proxy will improve availability on your servers.
It will also allow you to fix this problem.

~~~
Zardoz84
Looks that is true. Thanks for the information!

------
z3t4
Another strategy is to make the files cacheable, and have some version naming
scheme to invalidate caches. You can also lazy load features as most users
will likely only visit the front page and then leave after 3 seconds. No need
to have users wait for functionality they wont use.

~~~
hinkley
Several of the bundling tools have had a way to emit the file with a hash of
the contents on the end of the filename, rather than using version numbers.

One of the places this helps you is when you push out a one line javascript
bug fix without invalidating your CSS caching across the entire site, because
the CSS files are identical.

It also helps you with people with the website open when the new code gets
pushed. They either have all the old code or all the new code and no bizarre
mishmash of the two.

~~~
z3t4
With making the files cache-able I mean _not_ bundling them. The first load
will be longer (unless you use lazy loading), but successive page loads will
get the files from cache, and if there's an update, only that file will
update. It's not usual to have updates shipped several times per day, and if
there's many users there will also be many unnecessary reloads of the bundle.

It's a bit of a optimization as to be 100% sure users always get the latest
files the version naming scheme is needed, but most of the time it will work
anyway, eg. if you have no caching layers /CDN etc in-front of the web server.
So if you want to shave off milliseconds on first load, go with the bundle,
but if you want to save user's bandwidth and make your site/app more
lightweight - don't bundle!

~~~
vimslayer
Most bundlers have ways of splitting the bundle into smaller pieces that can
be cached (and cache-busted with the hash in filename) independently. I think
that's what the parent commenter was talking about. So you don't have to
choose between one huge bundle and hundreds of individual non-bundled files,
with code splitting and lazy loading, you can do something in the middle and
have like 3-30 "minibundles". Then, if you make a modification that only
affects one of those bundles, only that bundle needs to be reloaded and others
can still be served from cache.

------
AlchemistCamp
With their redesign and a few other choices, Reddit has gone from one of the
most beautiful places on the web to the second most hostile UX that I put up
with.

I don't want to "GET NEW REDDIT" as I'm urged to in the top left of every page
because I don't want a card-based layout. The whole reason I and so many
others left Digg for Reddit to begin with was for a highly skimmable,
information-dense site!

Similarly, I don't want to install a mobile app. The page worked fine on
mobile when I got my first iPod Touch a decade ago. Why do I have to see a
huge USE APP button in the nav bar and then lose another 20% of the page at
the bottom to a See Reddit In section at the bottom that's also urging me to
install a native app?

Infinite scroll actually slows down my browsing experience too, since it no
longer loads as many entries at a time and I have to keep waiting for
pagination.

~~~
DoingIsLearning
For desktop

old.reddit.com

or for mobile

reddit.com/.compact

Both are low nuissance and both still work for me, although I am not sure if
that is region based.

~~~
AlchemistCamp
> old.reddit.com

It doesn't put a big red "GET NEW REDDIT" in the upper left that does a
single-click update of your account settings? What region are you in?

> reddit.com/.compact

I just tried that and the top 40% of my screen was spent on a banner telling
me, "You've been invited to try out Reddit's new mobile website!" Clicking
that lead me to a page where both the top bar and the bottom section harass me
to install their mobile app.

~~~
rcfox
I use a uBlock Origin rule to hide the upper-left button.

~~~
DoingIsLearning
Indeed I overlooked the fact that I am also using ff+ublock.

------
keymone
Largest cost of JavaScript is having to write JavaScript. It’s insane anybody
would willingly do that nowadays. I think literally everybody writing
JavaScript secretly wishes they were using some other language with JS
transpiler.

I refuse to believe anybody can honestly say they want to write JavaScript
code unless it’s code that makes them write less javascript in future.

~~~
NohatCoder
I have yet to find a better package, when you transpile, debugging always has
that funky thing with two different source codes where you wrote one and the
other is what is running. And it is not like you can do anything that you
can't do in JavaScript, either you have a feature subset, or you ship with a
library that the transpiler attached.

~~~
blunte
You miss the point. The issue isn't whether it's nicer to use a transpiled-to-
JS language or JS itself. The issue is that JS as the root is an unfortunate
situation, and there are so many other languages that would be superior.

Just look at the history of JavaScript. It does not deserve the attention or
human time that it currently gets (compared to other languages, assuming we
could all decide simultaneously to replace JavaScript with another language).

~~~
hombre_fatal
I have a hard time justifying using any other dynamically-typed language over
JS. And JS' static typing story is pretty mature. And it has cool features
like async-everything and a first-class Promise.

So, let's not be too extreme here just because you don't like something. It
gets really silly to hear HNers suggest the equivalent of "I don't understand
how people own blue cars. They must secretly aspire to drive $myFavoriteColor.
Don't they realize how stupid blue looks to me?"

------
blunte
They left out measurements of "cognitive load".

------
drewbt
Const data() = JSON...

------
nullwasamistake
Great article. I applaud the massive amount of effort that's resulted in JS
being far faster than anyone ever expected.

It's time to switch to WASM though. Half native speed with almost no parsing
overhead. As integration and tooling improve I see no reason to stick with JS.

It also rectifies a number of historical mistakes that are nearly impossible
to fix. Mainly threading. WebWorkers etc are a huge hack. No atomic data
structures, no shared memory, huge overhead. WASM will be many times faster
than JS from threads alone. Add the natural speed advantage and you're looking
at maybe a 20x performance difference.

~~~
xenospn
The last milestone I see on the WASM page is from 2017 - is it still a thing?

~~~
nullwasamistake
It's stable now and implemented in all major browsers.

~~~
Siwka
Last time I checked you didn't get DOM manipulation, so you'd still need to
use JS for that.

~~~
nullwasamistake
Yes. Rustaceans have stepped up and provided a ton of nice shim layers though.
So practically, you don't have to actually write your own js to interface

------
superkuh
This isn't quite true. It's only true if you're browing from a computer that
is moderately fast (at short timescales before throttling at minimum) and
using a browser that does not respect your software freedoms. There are
literally hundreds of millions of us out there where executing javascript code
on sites made entirely out of javascript can be significant. Weather it's
because you use many, many tabs, or browse using an old smart phone.

But of course that's not the real cost. That's the computation cost. The real
cost of websites switching to an application model directly targeting the DOM
is the loss of accessible webpages and the chance that every time you're going
to get owned. Weather it's some information leak from speculative execution or
something else.

Running JS on every page these days is just as stupid as opening every
attachment you get emailed to you.

------
tyingq
_" On mobile, it takes 3–4× longer for a median phone (Moto G4) to execute
Reddit’s JavaScript compared to a high-end device (Pixel 3), and over 6× as
long on a low-end device (the <$100 Alcatel 1X)"_

Notable that they avoid the iPhone comparison. It runs circles around even the
top end Androids for JS performance.

~~~
simion314
Maybe you misunderstand what this article is about, s not about what is the
best phone, is about how JS is run in real world where not everyone has super
fast internet all the time and not everyone has a super fast phone or
computer.

So if you are a web developer you get some good advice from this article
related to JS not what phone to buy.

~~~
tyingq
They are making the point that low end phones are much slower in executing JS
than high end phones. Perhaps to remind developers that their experience, on
their phone, isn't the experience everyone has.

I imagine a lot of developers have recent iPhones. They are, depending on
benchmark, up to 3x faster than a flagship Android. Which means the top to
bottom chasm is much larger than the snippet suggests. Noting it would
strengthen their point.

~~~
simion314
OK, sorry I missundertoo then the iPhone point, my bad, I interpreted it as
iPhone is super fast so nobody should forget to mention in any article the
iPhone is the fastest phone and Intel i9s are the fastest CPUs.

I agree that we the developers should not forget that a lot of people have
less powerful hardware.

I sometimes hit thus problem at work, say we offer feature X like uploading an
image and you can crop that image and apply a filter, how should I add this
feature but not have the code even load in the browser if you don't intend to
use, is there a pure JS way to do it? AFAIK thee is no good way to load a
script at runtime and get an even when the file is loaded and parsed.

~~~
Stratoscope
> _AFAIK [there] is no good way to load a script at runtime and get an [event]
> when the file is loaded and parsed._

You can do this easily by creating a script element dynamically and then
either:

1) Listen for the load event on that element, or

2) Have the dynamic script call a function that is already loaded.

Here is an example of the first technique:

    
    
      let script = document.createElement( 'script' );
      script.addEventListener( 'load', () => {
          alert( 'three.js is loaded: ' + THREE );
      });
      script.src = 'https://ajax.googleapis.com/ajax/libs/threejs/r84/three.min.js';
      document.head.appendChild( script );
    

[https://jsfiddle.net/geary/szmc1L0h/9/](https://jsfiddle.net/geary/szmc1L0h/9/)

~~~
simion314
Thanks, I use this techinque but I had issues with it in the past, I tried a
few minutes now to find documentation to confirm if the onload guarantees the
script is loaded and finished parsing/interpreting but I can't find it.

Your example code works in all the browsers I tested so maybe it was an issue
ith th particular script I was testing, there are third party scripts like for
embedding an image editor, those could also use this trick to load some
dependencies.

Great. so I was wrong then, this should work with most third party scripts.

