Hacker News new | past | comments | ask | show | jobs | submit login
The Cost of JavaScript (medium.com)
177 points by shawndumas on Nov 15, 2017 | hide | past | web | favorite | 111 comments



Nice article, especially the data, but I'm disappointed they didn't cover more of the most effective approach: Design your site not to use javascript except where really needed!

I think most sites can be broadly classified as (1) "relatively static" such as hackernews itself, news/blogs, text-based sites in general, or (2) an entire program/application running in the browser, such as a game or collaborative text editor. Most sites fall into category 1 and therefore need very little JS to do their job effectively, but they come up with 50 different things for JS to do on their webpage, then spend time and effort optimizing and minifying instead of just not doing those things (or doing them in HTML).


> I think most sites can be broadly classified as (1) "relatively static" such as hackernews itself

My entire job experience would disagree. I have yet to work on a static content site. When I do, you can be sure I'll use hugo or some static site generator, but every job I've had so far was to build sites that act on user-specific data that changes frequently.


First, I want to clarify: You and I may be thinking of "static" somewhat differently (maybe I misused the term). I'm not thinking of e.g. HN as static in the sense of unchanging over time, actually it constantly changes in response to user interactions. But when I personally visit a page on HN, the page I receive mostly "static" in the sense of essentially serving me a set of text, not responding dynamically to my actions (except a little bit in the case of HN).

Second, fair enough with regards to your work experience, but for me, most sites I visit are just serving text/images/video, not interactive or barely interactive.


That is where server side rendering comes into play.

Most JavaScript I write, just boils down to fancy form validation or using something like Highcharts.


You are thinking only on two extremes, between a news/blogs and game/collaborative text editor there a thousands of web apps that don't fit the two. What would you call webapp for generating and keeping track of invoices?


While it's possible I'm not understanding some important aspect of invoice generation, I don't believe JavaScript is necessary for this process... Plain HTML form inputs -> Server -> Generate PDF? What am I missing? Why's JavaScript required here?

I think we forget that the web was a pretty productive thing even before JavaScript was widely used. GMail existed (and still does) in a javascript-free form. You can even use Google Maps without JavaScript. Granted, JavaScript enhances both experiences, but it's not strictly required.


There was never a time in the modern web era when JavaScript wasn't widely used. You'd have to go way back pre-2000 to find a time when popular websites didn't use JS at all. The Cambrian explosion of 'Web 2.0' wouldn't have happened without JavaScript and we wouldn't be where we are - doing everything on the server sucked big time, I do not want to go back to party like it's 1996.

To use Gmail, the poster boy of Web 2.0, as an example of why sites don't need to use JavaScript is bizarre. It destroyed other webmail sites because it used JS to give users a real GUI mail client on the web - the HTML backup works, but if that was all they'd given us we'd still be on Hotmail.


I thought gmail’s big competitive advantage was giving users 1GB of space instead of 10MB or whatever that other free webmail providers gave us. I don’t recall gmail’s UI being remarkably better.


Also, good spam filtering and alternative access through protocols like imap.


Web 2.0, which "happened" in late 2004 was mostly about user participation a la Wikipedia, Wordpress, etc. It was about tagging things a la del.icio.us. Some JavaScript was used to do crazy things like show full-size images from clicking a thumbnail or making wysiwyg editors. But these thick-client "Single Page Apps" people build today didn't really take off until Backbone.js became popular in ~2010.

Re: Gmail. It won, not because of its slick JavaScript features (really, there weren't many when it came out in 2004). It won because of the impressive (at the time) storage space they gave you.


Although the Wikipedia page may say Web 2.0 was about user generated content and tagging, from a web dev point of view it was about building rich interfaces and using XHR to update content without a full page refresh. Full size images had nothing to do with it.

My point is that JavaScript has it place in modern web dev, and saying we should just get rid of it is ridiculous, and not what this article says at all.

Gmail won for many reasons, but usability was a big part of it, and it wouldn't be possible to even auto-complete an email address without JavaScript


Gmail had an exceptionally well UI, minimal and highly usable. What is bizzare that mailers were extremely bad and you didn't even needed to compare them to Gmail.

JS had maybe it's share, but Gmail was minimal, condensed and fast. Literally no one had that back then and today no one knows because competition is dead. Now every webapp is JS heavy, but there is still a great difference in minimalism. Not saying that google products are small, but they seem to have a better balance in "fancy" user interaction than the rest of the web. The reckless transition to a modern web with more bells and whistles takes it's toll, mostly because "we can", but it should be "make sense here".


I agree, but I would say that Google are performance addicts because of their scale, where seconds actually count.

Most apps are never going to be 'web scale', and most of the time there's no trade off to make. Either the devs use libraries to make something work fast and live with a big js bundle, or they optimise too early and go down the perf rabbit hoe and run out of runway. If your product is of enough value your customers aren't going to care about 1-2 seconds, and if it becomes an issue you can do the perf work later.


Hell, even seconds seem like too much. Google's large enough that they can justifiably be concerned about milliseconds.


Your argument is essentially "sites that need JS as a core requirement to the UX need JS" and "now that everyone uses JS no one shouldn't."


My argument is "most sites need JS as a core requirement of their UX". Whether they need 1Mb of it is very doubtful and that's what the original post is about.


Granted, JavaScript enhances both experiences, but it's not strictly required.

Yep, Javascript is not strictly required but that was what I was trying to refer to, the enhancement part.

A blog or news site like HN doesn't really need much enhancements, a web app for invoices lends itself to be way more interactive than HN.

In your gmail example, the HTML version doesn't have:

Chat

Spell checker

Keyboard shortcuts

Adding or importing contacts

Custom "from" addresses

Rich formatting


JavaScript is like a permission in andoroid or in an addon for the new firefox. If I feel like the website really requires Javascript or benefits greatly by Javascript then I will allow it to use JavaScript. Otherwise I will use the website without Javascript or if it is not usable I will just leave. It's ok to add enhanced features using Javascript. But basic features should work without. If links (<a>) don't work without JavaScript I leave and pick the next search result.


One has to agree with your sentiment. In addition, I have spelling checked on many sites wherein I do not have javascript active - this is a part of the Mozilla functionality.


> In your gmail example, the HTML version doesn't have:

> Chat

> Spell checker

> Keyboard shortcuts

> Adding or importing contacts

> Custom "from" addresses

> Rich formatting

Sounds excellent. I might give the Gmail Web interface another go (I've always used it through POP and IMAP in Emacs, due to monstrous "features" like these). Whilst I agree in principle with spell checking and keyboard shortcuts, not only does my browser provide both, but Google's poor wheel-reinventing interferes with them (as does that of many other sites, e.g. GitHub).

As for the "chat", my experience has been that even nudging one's mouse in the wrong way ends up migrating a gtalk account to "hangouts", which doesn't support XMPP and it's very difficult to undo.


I just tried it and wow, the "HTML-only" GMail is so much more responsive and usable! Unfortunately it still breaks my keyboard shortcuts, so there must still be some pesky Javascript somewhere :(


For that I use Thunderbird or my Android email client.


I definitely admit it's a simplification that won't fit every scenario. It just turns out to be pretty apt for my web experience personally.

Maybe another way to put it is that I don't mind enabling and loading JS for a site where interactivity is clearly core functionality, but if it's essentially serving text, then a JS-heavy design feels like it's working against the user, not for the user.


A natural candidate for a desktop app.


Every time one of these JS-related posts come up (especially performance-related ones), I find interesting the seemingly huge disconnect between the JS-hating audience of HN and the actual silicon valley corporate/startup world where heavy-JS/React/SPA is everything.

Let me begin by stating that I'm not hating on either side of the fence -- I've been on both sides (and arguably, I still am on both sides, if that were possible).

As someone who works on a lot of side projects, I can very easily see the argument of keeping web sites JS-light, and not even dive into all the React stuff. Most projects don't necessarily call for it except for certain very specific things. (I actually do not use React on any side projects of mine, and still stick with jQuery.) On the other hand, after having joined a JS-fullstack startup this year, I can see that an article like this one is completely speaking to me/us. All I can say is, without the giant background context of working on a monolithic fullstack JS codebase, none of it makes sense; but when you're in such a job, everything of it makes sense.


I think it's so weird that many people on HN can't fathom a website being more complex than Medium. I've spent the last 5+ years doing nothing but develop data-heavy visualization apps. The alternative is not building a website without JS, it's building a desktop application using Swing or QT.


> I've spent the last 5+ years doing nothing but develop data-heavy visualization apps.

Those aren't what this article is talking about though, For a huge client facing desktop web app, you can load things up front, expect the user to wait a bit, and then off you go.

But imagine if every page on HN took ~15 seconds to load. Heck the reddit mobile experience right now is that the comments section takes 4-5 seconds to load, and sometimes it doesn't load at all. Of course it is instant in their mobile app, and also instantaneous in their legacy mobile view. (Which makes me seriously question what in the world the modern mobile site is doing.)

Most content is temporary and short lived. Caching a script does next to no good if I only ever visit a site once or twice a month, I'd prefer there not be a script at all!

In regards to content, HN is one of the most valuable sites I use. The JavaScript is so tiny as to be inconsequential.

Give me a CNN.com that is as responsive.


I know your request is tongue-in-cheek, but here you go:

http://lite.cnn.io/en


an old fashion img tag at top and I'd be delighted.

Need to resize the browser window to get a good column width, but that site is fast.

(pure black on pure white also isn't easy on the eyes, but plain ol HTML can change that as well! :) )


The problem is that most of the "bad" JS violators aren't building data viz, they're doing something much less complicated with the same tools.

Edit: the article mentions CNN, which is a mostly "static" site, but has enough JS to take 13 seconds to load on the "average" phone.


The counter-problem is that the "solution" that is peddled is to have pages that run perfectly with JS disabled rather than advocating for better usage.


Which is a pleasure, because I can go all the way down to OS APIs if needed and don't have to bother with browser rendering engine quirks and differences in how they understand the whole Web stack, sometimes even on the same browser just different versions.

The only way to achieve something similar on the Web is to stick with canvas and WebGL.


I don't see it as any different than the other disconnects between "what I want myself" and "what I get paid to do".

It's not unlike other fields. A composer who gets paid to write jingles for TV commercials won't listen only to TV commercial music. I once talked to an award-winning industrial designer at Fluke who admitted he couldn't care less about test and measurement, and didn't even know Ohm's Law. Michael Caine was paid $1M for "Jaws III", and says he's never watched it.

As a user, I avoid JS-heavy web apps (I think even StackOverflow has way too much JS). As an employee, if my employer (when I have one) wants me to build that, then that's what they get. HN gets responses from people wearing both hats.


"everything of it makes sense"

No it doesn't. Because you are obviously not implementing something that is meant to be used by a human being.

That load times are measured in multiple seconds even on a 4G phone is mind boggling.


You don't have to be combative. I agree with you.

The article is advocating for JS developers to be aware of what makes their SPA react JS app heavy and load slowly. I'm saying that an article like this only makes sense when you have the full context behind working on a SPA react JS mono codebase. If you haven't worked on one, it all sounds like gibberish and your mind just goes off to wonder "why even build this heavy react JS app to begin with when you could build a site without JS".


We live in a world where everyone wants to make Electron apps instead of doing anything in a compiled language or anything other than JavaScript really.

Maybe it's because I've never been a webdev but I just don't understand the fascination with JavaScript. Personally I hate JavaScript any time I have to code in it. Give me literally any other language.


I programmed assembly then C then C++ through the the 80s, 90s and 2000s. I love JavsScript. It just so damn easy. No fighting with headers, include paths, linker options, make files , cmake files, scons files, etc. No cross platform bs. When I want to share I just send a link. Electron does have a few platform issues to build the app but they're most solved for me. I get formatting, styles, animation, for free. I get cross platform graphics, video, and audio for free, networking is trivial. I also get pretty amazing debugging tools.

iteration time is orders of magnitude faster than any C/C++ project I've ever worked on.

I hated JavaScript at first because I didn't understand it. It took about 2 year to stop trying to turn in back into C++. It didn't help that my first real intro was in an environment that used Google Closure and the closure compiler both of which were designed by Java programmers to try to treat JavaScript as Java and disallowed real JavaScript.

now I mostly loath going back to C++ and having to fight the dev tools on each platform which seem to change every 5 years such that all old samples fail.

is it JS perfect? no, i think I kind of miss static typing but not always. It's often a joy to just do it.


> No fighting with headers, include paths, linker options, make files , cmake files, scons files, etc

Instead you get to fight with often poorly-versioned bower/npm dependency hell, and gulp/grunt/webpack config errors with inscrutable messages.

> having to fight the dev tools on each platform which seem to change every 5 years such that all old samples fail.

Such stability! If anything in javascript still works five months later, I count my lucky stars.


I hear you but since I've been using yarn to lock all package versions I haven't had any issues.

https://yarnpkg.com/lang/en/docs/yarn-lock/

JS development without yarn is complete madness.


which means that you do not use a npm proxy.


You can just build a RPM package with your dependencies and another with the dist result.

We have something like that,

node_modules for some project goes to a rpm called $project_name-devel

and the result goes to $project_name

Is not really that complicated and then we don't rely on anything outside of our control (e.g: github, npm)

We never had an issue with this workflow. Our dependencies (the devel package) changes maybe once every two months if so?


If you went from asm/c/c++ to js, and you didn't use a half dozen other languages including Python, Ruby, and even Java back in the day, then it's no wonder you're happy with JavaScript.

That's like saying that chalk tastes so much better and is so much easier to digest than gravel. However, it's good to be happy with a language... so if it works for you, great!


   I get formatting, styles, animation, for free. I get cross platform graphics, video, and audio for free
That is only true if by "for free" you mean you (or your users) get to exchange an oversized portion of their free memory and CPU cycles for those things. Better hope those users have at least as powerful a system as you do or they'll be looking elsewhere for their text editor, cooperative workbench or IDE.


>No fighting with headers, include paths, linker options, make files , cmake files, scons files, etc

These downsides only appear when you’re working on project that did it wrong way. Thousand-line autogenerated makefiles, pkg-config-unaware craplibs with headers based on different basedirs (Qt, I’m looking at you), idiots who think that using cmake/qmake/etc will magically ensure buildability on different platforms and never ever test that it does build there.

In pure js you don’t have include/dependecy problem simply because there is no require/include option at all. Then you install node/npm with 100Mb of js backing it aaand it is yet another complex toolkit to learn forever. With no man pages and shady internet snippets. My hate of js comes from different direction though — I know it and it is ugly simpleton that dooms you to boilerplate, polyfill and spaghettize everything. Debuggers help until you get a “framework” that turns everything upside down and whirls.

I’m sharing your every 5 years annoyance though, even non-web does too much change for the sake of change. No good ui also, you’re right. This comment was meant to be a counter-argument, but it appears that both worlds are crap.


> Maybe it's because I've never been a webdev but I just don't understand the fascination with JavaScript.

1) It's the only true "write once, run everywhere" language out there (along with languages that transpile to JS).

2) npm

3) Asynchronous I/O


Javascript "fame" came before npm/Node.js and before it had any idea async/await would ever be implemented.

Simply put, Javascript is the only language in the browser.


Asynchronous I/O refers to the fact that all I/O interactions default to being asynchronous in JS, not the fancy new `await` keyword, which is nice, but just sugar around Promises. The result of this default is that makes a best practice into a feature, not something I have to manage as a developer.


Because of Node or the browser's event loop? It's not like JS was the first language to have a built-in event loop. Visual Basic did exist.


Yes, but not much before. It took heavy evangelism and Google Chrome's JIT to make JavaScript popular about 10 years ago.


Maybe you don't remember the early 2000s when people were adding flying butterflies around the cursor with Javascript?

Recently there has been hype and explosion but web development (development on the browser, which means Javascript for lack of alternatives) had a long way coming.


The only true “write once run everywhere” is ANSI C, please don’t overestimate js in that regard. Unless you’re using polyfills and babels, it is feature-detection hell never ever imagined on desktop or embedded.

Asynchronous, parallel or progressive io existed since forever, and barely is an exclusive property of js. Moreover, it is not required from application programmer, unless the platform is built in a way that it is impossible to avoid.


> The only true “write once run everywhere” is ANSI C,

Sure, until your pointer size varies from platform, or LONG is 64bits on one platform and 32bits on another.

Don't forget compiler support, which version of C are you using? Does the compiler support it everywhere?

C runs everywhere with sufficient #IFDEFs.


If C code incorrectly depends on specific type sizes or is written for less portable standard than ANSI C or can be compiled with non-compliant compiler only, then it will be less or non-portable.

You can argue that these restrictions allow less freedom of expression and more quirks in some cases, but the same applies to js without polyfills and babels, because these are effectively ifdefs and compilers. The one main difference is that correct C code actually can be deployed to a variety of platforms unavailable to javascript, which is runnable on just few selected powerful environments with heavy help of what is called compat-libs in non-web world. Moreover, with all these compat layers you cannot just take js code and use it in non-hosted environment; that’s why everything runs in embedded node/webkit. This is barely portable, and I think you’re confusing [seeming] ubiquity and with actual portability (like “everywhere”).


> If C code incorrectly depends on specific type sizes

Now days C includes wonderful things such as int64_t and int32_t.

Before those existed, life was hard. Everyone had to implement their own type system abstraction layer on top of C, for good reasons. Making a structure that maps to a network data packet is irritating when the underlying types of the language vary in size from platform to platform.

Thus the million different prefixed IFDEF'd type systems in headers.

The size of an enum is up to the compiler. Some optimize so that enums with less than 256 elements will have 8bit values, other compiler's don't, always setting size as 32bit. Simple solution is to set a value at the end of the enum to 0xffffffffu, but I've had teams I've worked on shy away from using enums for data modeling specifically because of how the C spec defines them.

#pragma pack not being part of the spec also causes headaches. I get why, some platforms don't support unaligned access (heck, ARM up until relatively recently). Most modern compilers implement it, and it'd be nice if the standard laid down a single, opt-in, syntax for compilers to use. Even with pack, #pragma push and pop are not implemented everywhere, so you there is still this mess of cross-plat code that is needed. (#pragma push and pop are seriously useful, also possible to get in a lot of trouble with them!)

All this can be worked around, but it very much makes C not write once run everywhere. It is write once, adapt for a bunch of platform inconsistencies, run most places until it segfaults, fix the segfault, run again, hope it works until some new platform is brought up.

I'm not saying JS runs everywhere, far from it. It is true that there is a C compiler for almost every platform, but for the deep embedded platforms, the C code is very much custom. Just a few years ago, a product I worked on had an 8bit micro on it with 256 byte memory pages. It was attached to a Cortex M4 with 256KB of SRAM. The Cortex M4 had more memory hanging off an external bus. A huge chunk of the code for all of this cross-compiled to run on Windows with lots of mocks for the hardware, including the entire Flash file system.

Each of the embedded systems relied on compiler intrinsics, accessing magic memory addresses, and the rare drop to raw assembly. Getting the code cross platform was less IFDEF's than one would imagine, mostly because code was kept in separate _win32.c/_arm.c files, and each file started with a giant #IFDEF(__win32) or #IFDEF(__ARM) (best way to do this IMHO, throw *.c into the build system, let the preprocessor figure it out).

C may be supported everywhere, but making the same C code run everywhere is a huge hassle. In comparison, cross-plat JS engines (e.g. Node) don't scale down nearly so much, but when they are brought to a platform, everything is going to be the same. The abstraction layer provided is a lot more robust, by design. Two very different goals.

Lua and Forth programmers all think this is silly, since their languages really do run everywhere. :-D


I'm confused. Are you naming npm as a plus? 5,000 individual downloads/dependencies to use a library that you could hand write in a couple of hours... not to mention embarrassing events like left-pad...


> 5,000 individual downloads/dependencies to use a library that you could hand write in a couple of hours...

Yes, that's a trade I'm always willing to do. I'd rather use those couple of hours to write something new or spend some time with the family. "left-pad" was a minor annoyance that was quickly resolved.


Leftpad, a one time event that was fixed very quickly with a solid permanent solution? Being able to use libraries that 5,000+ other people have already vetted for quality for you so you don't have to waste a couple of hours hand writing them? That's 2 pluses right there.


Electron isn't great just because it runs javascript, it's great because you get to use the HTML/CSS you're used to making pretty UIs with to make pretty desktop UIs. HTML/CSS have their warts, but they're a fantastically efficient way to build out UIs.


So instead of running WinForms or Qt or GTK or, god help us, Swing, components in a tight, snappy desktop app, we're mostly running Bootstrap components in a webview. Six of one, half-dozen of the other, in my book.

The Stockholm syndrome around using browser technology everywhere, just because you've been brutalized into learning its wrinkles to work on the web, is scary.


having built both native and electron my experience is it's an order of magnitude faster to use electron to ship on win/Mac/Linux than any native tech

clone this repo, follow steps at bottom of readme, ship on 3 platforms

https://github.com/greggman/electron-hello-world


Every single platform has their 1-line hello world example. iOS has storyboards, Windows has XAML, Linux has QT components.

If all you need is an action button to trigger some underlying code, then those hello-world products are very simple.

Electron is NOT that easy to use of a platform. In fact, it's quite technical and requires a lot of knowledge about node and HTML/CSS. It requires that the developer understand you're running a node process that feeds its results to a browser that has a native chrome.

Visual Studio Code and Slack for example have native plugins that light-up the various platform's features. So you're never completely guarded from having to write native code. At the end of the day it's just another UI toolkit and language choice you can make when building out your software.


> it's quite technical and requires a lot of knowledge about node and HTML/CSS

HTML/CSS are already something with massive widespread know-how and insane numbers of online resources. That's a huge deal. This takes every web developer and makes them a cross-platform desktop app developer with no additional learning.


To a point that is correct. For simple button actions that trigger a web request and display content, I would agree the knowledge is there and available. For any deeper skillset, photo manipulation, live camera, geospatial or map related code, webgl, etc. Solutions in those spaces are just as undocumented on any platform and require a sophisticated developer to write complex code to solve.

A developer with the caliber required to solve those problems will be comfortable in any language they are asked to write in.


Every single platform has their 1-line hello world example. iOS has storyboards, Windows has XAML, Linux has QT components.

The point is that Electron is cross platform. Building three different native apps with completely different UI toolkits is a huge amount of work and depends on much less common skillsets.


Sure, until you hit a wall with the sandbox. Then you will have to write native code to enable features only available on the platform that the application is run on.


Maybe but the UI is the lion's share of the code for most of these apps so you start way ahead.


The tooling in my opinion is much better and the barrier of entry is much lower. How many people have been deterred from writing Java because of eclipse configuration? Setting up a remote webserver, launching an app with debug parameters, install eclipsing, setting up a new project and connecting to the remote app. With client side JavaScript (my specialty) the debugger is built right into the browser, hit F12 and you are debugging. The barrier to entry for developing a UI in JavaScript is extremely low. Mocking out JSON data, and running a simple python server to test a couple ideas takes minutes. Try that with JSP, JSF, JSTL, PrimeFrames etc. See you in a week.


> How many people have been deterred from writing Java because of eclipse configuration?

And yet, this hypothetical person wants to write an Electron app to run on desktops?

I don't get this argument. If you are good enough to understand how to write and deploy an Electron app, you should certainly be able to understand Eclipse.

Note: I'm not advocating for Java here...


That was probably a bad argument, setting up eclipse shouldn't be tougher than setting up tooling for modern SPA frameworks like react or vue..

But Electron is a there for a different reason. I can convert my SPA to a working desktop app with minimal coding (same Css/js front end/ja backend, if you use nodeJs). And even it works for all major platforms. So I literally have one codebase for web and all three desktop platforms.. beat that!

I know it's not optimal to have everyone open a chromium instance under the hood but how bad is it from JVM?


Much worse, it consumes more RAM, executes slower and lacks the platform integration even what Swing is capable of, reducing the UI/UX to the minimum denominator of HTML5 / CSS3 capabilities.

And with Java, I have the option to have an actual binary, 100% native code, if I feel inclined to do so.

Something that many Java haters keep being unaware of.


Yes, technologically worse. Decision to go with Electron is a business one.


> Give me literally any other language.

I'm torn between JS and PHP for the language I have the least fun programming in.


System at work is written in terrible terrible php, over the last few years PHP has come a long way, many of it's programmers haven't alas


That’s exactly my experience too. I’m attributing this to “the web curse”. Anything touched by the web is cursed with... I can’t even formulate it in few words, should be complex ancient magic.


Next time you're forced to deal with JavaScript, try out TypeScript. It's a subjective subject, but it's pretty fun to use and it compiles down to JavaScript. There are a lot of tools around this ecosystem, but as for anything experience helps.


TypeScript, Flow, Reason, ClojureScript...


Surely you understand the value of a JavaScript runtime in every browser?


To make Ads and track user behavior?


> Maybe it's because I've never been a webdev but I just don't understand the fascination with JavaScript.

Native lexical scope. Do more with less and write/test it in a fraction of the time. Seriously, a couple of rockstar JavaScript devs could easily replace a large department of equally skilled Java developers.


It's not the cost of Javascript. It's the cost of shipping the latest trend as fast as possible by throwing in another lib that has all the options you will use 1% of.

This is happening everywhere.

And this is also why we are wasting so much time waiting for that page or program to load.

It's a nice article but isn't this something that should be so obvious for the avarge programmer?


As a programmer I know all this. But I don't articulate as well as Addy. So He's given me a fantastic link to share with the non-programmers above me.


It's happening because when you search for a job you get posting after posting that calls for experience with specific JavaScript frameworks. "Looking for a React ninja". "Looking for an Angular hacker". So programmers push for using those frameworks even on projects that don't necessarily call for them, so they can pad their resume and keep up with the ridiculous churn of frontend JS development. This is the advantage of working with a platform like iOS - there is one framework to make UI's and that's all you need to know for your entire career. Yes, the framework upgrades and changes, but you don't have to learn a completely new paradigm to change the contents of a text box for the hundredth time.


This is covered in the article

>Removing unused code


Nope. it is about including stuff that you really need. I saw apps that where shipping angular + jquery just to provide a single interactive icon.

Article focus on very specific cases of the mobile web application. In most SPA it is not a big deal to ship a 1mb of JS.


They specifically mention tree-shaking and tools to strip unused code from libraries.


Read my answer more carefully. It is about using frameworks when you really need it. A lot of people include dependencies to their projects that can be avoided by better knowing JS and Web APIs.


This is one of the biggest things I have found remarkable about the modern Web: how slow everything feels.

Nothing just loads anymore! Instead everything loads in dribs and drabs, as various bits of it stream down to the device, get parsed and get rendered. I waste so much time staring at placeholder graphics and missing clicks because the element I was aiming at skidded across the page after something else popped in. The more front-end technology we throw at the Web, the worse it seems to get.

It's hard to remember that once upon a time Web pages were fast, even though we were viewing them on 1990s-era hardware that even today's budget mobile devices put to shame. We love the Web so much we are strangling it to death.


I've been paying attention to how slow the web is these days, and just as often as not I can finish a newspaper article (by far the worst offenders in 3rd Party API callouts) before the tab's page-loading indicator completes.


If it completes at all!


This is one of the reasons I use noscript. By blocking those dozens of calls to external trackers/ad networks/etc... you can significantly reduce the time between clicking the link and the page being readable.


> We love the Web so much we are strangling it to death.

"And I'm gonna hug 'im and squeeze 'im and name 'im George..."


I had a chance to hack through the new contributor interface for a leading stock photo seller after UX has become erroneous and extremely slow after an update last week. They switched from an under-invested front end interface to a React/Redux for submitting images. I was thinking that maybe the implementation was bad but I concluded that the React/Redux and the set of front end components were to blame. Go read react/redux websites with dozens of cool ideas leading to a user revolt.

Didn't we learn something from J2EE technologies that complexity is the source of all evil?


Hi, I'm a Redux maintainer. Can you give some more details on the issues you saw? What concerns do you have about using React and Redux?


Thx for the kind offer.

There's a list view where I select one or many video/image cards, and edit their titles, tags, descriptions in a single edit box singly or multiple. It's not about media editing, just their titles, tags and dozens of other data. Redux store contains all 200 image information as well as undo history.

Here's the issue - when I put there 200 things to edit, the screen becomes very slow to use. Like the mere act of selecting and having its title/description etc appear on an edit box is 2 seconds.

I have installed redux and react dev tools out of curiosity to understand what's happening. My conclusion, is that over 200 images to submit in batch combined with the undoable complex data store can't be handled. So I went to examine redux website and source code - which appears quite thin and performant. At the end I think react gets slow to update 200 cards from the complex store after redux store gets updated.

I currently don't have some things to submit otherwise I would post some animated screenshots. But next time I have I can post a github issue and pls let me know what kind of data I must include from to react dev tools for performance check.


Yeah, at first glance, I wouldn't _think_ that storing, displaying, and tracking selection for those items would be having performance issues, but obviously it depends on the actual code.

My React/Redux links list has a large section of articles on measure and improving perf in React and Redux apps [0]. You might want to look through some of those. Also, my "Practical Redux" tutorial series [1] has some posts that talk about perf behavior, including key Redux perf concerns, and ways to buffer form input change events

[0] https://github.com/markerikson/react-redux-links/blob/master...

[1] http://blog.isquaredsoftware.com/series/practical-redux/


Wow that's awesome. Thank you for your kind interest..


Not a hardcore JS-dev, but two things I noticed as someone who has optimized their JS output before:

- I am surprised that this blog post advertises webpack tree-shaking, as it is completely broken right now, and has been for a year[0]

- More and more people are switching to date-fns[1] from moment.js, since it is much smaller and composable by default, and works better with tree-shaking in rollup (or webpack once it's fixed). Yes, their recommendation of ContextReplacementPlugin is valid, but in the case of moment.js, its probably better to just use a replacement.

[0] https://github.com/webpack/webpack/issues/2867

[1] https://github.com/date-fns/date-fns


Angular 5 has been using the webpack ModuleConcatenationPlugin to reduce bundle sizes by half. It does Rollup-like scope hoisting. Google for "angular build optimizer".


I feel the web is so much faster on my phone since I started using brave and blocking all js by default. It's surprising how many pages load just fine with no javascript. Most HN articles do.


This phenomenon seems like an almost inevitable symptom of another thing companies do, this one completely understandable: Making sure the dev team has nice equipment to work with. If we're always working with reasonably high-end kit, it impairs our ability to empathize with the experience of someone who's using a cheap Chromebook or the "free" phone.

Contrast with a previous place I worked, where the team that was building in-house tools worked on hand-me-down equipment from their end users. That's a shop where the users almost never had to complain about poor UI performance.


This is why the webbrowser was not built for every little thing. We as programmers need to get back to writing applications and use the web more as a data stream. Thus letting the app do all the rendering ahead of time and we are just refreshing the data


Yes, the browser should be kept for HTML / CSS, and maybe some JavaScript for interactive documents.


Well, average desktop app size exploded too.


I wrote this [1] guide on how to reduce the size of your JS bundles via transpiler optimizations. There's a lot you can do to write code which results in less boilerplate from your toolchain. It also covers code splitting via webpack and some other goodies (like evaulating the size of your polyfills). We've been using these techniques for a while in production (our library is very focused on size) with good success.

[1] https://medium.com/@jbartos/to-ship-less-code-write-transpil...


Really neat article, might try to putz around with Brotli, hadn't even heard of it before.

Was anyone else totally fascinated at those iPhone 8 processing times? Holy shit, it's faster than a Macbook Pro 16!! That's absolutely insane.


> adopt a Performance Budget for ensuring your team are able to keep an eye on their JavaScript costs.

This is really good advice for product people.


Oh the irony, having one of the prime facilitators of web bloat talk about “the cost of JS”.


How much might precompling to WebAssembly cut into that compile/parse time?


Anyone here deploy ES2015+ or know of any large sites that do?


I use AOT in dev mode.


Pay attention to how much slower all the Android devices are here, whereas iPhone has long since reached laptop parity. So depressing how bad Qualcomm hardware is.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: