> Even if these libraries get cached in the browser it’s still quite a lot of JavaScript that’s executed every time a site is loaded.
Is it? 39 libraries that are 35 lines of code each or just define a bunch of functions that your code can then call optionally may well not take that long to parse. How many gems does the typical Rails application use? Libraries are the equivalent on the web, only someone else doesn't resolve the dependencies; you have to include the dependencies manually. Is it that surprising that people are using a variety of helper libraries to do ads, analytics, DOM manipulation, etc, etc? That's fairly standard for development.
I think saying we're “drowning in JavaScript” because we've gotten a lot better at modularizing our code instead of copy-pasting it from everywhere seems like lamenting the wrong part of the problem by far. I would rather every site include Google Analytics, DoubleClick, jQuery, and 10 jQuery UI plugins than be subjected to the cascade of security and functionality bugs folks would run into if everyone was trying to roll their own solution for everything. There are already enough with the use of common code.
Using libraries for third party code is good, and the fact that we're doing it is a step forward, not a step back. Now, there may be room for folks to choose the third party code more wisely, but that's a perennial problem when you're willing to use code that's already been written.
Sure, but it's a leaky abstraction. Downloading 30 .js files (and thus wait for 30/x HTTP roundtrips, where x is the number of allowed concurrent connections) before any javascript on the page can run is problematic unless you're adding extra complexity by using JS minification etc.
> and thus wait for 30/x HTTP roundtrips, where x is the number of allowed concurrent connections
Which is 1.
This is javascript here, not CSS or images. The default behaviour is to stop everything, download the script file (synchronously) then execute it (also synchronously) then resume. This means the default is to download then execute one JS file at a time (which is why you should always put your CSS files before your JS files).
Concurrent download (let alone out-of-order execution) have to be opted-in via specific attributes:
@async will queue script download (asynchronously) and execute it whenever it can once downloaded. Multiple @async scripts may be executed in any order, depending how fast they arrive and when the browser finally decides to exec them. It is supported in webkit-ish browsers, Firefox (>= 3.6) and MSIE >= 10
@defer will also queue script download but it guarantees the scripts will only be executed 1. between parsing and end DOMContentLoaded triggering, 2. in order. It is suported in webkit-ish browsers, Firefox (>= 3.5) and MSIE >= 10 (MSIE has had it since ~IE5, but as usual its behavior tends to be ill-defined and buggy)
(note: this is for <script src> tags in the downloaded source, not when they're dynamically inserted)
This isn't the case, I just verified it.
A test.html page with <script src="dummy32k-a.js"> and another for dummy32k-b.js show as downloading concurrently in chrome devtools network timeline.
It makes sense since the HTTP fetch order doesn't affect semantics as long as you obey the execution order from the page. Unless you're doing something very contrived, such as serving them as no-cache and generating different js dynamically from server side if a previous script has been requested...
Last I checked, browsers (chrome? firefox?) only allow 6 concurrent downloads from a web server. Do all resources load concurrently if you have seven large javascripts in a page?
I'm not sure it's a leaky abstraction per se. But there are definitely issues with it—issues things like the Google Closure Compiler try to deal with. One of them is this: eventually, Rails realized you should probably link your application to a particular version of a gem (see bundler), just like the Java world realized web apps probably shouldn't be resolving dependencies at runtime (see WAR files).
Right now, JS apps are often just serving the latest version of libraries for things like DoubleClick and Google Analytics. Google can update what they're serving at any time without letting you know. If you use a build system to bundle the library into your code, you lose the benefits the user could get from already having the library cached. If you don't, you're exposed to someone else's whims on library updates, and you add HTTP requests to boot.
The same goes for bundling something like jQuery into your app. You can bundle it in, but that data has to be sent downwire to the client when the JS goes down, even if it is just in one HTTP request. Or you can include it via CDN and a lot of users won't have to download it again, despite having to do an HTTP request. So it's all about choosing the right tradeoffs.
Also, JavaScript runs as it downloads, in page order. So you don't have to wait for all 30/X round trips, though you may have to wait for X round trips for the last file (in page order) to run. Then it becomes a matter of prioritizing what needs to load first. Naturally this is stuff we wish we didn't have to worry about, but worrying about bandwidth and latency and how and when and in what order things download has always been necessary for network applications, and I don't see that need going away anytime soon. I'd love to be proven wrong though :)
Most properly built sites are already combining the JS and minifying it. It's a largely solved problem that is automated with the right tools. Not much added complexity in it.
Correctly. Nowadays most of the web application frameworks have a built-in concept of an "asset pipeline". You drop your javascripts and stylesheets in a specific folder, and when starting your application in a production environment, everything is concatenated, minified and optimized, so you have basically exactly one javascript file and one stylesheet file to download for your users.
Wasn't HTTP pipelining supposed to fix the round trip problem circa 1999. If everyone has given up on that, maybe HTML5 needs to bring back .jar files for JS or something.
the worst offender on this list is Adobe Analytics. If not loaded, it breaks most websites, as I can only imagine that its calls are ... (shudder) ... synchronous.
Is it? 39 libraries that are 35 lines of code each or just define a bunch of functions that your code can then call optionally may well not take that long to parse. How many gems does the typical Rails application use? Libraries are the equivalent on the web, only someone else doesn't resolve the dependencies; you have to include the dependencies manually. Is it that surprising that people are using a variety of helper libraries to do ads, analytics, DOM manipulation, etc, etc? That's fairly standard for development.
I think saying we're “drowning in JavaScript” because we've gotten a lot better at modularizing our code instead of copy-pasting it from everywhere seems like lamenting the wrong part of the problem by far. I would rather every site include Google Analytics, DoubleClick, jQuery, and 10 jQuery UI plugins than be subjected to the cascade of security and functionality bugs folks would run into if everyone was trying to roll their own solution for everything. There are already enough with the use of common code.
Using libraries for third party code is good, and the fact that we're doing it is a step forward, not a step back. Now, there may be room for folks to choose the third party code more wisely, but that's a perennial problem when you're willing to use code that's already been written.