Hacker News new | past | comments | ask | show | jobs | submit login
Stop Paying Your jQuery Tax (samsaffron.com)
357 points by sams99 on Feb 17, 2012 | hide | past | favorite | 84 comments



Please go check out StackOverflow's source code before bandwagoning on this topic.

This is shifting blame to the tools. The problem lies on StackOverflow's lacking design and not in jQuery.

Pushing jQuery to the bottom of the page is trivial if you do proper HTML architecture. Pages should have only HTML. JS files only JS. JS files referenced at the very bottom of the body. Very simple. (if your asp/c#/* framework doesn't make it easy, blame the framework and not jQuery)

Inlining JS and even having JS code in the title attributes is ridiculous. Fix your HTML and only then you get the right to say anything. Another telling gem is StackOverflow page doesn't even have a proper encoding declaration.

You don't need to use jQuery for everything. This is very typical of developers coming from backend who don't take web development seriously. Use CSS as much as possible.

Another misplaced attack is refresh, with proper cache headers it should not take that long. If some browsers are slow and don't keep a pre-parsed cache, blame the browser vendor and not jQuery.

jQuery taking 80ms on mobiles is quite OK. If you really care about mobile make a page optimized for mobile and minimize JS rendering and styling.

I absolutely love StackOverflow and it's one of the best things to happen to programmers in the last few years. But this self-righteousness attack on a very important tool is very misleading and ungrateful.

Edit: the proposed "solution" of catching $.ready and later calling those is insane.


Let's be clear ... I am not attacking jQuery here, just saying it is misuse to chuck it in the header and offering a practical mechanism to get out of a mess you create. I explain the cost of placing it in the header. I am not perfect, I make mistakes daily, I try to learn.


You pointed out a few best-practices, but it's a shame that this article is written in the context of jQuery, rather than recognizing that these "taxes" are true with every script tag you place on your website.


That is absolutely true, was looking at the cnn.com .. http://z.cdn.turner.com/cnn/tmpl_asset/static/intl_homepage/... which is in the header chews up 100k of compressed js. This monster must take upwards of 20ms to parse in chrome as well. Way worse in crappy browsers, the CDN network you need to support this is huge.


In defense of sloppily composed news websites: they're not run like normal sites. Despite the obvious benefit of optimizing and cleaning up the site markup, it's quite normal and it's even encouraged to sacrifice optimization -- so that something would just work.

Two main differences from normal sites are:

(1) The sheer size of moving parts: number of hands that have a say in the site's daily operation

Before anyone says "duh, well, get a handle on your site" you have to understand that news world is chaotic for the right reason: content flexibility and innovation. You can't put hundreds of ongoing creative and newsworthy projects on small development teams. That kind of open exploration belongs in hands of editors who specialize in their particular beats and are willing to pursue newsworthy projects (vendor or agency partnership, special reports, etc.)

These people only understand rudimentary HTML but they have access to be dangerous.

That's a good thing.

More people that do this sort of thing is good for any news agency: it gives us a collective chance for any random project to be naturally selected by audience for success. Problem is how to corral various implementations, bandaids, and so forth to where it all works flawlessly.

However, cutting off editors from being able to write markup is not the solution - and post-optimization regression testing is impossible due to sheer volume of content and creative projects. As long as something works within acceptable limits, it's fine.

(2) Politics of how technology intersects with editorial and operations.

There are invisible lines of power that outsiders don't see with news. Bureaucracy alone is tough to navigate, but, compartmentalization of editorial, technology, operations teams contributes to the problem. Most news agencies have been transformed over the decades (in some cases, centuries) and are pretty set in their ways.

Explosion of internet technology has dramatically sped up this transformation process and it'll take some time to iron out. Good news? Rise of devops in the news biz is happening.

Though I'm a fairly adept developer working for a news agency, I cannot update or fix our favicon file or optimize JS on the main page: for one, template control belongs to a different department that will prioritize their projects differently than I. Communication overhead is too large for small matters. In their defense, their lean team is supporting 50 languages, 1000-persons worth of editors and producers at this point.

Secondly, even if something is simple to fix and done for the right reasons, navigating this field of artificial obstructions just to reduce page weight or load time doesn't justify me taking time off from working on my latest editorial project: there is a fast approaching relevance deadline on it.


Where are you getting these numbers from?


My laptop which could be a bit faster, the cold cache issue is the major issue though. Distributing a file this big in a scalable way without blocking requires a lot of servers.


"Stop Paying Your jQuery Tax", that's an attention grabbing title with an attack on a free open-source project. It took you at least 20m to write that post, probably over an hour. You can't just say "oops".


I (not really a frontend web developer, though I've done a little bit of it) immediately went "oh, script loading in HEAD?" before even opening the article.

Most people, at least in this neck of the woods, use jQuery. Most people also put it in the HEAD and are susceptible to this sort of thing. "jQuery tax" is a succinct way of describing the phenomenon, and nobody's going to suddenly think less of jQuery because of it.

"An attack" on jQuery? Histronics.


1) If it took him a month to write that article, he could still say 'oops'. He's entitled to err.

2) It seems like you're the only one interpreting his post as an attack on jQuery. It isn't. You said in another thread that English isn't your first language. I'd suggest you temper your accusations in light of that knowledge.

3) Even if he were bashing jQuery, and even though it is a good and widely used framework, that's his right to do. Yes, it is your right to cry foul, as it is my right to attempt to set you straight.

Simply put, I think you're off the mark here. Your attack on him are more damning than his supposed attack on jQuery, and strikes me as defensive.

In with the good air, out with the bad.


>You said in another thread that English isn't your first language. I'd suggest you temper your accusations in light of that knowledge.

As a native English speaker, I would like to point out that alecco is _correct_ when he interprets the article as slinging mud at jQuery; there is nothing wrong with his/her English vis-a-vis his interpretation of this article's title. The fact that you would tell him/her to pipe down because he's not a native speaker is disgusting and xenophobic. As a side note, your comment about alecco being "the only one" interpreting the post as attack on jQuery is completely false. S/he's dead on about it being what I call a DCA, or "deliberately contentious assertion", designed to win more page views than an article merits.

Honestly, if alecco were expressing an opinion with which you agreed, would you still be saying s/he 'just doesn't understand the article due to poor language skills'? Or do you reserve this treatment for those foreigners who dare disagree with you? The fact that you dug thru alecco's comments in other threads to find this ad-hominem rock to sling... the reason you had to mention his/her non-native speaking comment from another thread is because alecco's English is good enough that you wouldn't know s/he's a non-native speaker from this thread.

Non-English natives are allowed to have opinions at variance with your own, sir or madam. Leave


It wasn't meant to be an attack. The point was that I felt Alecco was being overly harsh, if not in this particular post, in others on that thread.

I didn't dig through their history for a rock to sling, but they mentioned it elsewhere in the same thread.

I don't have a dog in this fight regarding the article. It isn't an opinion I particularly agree or disagree with. I have no affiliation to the author, to the blog, or to any of the individuals I've remarked in this thread. I also like jQuery, so if anything, I'm arguing from the same bias as Alecco, in that I think it is a very high quality open source implementation of an amazing JS framework. I even read through the article multiple times to seek out the jQuery bashing Alecco alleges. I can't find it.

At first, I thought it was a popular opinion, until I realized that each of the remarks I believed to be overly negative were from the same person.

There wasn't any malice intended except to say "Hey, lighten up. People are allowed to make mistakes." One of the tenets I believe is core to Hacker News working is the assumption of good faith. I don't believe that Alecco was deliberately trying to malign the author, but the statements certainly come off as though they did not assume good faith.

I have tried to do so here, as in my original comment, but I do not believe you have extended me the same courtesy.

In the interest of promoting civility, I'll disregard your final remark, except to say that English is also not my first language.


I apologize for the last word of my reply. I'm not sure why it was truncated, it just said "Leave".


Another telling gem is StackOverflow page doesn't even have a proper encoding declaration.

They declare the encoding in the HTTP Response.

  Content-Type:text/html; charset=utf-8
That's totally proper. If you specify it again inside the HTML, are you gaining anything?


It's not wrong by itself, but making content encoding implied might become an issue later (e.g. load balancers or CDN, merging content or improving support on east-asian languages). It's likely they've never faced those kind of issues.


Again, it's not implied when you state it categorically in headers. If there's a CDN or load balancer out there which breaks your Content-Type headers it's much, much better to replace it with something which follows HTTP a bit closer.


Very often you don't have control over that. It might be managed by a different team of the organization or might be a poor product purchased by higher management. Or it could be a third party service. You have to code defensively.


Rest assured, I've been in that sort of situation. However I've never found it viable long term to take on additional complexity rather than pushing back against a group which is obstructing progress - ultimately you're accepting more work and risk to help reduce the pressure on them to stop underperforming.


>> Please go check out StackOverflow's source code before bandwagoning on this topic.

Please read the article before replying.


If you read my comment you'll see I addressed exactly the topics (document ready and caching).


Agree, I use Django and I had no idea what they were talking about. They should check out other frameworks before writing posts with generic headlines.


Re "Pages should have only HTML. JS files only JS. JS files referenced at the very bottom of the body. Very simple."

I suggest you view the source of http://api.jquery.com/ready/ before making that assertion. If it was that simple why is the website providing documentation for jQuery not doing so?


Did we read the same article? I didn't see this as an attack on jQuery at all. Furthermore, he didn't even blame any tools! He blamed the platform because as usual, the web platform itself is indeed the problem.

"...developers coming from the backend who don't take web development seriously."

That's because client side web development is a joke. It's a really bad one too compared to a nice native kit. The people who come up with all the silly ways of turning a fracking Hyper-TEXT system into an applications platform should be shot. I'm kidding about shooting anyone, but honestly - it's horrible how many hoops you have to jump through. HTML is the worst thing that's ever happened to software development. The Internet is TCP/IP, it's not HTML and personally, I want for something better every day and I know you do to because all you web developers are always, always, always complaining about something.

"self-righteousness attack on a very important tool is very misleading and ungrateful"

Once again, self-righteous? Sorry, I just don't see it. He wasn't even blaming the jQuery guys for one single thing. This kind of problem could happen with any large Javascript library that runs in the browser.

The proposed solution is not insane if you don't want to make your server framework do more work. There's absolutely nothing wrong with making code run in the browser so that you don't have to do something on the server. You need to get off your high horse buddy.


I hate the whole web paradigm but it's what's in use and it's definitely improving. And in particular, jQuery makes it less painful so that's what many of us use (or similar JS frameworks). The same problem happens at the low end with the x86 instruction set. Yet we live with it.

None of that makes it OK to write a bashful title that should've read "Stop paying your ASP MVC JS tax" or something. It all comes down on how they need to have inlined JS requiring $(document).ready(). That's the problem and not jQuery.


'None of that makes it OK to write a bashful title that should've read "Stop paying your ASP MVC JS tax" or something. '

"Bashful" means "shy," not "full of bash," as you seem to think. Actually, I wonder if you speak English natively as you seem to be the only person who interpreted the article the way you did.


You are right, I meant to use "bashing".

I do not speak english natively. But that has nothing to do with the bashing title. There is no "jQuery tax", that's a misleading statement.


He doesn't suggest that there IS a "jQuery tax", rather that people's common misuse of jQuery imposes a tax on page rendering.


That is a great point!

An interesting question is, why not just put ALL scripts at the end of the <body> tag, after the HTML of the page has loaded and the CSS probably did as well?

The only thing I can think of is if you have code ON the page which uses these scripts. But why not just put that code at the end of the page, too?


The main reason this happens is that most MVC platforms generate pages piecemeal, say for example you are generating a 'product' page:

It has a 'template' that contains your header,footer and scripts that everyone relies on.

Then the product piece may also need a script or 2 and finally it may need to add some dynamic love to the page like say: $(myMagicHelper(779,'magic');), in general people are used to just inlining these kind of mini-scripts close to the bit that generates the product html. It can be migrated to a system that defer generates it in the footer, but usually would involve a larger amount of change (at least on projects I worked on). I guess this trick saves you a bit of time migrating some inline scripts to the bottom.


rails has content_for (http://api.rubyonrails.org/classes/ActionView/Helpers/Captur...) that helps with this.


true, still verbose, but more elegant than the solution in asp.net mvc


out of sheer curiosity, what is the asp.net mvc solution?


It doesn't really have one. Annoying, because it's 90% of the way there- views can insert code into specific "sections" of the template, but sub-views can't.

If they just enabled that then you could make a "JS" section just before </body> and have all your JS inserted there.


My current thinking is: http://snipt.org/uNq8


A strong reason to not do so is when you want some part of the page to be immediately enhanced with JavaScript (like a login form). If the page has many images and other resources, your users will have an awkward experience, with the scripts suddenly engaging at a time that feels arbitrary. While some scripts absolutely belong at the bottom of the page, it is a very good idea to decorate JavaScript widgets immediately after the HTML is rendered, as in:

  <div id="widget1234"></div>
  <script>
      (function() {
          var widget = new Widget({
              el: document.getElementById('widget1234')
          });
          // ...
      })();
  </script>
Note that the script is also inlined here: Losing the overhead of an HTTP request is beneficial when a widget needs to spring to life immediately: You want it to be active the moment the HTML is finished.


Simply have the widgets start as display: none -- or even better -- visibility: hidden until they are rendered with js. The css accompanying the widgets should do this.


Yes, that should be the case anyway: The widget shows itself when it is finished decorating. But even then, there's no sense in deferring decoration until after some other arbitrary resources have loaded.


Why do you still need a document ready function if all your scripts are loading after the html anyway?


The DOM will have been parsed if your scripts are at the end—but external components (links, imgs) may not have loaded yet.


The same is true whether or not you use a document.ready statement; you're thinking of window.onready, which fires once all images have been downloaded.


You are correct! Thanks. Been spending too long with document.readyState.


you meant window.onload


well its more of a legacy problem, we are used to using it everywhere, deferring external scripts is a fairly easy exercise but those tiny snippets everywhere that have $(doStuff) are more annoying to port


If all your scripts are before the closing </body>, the DOM will be ready by the time they execute, and you can safely ditch your legacy ready wrappers.


This linked article is good: the Performance Golden Rule http://www.stevesouders.com/blog/2012/02/10/the-performance-...

(submitted days ago but got no traction)

For years we have been told that we will have to wait on the database so it doesn't matter how fast your implementation language is...

Well it's half true, depending on what you consider your implementation language: server side or client side.

Front end performance matters just as much (or by this article) more than "backend."


How does this (http://37signals.com/svn/posts/3112-how-basecamp-next-got-to...) get to be number one, while this http://www.stevesouders.com/blog/2012/02/10/the-performance-... goes nowhere?

Are we upvoting simply by name recognition?

http://stevesouders.com/about.php

He only wrote the book on high performance web sites...

http://www.amazon.com/High-Performance-Web-Sites-Essential/d...


It's partly name recognition and partly novelty. The 37signals approach is unusual, while most competent front-end developers are already aware of the principles in Souders' post.


Sounders post is 5% halfheartedly evangelizing (maybe not-so) common knowledge and 95% reiterating his point about the split between frontend/backend time on fulfilling a request.

After reading it I'm pretty sure it could eliminate all but the last graph and be just as valuable at 1/4 the length. That he did a rather comprehensive evaluation is commendable, but I think the post loses track of its main point in favor of sailing the sea of data that was collected.


So is the article saying you are basically putting an alias for jQuery's `$` in the header and so if you need to put `$(function() {});` or `$(document).ready(function(){})` elsewhere in your page you rely on that alias and then once jQuery has loaded, map the `$` to it?


That is one technique you can use to assist you in pushing your jQuery include to the bottom, the others are having a smart script registration piece that takes care of it for you or running code that rewrites the html like blaze.io does


Complex solution to simple problems.


Yes, WTF worthy.


The Grails framework has a plugin to organize resources like JavaScript - http://grails-plugins.github.com/grails-resources (scripts will by default be added to the end of the page and even if your page consists of several template files which depend on jQuery etc, they will only be included once).


I'm very confused... why aren't the scripts in the footer? Because of inline javascript? What problem is this solving? What does this have to do with jQuery in particular? I'm very confused...


This is clever. However, it depends on scripts using the shorthand notation for jQuery's ready method. Maybe it would be better form to use something explicit instead rather than doubling up $ operator.


true, we are used to using the shorthand notation in fact at SO we use a special internal variant called StackExchange.ready for a similar purpose, you could get $(document).ready remapped with a similar trick if you wanted, or you could simply push the inlines to the bottom as well.


Here's my attempt at the similar trick, covering all 4 possible ways to bind to DOM ready using jQuery: http://blog.colin-gourlay.com/blog/2012/02/safely-using-read...


Nah, I'm going to pay the tax rather than optimize prematurely.

But I'll add that to my bag of tricks when I start optimizing.


One of the points the post makes is that by that time, making changes is a lot harder than it would have been if one had just started out with the proposed setup.


I'm not sure this qualifies as premature optimization. Allowing jQuery to block page load incurs a bit of load time where the user has to sit around waiting. Anything that can be done to minimize user frustration (and maximize user experience) is probably worth doing when it's so simple.


Agreed. Premature optimization is about "maybe", and feared performance loss.

Responsiveness matters to users; to them, speed is a feature, insofar as they can perceive it. The "JQuery tax" absolutely is a real, and user-perceptible issue.

"Avoiding premature optimization" can sometimes make it hard to undo poor design choices made early on that could have been avoided.


> Responsiveness matters to users; to them, speed is a feature, insofar as they can perceive it. The "JQuery tax" absolutely is a real, and user-perceptible issue.

I think the evil of premature optimization is attacking the wrong target first. 80ms isn't a small amount, but moving script tags around could have more potential side effects than say using proper caching, minifying etc


Good point. This hadn't even occurred to me.


After the initial load from the CDN the tax is 10ms (Chrome), 20ms (IE9) or 80ms (Firefox). So for modern browsers it's really a tiny amount you save. For IE7 users the tax is more than 100ms but it seems a waste of time to optimize for them given their market share.

I'd say saving 20ms is premature unless you are already doing everything else you can do.


I would tend to agree with the other commenters and say, why not just put it at the bottom in the first place? That way, you design the whole site with jQuery loading last and, hence, you never have a massive headache trying to change it!


This is not premature optimization. It's about not making spaghetti code (HTML/CSS/JS in this case). Have clear guidelines from the start, that's it. No optimizations. In fact having clean code makes development easier (this applies to webdev too). If the framework you picked makes it harder, ditch the framework.


"Chrome seems to be able to do it under 10ms, IE9 and Opera at around 20ms and Firefox at 80ms (though I probably have a plugin that is causing that pain). IE7 at over 100ms.

On mobile the pain is extreme, for example: on my iPhone 4S this can take about 80ms."

I'd be very interested in seeing some more concrete benchmarks for this.



While I personally would prefer to put most JavaScript in an separate file, rather than sprinkled inline, it happens. So, why doesn't the browser handle this problem for us???

Why not have the browser drop all of the js script fragments and references into a queue, then run them later? Why make people reshuffle the stuff that they have working?

Hell, for that matter, I'm a bit puzzled why js even runs before the document is ready anyway. Is there some user in a hurry to get his alerts and popup windows?


How does Pushing jQuery to the footer solve the "constant tax"?


you get to see your content on the page rendered before jQuery is parsed and compiled. There are other approaches like asyc or defer that are preferred but harder to implement and you could go for a leaner jQuery like say jQuip


What you just said is this article summed up I think. Getting to read the content before the jQuery is parsed is the key and solves the cache issues. The other part where you collect all the jQuery functionality and you push it to the array/.ready loader at once, forces you to be organized.


I hope it also forces people to use CSS where possible (which is more efficient for certain animations etc), and only use jQuery where necessary.


When we were working on WebPutty.net, we found headjs (http://headjs.com/) to be very useful for ensuring all other script calls are run outside of the <head> tag. The full library runs 2.7k gzipped, while just the loader is 1.3k and is the only javascript file you need to block on. (The full script includes a modernizer library in addition to the script loading functionality.)


I think they're still missing the point. The idea is to stop using jQuery like that. Think of it as another required module of your application and move everything to requireJS. You'll gain speed and peace of mind once you learn to modularize all your JavaScript code.


That's yet another dependency. Not my cup of tea. I'd rather wait for the upcoming light/modularized jQuery rewrite.


I think the world should just go back to rich client apps and forego the web browser development tax in general. At least consider something like GWT which let's developer code the server and client in Java and Google worries about how to optimize JavaScript.


If you're looking into these kinds of optimilisations then why not sidestep the problem by moving to a single page application model?

Or is it still acceptable to support clients who do not enable javascript?


> Or is it still acceptable to support clients who do not enable javascript?

I deeply believe that yes, it is. Making Javascript mandatory on websites where it should only enhance the user experience still seems to me like a very bad practice, and I hope it will stay so.

I even want to say that, if your website if correctly designed, offering its content in a Javascript-less way should not be that hard...


I think it's less an absolute, and a lot more "it depends".

If you're building a 3 page website for a mom and pop operation, then ok, have a JavaScript free site. If you're happy with the sites from the 1990's, then no JavaScript is just fine.

It's possible in some cases to use progressive enhancement, so that those without JavaScript get a working, albeit really poor, experience. How much effort you put into creating that option, and maintaining it, depends on your cost-benefit analysis.

If however you want to build a site that the user interacts with, then JavaScript makes such an important difference that not to use JavaScript results in a crappy user experience.

I want every bit of data entry to be validated when it's entered, either on the client or the server, not wait for the user to click "submit".

I want forms to morph as the user enters data. Shipping address same as Billing address? Great, user clicks "Same As Billing" and all the Shipping Address fields vanish.

Shipping out-of-country? Ok, shippers that service the US only are removed from the Shipping options drop-down.

Don't get me started on a zillion user-interface improvements, like data-lookups, sliders, menus, autocomplete etc which enhance the user experience and make people want to use my app, and come back to it.

There are only 2 groups who don't have JavaScript support. Those using truly ancient browsers, and those who've actively turned it off. I honestly don't see the point of degrading the experience for everyone based one these two demographics. It's unlikely that either group will contribute significantly to my income, certainly not in proportion to what it costs to support them.

We make this trade-off all the time. If I write a desktop Windows program, I'm ignoring the 10-15% of folk not on Windows. If I write an iOS app I'm ignoring the folks with Android et al.

So - it depends.


> There are only 2 groups who don't have JavaScript support

That's not entirely accurate. While those are two of the most common cases there are plenty of other contributing factors (http://developer.yahoo.com/blogs/ydn/posts/2010/10/how-many-..., this comment makes a good case). Errors stemming from network congestion, developer error and even CDNs failing - only a few weeks ago the google hosted version of jquery (iirc) failed leaving lots of sites with broken JS and an effective no-JS experience.

Personally I think building a site that works well without JS and then progressively enhancing it is the Right Thing to do. It means you're building a robust site for the real web and the multitude of clients and conditions that comes with. In your example I wouldn't want JS failing to prevent a checkout process, it should be robust and continue working, albeit less smoothly.

> honestly don't see the point of degrading the experience for everyone based one these two demographics

IMO If you're doing progressive enhancement well then this is not the case. I don't think the iOS/Android comparison is accurate either. The web is much more varied, the differences less well defined.

You're right, it does depend on what you're building and for who. But I don't think the general case is as binary as you've make out.


All your examples can be accomplished with progressive enhancement, so what's the problem?


> I deeply believe that yes, it is. Making Javascript mandatory on websites where it should only enhance the user experience still seems to me like a very bad practice, and I hope it will stay so. > I even want to say that, if your website if correctly designed, offering its content in a Javascript-less way should not be that hard...

This is problematic when JS is the user experience. Many modern applications are just brower/javascript clients on top of rest APIs.


Stop paying your .NET tax and start using proper javascript.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: