Hacker News new | comments | show | ask | jobs | submit login

I dislike the approach described where one concatenates every single javascript file into application.js and then tests for the presence of particular dom elements or classes within each file. This taxes the creation of new scripts somewhat as you must ensure you're not accidentally creating a dom element or class which is going to enable some undesired scripts. Another possible option with "require_tree ." is to use a framework like http://requirejs.org/ and then use in line script tags within pages to require a particular module.

I also think it's a bad idea to enable asset compilation in production. From the Rails asset guides (http://guides.rubyonrails.org/asset_pipeline.html#live-compi...):

"This mode uses more memory, performs more poorly than the default and is not recommended."

My advice would be to use application.js to concatenate scripts you're highly likely to use on every page, e.g., jquery, bootstrap, etc. Then organize the rest of your page specific scripts into app/assets/javascripts/<controller>/<action>.js<.extensions> and add a `javascript_include_tag "#{params[:controller]}/#{params[:action]}` in any views that rely on page specific scripts (within a content_for(:head) block). This way you'll likely have 2 local scripts loading for each page - the application.js file and any page specific script. The upsides are that you don't need to worry about undesired javascript running, the application.js file will be cached and reused across all pages, and your lightweight page specific js file will be served the first time a user loads the page then cached with every subsequent visit. The potential downside is that you either need to specify a precompile array by hand in your environment specific config file or automatically glob files to be precompiled.

Note that you can also use the manifest declarations inside of your regular javascript files, e.g., `//= require 'backbone'` at the top of one of your page specific javascript files.




I like your suggestion of using RequireJS. I'll add that to the next revision of the article.

Your strategy of using two JS scripts per view, one site-wide and one page-specific is interesting. I wonder how to test the performance to really determine the value. Maybe look at time-to-render?


I do this as well. I break apps into "components" -- each component has it's own CSS and JavaScript file. A component covers a logical domain, such as administrative functions, or a reporting piece.

Then I have an application level CSS&JS that has everything generic in it. This is for a reasonably big app, so the overhead of having a single CSS/JS isn't insignificant - particularly when supporting mobile devices.

The other useful thing is a I as a dasherized version of the controller name and action to the body tag - as an id and a class respectively. So the body tag is something like <body id='my-controller' class='show'>

I also expose these as JavaScript globals CONTROLLER and ACTION (I also have LOCALE and COMPONENT).

These constructs are to be used sparingly -- you don't want to mash everything together too much -- but they're very handy when used right.


Yes I guess that would be the right metric. You'd need to look at average time to render over a number of deploys with both strategies. I'm pleased to see articles approaching this subject as there's definitely not enough solid discussion around this.

One other optimization you may want to talk about in your next article is to use a gem like the asset_sync gem to upload your assets to S3 or CloudFront (or similar) at compile time.


I'd like to look deeper into this option. I didn't mention it in the article, but @jo_liss suggested hosting all assets on CloudFront in production. She points out that you can set up CloudFront and get your own content-delivery network in five minutes. Once you have your own CDN, there's no advantage to using any 3rd-party JS host.


That's somewhat true. However, using Google does have an advantage over your own CDN -- it's more likely to already be cached in someone's browser.

On these occassions the hit to download JQuery (or something big like JQueryUI) is zero.

This is more a consideration for a landing page than a heavy-use application though.


> I also think it's a bad idea to enable asset compilation in production.

I don't see where the article recommended that. The article (and the Rails Guides) seem to be pre-compiling for production. This is the standard & recommended practice.


Yes, the articles and all of us in the comment thread are recommending pre-compiling. There's a rails configuration option that enables run-time compilation of uncompiled assets - this is referred to both in the rails guide and in the linked HN article. It is not recommended for performance reasons.


Hi Finbarr, I'm interested in learning more about this strategy. I've been using the Garber-Irish method (http://viget.com/inspire/extending-paul-irishs-comprehensive...) for running scripts based on controller/actions, but everything is still always served to every page...With your solution, only truly global scripts seems to be served. I'll try to throw up a test app on github to test out what you've done here, but do you have any existing app that uses this paradigm? I'd love to take a closer look.


Please do test it and post the results. It seems a real point of contention whether a single JS script can outperform page-specific scripts. The team behind the Rails assets pipeline has pushed the single script approach. It'll take some testing to lay the debate to rest.


I think there are a number of factors which will affect the outcome, including:

- number of $(document).ready() callbacks that check whether the current file should run within the single application.js approach. This could grow to be an issue with much larger apps.

- how often the application.js file gets invalidated by a deploy (if it is every deploy this may be annoying for users if there's always a large performance penalty on the first download - especially for apps where the usage isn't regular).

- in the 2 scripts version it's very unlikely that the heavyweight application.js global file will get invalidated regularly, but it is likely that several of the smaller specific page js files will get invalidated with each deploy.

I guess the question is whether downloading several smaller files for each deploy beats downloading the whole application.js file for each deploy (network overhead being the likely bottleneck here).


These are good points. I'll experiment. Thanks!


One more thing - if you concatenate all of your javascript files into a single application.js, the cache will be invalidated every time you make a change to any of your javascript files. I think all in all, this is a bad approach.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: