Hacker News new | past | comments | ask | show | jobs | submit login
Why does a new Rails app need 106 MB of JavaScript? (gist.github.com)
64 points by rileytg 37 days ago | hide | past | favorite | 74 comments

Rails 6 by default includes webpack to bundle assets for production. Aside from webpack, several of the top 10 on your list are libraries that ensure a great user & developer experience across browsers:

- babel transpiles ES6/7+ JavaScript syntax to the least-common-denominator of the browsers you designate as supported by your application

- core-js provides implementations of features that are not supported in older browsers -- things like array methods that cannot simply be transpiled

- caniuse-lite is a database of supported browser features to determine what needs to be transpiled and when

This toolchain allows developers to write code with the latest language features (which usually results in simpler, more elegant code) while seamlessly supporting legacy browsers as needed. If you decide to drop support for an older browser, you can simply remove it from your browserslist instead of going through all your code to figure out which hacks you put in just for that browser. It takes a ton of cycles out of development googling "can I use feature X in browser Y".

Prior to Rails 6, asset precompilation was powered by Sprockets by default. Sprockets still bundled many analogous libraries, just hidden in ruby gems instead of node_modules. For example, if you want to use SASS then you need a library that compiles SASS to CSS -- whether it's node-sass on your list or the sassc gem.

As noted in other comments, very little of this code is actually loaded in the frontend by end users.

Regardless of whether you think all those components are necessary, there is still the question of whether the amount of space they require is justified; JavaScript is a high-level language, and in fact one of the higher-level ones in comparison to things like C/C++. 1KB of JS should be able to do far more than 1KB of C or even binary.

Thus, 100MB of JS is astoundingly huge. It's roughly the same size as all the binaries in a full install of Windows 95, with all its features and drivers.

In the spirit of https://prog21.dadgum.com/116.html perhaps we need a list of "Things an empty Rails app is smaller than".

It takes a ton of cycles out of development googling "can I use feature X in browser Y".

...or you could just avoid the trendchasing and use a minimal subset that you are sure will work everywhere, but that's a different discussion.

"1KB of JS should be able to do far more than 1KB of C or even binary"

This doesn't make sense and isn't a relevant comparison.

I completely disagree with the premise that a high-level language should be able to do substantially more work with the same amount of code as C. That is not the point of a high-level language, not even close. The point of a high-level language is simply to abstract away some of the more difficult problems inherent to programming closer to the metal.

And as far as languages go, JavaScript is not nearly so far removed from raw machine code as most interpreted languages.

100MB of JavaScript code is not “astoundingly huge” at all , considering the bulk of that is tooling. Compared to the tooling for many other commonly used C/C++ development environments, 100MB is pretty modest.

At $50 for a 1TB hard drive, that 100MB is costing you half a cent. It doesn’t get sent to users over the network so it doesn’t affect performance. Why does that matter again?

106MB of source code is perhaps five million lines of code, if written in a typical way. Some people use longer lines, not so many "{" lines, but anyway.

Storing five million lines of code is cheap, sure, no question. But don't you feel curious about how why you need that much code in the first place?

This comparison done by Google is really helpful at understanding all the work that Webpack (and related tools) do: https://bundlers.tooling.report/ .

Click through each test and read through some of the descriptions. They do a lot of advanced techniques that are important for web performance.

Those optimizations are another big part of why modern Javascript build tools are so complicated. There's no other language where we care nearly as much about optimizing for code delivery. On the web, a frontend app should (hopefully) be around 100k of Javascript bytes, and it should have good caching and CDN behavior. Compared to other platforms where 10mb to 100mb binaries are considered just fine.

Thank you for the well thought out answer. This is why I asked.

This was brought to my attention working on a new project. This is the first time I've done rails 6 on heroku. Heroku out of the box with this app gives me a ~360mb buildpack; this exceeds their soft limit of 300mb. Historically not been an issue for me but I guess now with webpack, I need to customize the buildpack to cleanup the node_modules after compilation.

No problem! FYI there may be something else bloating your slug. With the ruby buildpack, heroku by default will install node_modules to a shared cache directory and not include them in the slug artifact. For example I have a Rails app that has 250MB of node_modules locally but my slug on Heroku is only 93.4MB.

Good to know, Ill keep digging on the heroku side.

Actually, I guess I was wrong about node_modules not being included in the slug -- it's just that the slug is compressed so node_modules don't take up anything close to their stat size. One common gotcha is that if you have large non-code files in your git repo (binaries/images/gzips/etc), those will end up in your slug and they don't compress well so they'll add a lot of bloat. If that's the case you can use a .slugignore file[0] to tell Heroku to exclude them. Good luck!

[0]: https://devcenter.heroku.com/articles/slug-compiler#ignoring...

Looks like it is just the asset pipeline, so not anything that is loaded on the frontend.

Little bit click baity. A lot of other frameworks do the same thing.

> A lot of other frameworks do the same thing.

God that is an awful justification.

The heading is the opposite of clickbait. 106 MB is not exaggerated, the link provides proof showing that number. I get pretty tired of people handwaving away bloat these days. Its a real problem across programming languages and communities, and it doesnt just go away by pretending it doesnt exist. It goes away by people testing for it, making it a priority for software to be streamlined and pruning bloat where it makes sense.

You sort of missed the point. The point isn't "there's 106mb of code that will make it into production, but it's ok because other frameworks do it" -- the point is "looking at the size of the node_modules directory is not remotely representative of the final size of the bundled code that will make it to production, and this is in fact the way many modern frameworks work."

Two concrete examples: first, node_modules contains any development modules, which won't make it into the final code at all. Second, most of these packages are installed into the node_modules directory with typing support, full un-minified code, etc etc. When bundling the app for deployment, much of that goes away.

The issue for me is nuanced. While it doesn't go to end users in production, it still ends up in my heroku buildpack. This sharp increase (33% of my allowed buildpack) was alarming.

Having been a rails and react developer for a very long time, i'm generally aware of the value added by all these files. I bring up these concerns almost to say "have we added too much? is it time to think about trimming back?".

Still it's 106MB which has to be audited, checked for license, ...

I don't think anyone's forcing you to use the rails CLI's "new" command to start a new rails project, if you're concerned about all of that and just don't want to deal with it... the docs explicitly say that this is a convenience method for getting started quickly with everything they think a modern developer would want. If you don't want that, that's a perfectly fine choice -- just start a new project from scratch, instead.

It's build tools. This is like complaining that Xcode is many gigabytes.

I never had to install a new copy of xcode for every project.

So the problem is not the size, but that you want to have a single cache of all the npm modules?

And, what about OS X, it takes a huge amount of space... and they install things I never use!

It's almost as if these things run on code!

These kinds of comments are against HN guidelines FYI.

You're right, I was not in the best light last evening. My apologies to the community.

Xcode is from a single vendor, whom I trust as much as I trust the operating system.

With rails and npm I probably trust Rails and see them as license-wise ok, but due to version requirements I might get "newer" versions of a package, which hasn't been veted by Rails devs, might use different license, might do stuff I don't want it might add in even other new deps.

Then all you have to do is specify the specific version you want in your package.json, rather than a semver range, and you'll never get a version you haven't explicitly approved.

Of each of the dependencies Rails installs.

don't use an opinionated framework then?

I don't upload xcode.app to apple every time i submit an iOS app. I do npm install on every build on heroku.

Then maybe don't use Heroku and use something that can work with a finished build?

It's not complicated to set up a build container locally and ship only your final result to a service that can run it.

Heroku provides value in other ways for us. We run on a number of other solutions, most involve shipping a ~100mb image. This can be dramatically reduced in size with a slimmer base image, which we plan on doing eventually.

In this case, heroku handles a lot of things we otherwise need to do ourselves: - DB + backups + DR - Network security - Network routing - SSL certs - Easy scaling of resources - Access control to operations (redeploy, restart etc) - Audits of operations etc

We can do all these things, and under various compliance frameworks, outside of heroku; its just exhausting and expensive. This project we want to focus on the product so we opt for heroku as it was a few lines of bash we were all familiar with to get up and running nearly prod ready.

Just build locally and vendor the asset artefacts. That's how we did it in the "good old days". No containers in containers needed.

To be clear, you don't do "containers in containers", you build a container and you copy your assets from the other one. You then have a shippable artifact with all your dependencies, including your runtime, vendored into it.

I'm not a k8s guy, but it's a lot nicer than a tarball.

It's really clickbait. The "proof" provided includes all sorts of crap that the npm packages ship with.

If you want to show how much JS a new Rails project has, show what gets delivered to the frontend not all the trash in node_modules.

How is it bloat if 160MB of JavaScript will never be in the production version of the app?

I mean, the Basecamp team just published Hey.com with less the 50KB of JavaScript and their dev build of Rails for that project must've exceeded 160MB in `node_modules`.

Rails 6 has become significantly more JavaScript intensive and has bought into the ecosystem. IMHO, it's a mistake and they were better off staying in their niche. Server side rendering + Turbolinks is a very compelling story and much simpler to get into than what Rails 6 is now.

JS in rails is entirely opt-in.

Opt-in means the default is without JS and you can enable it if you want. As shown in the gist, the situation is actually that by default all the JS is installed and you have to explicitly opt-out to prevent it from being installed.

When you create a new rails project, the tooling for JS is set up, but none of that gets used (Turbolinks the only exception) unless you write or include JS code. The tooling does not affect the user experience one bit.

No. That's what it used to be in the old days, but not anymore.

I've been away from rails for a couple of years and recently came back to build a side project. The latest rails version shoves all kinds of JS stuff down your throat, you have to literally opt out.

The whole point of rails is "convention over configuration", and they have adopted the bloated NPM + Webpack ecosystem as their "convention".

For reference, a full install of Windows 95 is about 50MB.

There is something deeply wrong with this entire industry.

While true, that's compiled windows 95. The binary of actually-used js compiled to native code would be tiny.

But those are compiled binaries, not source code.

I guarantee you the source code for Windows 95 is vastly larger than the 106 MB we're talking about for Rails.

And considering the ridiculously large number of things Rails does that Windows 95 didn't... the sheer number of different technologies, standards, formats and versions it's able to interface with... I don't see what's wrong at all.

I guarantee you the source code for Windows 95 is vastly larger than the 106 MB we're talking about for Rails.

I wouldn't be so sure. Win95 has a lot of pieces written in Asm (source > binary), mostly the low-level parts, but also a lot of C/C++ (binary > source).

> also a lot of C/C++ (binary > source)

Unless you're doing a debug build, lots of metaprogramming, or have a tiny source where linker overhead is larger - C source is going to be larger than the binary.

I mean, I know that for a small "hello world", you'll get an executable much larger than the source code.

But with a decent-sized program, full of header files and comments and config files and descriptive variable names and all the rest, there quickly comes a point where the executable is vastly smaller, no?

I really can't believe that the C/C++ source code for Windows (or Word or Photoshop or whatever) winds up compiling to a larger file size, can it?

It really depends. Small amount of source code doesn't necessarily mean a small binary if you have a ton of template instantiations.

How big are all the development libraries and tools that made that 50MB final install pack?

Right, what I love about this is that it completely ignores the installed gems because they're installed somewhere else?

  $ bundle --path vendor
  $ du -h -d 1 .
  64M ./vendor
After building the rails site you can delete node_modules because you only need the built bundles. Those shouldn't be very big at all.

When building container images, yes. However in heroku I'm not sure how to do this without custom buildpacks (e.g. https://elements.heroku.com/buildpacks/istrategylabs/heroku-...). Custom buildpacks reduce the value add of heroku for my small team.

Ultimately my solution will likely be build the assets in CI and deliver via CDN or check into the repo on some release branch.

Edit: I intentionally omitted the gem sizes as that was a prior cost of using rails

I feel your pain, we also had this problem, and resorted to exactly the thing you've done.

Personally I always felt that this was a shortcoming with Heroku's Buildpacks. Which seem unwilling to support modern rails.

Webpack-based projects are heavy. A Rails app comes with a Webpack-based Javascript scaffold out-of-the-box.

I wonder if something like Rollup or NCC could be used to create a single-file Javascript dependency in the general case that is only used for vanilla Webpacker builds, and relying on the individual packages could be something left for when you need to customize the Webpacker build.

Judging by some of the Hey developments, it's possible we could see a much more lightweight JS integration in Rails in the future:


I've worked on the whole loop of ¡server side! to ¡client side! and now back to ¡server side!. Using Hey for a little, I'm really impressed with the "feel" of it. I really hope they will write about how they do this and even contribute stuff upstream.

Going back to primarily server side was about long term projects that aren't very active. I have a rails app I wrote in 2011, I occasionally make fixes, update gems etc. No stress.

I have a client side app from 2016 that I eventually gave up on trying to get the build system running again (grunt?). I just edited the minified build and told the customer this was the last major edit. Only copy changes now.

This isn't an isolated incident. I frequently have to relearn an old project b/c everything changes so fast in js. First we were on prototype, then we went to jQuery (2011?). Homegrown UI components lacked the browser support we needed (IE5/6?) so we added jQuery UI. Our code was a spaghetti mess without components, so eventually we moved to React (2015?).

Those changes took 10+ years. The rest is all in the last 2.5-3.5 years. And personally, I find it way more complicated than previous js ecosystem changes. Ok now we really want to use modern js so lets add a transpiler. Now our components state is getting out of hand. Lets use reflux. Oh no dont use reflux, use redux. Oh but now you need redux-forms. Remember that build system we had? Now everyone uses grunt. Oh no, everyone uses webpack now. Are you still using redux? Use react hooks. And while I value this ecosystem, I'm nervous that we wont "solve" our problems and just keep writing new partial solutions.

It's a bit story-telly (rant?) but the gist is Rails w/ ssr doesnt come with huge mental cost over time. Client side can. We still love and use the client side apps, just only for complex "apps" like an interactive document viewer where we are actively developing year over year.

That can be answered in one phrase, coined by Django:

> Batteries included.

It's ultimately up to Rails how they want to do things. It's OK to complain about it but it's a shame when it discourages people from creating and maintaining open source projects.

Other use in the Python ecosystem seems to predate Django (https://github.com/python/peps/commit/53474baa0790912b8a02f1...) although I'd certainly be interested in more context.

Yeah I wonder if they arrived at the phrase independently. I think Jacob Kaplan-Moss was inspired by Python's standard library.

It seems the owner of that PEP could probably provide more context. He wrote "Meditations on the Zen of Python" recently which means he's still thinking about pythonic software design:



I always appreciated the design philosophy behind rails, how it was always about "getting things done", and not about being the latest cool tech of the day, but this introduction of the webpack fad into the "rails convention" feels just like going backwards.

FWIW 106mb will fit ~11 million chars.

I use many of the features I get with rails webpack. I also use create-react-apps (which I think is even bigger). I think they are valuable to the development process.

I would think we can achieve most if not all of these goals with less code. I don't know exactly how we would get there or if there are any projects underway. This seems like a problem that can't be fixed in one place but must be addressed holistically. This being a decentralized ecosystem of packages, the only way I can see that happening is by publishing stats and creating a culture of reduction. For example: When contributing to the one of the presumably hundreds if not thousands of packages that roll up into CRA or rails, you should see if that overall increases or decreases size. You may have introduced a huge tree of new dependencies thinking your adding one package to your tiny package.

Just a rough idea, does someone more experienced in the community have ideas on what can be or is being done?

I think there are two primary reasons why JavaScript projects tend to have much larger dependency trees than in ruby or other ecosystems.

1. The JavaScript standard library is much more sparse than what you get out-of-the-box with ruby or many other languages. The existence of lodash is evidence of this in itself. The infamous left-pad fiasco would never have happened if JavaScript's native String had an equivalent to ruby's String#rjust. This part of the problem is significantly improving with time -- for example, node.js 8.0 and up do have a native string pad function, padStart. However, especially for tooling libraries, using the latest native JS features means dropping support for semi-recent versions of node, and there's also generally a lot of inertia.

2. npm and yarn allow you to install multiple versions of the same package. If I have dependencies A and B, which both require different major versions of dependency C, npm and yarn will both happily install multiple copies of C. With ruby/bundler this is an error and you have to work out how to tweak your dependencies so they're all happy with the same version range. I suspect this architectural choice is largely a consequence of #1 -- especially in the earlier days of npm, getting all your dependencies on the same version of a commonly used helper library would have been nightmarish. And there are benefits to this approach; I have certainly spent long hours trying to upgrade ruby gems with version conflicts. But it results in a lot more bloat almost by default.

To answer your original question -- I don't know that anything in particular is intentionally being done. I do think we're heading in the right direction and things will be much better in a few years as packages slowly move towards new native language features rather than helper libraries. I also just don't think this is of critical importance at the end of the day. If you add up the brain cycles that I've ever spent worrying about 100MB of hard drive space, it's almost certainly less than what I've spent typing this comment :)

will fit ~11 million chars.

Do you mean something other than "characters", because 106M is almost exactly 106 million characters in UTF-8 (ignoring multibyte characters which are likely to be a tiny fraction of source code)?

This should also be done for create-react-app

"It's not frontend" isn't an answer. Why should a text-packager clock in at 100mb?

Because modern web development is a friggin' minefield of browser support matrices, front-end development languages with experimental features, and aging specifications that require the use of preprocessors for simple things like variables and functions. Almost everything in that 106 MB is a preprocessor of some type that's designed to either make developer's lives easier, normalize code to a widely-supported standard, and/or optimize said code to deliver the smallest possible bundle to end users.

Why can't it be 50mb? Why shouldn't it be 200mb?

I don't understand the point you're trying to make. That's just how much disk space those tools and their dependencies take. You might as well ask why iron is as dense as it is.

Web-pack is not an elementary particle. It's a man-made construction. Describing its disk-size as an intrinsic aspect is a categorical mistake.

Webpack is no-doubt a big program with a lot of features. But I have several even-bigger programs on my PC that clock in at just a few mb - so saying it has a lot of features doesn't answer OP's original question, it simply substitutes an easier question and then answers with a "the size is what the size is" tautology.

When a native app links a static library, it performs dead-code elimination. It might have hundreds of megabytes of dependencies, but the overwhelming majority is not copied to the application. When I make a new rails project I don't download the text-source of Firefox or MySQL and all their downstream dependencies into every project root. There's no reason webpack has to do this, except because of arbitrary workflows imposed by NPM.

If no user-complaint about NPM is ever quantified or interrogated, then NPM will fade into irrelevancy.

I just tried it for a new Angular app. I chose to include Angular Routing (because everybody will need that) and then to use CSS as my style sheet because hey why not. End result was a folder that was 273MB in size, with the node modules being basically all of it.

I guess he hasn't see the size of a create-react-app project.

Can't you just pass "--no-webpacker" to the "rails new" command to skip the npm packages? I haven't tried it myself.

Yes. Makes a huge difference. I try to stick with defaults as much as I can. I find that helps with long term maintainability. In this case, I don't know if I will (as I chose in the past to always use pg and rspec).

node_modules is insane. JavaScript needs an STL. Then we need an AI to rewrite all of these modules to use the STL.

Basically several compilers are bundled (node sass, webpack and Babel) so that you don't have to download them as a user and their version is stable for the app. You can use a tool like pnpm to only ever have one copy of them on your system but these aren't deps that negativity impact the app.

Great fan of Rails but man, that’s a lot of bloat. Last time I used Rails it only came with jQuery. Live-reloading was an exotic thing, quite difficult to setup. Now it seems to be enabled by default, explaining some of the stuff in there.

It's also 99% server-side preprocessors. Only a single package in that list (lodash) is something that could be actually loaded via browser, and even that would only be a small percentage of the full package size. It's also not required for a new rails project, it's simply a convenience workflow that integrates many of the tools used in modern best practices.

Your point is valid that a large portion is server side only (which is still a concern for my use case). But note a lot is omitted from my gist, there are 768 packages.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact