
The cost of small modules - okket
https://nolanlawson.com/2016/08/15/the-cost-of-small-modules/
======
doublerebel
The performance data is great to have, but gives me more reassurance than
concern. Between browserify and closure compiler, we're looking at a 15ms to
40ms difference until we get well over 1000 modules. Even complex SPA apps
I've written come in nowhere near 1000 modules, a few hundred at most.

Interestingly the performance difference between mobile and desktop on a fast
connection was negligible.

Browserify and webpack are really developer friendly, so it's an easy business
decision to sacrifice 15ms in overall load time and instead pre-hydrate the
first page or use another trick to provide responsiveness while modules load.

I think front end dev process can still use plenty of improvement, but my
primary takeaway here is to be conscious of the tools being used and the
dependencies they import. It doesn't need to be a complex issue.

~~~
asimuvPR
A few hundred modules? Could you talk more about that?

~~~
jahewson
"Modules" refers to individual source files, not npm packages. A single npm
package can contain many source files, each of which is a separate CommonJS
module.

Most npm modules are small, so I'd expect in the average case that the
majority of modules are actually your own app code.

------
smrq
This is really interesting. I think there's a lot of room for improvement in
the tooling, though, and my gut instinct is that theoretically,
Browserified/Webpacked code can be as fast if not faster than other compiled
methods. The reason for this is that the bundler has more structural
information about the code it's working with. More information theoretically
equals more opportunities for optimization.

Not saying those optimizations will be easy. I'm marginally familiar with both
the Browserify and Webpack codebases, and I certainly wouldn't want to be the
one to implement any of this. But, assuming all the benchmarks in this post
prove to be reproducible (and I'm expecting them to be), I hope that those
more familiar with the tooling will be able to build a world where we get to
have both the user experience and developer experience we want.

~~~
jahewson
> the bundler has more structural information about the code it's working with

For CommonJS modules this is only true to a very limited extent as the export
binding process can run arbitrary code which can't be analysed (it's Turing
complete).

ES6 modules on the other hand are static and can be extensively analysed
without difficulty. This makes, for example tree-shaking pretty easy. Rollup
is an ES6 module builder which is doing this already. The future's bright.

------
saskurambo
In haxe with static type system, dce and inling function we haven't this
problem if we use haxe libraries. The libraries can be really big or little,
but dce inject only classes and methods used. Same thing can be made with
scala.js and clojure using google closure.

------
simonlc
I've been using rollupify in my projects for a few months now with no issues,
but I didnt know it didnt work on npm modules. I would be interesting to see
benchmarks of rollupify vs rollup on ots own to see if its worth changing
bundlers.

~~~
nolanl
(rollupify author here.) It works if you configure Rollup to use a plugin like
[https://github.com/rollup/rollup-plugin-node-
resolve](https://github.com/rollup/rollup-plugin-node-resolve), and it works
best if the third-party packages also expose jsnext:main or the new "module"
field in package.json.

Overall though there are still lots of edge cases that aren't well-covered
without plugins (such as globals like process.nextTick etc.). Some details
here:
[https://github.com/rollup/rollup/issues/552](https://github.com/rollup/rollup/issues/552)

------
AWildDHHAppears
In pure functional programming, which is most of what I do, there are zillions
of "small modules" and everything seems to work just fine...

~~~
brandonbloom
The problem isn't (necessarily) small modules, it's the _dynamics_ of the
Common.js (and similar) module systems. By conflating fully general,
extensible, dynamically typed objects with both namespaces and modules, the
semantics can't be constrained enough to perform certain kinds of both intra-
and inter-module optimizations at compile time.

By contrast, the Google Closure compiler's module system (also for JavaScript)
does not suffer from this problem, but in tradeoff, disallows many types of
runtime metaprogramming, such as merging modules with Object.assign or
similar.

------
dukerutledge
Now lets see some charts about build times.

------
z3t4
i think its a common misconseption that code dissappears once you brake it
down into modules.

------
carsongross
Wait till people figure out the cost of managing and coordinating all these
microservices...

~~~
andybak
Yep. And the cycle will continue...

Which makes me wonder how one decides where and when to get off the treadmill.
Sadly I think it's a factor of age as much as technical merit. I hit the point
where I felt that we'd reached perfection with Postgres to do relational plus
the ability to go schemaless when it made sense, a nice MVC framework in a
sane language such as Django or Rails, a smattering of client side magic where
required using good old jQuery and finally something like PJAX or Turbolinks
to give you that satisfying 'no page reload' feeling.

Get off my lawn...

~~~
lj3
I saw a brilliant graphic a while back, but can't find it now. It shows 4
stages of development methodology in a big loop. It goes...

    
    
      Spaghetti code -> MVC -> DI -> advanced framework
    

Personally, I embrace the suck and get off the roundabout at 'spaghetti code'.
Define things as straight forward as I can as a sequence of instructions and
abstract as little and as usefully as possible. Things that get repeated go
into a function. Whole systems that get used more than once go into their own
module. If I find myself copying and pasting between projects, it becomes its
own library. If a library I'm using doesn't exactly fit my use case, I put it
away and write a new one from scratch.

That last one was especially hard for me. It's so tempting to spend hours or
even days on google searching for just the right library somebody else wrote
that will fit your use case. Don't do it. Write it yourself. It'll be quicker
and you'll learn more.

~~~
xweb
"...It'll be quicker and you'll learn more...." \- Not to be snarky, but yeah
- it'll be quicker to write up front, but you'll learn more...from finding and
debugging all the edge cases the library authors may have already dealt with.
And there goes the time you "saved" up front.

~~~
lj3
"may" being the key word here. The library you choose "may" have already dealt
with those edge cases. Or you may spend hours or days debugging the god damned
thing. In my experience in javascript development, the latter is far more
likely.

------
fibo
Interesting benchmarks, in particolar they invite to try rollup.

