
Closure Compiler: a tool for making JavaScript download and run faster - JacobJans
https://developers.google.com/closure/compiler/
======
dahart
The title of this submission omits the best part; closure provides static type
checking. That is more valuable to me than the dead code removal &
minification.

I just discovered an amazing and nearly undocumented feature of the Closure
Compiler: the --module flag.

One of the most inconvenient parts of using closure is having to compile each
page separately. That's what the Closure docs say to do:
[https://developers.google.com/closure/compiler/docs/api-
tuto...](https://developers.google.com/closure/compiler/docs/api-
tutorial3#separately)

That means if you have 10 pages, you have 10 compile passes, even if 95% of
your code is shared. But, it turns out you can run a compile pass and output
multiple compiled .js files simultaneously! This has cut my build times by
10x.

So, anyone with long build times due to multiple separate builds, poke around
and find out how to use --module!

~~~
sgrove
I think it's likely that long-term, the approach FB took with Flow (or more
general Abstract Interpretation methods) should be able to handle more cases
with less effort on the developer's side (albeit with increased computational
costs).

The modules/"code motion" is a bit strange at first glance, but definitely a
huge boon for building debuggers/admin interfaces/etc. that you don't want
included in your main application. David Nolen wrote about this earlier this
year when support started landing in ClojureScript
[http://swannodette.github.io/2015/04/07/in-stillness-
movemen...](http://swannodette.github.io/2015/04/07/in-stillness-movement/)

Edit: wrt Flow, specifically referring to FB's approach to retrofitting type-
checking into a largely dynamic language. Thanks to swannodette for pointing
out the ambiguity.

~~~
swannodette
It's not clear what you're referring to wrt. "the approach FB took with Flow"
and Abstract Interpretation. Type checking? Dead code elimination? In anycase,
Google Closure Compiler also leverages Abstract Interpretation.

~~~
sgrove
Thanks for pointing out the ambiguity.

Closure would probably be the most interesting (or close to the most
interesting) project to work on inside of Google-proper, for me personally.
That said, Google's approach is very much on the opposite side of FB's, in
that Google expects the vast majority of code to be written for it. FB is
dealing with a legacy codebase and has fewer engineers, so has to strike a
different set of tradeoffs - less up-front work to integrate their tooling for
potentially less benefit than something like Closure can deliver. It's
interesting to see the two different approaches.

Anyway, this thread is about Closure, which enables ClojureScript, which is my
go to language these days. The developers working on the compiler and the
stdlib (which is amazing) should feel proud about all the people they've
enabled.

------
swannodette
Google Closure is not only a useful tool at the command line, it's also an
incredibly powerful framework for manipulating, analyzing, checking, and
producing JavaScript in a dizzying number of ways. And now thanks to Java 8's
Nashorn you can skip the Java and script it quite productively with JS,
[https://gist.github.com/swannodette/aad077de18309a08cff3](https://gist.github.com/swannodette/aad077de18309a08cff3)

------
abritinthebay
Having used both this an Uglify... I can't recommend this.

It's slow. Like... really slow to build. Uglify takes ~10seconds on my code
whereas CC took around 10 minutes. Plus the performance benefits - while real
- provides very little useful speedup that the V8/whoever JIT won't do anyhow.
Even in tight loops.

It's VERY cool, but will add minutes to your deploy/compile time for very
little practical benefit. If you're doing a ton of data processing on the
client (WHY??) then I guess it might be useful... maybe.

That said - I'm glad it exists. It's a technological benefit even if it's not
practically that useful. Those who compare it to the C++ optimization flags
are being hyperbolic however: C++ isn't a JIT compiled language so it's Apples
to Oranges.

~~~
cromwellian
I work on Gmail/Inbox and not even our codebase takes 10 minutes. Closure
Compiler does not need to be constantly run in optimized mode during
development. Type-checking only mode is faster, and engineers at Google
typically develop code in uncompiled mode, and let the continuous integrations
server run the fully optimized build + integration tests.

Secondly, this isn't about strictly speeding up runtime performance, but
startup latency. The biggest use of Javascript compilers is shrinking download
size. The effects of even a 10% code size reduction are readily apparent in
95%tile latency graphs. On mobile web, it's even more of a benefit.

Third, on a large codebase, like Gmail, Docs, Maps, etc the type checking
provided by Closure is invaluable.

Forth, Closure provided a good module system for JS for far longer than any
alternative, and like GWT, it's over 10 years old. That module system allows
for cross-module code motion optimizations, that make it easy to structure
your code for maximum readability and productivity, but to ship it down the
wire in a way that only the code that is needed immediately is retained, and
"dead code" is moved into late loaded modules.

Have a look at [https://photos.google.com/](https://photos.google.com/) Most
of the files loaded are tiny. Why? They're globally optimized, uglified, and
dead-stripped together, but code is moved around between fragments depending
on when, not if, the code is needed.

Soap Box Time: The way your message goes over the top reminds me of a frequent
irritation I have with part of the JS community, I guess I'd call it the
"tools derangement syndrome". People who chafe at any structure imposed, be it
optional types, syntactic-sugar Classes in ES6, IDEs, build processes,
dismissing the benefits, or overplaying the downsides.

That's fine if you don't have a lot of code, but if you have a big enough
project that Closure would take a long time to optimize, you are exactly the
kind of scale of project that needs a type checking globally optimizing
compiler.

Closure compiler is extremely practically useful. Without it, Gmail, Maps,
Docs, et al would be far larger applications that consume more resources. It's
simply the best Javascript optimizer on the market by far.

~~~
abritinthebay
> I work on Gmail/Inbox

Hi there, love the work you guys do. For perspective: I used to be the Closure
cheerleader in our company. You don't have to sell me on it, I know it has
very good features.

> not even our codebase takes 10 minutes

It's probably better than ours. Not going to argue that. I think if our
codebase wasn't a giant legacy beast it would perform better. It doesn't
though :(

> engineers at Google typically develop code in uncompiled mode

Well we did that too, but when we had to bundle everything and compile.. it
sucked. We managed to get it down to about ~3 minutes with less advanced
options but at that point Uglify was on par with it size-wise so... _shrugs_

We took development time over a minor (basically insignificant) startup
latency improvement.

> on a large codebase [..] the type checking provided by Closure is
> invaluable.

I don't disagree with this but I find it best to separate the two out (for
example, with something like Flow, etc) rather than have the whole
bundle/compile step fail. But it's a good feature.

> Closure provided a good module system for JS for far longer than any
> alternative

True, but if you're using ES6 you get a better one now so I can't really say
this is a _plus_ anymore. If anything it's bloat now.

> The way your message goes over the top reminds me of a frequent irritation I
> have with part of the JS community, I guess I'd call it the "tools
> derangement syndrome". People who chafe at any structure imposed, be it
> optional types, syntactic-sugar Classes in ES6, IDEs, build processes,
> dismissing the benefits, or overplaying the downsides.

That's not what I'm doing. I'm saying _in my experience_ and _on our codebase_
it's not been worth using. I'm sure it will be for many people but I can't
recommended right now. That's a personal perspective sure; never claimed it
wasn't.

Ironically I'd say you're doing the exact thing you're arguing against -
getting massively riled up over criticism of your favorite tool chain.

> That's fine if you don't have a lot of code,

Yeah, I do. A lot. Like... a lot. Like... over 3,600,000 LOC. It's just it's
clearly not the kind of code and/or project that CC is fast at optimizing
_right now_. That's on the project: it's very legacy and bloated tbh. I'm
working to fix that as we speak ;)

> Closure compiler is extremely practically useful.

For projects that it's useful for, yes. It's a tautology. However for my
project: wasn't at all. For projects that really just want minification and
obfuscation: it's also total overkill.

I'm not saying its bad, GOD NO! Not at all, just that it was - for the large
codebase we have - really not the best choice. That may change as our project
gets cleaner, better organized, and less legacy. Right now though we're not
there.

I'm not attacking that CC exists, I was providing an alternative perspective
from the love-fest in here: where I've found it lacking. Blind devotion isn't
good for any project. It's a good project but it doesn't help _everyone_.

~~~
cromwellian
If you just want minification/obfuscation (not dead code stripping), sure, CC
is overkill. But if you've got a 3.6M LOC shipping to consumers and not a
B2B/intranet app, I can't believe you wouldn't care about an extra 10-20% dead
code stripping, that's a non-trivial improvement.

Also, have you tried it recently? AFAIK, there was performance work done on it
over the past year or so, speeding up the type checker and other parts of the
compiler.

My view is generally, it's better to spend time at compile time. Even if a
fully optimized build took 10 minutes, it's not a big deal. Even in an
uncompiled mode, presumably your integration tests on a large app can take
minutes to run anyway, and with such a large app, you may have a Q/A process
as well. That is, your releases will be gated by other factors, not compile
times, and so a few extra minutes to save 100k downloads * number of users is
a big benefit for your user base and a few minutes of your time.

I would say that if you have any deferred loading, CC's cross module code
motion is a fear that all of the other existing tools lack and it can get you
far far more than 10% reduction of your initial load depending on how your app
is written. Generally the larger the app, the more likely transitive dead code
will be pulled into the initial download without something like CC.

~~~
fixermark
Since we're reasoning about a concrete case (abrinthebay's codebase): Wouldn't
the amount of code-stripping benefit be directly proportional to how much
unused code the codebase has?

It's possible in abrinthebay's scenario, they simply haven't crammed in so
many Javascript libraries that the code-stripping can find 10-20% unused code.

(I'm also having a hard time avoiding calling "the code stripper yanks 10-20%
of my code" an indicator that one's choice of libraries is wastefully large,
but that's a separate thread of debate).

~~~
cromwellian
It's unavoidable in a large project unless your build dependencies are down to
the method level. Typically library targets don't supply a single method, they
supply a file which has a number of methods, and not all of them are used.

But even in the case where you don't get dead stripping, you'll still get
deferred code motion. And even if you don't get deferred code motion, you'll
still get better renaming, because with type analysis, you can know which
properties are disjoint from others.

At some point the argument gets into the "I don't need a C compiler, because I
already write fast, efficient ASM code" territory. If your code base is
perfectly gardened, ok, you don't need any compiler tools. The statistically
likelihood of that situation remaining stable as your project ages and adds
more developers approaches zero IMHO.

------
Drakim
How can it determine dead code for such a dynamic language like JavaScript?

Is it limited to obvious stuff like things after a "return;" statement that
cannot run? Because I don't see how it could remove unused functions when you
can invoke them in such indirect ways: myobj['f'+'oo']();

~~~
mythz
Have a look at the advanced compilation option:
[https://developers.google.com/closure/compiler/docs/api-
tuto...](https://developers.google.com/closure/compiler/docs/api-
tutorial3?hl=en)

Basically when enabled it will rewrite normal property access, e.g:
`myObj.foo` to something like `a.b` but wont shorten string literal access,
e.g `myObj['foo']` just becomes `a.foo` which you would use for any API's you
want to export.

For a real-world example I use string literals for declaring jquip's public
API:

    
    
        p['removeAttr'] = function(name){
            return this['each'](function(){
              if (this.nodeType == 1) this.removeAttribute(name);
            });
        };
    

Which closure advanced rewrites to:

    
    
        q.removeAttr=function(a){return this.each(function(){this.removeAttribute(a)})};
    

Original:
[https://github.com/mythz/jquip/blob/master/dist/jquip.all.js](https://github.com/mythz/jquip/blob/master/dist/jquip.all.js)

Compiled:
[https://github.com/mythz/jquip/blob/master/dist/jquip.all.cl...](https://github.com/mythz/jquip/blob/master/dist/jquip.all.closure-
advanced.js)

~~~
sanderjd
How did it know it could leave out the `this.nodeType` check?

~~~
tomjen3
If it can prove that it will always be 1 (ie by code not shown) it will be
removed.

~~~
sanderjd
Yeah, I was curious how it could prove that in this specific case. Very neat,
in any case.

------
adrianh
I've been using Closure Compiler with ADVANCED_OPTIMIZATIONS (which does the
dead-code removal) for soundslice.com for several years. It is truly awesome!

Check out 39:30 in my 37signals talk at
[http://37signals.com/talks/soundslice](http://37signals.com/talks/soundslice)
to find out more about it.

The downside is that, in order for the dead-code elimination to work properly,
you need to make sure to "export" things that aren't explicitly called in your
JS module. For example, if your JavaScript module just provides some functions
that are called by a web page, you'll need to make sure those function calls
are "seen" by the compiler so that it doesn't delete them. It just takes a bit
of time and thinking to set this up; I believe it originally took me a day to
do so for Soundslice.

------
Scarbutt
Why is the closure library not as popular as other alternatives (ex: jquery)?
It looks like a solid library, used by gmail, google docs and other google
apps, I'm guessing is the java requirement for the compiler.

~~~
masklinn
Because really taking advantage of it (enabling advanced compilation options)
requires using a fairly specific subset of the language otherwise it will DCE
half the project. Integration with "regular" javascript libraries can also be
annoying.

ClojureScript uses Closure as an optimisation backend, since the primary cljs
compiler is already in java.

~~~
chrisoakman
FYI - I'm fairly certain that CLJS does not use Google Closure because it is
"already in java", but rather because of it's maturity and awesome features
like DCE.

[https://github.com/clojure/clojurescript/wiki/Rationale#goog...](https://github.com/clojure/clojurescript/wiki/Rationale#google-
leads-the-way)

~~~
masklinn
This and that are not exclusive. There are good reasons to use closure, but
would integrating closure and the tooling it requires even have been
considered had the cljs compiler been in anything but java?

How many alt.js languages implemented in !java use closure?

------
kruhft
I keep trying to use closure on my projects but it doesn't compile jQuery to
working code. I always get errors after concatenating and using it on my
sources and this always just brings me back to uglify.

~~~
tantalor
[https://code.google.com/p/closure-
compiler/wiki/ExternsForCo...](https://code.google.com/p/closure-
compiler/wiki/ExternsForCommonLibraries#jQuery)

------
paulddraper
I have used the Closure Compiler for years (and contributed too!).

It is really a fantastic tool (along with the Closure Library and other
members of the Closure family) for anyone that values maintainable, solid JS.

I can understand why it's not more popular; it feels a lot like Java --
industrial strength, non-flashy, and perhaps a little boring. It's also a
little hard to setup for a real project (tutorials could be better).

It has been around since the creation of Gmail, and will be around to stay for
years to come. (Though it has improved substantially over the years.)

------
eltaco
Babel is working on minification[1] /dead code elimination [2] as well now

[1]:
[https://github.com/babel/babel/issues/1828](https://github.com/babel/babel/issues/1828)
[2]: [https://github.com/babel-plugins/babel-plugin-dead-code-
elim...](https://github.com/babel-plugins/babel-plugin-dead-code-elimination)

------
DenisM
Anyone wants to compare this with TypeScript?

I'm using Visual Studio for the backend, in case it matters.

------
sa_su_ke_hx
with haxe you can have the same effect but with:

\- a better dead code elimination thanks to Static Type System,

\- code generations with macro functions,

\- inlining of functions, constructors, and objects

\- static code analyzer

all this things for all the haxe
targets(javascript,flash,php,python,java,c#,c++,neko and work in progress lua)

~~~
klibertp
But, for what it's worth, without a "seamless JS interop" \- and without Haxe
_being_ JS.

For me these are of no concern and I'd probably use Haxe had I the right
project for it, but for most JS-and-only-JS people it's a deal breaker. I will
never be friends with one of those programmers.

------
pkmiec
how does this compare to uglifier?

~~~
abritinthebay
Different beasts. A direct comparison is unfair to either of them.

Uglify does minification and dead code removal. That's really it.

Closure does a lot of type checking, code _rewriting_ , and has a very
specific subset of JS that you'll have to use if you _really_ want to get the
best performance out of it... _and_ does minification and dead code removal.

If you just want minification/dead code removal then Uglify is waaay faster,
javascript native, and almost as small (105-120% of the size of Closure
compressed code, depending on the code). It also doesn't use the JVM so that
can make dep management/ops simpler if you don't already have java in your
stack. YMMV.

If the advanced stuff sounds like something you'd want/need (if you have a
large project and don't mind writing to the Closure optimizations) then
Closure is certainly worth looking at. It's very good at what it does.

Choose the right tool for the job and all that.

