Interesting read. I wrote the original compiler back in 2002/2003, but a lot changed by the time it was open sourced (including the confusing name -- I just called it a javascript compiler).
One detail this story gets wrong though is the claim that, "The Gmail team found that runtime JavaScript performance was almost irrelevant compared to download times." Runtime performance was actually way more important than download size and we put a lot of effort into making the JS fast (keep in mind that IE6 was the _best_ browser at the time). One of the key functions of the js compiler was inlining and dead-code removal so that we could keep the code readable without introducing any extra overhead.
Thanks for the correction, Paul (and for the great email client and JS Compiler!). I've added a note to the article.
The focus on inlining as a performance win makes a lot of sense. It's hard to get back into the pre-JIT IE6 mindset where every getter and setter came at a cost. By the time I used Closure Compiler years later this had gotten simplified to just "minification good". I remember search (where I worked) in particular was extremely concerned with shaving bytes off our JS bundles.
To be clear, minification was absolutely a key feature/motivation for the compiler. Runtime performance was more important than code size, but as usual the key to improving runtime performance is writing better code -- there's not much a compiler can do to fix slow code. For example, I wanted the inbox to render in less than 100ms, which required not only making the JS fast but also minimizing the number of DOM nodes by a variety means (such as only having a single event handler for the entire inbox instead of one per active element).
As other here have pointed out, JS was very much looked down upon by most people at Google, and there was a lot of resistance to our JS-heavy approach. One of their objections was that JS didn't have any tooling such compilers, and therefore the language was "unscalable" and unmaintainable. Knocking down that objection was another of the motivations for writing the compiler, though honestly it was also just kind of fun.
I used Closure at Google after coming from a Java background. I always described it as "Closure puts the Java into JavaScript". The team I was working on found several bugs where live code was removed from the dead code removal too.
Now Closure (at Google) meant a couple of different things (by 2010+). First it was the compiler. But second it was a set of components, many UI related. Fun fact: the Gmail did had written their own set of components (called Fava IIRC) and those had a different component lifecycle so weren't interoperable. All of this was the most Google thing ever.
IMHO Closure was never heavily pushed by Google. In fact, at the time, publicly at least, Google was very much pushing GWT (Google Web Toolkit) instead. For those unfamiliar, this is writing code in Java that is transpiled to Javascripit for frontend code. This was based on the very Google notion of both not understanding and essentially resenting Javsscript. It was never viewed as a "real" language. Then again, the C++ people didn't view Java as a real language either so there was a hierarchy.
GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.
But this idea of running the same code everywhere was really destructive, distracting and counter-productive. Most notably, Google runs on protobufs. Being a binary format, this doesn't work for Javascript. Java API protobufs weren't compatible with GWT for many years. JS had a couple of encodings it tried to use. One was pblite, which basically took the protobuf tag numbers as array elements. Some Google protobufs had thousands of optional fields so the wire format became:
[null,null,null,null,...many times over...,null,"foo"]
Not exactly efficient. Another used protobuf tag numbers as JSON object keys. I think this had other issues but I can't remember what.
Likewise, Google never focused on having a good baseline set of components. Around this time some Twitter engineers came out with Bootstrap, which became the new reset.css plus a component library and everything else kind of died.
Even Angular's big idea of two-way data binding came at a huge cost (component transclusion anyone?).
Google just never got the Web. The biggest product is essentially a text box. The way I always described it is "How do you know you're engineering if you're not over-engineering?" Google does some absolutely amazing technical infrastructure (in particular) but the Web? It just never seemed to be taken seriously or it was forced into an uncomfortable box.
Yes definitely, I also worked there during that time, and agree with the idea that Google didn't get JS. This is DESPITE coming out with 2 of the greatest JS applications ever -- GMail and Google Maps -- which basically started "Web 2.0" in JS
I always found that odd, and I do think it was cultural. At a certain point, low-level systems programming became valued more, and IMO it was emphasized/rewarded too much over products. I also agree that GWT seemed to be more "staffed" and popular than Closure compiler. There were a bunch of internal sites using GWT.
There was much more JS talent at Yahoo, Facebook, etc. -- and even eventually Microsoft! Shocking... since early Google basically leap-frogged Microsoft's hotmail and I think maps with their JS-based products. Google Docs/Sheets/Slides was supposedly a very strategic JS-based product to compete with Microsoft.
I believe a lot of it had to do with the interview process, which was very uniform for all the time I was there. You could never really hire a JS specialist -- they had to be a generalist and answer systems programming questions. I think there's a sound logic in that (JS programmers need to understand systems performance), but I also think there's room for diversity on a team. People can learn different skills from each other; not everyone has to jump through the same hoops
---
This also reminds me that I thought Alex Russell wrote something about Google engineering not getting the WEB! Not just JavaScript. It's not this, but maybe something similar:
I don't remember if it was an internal or external doc. I think it focused on Chrome side things.
But I remember thinking that too -- the C++ guys don't get the web. When I was in indexing, I remember the tech lead (Sitaram) encouraged the engineers to actually go buy a web hosting account, and set up a web site !! Presumably because that would get them more in touch with web tech, and how web sites are structured.
So yeah it seems really weird and ironic that the company that owns the biggest web apps and the most popular web browser has a lot of employees who don't value that tech
---
Similarly, I have a rant about Google engineering not getting Python. The early engineers set up some pretty great Python infrastructure, and then it kind of rotted. There were arguably sound reasons for that, but I think it basically came back to bite the company with machine learning.
I've heard a bunch of complaints that the Tensorflow APIs are basically what a non-Python programmer would invent, and so PyTorch is more popular ... that's sort of second-hand, but I believe it, from what I know about the engineering culture.
A lot of it has to do with Blaze/Bazel, which is a great C++ build system, while users of every other language all find it deoptimizes their workflow (Java, Go, Python, JS, ...)
So basically I think in the early days there were people who understood JS (like paul) and understood Python, and wrote great code with them, but the eng culture shifted away from those languages.
It was definitely cultural. The engineering hierarchy was:
1. C++ engineers thought the only real language was C++
2. Java engineers thought the only real languages were C++ or Java
3. Python engineers either thought the only real languages were Python, C++ or Java or some of them thought only Python
At that time (I don't know about now), Google had a designation of a frontend softare engineer ("FE SWE") and you'd see interview feedback where a particular interviewer would be neutral on a candidate and soft reject them by explicitly stating they were maybe good enough to be an FE SWE, even though the official stance was FE SWEs had to pass the regular SWE standard plus some extra.
Basically, anything JS/CSS/HTML related was very much looked down upon by many.
Blaze went through some growing pains. At a time it was a full Python interpreter and then an interpreter of a subset of Python and ultimately not really Python at all. There was a time when you had to do things with genrules, which was an awful user expeience and it broke caching, two reasons why they got rid of it, ultimately.
But Blaze made sense because you had to be able to compile things outside of Java, Python and (later) Go where Java and Go in particular had better build systems for purely Java and Go (respectively) code bases. It got better once there were tools for auto-generating Blaze config (ie the java_library build units).
Where Blaze was horrible was actually with protobufs. auto-generated code wasn't stored in the repo (unlike Facebook). There were protobuf versions (although, even by 2010, most things were protobuf version 2) but there were also API versions. And they weren't compatible.
So Java had API version 1 (mutable) and 2 (immutable) and if you needed to use some team's protobuf but they'd never updated to API v2, you'd either have to make everything v1 or do some horrible hacks or create your own build units for v2.
But I digress. Python essesntially got supplanted by Go outside of DS/ML. The code review tool was originally written in Python (ie Mondrian) before being rewritten in (IIRC) GWT (ie Critique). For a very long time Critique lacked features Mondrian had.
Personally, I was always sympathetic to avoiding large Python code bases just for the lack of strict typing. You ended up having to write unit tests for spelling mistakes. I can't speak to Tensorflow vs PyTorch.
I suspect this institutional disdain for Python was probably a factor in GvR leaving to go join Dropbox.
You have good points overall, but I'd say AngularJS did have mass adoption at a time - it was a great way to build web applications compared to the alternatives. React just came by and did even better.
And Dart may not have much of a standing on its own, but Flutter is one of the most popular frameworks for creating cross-platform applications right now.
Maybe eventually someone realized this was horribly inefficient and added an extra step but a series of commas in an array with nothing in between them isn't valid JSON.
True, although GP was comparing Dart to UI frameworks (GWT, Angular and React):
> GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.
A few things held closure compiler back in my mind from general adoption:
- First and foremost a Google tool, not a tool meant for the masses, per se[0]. It could do great things, but it had the dependency on closure tools, which were not the easiest to use outside of Google (did anyone outside google actually use the `goog.*` stuff?) and writing annotation files (externs) of libraries and such never caught on and weren't shared in some meaningful way.
- Lacked evangelism. I imagine this had to do with the above. There weren't people from Google singing the benefits loud and proud everywhere about using the Closure Compiler
- Docs weren't the best. Some parts are still broken (links and such).
- It didn't evolve. Closure could have been a pretty great bundler too, actually, but it didn't support ESM for a long time and it was never clear how to get it bundle things even when it did, I think you mostly had to declare each dependency file for it to work correctly, but I never myself got it to work 100%
These are some of the things that I think ended up holding it back. It could have been a cool little ecosystem too. There was a sister project called Closure Stylesheets that was for CSS that was supposedly very good as well, though I think thats no longer maintained. I believe it was very similar to how people use PostCSS in terms of what it did
[0]: Lots of projects use it to minify their JS but never really took advantage of the advanced functionality it can provide on top.
Beyond that the branding is also confusing. Is it Closure? The Closure Compiler? Closure Tools? Google Closure? I used it for years and I'm still not clear which is right.
I used the goog.* stuff once upon a time. I remember using it outside Google meant bundling it with a .jar built by a single guy that hadn't been updated in years.
Closure compiler was actually one of the biggest influences on the design of TypeScript, and even the early motivation for the approach that TypeScript took.
> There were many options already available, but none seemed to be resonating well with a broad enough section of the market. Internally at Microsoft, Script# was being used by some large teams. It let them use C# directly instead of JavaScript, but as a result, suffered from the kind of impedance mismatch you get when trying to stand at arms length from the runtime model you are really programming against. And there was Google’s Closure Compiler, which offered a rich type system embedded in comments inside JavaScript code to guide some advanced minification processes (and along the way, caught and reported type-related errors). And finally, this was the timeframe of a rapid ascendancy of CoffeeScript within the JavaScript ecosystem — becoming the first heavily used transpiled-to-JavaScript language and paving the way for transpilers in the JavaScript development workflow. (Aside — I often explained TypeScript in the early days using an analogy “CoffeeScript : TypeScript :: Ruby : C#/Java/C++”, often adding — “and there are 50x more C#/Java/C++ developers than Ruby developers :-)”)
> What we quickly discovered we wanted to offer was a “best of all worlds” at the intersection of these three — a language as close as possible to JavaScript semantics (like CoffeeScript) and syntax (like Closure Compiler) but able to offer typechecking and rich tooling (like Script#).
Excel online at last back in 2015 was writing in script#. Not only c# IDE support was just miles ahead (that's per vscode, typescript days), the biggest thing was the ability to author unit tests that leverage lots of work from at the time dedicated testing organization. (Who wrote unit tests in js 10yrs ago, anyone? )
:raises-hand: - I was certainly writing unit tests in JS in 2012. Jasmine came out in 2010 and was already widely adopted.
Also, Jasmine wasn't the first test runner by a long shot (John Resig wrote one for jQuery before Jasmine was a thing and there were earlier ones too).
I've been using the Google Closure compiler for the last 8 years or so, in production. My SaaS depends on it, so you could say I make a living based on that tool. It's been working great, providing a significant performance increase to my ClojureScript code, along with a bunch of other benefits. I use advanced compilation mode.
I'm not sure why the author believes that "minification was a design goal". Minification is a side effect.
> "In the context of npm in 2023, this would be impossible. In most projects, at least 90+% of the lines of code are third-party. "
Well I guess that's why I avoid using npm and why I can maintain code for 8 years and still keep my sanity. I keep the use of third-party code to a minimum, carefully considering each addition, its cost over time, and the probability that it will be maintained.
As a side note, I think it's immature to use terms like "X won" or "Y is dead" in tech discussions.
Closure Compiler was an amazing tool, and still is.
When I was at Lucidchart, I helped convert the 600k line Closure codebase to TypeScript. [1] In fact, Lucidchart still uses Closure (for minification+library, not typechecking).
There are better approaches available in 2023, but Closure Compiler will always have special place in my heart.
+1. The comments bashing Closure in comparison to TypeScript feel like they're missing the timeline.
Closure brought modules, requires, compile-time type checking and optimizations to JavaScript years before TypeScript was on the scene. I wouldn't dare start a new project with Closure. But it was such a spiritual predecessor to what we have today in TypeScript, and has a special place in my heart too.
Closure was/is a technological marvel - years ahead of the competition in some aspects. Closure had dead-code removal years before "tree-shaking" became popular in the Javascript world - both the buzzword and the implementations.
The public-facing documentation was terrible, and it got very little evangelism.
Also: it allowed multiple teams to work on huge code bases without constantly stepping on their feet - it was basically impossible to write bigger JS apps before.
> Unless you've worked on frontend at Google at some point in the past 20 years, it's unlikely that you've ever encountered the Closure Compiler
Unless, of course you were at the forefront of frontend stuff like a decade ago and Closure Compiler was the absolute best for a long time at dead-tree elimination and compressing JS artifacts. Or, you're a developer using ClojureScript. Or...
> It occupied a similar niche to TypeScript, but TypeScript has absolutely, definitively won.
Huh? The Closure Compiler is so much more than just "JavaScript but with types". It also provides minification, provided a namespacing system, additional standard library and such.
> The Closure Compiler is so much more than just "JavaScript but with types". It also provides minification, provided a namespacing system, additional standard library and such.
That’s literally what the article is about if you read a few sentences more
Exactly. We build a large SPA back in 2008. Used closure compiler just for modulizing the code base and minification for production. It had some linting as well. The software still running today only last year moved to parceljs.
Closure library/compiler was the only real way to write a large SPA back in 2012+.
You had access to an amazing compiler with a standard library that was like having every npm module you could ever need but written by Google and well maintained without the security risks.
We still use it, and the only real “wish” we have is maybe if the jsdoc “type system” could just be replaced by TypeScript while maintaining all the rest of the library/compiler.
Outside Google, ClojureScript (with a "j") used to depend on the Closure compiler (with an "s") - partly because the library that came with the compiler provided a Java-like API which was convenient as the Clojure language was originally written targeting Java, and partly because the language had quite a large runtime and tree shaking was necessary for performance. You also had to write extern declarations if you used outside code, much like you have to manually declare types for untyped dependencies in when using Typescript.
Edit: ClojureScript still depends on Google Closure.
I use ClojureScript (it's excellent along with the rest of the Clojure ecosystem) and thus use the Closure compiler every day. It no longer requires manual extern declarations and is able to inform externs from your source.
I absolutely hate property renaming in the Closure compiler.
If you’re not acutely aware of it and you do a `myObj[‘someProp’]`, you’re not going to get any in code warnings, everything will work as you expect in development, and you’re tests and presubmit checks will pass. On multiple occasions for me, the problem surfaced in production, and there was no one around to tell me that property renaming was even a thing. I had to try to debug compiled code.
Worse still is that you don’t even have to try to do a `myObj[‘someProp’]` to get into trouble. There is very commonly used library code that will cause the same problem—you’re just calling a method that tries to access a property on an object. But since it’s abstracted, it’s even harder to catch during a code review or to debug the problem.
From what I recall, the Closure type system wasn't well suited to Javascript. I recall it as being quite Java-ish (nominal rather than structural). It's been a loooong time since I looked at it, though, so I might be misremembering.
TypeScript aims to make a type system that can model anything you can do in JS. Of course JS can modify its objects, at runtime, in literally any way imaginable, so the TS type system has to be Turing complete. Anything less powerful and you aren't going to be able to model what arbitrary JS code is doing.
My go to example being a function that takes in an object with a bunch of camel case parameterFields and makes them into snake case parameter_fields.
Easy peasy in TypeScript, You can say, in a generic fashion, "pass an object with camelCase field names, and return the same object but the field names are snake_case".
This is incredibly useful when adapting between different systems!
Traditional OO systems fall flat on their face attempting such feats.
Funny, I actually wrote and maintain a library which, amongst other things, does almost the same thing as your contrived example, except it's camelCase to kebab-case. It's written in Flow though, because that was what I knew at the time.
> does almost the same thing as your contrived example
Sadly, not contrived!
Anyone using protocol buffers is actually familiar with this exact scenario. Not sure how the JS PB compiler handles things though, I never bothered to look under the covers.
Here is a random sample I found of both the types and the conversion code.
I don't know if its aim is necessarily to model anything you can do in JS. At the moment, TypeScript has zero understanding of prototypal inheritance and there hasn't been any visible progress in that direction.
Right, but if you can't give precise types to common coding patterns the type system isn't very useful. A lot of work in Typescript is about being able to type the kind of things people do in JS.
No. Idiomatic Closure code was extremely Java-like. That was the whole point. Google built it to abstract away the nasties of early JS for their huge stable of Java programmers at the time. Dedicated JS engineers weren't really a thing yet.
This bring back happy memories of the Closure Compiler and using Clojurescript. I built a product back in the day using these technologies. There was definitely a lot of innovation in the Clojure ecosystem. Also I remmeber how confusing if was to talk about the difference between Closure and Clojure since they sound the same :)
Part of Closure Tools was also an early CSS transpiler called "Closure Stylesheets" that behaves very much like the CSS 'preprocessor' languages that followed it https://github.com/google/closure-stylesheets
It's easy to see this set of JavaScript and CSS tools as ahead of their time, and the first of the particular kind of toolset that followed in the years after it.
Steve Yegge had once mentioned in a podcast that the tooling inside google internally is somewhat like magic, and years ahead of anyone else outside.
It sucks a bit that it's all internal and unavailable to the outside. They're not even looking at commercializing it (presumably it's too dependent on google infrastructure for it to be sellable).
Unlike amazon's ethos, which is that the internal infrastructure is made available (at some point) to sell to outside.
Typescript won because Microsoft shipped vscode with out of the box TS integration that just worked. This is touched on in the article but understated.
They used the exact same approach as visual studio and .NET, and managed to eat up the entire fragmented ecosystem in a matter of years.
I've never worked for Google, but have written a lot of Closure code.
Unfortunately, my opinions about Closure are colored by how it was used within my org -- which was far from ideal. We should have taken the best of Closure (the compiler, type checking, and maybe a few utility libraries) and ignored most of the rest.
When we switched all greenfield JS development from Closure to React around 2016, it resulted in a 2-3x speed-up in the development process. Not because the compiler stage added any significant time to our process, but because of poor design patterns that were implemented because Google was using them -- but perhaps not with the correct level of understanding about how to use them effectively.
Great article, I had forgotten that Closure existed.
Weird have been interesting to include Facebook's Flow. That one seems pretty similar to TypeScript but more oriented towards Facebook's specific needs than a genuinely open source tool.
I'm not sure that it was just Facebook-specific needs. Aside from being faster (for their large code size), back in like 2015 Flow was also a better type checker than Typescript despite being a younger project. It made different design decisions, focusing on better correctness, at the expense of some convenience.
Typescript "won" because it was actually very loose, especially back then, especially with default configuration. It happily compiled heaps of incorrect code patterns without any warnings. Many JS devs had no prior experience with statically typed languages, and they saw it as an improvement over the status quo of plain JS, and didn't care enough about slightly better but newer and thus less popular alternatives like Flow. The frontend devs who expected soundness and correctness from a statically typed language were (and still are, by the looks of it) in a very small minority.
That is an interesting note (and news to me as the article author). Note that they used Closure in its "simple" mode, rather than "advanced", which is what the post is mainly talking about. I wonder if React still uses CC?
A footnote to this is that ClojureScript bet heavily on the Closure compiler, at a time when, IIRC, it wasn't even open sourced. In particularly, it was not just a bet on the technology, but the underlying methodology, so ClojureScript was designed to produce a lot of JS that the Closure compiler would find easy to optimise. A solid and clever design, but a bet on the wrong ecosystem.
google has the curse that they can scale any problem across hundreds of machines. This isn't an environment that creates tools that a single developer wants to run on their machine. closure was always slower then alternatives and not fun to use on your 6 year old laptop to check your side project.
We still use the Closure compiler on one of our projects that originated about 10 years ago. It’s difficult (but not impossible) to debug with, but there’s no compelling reason to replace it yet. The code base has been completely rewritten over time with modern JS and amazingly the compiler works just fine.
Maybe in the not so distant future, TypeScript versus Closure Compiler won't matter and everyone will be compiling their favorite language to WebAssembly instead.
Google has a history of killing its own products and services. It is not wise to build on top of depending on their tools. It's likely that you'll get rugged next quarter.
Rust can "just" be run; as in, the developer can create a binary and run that. Or you can compile from source without too much trouble with "cargo build".
Java is a bit more complex, and was even more complex in the past. You had to download Java (and on Windows, came with this annoying "Java updater" thing), and on Linux/BSD machines it couldn't "just" be installed via the package manager but had to manually download it from the Sun website (at some point Canonical got a special agreement with Sun to allow this though).
Later OpenJDK became mature enough to be used, which solved the entire problem, but that took quite a while.
This is why Java has traditionally not been huge in the open source world for writing applications.
You didn't need Ocaml to use Flow, and it's not like the average Typescript/Flow dev ever looks at the source code of Typescript/Flow, let alone contributes to it.
Flow died out because it focused on things that JS devs didn't care about – speed and correctness. The latter especially came at the expense of annoyance: Flow tended to throw errors when you used code patterns with unprovable correctness, whereas Typescript happily "compiled" such patterns without checking their correctness. That was a "good enough" improvement over plain Javascript, and it easily won, being a more established community, and with more support from Microsoft than Flow ever got from Facebook.
How does a programming language "win" against a code minifier? Typescript and Closure compiler solve entirely different problems, they are not in any way alternatives to each other.
It's not like Typescipt somehow solved code minification in a way that made Closure compiler redundant. It's still used where it makes sense (e.g. to compile Scala.js output, although other solutions like terser.js work too).
Closure does static analysis too, the fact that the types are shoved in comments thus allowing the files to be used as-is in JS is more of a technical nuance not different from the flow programming language by Facebook, or JSDoc-driven TypeScript.
Ah! Right, I forgot that using JSDoc-style comments for static type checking was a thing back then. Nowadays we just feed Scala.js output to Closure compiler for it to optimize and minify, it would be too painful to write such comment-centric code manually. But we didn't have that many good options back then.
Closure Compiler was way ahead of its time. I used it extensively at my first job. It was the only way to stay sane pre-ES5. It's still great for people who want to write JavaScript applications as if they were Java applications. But it lost because people hate writing Java applications.
One detail this story gets wrong though is the claim that, "The Gmail team found that runtime JavaScript performance was almost irrelevant compared to download times." Runtime performance was actually way more important than download size and we put a lot of effort into making the JS fast (keep in mind that IE6 was the _best_ browser at the time). One of the key functions of the js compiler was inlining and dead-code removal so that we could keep the code readable without introducing any extra overhead.