Interesting read. I wrote the original compiler back in 2002/2003, but a lot changed by the time it was open sourced (including the confusing name -- I just called it a javascript compiler).
One detail this story gets wrong though is the claim that, "The Gmail team found that runtime JavaScript performance was almost irrelevant compared to download times." Runtime performance was actually way more important than download size and we put a lot of effort into making the JS fast (keep in mind that IE6 was the _best_ browser at the time). One of the key functions of the js compiler was inlining and dead-code removal so that we could keep the code readable without introducing any extra overhead.
Thanks for the correction, Paul (and for the great email client and JS Compiler!). I've added a note to the article.
The focus on inlining as a performance win makes a lot of sense. It's hard to get back into the pre-JIT IE6 mindset where every getter and setter came at a cost. By the time I used Closure Compiler years later this had gotten simplified to just "minification good". I remember search (where I worked) in particular was extremely concerned with shaving bytes off our JS bundles.
To be clear, minification was absolutely a key feature/motivation for the compiler. Runtime performance was more important than code size, but as usual the key to improving runtime performance is writing better code -- there's not much a compiler can do to fix slow code. For example, I wanted the inbox to render in less than 100ms, which required not only making the JS fast but also minimizing the number of DOM nodes by a variety means (such as only having a single event handler for the entire inbox instead of one per active element).
As other here have pointed out, JS was very much looked down upon by most people at Google, and there was a lot of resistance to our JS-heavy approach. One of their objections was that JS didn't have any tooling such compilers, and therefore the language was "unscalable" and unmaintainable. Knocking down that objection was another of the motivations for writing the compiler, though honestly it was also just kind of fun.
I used Closure at Google after coming from a Java background. I always described it as "Closure puts the Java into JavaScript". The team I was working on found several bugs where live code was removed from the dead code removal too.
Now Closure (at Google) meant a couple of different things (by 2010+). First it was the compiler. But second it was a set of components, many UI related. Fun fact: the Gmail did had written their own set of components (called Fava IIRC) and those had a different component lifecycle so weren't interoperable. All of this was the most Google thing ever.
IMHO Closure was never heavily pushed by Google. In fact, at the time, publicly at least, Google was very much pushing GWT (Google Web Toolkit) instead. For those unfamiliar, this is writing code in Java that is transpiled to Javascripit for frontend code. This was based on the very Google notion of both not understanding and essentially resenting Javsscript. It was never viewed as a "real" language. Then again, the C++ people didn't view Java as a real language either so there was a hierarchy.
GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.
But this idea of running the same code everywhere was really destructive, distracting and counter-productive. Most notably, Google runs on protobufs. Being a binary format, this doesn't work for Javascript. Java API protobufs weren't compatible with GWT for many years. JS had a couple of encodings it tried to use. One was pblite, which basically took the protobuf tag numbers as array elements. Some Google protobufs had thousands of optional fields so the wire format became:
[null,null,null,null,...many times over...,null,"foo"]
Not exactly efficient. Another used protobuf tag numbers as JSON object keys. I think this had other issues but I can't remember what.
Likewise, Google never focused on having a good baseline set of components. Around this time some Twitter engineers came out with Bootstrap, which became the new reset.css plus a component library and everything else kind of died.
Even Angular's big idea of two-way data binding came at a huge cost (component transclusion anyone?).
Google just never got the Web. The biggest product is essentially a text box. The way I always described it is "How do you know you're engineering if you're not over-engineering?" Google does some absolutely amazing technical infrastructure (in particular) but the Web? It just never seemed to be taken seriously or it was forced into an uncomfortable box.
Yes definitely, I also worked there during that time, and agree with the idea that Google didn't get JS. This is DESPITE coming out with 2 of the greatest JS applications ever -- GMail and Google Maps -- which basically started "Web 2.0" in JS
I always found that odd, and I do think it was cultural. At a certain point, low-level systems programming became valued more, and IMO it was emphasized/rewarded too much over products. I also agree that GWT seemed to be more "staffed" and popular than Closure compiler. There were a bunch of internal sites using GWT.
There was much more JS talent at Yahoo, Facebook, etc. -- and even eventually Microsoft! Shocking... since early Google basically leap-frogged Microsoft's hotmail and I think maps with their JS-based products. Google Docs/Sheets/Slides was supposedly a very strategic JS-based product to compete with Microsoft.
I believe a lot of it had to do with the interview process, which was very uniform for all the time I was there. You could never really hire a JS specialist -- they had to be a generalist and answer systems programming questions. I think there's a sound logic in that (JS programmers need to understand systems performance), but I also think there's room for diversity on a team. People can learn different skills from each other; not everyone has to jump through the same hoops
---
This also reminds me that I thought Alex Russell wrote something about Google engineering not getting the WEB! Not just JavaScript. It's not this, but maybe something similar:
I don't remember if it was an internal or external doc. I think it focused on Chrome side things.
But I remember thinking that too -- the C++ guys don't get the web. When I was in indexing, I remember the tech lead (Sitaram) encouraged the engineers to actually go buy a web hosting account, and set up a web site !! Presumably because that would get them more in touch with web tech, and how web sites are structured.
So yeah it seems really weird and ironic that the company that owns the biggest web apps and the most popular web browser has a lot of employees who don't value that tech
---
Similarly, I have a rant about Google engineering not getting Python. The early engineers set up some pretty great Python infrastructure, and then it kind of rotted. There were arguably sound reasons for that, but I think it basically came back to bite the company with machine learning.
I've heard a bunch of complaints that the Tensorflow APIs are basically what a non-Python programmer would invent, and so PyTorch is more popular ... that's sort of second-hand, but I believe it, from what I know about the engineering culture.
A lot of it has to do with Blaze/Bazel, which is a great C++ build system, while users of every other language all find it deoptimizes their workflow (Java, Go, Python, JS, ...)
So basically I think in the early days there were people who understood JS (like paul) and understood Python, and wrote great code with them, but the eng culture shifted away from those languages.
It was definitely cultural. The engineering hierarchy was:
1. C++ engineers thought the only real language was C++
2. Java engineers thought the only real languages were C++ or Java
3. Python engineers either thought the only real languages were Python, C++ or Java or some of them thought only Python
At that time (I don't know about now), Google had a designation of a frontend softare engineer ("FE SWE") and you'd see interview feedback where a particular interviewer would be neutral on a candidate and soft reject them by explicitly stating they were maybe good enough to be an FE SWE, even though the official stance was FE SWEs had to pass the regular SWE standard plus some extra.
Basically, anything JS/CSS/HTML related was very much looked down upon by many.
Blaze went through some growing pains. At a time it was a full Python interpreter and then an interpreter of a subset of Python and ultimately not really Python at all. There was a time when you had to do things with genrules, which was an awful user expeience and it broke caching, two reasons why they got rid of it, ultimately.
But Blaze made sense because you had to be able to compile things outside of Java, Python and (later) Go where Java and Go in particular had better build systems for purely Java and Go (respectively) code bases. It got better once there were tools for auto-generating Blaze config (ie the java_library build units).
Where Blaze was horrible was actually with protobufs. auto-generated code wasn't stored in the repo (unlike Facebook). There were protobuf versions (although, even by 2010, most things were protobuf version 2) but there were also API versions. And they weren't compatible.
So Java had API version 1 (mutable) and 2 (immutable) and if you needed to use some team's protobuf but they'd never updated to API v2, you'd either have to make everything v1 or do some horrible hacks or create your own build units for v2.
But I digress. Python essesntially got supplanted by Go outside of DS/ML. The code review tool was originally written in Python (ie Mondrian) before being rewritten in (IIRC) GWT (ie Critique). For a very long time Critique lacked features Mondrian had.
Personally, I was always sympathetic to avoiding large Python code bases just for the lack of strict typing. You ended up having to write unit tests for spelling mistakes. I can't speak to Tensorflow vs PyTorch.
I suspect this institutional disdain for Python was probably a factor in GvR leaving to go join Dropbox.
You have good points overall, but I'd say AngularJS did have mass adoption at a time - it was a great way to build web applications compared to the alternatives. React just came by and did even better.
And Dart may not have much of a standing on its own, but Flutter is one of the most popular frameworks for creating cross-platform applications right now.
Maybe eventually someone realized this was horribly inefficient and added an extra step but a series of commas in an array with nothing in between them isn't valid JSON.
True, although GP was comparing Dart to UI frameworks (GWT, Angular and React):
> GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.
One detail this story gets wrong though is the claim that, "The Gmail team found that runtime JavaScript performance was almost irrelevant compared to download times." Runtime performance was actually way more important than download size and we put a lot of effort into making the JS fast (keep in mind that IE6 was the _best_ browser at the time). One of the key functions of the js compiler was inlining and dead-code removal so that we could keep the code readable without introducing any extra overhead.