Hacker News new | past | comments | ask | show | jobs | submit | paul's comments login

Agreed! Wonderful museum.

They also have some original codices along with translations that are interesting to look at https://photos.app.goo.gl/zeo3Hn2Q8v81cidX9


Interesting read. I wrote the original compiler back in 2002/2003, but a lot changed by the time it was open sourced (including the confusing name -- I just called it a javascript compiler).

One detail this story gets wrong though is the claim that, "The Gmail team found that runtime JavaScript performance was almost irrelevant compared to download times." Runtime performance was actually way more important than download size and we put a lot of effort into making the JS fast (keep in mind that IE6 was the _best_ browser at the time). One of the key functions of the js compiler was inlining and dead-code removal so that we could keep the code readable without introducing any extra overhead.


Thanks for the correction, Paul (and for the great email client and JS Compiler!). I've added a note to the article.

The focus on inlining as a performance win makes a lot of sense. It's hard to get back into the pre-JIT IE6 mindset where every getter and setter came at a cost. By the time I used Closure Compiler years later this had gotten simplified to just "minification good". I remember search (where I worked) in particular was extremely concerned with shaving bytes off our JS bundles.


To be clear, minification was absolutely a key feature/motivation for the compiler. Runtime performance was more important than code size, but as usual the key to improving runtime performance is writing better code -- there's not much a compiler can do to fix slow code. For example, I wanted the inbox to render in less than 100ms, which required not only making the JS fast but also minimizing the number of DOM nodes by a variety means (such as only having a single event handler for the entire inbox instead of one per active element).

As other here have pointed out, JS was very much looked down upon by most people at Google, and there was a lot of resistance to our JS-heavy approach. One of their objections was that JS didn't have any tooling such compilers, and therefore the language was "unscalable" and unmaintainable. Knocking down that objection was another of the motivations for writing the compiler, though honestly it was also just kind of fun.


I used Closure at Google after coming from a Java background. I always described it as "Closure puts the Java into JavaScript". The team I was working on found several bugs where live code was removed from the dead code removal too.

Now Closure (at Google) meant a couple of different things (by 2010+). First it was the compiler. But second it was a set of components, many UI related. Fun fact: the Gmail did had written their own set of components (called Fava IIRC) and those had a different component lifecycle so weren't interoperable. All of this was the most Google thing ever.

IMHO Closure was never heavily pushed by Google. In fact, at the time, publicly at least, Google was very much pushing GWT (Google Web Toolkit) instead. For those unfamiliar, this is writing code in Java that is transpiled to Javascripit for frontend code. This was based on the very Google notion of both not understanding and essentially resenting Javsscript. It was never viewed as a "real" language. Then again, the C++ people didn't view Java as a real language either so there was a hierarchy.

GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.

But this idea of running the same code everywhere was really destructive, distracting and counter-productive. Most notably, Google runs on protobufs. Being a binary format, this doesn't work for Javascript. Java API protobufs weren't compatible with GWT for many years. JS had a couple of encodings it tried to use. One was pblite, which basically took the protobuf tag numbers as array elements. Some Google protobufs had thousands of optional fields so the wire format became:

    [null,null,null,null,...many times over...,null,"foo"]
Not exactly efficient. Another used protobuf tag numbers as JSON object keys. I think this had other issues but I can't remember what.

Likewise, Google never focused on having a good baseline set of components. Around this time some Twitter engineers came out with Bootstrap, which became the new reset.css plus a component library and everything else kind of died.

Even Angular's big idea of two-way data binding came at a huge cost (component transclusion anyone?).

Google just never got the Web. The biggest product is essentially a text box. The way I always described it is "How do you know you're engineering if you're not over-engineering?" Google does some absolutely amazing technical infrastructure (in particular) but the Web? It just never seemed to be taken seriously or it was forced into an uncomfortable box.


The whole time I read this article I kept thinking there was some write it in Java and compile to JS angle that wasn't being mentioned. GWT.


Yes definitely, I also worked there during that time, and agree with the idea that Google didn't get JS. This is DESPITE coming out with 2 of the greatest JS applications ever -- GMail and Google Maps -- which basically started "Web 2.0" in JS

I always found that odd, and I do think it was cultural. At a certain point, low-level systems programming became valued more, and IMO it was emphasized/rewarded too much over products. I also agree that GWT seemed to be more "staffed" and popular than Closure compiler. There were a bunch of internal sites using GWT.

There was much more JS talent at Yahoo, Facebook, etc. -- and even eventually Microsoft! Shocking... since early Google basically leap-frogged Microsoft's hotmail and I think maps with their JS-based products. Google Docs/Sheets/Slides was supposedly a very strategic JS-based product to compete with Microsoft.

I believe a lot of it had to do with the interview process, which was very uniform for all the time I was there. You could never really hire a JS specialist -- they had to be a generalist and answer systems programming questions. I think there's a sound logic in that (JS programmers need to understand systems performance), but I also think there's room for diversity on a team. People can learn different skills from each other; not everyone has to jump through the same hoops

---

This also reminds me that I thought Alex Russell wrote something about Google engineering not getting the WEB! Not just JavaScript. It's not this, but maybe something similar:

https://changelog.com/jsparty/263

I don't remember if it was an internal or external doc. I think it focused on Chrome side things.

But I remember thinking that too -- the C++ guys don't get the web. When I was in indexing, I remember the tech lead (Sitaram) encouraged the engineers to actually go buy a web hosting account, and set up a web site !! Presumably because that would get them more in touch with web tech, and how web sites are structured.

So yeah it seems really weird and ironic that the company that owns the biggest web apps and the most popular web browser has a lot of employees who don't value that tech

---

Similarly, I have a rant about Google engineering not getting Python. The early engineers set up some pretty great Python infrastructure, and then it kind of rotted. There were arguably sound reasons for that, but I think it basically came back to bite the company with machine learning.

I've heard a bunch of complaints that the Tensorflow APIs are basically what a non-Python programmer would invent, and so PyTorch is more popular ... that's sort of second-hand, but I believe it, from what I know about the engineering culture.

A lot of it has to do with Blaze/Bazel, which is a great C++ build system, while users of every other language all find it deoptimizes their workflow (Java, Go, Python, JS, ...)

So basically I think in the early days there were people who understood JS (like paul) and understood Python, and wrote great code with them, but the eng culture shifted away from those languages.


It was definitely cultural. The engineering hierarchy was:

1. C++ engineers thought the only real language was C++

2. Java engineers thought the only real languages were C++ or Java

3. Python engineers either thought the only real languages were Python, C++ or Java or some of them thought only Python

At that time (I don't know about now), Google had a designation of a frontend softare engineer ("FE SWE") and you'd see interview feedback where a particular interviewer would be neutral on a candidate and soft reject them by explicitly stating they were maybe good enough to be an FE SWE, even though the official stance was FE SWEs had to pass the regular SWE standard plus some extra.

Basically, anything JS/CSS/HTML related was very much looked down upon by many.

Blaze went through some growing pains. At a time it was a full Python interpreter and then an interpreter of a subset of Python and ultimately not really Python at all. There was a time when you had to do things with genrules, which was an awful user expeience and it broke caching, two reasons why they got rid of it, ultimately.

But Blaze made sense because you had to be able to compile things outside of Java, Python and (later) Go where Java and Go in particular had better build systems for purely Java and Go (respectively) code bases. It got better once there were tools for auto-generating Blaze config (ie the java_library build units).

Where Blaze was horrible was actually with protobufs. auto-generated code wasn't stored in the repo (unlike Facebook). There were protobuf versions (although, even by 2010, most things were protobuf version 2) but there were also API versions. And they weren't compatible.

So Java had API version 1 (mutable) and 2 (immutable) and if you needed to use some team's protobuf but they'd never updated to API v2, you'd either have to make everything v1 or do some horrible hacks or create your own build units for v2.

But I digress. Python essesntially got supplanted by Go outside of DS/ML. The code review tool was originally written in Python (ie Mondrian) before being rewritten in (IIRC) GWT (ie Critique). For a very long time Critique lacked features Mondrian had.

Personally, I was always sympathetic to avoiding large Python code bases just for the lack of strict typing. You ended up having to write unit tests for spelling mistakes. I can't speak to Tensorflow vs PyTorch.

I suspect this institutional disdain for Python was probably a factor in GvR leaving to go join Dropbox.


You have good points overall, but I'd say AngularJS did have mass adoption at a time - it was a great way to build web applications compared to the alternatives. React just came by and did even better.

And Dart may not have much of a standing on its own, but Flutter is one of the most popular frameworks for creating cross-platform applications right now.


> Some Google protobufs had thousands of optional fields so the wire format became: > [null,null,null,null,...many times over...,null,"foo"]

In pblite, they are serialized as `[,,,,...many times over...,,"foo"]`. Just comma, no "null".


    > JSON.parse('[null,null,null]')
    > (3) [null, null, null]
    > JSON.parse('[,,,]')
    Uncaught SyntaxError...
Maybe eventually someone realized this was horribly inefficient and added an extra step but a series of commas in an array with nothing in between them isn't valid JSON.


Dart is alive and well in Flutter: https://flutter.dev/


It is, but it is no longer marketed as a general replacement for JavaScript.


True, although GP was comparing Dart to UI frameworks (GWT, Angular and React):

> GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.


It’s both a novel virus and a novel vaccine. Without clinical trials we really don’t know if the boosters are helping or hurting.

Also, we should be doing clinical trials on flu shots as well. New drugs, even if they are only slightly different still require clinical trials. Why is hacking the immune system not subject to the same scrutiny?


Show your math. Unless those nukes are very small, your claim seems way off.


According to this source [1] flights departing GVA generated 1.3 million tons of CO2 in 2018. According to this source [2], jet fuel generates 3.16 kg CO2 per kg of fuel. Multiplied by approximate fuel density and we get 2.57 kg CO2 per liter. We can therefore estimate that GVA consumes approximately 500 million (1.3 billion kg CO2/ 2.57 kg CO2 per liter) liters of fuel per year. Jet fuel contains about 35 MJ of energy per liter according to [3] so that's about 17.5 PJ. If the process to convert electricity to jet fuel is 50% efficient, that's 35 PJ. That is equivalent to a 1.1GW reactor running at 100% capacity. At the global average capacity factor of 80%, that's about 1.4GW required. Half a dozen is probably a pretty large overestimate. Alternatively, this would require around 6GW of solar, although 6GW of solar is probably quite a bit cheaper than 1.1 GW of nuclear power.



agreed, 1.1 GW seems reasonable for GVA. thanks!


My back of the envelope calculation is that 1GW for 1 year is 30PJ. Jet fuel has 42MJ/kg with a density of 0.8 kg/L for a total of around 100 million liters of fuel at 100% efficiency. An A321 holds around 30000L which comes out to about 30000 flights equivalent from one reactor. GVA had around 200000 flights in 2018 meaning about 6 1GW reactors equivalent of fuel used (obviously not all flights would be fully loaded but I don't know what a normal load is).


Another quick calculation, in addition to the one from tfussell:

Google tells me [0] that total fuel consumption by commercial airlines in 2019 was 95 billion gallons. One gallon of fuel has around 33 kWh of energy.

Power capacity needed to produce that amount of energy in a year is around 358 GW.

Total world electricity production in 2020 was around 3000 GW-year [1].

[0] https://www.google.com/search?q=total+yearly+aircraft+fuel+u...

[1] https://www.google.com/search?q=total+world+electricity+prod...


Comparing global fuel consumption with global electricity production is a good approach. It’s clearly substantial, but doable, especially since fuel production can utilize “unreliable” renewables (make gas when the sun shines).


I agree it was overestimated, note to self: always redo ppl's maths.

1) number of barrels of jet fuel per day in switzerland:

34000 barrel / day = 0.39 barrel / s

https://www.indexmundi.com/energy/?product=jet-fuel&graph=co...

2) energy in a barrel of jet fuel:

1700 kWh / barrel = 6120000000 J / barrel

3) total power for switzerland:

2.4 GW

so more like 2-4 reactors for all switzerland.


She’s also one of the best founders I’ve ever worked with. Her talks at yc are my favorite — incredibly honest, personal, and insightful.


I’m more disturbed by the fact that they can also edit or remove books that I’ve already purchased. How long until Amazon is forced to “deplatform” something offensive? Those old books contain a lot of words and ideas that have no place in 2019.

It’s one of several reasons why I mostly only buy paper books.


The denial is strong. Reminds me of how cell phone makers responded to the iPhone:

“The development of mobile phones will be similar in PCs. Even with the Mac, Apple has attracted much attention at first, but they have still remained a niche manufacturer. That will be in mobile phones as well,” Nokia chief strategist Anssi Vanjoki told a German newspaper at the time.

Back in the day, smartphones were pretty much defined by devices like the Palm Treo, and Palm CEO Ed Colligan doubted some computer maker was going to just waltz in and eat his lunch.

“We've learned and struggled for a few years here figuring out how to make a decent phone,” Colligan said. “PC guys are not going to just figure this out. They're not going to just walk in.”


CMGI and the other “incubators” of that era were nothing like YC. The label “accelerator” was applied to YC years after it was started because of the need for a generic term to describe YC and all of the clones. Whether or not that label was previously used for some other kind of business is irrelevant.


Makes sense. I suspected that might be the case, which is why I mentioned the possibility of something more specific intended. (In case other readers are struck by the same thought, it might be worth a footnote).



Thankfully, mine are more of a mild annoyance than cripplingly painful.

I'll check out Curable though, thank you!


Slashdot is where I first encountered both Google ('98 / early '99) and YC (the Summer Founders Program). So pretty valuable personally. Thanks for that :)

I miss the enthusiasm of that community.


Ditto. It almost felt like Google got its nerd cred from being featured on Slashdot multiple times. BitCoin too (waaaay before it was "popular"), and probably Linux itself.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: