Hacker News new | past | comments | ask | show | jobs | submit login

Why not allow the user to specify the level of optimization?



It's been a while since I looked at Closure's codebase but, as I recall, the individual optimizations are not loosely overlaid, but somewhat dependent on each other and specifically ordered, so "lower optimization levels" would involve manually deciding which optimizations could be removed from the path without adversely affecting the others. But I could be wrong on that.

More importantly, this is a compiler targeted towards one-time compilations to permanently reduce large JavaScript payloads per millions of downloads, and not a compiler that is required during development. As such, blunting its effect to save a few seconds is pretty meaningless, so I doubt the maintainers ever considered "less optimization".

That said, it does allow for "dangerous" but more aggressive optimizations that require assurances from the JavaScript or you'll break the code. In that way, Closure offers user-specified levels of optimization.

EDIT: A secondary and less-obvious effect is that using a smaller number of total optimizations produces more internally-consistent code, as opposed to producing unusual and internally-unique constructions for rare optimizations. Internal consistency is great for the next step after compilation: run-length compression.


You might be interested in my work on sorting and clustering code to improve compression efficiency: http://timepedia.blogspot.com/2009/08/on-reducing-size-of-co...


> As such, blunting its effect to save a few seconds is pretty meaningless, so I doubt the maintainers ever considered "less optimization".

Yes, but still, some projects are orders of magnitude larger than other projects. Also, some users might be willing to wait an hour, others only a minute.


The point I was trying to make is that, in practice, everyone will run at full optimization, since that's the point of something like Closure. It's not gzip where "good enough" exists sometimes. JavaScript compilers are all about saving users time. Because of that, offering a product that breaks deployment cycles becomes a non-starter.

There are, essentially, an infinite number of optimizations you could make to Closure, though probably several thousand are reasonable. Every marginal optimization needs to run though the entire AST and many of them require prior optimizations to be re-run. As 'cromwellian pointed out, the number of passes is the dominating factor in speed. At some point, it's no longer worth it.


Yes, there are diminishing returns, where you spend polynomial more time to get an extra 0.2% code size reduction. At some point, you need to early exit the optimization loop.

For Google production code, we typically let things run long, because if you shave off say, 30k from Gmail * 1 billion active users, you've just saved a lot of bandwidth.


The Closure compiler's optimization baseline is drastically ahead of any other JS optimizer. It's a diminishing returns sort of thing I would think.

I'd be interested in seeing what a large typescript project would look like run through https://github.com/angular/tsickle.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: