
Prepack helps make JavaScript code more efficient - jimarcey
https://prepack.io
======
chmod775
I just ran this on a huge JS project that has a quite intensive
"initialization" stage (modules being registered, importing each other, etc.),
and prepack basically pre-computed 90% of that, saving some 3k LOC. I had to
replace all references to "window" with a fake global object that only existed
within the outer (function() {..})() though (and move some other early stuff
with side effects to the end of the initialization), to get it to make any
optimizations at all.

Very impressive overall.

~~~
adamsea
Do you happen to know if this is similar to what Google's Closure Compiler for
javascriptw ould do?

~~~
wlib
It says on the bottom of the page that prepack optimizes for performance/less
computation while closure optimizes for file size.

------
yladiz
I hate to bring this up whenever I see a Facebook project, but it still
warrants saying: the patents clause in this project, like in others including
React, is too broad. I really wish they made a "version 3" that limited the
scope of the revocation to patents pertaining to the software in question,
e.g. Prepack, React, rather than a blanket statement that covers any patent
assertion against Facebook. While I suppose the likelihood of this occurring
is small, I can imagine a company holding some valid patents, such as
something related to VR, that aren't related to Prepack that Facebook
infringes upon, as well as using a software that Facebook produces like
Prepack, sue Facebook for infringement, and then losing the right to use
Prepack as a result. From my understanding these kinds of clauses are
beneficial overall, but the specific one that Facebook uses is too broad.

Tangentially related: what would happen if you did sue Facebook for patent
infringement, and continued to use this software?

~~~
abritinthebay
This gets brought up every time, even after many times it's been clarified
(even by lawyers) that it gives you _more_ rights overall (as the downsides to
it _are true either way_ ).

Remember - without that patent grant _you have no rights to any of Facebook 's
patents anyhow_. With it, you do.

So the worst case is that you'd be in the same situation if _you didn 't_ have
the grant.

~~~
andreyf
False: an implicit patent grant means that by open sourcing a piece of
software you imply that people can, you know, use it.

~~~
Lazare
Some people think so. Others aren't so sure. US courts have not yet ruled on
it.

However, even the people who DO think an implicit grant exists would mostly
agree that the implicit grant is not sublicensable, which makes it a horrible
mess and probably unusable.

An explicit grant is strongly preferable, IF people can agree on the terms.
Facebook's terms are on the harsh side, but there's clear advantages to it
existing.

~~~
abritinthebay
> US courts have not yet ruled on it.

They sort of have. A patent has to be a major and "dominant" part of the
implicitly licensed tech for it to be granted.

Basically - all existing case law says that you get _some_ rights from
implicit grants but it's also _far less_ than explicit ones like FB's

~~~
Lazare
Thanks for the info.

~~~
andreyf
Careful. I don't know what his motivations are, but the info is
misrepresenting both the context and the meaning of the language he
references:
[http://en.swpat.org/wiki/Implicit_patent_licence#USA](http://en.swpat.org/wiki/Implicit_patent_licence#USA)

------
dschnurr
This is cool–it's worth mentioning that you might be trading runtime
performance for bundle size though, here's a contrived example to demonstrate:
[http://i.imgur.com/38CR3Ws.jpg](http://i.imgur.com/38CR3Ws.jpg)

~~~
evv
Keep in mind that everything is gzipped nowadays, so it may not make a big
difference in network usage. Although it is still likely to cause some memory
overhead

~~~
franciscop
No, I have done and pubpished an article about how gzip works with JS and the
result will compress pretty well. Not better than the code, but I would guess
within the same order of magnitude.

~~~
vmasto
JavaScript parsing is still a huge bottleneck. 1mb of it will still take 1
whole second to parse in V8 (note: just to parse it, not actually evaluate or
run it!).

~~~
franciscop
I think this is (so far) for snippets. I agree otherwise though

------
chime
This has promise but still needs more work. I added one line to their 9 line
demo ( [https://prepack.io/repl.html](https://prepack.io/repl.html) ) and it
ballooned to 1400+ lines of junk:

    
    
        (function() {
          function fib(x) {
            y = Date.now(); // the useless line I added
            return x <= 1 ? x : fib(x - 1) + fib(x - 2);
          }
        
          let x = Date.now();
          if (x * 2 > 42) x = fib(10);
          global.result = x;
        })();
    

I understand Date might not be acceptable for inner loops but a lot of my code
that deals with scheduling would benefit significantly if I could precompute
some of the core values/arrays using a tool like prepack.

~~~
shuzchen
It's not a useless line, because prepack has no idea what Date.now() does
(there are no guarantees in javascript that it hasn't been replaced with
another function). It might mutate some global state somewhere, so the
resulting code needs to call Date.now() as often as it would've if fib(10) was
called. Basically the output is the unrolled version of the recursion, which
cuts down on function invocation (comparatively expensive in dynamic
languages).

If you replaced the line with say: `var y = 4;` you'll notice that it has been
optimized out.

~~~
magicalist
> _Basically the output is the unrolled version of the recursion, which cuts
> down on function invocation (comparatively expensive in dynamic languages)._

You're right about why it's included, but this is a bug, not a feature. If
this is a hot function it'll be optimized. The recursive call is a known, non-
dynamic invocation so the function location will be inlined leaving just the
fairly low function call overhead itself. There's a reason nobody does
arbitrary-length loop unrolls.

I would definitely bet that this is a bug to be fixed :) Right now you can see
this if you call fib(20) in that example...the compiler times out while trying
to unroll that far. Clearly that behavior won't stick around.

------
NTillmann
Hi, I am Nikolai Tillmann, a developer on the Prepack project. I am happy to
answer any questions!

~~~
GeoffreyPlitt
Can you prove that any of the optimizations in your docs aren't already done
by V8? I agree with the other commenters in this thread-- V8 likely does these
already. You have an extraordinary claim, which requires extraordinary proof.

~~~
batmansmk
The whole point is to not let V8 do any job at all.

V8 == user impacted.

Compile time V8 == compiler impacted, users happy.

It's like interpreting what you can during compile time, with caching
capabilities.

To speed up boot time and init time, not runtime per say.

~~~
swsieber
This. By transforming and evaluating things that really could be constants
beforehand, V8 has less to do when it goes to run the javascript.

------
jamescostian
The examples are very far from the JS I see and read, but this is definitely a
very useful tool. It seems like gcc -Olevel. It would be interesting to
incorporate some sort of tailoring for JS engines into this, like how a
compiler might try to make x86-specific optimizations. For example, if you
know your target audience mostly runs Chrome (or if the code is to be run by
node), you might apply optimizations to change the code to be more performant
on V8 (see
[https://github.com/petkaantonov/bluebird/wiki/Optimization-k...](https://github.com/petkaantonov/bluebird/wiki/Optimization-
killers) for example).

I love it and can't wait to use it on some projects!

~~~
djsumdog
The examples seem more typical of Coffeescript output.

~~~
SamBam
You've mentioned that a couple times, but I'm really not seeing it. What to
you looks like coffeescript there?

~~~
jessaustin
"Coffeescript output" is javascript so it doesn't look much like coffeescript.
Presumably GP doesn't like the specific javascript idioms to which
coffeescript transpiles.

------
ianbicking
A long time ago there was a theory about using Guile (the GNU Scheme) as a
general interpreter for languages using partial evaluation: you write an
interpreter for a language in Scheme, use a given program as input, and run an
optimizer over the program. This turns your interpreter into a compiler. I
played around with the concept (making a Tcl interpreter), and it even kind of
worked, often creating reasonably readable output.

Prepack looks like the same kind of optimizer – it could be a fun task to
write an interpreter and see if this can turn it into a compiler/transpiler.

~~~
dgreensp
What you're talking about -- writing an interpreter that's optimized into a
compiler -- is actually coming in the soon-to-be-released Java 9. Check out
Graal and Truffle. I'm pretty excited to play with it at some point.

Prepack reminds me more of a "supercompiler," because it focuses on partial
evaluation rather than optimization.

~~~
ianbicking
Your "supercompiler" phrase made me think about what happens if you start
applying the partial evaluation over and over. Of course nothing happens...
unless you know something more than what you knew before. Which you might!
That in turn made me think of the Wolfram Language, which feels like this to
me – you declare things, and as the set of declarations continues the language
starts to "know" more things, and your statements become more concrete. This
is interesting because it's all automated, you can undo things, change them,
implicitly loop them by considering multiple possibilities.

I'm not sure you could take Prepack and do this. But it sure seems
interesting. A kind of partial evaluation coding notebook... not so unlike a
symbolic spreadsheet I suppose.

------
untog
This should have a big impact on the "cost of small modules", as outlined
here:

[https://nolanlawson.com/2016/08/15/the-cost-of-small-
modules...](https://nolanlawson.com/2016/08/15/the-cost-of-small-modules/)

Which is to say, one of its most effective use cases will be making up for
deficiencies in Webpack, Browserify and RequireJS. Which I'm a little
ambivalent about - I wish we could have seen improvements to those tools (it's
possible, as shown by Rollup and Closure Compiler) rather than adding another
stage to filter our JavaScript through. But progress is progess.

~~~
k__
I saw a Webpack example and it looked smaller & faster. So it seems like a
good thing.

------
xg15

      function define() {...}
      function require() {...}
      define("one", function() { return 1; });
      define("two", function() { return require("one") + require("one"); });
      define("three", function() { return require("two") + require("one"); });
      three = require("three");
    

\--->

    
    
      three = 3;
    

There is a certain irony that now it's possible to do optimisations like that
in javascript - a dynamically typed language with almost no compile time
guarantees.

Meanwhile java used to have easy static analysis as a design goal (and I think
a lot of boilerplate is due to that goal) but the community relies so much on
reflection, unsafe access, dynamic bytecode generation, bytecode _parsing_ etc
that such an optimisation would be almost impossible to get right.

~~~
lurker456
it's possible to do such optimizations for a (safe) subset of javascript, such
as these pure functions.

Arguably java has a larger subset even today.

~~~
xg15
I think the remarkable point in the above optimisation was that the non-pure
functions define() and require() were also subject of optimisation even though
the optimizer had no special knowledge about them. Using symbolic execution,
the optimizer nevertheless was able to reason about them.

------
gajus
A webpack plugin for prepack. [https://github.com/gajus/prepack-webpack-
plugin](https://github.com/gajus/prepack-webpack-plugin)

------
tyingq
How "safe" is it? I'm thinking, for example, of Google's closure compiler and
the advanced optimizations, which can break some things.

Or roughly, if it compiles without errors, is it safe to assume it won't
introduce new bugs?

~~~
NTillmann
It's not yet ready for production, so there are some bugs and cases where we
should reject a program but don't do that yet.

Having said that, it's quite safe, but won't be undetectable. Code using eval
could detect injected identifiers, we don't currently aim at preserving
function names, and the method bodies you get with toString() are altered.
That should be roughly it.

~~~
Hydraulix989
The real litmus test: Does FB use it in production yet?

------
vikeri
I was under the impression that V8 and the like are so optimized that this
would give marginal gains. Would love to be wrong though. Do you have any
performance benchmarks?

~~~
gavinpc
The benefit of this would be at runtime when implementing interpreters:

> experiment with JavaScript features by tweaking a JavaScript engine written
> in JavaScript, all hosted just in a browser; think of it as a "Babel VM",
> realizing new JavaScript features that cannot just be compiled away

I've been playing with making toy languages inside of Javascript, and I
believe there's lots of untapped power there. The paradigm battles don't have
to be: we can run all sorts of paradigms in the same VM, with data executing
as code. This means that you can _decompose_ expressions to see where they
came from (e.g. the steps in a state machine that yielded an evaluation
result). If you believe (as I do) that invisibility-by-default is one of the
greatest pain points in the history of computing, then these sorts of
approaches are essential.

The problem with doing that is that some things will be slower than they would
in "native" JS. I've been proceeding anyway and thinking that could be dealt
with later. So I'm bookmarking this, because it attacks that exact problem:
runtime-compilation of generated AST's.

The name is a little unfortunate, in that respect, though, especially since it
will make people think it's a build tool like Webpack. Webpack is for dead
fish. This is incomparably more powerful.

 _edit_ so to answer your original question, this would piggyback on the
optimizations of the VM (V8 or SpiderMonkey, or whatever), taking for granted
that JS which is not needlessly verbose (as generated code must sometimes be),
can be run nearly optimally.

~~~
jamescostian
> Webpack is for dead fish.

I haven't used webpack but I have used many similar tools, and I've heard a
lot of praise for webpack. Can you elaborate on why you dislike webpack so
much?

~~~
gavinpc
I'm using "dead fish" in the Bret Victor sense.[0] It's a static tool by
nature. By the time you're in a live environment, webpack is gone.

[0] [https://vimeo.com/64895205](https://vimeo.com/64895205)

~~~
jamescostian
Oh. Isn't prepack also a "dead fish" then? Like you said, "It's a static tool
by nature. By the time you're in a live environment, [prepack] is gone." There
isn't any sort of prepack runtime

~~~
gavinpc
I was assuming that the intention was to eventually provide runtime support.
If that's correct, then I think we'd be in agreement about what "static"
means?

Babel and all sorts of other compilers can run in the browser. Whereas webpack
is mainly concerned with the text that you transmit and could not "by nature"
be made to do its job in the browser (because then it would be too late).
That's the difference I had in mind.

------
bthornbury
Awesome project, the performance gains seem real, but why wouldn't these
optimizations be happening at the javascript JIT level in the vm? (serious
question)

React / javascript programming, is the most complex environment I've ever dug
into, and it's only getting more complex.

create-react-app is great for hiding that complexity until you need to do
something it doesn't support and then it's like gasping for air in a giant sea
of javascript compilers.

~~~
throwawaymsft
download js -> run through JIT compiler -> execute

vs.

run through JIT compiler -> download js -> execute

Whatever overhead the JIT compiler adds will be latency for the user.

~~~
mbel
In the second case it's AOT (ahead of time) not JIT (just in time) compiler.

------
Kamgunz
Very interesting, nobody mentioned how formal and quite technical this README
is, it goes really into details about what it does, and even future plans laid
in three sections across 30 bullet points. One bullet point in the really far
future sections said "instrument JavaScript code [in a non observable way]"
emphasis mine, that part was noted in several other bullet points. It seems to
me every compiler/transpiler/babel-plguin changes JavaScript code in a non
observable way, no? Just a theory, but that undertone sounds to me like the
ability to change/inject into JavaScript code undetectably on the fly in some
super easy way.

Just another day at Facebook's office...

~~~
CGamesPlay
Your tin foil hat can go away when you consider that you're instrumenting code
that will permanently run in an observable sandbox. You control the server
(for node) and can watch what external requests it's making; and while you
don't directly control the client, you do have access (via the web inspector)
to all external requests it's making.

------
arota
This is exciting, and has a lot of potential to significantly improve JS
library initialization time.

I wonder if this is the same project[0] Sebastian McKenzie previewed at React
Europe 2016?

[0]
[https://www.youtube.com/watch?v=xbZzahWakGs](https://www.youtube.com/watch?v=xbZzahWakGs)

~~~
hzoo
Yep it is!
[https://twitter.com/dan_abramov/status/859817680857165824](https://twitter.com/dan_abramov/status/859817680857165824)

------
dandare
What is the business model for a tool like this? Who has the resource to spend
man/years of work while also create such a fantastic, simple yet comprehensive
landing page?

~~~
kylemathews
It's by Facebook. They have 1.x billion users across their web and mobile
products all of which run JavaScript. Tiny improvements in product performance
equals big $$$. Easily justifies efforts like this. And the rest of us get to
free-ride :-)

------
drumttocs8
Coming from a non-CS background, I've always wondered why you can't "convert"
code from one framework or paradigm to another. For instance, converting a
project from jQuery to React. If you can define the outputs, why can't you
redefine the inputs? That's what it seems like this project does... I suppose
converting frameworks would be a few orders of magnitudes harder though.

~~~
batmansmk
Can you turn lead in gold? Maybe using nuclear transmutation. But the
potential cost of the process seems higher than the benefits.

Can you convert a code base to React from Angular? Maybe, but the effort to
write this converter is higher than rewriting the code base.

------
aylmao
Facebook's javascript / PL game doesn't disappoint. This is awesome!

------
mstade
I'm happy to see there's an option to use this with an AST as input, more
tools like this should follow suit. Hopefully it can then push us to a place
where there's a standard JS AST such that we don't reinvent that wheel over
and over. Babel seems to be winning here, but I don't think it matters so much
which one wins so long as any _one_ does.

This tool looks interesting, particularly the future direction of it, but I'm
weary about efficiency claims without a _single_ runtime metric posted. The
claims may be true, initializing is costly, but so is parsing huge unrolled
loops. For an optimization tool, I'd hope to see pretty strong metrics to go
along with the strong claims, but maybe that is coming?

Interesting work, nonetheless!

------
ericmcer
Pretty cool, it did not make much difference in my application size, as it has
very little static data in it. It seems pretty rare to do something like:

    
    
        fib(2);

and more common to do:

    
    
        getInputOrHttpOrSomethingAsync().then(function(a){fib(a)});

------
iamleppert
Not a comment about the tool, which looks cool and well done.

It's sad that there are developers and projects who write the type of code
that causes these sorts of performance trade offs. I stopped writing this kind
of fancy code a long time ago when I realized it wasn't worth it. You're just
shooting yourself in the foot in the long run.

I think static analysis performance optimization tools are great but a certain
part of me thinks it just raises the waterline for more shitty code and awful
heavy frameworks that sacrifice the user experience for the developer
experience.

"Just run it through the optimizer" so we don't actually have to think about
what a good design looks like...

~~~
gavinpc
There is some confusion in this thread about the purpose of this tool, which
is targeted at _generated_ code—specifically, code generated by other
compilers. In order to get language features that don't exist in javascript,
code has to be generated in a more or less context-agnostic way. This (as I
understand it) brings more context to bear, to reduce the cost of the new
abstractions _for your specific usage_.

~~~
wereHamster
Why improve the compilers when we can have yet another tool to layer on top of
our ever growing tool stack.

~~~
abritinthebay
I take it you've never looked at a modern C++ or similar build and execution
chain under the hood then?

There's a LOT of parts that do different things. They may be aliased under one
command (with tons of flags) but a modern system does a lot of stuff.

For modern JS we appear to have:

\- Linters (ESLint, etc)

\- Transcompilation to Object Code (Babel, etc, transpile to JS)

\- AoT Compilation (this & Closure Compiler, etc, do optimizations on the code
ahead of running it)

\- Recompilation (AST based compression - like Uglify)

\- Compiling (the actual JS VM, V8, etc)

\- JIT (in the actual browser)

None of these steps are alien to other build chains.

------
Waterluvian
What percentage of typical code is packable like this? What I really need is a
way to easily determine, "is it worth bothering with a tool like this?"

------
hdhzy
This looks very good indeed but the lack of initial data model very severely
limits the production usability of this tool. You can't use "document" and
"window" ...

It's the same problem TypeScript have/had that for external libs you need
definition files for it to work. Now if we had TypeScript-to-
assumeDataProperty generator that would be VERY interesting!

~~~
kylemathews
This is a very early release. I'm sure there'll be a "build for the web" mode
soon enough.

------
kasper93
I think that just in time compilers are better at doing thier things. Sure it
is nice project that can interpret and print preprocessed js, but I think it
might in fact not bring speed in most cases.

And the current state doesn't even know how to constant fold this loop.

function foo() { const bar = 42; for (let i = 0; i <= bar; i++) { if (i ===
bar) { return bar; } } };

~~~
yairhaimo
As long as the ahead of time optimizations that dont have a large enough
negative drawback in another criteria (speed vs size) are welcomed by me, even
if the JIT can do the same optimizations too. Regarding your code example,
maybe prepack was changed in the past week and a half, but it is folded quite
fine when i tried it:

    
    
      (function() {
        function foo() { const bar = 42; for (let i = 0; i <= bar; i++) { if (i  === bar) { return bar; } } };
        console.log(foo());
      })();

------
kamranahmed_se
> helps make javascript code more efficient

[https://github.com/facebook/prepack/issues/543](https://github.com/facebook/prepack/issues/543)

Are you sure?

------
kccqzy
This reminds me of Morte, an experimental mid-level functional language
created by Gabriel Gonzalez. They both seem to be super-optimizing, that is
partially executing the program in question. Of course it is a great deal
easier to do in a functional language than JavaScript.

[http://www.haskellforall.com/2014/09/morte-intermediate-
lang...](http://www.haskellforall.com/2014/09/morte-intermediate-language-for-
super.html)

------
KirinDave
I wonder what this would do to Purescript code?

------
web-guy
I did an experiment to look for synergy from combining Prepack with the
Closure compiler: [http://www.syntaxsuccess.com/viewarticle/combining-
prepack-a...](http://www.syntaxsuccess.com/viewarticle/combining-prepack-and-
closure-compiler)

The result was pretty good.

------
jlebrech
I want something that can separate my code into what can be precompiled into
wasm and what has to stay in JS. maybe just insert comments so i can see what
needs to be done.

------
Traubenfuchs
I can't get anything to work in it. Just for fun I put the not minifed vue.js
source inside and I get:

null or undefined TypeError at repl:537:23 at repl:5:16 at repl:2:2

------
reaction
Has anyone used this with webpack + reactjs ?

------
austincheney
Errors on all my code:

* [http://prettydiff.com/lib/](http://prettydiff.com/lib/)

* [https://raw.githubusercontent.com/prettydiff/biddle/master/b...](https://raw.githubusercontent.com/prettydiff/biddle/master/biddle.js)

------
avodonosov
How does one measures performance improvement for a web page gained from such
tools?

------
k__
Just throw your webpack bundles in and be amazed.

------
frik
How does it compare to Google's closure compiler? It is considered by many
best in class. It understands the code (uses Java based Rhino Javascript
engine), while most alternatives (UglifyJS & co) just monkey patch things. You
can trust the Google's closure compiler output.

Edit: @jagthebeetle: have you tried "advanced mode"? (One should read the
documentation before using it, it's really a game changer but requires one to
read the docu first)

~~~
matt4077
Closure compiler optimizes code size, while this optimises code execution.

~~~
bpicolo
I see that it claims that, but that's not entirely true. The closure compiler
does a variety of perf optimizations as well (e.g. inlining).

Closure is really a great compiler, it's just a shame it doesn't interact well
(that is, at all) with the modern JS ecosystem.

~~~
tadeegan
As someone who works on Closure Compiler, this is one of my biggest gripes
with the project. Things are getting better though! CC now supports node's
module resolution algorithm. It works pretty well with Es6 imports but not so
well with CommonJS (mostly because the exports are impossible to statically
analyze).

Within Google, CC is heading towards being a an optimizing backend for other
less painful languages such as Typescript (tsickle) and the yet-to-be-released
J2CL compiler.

CC does pretty well with these examples (our debugger is not quite as flashy):
[https://closure-compiler-
debugger.appspot.com/#input0%3D%252...](https://closure-compiler-
debugger.appspot.com/#input0%3D%252F%252F%2520From%2520Prepack.io%2520%2522Hello%2520World%2522%250Avar%2520s%253B%250A\(function%2520\(\)%2520%257B%250A%2520%2520function%2520hello\(\)%2520%257B%2520return%2520'hello'%253B%2520%257D%250A%2520%2520function%2520world\(\)%2520%257B%2520return%2520'world'%253B%2520%257D%250A%2520%2520s%2520%253D%2520hello\(\)%2520%252B%2520'%2520'%2520%252B%2520world\(\)%253B%250A%257D\)\(\)%253B%250A%250Aconsole.log\(s\)%253B%26input1%26conformanceConfig%26externs%26refasterjs-
template%26includeDefaultExterns%3D1%26CHECK_SYMBOLS%3D1%26MISSING_PROPERTIES%3D1%26TRANSPILE%3D1%26CHECK_TYPES%3D1%26COMPUTE_FUNCTION_SIDE_EFFECTS%3D1%26FOLD_CONSTANTS%3D1%26DEAD_ASSIGNMENT_ELIMINATION%3D1%26INLINE_CONSTANTS%3D1%26INLINE_FUNCTIONS%3D1%26INLINE_VARIABLES%3D1%26FLOW_SENSITIVE_INLINE_VARIABLES%3D1%26INLINE_PROPERTIES%3D1%26REMOVE_DEAD_CODE%3D1%26EXTRACT_PROTOTYPE_MEMBER_DECLARATIONS%3D1%26REMOVE_UNUSED_PROTOTYPE_PROPERTIES%3D1%26REMOVE_UNUSED_VARIABLES%3D1%26COLLAPSE_VARIABLE_DECLARATIONS%3D1%26COLLAPSE_ANONYMOUS_FUNCTIONS%3D1%26COLLAPSE_PROPERTIES%3D1%26DEVIRTUALIZE_PROTOTYPE_METHODS%3D1%26REWRITE_FUNCTION_EXPRESSIONS%3D1%26DISAMBIGUATE_PROPERTIES%3D1%26AMBIGUATE_PROPERTIES%3D1%26PROPERTY_RENAMING%3D1%26OPTIMIZE_CALLS%3D1%26OPTIMIZE_PARAMETERS%3D1%26OPTIMIZE_RETURNS%3D1%26MOVE_FUNCTION_DECLARATIONS%3D1%26MARK_NO_SIDE_EFFECT_CALLS%3D1%26CROSS_MODULE_CODE_MOTION%3D1%26CROSS_MODULE_METHOD_MOTION%3D1%26CLOSURE_PASS%3D1%26PRETTY_PRINT%3D1)

~~~
bpicolo
> CC is heading towards being a an optimizing backend for other less painful
> languages such as Typescript

I actually considered musing about something like this in my post. Typescript
support would be very interesting.

Fwiw, the tooling support around the CC has been a problem historically. We
relied on Plovr for a long time, but eventually it fell unmaintained, and
there wasn't an alternative for a lot of the relevant parts (e.g. gathering
source files you care about). Some important dev features also just didn't
quite work as intended (sourcemaps) for a long time.

------
iMark
The destination page looks uncomfortably like Webpack's.

Not the best idea, imho.

