
I Compiled 1M TypeScript Files in Under 40 Seconds. This Is How - breck
https://medium.com/@urish/yes-i-compiled-1-000-000-typescript-files-in-under-40-seconds-this-is-how-6429a665999c
======
mncharity
Community-scale bulk code analysis; community-scale compilation caching;
cloud-scale compilation parallelism.

What do future programming languages and environments then look like? As the
constraints and costs on programming language design space again change?

FORTRAN survived from punch cards to present. But try to picture your current
favorite IDE/language experience... on a punch card machine. Or a Lisp
Machine, or Smalltalk IDE... on paper tape and console switches.

Stanford/Winstein's deterministic gcc compilation on lambda enabled `make -j
1000`-like parallelism - something vaguely like cold-cache compilation of
ffmpeg in 1 minute for 10 cents. What changes when that's 100x faster and
cheaper? When your linux kernel can be rebuilt as a boot step?

What changes when the time constraint on your type inference, is a couple of
cpu _years_?

What changes when your toy language, with optimization of compiler performance
still distant on the roadmap, is not performance restricted to toy-scale
problems?

What changes when ML-based predictive code, becomes a form of code reuse and
abstraction?

What changes when ...

... our languages and IDE's are allowed to become deeply different?

