
The smallest JavaScript libraries on the internet - 112233
http://minime.stephan-brumme.com/
======
yread
I'm wondering for some time if the minification could be done in a way that
would help the gzipping achieve better ratios.

For example if you have to minified function and one has three local variables
and the other two it might be beneficial to minify it as "var a, b;var c;" if
there are a lot of functions with two parameters a only one with 3. Or always
assign "this" to a variable called "t" so that "t=this" (or "var t=this;") is
repeated often.

This seems to be low hanging fruit but there is a lot of microoptimizations
like that even if it saves <5% like this brute forcing it might be worthwhile.

~~~
hawski
Google's Closure Compiler does something like this [1].

It also reminds me of Stack Overflow in 4096 bytes [2], where author also
tried to write code that would compress more.

[1] [https://github.com/google/closure-
compiler/wiki/FAQ#closure-...](https://github.com/google/closure-
compiler/wiki/FAQ#closure-compiler-inlined-all-my-strings-which-made-my-code-
size-bigger-why-did-it-do-that)

[2] [http://danlec.com/blog/stackoverflow-
in-4096-bytes](http://danlec.com/blog/stackoverflow-in-4096-bytes)

------
lr4444lr
Why do these JS framework comparison articles never include [http://vanilla-
js.com/](http://vanilla-js.com/) ? It's all the new rage, and has these other
contenders beat every time when both post- and pre- compression.

~~~
cwt137
Vanilla JS is nice, but I like Vapor JS better. It has some famous people
behind it. Check it out
[https://github.com/madrobby/vapor.js](https://github.com/madrobby/vapor.js)

~~~
lr4444lr
Wow - the packed version even gives asm.js a run for its money!

Wonder if they'll make a CoffeeScript wrapper soon.

Thanks for the tip!

------
nreece
As as aside, there's a good list micro-frameworks and micro-libraries at
[http://microjs.com](http://microjs.com)

------
SimeVidas
> “Surprisingly, most Content Delivery Networks (CDNs) give you a quite bad
> level of GZIP compression.”

I think CDNs have a good reason for choosing their preferred method of
compression. Just the file size alone isn't the whole picture.

~~~
wolfgke
Then explain to me, what reasons the CDN has not to choose the best available
compression (at least for small files that are downloaded rather often)?

~~~
hartator
Stress on the client CPU + proxy caching is harder + broken support on some
old IE

~~~
wolfgke
* Stress on the client CPU

Do you have any evidence that these smaller files stress the client CPU more
(this is probably not an absurd statement, but I claim that one can only make
serious discussions about this point if one has a computable (i.e. not only
empirically measurable) metric that measures how much a client CPU is stressed
by some file).

* proxy caching is harder

Could you explain me the reason for that?

~~~
hartator
> Do you have any evidence that these smaller files stress the client CPU more
> (this is probably not an absurd statement, but I claim that one can only
> make serious discussions about this point if one has a computable (i.e. not
> only empirically measurable) metric that measures how much a client CPU is
> stressed by some file).

[http://tukaani.org/lzma/benchmarks.html](http://tukaani.org/lzma/benchmarks.html),
Actually I was wrong, Gzip decompression doesn't seem to be impacted at all by
higher compression rate, it's even the reverse with small gain in
decompression speed. Gzip compression is though - for a mere 3% to 5% gain in
size, compression is taking 4x to 6x the time.

> proxy caching is harder, could you explain me the reason for that?

The tricky part is each filed can be compressed in different ways, the CDN
might have to store different versions of the same file that can lead to a
mismatch between the compression used and the actual compression of the file,
a good post on the topic: [https://www.maxcdn.com/blog/accept-encoding-its-
vary-importa...](https://www.maxcdn.com/blog/accept-encoding-its-vary-
important/) It's doable but it's not trivial.

------
willmacdonald
It'd be good to try with Brotli compression too;
[https://en.wikipedia.org/wiki/Brotli](https://en.wikipedia.org/wiki/Brotli)

A new compression algorithm from Google, which is already used for web-fonts.
Currently usable in Chrome & Firefox.

 _Replacing deflate with brotli typically gives an increase of 20% in
compression density for text files, while compression and decompression speeds
are roughly unchanged_

~~~
stbrumme
Scroll down to the bottom of each page (e.g. [http://minime.stephan-
brumme.com/jquery/2.2.0/](http://minime.stephan-brumme.com/jquery/2.2.0/) ).
There you will find a comparison of the major non-DEFLATE algorithms, such as
Brotli, RAR, 7z, xz, bzip2.

------
stbrumme
Author here: my website isn't about Javascript in the first place, it's about
compression. I could have done something similar with PNGs, PDFs or ZIPs (same
DEFLATE compression algorithm) but Javascript is much more widespread.

~~~
marklawrutgers
I have an issue with the load local copy code:

<script>window.jQuery || document.write('<script
src="local_server_path/jquery.min.js"></script>')</script>

In the browser I always end up with a ') at the top of all my pages. I'm
assuming this is because the script tag actually closes inside of the
document.write instead of the actual end. Am I the only one with this issue?
Not sure what went wrong here.

------
SlashmanX
Is the time taken to decompress these minified libraries identical to the
original minified libraries? Be nice to see if there was a measurement on
that.

------
PSeitz
[http://vanilla-js.com/](http://vanilla-js.com/) is also pretty small

