

Show HN: A Genetic approach to CSS compression - friggeri
http://friggeri.net/blog/a-genetic-approach-to-css-compression/

======
franze

      however, although the resulting file size is smaller,
      once gzipped it actually leads to larger files than 
      by only using minification, therefore this is a
      totally useless article,
    

no, it's not. mobile devices have a relatively small cache size, and assets
are cached in non-gzip-ed form. so non-gezip-ed filesize is important (on
mobile, that is)

~~~
justinsb
I really appreciate OP's honesty/humility, and (presuming you're right franze)
it shows that even things we ourselves consider to be dead-ends, may in fact
be useful to different applications.

I'd be very happy to see lots more of these interesting experiments -
regardless of outcome.

------
merraksh
The problem might be NP-hard, but with modern commercial (CPLEX, Gurobi,
Xpress) and open-source (Cbc, SCIP) solvers for discrete optimization you can
find a provably optimal solution in a relatively short time, even for
thousands of nodes.

The disadvantage of genetic algorithms is that they are heuristics, with no
guarantees on the quality of the solution found. Exact solvers such as the
ones above do provide an estimate of how far you are from the optimal solution
if you decide to stop them before they terminate.

------
muxxa
What about if you rerun the genetic algorithm with the fitness function being
the size of the final compressed+gzipped file? Maybe that could counter-
intuitively reorder the CSS so that there are more long runs of repetitive
rules which result in an optimum final level of compression.

~~~
friggeri
This is more or less what I hint to at the end of the article, I did look at
it very fast but it did not yield very interesting results. The way I see it
that this might be related to the fact that there is really no link between
the different genetic operators and the deflate algorithm. Anyways, I'd love
to be proven wrong, in case anyone wants to fork the repo and try that.

~~~
muxxa
Haven't run it yet, but a simple change to the cost function:
[https://github.com/eoghanmurray/css-bipartite-
compression/co...](https://github.com/eoghanmurray/css-bipartite-
compression/commit/2da0a52cb4968959001acecd7ae3095717ea8534) Maybe the
assumption to merge more frequently than to split doesn't hold with the new
cost function.

~~~
friggeri
You might be onto something about the merge/split balance. Will definitely
investigate in that direction.

------
friggeri
OP here, available to answer your questions if you have any.

~~~
ganarajpr
Since GA is non-deterministic, you will never know how long / for how many
iterations ( generations) to run the GA for.

If one could get a pretty good compression in around 200-500 generations, then
perhaps it would be a good solution.

------
SageRaven
Best practice for serving .css files is gzippped? Is this at the file level or
on-the-fly by the web server? In any case, why is this considered best
practice? (I ask as a sysadmin, not a web developer.)

~~~
icebraining
While most people do on-the-fly gzipping, I think that's it's best to just
include static gzipping as part of the deployment process. Then you can use
e.g. Nginx's HttpGzipStaticModule to serve the appropriate version.

------
halbermensch
Quite interesting though, as far as "useless" ideas go.

