
Genetic Algorithm 2D Car Thingy - sconxu
http://rednuht.org/genetic_cars_2/
======
ColinWright
This is brilliantly cool, and utterly hypnotic. I still always wonder about
implementations of genetic algorithms, though. In real life, from which
inspiration has been taken, the phenotype is often radically affected by small
changes in the genotype. In computer implementation of the concept this is
rarely the case. The computer versions are small, clean, spare, and nowhere
near as messy as the real life version.

I wonder if the messiness actually matters. I wonder if "optimization by
genetic algorithm" would be significantly improved by using a more faithful
algorithm.

========

Anyway, here's a previous discussion of this particular toy:

[https://news.ycombinator.com/item?id=5942757](https://news.ycombinator.com/item?id=5942757)
(169 comments)

Other submissions:

[https://news.ycombinator.com/item?id=10596079](https://news.ycombinator.com/item?id=10596079)

[https://news.ycombinator.com/item?id=10134390](https://news.ycombinator.com/item?id=10134390)

[https://news.ycombinator.com/item?id=5952145](https://news.ycombinator.com/item?id=5952145)

[https://news.ycombinator.com/item?id=2226137](https://news.ycombinator.com/item?id=2226137)

Also, genetic development of walkers:

[https://news.ycombinator.com/item?id=8911719](https://news.ycombinator.com/item?id=8911719)
(74 comments)

~~~
spooningtamarin
Genetic algorithms aren't really that brilliant. They are simple, and that's
what makes them popular among researchers. There's no formal science behind
them.

I bet that thinking about the problem more would allow you to find an
algorithm that works better than genetic algorithms all the time.

Just look at travelling salesman problem, or vehicle routing problem. Smart
heuristics beat these hybrid-genetic-memetic-algorithms all the time.

~~~
TelmoMenezes
> There's no formal science behind them.

I'm not sure what that means. Science is empiricism, and genetic algorithms
are a wide field of research with a lot of experimentation behind them. There
are many hypothesis about how they work, why they work and what their
limitations are. These hypothesis are tested through experimentation, just
like in other scientific fields.

> Smart heuristics beat these hybrid-genetic-memetic-algorithms all the time.

Sure, this is almost guaranteed by the "no free-lunch theorem":
[https://en.wikipedia.org/wiki/No_free_lunch_theorem](https://en.wikipedia.org/wiki/No_free_lunch_theorem)

There might be some exotic cases where genetic algorithms are actually the
best solution, but nobody knows that.

But that's not the point of genetic algorithms. The point is having a generic
solution that can be applied with some success when you don't have the time or
the resources to find those "smart heuristics". And there might be cases where
we are just not smart enough to find them. Some of these cases might be
related to the effort towards developing generic AI. You can also claim that
it is always possible to write a more efficient computer program directly in
machine code, if you ignore time and resources.

Another interesting property of evolutionary computation is enabling some form
of artificial creativity. Check out this antenna, created by NASA using
evolutionary computation:

[https://en.wikipedia.org/wiki/Evolved_antenna](https://en.wikipedia.org/wiki/Evolved_antenna)

Also, if you will forgive me a bit of self promotion, check out my own work on
the discovery of complex network generators:

[http://www.nature.com/articles/srep06284](http://www.nature.com/articles/srep06284)

~~~
spooningtamarin
I wouldn't say genetic algorithms are generic solutions. I have to try really
hard to implement efficient operators that would allow me fast optimization.

Scientific contributions are mostly in operators, I guess. I can't think what
else can someone working with genetic algorithms contribute because adding
elitism, or 3-way crossover isn't really the solution to a problem, it's just
metaheuristics jibber-jabber.

~~~
TelmoMenezes
I find the situation a bit similar to neural networks. It's a bio-inspired
model, and we know from nature that it can work spectacularly well. The actual
algorithms that became popular (both in neural networks and evolutionary
computation) are extreme simplifications of the biological reality.

So a lot of the research goes in the direction of finding the "missing
ingredients". On the evolutionary computation front, one of the topics I find
the most interesting is the genotype-phenotype mapping. Naive genetic
algorithms have genotypes that map directly to phenotypes (e.g you are
evolving some vector which is a direct solution to a problem). More complex
mapping (like what we see in nature) encode a generator for a solution.

On this note, exciting stuff happens on the genetic programming front (where
you are evolving computer programs, commonly Lisp-like trees).

It really is a vast field, ranging from direct engineering applications to
artificial life simulations aimed at theoretical biology. Like everything
else, you have to be familiar with the literature.

~~~
spooningtamarin
I'm familiar with the literature.

I do not like the research. It is bad, it's rarely reproducible, source code
is almost never public thus time/convergence measurements are meaningless.

------
sevensor
This is a very beautiful illustration of a woefully outdated algorithm. Its
proportional selection operator was superseded in 1994 (at the latest) when
Bäck published his comparison of selection operators. Its approach to
crossover looks like an ad-hoc extension of binary crossover, which Deb
improved on in the same year. It's using uniform mutation, although better
options have been demonstrated by Evolution Strategies since the '70s, if not
earlier.

But it's really fun to watch. I'll give it that.

~~~
mdda
I researched GAs / GPs back in the mid-90s (but then went in a different
direction).

Is there a paper/presentation that embodies the current best
practices/thinking that you could recommend? I'm not trying to be lazy, it's
just that there is clearly a lot of retro thinking among the top search
results on the net, and it's difficult to separate the wheat from the chaff...

~~~
sevensor
With the disclaimer that there are different schools of thought, I can give a
basic outline, and provide some references. This pertains mainly to optimizing
engineering models, which tend to be continuous or mixed integer. GAs are just
OK at combinatorial optimization (although GP is pretty nifty, from what I
hear about it).

* Steady-state: pioneered to the best of my knowledge by Deb's epsilon-MOEA, steady-state algorithms use function evaluations as soon as they are received, rather than waiting for a whole generation to complete. [1]

* Multi-objective: Pareto ranking as dominance relation lets us have multiple objectives [2]. This is good because it gives us a whole bunch of solutions at once and lets us decide which one we want, instead of tweaking objectives and constraints to get an idea of the tradeoffs involved.

* Epsilon resolution: comes from an idea by Laumanns et al. [3]. We can have more objectives if we don't care about differences below a certain threshold. Otherwise the Pareto front explodes in size.

* Restarting: Proposed by Coello-Coello [4] and demonstrated by his micro-GA, restarting helps you get out of local optima.

* Better search operators: Binary crossover doesn't work so well, especially for real-valued decision variables. It took some fancy theoretical footwork involving schema theory and domino convergence to support it. There were a lot of new search operators proposed in the late '90s, the best of which seem to be simulated binary crossover (SBX), and polynomial mutation (PM). [5]

* Alternatively, the Differential Evolution search operator [6] works really well too.

* Better selection operator: tournament selection beats the pants off of the alternatives [7].

Resources:

* Dave Hadka's MOEAFramework [8], which includes open-source multi-objective evolutionary algorithm implementations.

* Dave Hadka's dissertation [9], which describes a (regrettably patented) MOEA that's kind of a greatest-hits of good MOEA ideas.

* Deb's website [10], which includes questionably-licensed algorithm implementations, as well as a number of technical reports.

Links. I'm posting raw bibtex because I'm lazy.

[1] @article{deb_2005_emoea, author = { Deb, K. and Mohan, M. and Mishra, S},
year = {2005}, title = {Evaluating the $\varepsilon$-domination based
multiobjective evolutionary algorithm for a quick computation of Pareto-
optimal solutions.}, journal = {Evolutionary Computation Journal}, volume=
{13}, number = {4}, pages ={501--525} }

[2] @book{goldberg_1989_book, title={Genetic algorithms in search,
optimization, and machine learning}, author={Goldberg, D.E.}, year={1989},
publisher={Addison-Wesley Professional} }

[3] @article{laumanns_2002_combining, title={Combining convergence and
diversity in evolutionary multiobjective optimization}, author={Laumanns, M.
and Thiele, L. and Deb, K. and Zitzler, E.}, journal={Evolutionary
computation}, volume={10}, number={3}, pages={263--282}, year={2002},
publisher={MIT Press} }

[4] @inproceedings{coello_2001_uga, author = {Carlos A Coello Coello and
Gregorio Toscano Pulido}, title = {Multi-Objective Optimization Using a Micro-
Genetic Algorithm}, booktitle = {Proceedings of the Genetic and Evolutionary
Computation Conference (GECCO 2001)}, pages = {274--282}, publisher = {Morgan
Kaufmann}, address = {San Francisco, CA}, year = {2001} }

[5] @techreport{deb_1994_sbx, author = {Deb, K. and Agrawal, R. B.}, year =
1994, title = {Simulated binary crossover for continuous search space}, number
= { Technical Report IITK/ME/SMD-94027}, institution = {Indian Institute of
Technology, Kanpur}, address = {Kanpur, UP, India} }

[6] @article{storn_1997_de, author = {Storn, R. and Price, K.}, year = 1997,
title = {Differential evolution --- a simple and efficient heuristic for
global optimization over continuous spaces}, journal = { Journal of Global
Optimization}, volume = 11, number = 4, pages = {341--359} }

[7] @inproceedings{back_1994_selection, title={Selective pressure in
evolutionary algorithms: A characterization of selection mechanisms},
author={B{\"a}ck, Thomas}, booktitle={Proceedings of the First IEEE Conference
on Evolutionary Computation}, pages={57--62}, year={1994}, organization={IEEE}
}

[8] [http://moeaframework.org/](http://moeaframework.org/)

[9] [https://etda.libraries.psu.edu/](https://etda.libraries.psu.edu/) You'll
have to search for it, I'm afraid. The site's not responding for me right now.
But his dissertation is publicly available on Penn State's ETD site.

[10]
[http://www.iitk.ac.in/kangal/deb.shtml](http://www.iitk.ac.in/kangal/deb.shtml)

edit: formatting

~~~
sevensor
Here's a better link for [9]:
[https://etda.libraries.psu.edu/search/1/50/31/author/term=ha...](https://etda.libraries.psu.edu/search/1/50/31/author/term=hadka)

------
pjs_
Here's my version from a decade ago:

[http://peteshadbolt.co.uk/ga/](http://peteshadbolt.co.uk/ga/)

It's so old that it's written in Flash! One of these days I will port it to
canvas, I swear...

~~~
hliyan
I remember this when it came out! I was so fascinated I wanted to write my
own, but I didn't have the programming skills back then...

------
bentpins
It's a little dead, but there's a few more similar natural selection
simulators over at
[https://www.reddit.com/r/NSIP/top/?sort=top&t=all](https://www.reddit.com/r/NSIP/top/?sort=top&t=all)

------
IanDrake
It would be cool if someone added a car designer to this. I wonder if I could
design a car based on intuition that would perform better than the algo could
generate.

~~~
sevensor
This is a frequent request. It can be very interesting to see how professional
designs stack up against algorithmically generated ones. I'm a proponent of
multi-objective algorithms, so it tends to be the case that professional
designs are at least close to Pareto-optimal. Pros are pros for a reason. But
they are often unaware of nondominated alternatives. For really hard problems,
problems that evolutionary algorithms are bad at, experts can still come up
with better alternatives than algorithms.

------
shostack
Would be cool to display the current world seed somewhere if it isn't
explicitly entered.

I was going to paste an example track I came across that seemed literally
impossible to conquer within the constraints of the mutations, but I can't
grab the seed.

In any event, would love more depth to the mutations. Longer/taller cars, etc.
Maybe even other things like randomly firing after burners, wings to let it
glide, etc.

Nice work!

------
sanoli
I keep looking for the version of this simulator where _you_ get to draw your
own car, and then you run it and see how well it handles the terrain. played
with it many years ago, never found it again. maybe someone here can help?

------
542458
Bizarrely, I seem to get much better performance in safari than in chrome with
this, which runs contrary to my expectations. I wonder why that is.

------
nmc
OP, please append "(2012)" to the title.

------
ixtli
So how long do I have to leave it running before the cars create art or try to
figure out who I am?

------
raldi
What does "Floor: [fixed / mutable]" mean?

~~~
ChicagoBoy11
You can have each generation always run on the exact same terrain, or the
terrain can be randomly generated on each iteration.

------
scottmcdot
Does the best one look like a Strandbeest?

