
The "Hello World" of genetic algorithms - DanielHimmelein
http://www.generation5.org/content/2003/gahelloworld.asp
======
karpathy
Genetic algorithms, heuristic hill-climbing, simulated annealing, etc, etc,
etc. These are all basically equivalent ways of tackling an optimization
function that can only be point-wise queried efficiently, and they are all
subject to same problems (most notably local optima, curse of dimensionality,
etc.).

I've seen these used rather sporadically in academia, where people generally
prefer to reformulate a difficult optimization problem as a convex (or
similar) relaxation that is provably solvable, and then throwing solvers at
it.

However, I get the impression that when you just need to get some reasonable
solution quick and dirty in practice (especially with discrete domains), one
of these is the method of choice. I'm not aware of some particularly good
arguments for GA's specifically though.

~~~
Kartificial
The power of EAs (Evolutionary Algorithms) is that they are able to escape
local optima through random mutation and recombination.

EAs are suitable for finding solutions to complex problems, as long as you
state the problem correctly and have a decent fitness function. Besides this,
parameter tuning is vital for creating a decent EA. This goes so far that
there has been research in tuning parameters for an EA using an EA (creating
an EA Inception ;)).

------
Nvn
I really love the elegance of algorithms like these, a bunch of similar
algorithms are discussed in Clever Algorithms[1].

It also discusses genetic algorithms[2], providing a really simple
implementation that suffers from the same problem as the article in question;
the outcome you're looking for is already known so the fitness function is
rather trivial, but nonetheless its a good place to get started.

[1] <http://www.cleveralgorithms.com/>

[2] [http://www.cleveralgorithms.com/nature-
inspired/evolution/ge...](http://www.cleveralgorithms.com/nature-
inspired/evolution/genetic_algorithm.html)

------
peteretep
Maybe I'm way off the mark here, but calling a function that's iterating
towards a target string doesn't really feel like a genetic algorithm. If you
know the exact output already, you aren't really testing 'fitness' in any
meaningful sense - you're testing closeness. Why not just use A* to path-find
your way to it?

Genetic algorithms are surely meant for iterating towards a goal you can't
define, other than fitness towards a certain set of criteria - if you know
what the target looks like, what's the point in iterating towards it, rather
than just going at it?

~~~
peteretep
I think what I'm trying to say is: if your fitness function contains the
solution, then your problem is entirely contrived, and not a good illustration
of genetic algorithms - you're just using a very round-about way of
calculating something you already knew, inefficiently.

------
llambda
I love these. I think the original concept is based on the Weasel Program[1]
formulated by Richard Dawkins to demonstrate evolutionary change and natural
selection. Here's my attempt[2] in Clojure from a few weeks ago.

[1] <http://en.wikipedia.org/wiki/Weasel_program>

[2] <https://gist.github.com/2072019>

------
jordhy
Very basic but also very instructional. The key to these sort of algorithms is
to write a very efficient fitness function. The best GAs use GA to fetch the
best fitness function (sort of like meta programming or "meta-reflection").

This PPT has a great intro on GAs:
<http://cs.uga.edu/~khaled/ECcourse/chapter03.ppt>

~~~
sukuriant
It's so much more complex and amazing than that, especially in problem spaces
that have numerous local optima. My personal favorite weird genetic algorithm
exploration paradigms are where you start looking at the behavior of the
genome; and, instead of searching out best behavior, you search out
~different~ behavior. It's called Novelty Search, and it's awesome!

~~~
jal278
i'm glad you think novelty search is awesome, i'm doing my dissertation on it!
out of curiosity, how did you hear about it?

~~~
sukuriant
I know Kenneth Stanley and loooove NEAT :)

------
Lambdanaut
I apologize for the cheese factor of the "goal" beforehand, but here's a
Python program I wrote recently that does the same thing:

<https://gist.github.com/1649116>

I sat down with my fiance on Valentines Day and asked her if she could figure
out what the letters on the screen were trying to spell. I think she really
loved it!

------
debacle
I did a zombie port of this to JavaScript so I could take a peek at what's
going on without having to remember my C++. Take a look:

<http://jsfiddle.net/zENKM/>

------
rheide
Quite a good read and a good introduction to the topic. I somehow ended up
writing a Python version. Here it is: [https://github.com/rheide/Hello-
genetic-algorithm/blob/maste...](https://github.com/rheide/Hello-genetic-
algorithm/blob/master/hellogen.py)

------
agscala
Link is dead, google cache to the rescue:

[http://webcache.googleusercontent.com/search?q=cache:JBfk95W...](http://webcache.googleusercontent.com/search?q=cache:JBfk95WBF2kJ:www.generation5.org/content/2003/gahelloworld.asp+&cd=1&hl=en&ct=clnk&gl=us)

------
Caligula
Wow it has been years since I saw generation5. It used to have the best ai
learning materials and a good forum about eight years ago. The site seems to
be down for me however.

------
user24
I did the same thing in JavaScript a while ago with a more detailed writeup:
[http://www.puremango.co.uk/2010/12/genetic-algorithm-for-
hel...](http://www.puremango.co.uk/2010/12/genetic-algorithm-for-hello-world/)

Here's the HN discussion at the time:
<http://news.ycombinator.com/item?id=2002673>

------
arctangent
I wrote something similar over Christmas in both C and D:

<http://jacobconradmartin.com/experiments/evolution/>

The graph of mutation rate versus convergence time might be of interest.

------
Kartificial
A nice TEDx talk from Guszti Eiben about the possible applications of
Evolutionary Algorithms. <http://www.youtube.com/watch?v=WJX_wAKhg8A>

------
aerique
There's a more elaborate version on GitHub for multiple languages:
<http://github.com/jsvazic/GAHelloWorld>

------
tiles
An excellent, simple extension to this is to throw simple arithmetic operators
into the mix (digits 0-9, +, -, / and * perhaps) and to try and breed an
algorithm which when evaluated yields a number, like the number 500. You can
get startlingly close answers and interesting equations this way.

------
jostmey
My impression of genetic algorithms are that they work in the context of
biology (you can actually see it happen in real time with viruses & bacteria),
not computer science. I have yet to see an instance where a real problem in
computer science was best solved using genetic algorithms.

~~~
SeanLuke
The Genetic Algorithm ("genetic algorithms" is actually a misnomer) actually
has little to do with biology except for original inspiration. It is simply
one of a number of sample-based methods for doing global stochastic
optimization.

There's an old saying about the Genetic Algorithm: that it's the "third best
way to do anything". Stochastic optimization methods (or "metaheuristics") in
general are knowledge-poor methods, essentially last-ditch techniques where
you don't have any other known way to solve your problem and you don't want to
jump off the cliff into random or brute-force search. They rely on a central
heuristic: that similar candidate solutions will likely have similar
performance (the "smoothness" criterion).

Here's the thing. There are a _huge and growing number of crunchy problems in
this category_. If you're trying to find a good tic-tac-toe solution, you can
almost certainly do better than a stochastic optimization method (just use
state-space search, say). But if what if you're trying to find the set of
behaviors for a two-thousand-agent multiagent simulation model which produces
statistics most closely resembling known historical data? Or what if you're
trying to find the best parameters for optimizing an aircraft engine whose
space is filled with local optima? It's ugly problems like these, for which
there is _no principled solution method_ , where the Genetic Algorithm and its
ilk reign supreme. You might say that you never see problems in computer
science which are "best solved using genetic algorithms". My answer is: your
daily problems are too simple to need them.

Book plug: you might enjoy my free online text on the subject, called
Essentials of Metaheuristics. You can also get it in paperback.
<http://cs.gmu.edu/~sean/book/metaheuristics/>

~~~
NyxWulf
I've read through a large portion of the PDF version of your book quite a few
times. It is very well written and helped me understand a number of different
techniques very well.

I definitely recommend Sean's book as a starting point if you are interested
in this field.

(I don't know Sean, and don't get anything for this recommendation other than
the happy glow of helping a well deserved author get more recognition)

~~~
SeanLuke
Thank you very much.

------
vellum
A good write-up. I played around with Pyevolve[1], which is also written in
Python. GAs are non-optimal for most problems, but it's a fun starting point.

[1]: <http://pyevolve.sourceforge.net/>

------
eridius
Here's a fairly direct translation of the original program into Go:
<http://play.golang.org/p/Xg9f0lM40n>

------
eridius
There's a bug in the program. In the mate() function, when it recalculates the
string, it uses `esize-spos` when it meant to use `tsize-spos`.

