
What problems have you solved using genetic algorithms/genetic programming? - dlowe
http://stackoverflow.com/questions/1538235/
======
thornad
GA solve hard optimization problem, with serious resource constrains, to
facilitate breakthrough technology to market:

In January 2004 I was contacted by Philips New Display Technologies who were
creating the electronics for the first ever commercial e-ink book reader, the
Sony Librie, that had only been released in Japan, years before Amazon Kindle
and the others hit the market in US an Europe. The Philips engineers had a
major problem. Few month before the product was supposed to hit the market,
they were still getting ghosting on the screen when changing pages. The
problem was the 200 drivers that were creating the electrostatic field. Each
of these drivers had a certain voltage that had to be set between zero and
1000 mV or something like this. But if you changed one of them, it would
change everything. So optimizing each driver's voltage individually was out of
the question. The number of possible combination of values was in billions,and
it took about 1 minute for a special camera to evaluate a single combination.
The engineers had tried many standard optimization techniques, but nothing
would come close. The head engineer contacted me (I was in US at the time,
they were in Holland) because I had previously released a Genetic Programming
library to the open-source community. He asked if GP/GA's would help and if I
can get involved. I did, and for about a month we worked together, me writing
and tuning the GA library, on synthetic data, and him integrating it into
their system. Then, one weekend they let it run live on the real thing. Next
Monday I got these glowing emails from him and their hardware designer, about
how nobody could believe the amazing results the GA found. This was it. Later
that year the product hit the market. I didn't get payed one cent for it, but
I got 'bragging' rights. They said from the begining they were already over
budget, so I knew what the deal was before I started working on it. And it's a
great story for applications of GAs.

------
ced
With regards to GAs, it's good to keep in mind the comments from Russell and
Norvig's AI book:

"At present, it is not clear whether the appeal of genetic algorithms arises
from their performance or from their aesthetically pleasing origins."

Furthermore:

"Most comparisons of genetic algorithms with other approaches (esp. stochastic
hill climbing) have found that the GAs are slower to converge."

The discussion at the page below is very enlightening as well.

[http://books.google.ca/books?id=8jZBksh-
bUMC&lpg=PR8&...](http://books.google.ca/books?id=8jZBksh-
bUMC&lpg=PR8&dq=aima&pg=PA155#v=onepage&q=genetic&f=false)

~~~
_delirium
Yeah, I see them as just one tool in the randomized-optimization toolbox. It's
not entirely clear to me why many people see them as the only or the first
tool (perhaps the appealing biology metaphor?). In particular, I tend to like
a complexity ramp: first try the dumbest thing, like random instances, then
move on to randomized hill climbing or a variant like simulated annealing,
then add more complex stuff on top of that, whether via GA crossover, or
something that learns/exploits a statistical distribution, like MIMIC, if the
simpler methods didn't work and you need to exploit the structure of the
space. And with purely discrete things, I also like considering whether
completely different approaches, like constraint programming, would work
instead (instead of optimizing a numerical fitness/cost function, you give a
list of things that must be true about any desired solution, and ask it to
find one meeting the constraints).

------
endtime
* Generating rule sets for classifying politicians by party, based on voting data - this was for a class project. IIRC my results were a little better than the standard GrowRule (or whatever it was called) algorithm that most of the class used.

* Generating iterated prisoner's dilemma strategies. It's not that this is an intractable problem...but I wanted to see what would happen in a large population of different strategies. This actually resulted in support for some theoretical results I'd read, which was nice. This was for a class paper.

* The Mona Lisa polygon thing that Roger Alsing came up with - wrote this just to test a new GA lib I'd written to learn Python.

* Optimizing a nano-scale emitter for a new scanning electron microscope - I didn't actually _solve_ this problem, as I had to leave the job where I was working on it before the project finished...but I'd like to think that the project will get somewhere one day.

* "Translating" English prose to English verse; that is, trying to increase the metricality of text while minimizing semantic drift. Very, uh, mixed results (I blame Wordnet!). This was also a class project.

I feel like I might be forgetting something, but that's at least most of it.

~~~
abecedarius
How'd you measure metricality of text? <http://github.com/darius/versecop> was
my own hack at that using the CMU pronouncing dictionary -- I'd like to come
back to it sometime.

~~~
endtime
Well, we had a solution that used the CMU Pronouncing Dictionary but also had
a lot of unknown handling on top of it. Then we used a heavily modified
version of Levenshtein distance to measure distance from Shakespearean iambic
pentameter, with weighted penalties for the various "common
irregularities"...we consulted someone in the English dept. to set the weights
reasonably. Our metricality algorithm gave Shakespeare's sonnets 0.85, some
random Dickens 0.7, and the NYT 0.4 or so...so we think it worked reasonably
well.

Happy to send you the paper if you're interested in more depth than the above.

~~~
abecedarius
Yeah, unknown words were the biggest problem. I'd love to see your paper --
withal@gmail.com if you don't want to post a link here. Thanks!

I wonder if y'all ever went looking for 'hidden verse' in published prose,
too.

------
psawaya
For reference, here's the classic list of 36 known human competitive results
produced by GP: <http://www.genetic-programming.com/humancompetitive.html>

------
antirez
I needed to have a lot of hacking fun: stack based genetic programming
provided me with tons of wonderful hacking days a few years ago. See your
computer understanding how to create programs like: "DUP * 1 +" actually doing
symbolic regression by discovering a program that has the best fitness for the
set if inputs is something you'll hardly forget in your life.

Btw other than to have fun I also tried to use this approach to create a few
hash functions with good distributions.

------
rouli
I'm trying to evolve a bot for the current Google's AI challenge using genetic
algorithms. The results so far: a mediocre bot that doesn't even win on all
maps against the samples bots, and a laptop that is about to die every minute
now from exhaustion.

~~~
tocomment
I would thnk genetic programming might make more sense. Have you looked into
that?

I'd love to hear more about your approach. Email me? I thought of using GP for
the tron google challenge but it was too slow to launch that java thing they
give you to run the players.

------
arfrank
Bill Gross gave a TED talk a few years ago where he discusses how they used
genetic algorithms to design a solar collector and a subsequent stirling
engine. Theres not much on the details of what went into creating the program,
but there were a few interesting pictures of their progress along the way.

<http://www.ted.com/talks/bill_gross_on_new_energy.html>

------
cubicle67
Solving magic squares

I wrote a solver (conventional style) in Ruby that could solve squares in
about 30 minutes, and I was pretty pleased with that, so I challenged a friend
of mine to beat me.

He came back an hour later claiming he had code that could solve it in under
10s. He'd written a genetic algorithm that solved squares on my machine in
about 6s _and_ in only about 30 lines of Java (less than my Ruby). I was
pretty humbled

------
motters
In 2007-9 I developed some software for reading datamatrix patterns. Often
these patterns were difficult to read, being indented into scratched surfaces
with all kinds of reflectance properties, fuzzy chemically etched markings and
so on. I used a GA to fine tune various parameters of the vision algorithms to
give the best results on a database of 300 images having known properties.
Parameters were things like downsampling resolution, RANSAC parameters, amount
of erosion and dilation, low pass filtering radius, and a few others. Running
the optimisation over several days this produced results which were about 20%
better than naive values on a test set of images unseen during the
optimisation phase.

This system was completely written from scratch, and I didn't use any other
libraries. I'm not opposed to using such things, provided that they give a
reliable result, but you have to be careful about license compatibility and
code portability issues.

------
dangrossman
I used some GP to find optimal ranges for sensors meant to detect malware on
computers. The sensors were things like process and thread counts, memory
usage, disk activity, etc... but there were around 40 total sensors and very
large ranges of data seen on computers running normally and computers with
some kind of malware running in the background. The detection software built
with the GP results worked pretty well. It was just a project I did for a
reverse engineering class in college.

------
terra_t
GA and GP are powerful methods for machine learning, but other methods are
often a better use of programmer and computer time, particularly the Support
Vector Machine.

------
jfager
Answering a slightly different question: if you're considering trying to solve
a problem with a GA, you might first try simulated annealing. For many
problems, it will probably converge to a decent answer more quickly, is quite
a bit easier to code, and is less resource intensive.

------
abecedarius
Made <http://djb.deviantart.com/gallery/> using my eye for the fitness
function. Karl Sims did it better originally, but I had some fun with it.

------
c00p3r
_My first job as a professional programmer (1995) was writing a genetic-
algorithm based automated trading system for S &P500 futures. The application
was written in Visual Basic 3_ \- thanks! You made my day!

