Hacker News new | past | comments | ask | show | jobs | submit login
Playing with genetic algorithms in Python (joseprupi.github.io)
104 points by melenaboija 19 days ago | hide | past | favorite | 16 comments



GAs are super cool and I think underutilized in the NN era.

One thing that might improve the algorithms in the linked write-up is not choosing the strict top results. Often, GAs perform better if you choose a random selection from the result set, then choose the top performer from that selection.

So, if your algorithm generates 100 results, pick 10 randomly, then use the top 2 from that selection to generate the next generation. This allows more exploration of the solution space and mitigates landing in local optima. Introducing the new parameters means more playing with the values, but it's been shown to work better, and makes sense in the context of biological evolution, as well.

Anyway - GAs are fun and I'd like to see them used more. Thanks for the article OP!


I agree. Personally I had success using randomness on an energy-based system. You start off with energy and compete to earn more. Once you hit 0 then you're out. Then you can generate new ones, with some being completely random, and then others being the combination of random choices from the Top N. Probably not the most theoretically-sound approach, but it worked well enough in the real world.


Evolutionary Computation has, surprisingly, been successfully integrated in many engineering (not software!) problems such as antenna design, aerodynamics and whatever else has to do with shape optimization.


Rather than choosing 10 fully randomly I think you would have more luck with using fitness to guide the selection process, using methods like Roulette Wheel selection (letting the probability to be selected be somewhat influenced by the fitness). This will retain the benefit of a wide exploration of the solution space as you mention, but also retain some of the benefits of evolution - the fittest prevails.

Throughout the generations you can adjust the parameters (selection pressure) from just a slight favoring of fitness to higher favoring fitness, making the GA have a large initial search space that gradually approaches an optima. If you are worried about the solution converging to soon, a high grade of mutation can also be used.


In my experience using tournament selection with a small elitism count works best. You keep the best `n` members of the population and for all other members you group them into random groups of size `m` and select the best value from that group.


Why would that be different from generating 10 results and picking the top 2? I assume any 2 generated results from the same generation would be "within the same distribution", if that makes sense.

Is there a term for that approach?


Larger population = faster state space exploration. Sampling relies on the structure of the population. Most individuals will typically be minor variations on one of the successful genotypes, with some smaller percentage of novel genotypes. Random sampling will usually ensure that the successful genes are broadly represented while occasionally including the others. Since the sampling probably won't include the most optimized individuals, those others can compete in an easier environment and the algorithm has a better chance to escape local minima.


If you want to read what I think is one of the coolest experiments with GAs I've ever seen, check out the paper "An evolved circuit, intrinsic in silicon, entwined with physics." by Adrian Thompson [0] - taking the bitstream of an FPGA and running it through a GA and them testing the hardware running in-circuit while judging fitness to a desired function. Crazy things ensue.

[0] https://link.springer.com/chapter/10.1007/3-540-63173-9_61 but likely can be found online in other places


I just looked up this paper from "other places". The author evolves an FPGA to discriminate between 1Khz and 10Khz square waves. I don't know that much about FPGAs, but it seems like such a task should not be possible without an external clock signal, since otherwise the output wave-form the 10Khz signal should simply be the 1Khz signal sped up 10x. Somehow, evolution learned to make use of electrical and thermal effects to accomplish the task. There were even units that were electrically separated from the rest of the device which were crucial to its functioning. Pretty amazing.


That paper, and TS Ray's Tierra, left me with a life-long fascination with GA. But as I understand it, as long as you're chasing a fixed fitness function the technique doesn't really outperform simulated annealing.


I remember presenting that paper in grad school. Can confirm the ensuing WTF factor.


It seems like using np.random.choice is indeed a slow way to get a grid in which 5% of the values are 1. I would recommend using np.random.rand(size) >= mutations:

  > python3 -m timeit 'import numpy as np; mutations=0.05; rows=10; columns=10; np.random.choice([0,1], p=[(1-mutations), mutations],size=(rows,columns))'
  50000 loops, best of 5: 8.88 usec per loop`

  > python3 -m timeit 'import numpy as np; mutations=0.05; rows=10; columns=10; np.random.rand(rows,columns) <= mutations'
  200000 loops, best of 5: 1.06 usec per loop


I've been fascinated by GAs ever since Blondie24 [1] demonstrated it could be used to create strong board game players. Now that there exists the Distributed Evolutionary Algorithms (DEAP, [2]) library for Python I am tempted to recreate Blondie24.

[1]. https://en.wikipedia.org/wiki/Blondie24

[2]. https://deap.readthedocs.io/en/master/


I recall in my younger days this book[0] that was very approachable on the subject.

[0] An Introduction to Genetic Algorithms https://a.co/d/hKyQoqO


GA always interests me. Sorry for being naive here but I want opinions from HN. What really intrigues me is how the principle of evolution just works when implemented as an algorithm. I've wondered is there an implementation of evolution in similar way but as, like, a differential equation model, like an ODE or PDE? Like a PDE model of natural selection. I did some search 3 years ago but Google was unreliable and a lot of unrelated things came up


I love GA's and realy like Python, but given Python's traditional struggle with the multithreading that GAs crave, it is not the most obvious combination.




Applications are open for YC Winter 2024

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: