Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: A Genetic Algorithm library written in JavaScript (github.com/lodenrogue)
48 points by nyxyn on Aug 14, 2020 | hide | past | favorite | 41 comments



Remember that great old flash thingy where you mutate a boxcar until it drives well, just found out there's a html version: https://rednuht.org/genetic_cars_2/


I made something similar for teaching shapes to walk [0]. It's actually a workshop to teach people at my job about how GAs work by having them implement the different parts themselves.

[0]: https://matsemann.github.io/walkingea/


The thing about GA's is that the GA-part is easy -- it's the fitness function (the simulation here) that is the hard/interesting part. This problem has a nice evaluation fn - the duration of the sim - a lot of problems have what amounts to a binary pass/fail or it's not obvious where to do credit-assignment in your genome.


I like the cleanliness of the library. I learned about and used GAs in college. It is important for this library to have the ability to hold out the fittest candidate so far. This avoids the situation where a good solution is lost because the population evolves away from it. Optionally, the best candidate can be retained for breeding. Deleting unfit solutions should be a parameter. Breeding partner selection should also be weighted by fitness.


The idea of choosing partners based on fitness is implemented in most GAs. It's called selection and a basic form looks like choosing two individuals for each parent and only breeding the fittest in each pair (tournament selection).

Many frameworks also include a "hall of fame" to avoid evolving away from a good point like you mentioned.


The type of selection I was talking about is called "fitness proportionate selection"[1]. I prefer it, mainly because I am more familiar with it.

[1]: https://en.wikipedia.org/wiki/Fitness_proportionate_selectio...


What do you recommend as good GA frameworks?


DEAP is the well known framework in python.


Thanks for your comment. This algorithm keeps the best candidates through every generation.


I see that, but the best candidate of the current generation is not necessarily the best candidate ever seen. Reproduction and mutation can lead to the best solution of a given generation being worse than the generation before it.


The best candidate is always kept throughout the whole evolution process. If you find the best candidate in the second generation, for example, it will be kept all the way to the end.


interesting.

I did proposed a GA framework in my final thesis, but I end up never actually sharing the code, because it the end you have to create most of the code anyways. How to create your population, how to evaluate each individual, how to cross them...

My final thesis is in Portuguese, otherwise I'll share a link


Do you have diagrammatic details to share ? That will be easy to understand irrespective of language.


http://dspace.unipampa.edu.br/bitstream/riu/4849/1/Victor%20...

I do intend to take the time to translate it to english.


Are genetic algorithms getting popular again? This was the topic of my senior project 20 years ago. I had barely heard about them since I left university, but all of a sudden within the last six months I'm hearing about them again.


They've been quite popular in my field for the past decade and I am doing some work on them right now.


What field are you in?


Particle accelerator physics. They are used in the optimization of charged particle optics.


Purely out of curiosity - is there something about them that makes them better than other optimization algorithms, e.g. simulated annealing, random local search etc, or for continuous functions gradient descent and the like. Seems like GA's are always glossed over as a relic of past times, it's pretty interesting to hear about them in practice somewhere!


We use them because they are particularly well suited for multi-objective optimization. They are also a global techniqe as opposed to gradient based methods and have great scaling properties. Important because each evaluation of the objective function takes ten minutes for us.

GAs have found a niche in a lot of "obscure" engineering situations. Anything where you need to optimize an expensive black box function is at least worth a try with a GA. I know they are used in analog IC design, civil engineer, and aerospace.


They can be very good when there are multiple objectives / tradeoffs. I wrote my thesis on that, which includes an introduction to the concept and how it works if you're interested. http://master.matsemann.com/


Haven't worked on them a lot recently, but AFAIK, the population based nature helps them escape local minima and do a better job at exploring the solution space.


Where are you hearing about them? I haven't personally seen an increase in popularity, but that's not saying much.


Mostly links here on HN, links at /r/genetic_algorithms hitting my feed a little more often, and I'm seeing GAs referenced with designing neural nets in random articles.

I'm guessing that it's the last one that would drive renewed interest if that is indeed what I've been seeing, though I'm wondering if there is interest in applications for GAs outside of AI/ML.


I believe AutoCAD implemented a service which uses a GA to design an optimal solution for desired engineering specs.

At their core, GAs are search algorithms so they are relevant in lots of fields. Combine them with digital simulations (eg. physics, games, war) and you can search at far faster than life speeds.


GA are used for electronic circuit connection design (sorry, i don't the correct english name for the line printed on the circuits that connects different pins). GA are used to solve the problem of laying out them avoiding intersection. GA are used to design antennas. They are often used in travelling salesman like problems. GA and GP would be vey useful also in some other context like finding sequences of actions if only they could benefit from GPU like neural net did, but gpu aren't suitable for GA/GP and it's a pity since in GA/GP you can understand the inner structure easier than in a neural net where simply you don't know why the net took that decision.


I’m sure you could find a way to do GAs on a GPU. If you could have generations measured in billions of individuals and compute all of their fitness, mutation, crossover in parallel it should drastically speed up the algorithm.


That's exactly the point: is it possible? From my understandings, GPU allows massive parallel matricial calculations but that's not the case for mutations and crossovers (which involves random selection of mutate gene). Even so complex fitness functions (that are based on mathematical operation that takes into account more data per point like a moving average) couldn't be converted to matricial calculus.


Circuit route design :)


Nueral network tech went through a weird period and then back to basics (back prop) just with massive computation (and simple activation functions) with fabulous results. I wonder if similar things could be done with other old school techniques like GA. Does anyone know if we have tried massive ones?


There is plenty of work running evolutionary algorithms on GPUs, but there are a number of limiting factors that make it less than ideal. Evolutionary algorithms in general can be extremely effective, they just solve very different problems from "is this a picture of a cat?" and aren't as sexy as neural networks.

In fact, much of "old school" CS is still very much alive and applying modern techniques and hardware to their problems. It's not that neural networks are some field of CS that is much more advanced than the rest of CS but rather neural networks happen to be the current hot topic.


What do you mean a massive GA? Just using a huge population? The issue is that in a GA, time is dominated by evaluation of the fitness function. For my applications every time you do that takes ten minutes. More individuals just means more cost. Plus now there are more of them randomly searching or doing the same thing as each other. So in a sense there is a "too big".


For GAs the expensive part is often the fitness function. So it's very problem dependant.


I think GA with artificial neuron will be a good combination.


It looks usable for someone who knows about genetic algorithm. It would be catchy on your repo if you add gif version of a working demo for simple use case such as snake game or something similar.


Thanks so much for your comment. There are some examples here: https://github.com/lodenrogue/genetic-algorithm-js/tree/mast...


If I were you I would take those examples, wrap them in some html and put them on github.io or somewhere else but with a link at the top of the readme. There is so much stuff around and only so little time...


Nice work, I just written an algorithm myself, it’s a quick one in Python.


A question for the author: is genetic data a thing? If it is what are your thoughts?


I'm not the author, but DNA can be used to encode data. Interpreting your question another way, the parameters of each solution are data, and they metaphorically evolve like genes.


What do you mean by "genetic data"?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: