Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Is genetic programming still actively researched?
51 points by mg on Aug 5, 2023 | hide | past | favorite | 22 comments
It is fascinating that neural networks have such a run at the moment. I wonder if this will continue "forever". Or if we will see a different paradigm eclipse them in the future.

Is anybody still doing research in the area of genetic programming?

The genetic programming books of John R. Koza were the first I ever read about machine learning. It felt like magic at that time.

I have the feeling that the approach to generate programs for the CPU via evolution still has a lot to offer if it was explored further.

If there is research going on out there, I would love to follow it.




The genetic programming scene kind of evolved into NEAT, HyperNEAT and ES-HyperNEAT as a meta learning concept.

Connections between layers/nodes are serialized as genes of agents with phenotypes and dominant/recessive markers, and an observing CPPN learns to categorize agents into different traits to find more efficient breeding mechanisms.

It's a strong concept, and AFAIK it's still used a lot in the robotics world where you have to guarantee behaviors and have to be able to reproduce behaviors due to safety regulations.

There was a nice intro video into the underlying base concept which is called NEAT by a youtuber named SethBling [2]

[1] http://eplex.cs.ucf.edu/ESHyperNEAT/

[2] https://m.youtube.com/watch?v=qv6UVOQ0F44


NEAT and neuroevolution in general are interesting approaches. I also suggest to check techniques like DENSER [1] that can be used to evolve deep networks (by using the evolutionary part on the network structure and not on the weights).

Genetic Programming (GP), however, has not evolved to NEAT (which itself is not very recent, being published in 2002) but simply neuroevolution has become one of the topics that are part of evolutionary computation (EC). For example, one of the largest yearly conferences on evolutionary computation (GECCO) [2] was just last month with both neuroevolution and GP tracks. It is however true that the success of neural techniques had an effect on the community, some effects are the discussion of the role of EC and, for example, more space given to hybrid works (see, for example, the joint track on evolutionary machine learning [3] inside the evostar event).

Related to the original post, a place where some recent research on GP can be found are the proceedings of GECCO (GP track), EuroGP (part of evostar), PPSN (Parallel Problem Solving from Nature), and IEEE CEC (IEEE Congress on Evolutionary Computation) and journals like Genetic Programming and Evolvable Machine (GPEM), Swarm and Evolutionary Computation (SWEVO), and IEEE Transactions on Evolutionary Computation (IEEE TEVC). The list is not exhaustive, but those are some well-known venues.

For a less "daunting" starting point, some recent techniques are being added to the SRBench benchmark suite [4], with links to both the code and the paper describing the technique.

[1] Assunção, F., Lourenço, N., Machado, P., & Ribeiro, B. (2019, March). Fast denser: Efficient deep neuroevolution. In european conference on genetic programming (pp. 197-212). Cham: Springer International Publishing.

[2] https://gecco-2023.sigevo.org/HomePage

[3] https://www.evostar.org/2024/eml/

[4] https://github.com/cavalab/srbench


Thanks for the infos. I will look into it.


It's completely mistaken to think that the NN craze means noone works on anything else. Academia has very many people researching whatever they want, full-time or on the side. AI has especially many veteran researchers stubbornly following long-standing lines of research which have unimpressive results. Noone can say they're wrong. Hinton was once that guy doing unfashionable research into NNs.

Anyway, there are also memetic algorithms, which extend genetic algorithms by adding local search (some form of local improvement such as gradient following or simple handcoded heuristics) to the genetic global search. Actually a very simple idea (e.g. alternate mutation and/or recombination and optimisation steps). They tend to perform better than pure genetic algorithms because they can actually use gradient information or heuristics. It's a very broad class of algorithms which tend to have many hyperparameters.


It doesn’t actually surprise me that hybrid genetic/memetic approaches outcompete purely genetic approaches: after all, hybrid genetic/memetic humans have outcompeted purely genetic species at every level.


Not every level


May I take this opportunity to recommend Phase IV https://www.imdb.com/title/tt0070531/


This is probably a good starting point:

https://sig.sigevo.org/index.html

Genetic programming is a bit of a misnomer, evolutionary algorithms is probably a better name.



Yes, although it is much more active in robotics. York still has quite active research into evolutionary algorithms and genetic programming (https://www.york.ac.uk/physics-engineering-technology/resear...).

It's been used to do things like find design parameters (https://pure.york.ac.uk/portal/en/publications/evolving-desi...) and attempt to evolve robots to fit an environment (https://www.york.ac.uk/robot-lab/are/)


Yes it is going on. I wound up having to write a new python GP library from scratch due to DEAP license being gpl3. Now I’m translating that to Rust. One point of order: don’t only try stuff randomly. Also try trying stuff in shortlex order. Also, make sure you use hyperparameter optimization on the outside of the GP evolution process or else you’ll wind up with too many parameters to hand-tune.

I think the link between Pascal’s Simplex, Koza GP Tree Words, and Levin Search, is fascinating.


I would take a look at pymoo library which has a good enough API for using in for applied problems I’ve run into in my research. It’s also surprisingly flexible to extend / subclass for any needs you need as well.


One of the reasons I got my degree in Biotechnology was because I realized that the technology of life is mind-bogglingly advanced and learning how it does things can have profound insights into how we solve other problems. The process of mutation and evolution is definitely a strong contender for this, maybe one of the most important and powerful.


That's a very wise decision. We are far from done with learning from nature. As more and more of the DNA is unlocked it never ceases to amaze me how incredibly complex it all is and how many parts of the process interact with each others, sometimes across several levels of abstraction.


I'm a non-expert here but it seems intuitive to me that directly moving towards an improved model via linear regression is more efficient than randomly changing your model and then running a natural selection simulation to improve fitness.


> directly moving towards an improved model via linear regression is more efficient than randomly changing your model and then running a natural selection simulation to improve fitness.

That is objectively true, but don't underestimate how much of that process is simulated by the way we train our models. The natural selection bit never was natural to begin with (it's obviously artificial), and is the rough equivalent of the final step in training a model: verification on unseen data. If the model performs worse compared to a previous one then it is discarded!

Evolutionary algorithms are somewhat interesting because they can come up with weird stuff that works anyway, that random element can result in entirely novel approaches (to the point that we have a hard time to understand what is going on) and that's something that I have not seen with neural nets.

There are some interesting hybrids:

https://www.sciencedirect.com/science/article/abs/pii/S09521...


This a bit of a false dichotomy: genetic algorithms need not forego of directed or local search (for example, memetic and hybrid algorithms mix gradient search with populations).

Also, you have to keep in mind, it is often very hard to frame problems in a way that makes linear regression or gradient descent practical.


Also not an expert but I can imagine that models with a lot of binary and/or non-linear variables will work much better with genetic algorithms. In other words, genetic algorithms are strictly more powerful than linear regression.


But how do you know you’re in the right area? You could be linearly improving a really terrible solution, whereas random chance could take you somewhere completely new and better.


I have no research to back this up, but I'd expect that the more unknown and unpredictable your problem and search space are, the bigger the benefit of an evolutionary approach.

But it's never going to be efficient; it's inherently incredibly inefficient. It only really makes sense when no other method will work.


Google Scholar shows hundreds of Genetic Programming papers published in 2023. If we expand with some years, several have hundreds of citations, so the field is seemingly alive and well. I hope to find some uses of it myself, just because I like the concept.


Intuitively to me Monty Carlo would be better. Monty Carlo rocks.

Intuitively to me intelligent design is going to beat genetic programming.

It's the constants and knowing which intelligently designed algorithm is better that is impossible to know, which Monty Carlo solves.

Look at the antenna designed on Wiki and think how easy that would be with Monty Carlo - https://en.wikipedia.org/wiki/Genetic_algorithm#:~:text=The%....

John R. Koza book was 1992, computational power now allows us to smash things.

Here's a comparison at a wind farm design between Monte Carlo and genetic algorithms (Monty Carlo was better) - https://rera.shahroodut.ac.ir/article_2146_5e7bee97938fcd513...

But it's really interesting, have fun looking into it. Have a look through HN articles - https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

[edit] I haven't differentiated between "What are the differences between genetic algorithms and genetic programming?" - https://stackoverflow.com/questions/3819977/what-are-the-dif...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: