Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: My Book on Evolutionary Algorithms, Written in Python Notebooks (shahinrostami.com)
63 points by shahinrostami on June 18, 2020 | hide | past | favorite | 8 comments



Your patron link to chord is broken which may confuse some people.

This looks really great!

To be honest I tend to tinker a bit in the simulation space so this all is really interesting. It seems like a github pledge is better for you than patron at the moment, as you keep all the money, would you prefer that?


Thank you for spotting that and letting me know, I'll fix it now!

I'm glad you like it! I'm trying out both GitHub sponsors and Patreon at the moment to fund more projects, I'm not sure which will stick! So far it looks like Patreon has better support for giving back to your backers, but GitHub also doesn't take a cut. I am of course very grateful and flattered - please choose whichever you prefer!


Well if you plan on mirroring access via discord, then github's better as I can support you =)...

If you are thinking of making patron only content then that's better.

Hence why I was asking ;)...


Ah I see! Let's go with Patreon then :) As it grows I think it will allow for better engagement from supporters. Plenty of features for sharing updates and getting feedback that are perhaps worth the cut they take!


Cool! Bought a copy. Some questions

- Have you looked into Latin Hypercube Sampling for your initial population creation? Or in general, some initial set that tries to cover the space better than a pseudo-random sample? LHS tries to guarantee some distance between the points, which can be beneficial for rather non-convex problems.

- For very high dimensionality, have you considered classification to identify which dimensions are significant?


Hello! I didn't realise this post received any attention, I thought it had faded away!

With a sufficient population size, taking advantage of better sampling techniques for initial population generation can make a significant difference. I've used LHS and modified LHS approaches before, but I wanted to keep things simple, at least in the earlier parts of the book!

Something I've worked on with a colleague recently is using a more data science with pre-optimisation exploration (https://link.springer.com/chapter/10.1007/978-3-030-43722-0_...), hoping to do more work on this soon. With regards to high dimensionality in the search space, this kind of approach could be useful.

Previously I've been more interested in high-dimensional objective space and how to deal with them, primarily using progressive preference articulation.

If you have any requests for additional sections in the book I would love to hear them! Something high up on my list is doing a section or two on my neuroevolution algorithm, but keeping it at the right level is tricky.


I'm not sure if dimensionality reduction fits in the book, but it is one other thing we are considering in our problem. I am HOPING that with good coverage of early populations, we can 1. drop an order of magnitude off our dimensions (probably not) 2. select the most significant dimensions to apply our evolution techniques.

Do you have any favorite approaches to reduction?


It depends on the nature of the problem! Are you able to share any specifics e.g. context, number and relationship between decision variables, number of objectives (single or multi?), the optimisation operators in use, and so on?

My main use of dimensionality reduction is to improve visualisation to support decision making. I've seen PCA and differential evolution approaches in the decision space to show promising results on benchmarks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: