Hacker News new | past | comments | ask | show | jobs | submit login
New release of Gradient-Free-Optimizers with two new evolutionary algorithms (github.com/simonblanke)
32 points by 528491 3 months ago | hide | past | favorite | 9 comments



When I read "evolution strategy" I was pretty sure to find some variant of the Canonical Evolution Strategy (as in https://arxiv.org/pdf/1802.08842) or maybe CMA-ES or something related. But the implementation looks like a GA. Maybe the term means different things to different people...?


Thanks for pointing that out. The current implementation is not self-adapting the parameters (like mutation strength) of the individuals in the population: https://github.com/SimonBlanke/Gradient-Free-Optimizers/issu...


the title mentions two new evolutionary algorithms, and I think it would be good if the title clarifies which two are new. it seems the new ones are the genetic algorithm and differential evolution

i find it interesting that Gradient-Free-Optimizers is used in a library for hyperparameter optimization. so in essence using a gradient-free approach to optimize a gradient-based approach


The title was to long, but I edited my comment for clarification. Hyperparameter optimization is normally done with gradient-free-optimization techniques, since a gradient cannot be computed. This is opposed to the optimization of parameters during training (e.g. neural network weights).


I was looking for papers on this topic a while back. Here’s one which lists many derivative-free methods:

https://arxiv.org/abs/1904.11585


that's quite the comprehensive paper! it's almost like a mini textbook


The new release adds the Genetic Algorithm and Differential Evolution. Also check out the documentation for the new optimization algorithms: https://simonblanke.github.io/gradient-free-optimizers-docum...


I've previously used pagmo2 for this kind of thing with some amount of success. Might be worth giving this one a try as pagmo2's c++ patterns can be something of a mindfuck.


Are there any advantages of this over scipy.optimize? That's what I've used in the past. Trying to understand if it's worth switching.

It looks like they have many of the same algorithms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: