When I read "evolution strategy" I was pretty sure to find some variant of the Canonical Evolution Strategy (as in https://arxiv.org/pdf/1802.08842) or maybe CMA-ES or something related. But the implementation looks like a GA. Maybe the term means different things to different people...?
the title mentions two new evolutionary algorithms, and I think it would be good if the title clarifies which two are new. it seems the new ones are the genetic algorithm and differential evolution
i find it interesting that Gradient-Free-Optimizers is used in a library for hyperparameter optimization. so in essence using a gradient-free approach to optimize a gradient-based approach
The title was to long, but I edited my comment for clarification. Hyperparameter optimization is normally done with gradient-free-optimization techniques, since a gradient cannot be computed. This is opposed to the optimization of parameters during training (e.g. neural network weights).
I've previously used pagmo2 for this kind of thing with some amount of success. Might be worth giving this one a try as pagmo2's c++ patterns can be something of a mindfuck.