ABSTRACT
In the field of evolutionary computation, one of the most challenging topics is algorithm selection. Knowing which heuristics to use for which optimization problem is key to obtaining high-quality solutions. We aim to extend this research topic by taking a first step towards a selection method for adaptive CMA-ES algorithms. We build upon the theoretical work done by van Rijn et al. [PPSN'18], in which the potential of switching between different CMA-ES variants was quantified in the context of a modular CMA-ES framework.
We demonstrate in this work that their proposed approach is not very reliable, in that implementing the suggested adaptive configurations does not yield the predicted performance gains. We propose a revised approach, which results in a more robust fit between predicted and actual performance. The adaptive CMA-ES approach obtains performance gains on 18 out of 24 tested functions of the BBOB benchmark, with stable advantages of up to 23%. An analysis of module activation indicates which modules are most crucial for the different phases of optimizing each of the 24 benchmark problems. The module activation also suggests that additional gains are possible when including the (B)IPOP modules, which we have excluded for this present work.
- Anne Auger, Dimo Brockhoff, and Nikolaus Hansen. 2011. Mirrored Sampling in Evolution Strategies with Weighted Recombination. In Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation (GECCO '11). ACM, New York, NY, USA, 861--868. Google ScholarDigital Library
- A. Auger and N. Hansen. 2005. A restart CMA evolution strategy with increasing population size. In 2005 IEEE Congress on Evolutionary Computation, Vol. 2. 1769--1776 Vol. 2.Google Scholar
- Anne Auger, Mohamed Jebalia, and Olivier Teytaud. 2005. Algorithms (X, sigma, eta): Quasi-random Mutations for Evolution Strategies. In Artificial Evolution, 7th International Conference, Evolution Artificielle, EA 2005, Lille, France, October 26--28, 2005, Revised Selected Papers (Lecture Notes in Computer Science), El-Ghazali Talbi, Pierre Liardet, Pierre Collet, Evelyne Lutton, and Marc Schoenauer (Eds.), Vol. 3871. Springer, 296--307. Google ScholarDigital Library
- Thomas Bäck, Christophe Foussette, and Peter Krause. 2013. Contemporary Evolution Strategies. Springer. Google ScholarDigital Library
- Henri Bal, Dick Epema, Cees de Laat, Rob van Nieuwpoort, John Romein, Frank Seinstra, Cees Snoek, and Harry Wijshoff. 2016. A medium-scale distributed system for computer science research: Infrastructure for the long term. Computer 5 (2016), 54--63.Google ScholarCross Ref
- Thomas Bartz-Beielstein, Marco Chiarandini, Luís Paquete, and Mike Preuss. 2010. Experimental methods for the analysis of optimization algorithms. Springer. Google ScholarDigital Library
- Thomas Bartz-Beielstein, Christian Lasarczyk, and Mike Preuss. 2010. The sequential parameter optimization toolbox. In Experimental methods for the analysis of optimization algorithms. Springer, 337--362.Google Scholar
- Nacim Belkhir, Johann Dréo, Pierre Savéant, and Marc Schoenauer. 2017. Per instance algorithm configuration of CMA-ES with limited budget. In Proc. of Genetic and Evolutionary Computation Conference (GECCO'17). ACM, 681--688. Google ScholarDigital Library
- Dimo Brockhoff, Anne Auger, Nikolaus Hansen, Dirk V. Arnold, and Tim Hohm. 2010. Mirrored Sampling and Sequential Selection for Evolution Strategies. In Parallel Problem Solving from Nature, PPSN XI. Springer, Berlin, Heidelberg, 11--21. Google ScholarDigital Library
- Edmund K. Burke, Michel Gendreau, Matthew R. Hyde, Graham Kendall, Gabriela Ochoa, Ender Özcan, and Rong Qu. 2013. Hyper-heuristics: a survey of the state of the art. JORS 64, 12 (2013), 1695--1724.Google ScholarCross Ref
- Edmund K Burke, Barry McCollum, Amnon Meisels, Sanja Petrovic, and Rong Qu. 2007. A graph-based hyper-heuristic for educational timetabling problems. European Journal of Operational Research 176, 1 (2007), 177--192.Google ScholarCross Ref
- Nikolaus Hansen. 2008. CMA-ES with Two-Point Step-Size Adaptation. arXiv:0805.0231 {cs} (May 2008). http://arxiv.org/abs/0805.0231 arXiv: 0805.0231.Google Scholar
- Nikolaus Hansen. 2009. Benchmarking a BI-population CMA-ES on the BBOB-2009 Function Testbed. In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers (GECCO '09). ACM, New York, NY, USA, 2389--2396. Google ScholarDigital Library
- Nikolaus Hansen, Anne Auger, Dimo Brockhoff, Dejan TuÅąar, and Tea TuÅąar. 2016. COCO: Performance Assessment. arXiv:1605.03560 {cs} (May 2016). http://arxiv.org/abs/1605.03560 arXiv: 1605.03560.Google Scholar
- Nikolaus Hansen, Anne Auger, Steffen Finck, and Raymond Ros. 2009. Real-Parameter Black-Box Optimization Benchmarking 2009: Experimental Setup. report. INRIA. https://hal.inria.fr/inria-00362649/documentGoogle Scholar
- Nikolaus Hansen, Anne Auger, Raymond Ros, Steffen Finck, and Petr Pošík. 2010. Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009. In Proceedings of the 12th annual conference companion on Genetic and evolutionary computation. ACM, 1689--1696. Google ScholarDigital Library
- Nikolaus Hansen and Andreas Ostermeier. 2001. Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation 9, 2 (2001), 159--195. Google ScholarDigital Library
- Holger H. Hoos, Frank Neumann, and Heike Trautmann. 2016. Automated Algorithm Selection and Configuration (Dagstuhl Seminar 16412). Dagstuhl Reports 6, 10 (2016), 33--74.Google Scholar
- Frank Hutter, Holger H Hoos, and Kevin Leyton-Brown. 2011. Sequential model-based optimization for general algorithm configuration. In International Conference on Learning and Intelligent Optimization. Springer, 507--523. Google ScholarDigital Library
- G. A. Jastrebski and D. V. Arnold. 2006. Improving Evolution Strategies through Active Covariance Matrix Adaptation. In 2006 IEEE International Conference on Evolutionary Computation. 2814--2821.Google Scholar
- Pascal Kerschke, Holger H. Hoos, Frank Neumann, and Heike Trautmann. 2018. Automated Algorithm Selection: Survey and Perspectives. CoRR abs/1811.11597 (2018). arXiv:1811.11597 http://arxiv.org/abs/1811.11597Google Scholar
- Pascal Kerschke and Heike Trautmann. 2017. Automated Algorithm Selection on Continuous Black-Box Problems By Combining Exploratory Landscape Analysis and Machine Learning. arXiv preprint arXiv:1711.08921 (2017).Google Scholar
- Mario A Muñoz, Yuan Sun, Michael Kirley, and Saman K Halgamuge. 2015. Algorithm selection for black-box continuous optimization problems: a survey on methods and challenges. Information Sciences 317 (2015), 224--245. Google ScholarDigital Library
- A. Piad-Morffis, S. Estévez-Velarde, A. Bolufé-Röhler, J. Montgomery, and S. Chen. 2015. Evolution strategies with thresheld convergence. In 2015 IEEE Congress on Evolutionary Computation (CEC). 2097--2104.Google Scholar
- Sander van Rijn. 2018. Modular CMA-ES framework from {27}, v0.3.0. https://github.com/sjvrijn/ModEA. Available also as pypi package at https://pypi.org/project/ModEA/0.3.0/.Google Scholar
- Sander Van Rijn, Carola Doerr, and Thomas Bäck. 2018. Towards an Adaptive CMA-ES Configurator. In Proc. of 15th International Conference on Parallel Problem Solving from Nature (PPSN'18) (Lecture Notes in Computer Science), Vol. 11101. Springer, 54--65.Google ScholarCross Ref
- Sander van Rijn, Hao Wang, Matthijs van Leeuwen, and Thomas Bäck. 2016. Evolving the structure of Evolution Strategies. In 2016 IEEE Symposium Series on Computational Intelligence (SSCI). 1--8.Google ScholarCross Ref
- Diederick Vermetten, Sander van Rijn, Thomas Bäck, and Carola Doerr. 2019. On-line Selection of CMA-ES Variants. CoRR abs/1904.07801 (2019). arXiv:1904.07801 http://arxiv.org/abs/1904.07801 A GitHub repository containing more data and experiments from this project is available at https://github.com/Dvermetten/Online_CMA-ES_Selection.Google Scholar
- Hao Wang, Michael Emmerich, and Thomas Bäck. 2014. Mirrored Orthogonal Sampling with Pairwise Selection in Evolution Strategies. In Proceedings of the 29th Annual ACM Symposium on Applied Computing (SAC '14). ACM, New York, NY, USA, 154--156. Google ScholarDigital Library
Index Terms
- Online selection of CMA-ES variants
Recommendations
Comparing natural evolution strategies to BIPOP-CMA-ES on noiseless and noisy black-box optimization testbeds
GECCO '12: Proceedings of the 14th annual conference companion on Genetic and evolutionary computationNatural Evolution Strategies (NES) are a recent member of the class of preal-valued optimization algorithms that are based on adapting search distributions. Exponential NES (xNES) are the most common instantiation of NES, and particularly appropriate ...
Intensive surrogate model exploitation in self-adaptive surrogate-assisted cma-es (saacm-es)
GECCO '13: Proceedings of the 15th annual conference on Genetic and evolutionary computationThis paper presents a new mechanism for a better exploitation of surrogate models in the framework of Evolution Strategies (ESs). This mechanism is instantiated here on the self-adaptive surrogate-assisted Covariance Matrix Adaptation Evolution Strategy ...
An empirical comparison of CMA-ES in dynamic environments
PPSN'12: Proceedings of the 12th international conference on Parallel Problem Solving from Nature - Volume Part IThis paper empirically investigates the behavior of three variants of covariance matrix adaptation evolution strategies (CMA-ES) for dynamic optimization. These three strategies include the elitist (1+1)-CMA-ES, the non-elitist (μ,λ)-CMA-ES and sep-CMA-...
Comments