ABSTRACT
We explore the arguably simplest way to build an effective surrogate fitness model in continuous search spaces. The model complexity is linear or diagonal-quadratic or full quadratic, depending on the number of available data. The model parameters are computed from the Moore-Penrose pseudoinverse. The model is used as a surrogate fitness for CMA-ES if the rank correlation between true fitness and surrogate value of recently sampled data points is high. Otherwise, further samples from the current population are successively added as data to the model. We empirically compare the IPOP scheme of the new model assisted lq-CMA-ES with a variety of previously proposed methods and with a simple portfolio algorithm using SLSQP and CMA-ES. We conclude that a global quadratic model and a simple portfolio algorithm are viable options to enhance CMA-ES. The model building code is available as part of the pycma Python module on Github and PyPI.
- Anne Auger, Dimo Brockhoff, and Nikolaus Hansen. Benchmarking the local metamodel CMA-ES on the noiseless BBOB'2013 test bed. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO '13 Companion, pages 1225--1232, New York, NY, USA, 2013. ACM. Google ScholarDigital Library
- Anne Auger and Nikolaus Hansen. A restart CMA evolution strategy with increasing population size. In Evolutionary Computation, 2005. The 2005 IEEE Congress on, volume 2, pages 1769--1776. IEEE, 2005.Google ScholarCross Ref
- Anne Auger, Marc Schoenauer, and Nicolas Vanhaecke. LS-CMA-ES: A second-order algorithm for covariance matrix adaptation. In International Conference on Parallel Problem Solving from Nature, pages 182--191. Springer, 2004.Google Scholar
- Lukáš Bajer, Zbyněk Pitra, Jakub Repickỳ, and Martin Holeňa. Gaussian process surrogate models for the CMA evolution strategy. Evolutionary computation, to appear 2019.Google Scholar
- Aurore Blelly, Matheus Felipe-Gomes, Anne Auger, and Dimo Brockhoff. Stopping criteria, initialization, and implementations of BFGS and their effect on the BBOB test suite. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO '18, pages 1513--1517, New York, NY, USA, 2018. ACM. Google ScholarDigital Library
- Benjamin Doerr, Mahmoud Fouz, Martin Schmidt, and Magnus Wahlstrom. Bbob: Nelder-mead with resize and halfruns. In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, GECCO '09, pages 2239--2246, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- N. Hansen and A. Auger. Principled design of continuous stochastic search: From theory to practice. In Y. Borenstein and A. Moraglio, editors, Theory and principled methods for the design of metaheuristics, Natural Computing Series, pages 145--180. Springer, Berlin, Heidelberg, 2014.Google ScholarCross Ref
- Nikolaus Hansen. Injecting external solutions into CMA-ES. ArXiv e-prints, arXiv:1110.4181, October 2011.Google Scholar
- Nikolaus Hansen. The CMA evolution strategy: A tutorial. ArXiv e-prints, arXiv:1604.00772 {cs.LG}, April 2016.Google Scholar
- Nikolaus Hansen, Youhei Akimoto, and Petr Baudis. CMA-ES/pycma on Github. Zenodo, February 2019.Google Scholar
- Nikolaus Hansen, Dirk V Arnold, and Anne Auger. Evolution strategies. In J. Kacprzyk and W. Pedrycz, editors, Springer handbook of computational intelligence, Springer Handbooks, pages 871--898. Springer, Berlin, Heidelberg, 2015.Google ScholarCross Ref
- Nikolaus Hansen, Steffen Finck, Raymond Ros, and Anne Auger. Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Research Report RR-6829, INRIA, 2009.Google Scholar
- Nikolaus Hansen, Sibylle D Müller, and Petros Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary computation, 11(1):1--18, 2003. Google ScholarDigital Library
- Nikolaus Hansen and Andreas Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary computation, 9(2):159--195, 2001. Google ScholarDigital Library
- Nikolaus Hansen and Raymond Ros. Benchmarking a weighted negative covariance matrix update on the BBOB-2010 noiseless testbed. In Proceedings of the 12th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO '10, pages 1673--1680, New York, NY, USA, 2010. ACM. Google ScholarDigital Library
- Nikolaus Hansen, Tea Tusar, Olaf Mersmann, Anne Auger, and Dimo Brockhoff. COCO: the experimental procedure. ArXiv e-prints, arXiv:1603.08776 {cs.AI}, 2016.Google Scholar
- Georges R Harik and Fernando G Lobo. A parameter-less genetic algorithm. In W. Banzhaf et al., editors, Proceedings of the Genetic and Evolutionary Computation Conference GECCO-99, pages 258--265. Morgan Kaufmann Publishers, 1999. Google ScholarDigital Library
- Waltraud Huyer and Arnold Neumaier. Benchmarking of MCS on the noiseless function testbed. Technical report, University of Vienna, 2009.Google Scholar
- Grahame A Jastrebski and Dirk V Arnold. Improving evolution strategies through active covariance matrix adaptation. In Evolutionary Computation, 2006. CEC 2006. IEEE Congress on, pages 2814--2821. IEEE, 2006.Google ScholarCross Ref
- Stefan Kern, Nikolaus Hansen, and Petros Koumoutsakos. Local meta-models for optimization using evolution strategies. In Parallel Problem Solving from Nature-PPSN IX, pages 939--948. Springer, 2006. Google ScholarDigital Library
- Dieter Kraft. A software package for sequential quadratic programming. Research report DFVLR-FB--88-28, Deutsche Forschungs- und Versuchsanstalt fur Luft- und Raumfahrt, 1988.Google Scholar
- Dieter Kraft. Algorithm 733: TOMP---Fortran modules for optimal control calculations. ACM Trans. Math. Softw., 20(3):262--281, September 1994. Google ScholarDigital Library
- Ilya Loshchilov, Marc Schoenauer, and Michèle Sebag. Black-box optimization benchmarking of IPOP-saACM-ES and BIPOP-saACM-ES on the BBOB-2012 noiseless testbed. In Proceedings of the 14th annual conference companion on Genetic and evolutionary computation, pages 175--182. ACM, 2012. Google ScholarDigital Library
- Ilya Loshchilov, Marc Schoenauer, and Michele Sebag. Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy. In Proceedings of the 14th annual conference on Genetic and evolutionary computation, pages 321--328. ACM, 2012. Google ScholarDigital Library
- John A Nelder and Roger Mead. A simplex method for function minimization. The computer journal, 7(4):308--313, 1965.Google Scholar
- Petr Pošík and Vojtèch Franc. Estimation of fitness landscape contours in EAs. In Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, GECCO '07, pages 562--569, New York, NY, USA, 2007. ACM. Google ScholarDigital Library
- Michael JD Powell. The NEWUOA software for unconstrained optimization without derivatives. In Large-scale nonlinear optimization, pages 255--297. Springer, 2006.Google ScholarCross Ref
- Ingo Rechenberg. Evolutionsstrategie-Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog, 1973.Google Scholar
- Raymond Ros. Benchmarking the BFGS algorithm on the BBOB-2009 function testbed. In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, GECCO '09, pages 2409--2414, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- Raymond Ros. Benchmarking the NEWUOA on the BBOB-2009 function testbed. In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, GECCO '09, pages 2421--2428, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- Thomas Philip Runarsson. Ordinal regression in evolutionary computation. In T.P. Runarsson, HG. Beyer, E. Burke, J.J. Merelo-Guervós, L.D. Whitley, and X. Yao, editors, Parallel Problem Solving from Nature-PPSN IX, volume 4193 of Lecture Notes in Computer Science, pages 1048--1057. Springer, Berlin, Heidelberg, 2006. Google ScholarDigital Library
Index Terms
- A global surrogate assisted CMA-ES
Recommendations
CMA-ES: evolution strategies and covariance matrix adaptation
GECCO '11: Proceedings of the 13th annual conference companion on Genetic and evolutionary computationEvolution Strategies (ESs) and many continuous domain Estimation of Distribution Algorithms (EDAs) are stochastic optimization procedures that sample a multivariate normal (Gaussian) distribution in the continuous search space, Rn. Many of them can be ...
Intensive surrogate model exploitation in self-adaptive surrogate-assisted cma-es (saacm-es)
GECCO '13: Proceedings of the 15th annual conference on Genetic and evolutionary computationThis paper presents a new mechanism for a better exploitation of surrogate models in the framework of Evolution Strategies (ESs). This mechanism is instantiated here on the self-adaptive surrogate-assisted Covariance Matrix Adaptation Evolution Strategy ...
Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy
GECCO '12: Proceedings of the 14th annual conference on Genetic and evolutionary computationThis paper presents a novel mechanism to adapt surrogate-assisted population-based algorithms. This mechanism is applied to ACM-ES, a recently proposed surrogate-assisted variant of CMA-ES. The resulting algorithm, s*ACM-ES, adjusts online the ...
Comments