skip to main content
10.1145/3321707.3321842acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

A global surrogate assisted CMA-ES

Published:13 July 2019Publication History

ABSTRACT

We explore the arguably simplest way to build an effective surrogate fitness model in continuous search spaces. The model complexity is linear or diagonal-quadratic or full quadratic, depending on the number of available data. The model parameters are computed from the Moore-Penrose pseudoinverse. The model is used as a surrogate fitness for CMA-ES if the rank correlation between true fitness and surrogate value of recently sampled data points is high. Otherwise, further samples from the current population are successively added as data to the model. We empirically compare the IPOP scheme of the new model assisted lq-CMA-ES with a variety of previously proposed methods and with a simple portfolio algorithm using SLSQP and CMA-ES. We conclude that a global quadratic model and a simple portfolio algorithm are viable options to enhance CMA-ES. The model building code is available as part of the pycma Python module on Github and PyPI.

References

  1. Anne Auger, Dimo Brockhoff, and Nikolaus Hansen. Benchmarking the local metamodel CMA-ES on the noiseless BBOB'2013 test bed. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO '13 Companion, pages 1225--1232, New York, NY, USA, 2013. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Anne Auger and Nikolaus Hansen. A restart CMA evolution strategy with increasing population size. In Evolutionary Computation, 2005. The 2005 IEEE Congress on, volume 2, pages 1769--1776. IEEE, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  3. Anne Auger, Marc Schoenauer, and Nicolas Vanhaecke. LS-CMA-ES: A second-order algorithm for covariance matrix adaptation. In International Conference on Parallel Problem Solving from Nature, pages 182--191. Springer, 2004.Google ScholarGoogle Scholar
  4. Lukáš Bajer, Zbyněk Pitra, Jakub Repickỳ, and Martin Holeňa. Gaussian process surrogate models for the CMA evolution strategy. Evolutionary computation, to appear 2019.Google ScholarGoogle Scholar
  5. Aurore Blelly, Matheus Felipe-Gomes, Anne Auger, and Dimo Brockhoff. Stopping criteria, initialization, and implementations of BFGS and their effect on the BBOB test suite. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, GECCO '18, pages 1513--1517, New York, NY, USA, 2018. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Benjamin Doerr, Mahmoud Fouz, Martin Schmidt, and Magnus Wahlstrom. Bbob: Nelder-mead with resize and halfruns. In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, GECCO '09, pages 2239--2246, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. N. Hansen and A. Auger. Principled design of continuous stochastic search: From theory to practice. In Y. Borenstein and A. Moraglio, editors, Theory and principled methods for the design of metaheuristics, Natural Computing Series, pages 145--180. Springer, Berlin, Heidelberg, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  8. Nikolaus Hansen. Injecting external solutions into CMA-ES. ArXiv e-prints, arXiv:1110.4181, October 2011.Google ScholarGoogle Scholar
  9. Nikolaus Hansen. The CMA evolution strategy: A tutorial. ArXiv e-prints, arXiv:1604.00772 {cs.LG}, April 2016.Google ScholarGoogle Scholar
  10. Nikolaus Hansen, Youhei Akimoto, and Petr Baudis. CMA-ES/pycma on Github. Zenodo, February 2019.Google ScholarGoogle Scholar
  11. Nikolaus Hansen, Dirk V Arnold, and Anne Auger. Evolution strategies. In J. Kacprzyk and W. Pedrycz, editors, Springer handbook of computational intelligence, Springer Handbooks, pages 871--898. Springer, Berlin, Heidelberg, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  12. Nikolaus Hansen, Steffen Finck, Raymond Ros, and Anne Auger. Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Research Report RR-6829, INRIA, 2009.Google ScholarGoogle Scholar
  13. Nikolaus Hansen, Sibylle D Müller, and Petros Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary computation, 11(1):1--18, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Nikolaus Hansen and Andreas Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary computation, 9(2):159--195, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Nikolaus Hansen and Raymond Ros. Benchmarking a weighted negative covariance matrix update on the BBOB-2010 noiseless testbed. In Proceedings of the 12th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO '10, pages 1673--1680, New York, NY, USA, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Nikolaus Hansen, Tea Tusar, Olaf Mersmann, Anne Auger, and Dimo Brockhoff. COCO: the experimental procedure. ArXiv e-prints, arXiv:1603.08776 {cs.AI}, 2016.Google ScholarGoogle Scholar
  17. Georges R Harik and Fernando G Lobo. A parameter-less genetic algorithm. In W. Banzhaf et al., editors, Proceedings of the Genetic and Evolutionary Computation Conference GECCO-99, pages 258--265. Morgan Kaufmann Publishers, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Waltraud Huyer and Arnold Neumaier. Benchmarking of MCS on the noiseless function testbed. Technical report, University of Vienna, 2009.Google ScholarGoogle Scholar
  19. Grahame A Jastrebski and Dirk V Arnold. Improving evolution strategies through active covariance matrix adaptation. In Evolutionary Computation, 2006. CEC 2006. IEEE Congress on, pages 2814--2821. IEEE, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  20. Stefan Kern, Nikolaus Hansen, and Petros Koumoutsakos. Local meta-models for optimization using evolution strategies. In Parallel Problem Solving from Nature-PPSN IX, pages 939--948. Springer, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Dieter Kraft. A software package for sequential quadratic programming. Research report DFVLR-FB--88-28, Deutsche Forschungs- und Versuchsanstalt fur Luft- und Raumfahrt, 1988.Google ScholarGoogle Scholar
  22. Dieter Kraft. Algorithm 733: TOMP---Fortran modules for optimal control calculations. ACM Trans. Math. Softw., 20(3):262--281, September 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Ilya Loshchilov, Marc Schoenauer, and Michèle Sebag. Black-box optimization benchmarking of IPOP-saACM-ES and BIPOP-saACM-ES on the BBOB-2012 noiseless testbed. In Proceedings of the 14th annual conference companion on Genetic and evolutionary computation, pages 175--182. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Ilya Loshchilov, Marc Schoenauer, and Michele Sebag. Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy. In Proceedings of the 14th annual conference on Genetic and evolutionary computation, pages 321--328. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. John A Nelder and Roger Mead. A simplex method for function minimization. The computer journal, 7(4):308--313, 1965.Google ScholarGoogle Scholar
  26. Petr Pošík and Vojtèch Franc. Estimation of fitness landscape contours in EAs. In Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, GECCO '07, pages 562--569, New York, NY, USA, 2007. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Michael JD Powell. The NEWUOA software for unconstrained optimization without derivatives. In Large-scale nonlinear optimization, pages 255--297. Springer, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  28. Ingo Rechenberg. Evolutionsstrategie-Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog, 1973.Google ScholarGoogle Scholar
  29. Raymond Ros. Benchmarking the BFGS algorithm on the BBOB-2009 function testbed. In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, GECCO '09, pages 2409--2414, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Raymond Ros. Benchmarking the NEWUOA on the BBOB-2009 function testbed. In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, GECCO '09, pages 2421--2428, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Thomas Philip Runarsson. Ordinal regression in evolutionary computation. In T.P. Runarsson, HG. Beyer, E. Burke, J.J. Merelo-Guervós, L.D. Whitley, and X. Yao, editors, Parallel Problem Solving from Nature-PPSN IX, volume 4193 of Lecture Notes in Computer Science, pages 1048--1057. Springer, Berlin, Heidelberg, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A global surrogate assisted CMA-ES

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        GECCO '19: Proceedings of the Genetic and Evolutionary Computation Conference
        July 2019
        1545 pages
        ISBN:9781450361118
        DOI:10.1145/3321707

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 13 July 2019

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate1,669of4,410submissions,38%

        Upcoming Conference

        GECCO '24
        Genetic and Evolutionary Computation Conference
        July 14 - 18, 2024
        Melbourne , VIC , Australia

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader