ABSTRACT
We present a surrogate-assisted multiobjective optimization algorithm. The aggregation of the objectives relies on the Uncrowded Hypervolume Improvement (UHVI) which is partly replaced by a linear-quadratic surrogate that is integrated into the CMA-ES algorithm. Surrogating the UHVI poses two challenges. First, the UHVI is a dynamic function, changing with the empirical Pareto set. Second, it is a composite function, defined differently for dominated and nondominated points. The presented algorithm is thought to be used with expensive functions of moderate dimension (up to about 50) with a quadratic surrogate which is updated based on its ranking ability. We report numerical experiments which include tests on the COCO benchmark. The algorithm shows in particular linear convergence on the double sphere function with a convergence rate that is 6--20 times faster than without surrogate assistance.
- Richard Allmendinger, Michael TM Emmerich, Jussi Hakanen, Yaochu Jin, and Enrico Rigoni. 2017. Surrogate-assisted multicriteria optimization: Complexities, prospective solutions, and business case. Journal of Multi-Criteria Decision Analysis 24, 1--2 (2017), 5--24.Google ScholarCross Ref
- Anne Auger, Johannes Bader, Dimo Brockhoff, and Eckart Zitzler. 2009. Theory of the hypervolume indicator: optimal μ-distributions and the choice of the reference point. In Foundations of Genetic Algorithms (FOGA 2009). ACM, New York, NY, USA, 87--102.Google ScholarDigital Library
- Mickaël Binois and Victor Picheny. 2019. GPareto: An R package for gaussian-process-based multi-objective optimization and analysis. Journal of Statistical Software 89 (2019), 1--30.Google ScholarCross Ref
- G. E. P. Box and K. B. Wilson. 1951. On the Experimental Attainment of Optimum Conditions. Journal of the Royal Statistical Society: Series B (Methodological) 13, 1 (1951), 1--38. arXiv:https://rss.onlinelibrary.wiley.com/doi/pdf/10.1111/j.2517-6161.1951.tb00067.x Google ScholarCross Ref
- Dimo Brockhoff, Anne Auger, Nikolaus Hansen, and Tea Tušar. 2022. Using well-understood single-objective functions in multiobjective black-box optimization test suites. Evolutionary Computation 30, 2 (2022), 165--193.Google ScholarCross Ref
- Dimo Brockhoff and Tea Tušar. 2019. Benchmarking algorithms from the platypus framework on the biobjective bbob-biobj testbed. In Genetic and Evolutionary Computation Conference Companion (GECCO 2019). ACM, New York, NY, USA, 1905--1911.Google ScholarDigital Library
- Tinkle Chugh, Karthik Sindhya, Jussi Hakanen, and Kaisa Miettinen. 2019. A survey on handling computationally expensive multiobjective optimization problems with evolutionary algorithms. Soft Computing 23 (2019), 3137--3166.Google ScholarDigital Library
- Kalyanmoy Deb and Himanshu Jain. 2013. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints. IEEE Transactions on Evolutionary Computation 18, 4 (2013), 577--601.Google ScholarCross Ref
- Kalyanmoy Deb, Proteek Chandan Roy, and Rayan Hussein. 2020. Surrogate modeling approaches for multiobjective optimization: methods, taxonomy, and results. Mathematical and Computational Applications 26, 1 (2020), 5.Google ScholarCross Ref
- Paul Dufossé and Cheikh Touré. 2019. Benchmarking MO-CMA-ES and COMO-CMA-ES on the Bi-objective bbob-biobj Testbed. In Genetic and Evolutionary Computation Conference Companion (GECCO 2019). ACM, New York, NY, USA, 1920--1927.Google ScholarDigital Library
- Nikolaus Hansen. 2019. A global surrogate assisted CMA-ES. In Genetic and Evolutionary Computation Conference (GECCO 2019). ACM, New York, NY, USA, 664--672.Google ScholarDigital Library
- Nikolaus Hansen, Anne Auger, Raymond Ros, Olaf Mersmann, Tea Tušar, and Dimo Brockhoff. 2021. COCO: A platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software 36, 1 (2021), 114--144.Google ScholarCross Ref
- Nikolaus Hansen and Andreas Ostermeier. 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9, 2 (2001), 159--195.Google ScholarDigital Library
- Nikolaus Hansen, Gilles Pujol, Daniel Salazar Aponte, and Rodolphe Le Riche. 2010. On Object-Oriented Programming of Optimizers-Examples in Scilab. In Multidisciplinary Design Optimization in Computational Mechanics, Piotr Breitkopf and Rajan Filomeno Coehlo (Eds.). Wiley, Hoboken, NJ, USA, 499--538.Google Scholar
- Christian Igel, Nikolaus Hansen, and Stefan Roth. 2007. Covariance matrix adaptation for multi-objective optimization. Evolutionary Computation 15, 1 (2007), 1--28.Google ScholarDigital Library
- Marios K. Karakasis and Kyriakos Giannakoglou. 2005. Metamodel-assisted multi-objective evolutionary optimization. In Evolutionary and Deterministic Methods for Design, Optimization and Control with Applications to Industrial and Societal Problems (EUROGEN 2005), R. R. Schilling, W. Haase, J. Periaux, H. Baier, and G. Bugeda (Eds.). Springer, Berlin/Heidelberg, Germany.Google Scholar
- GP Liu, Xu Han, and Chao Jiang. 2008. A novel multi-objective optimization method based on an approximation model management technique. Computer Methods in Applied Mechanics and Engineering 197, 33--40 (2008), 2719--2731.Google ScholarCross Ref
- Ilya Loshchilov, Marc Schoenauer, and Michèle Sebag. 2010. A mono surrogate for multiobjective optimization. In Genetic and Evolutionary Computation Conference (GECCO 2010). ACM, New York, NY, USA, 471--478.Google ScholarDigital Library
- Martin Pilát and Roman Neruda. 2015. Hypervolume-based surrogate model for MO-CMA-ES. In International Conference on Tools with Artificial Intelligence (ICTAI). IEEE Computer Society, Washington, D.C., USA, 604--611.Google ScholarDigital Library
- Bobak Shahriari, Kevin Swersky, Ziyu Wang, Ryan P Adams, and Nando De Freitas. 2015. Taking the human out of the loop: A review of Bayesian optimization. Proc. IEEE 104, 1 (2015), 148--175.Google ScholarCross Ref
- Cheikh Touré, Nikolaus Hansen, Anne Auger, and Dimo Brockhoff. 2019. Un-crowded hypervolume improvement: COMO-CMA-ES and the Sofomore framework. In Genetic and Evolutionary Computation Conference (GECCO 2019). ACM, New York, NY, USA, 638--646.Google ScholarDigital Library
- Thomas Voß, Nikolaus Hansen, and Christian Igel. 2010. Improved step size adaptation for the MO-CMA-ES. In Conference on Genetic and Evolutionary Computation (GECCO 2010). ACM, New York, NY, USA, 487--494.Google Scholar
Index Terms
- Multiobjective Optimization with a Quadratic Surrogate-assisted CMA-ES
Recommendations
Intensive surrogate model exploitation in self-adaptive surrogate-assisted cma-es (saacm-es)
GECCO '13: Proceedings of the 15th annual conference on Genetic and evolutionary computationThis paper presents a new mechanism for a better exploitation of surrogate models in the framework of Evolution Strategies (ESs). This mechanism is instantiated here on the self-adaptive surrogate-assisted Covariance Matrix Adaptation Evolution Strategy ...
Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy
GECCO '12: Proceedings of the 14th annual conference on Genetic and evolutionary computationThis paper presents a novel mechanism to adapt surrogate-assisted population-based algorithms. This mechanism is applied to ACM-ES, a recently proposed surrogate-assisted variant of CMA-ES. The resulting algorithm, s*ACM-ES, adjusts online the ...
Black-box optimization benchmarking of IPOP-saACM-ES and BIPOP-saACM-ES on the BBOB-2012 noiseless testbed
GECCO '12: Proceedings of the 14th annual conference companion on Genetic and evolutionary computationIn this paper, we study the performance of IPOP-saACM-ES and BIPOP-saACM-ES, recently proposed self-adaptive surrogate-assisted Covariance Matrix Adaptation Evolution Strategies. Both algorithms were tested using restarts till a total number of function ...
Comments