Skip to main content

Per-run Algorithm Selection with Warm-Starting Using Trajectory-Based Features

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13398))

Abstract

Per-instance algorithm selection seeks to recommend, for a given problem instance and a given performance criterion, one or several suitable algorithms that are expected to perform well for the particular setting. The selection is classically done offline, using openly available information about the problem instance or features that are extracted from the instance during a dedicated feature extraction step. This ignores valuable information that the algorithms accumulate during the optimization process. In this work, we propose an alternative, online algorithm selection scheme which we coin as “per-run” algorithm selection. In our approach, we start the optimization with a default algorithm, and, after a certain number of iterations, extract instance features from the observed trajectory of this initial optimizer to determine whether to switch to another optimizer. We test this approach using the CMA-ES as the default solver, and a portfolio of six different optimizers as potential algorithms to switch to. In contrast to other recent work on online per-run algorithm selection, we warm-start the second optimizer using information accumulated during the first optimization phase. We show that our approach outperforms static per-instance algorithm selection. We also compare two different feature extraction principles, based on exploratory landscape analysis and time series analysis of the internal state variables of the CMA-ES, respectively. We show that a combination of both feature sets provides the most accurate recommendations for our test cases, taken from the BBOB function suite from the COCO platform and the YABBOB suite from the Nevergrad platform.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M.: Per instance algorithm configuration of CMA-ES with limited budget. In: Proceedings of Genetic and Evolutionary Computation (GECCO 2017), pp. 681–688. ACM (2017). https://doi.org/10.1145/3071178.3071343

  2. Bischl, B., Mersmann, O., Trautmann, H., Preuss, M.: Algorithm selection based on exploratory landscape analysis and cost-sensitive learning. In: Proceedings of Genetic and Evolutionary Computation Conference, GECCO’12. pp. 313–320. ACM (2012). https://doi.org/10.1145/2330163.2330209

  3. Broyden, C.G.: The convergence of a class of double-rank minimization algorithms. J. Inst. Math. Appl. 6, 76–90 (1970)

    Article  Google Scholar 

  4. Christ, M., Braun, N., Neuffer, J., Kempa-Liehr, A.W.: tsfresh package for time-series feature engineering. http://tsfresh.readthedocs.io/en/latest/text/listoffeatures.html

  5. Cosson, R., Derbel, B., Liefooghe, A., Aguirre, H., Tanaka, K., Zhang, Q.: Decomposition-based multi-objective landscape features and automated algorithm selection. In: Zarges, C., Verel, S. (eds.) EvoCOP 2021. LNCS, vol. 12692, pp. 34–50. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72904-2_3

    Chapter  Google Scholar 

  6. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  7. Eftimov, T., Popovski, G., Renau, Q., Korošec, P., Doerr, C.: Linear matrix factorization embeddings for single-objective optimization landscapes. In: Proceedings of IEEE Symposium Series on Computational Intelligence (SSCI 2020), pp. 775–782. IEEE (2020). https://doi.org/10.1109/SSCI47803.2020.9308180

  8. Fletcher, R.: A new approach to variable metric algorithms. Comput. J. 13, 317–322 (1970)

    Article  Google Scholar 

  9. Goldfarb, D.F.: A family of variable-metric methods derived by variational means. Math. Comput. 24, 23–26 (1970)

    Article  MathSciNet  Google Scholar 

  10. Hansen, N., Auger, A., Finck, S., Ros, R.: Real-Parameter Black-Box Optimization Benchmarking: Experimental Setup. RR-7215, INRIA (2010)

    Google Scholar 

  11. Hansen, N., Auger, A., Ros, R., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting. Optim. Meth. Software 36, 1–31 (2020). https://doi.org/10.1080/10556788.2020.1808977

    Article  MathSciNet  MATH  Google Scholar 

  12. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  13. Hutter, F., Kotthoff, L., Vanschoren, J. (eds.): Automated Machine Learning. TSSCML, Springer, Cham (2019). https://doi.org/10.1007/978-3-030-05318-5

  14. Jankovic, A., Doerr, C.: Landscape-aware fixed-budget performance regression and algorithm selection for modular CMA-ES variants. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO 2020), pp. 841–849. ACM (2020). https://doi.org/10.1145/3377930.3390183

  15. Jankovic, A., Eftimov, T., Doerr, C.: Towards feature-based performance regression using trajectory data. In: Castillo, P.A., Jiménez Laredo, J.L. (eds.) EvoApplications 2021. LNCS, vol. 12694, pp. 601–617. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72699-7_38

    Chapter  Google Scholar 

  16. Jankovic, A., Popovski, G., Eftimov, T., Doerr, C.: The impact of hyper-parameter tuning for landscape-aware performance regression and algorithm selection. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO 2021), pp. 687–696. ACM (2021). https://doi.org/10.1145/3449639.3459406

  17. Jankovic, A., et al.: Per-Run Algorithm Selection with Warm-starting using Trajectory-based Features - Data, April 2022. https://doi.org/10.5281/zenodo.6458266

  18. Jankovic, A., Vermetten, D., Kostovska, A., de Nobel, J., Eftimov, T., Doerr, C.: Trajectory-based algorithm selection with warm-starting (2022). https://doi.org/10.48550/arxiv.2204.06397

  19. Kan, A., Timmer, G.: Stochastic global optimization methods Part II: multi level methods. Math. Program. 39, 57–78 (1987). https://doi.org/10.1007/BF02592071

    Article  MATH  Google Scholar 

  20. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN 1995 - International Conference on Neural Networks, vol. 4, pp. 1942–1948 (1995). https://doi.org/10.1109/ICNN.1995.488968

  21. Kerschke, P., Hoos, H.H., Neumann, F., Trautmann, H.: Automated algorithm selection: survey and perspectives. Evol. Comput. 27(1), 3–45 (2019)

    Article  Google Scholar 

  22. Kerschke, P., Trautmann, H.: The R-package FLACCO for exploratory landscape analysis with applications to multi-objective optimization problems. In: CEC, pp. 5262–5269. IEEE (2016). https://doi.org/10.1109/CEC.2016.7748359

  23. Lindauer, M., Hoos, H.H., Hutter, F., Schaub, T.: Autofolio: an automatically configured algorithm selector. J. Artif. Intell. Res. 53, 745–778 (2015). https://doi.org/10.1613/jair.4726

    Article  Google Scholar 

  24. Meidani, K., Mirjalili, S., Farimani, A.B.: Online metaheuristic algorithm selection. Exp. Syst. Appl. 201, 117058 (2022)

    Article  Google Scholar 

  25. Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: Proceedings of Genetic and Evolutionary Computation Conference (GECCO 2021), pp. 829–836. ACM (2011). https://doi.org/10.1145/2001576.2001690

  26. Nobel, J., Wang, H., Bäck, T.: Explorative data analysis of time series based algorithm features of CMA-ES variants, pp. 510–518, June 2021. https://doi.org/10.1145/3449639.3459399

  27. Pedregosa, F., et al.: Scikit-learn: machine learning in Python. JMLR 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  28. Rapin, J., Teytaud, O.: Nevergrad - A gradient-free optimization platform (2018). http://GitHub.com/FacebookResearch/Nevergrad

  29. van Rijn, S.: Modular CMA-ES framework from [30], v0.3.0 (2018). http://github.com/sjvrijn/ModEA. Available also as PyPi package at http://pypi.org/project/ModEA/0.3.0/

  30. van Rijn, S., Wang, H., van Leeuwen, M., Bäck, T.: Evolving the structure of evolution strategies. In: Proceedings of IEEE Symposium Series on Computational Intelligence (SSCI 2016), pp. 1–8. IEEE (2016). https://doi.org/10.1109/SSCI.2016.7850138

  31. RinnooyKan, A.H.G., Timmer, G.T.: Stochastic global optimization methods. Part 1: clustering methods. Math. Program. 39(1), 27–56 (1987)

    Article  Google Scholar 

  32. Schröder, D., Vermetten, D., Wang, H., Doerr, C., Bäck, T.: Chaining of numerical black-box algorithms: Warm-starting and switching points (2022). https://doi.org/10.48550/arxiv.2204.06539

  33. Shanno, D.: Conditioning of quasi-newton methods for function minimization. Math. Comput. 24, 647–656 (1970)

    Article  MathSciNet  Google Scholar 

  34. Storn, R., Price, K.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11(4), 341–359 (1997). https://doi.org/10.1023/A:1008202821328

    Article  MathSciNet  MATH  Google Scholar 

  35. Wang, H., Vermetten, D., Ye, F., Doerr, C., Bäck, T.: Iohanalyzer: Detailed performance analysis for iterative optimization heuristic. ACM Trans. Evol. Learn. Optim. (2022). https://doi.org/10.1145/3510426, to appear. IOHanalyzer is available at CRAN, on GitHub, and as web-based GUI, see http://iohprofiler.github.io/IOHanalyzer/ for links

  36. Xu, L., Hutter, F., Hoos, H., Leyton-Brown, K.: Evaluating component solver contributions to portfolio-based algorithm selectors. In: Cimatti, A., Sebastiani, R. (eds.) SAT 2012. LNCS, vol. 7317, pp. 228–241. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31612-8_18

    Chapter  Google Scholar 

Download references

Acknowledgment

The authors acknowledge financial support by the Slovenian Research Agency (research core grants No. P2-0103 and P2-0098, project grant No. N2-0239, and young researcher grant No. PR-09773 to AK), by the EC (grant No. 952215 - TAILOR), by the Paris Ile-de-France region, and by the CNRS INS2I institute.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anja Jankovic .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kostovska, A. et al. (2022). Per-run Algorithm Selection with Warm-Starting Using Trajectory-Based Features. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tušar, T. (eds) Parallel Problem Solving from Nature – PPSN XVII. PPSN 2022. Lecture Notes in Computer Science, vol 13398. Springer, Cham. https://doi.org/10.1007/978-3-031-14714-2_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-14714-2_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-14713-5

  • Online ISBN: 978-3-031-14714-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics