Skip to main content

Experimental Analysis of Optimization Algorithms: Tuning and Beyond

  • Chapter
  • First Online:
Theory and Principled Methods for the Design of Metaheuristics

Part of the book series: Natural Computing Series ((NCS))

Abstract

This chapter comprises the essence of several years of tutorials the authors gave on experimental research in evolutionary computation. We highlight the renaissance of experimental techniques also in other fields to especially focus on the specific conditions of experimental research in computer science, or more concretely, metaheuristic optimization. The experimental setup is discussed together with the pitfalls awaiting the unexperienced (and sometimes even the experienced). We present a severity criterion as a meta statistical concept for evaluating statistical inferences, which can be used to avoid fallacies, i.e., misconceptions resulting from incorrect reasoning in argumentation caused by floor or ceiling effects. The sequential parameter optimization is discussed as a meta statistical framework which integrates concepts such as severity. Parameter tuning is considered as a relatively new tool in method design and analysis, and it leads to the question of adaptability of optimization algorithms. Another branch of experimentation aims at attaining more concrete problem knowledge, we may term it “exploratory landscape analysis”, containing sample and visualization techniques that are often applied but not seen as being a methodological contribution. However, this chapter is not only a renarration of well-known facts. We also attempt to look into the future to estimate what the hot topics of methodological research will be in the coming years and what changes we may expect for the whole community.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    SPOT can generate 100 randomly chosen design points of the SANN by using the following setting in the CONF file: init.design.size = 100 and init.design.repeats = 1.

  2. 2.

    R is a freely available language and environment for statistical computing and graphics which provides a wide variety of statistical and graphical techniques. CRAN is a network of ftp and web servers around the world that store identical, up-to-date versions of code and documentation for R, see http://cran.r-project.org.

References

  1. T. Bartz-Beielstein, Experimental Research in Evolutionary Computation—The New Experimentalism. Natural Computing Series (Springer, Berlin/Heidelberg/New York, 2006)

    MATH  Google Scholar 

  2. T. Bartz-Beielstein, How experimental algorithmics can benefit from Mayo’s extensions to Neyman-Pearson theory of testing. Synthese 163(3), 385–396 (2008). doi:10.1007/s11229-007-9297-z

    Article  MATH  MathSciNet  Google Scholar 

  3. T. Bartz-Beielstein, Sequential parameter optimization—an annotated bibliography. CIOP technical report 04/10, Research Center CIOP (Computational Intelligence, Optimization and Data Mining), Cologne University of Applied Science, Faculty of Computer Science and Engineering Science, Apr 2010

    Google Scholar 

  4. T. Bartz-Beielstein, SPOT: an R package for automatic and interactive tuning of optimization algorithms by sequential parameter optimization. CIOP technical report 05/10, Research Center CIOP (Computational Intelligence, Optimization and Data Mining), Cologne University of Applied Science, Faculty of Computer Science and Engineering Science, Jun 2010. Comments: related software can be downloaded from http://cran.r-project.org/web/packages/SPOT/index.html

  5. T. Bartz-Beielstein, Writing interfaces for the sequential parameter optimization toolbox SPOT. CIOP technical report 07/10, Cologne University of Applied Sciences, Cologne University of Applied Science, Faculty of Computer Science and Engineering Science, July 2010

    Google Scholar 

  6. T. Bartz-Beielstein, M. Preuss, CEC tutorial on experimental research in evolutionary computation, in IEEE Congress on Evolutionary Computation, Tutorial Program, Tutorials given at CEC 2004, San Diego and CEC 2005, Edinburgh

    Google Scholar 

  7. T. Bartz-Beielstein, M. Preuss, Experimental research in evolutionary computation (tutorial), in Genetic and Evolutionary Computation Conference (GECCO 2005), Washington, June 2005

    Google Scholar 

  8. T. Bartz-Beielstein, M. Preuss, Considerations of budget allocation for sequential parameter optimization (SPO), in Workshop on Empirical Methods for the Analysis of Algorithms, Proceedings, Reykjavik, ed. by L. Paquete et al., 2006, pp. 35–40

    Google Scholar 

  9. T. Bartz-Beielstein, M. Preuss, Experimental research in evolutionary computation (tutorial), in Genetic and Evolutionary Computation Conference (GECCO 2006), Seattle, July, 2006

    Google Scholar 

  10. T. Bartz-Beielstein, M. Preuss, Experimental research in evolutionary computation–the future of experimental research (tutorial), in Genetic and Evolutionary Computation Conference (GECCO 2007), London, July 2007

    Google Scholar 

  11. T. Bartz-Beielstein, M. Preuss, Experimental research in evolutionary computation–the future of experimental research (tutorial), in Genetic and Evolutionary Computation Conference (GECCO 2008), Atlanta, July 2008

    Google Scholar 

  12. T. Bartz-Beielstein, M. Preuss, Experimental research in evolutionary computation–the future of experimental research (tutorial), in Genetic and Evolutionary Computation Conference (GECCO 2009), Montreal, July 2009

    Google Scholar 

  13. T. Bartz-Beielstein, M. Preuss, The future of experimental research, in Experimental Methods for the Analysis of Optimization Algorithms, ed. by T. Bartz-Beielstein, M. Chiarandini, L. Paquete, M. Preuß (Springer, Berlin/Heidelberg/New York, 2010), pp. 17–46

    Chapter  Google Scholar 

  14. T. Bartz-Beielstein, M. Preuss, Tuning and experimental analysis in evolutionary computation: what we still have wrong (tutorial), in Genetic and Evolutionary Computation Conference (GECCO 2010), Portland, July 2010

    Google Scholar 

  15. T. Bartz-Beielstein, M. Preuss, Automatic and interactive tuning of algorithms, in GECCO 2011 (Companion), ed. by N. Krasnogor, P.L. Lanzi (ACM, New York, 2011), pp. 1361–1380

    Google Scholar 

  16. T. Bartz-Beielstein, K.E. Parsopoulos, M.N. Vrahatis, Design and analysis of optimization algorithms using computational statistics. Appl. Numer. Anal. Comput. Math. 1(2), 413–433 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  17. T. Bartz-Beielstein, C. Lasarczyk, M. Preuss, Sequential parameter optimization, in Proceedings 2005 Congress on Evolutionary Computation (CEC’05), Edinburgh, vol. 1, ed. by B. McKay et al. (IEEE, Piscataway, 2005), pp. 773–780

    Google Scholar 

  18. T. Bartz-Beielstein, M. Chiarandini, L. Paquete, M. Preuss (ed.), Experimental Methods for the Analysis of Optimization Algorithms. (Springer, Berlin/Heidelberg/New York, 2010)

    MATH  Google Scholar 

  19. T. Bartz-Beielstein, M. Friese, O. Flasch, W. Konen, P. Koch, B. Naujoks, Ensemble-based modeling. CIOP technical report 06/11, Research Center CIOP (Computational Intelligence, Optimization and Data Mining), Cologne University of Applied Science, Faculty of Computer Science and Engineering Science, July 2011

    Google Scholar 

  20. R.E. Bechhofer, T.J. Santner, D.M. Goldsman, Design and Analysis of Experiments for Statistical Selection, Screening, and Multiple Comparisons (Wiley, New York, 1995)

    Google Scholar 

  21. C.J.P. Belisle, Convergence theorems for a class of simulated annealing algorithms. J. Appl. Probab. 29, 885–895 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  22. M. Birattari, Tuning Metaheuristics (Springer, Berlin/Heidelberg/New York, 2005)

    MATH  Google Scholar 

  23. G.E.P. Box, W.G. Hunter, J.S. Hunter, Statistics for Experimenters (Wiley, New York, 1978)

    MATH  Google Scholar 

  24. A.F. Chalmers, What Is This Thing Called Science (University of Queensland Press, St. Lucia, 1999)

    Google Scholar 

  25. C.H. Chen, An effective approach to smartly allocate computing budget for discrete event simulation, in Proceedings of the 34th IEEE Conference on Decision and Control, New Orleans, 1995, pp. 2598–2605

    Google Scholar 

  26. M. Chimani, K. Klein, Algorithm engineering: concepts and practice, in Experimental Methods for the Analysis of Optimization Algorithms, ed. by T. Bartz-Beielstein, M. Chiarandini, L. Paquete, M. Preuß (Springer, New York, 2010)

    Google Scholar 

  27. P.R. Cohen, A survey of the eighth national conference on artificial intelligence: pulling together or pulling apart? AI Mag. 12(1), 16–41 (1991)

    Google Scholar 

  28. P.R. Cohen, Empirical Methods for Artificial Intelligence (MIT, Cambridge, 1995)

    MATH  Google Scholar 

  29. A.E. Eiben, M. Jelasity, A critical note on experimental research methodology in EC, in Proceedings of the 2002 Congress on Evolutionary Computation (CEC’2002), Hawaii (IEEE, 2002), pp. 582–587

    Google Scholar 

  30. O. Flasch, T. Bartz-Beielstein, A. Davtyan, P. Koch, W. Konen, T.D. Oyetoyan, M. Tamutan, Comparing CI methods for prediction models in environmental engineering. CIOP technical report 02/10, Research Center CIOP (Computational Intelligence, Optimization and Data Mining), Faculty of Computer Science and Engineering Science, Cologne University of Applied Sciences, Germany, Feb 2010

    Google Scholar 

  31. T. Fober, Experimentelle Analyse Evolutionärer Algorithmen auf dem CEC 2005 Testfunktionensatz. Master’s thesis, Universität Dortmund, 2006

    Google Scholar 

  32. T. Fober, M. Mernberger, G. Klebe, E. Hüllermeier, Evolutionary construction of multiple graph alignments for the structural analysis of biomolecules. Bioinformatics 25(16), 2110–2117 (2009)

    Article  Google Scholar 

  33. T. Fober, S. Glinca, G. Klebe, E. Hüllermeier, Superposition and alignment of labeled point clouds. IEEE/ACM Trans. Comput. Biol. Bioinfo. 8(6), 1653–1666 (2011)

    Article  Google Scholar 

  34. M. Gallagher, B. Yuan, A general-purpose tunable landscape generator. IEEE Trans. Evol. Comput. 10(5), 590–603 (2006)

    Article  Google Scholar 

  35. N. Hansen, A. Ostermeier, Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  36. N. Hansen, A. Auger, S. Finck, R. Ros, Real-parameter black-box optimization benchmarking 2009: experimental setup. Technical report RR-6828, INRIA, 2009

    Google Scholar 

  37. N. Hansen, S. Finck, R. Ros, A. Auger, Real-parameter black-box optimization benchmarking 2009: noiseless functions definitions. Technical report RR-6829, INRIA, 2009

    Google Scholar 

  38. J. He, C. Reeves, C. Witt, X. Yao, A note on problem difficulty measures in black-box optimization: classification, realizations and predictability. Evol. Comput. 15(4), 435–443 (2007)

    Article  Google Scholar 

  39. F. Henrich, C. Bouvy, C. Kausch, K. Lucas, M. Preuss, G. Rudolph, P. Roosen, Economic optimization of non-sharp separation sequences by means of evolutionary algorithms. Comput. Chem. Eng. 32(7), 1411–1432 (2008)

    Article  Google Scholar 

  40. J.N. Hooker, Testing heuristics: we have it all wrong. J. Heuristics 1(1), 33–42 (1996)

    Article  Google Scholar 

  41. H.H. Hoos, T. Stützle, Evaluating Las Vegas algorithms: pitfalls and remedies, in UAI ’98: Proceedings of the 14th Conference on Uncertainty in Artificial Intelligence, Madison, ed. by G.F. Cooper, S. Moral (Morgan Kaufmann, 1998), pp. 238–245

    Google Scholar 

  42. F. Hutter, T. Bartz-Beielstein, H. Hoos, K. Leyton-Brown, K.P. Murphy, Sequential model-based parameter optimisation: an experimental investigation of automated and interactive approaches empirical methods for the analysis of optimization algorithms, in Experimental Methods for the Analysis of Optimization Algorithms, ed. by T. Bartz-Beielstein, M. Chiarandini, L. Paquete, M. Preuß (Springer, Berlin/Heidelberg/New York, 2010), pp. 361–414

    Google Scholar 

  43. F. Hutter, H.H. Hoos, K. Leyton-Brown, K.P. Murphy, Time-bounded sequential parameter optimization, in Proceedings of LION 2010, Venice. LNCS, 6073 (2010), pp. 281–298

    Google Scholar 

  44. T. Jansen, On classifications of fitness functions, in Theoretical Aspects of Evolutionary Computing, ed. by L. Kallel, B. Naudts, A. Rogers (Springer, Berlin, 2001), pp. 371–386

    Chapter  Google Scholar 

  45. D.S. Johnson, A theoretician’s guide to the experimental analysis of algorithms, in Data Structures, Near Neighbor Searches, and Methodology: Fifth and Sixth DIMACS Implementation Challenges (AMS, Providence, 2002), pp. 215–250

    Google Scholar 

  46. T. Jones, S. Forrest, Fitness distance correlation as a measure of problem difficulty for genetic algorithms, in Proceedings of the Sixth International Conference on Genetic Algorithms, Pittsburgh (Morgan Kaufmann, 1995), pp. 184–192

    Google Scholar 

  47. K. Knight, P. Langley, P.R. Cohen, What makes a compelling empirical evaluation? IEEE Intel. Syst. 11, 10–14 (1996)

    Google Scholar 

  48. W. Konen, T. Zimmer, T. Bartz-Beielstein, Optimized modelling of fill levels in stormwater tanks using CI-based parameter selection schemes (in German). at-Automatisierungstechnik 57(3), 155–166 (2009)

    Google Scholar 

  49. O. Kramer, B. Gloger, A. Goebels, An experimental analysis of evolution strategies and particle swarm optimisers using design of experiments, in Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, GECCO ’07, London (ACM, 2007), pp. 674–681

    Google Scholar 

  50. C.W.G. Lasarczyk, Genetische Programmierung einer algorithmischen Chemie. PhD thesis, Technische Universität Dortmund, 2007

    Google Scholar 

  51. C.W.G. Lasarczyk, W. Banzhaf, Total synthesis of algorithmic chemistries, in GECCO ’05: Proceedings of the 2005 Conference on Genetic and Evolutionary Computation, Washington D.C. (ACM, New York, 2005), pp. 1635–1640

    Google Scholar 

  52. D.G. Mayo, Error and the Growth of Experimental Knowledge (The University of Chicago Press, Chicago, 1996)

    Book  Google Scholar 

  53. D.G. Mayo, A. Spanos, Severe testing as a basic concept in a Neyman–Pearson philosophy of induction. Br. J. Philos. Sci. 57, 323–357 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  54. D.G. Mayo, A. Spanos, Error and Inference (Cambridge University Press, Cambridge, 2010)

    MATH  Google Scholar 

  55. C.C. McGeoch, Toward an experimental method for algorithm simulation. INFORMS J. Comput. 8(1), 1–15 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  56. J. Mehnen, T. Michelitsch, C. Lasarczyk, T. Bartz-Beielstein, Multi-objective evolutionary design of mold temperature control using DACE for parameter optimization. Int. J. Appl. Electromagn. Mech. 25(1–4), 661–667 (2007)

    Google Scholar 

  57. O. Mersmann, M. Preuss, H. Trautmann, Benchmarking evolutionary algorithms: towards exploratory landscape analysis, in Proceedings of the 11th International Conference on Parallel Problem Solving from Nature: Part I, PPSN’10, Krakow (Springer, Berlin/Heidelberg, 2010), pp. 73–82

    Google Scholar 

  58. O. Mersmann, B. Bischl, H. Trautmann, M. Preuss, C. Weihs, G. Rudolph, Exploratory landscape analysis, in Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, GECCO ’11, Dublin (ACM, New York, 2011), pp. 829–836

    Google Scholar 

  59. B.M. Moret, H.D. Shapiro, Algorithms and experiments: the new (and old) methodology. J. Univers. Comput. Sci. 7(5), 434–446 (2001)

    MATH  MathSciNet  Google Scholar 

  60. V. Nannen, Evolutionary agent-based policy analysis in dynamic environments. PhD thesis, Vrije Universiteit Amsterdam, 2009

    Google Scholar 

  61. V. Nannen, A.E. Eiben, A method for parameter calibration and relevance estimation in evolutionary algorithms, in Genetic and Evolutionary Computation Conference, GECCO 2006, Proceedings, Seattle, ed. by M. Cattolico (ACM, 2006), pp. 183–190

    Google Scholar 

  62. J.C. Nash, Compact Numerical Methods for Computers: Linear Algebra and Function Minimisation, 2nd edn. (IOP, Bristol, 1990)

    MATH  Google Scholar 

  63. B. Naujoks, D. Quagliarella, T. Bartz-Beielstein, Sequential parameter optimisation of evolutionary algorithms for airfoil design, in Proceedings Design and Optimization: Methods and Applications (ERCOFTAC’06), Berlin, ed. by G. Winter et al. (University of Las Palmas de Gran Canaria, 2006), pp. 231–235

    Google Scholar 

  64. J. Neyman, E.S. Pearson, On the problem of the most efficient tests of statistical hypotheses. Philos. Trans. R. Soc. A 231, 289–337 (1933)

    Article  Google Scholar 

  65. N.H. Pothmann, Kreuzungsminimierung für k-seitige Buchzeichnungen von Graphen mit Ameisenalgorithmen. Master’s thesis, Universität Dortmund, 2007

    Google Scholar 

  66. M. Preuss, Niching prospects, in Bioinspired Optimization Methods and Their Applications (BIOMA 2006), ed. by B. Filipic, J. Silc (Jozef Stefan Institute, Ljubljana, 2006), pp. 25–34

    Google Scholar 

  67. M. Preuss, T. Bartz-Beielstein, Sequential parameter optimization applied to self-adaptation for binary-coded evolutionary algorithms, in Parameter Setting in Evolutionary Algorithms, ed. by F. Lobo, C. Lima, Z. Michalewicz. Studies in Computational Intelligence (Springer, New York, 2007), pp. 91–120

    Google Scholar 

  68. M. Preuss, G. Rudolph, F. Tumakaka, Solving multimodal problems via multiobjective techniques with application to phase equilibrium detection, in Proceedings of the International Congress on Evolutionary Computation (CEC2007), Singapore (IEEE, Piscataway, 2007)

    Google Scholar 

  69. M. Preuss, G. Rudolph, S. Wessing, Tuning optimization algorithms for real-world problems by means of surrogate modeling, in Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, GECCO ’10, Portland (ACM, New York, 2010), pp. 401–408

    Google Scholar 

  70. M. Preuss, C. Stoean, R. Stoean, Niching foundations: basin identification on fixed-property generated landscapes, in Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, GECCO ’11, Dublin (ACM, 2011), pp. 837–844

    Google Scholar 

  71. R.L. Rardin, R. Uzsoy, Experimental evaluation of heuristic optimization algorithms: a tutorial. J. Heuristics 7(3), 261–304 (2001)

    Article  MATH  Google Scholar 

  72. G. Rudolph, M. Preuss, J. Quadflieg, Two-layered surrogate modeling for tuning optimization metaheuristics. Algorithm engineering report TR09-2-005, Faculty of Computer Science, Algorithm Engineering (Ls11), Technische Universität Dortmund, Sept 2009

    Google Scholar 

  73. R. Salomon, Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions: a survey of some theoretical and practical aspects of genetic algorithms. BioSystems 39, 263–278 (1996)

    Article  Google Scholar 

  74. S.K. Smit, A.E. Eiben, Comparing parameter tuning methods for evolutionary algorithms, in IEEE Congress on Evolutionary Computation (CEC), Trondheim, 2009, pp. 399–406

    Google Scholar 

  75. P.N. Suganthan, N. Hansen, J.J. Liang, K. Deb, Y.-P. Chen, A. Auger, S. Tiwari, Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Technical report, Nanyang Technological University, Singapore, 2005. http://www.ntu.edu.sg/home/EPNSugan

  76. A. Törn, M. Ali, S. Viitanen, Stochastic global optimization: problem classes and solution techniques. J. Glob. Optim. 14(4), 437–447 (1999)

    Article  MATH  Google Scholar 

  77. M. Tosic, Evolutionäre Kreuzungsminimierung. Diploma thesis, University of Dortmund, Jan 2006

    Google Scholar 

  78. H. Trautmann, J. Mehnen, Statistical methods for improving multi-objective evolutionary optimisation. Intern. J. Comput. Intell. Res. 5(2), 72–78 (2009)

    Google Scholar 

  79. L. Volkert, Investigating EA based training of HMM using a sequential parameter optimization approach, in Proceedings of the 2006 IEEE Congress on Evolutionary Computation, Vancouver, ed. by G.G. Yen et al. (IEEE, 2006), pp. 2742–2749

    Google Scholar 

  80. S. Wessing, Towards optimal parameterizations of the S-metric selection evolutionary multi-objective algorithms. Algorithm engineering report TR09-2-006, Universität Dortmund, Sept 2009

    Google Scholar 

  81. S. Wessing, M. Preuß, G. Rudolph, When parameter tuning actually is parameter control, in Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, GECCO ’11, Dublin (ACM, 2011), pp. 821–828

    Google Scholar 

  82. Y. Yi, Fuzzy operator trees for modeling utility functions. PhD thesis, Philipps-Universität Marburg, 2008

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Bundesministerium für Bildung und Forschung (BMBF) under the grants FIWA (AIF FKZ 17N2309), MCIOP (AIF FKZ 17N0311), and by the Cologne University of Applied Sciences under the research focus grant COSA.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thomas Bartz-Beielstein .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Bartz-Beielstein, T., Preuss, M. (2014). Experimental Analysis of Optimization Algorithms: Tuning and Beyond. In: Borenstein, Y., Moraglio, A. (eds) Theory and Principled Methods for the Design of Metaheuristics. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33206-7_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33206-7_10

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33205-0

  • Online ISBN: 978-3-642-33206-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics