Skip to main content

Review of Parameter Tuning Methods for Nature-Inspired Algorithms

  • Chapter
  • First Online:
Benchmarks and Hybrid Algorithms in Optimization and Applications

Abstract

Almost all optimization algorithms have algorithm-dependent parameters, and the setting of such parameter values can largely influence the behavior of the algorithm under consideration. Thus, proper parameter tuning should be carried out to ensure the algorithm used for optimization may perform well and can be sufficiently robust for solving different types of optimization problems. This chapter reviews some of the main methods for parameter tuning and then highlights the important issues concerning the latest development in parameter tuning. A few open problems are also discussed with some recommendations for future research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Eiben A, Smit S (2011) Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol Comput 1(03 2011):19–31

    Google Scholar 

  2. Yang XS, He XS (2019) Mathematical foundations of nature-inspired algorithms. Springer briefs in optimization. Springer, Cham, Switzerland

    Book  MATH  Google Scholar 

  3. Yang XS (2020) Nature-inspired optimization algorithms: challenges and open problems. J Comput Sci 46:101104

    Article  MathSciNet  Google Scholar 

  4. Talbi EG (2009) Metaheuristics: from design to implementation. Wiley

    Google Scholar 

  5. Hussain K, Salleh MNM, Cheng S, Shi Y (2019) Metaheuristic research: a comprehensive survey. Artif Intell Rev 52:2191–2233

    Article  Google Scholar 

  6. Yang XS, Deb S, Loomes M, Karamanoglu M (2013) A framework for self-tuning optimization algorithm. Neural Comput Appl 23(7–8):2051–2057

    Article  Google Scholar 

  7. Yang XS (2020) Nature-Inspired optimization algorithms, 2nd edn. Academic Press, London

    MATH  Google Scholar 

  8. Joshi SK, Bansal JC (2020) Parameter tuning for meta-heuristics. Knowl-Based Syst 189:105094

    Article  Google Scholar 

  9. Lacerda M, Pessoa L, Lima Neto F, Ludermir T, Kuchen H (2021) A systematic literature review on general parameter control for evolutionary and swarm-based algorithms. Swarm Evol Comput 60(2 2021):Article 100777

    Google Scholar 

  10. Rathore K (2018) Parameter tuning in firefly algorithm. Int J Adv Res Ideas Innov Technol 4:106–112

    Google Scholar 

  11. Lacerda M, Lima Neto F, Ludermir T, Kuchen H (2023) Out-of-the-box parameter control for evolutionary and swarm-based algorithms with distributed reinforcement learning. Swarm Intell (01 2023)

    Google Scholar 

  12. Eiben A, Hinterding R, Michalewicz Z (1999) Parameter control in evolutionary algorithms. IEEE Trans Evol Comput 3(2):124–141

    Article  Google Scholar 

  13. Skakov E, Malysh V (2018) Parameter meta-optimization of metaheuristics of solving specific np-hard facility location problem. J Phys: Conf Ser 973(03 2018):012063

    Google Scholar 

  14. Keller, E.F.: Organisms, machines, and thunderstorms: a history of self-organization, part ii. complexity, emergence, and stable attractors. Histor Stud Nat Sci 39(1):1–31

    Google Scholar 

  15. Phan H, Ellis K, Barca J, Dorin A (2020) A survey of dynamic parameter setting methods for nature-inspired swarm intelligence algorithms. Neural Comput Appl 32(2):567–588 January

    Article  Google Scholar 

  16. Huang C, Li Y, Yao X (2020) A survey of automatic parameter tuning methods for metaheuristics. IEEE Trans Evol Comput 24(2):201–216

    Article  Google Scholar 

  17. Trindade, A.R., Campelo, F.: Tuning metaheuristics by sequential optimisation of regression models. Appl Soft Comput 85(C) (dec 2019)

    Google Scholar 

  18. Shadkam E (2021) Parameter setting of meta-heuristic algorithms: a new hybrid method based on DEA and RSM. Environ Sci Pollut Res 29(11 2021):1–23

    Google Scholar 

  19. Talbi EG (2013) A unified taxonomy of hybrid metaheuristics with mathematical programming, constraint programming and machine learning. Stud Comput Intell 434(01 2013):3–76

    Google Scholar 

  20. Harrison K, Ombuki-Berman B, Engelbrecht A (2019) A parameter-free particle swarm optimization algorithm using performance classifiers. Inf Sci 503(07 2019)

    Google Scholar 

  21. Sababha M, Zohdy M, Kafafy M (2018) The enhanced firefly algorithm based on modified exploitation and exploration mechanism. Electronics 7(8):132

    Article  Google Scholar 

  22. Hutter F, Hoos HH, Stützle T (2007) Automatic algorithm configuration based on local search. In: AAAI conference on artificial intelligence

    Google Scholar 

  23. Eryoldaş Y, Durmuşoglu A (2022) A literature survey on offline automatic algorithm configuration. Appl Sci 12(13)

    Google Scholar 

  24. Birattari M (2009) Tuning metaheuristics—a machine learning perspective. Springer, Heidelberg (01 2009)

    Google Scholar 

  25. Tatsis V, Parsopoulos K (2020) Reinforced online parameter adaptation method for population-based metaheuristics. In: IEEE symposium series on computational intelligence (SSCI2020), Canberra, Australia, (12 2020), pp 360–367

    Google Scholar 

  26. Hutter F, Hoos HH, Leyton-Brown K (2011) Sequential model-based optimization for general algorithm configuration. In: Coello CAC (ed) Learning and intelligent optimization. Springer, Berlin, Heidelberg, pp 507–523

    Chapter  Google Scholar 

  27. Hoos HH (2012) Automated algorithm configuration and parameter tuning. In: Hamadi Y, Monfroy E, Saubion F (eds) Autonomous search. Springer, Berlin, Heidelberg, pp 37–71

    Google Scholar 

  28. Birattari M, Yuan Z, Balaprakash P, Stützle T (2010) F-race and iterated f-race: an overview. In: Bartz-Beielstein T, Chiarandini M, Paquete L, Preuss M (eds) Experimental methods for the analysis of optimization algorithms. Springer, Berlin, Heidelberg, pp 311–336

    Chapter  Google Scholar 

  29. Bartz-Beielstein T, Preuss M (2007) Experimental research in evolutionary computation. In: Proceedings of the 9th annual conference companion on genetic and evolutionary computation, pp 3001–3020

    Google Scholar 

  30. Duque Gallego J, Múnera D, Diaz D, Abreu S: In: Solving QAP with auto-parameterization in parallel hybrid metaheuristics, vol 1443. Springer, Cham (08 2021), pp 294–309

    Google Scholar 

  31. Eryoldaş Y, Durmuşoğlu A (2022) An efficient parameter tuning method based on the Latin hypercube hammersley sampling and fuzzy c-means clustering methods. J King Saud Univ—Comput Inf Sci 34(10):8307–8322

    Google Scholar 

  32. Tatsis V, Parsopoulos K (2019) Dynamic parameter adaptation in metaheuristics using gradient approximation and line search. Appl Soft Computg 74(1 2019):368–384

    Google Scholar 

  33. Tatsis V, Parsopoulos K (2017) Differential evolution with grid-based parameter adaptation. Soft Comput 21(8):2105–2127

    Article  Google Scholar 

  34. Dzalbs I, Kalganova T (2020) Simple generate-evaluate strategy for tight-budget parameter tuning problems. In: IEEE symposium series on computational intelligence (SSCI2020), Canberra, Australia (12 2020):783–790

    Google Scholar 

  35. Santos AS, Madureira AM, Varela LR (2022) A self-parametrization framework for meta-heuristics. Mathematics 10(3):475

    Article  Google Scholar 

  36. Ferrari A, Leandro G, Coelho L, Gouvea C, Lima E, Chaves C (2019) Tuning of control parameters of grey wolf optimizer using fuzzy inference. IEEE Latin Am Trans 17:1191–1198

    Article  Google Scholar 

  37. Bezdek J, Ehrlich R, Full W (1984) Fcm-the fuzzy c-means clustering-algorithm. Comput Geosci 10(12 1984):191–203

    Google Scholar 

  38. Wang R, Diwekar U, Padró C (2004) Efficient sampling techniques for uncertainties in risk analysis. Environ Prog 23(07 2004):141–157

    Google Scholar 

  39. Dillen W, Lombaert G, Schevenels M (2021) Performance assessment of metaheuristic algorithms for structural optimization taking into account the influence of algorithmic control parameters. Front Built Environ 7(03 2021):618851

    Google Scholar 

  40. Eggensperger K, Lindauer M, Hutter F (2017) Pitfalls and best practices in algorithm configuration. J Artif Intell Res 64(05 2017)

    Google Scholar 

  41. Tan CG, Siang Choong S, Wong LP (2021) A machine-learning-based approach for parameter control in bee colony optimization for traveling salesman problem. In: 2021 international conference on technologies and applications of artificial intelligence (TAAI), pp 54–59

    Google Scholar 

  42. Lessmann S, Caserta M, Montalvo I (2011) Tuning metaheuristics: a data mining based approach for particle swarm optimization. Expert Syst Appl 38(09 2011):12826–12838

    Google Scholar 

  43. Hekmatinia A, Shanghooshabad AM, Motevali MM, Almasi M (2019) Tuning parameter via a new rapid, accurate and parameter-less method using meta-learning. Int J Data Mining Model Manag 11(4):366–390

    Google Scholar 

  44. Bergstra J, Bengio Y (2012) Random search for hyper-parameter optimization. J Mach Learn Res 13:281–305

    MathSciNet  MATH  Google Scholar 

  45. Yoo Y (2019) Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches. Knowl-Based Syst 178(1):74–83

    Article  Google Scholar 

  46. Calvet L, Juan AA, Serrat C, Ries J (2016) A statistical learning based approach for parameter fine-tuning of metaheuristics. SORT-Stat Oper Res Trans 1(1):201–224

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xin-She Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Joy, G., Huyck, C., Yang, XS. (2023). Review of Parameter Tuning Methods for Nature-Inspired Algorithms. In: Yang, XS. (eds) Benchmarks and Hybrid Algorithms in Optimization and Applications. Springer Tracts in Nature-Inspired Computing. Springer, Singapore. https://doi.org/10.1007/978-981-99-3970-1_3

Download citation

Publish with us

Policies and ethics