Abstract
Almost all optimization algorithms have algorithm-dependent parameters, and the setting of such parameter values can largely influence the behavior of the algorithm under consideration. Thus, proper parameter tuning should be carried out to ensure the algorithm used for optimization may perform well and can be sufficiently robust for solving different types of optimization problems. This chapter reviews some of the main methods for parameter tuning and then highlights the important issues concerning the latest development in parameter tuning. A few open problems are also discussed with some recommendations for future research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Eiben A, Smit S (2011) Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol Comput 1(03 2011):19–31
Yang XS, He XS (2019) Mathematical foundations of nature-inspired algorithms. Springer briefs in optimization. Springer, Cham, Switzerland
Yang XS (2020) Nature-inspired optimization algorithms: challenges and open problems. J Comput Sci 46:101104
Talbi EG (2009) Metaheuristics: from design to implementation. Wiley
Hussain K, Salleh MNM, Cheng S, Shi Y (2019) Metaheuristic research: a comprehensive survey. Artif Intell Rev 52:2191–2233
Yang XS, Deb S, Loomes M, Karamanoglu M (2013) A framework for self-tuning optimization algorithm. Neural Comput Appl 23(7–8):2051–2057
Yang XS (2020) Nature-Inspired optimization algorithms, 2nd edn. Academic Press, London
Joshi SK, Bansal JC (2020) Parameter tuning for meta-heuristics. Knowl-Based Syst 189:105094
Lacerda M, Pessoa L, Lima Neto F, Ludermir T, Kuchen H (2021) A systematic literature review on general parameter control for evolutionary and swarm-based algorithms. Swarm Evol Comput 60(2 2021):Article 100777
Rathore K (2018) Parameter tuning in firefly algorithm. Int J Adv Res Ideas Innov Technol 4:106–112
Lacerda M, Lima Neto F, Ludermir T, Kuchen H (2023) Out-of-the-box parameter control for evolutionary and swarm-based algorithms with distributed reinforcement learning. Swarm Intell (01 2023)
Eiben A, Hinterding R, Michalewicz Z (1999) Parameter control in evolutionary algorithms. IEEE Trans Evol Comput 3(2):124–141
Skakov E, Malysh V (2018) Parameter meta-optimization of metaheuristics of solving specific np-hard facility location problem. J Phys: Conf Ser 973(03 2018):012063
Keller, E.F.: Organisms, machines, and thunderstorms: a history of self-organization, part ii. complexity, emergence, and stable attractors. Histor Stud Nat Sci 39(1):1–31
Phan H, Ellis K, Barca J, Dorin A (2020) A survey of dynamic parameter setting methods for nature-inspired swarm intelligence algorithms. Neural Comput Appl 32(2):567–588 January
Huang C, Li Y, Yao X (2020) A survey of automatic parameter tuning methods for metaheuristics. IEEE Trans Evol Comput 24(2):201–216
Trindade, A.R., Campelo, F.: Tuning metaheuristics by sequential optimisation of regression models. Appl Soft Comput 85(C) (dec 2019)
Shadkam E (2021) Parameter setting of meta-heuristic algorithms: a new hybrid method based on DEA and RSM. Environ Sci Pollut Res 29(11 2021):1–23
Talbi EG (2013) A unified taxonomy of hybrid metaheuristics with mathematical programming, constraint programming and machine learning. Stud Comput Intell 434(01 2013):3–76
Harrison K, Ombuki-Berman B, Engelbrecht A (2019) A parameter-free particle swarm optimization algorithm using performance classifiers. Inf Sci 503(07 2019)
Sababha M, Zohdy M, Kafafy M (2018) The enhanced firefly algorithm based on modified exploitation and exploration mechanism. Electronics 7(8):132
Hutter F, Hoos HH, Stützle T (2007) Automatic algorithm configuration based on local search. In: AAAI conference on artificial intelligence
Eryoldaş Y, Durmuşoglu A (2022) A literature survey on offline automatic algorithm configuration. Appl Sci 12(13)
Birattari M (2009) Tuning metaheuristics—a machine learning perspective. Springer, Heidelberg (01 2009)
Tatsis V, Parsopoulos K (2020) Reinforced online parameter adaptation method for population-based metaheuristics. In: IEEE symposium series on computational intelligence (SSCI2020), Canberra, Australia, (12 2020), pp 360–367
Hutter F, Hoos HH, Leyton-Brown K (2011) Sequential model-based optimization for general algorithm configuration. In: Coello CAC (ed) Learning and intelligent optimization. Springer, Berlin, Heidelberg, pp 507–523
Hoos HH (2012) Automated algorithm configuration and parameter tuning. In: Hamadi Y, Monfroy E, Saubion F (eds) Autonomous search. Springer, Berlin, Heidelberg, pp 37–71
Birattari M, Yuan Z, Balaprakash P, Stützle T (2010) F-race and iterated f-race: an overview. In: Bartz-Beielstein T, Chiarandini M, Paquete L, Preuss M (eds) Experimental methods for the analysis of optimization algorithms. Springer, Berlin, Heidelberg, pp 311–336
Bartz-Beielstein T, Preuss M (2007) Experimental research in evolutionary computation. In: Proceedings of the 9th annual conference companion on genetic and evolutionary computation, pp 3001–3020
Duque Gallego J, Múnera D, Diaz D, Abreu S: In: Solving QAP with auto-parameterization in parallel hybrid metaheuristics, vol 1443. Springer, Cham (08 2021), pp 294–309
Eryoldaş Y, Durmuşoğlu A (2022) An efficient parameter tuning method based on the Latin hypercube hammersley sampling and fuzzy c-means clustering methods. J King Saud Univ—Comput Inf Sci 34(10):8307–8322
Tatsis V, Parsopoulos K (2019) Dynamic parameter adaptation in metaheuristics using gradient approximation and line search. Appl Soft Computg 74(1 2019):368–384
Tatsis V, Parsopoulos K (2017) Differential evolution with grid-based parameter adaptation. Soft Comput 21(8):2105–2127
Dzalbs I, Kalganova T (2020) Simple generate-evaluate strategy for tight-budget parameter tuning problems. In: IEEE symposium series on computational intelligence (SSCI2020), Canberra, Australia (12 2020):783–790
Santos AS, Madureira AM, Varela LR (2022) A self-parametrization framework for meta-heuristics. Mathematics 10(3):475
Ferrari A, Leandro G, Coelho L, Gouvea C, Lima E, Chaves C (2019) Tuning of control parameters of grey wolf optimizer using fuzzy inference. IEEE Latin Am Trans 17:1191–1198
Bezdek J, Ehrlich R, Full W (1984) Fcm-the fuzzy c-means clustering-algorithm. Comput Geosci 10(12 1984):191–203
Wang R, Diwekar U, Padró C (2004) Efficient sampling techniques for uncertainties in risk analysis. Environ Prog 23(07 2004):141–157
Dillen W, Lombaert G, Schevenels M (2021) Performance assessment of metaheuristic algorithms for structural optimization taking into account the influence of algorithmic control parameters. Front Built Environ 7(03 2021):618851
Eggensperger K, Lindauer M, Hutter F (2017) Pitfalls and best practices in algorithm configuration. J Artif Intell Res 64(05 2017)
Tan CG, Siang Choong S, Wong LP (2021) A machine-learning-based approach for parameter control in bee colony optimization for traveling salesman problem. In: 2021 international conference on technologies and applications of artificial intelligence (TAAI), pp 54–59
Lessmann S, Caserta M, Montalvo I (2011) Tuning metaheuristics: a data mining based approach for particle swarm optimization. Expert Syst Appl 38(09 2011):12826–12838
Hekmatinia A, Shanghooshabad AM, Motevali MM, Almasi M (2019) Tuning parameter via a new rapid, accurate and parameter-less method using meta-learning. Int J Data Mining Model Manag 11(4):366–390
Bergstra J, Bengio Y (2012) Random search for hyper-parameter optimization. J Mach Learn Res 13:281–305
Yoo Y (2019) Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches. Knowl-Based Syst 178(1):74–83
Calvet L, Juan AA, Serrat C, Ries J (2016) A statistical learning based approach for parameter fine-tuning of metaheuristics. SORT-Stat Oper Res Trans 1(1):201–224
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Joy, G., Huyck, C., Yang, XS. (2023). Review of Parameter Tuning Methods for Nature-Inspired Algorithms. In: Yang, XS. (eds) Benchmarks and Hybrid Algorithms in Optimization and Applications. Springer Tracts in Nature-Inspired Computing. Springer, Singapore. https://doi.org/10.1007/978-981-99-3970-1_3
Download citation
DOI: https://doi.org/10.1007/978-981-99-3970-1_3
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-3969-5
Online ISBN: 978-981-99-3970-1
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)