A survey on optimization metaheuristics
Introduction
We roughly define hard optimization problems as problems that cannot be solved to optimality, or to any guaranteed bound, by any exact (deterministic) method within a “reasonable” time limit. These problems can be divided into several categories depending on whether they are continuous or discrete, constrained or unconstrained, mono or multi-objective, static or dynamic. In order to find satisfactory solutions for these problems, metaheuristics can be used. A metaheuristic is an algorithm designed to solve approximately a wide range of hard optimization problems without having to deeply adapt to each problem. Indeed, the greek prefix “meta”, present in the name, is used to indicate that these algorithms are “higher level” heuristics, in contrast with problem-specific heuristics. Metaheuristics are generally applied to problems for which there is no satisfactory problem-specific algorithm to solve them. They are widely used to solve complex problems in industry and services, in areas ranging from finance to production management and engineering.
Almost all metaheuristics share the following characteristics: they are nature-inspired (based on some principles from physics, biology or ethology); they make use of stochastic components (involving random variables); they do not use the gradient or Hessian matrix of the objective function; they have several parameters that need to be fitted to the problem at hand.
In the last thirty years, a great interest has been devoted to metaheuristics. We can try to point out some of the steps that have marked the history of metaheuristics. One pioneer contribution is the proposition of the simulated annealing method by Kirkpatrick et al. in 1982 [150]. In 1986, the tabu search was proposed by Glover [104], and the artificial immune system was proposed by Farmer et al. [83]. In 1988, Koza registered his first patent on genetic programming, later published in 1992 [154]. In 1989, Goldberg published a well known book on genetic algorithms [110]. In 1992, Dorigo completed his PhD thesis, in which he describes his innovative work on ant colony optimization [69]. In 1993, the first algorithm based on bee colonies was proposed by Walker et al. [277]. Another significant progress is the development of the particle swarm optimization by Kennedy and Eberhart in 1995 [145]. The same year, Hansen and Ostermeier proposed CMA-ES [121]. In 1996, Mühlenbein and Paaß proposed the estimation of distribution algorithm [190]. In 1997, Storn and Price proposed differential evolution [253]. In 2002, Passino introduced an optimization algorithm based on bacterial foraging [200]. Then, Simon proposed a biogeography-based optimization algorithm in 2008 [247].
The considerable development of metaheuristics can be explained by the significant increase in the processing power of the computers, and by the development of massively parallel architectures. These hardware improvements relativize the CPU time–costly nature of metaheuristics.
A metaheuristic will be successful on a given optimization problem if it can provide a balance between the exploration (diversification) and the exploitation (intensification). Exploitation is needed to identify parts of the search space with high quality solutions. Exploitation is important to intensify the search in some promising areas of the accumulated search experience. The main differences between the existing metaheuristics concern the particular way in which they try to achieve this balance [28]. Many classification criteria may be used for metaheuristics. This may be illustrated by considering the classification of metaheuristics in terms of their features with respect to different aspects concerning the search path they follow, the use of memory, the kind of neighborhood exploration used or the number of current solutions carried from one iteration to the next. For a more formal classification of metaheuristics we refer the reader to [28], [258]. The metaheuristic classification, which differentiates between Single-Solution Based Metaheuristics and Population-Based Metaheuristics, is often taken to be a fundamental distinction in the literature. Roughly speaking, basic single-solution based metaheuristics are more exploitation oriented whereas basic population-based metaheuristics are more exploration oriented.
The purpose of this paper is to present a global overview of the main metaheuristics and their principles. That attempt of survey on metaheuristics is structured in the following way. Section 2 shortly presents the class of single-solution based metaheuristics, and the main algorithms that belong to this class, i.e. the simulated annealing method, the tabu search, the GRASP method, the variable neighborhood search, the guided local search, the iterated local search, and their variants. Section 3 describes the class of metaheuristics related to population-based metaheuristics, which manipulate a collection of solutions rather than a single solution at each stage. Section 3.1 describes the field of evolutionary computation and outlines the common search components of this family of algorithms (e.g., selection, variation, and replacement). In this subsection, the focus is on evolutionary algorithms such as genetic algorithms, evolution strategies, evolutionary programming, and genetic programming. Section 3.2 presents other evolutionary algorithms such as estimation of distribution algorithms, differential evolution, coevolutionary algorithms, cultural algorithms and the scatter search and path relinking. Section 3.3 contains an overview of a family of nature inspired algorithms related to Swarm Intelligence. The main algorithms belonging to this field are ant colonies, particle swarm optimization, bacterial foraging, bee colonies, artificial immune systems and biogeography-based optimization. Finally, a discussion on the current research status and most promising paths of future research is presented in Section 4.
Section snippets
Single-solution based metaheuristics
In this section, we outline single-solution based metaheuristics, also called trajectory methods. Unlike population-based metaheuristics, they start with a single initial solution and move away from it, describing a trajectory in the search space. Some of them can be seen as “intelligent” extensions of local search algorithms. Trajectory methods mainly encompass the simulated annealing method, the tabu search, the GRASP method, the variable neighborhood search, the guided local search, the
Population-based metaheuristics
Population-based metaheuristics deal with a set (i.e. a population) of solutions rather than with a single solution. The most studied population-based methods are related to Evolutionary Computation (EC) and Swarm Intelligence (SI). EC algorithms are inspired by Darwin’s evolutionary theory, where a population of individuals is modified through recombination and mutation operators. In SI, the idea is to produce computational intelligence by exploiting simple analogs of social interaction,
Discussion and conclusions
This work surveyed several important metaheuristic methods as they are described in the literature. Some of them are single-solution based, and others are population-based, and although they are based on different philosophies. Nevertheless, a number of these metaheuristics are implemented in a more and more similar way. A unified presentation of these methods is proposed under the name of adaptive memory programming (AMP) [256]. An important principle behind AMP is that a memory containing a
References (287)
Ant colony optimization: Introduction and recent trends
Physics of Life Reviews
(2005)- et al.
Hybrid metaheuristics in combinatorial optimization: a survey
Applied Soft Computing
(2011) - et al.
Physical mechanisms for chemotactic pattern formation by bacteria
Biophysical Journal
(1998) - et al.
Gaussian variable neighborhood search for continuous optimization
Computers & Operations Research
(2012) - et al.
Optimization of type-2 fuzzy systems based on bio-inspired methods: a concise review
Information Sciences
(2012) - et al.
The noising method: a new method for combinatorial optimization
Operations Research Letters
(1993) - et al.
The noising methods: a generalization of some metaheuristics
European Journal of Operational Research
(2001) - et al.
Tabu search applied to global optimization
European Journal of Operational Research
(2000) Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art
Computer Methods in Applied Mechanics and Engineering
(2002)- et al.
Recent advances in artificial immune systems: models and applications
Applied Soft Computing
(2011)
Ant colony optimization theory: a survey
Theoretical Computer Science
Threshold accepting: a general purpose optimization algorithm appearing superior to simulated annealing
Journal of Computational Physics
The immune system, adaptation, and machine learning
Physica D
A probabilistic heuristic for a computationally difficult set covering problem
Operations Research Letters
Swarm Intelligence in Data Mining
Danger theory: the link between AIS and IDS? Artificial immune systems
Parallel Metaheuristics: A New Class of Algorithms
A survey of parallel distributed genetic algorithms
Complexity
Evolutionary optimization versus particle swarm optimization: philosophy and performance differences
Multiple objective ant colony optimisation
Swarm Intelligence
Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms
An overview of evolutionary algorithms for parameter optimization
Evolutionary Computation
Using optimal dependency-trees for combinatorial optimization: learning the structure of the search space
A review of particle swarm optimization. Part i: background and development
Natural Computing
A review of particle swarm optimization. part ii: hybridisation, combinatorial, multicriteria and constrained optimization, and indicative applications
Natural Computing
Reactive Search and Intelligent Optimization
The reactive tabu search
ORSA Journal on Computing
Local optima avoidance in depot location
Journal of the Operational Research Society
An overview of genetic algorithms. Part i: fundamentals
University Computing
An overview of genetic algorithms. Part ii: research topics
University Computing
Evolution strategies – a comprehensive introduction
Natural Computing
A survey on metaheuristics for stochastic combinatorial optimization
Natural Computing
Particle swarm optimization in dynamic environments
A comparison of selection schemes used in genetic algorithms
Evolutionary Computation
Metaheuristics in combinatorial optimization: overview and conceptual comparison
ACM Computing Surveys
Self-adaptive differential evolution algorithm using population size reduction and three strategies, soft computing – a fusion of foundations
Methodologies and Applications
Artificial Immune Systems: A New Computational Intelligence Approach
aiNet: an artificial immune network for data analysis
Learning and optimization using the clonal selection principle
IEEE Transactions on Evolutionary Computation
Cited by (1345)
Optimal super twisting sliding mode control strategy for performance improvement of islanded microgrids: Validation and real-time study
2024, International Journal of Electrical Power and Energy SystemsThe family capacitated vehicle routing problem
2024, European Journal of Operational ResearchEnhanced variants of crow search algorithm boosted with cooperative based island model for global optimization
2024, Expert Systems with Applications