Evolutionary multimodal optimization using the principle of locality
Introduction
Since genetic algorithm was proposed by Holland [17] in the early 1970s, researchers have been exploring the power of evolutionary algorithms. For instance, biological pattern discovery [64] and computer vision [65]. In particular, its function optimization capability was highlighted [14] because of its high adaptability to different non-convex function landscapes, to which we cannot apply traditional optimization techniques.
Real world problems always have different multiple solutions. For instance, optical engineers need to tune the recording parameters to get as many optimal solutions as possible for multiple trials in the varied-line-spacing holographic grating design problem because the design constraints are too difficult to be expressed and solved in mathematical forms [46]. Unfortunately, most traditional optimization techniques focus on solving for a single optimal solution. They need to be applied several times; yet all solutions are not guaranteed to be found. Thus the multimodal optimization problem was proposed. In that problem, we are interested in not only a single optimal point, but also the others. Given an objective function, an algorithm is expected to find all optimal points in a single run. With strong parallel search capability, evolutionary algorithms are shown to be particularly effective in solving this type of problem [14]: Given , we would like to find all global and local maxima (or minima) of f in a single run. Definition 1 Local Maximum [60]: A local maximum of one (objective) function is an input element with for all x neighboring . If , then . Definition 2 Global Maximum [60]: A global maximum of one (objective) function is an input element with .
Although the objective is clear, it is not easy to be satisfied in practice because some problems may have too many optima to be located. Nonetheless, it is still of great interest to researchers how these problems are going to be solved because the algorithms for multimodal optimization usually not only locate multiple optima in a single run, but also preserve their population diversity throughout a run, resulting in their global optimization ability on multimodal functions. Such an ability is demonstrated in two applications and an extended numerical experiment in this study. Moreover, the techniques for multimodal optimization are usually borrowed as diversity maintenance techniques to other problems.
Section snippets
Related works
The work by Jong [20] is one of the first known attempts to solve the multimodal optimization problem by an evolutionary algorithm. He introduced the crowding technique to increase the chance of locating multiple optima. In the crowding technique, an offspring replaces the parent which is most similar to the offspring itself. Such a strategy can preserve the diversity and maintain different niches in a run. Twelve years later, Goldberg and Richardson [15] proposed a fitness-sharing niching
Differential evolution
Differential Evolution was first proposed by Storn and Price [57]. Without loss of generality, a typical strategy in differential evolution (DE/rand/1) [13] is shown in Algorithm 1. Algorithm 1 Differential EvolutionPt: Population at time t TP: Transient population t ← 0; Initialize Pt; Evaluate Pt; while not termination condition do TP ← ∅; for ∀indivi ∈ Pt do Offspring ← TrialVectorGeneration(indivi); Evaluate Offspring; if Offspring is fitter than indivi then Put Offspring into TP; else Put indivi into TP; end if end for t = t + 1; P
Evolutionary multimodal optimization using principle of locality
In this section, the proposed methods are described in details. As a nomenclature, a ’parent’ refers to the parent individual which is used as the input to a trial vector generation (i.e. indivi in Algorithm 2) whereas an ’offspring’ refers to the resultant individual returned by a trial vector generation (i.e. Offspring in Algorithm 2).
Numerical experiments
Experiments to compare the performance of CrowdingDE-STL, CrowdingDE-TL, CrowdingDE-SL, and other algorithms were conducted on ten benchmark functions. The other algorithms include: Crowding Genetic Algorithm (CrowdingGA) [20], CrowdingDE [59], Fitness Sharing Genetic Algorithm (SharingGA) [15], SharingDE [59], Species Conserving Genetic Algorithm (SCGA) [30], SDE [31], and UN [21]. The first five benchmark functions are widely adopted in literature: Deb’s 1st function [61], Himmelblau function
Applications
To demonstrate the effectiveness of the proposed methods, two applications are described in the following sections. One is the varied-line-spacing holographic grating design problem, while the other is the protein structure prediction problem on a lattice model.
Extended numerical experiments
To assess the performance of the proposed methods thoroughly, extended numerical experiments were conducted to compare them with the other state-of-the-art methods. Two types of methods recently published have been selected. We strictly followed the experimental setups and compared our results with those in the related literature.
Conclusion
In this work, the preliminary study of spatial locality reported in [62] has been extended. Besides spatial locality, temporal locality has been proposed and incorporated. Both localities have also been proposed and integrated together. They have been compared with the other state-of-the-art methods on the benchmark functions. To further analyse the effect of locality, the experimental analysis has been conducted to observe the synergistic effect between spatial and temporal locality.
The
Acknowledgments
The authors would like to express their deep gratitudes to the anonymous reviewers for their constructive comments. The authors would also like to thank Ling Qing for his source codes and insightful discussions. Last but not least, the authors would like to thank Kwong-Sak Leung and Man-Hon Wong for their contributions. ZZ acknowledges funding support from a NSERC Discovery Grant (RGPIN 327612-09).
References (67)
- et al.
Adaptive fuzzy particle swarm optimization for global optimization of multimodal functions
Information Sciences
(2011) - et al.
Force-imitated particle swarm optimization using the near-neighbor effect for locating multiple optima
Information Sciences
(2012) - et al.
Crowding clustering genetic algorithm for multimodal function optimization
Applied Soft Computing
(2008) - et al.
Ensemble of niching algorithms
Information Sciences
(2010) Handbook of Computational Molecular Biology (Chapman & All/Crc Computer and Information Science Series)
(2005)- et al.
Protein structure prediction and structural genomics
Science
(2001) - et al.
A sequential niche technique for multimodal function optimization
Evolutionary Computation
(1993) - B. Berger, T. Leighton, Protein folding in the hydrophobic–hydrophilic (hp) is np-complete, in: RECOMB ’98: Proceedings...
- H. Bersini, M. Dorigo, S. Langerman, G. Seront, L. Gambardella, Results of the first international contest on...
- M. Bessaou, A. Pétrowski, P. Siarry, Island model cooperating with speciation for multimodal optimization, in: PPSN VI:...
Time Series Analysis: Forecasting & Control
An immune algorithm for protein structure prediction on lattice models
IEEE Transactions on Evolutionary Computation
The locality principle
Communications of the ACM
Theory for the folding and stability of globular proteins
Biochemistry
Introduction to Evolutionary Computing (Natural Computing Series)
Differential Evolution: In Search of Solutions (Springer Optimization and Its Applications)
Genetic Algorithms in Search
Completely derandomized self-adaptation in evolution strategies
Evolutionary Computation
Adaptation in Natural and Artificial Systems
A framework for evolutionary optimization with approximate fitness functions
IEEE Transactions on Evolutionary Computation
Evolutionary Computation. A Unified Approach
Minimal representation multisensor fusion using differential evolution
IEEE Transactions on Systems, Man and Cybernetics Part A: Systems and Humans
Mechanical engineering design optimization by differential evolution
New Ideas in Optimization
Emergence of preferred structures in a simple model of protein folding
Science
A species conserving genetic algorithm for multimodal function optimization
Evolutionary Computation
Cited by (63)
Evolutionary-Mean shift algorithm for dynamic multimodal function optimization
2021, Applied Soft ComputingTwo-phase protein folding optimization on a three-dimensional AB off-lattice model
2020, Swarm and Evolutionary ComputationProtein folding optimization using differential evolution extended with local search and component reinitialization
2018, Information SciencesCitation Excerpt :With this strategy, our algorithm follows only one attractor. The temporal locality mechanism [35] and self-adaptive mechanism [6] of the main control parameters were used additionally to speed up the convergence speed. When the algorithm was trapped in a local optimum, then random reinitialization was used.
Black box optimization using evolutionary algorithm with novel selection and replacement strategies based on similarity between solutions
2018, Applied Soft Computing JournalCitation Excerpt :However, they are similar. These references calculate PD by using difference between genes in the same positions of solutions [44,47] or difference of each gene from average gene in corresponding position [14–47] or difference of each gene from gene in corresponding position of best solution [43]. To explore diversity of probed types of GAs, we uses two different methods.
A novel genetic algorithm based method for solving continuous nonlinear optimization problems through subdividing and labeling
2018, Measurement: Journal of the International Measurement ConfederationA niche GSA method with nearest neighbor scheme for multimodal optimization
2017, Swarm and Evolutionary ComputationCitation Excerpt :Providing multiple optimal solutions for an optimization problem can provide alternative solutions to the decision maker when a single solution will not suffice. The term Multimodal Optimization (MO) refers to a class of optimization techniques attempting to find multiple local and global optima for a given problem [1–10]. Evolutionary Algorithms (EAs) and Swarm Intelligence (SI) based search algorithms as metaheuristics are able to explore multiple regions of the search space during a single run due to their population based structure.