Gravitational swarm optimizer for global optimization
Graphical abstract
Introduction
Constrained optimization problems (COPs) are an important class of problems in the field of optimization, because many real-life problems arising in engineering, computer science, finance, and business science can be modeled as nonlinear constrained optimization problems. The formulation of any real-life problem into a constrained optimization problem involves the use of many parameters. Determination of the optimal value of each of these parameters is very important because together these values provide the solution to the problems.
Mathematically, a constrained optimization problem can be formulated in the form of an objective function, which is constrained by some linear and nonlinear constraints. The following model provides the mathematical description of a nonlinear constrained optimization problem:subject to a set of inequality constraintsas well as equality constraintswhere objective function f is defined over subspace of a D dimensional real vector space and x is a member of D-dimensional vector space. A set of q inequality constraints and equality constraints, define the feasible region . are the lower and upper bounds of the decision variables in the domain S, where .
A class of optimization techniques is available in the literature for use with COPs. In principle, both deterministic and nondeterministic techniques are being used to solve COPs. Unfortunately, selected predefined assumptions of deterministic techniques restrict their applicability to a specific class of problems. This restriction directed us to focus on nondeterministic techniques, among which differential-free, nature-inspired optimization techniques have become very popular, because of their applicability to a wide range of optimization problems. This paper focuses on the development of a new meta-heuristic method for constrained optimization problems.
In recent years, many nature-inspired optimization techniques have been developed to solve constrained optimization problems. Initially, these techniques were only used to solve unconstrained optimization problems. Particle Swarm Optimization (PSO) [7], [28], [44], Differential Evolution (DE) [5], [33], [8], and the Gravitational Search Algorithm (GSA) [45] are known to deliver excellent performance for unconstrained optimization problems, but have been found to perform variedly with COPs. In particular, the performance is strongly affected when problems have to be solved at a high level of complexity. The complexity in COPs mainly occurs when the ratio of the search region to the feasible region is very small [29]. This level of complexity in problems requires the combined use of different classes of algorithms to provide a more powerful constrained optimizer. Many modifications and hybridizations intended to improve the efficiency and robustness of the algorithms have appeared in the literature. Banks et al. [3] provide detailed information about the possible improvements of the PSO algorithm by hybridization, and they exhaustively discussed the major benefits of this development. Huang [23] improved the availability of the DE algorithm by evolving two subpopulations. Lwin and Qu [31] proposed a hybrid algorithm by integrating population-based incremental learning and DE for the solution of constrained portfolio selections.
The success of any constrained optimization algorithm mostly depends on the strength of the constraint-handling technique, the design of which has to be customized for an individual optimization algorithm. A few good constraint-handling mechanisms, that are capable of performing well, have been proposed. For example, Deb [19] proposed an efficient constraint-handling approach for genetic algorithms, whereas Coello and Carlos [14] published a comprehensive survey on constraint-handling approaches for a large number of optimization algorithms. Mezura-Montes and Coello [35] also furnished a detailed report in which they presented the future scope and trends of constraint-handling mechanisms. In [18] the design of a very good constraint-handling method for multiple swarm-based cultural PSO is described. The constraint-handling method discussed in [1] was successfully embedded within DE based on a penalty function. The FPBRM constraint-handling method proposed by Mun and Cho [40] for a modified harmony search algorithm also produced good results for optimization problems. The advantage of these algorithms lies in the fact that they were specifically designed for the technique being used for the optimization. This kind of constraint-handling mechanism is naturally compatible with the algorithms and enhances the performance of the optimizer. The overall message emerging from these studies is that an effective constraint-handling method should be based on the individual algorithm in which this constraint-handling method will be utilized. This inspired the authors of this paper to propose a new constraint-handling mechanism that is appropriate for and compatible with the proposed optimization algorithm.
This research extends the concept of the recently proposed shrinking hypersphere PSO (SHPSO) [51] for unconstrained optimization and engineering design problems, as opposed to the method in [52], which was extended for constrained optimization problems. The performance of the SHPSO approach was improved using the GSA [45] and a global constrained optimizer was established with theoretical proof of its convergence and stability.
The organization of the paper is as follows. Section 2 briefly provides the concept of the GSA, and in 2.1 Particle Swarm Optimization, 2.2 Shrinking Hypersphere based PSO (SHPSO) the principles of PSO and SHPSO are discussed. Subsequently, in Section 3, the proposed SHPSO-GSA is presented and Section 4 contains a detailed theoretical and experimental analysis of the proposed algorithm. Section 5 discusses the proposed constraint-handling method and in Section 6 the experimental results are discussed followed by the conclusions. A flow chart of the paper is depicted in Fig. 1.
Section snippets
Gravitational search algorithm
The GSA [45] is a recent meta-heuristic algorithm for solving nonlinear optimization problems. It is inspired by Newton's basic physical theory that states that a force of attraction works between every particle in the universe and this force is directly proportional to the product of their masses and inversely proportional to the square of the distance between their positions. In the GSA, each particle is equipped with four kinds of properties: position, mass, active gravitational mass (Mai),
Motivation of hybridization
The fundamental motivation for designing the SHPSO-GSA was to utilize the memory-enabled behavior of PSO in the memory-less approach of GSA; i.e., GSA does not keep track of the path of any individual particle in its memory. The memory functionality was assembled into GSA by using it jointly with the recently proposed SHPSO. The advantage of the GSA is the constitutional diversity in the algorithm, which originates from the fundamental concept of defined acceleration of a particle. Because the
A detailed analysis of the proposed algorithm
This section presents a rigorous analysis of the proposed SHPSO-GSA. The effect of the modified velocity updated equation, theoretical convergence, and converging ability is discussed in detail.
A new constraint handling method
A parameter-free constraint-handling approach is used to ensure the feasibility of the particles. The degree of constrained violation is evaluated by using Eq. (42) and the total degree of violation of an individual x is evaluated by taking the sum of violations at each constraint, i.e. . In each iteration the swarm is sorted in the following three ways:
- (i)
The feasible solutions are listed in front of the infeasible solutions.
- (ii)
The feasible solutions are sorted in ascending order of
Experimental analysis and results
The proposed SHPSO-GSA is tested on twenty four benchmark problems proposed in IEEE CEC 2006 [29]. The results are compared with eleven state-of-the-algorithms. These algorithms are listed in Table 4. The experiments were performed by using the following experimental setup.
Performance of SHPSO-GSA on unconstrained problems
In order to study the performance of SHPSO-GSA over unconstrained optimization problems. It is applied to solve CEC 2015 benchmark [6] expensive optimization test problems. All the 15 problems are solved and the results are compared with the state-of-the-art algorithms listed in Table 6.
The results are listed in a form of best, worst, mean and standard deviation (stdev) of the fitness values of the corresponding problem in Table 12, Table 13. The best result for each algorithm is presented in
Algorithm complexity
The time complexity of SHPSO-GSA is studied based on the strategy defined in CEC 2015 benchmark [6]. The measurement of the complexity of the strategy employed is presented in Algorithm 5. Algorithm 5 Strategy for the calculation of algorithm complexity. 1: Run the test program below: 2: for i=1:1,000,000 do 3: ; 4: ; 5: end for 6: Computing time for the above = T0; 7: The average complete computing time for the algorithm 8: The complexity
Conclusion
This paper presents a new algorithm named SHPSO-GSA, which was developed by hybridizing shrinking hypersphere-based PSO with the GSA, to produce an optimizer capable of improved constraint. The need for and design of the proposed algorithm are well established and justified in various respects. The validity of the designed hybrid was tested in multiple ways with positive output. An effective constraint-handling technique, which is compatible with the proposed algorithm, was defined to ensure
Acknowledgment
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIP) (No. 2013R1A2A1A01013886) and National Institute of Technology Uttarakhand, India. We would like to express our gratitude toward the unknown potential reviewers who have agreed to review this paper and who have provided valuable suggestions to improve the quality of the paper.
References (55)
- et al.
A prototype classifier based on gravitational search algorithm
Appl. Soft. Comput.
(2012) - et al.
Enhancing distributed differential evolution with multicultural migration for global numerical optimization
Inf. Sci.
(2013) Theoretical and numerical constraint-handling techniques used with evolutionary algorithmsa survey of the state of the art
Comput. Methods Appl. Mech. Eng.
(2002)- et al.
A note on teaching-learning-based optimization algorithm
Inf. Sci.
(2012) - et al.
Replication and comparison of computational experiments in applied evolutionary computingcommon pitfalls and guidelines to avoid them
Appl. Soft Comput.
(2014) An efficient constraint handling method for genetic algorithms
Comput. Methods Appl. Mech. Eng.
(2000)- et al.
A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms
Swarm Evol. Comput.
(2011) - et al.
An effective co-evolutionary differential evolution for constrained optimization
Appl. Math. Comput.
(2007) - et al.
Hybrid ica-pso algorithm for continuous optimization
Appl. Math. Comput.
(2013) - et al.
A hybrid approach based on an improved gravitational search algorithm and orthogonal crossover for optimal shape design of concrete gravity dams
Appl. Soft. Comput.
(2014)
Teaching and peer-learning particle swarm optimization
Appl. Soft. Comput.
Differential evolution algorithm with ensemble of parameters and mutation strategies
Appl. Soft. Comput.
Constraint-handling in nature-inspired numerical optimizationpast, present and future
Swarm Evol. Comput.
Differential evolution in constrained numerical optimizationan empirical study
Inf. Sci.
Constrained optimization based on modified differential evolution algorithm
Inf. Sci.
Modified harmony search optimization for constrained design problems
Expert Syst. Appl.
Compact particle swarm optimization
Inf. Sci.
GSAa gravitational search algorithm
Inf. Sci.
A two-swarm cooperative particle swarms optimization
Swarm Evol. Comput.
The particle swarm optimization algorithmconvergence analysis and parameter selection
Inf. Process. Lett.
A study of particle swarm optimization particle trajectories
Inf. Sci.
Shrinking hypersphere based trajectory of particles in pso
Appl. Math. Comput.
A new approach for unit commitment problem via binary gravitational search algorithm
Appl. Soft. Comput.
Differential evolution with dynamic stochastic selection for constrained optimization
Inf. Sci.
A penalty function-based differential evolution algorithm for constrained global optimization
Comput. Optim. Appl.
A review of particle swarm optimization. Part IIhybridisation, combinatorial, multicriteria and constrained optimization, and indicative applications
Nat. Comput. Ser.
A hybrid particle swarm with a time-adaptive topology for constrained optimization
Swarm Evol. Comput.
Cited by (28)
Fitness-Distance-Constraint (FDC) based guide selection method for constrained optimization problems
2023, Applied Soft ComputingOpposition-based Laplacian Equilibrium Optimizer with application in Image Segmentation using Multilevel Thresholding
2021, Expert Systems with ApplicationsCitation Excerpt :One another swarm intelligence based algorithm imitating the behaviour of grasshopper and proves to be the very efficient algorithm for structural design is Grasshopper optimization algorithm (Saremi, Mirjalili, & Lewis, 2017; Yıldız, Yıldız, Sait, et al., 2019; Yıldız, Yıldız, Sait, Bureerat, et al., 2019). Few algorithms such as Central Force Optimization (CFO) (Formato, 2007), Gravitational Search Algorithm (GSA) (Rashedi, Nezamabadi-Pour, & Saryazdi, 2009; Yadav, Deep, Kim, & Nagar, 2016) induced their motivation from physics law of nature. Simulated Annealing (SA) is an algorithm based on the physical process of enhancing the temperature of heat bath which induces all the solid particles to arrange randomly in a liquid phase (Van Laarhoven & Aarts, 1987; Kurtuluş, Yıldız, Sait, & Bureerat, 2020).
Many-objective optimization for scheduling of crude oil operations based on NSGA-Ⅲ with consideration of energy efficiency
2020, Swarm and Evolutionary ComputationArtificial electric field algorithm for engineering optimization problems
2020, Expert Systems with ApplicationsCitation Excerpt :Coello and Montes (2002) investigated a dominance-based tournament selection scheme in genetic algorithm instead of traditional penalty method to incorporate the constrained into fitness functions. Some other primary nature inspired and evolutionary optimization techniques and their hybrids to solve the constrained optimization problems are: monarch butterfly algorithm (MBO) (Wang, Zhao, & Deb, 2015), firefly algorithm (Baykasoğlu & Ozsoydan, 2015), harmony search (HS) (Geem, Kim, & Loganathan, 2001), chaotic cuckoo search (CCS) (Wang, Deb, Gandomi, Zhang, & Alavi, 2016), grey wolf optimizer (GWO) (Mirjalili, Mirjalili, & Lewis, 2014), water cycle algorithm (WCA) (Sayyaadi, Sadollah, Yadav, & Yadav, 2019), artificial search agents with cognitive intelligence (Ozsoydan, 2019), bat algorithm (BA) (Yang, 2010), particle swarm optimization and their hybrids (Al-Shaikhi, Khan, Al-Awami, & Zerguine, 2019; Sereshki & Derakhshani, 2019; Yadav & Deep, 2014), gravitational swarm optimization (Yadav et al., 2016), swarm intelligence-based algorithm (Ozsoydan & Baykasoglu, 2019; Yadav, Yadav, Kumar, & Kim, 2017), neural network algorithm (Sadollah, Sayyaadi, & Yadav, 2018), an efficient co-swarm particle swarm optimization (Yadav & Deep, 2014; 2016), rain-fall optimization algorithm (Kaboli, Selvaraj, & Rahim, 2017), whale optimization algorithm(WOA) (Mirjalili & Lewis, 2016), weighted superposition attraction (WSA) (Baykasoğlu, Ozsoydan, & Senol, 2018) and etc. AEFA (Yadav et al., 2019) is a population-based optimization algorithm designed for continuous optimization problems.
Hybrid advanced player selection strategy based population search for global optimization
2020, Expert Systems with Applications