Abstract
Differential evolution (DE) is a powerful yet simple evolutionary algorithm for optimization of real-valued, multimodal functions. DE is generally considered as a reliable, accurate and robust optimization technique. However, the algorithm suffers from premature convergence and/or slow convergence rate resulting in poor solution quality and/or larger number of function evaluation resulting in large CPU time for optimizing the computationally expensive objective functions. Therefore, an attempt to speed up DE is considered necessary. This research introduces a modified differential evolution (MDE) that enhances the convergence rate without compromising with the solution quality. The proposed MDE algorithm maintains a failure_counter (FC) to keep a tab on the performance of the algorithm by scanning or monitoring the individuals. Finally, the individuals that fail to show any improvement in the function value for a successive number of generations are subject to Cauchy mutation with the hope of pulling them out of a local attractor which may be the cause of their deteriorating performance. The performance of proposed MDE is investigated on a comprehensive set of 15 standard benchmark problems with varying degrees of complexities and 7 nontraditional problems suggested in the special session of CEC2008. Numerical results and statistical analysis show that the proposed modifications help in locating the global optimal solution in lesser numbers of function evaluation in comparison with basic DE and several other contemporary optimization algorithms.
Similar content being viewed by others
References
Abbass H (2002) The self-adaptive pareto differential evolution algorithm. In: Proceedings of the 2002 congress on evolutionary computation, pp 831–836
Andre J, Siarry P, Dognon T (2001) An improvement of the standard genetic algorithm fighting premature convergence in continuous optimization. Adv Eng Software 32:49–60
Birru HK, Chellapilla K, Rao SS (1999) Local search operators in fast evolutionary programming. Proc IEEE Int Conf Evol Comput 2:1506–1513
Brest J, Boskovic B, Greiner S, Zumer V, Maucec MS (2007) Performance comparison of self-adaptive and adaptive differential evolution algorithms. Soft Comput 11(7):617–629
Caponio A, Neri F, Tirronen V (2009) Superfit control adaptation in memetic differential evolution frameworks. Soft Comput 13:811–831
Coelho LS, Krohling RA (2003) Predictive controller tuning using modified particle swarm optimization based on Cauchy and Gaussian distributions. In: Proceedings of the 8th on-line world conference on soft computing in industrial applications. WSC8
Fan H-Y, Lampinen J (2003) A trigonometric mutation operation to differential evolution. J Glob Optim 27:105–129
Gamperle R, Muller SD, Koumoutsakos A (2002) Parameter study for differential evolution. In: WSEAS NNA-FSFS-EC 2002. Interlaken, Switzerland
García S, Molina D, Lozano M, Herrera F (2009a) A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 special session on real parameter optimization. J Heuristics 15:617–644
García S, Fernández A, Luengo J, Herrera f (2009b) A study of statistical techniques and performance measures for genetics-based machine learning: accuracy and interpretability. Soft Comput 13:10:959–977. doi:10.1007/s00500-008-0392-y
Hendtlass T (2001) A combined swarm differential evolution algorithm for optimization problems. In: Proceedings of the fourteenth international conference on industrial and engineering applications of artificial intelligence and expert systems. Lecture notes in computer science. Springer, Berlin, vol 2070, pp 11–18
Hrstka O, Kucerová A (2004) Improvement of real coded genetic algorithm based on differential operators preventing premature convergence. Adv Eng Software 35:237–246
Kannan S, Slochanal S, Subbaraj P, Padhy N (2004) Application of particle swarm optimization technique and its variants to generation expansion planning. Electr Power Syst Res 70(3):203–210
Lampinen J, Zelinka I (2000) On stagnation of the differential evolution algorithm. In: Ošmera P (ed) Proceedings of MENDEL 2000, 6th international mendel conference on soft computing. Brno, Czech Republic, pp 76–83
Lan K-T, Lan C-H (2008) Notes on the distinction of Gaussian and Cauchy mutations. In: Eighth international conference on intelligent systems design and applications, vol 1, pp 272–277
Liu J, Lampinen J (2005) A fuzzy adaptive differential evolution algorithm. Soft Comput Fusion Found Methodol Appl 9(6):448–462
Noman N, Iba H (2005) Enhancing differential evolution performance with local search for high dimensional function optimization. In: Proceedings of the 2005 conference on genetic and evolutionary computation, pp 967–974
Noman N, Iba H (2008) Accelerating differential evolution using an adaptive local search. IEEE Trans Evol Comput 12(1):107–125
Omran M, Engelbrecht A, Salman A (2005a) Differential evolution methods for unsupervised image classification. Proc IEEE Congr Evol Comput 2:966–973
Omran M, Salman A, Engelbrecht AP (2005b) Self-adaptive differential evolution, computational intelligence and security, PT 1. In: Proceedings lecture notes in artificial intelligence, vol 3801, pp 192–199
Omran MGH, Engelbrecht AP, Salman A (2008) Bare bones differential evolution. Eur J Oper Res. doi:10.1016/j.ejor.2008.02.035
Pant M, Ali M, Singh VP (2009) Parent centric differential evolution algorithm for global optimization. Opsearch 46(2):153–168
Price K (1999) An introduction to DE. In: Corne D, Marco D, Glover F (eds) New ideas in optimization. McGraw-Hill, London (UK), pp 78–108
Qin K, Huang VL, Suganthan PN (2009) Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans Evol Comput 13(2):398–417
Rahnamayan S, Wang GG (2008) Solving large scale optimization problems by opposition based differential evolution (ODE). WSEAS Trans Comput 7(10):1792–1804
Rahnamayan S, Tizhoosh HR, Salama MMA (2008) Opposition-based differential evolution. IEEE Trans Evol Comput 12(1):64–79
Ronkkonen J, Kukkonen S, Price KV (2005) Real parameter optimization with differential evolution. In: Proceedings of IEEE congress on evolutionary computation (CEC-2005). IEEE Press, vol 1, pp 506–513
Rudolph G (1997) Local convergence rates of simple evolutionary algorithms with Cauchy mutations. IEEE Trans Evol Comput 1(1):249–256
Shih FY, Edupuganti VG (2009) A differential evolution based algorithm for breaking the visual steganaliytic system. Soft Comput 13(4):345–353
Stacey A, Jancie M, Grundy I (2003) Particle swarm optimization with mutation. In: Proceeding of IEEE congress on evolutionary computation, pp 1425–1430
Storn R (1995) Differential evolution design for an IIR-filter with requirements for magnitude and group delay. Technical Report TR-95-026. International Computer Science Institute, Berkeley, CA
Storn R, Price K (1995) DE-a simple and efficient adaptive scheme for global optimization over continuous space. Technical Report TR-95-012, ICSI, March 1995. ftp://ftp.icsi.berkeley.edu/pub/techreports/1995/tr-95-012.ps.Z
Talbi H, Batouche M (2004) Hybrid particle swarm with differential evolution for multimodal image registration. Proc IEEE Int Conf Indust Technol 3:1567–1573
Tang K, Yao X, Suganthan PN, MacNish C, Chen YP, Chen CM, Yang Z (2007) Benchmark functions for the CEC’2008 special session and competition on large scale global optimization, Technical Report, Nature Inspired Computation and Applications Laboratory, USTC, China. http://nical.ustc.edu.cn/cec08ss.php
Teng NS, Teo J, Hijazi MHA (2009) Self-adaptive population sizing for a tune-free differential evolution. Soft Comput 13(7):709–724
Vesterstroem J, Thomsen R (2004) A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical benchmark problems. Proc Congr Evol Comput 2:1980–1987
Wang H, Liu Y, Li C, Zeng S (2006) A hybrid particle swarm algorithm with Cauchy mutation. In: IEEE swarm intelligence symposium 2007 (SIS 2007), Honolulu, Hawaii, USA (in press)
Xu W, Gu X (2009) A hybrid particle swarm optimization approach with prior crossover differential evolution. In: Proceedings of GEC09, pp 671–677
Yang Z, Tang K, Yao X (2008a) Self-adaptive differential evolution with neighborhood search. In: Proceedings of IEEE congress on evolutionary computation (CEC-2008), Hong Kong, pp 1110–1116
Yang Z, Tang K, Yao X (2008b) Large scale evolutionary optimization using Cooperative Co evolution. Inf Sci 178(15):2985–2999
Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput 3(2):82–102
Zaharie D (2003) Control of population diversity and adaptation in differential evolution algorithms. In: Matousek D, Osmera P (eds) Proceedings of MENDEL 2003, 9th international conference on soft computing. Brno, Czech Republic, pp 41–46
Zaharie D, Petcu D (2003) Adaptive pareto differential evolution and its parallelization. In: Proceedings of 5th international conference on parallel processing and applied mathematics. Czestochowa, Poland, vol 3019, pp 261–268
Zhang WJ, Xie XF (2003) DEPSO, hybrid particle swarm with differential evolution operator. IEEE Int Conf Syst Man Cybern 4:3816–3821
Zhang C, Ning J, Lu S, Ouyang D, Ding T (2009) A novel hybrid differential evolution and particle swarm optimization algorithm for unconstrained optimization. Oper Res Lett 37:117–122
Acknowledgments
The authors would like to thank the unknown referees whose comments and suggestions helped in improving the shape of the paper. The authors would also like to acknowledge the financial support provided by the MHRD, India, and DST, India, respectively.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
Benchmark problems:
-
1.
Easom function (EP):
It is a multimodal, nonseparable function having several local optima.
-
2.
Foxhole function (FX):
It is a multimodal nonseparable function having several local optima. Many standard optimization algorithms get stuck in the first peak they find.
-
3.
Six hump Camel back function (CB6):
It is a multimodal separable function having two global optima and four local optima.
-
4.
Goldstien problem (GP):
It is a multimodal, nonseparable function having one global minimum and four local minima.
-
5.
Hartman 3 function (H3):
I | c i | a ij | p ij | ||||
---|---|---|---|---|---|---|---|
j = 1 | 2 | 3 | j = 1 | 2 | 3 | ||
1 | 1 | 3 | 10 | 30 | 0.3689 | 0.117 | 0.2673 |
2 | 1.2 | 0.1 | 10 | 35 | 0.4699 | 0.4387 | 0.747 |
3 | 3 | 3 | 10 | 30 | 0.1091 | 0.8732 | 0.5547 |
4 | 3.2 | 0.1 | 10 | 35 | 0.3815 | 0.5743 | 0.8828 |
It is a multimodal nonseparable function having four local minima and one global minimum.
-
6.
Sphere function:
It is a simple, continuous unimodal, separable and highly convex function. It serves as test case for validating the convergence speed of an algorithm.
-
7.
Ackley’s function (ACK):
With \( -30 \le x_{i} \le 30, \) min \( f_{\text{ACK}} \left( {0, \ldots ,0} \right) = 0 \)
The presence of an exponential term in the Ackley’s function covers its surface with numerous local minima. The complexity of this function is moderated. An algorithm that only uses the gradient steepest descent will be trapped in a local optima, but any search strategy that analyzes a wider region will be able to cross the valley among the optima and achieve better results.
-
8.
Schwefel’s problem (SWF):
Where s = 420.97
It is a multimodal function with very deep sinusoidal interactions. It is generally considered to be difficult for optimization algorithms.
-
9.
Griewank function (GW):
The Griewank function is a highly multimodal, nonseparable function. It has many regularly distributed local minima. It tests both convergence speed and the ability to escape from a shallow local minimum.
-
10.
Levy and Montalvo2 problem (LM2):
With \( -50 \le x_{i} \le 50 \), min \( f_{{{\text{LM}}2}} \left( {1, \ldots ,1} \right) = 0 \)
It is a multimodal, nonseparable function having several local optima.
-
11.
Step function(ST):
-
12.
Rosenbrock problem (RB):
Rosenbrock function is unimodal for smaller dimensions; however, as we increase the number of dimensions it ceases to be unimodal. Rosenbrock function is like Colville function for which the optimum lies inside a long, narrow, parabolic-shaped float valley. Like Colville function it helps in testing the ability of an algorithm to prevent premature convergence.
-
13.
Rastrigin’s function (RG):
This function is highly multimodal, nonseparable function having large number of local optima
-
14.
Schwefel’s problem 2.22:
-
15.
Rotated hyper ellipsoid function (RHE):
Rights and permissions
About this article
Cite this article
Ali, M., Pant, M. Improving the performance of differential evolution algorithm using Cauchy mutation. Soft Comput 15, 991–1007 (2011). https://doi.org/10.1007/s00500-010-0655-2
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-010-0655-2