Skip to main content
Log in

Improving the performance of differential evolution algorithm using Cauchy mutation

Soft Computing Aims and scope Submit manuscript

Abstract

Differential evolution (DE) is a powerful yet simple evolutionary algorithm for optimization of real-valued, multimodal functions. DE is generally considered as a reliable, accurate and robust optimization technique. However, the algorithm suffers from premature convergence and/or slow convergence rate resulting in poor solution quality and/or larger number of function evaluation resulting in large CPU time for optimizing the computationally expensive objective functions. Therefore, an attempt to speed up DE is considered necessary. This research introduces a modified differential evolution (MDE) that enhances the convergence rate without compromising with the solution quality. The proposed MDE algorithm maintains a failure_counter (FC) to keep a tab on the performance of the algorithm by scanning or monitoring the individuals. Finally, the individuals that fail to show any improvement in the function value for a successive number of generations are subject to Cauchy mutation with the hope of pulling them out of a local attractor which may be the cause of their deteriorating performance. The performance of proposed MDE is investigated on a comprehensive set of 15 standard benchmark problems with varying degrees of complexities and 7 nontraditional problems suggested in the special session of CEC2008. Numerical results and statistical analysis show that the proposed modifications help in locating the global optimal solution in lesser numbers of function evaluation in comparison with basic DE and several other contemporary optimization algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  • Abbass H (2002) The self-adaptive pareto differential evolution algorithm. In: Proceedings of the 2002 congress on evolutionary computation, pp 831–836

  • Andre J, Siarry P, Dognon T (2001) An improvement of the standard genetic algorithm fighting premature convergence in continuous optimization. Adv Eng Software 32:49–60

    Article  Google Scholar 

  • Birru HK, Chellapilla K, Rao SS (1999) Local search operators in fast evolutionary programming. Proc IEEE Int Conf Evol Comput 2:1506–1513

    Google Scholar 

  • Brest J, Boskovic B, Greiner S, Zumer V, Maucec MS (2007) Performance comparison of self-adaptive and adaptive differential evolution algorithms. Soft Comput 11(7):617–629

    Google Scholar 

  • Caponio A, Neri F, Tirronen V (2009) Superfit control adaptation in memetic differential evolution frameworks. Soft Comput 13:811–831

    Google Scholar 

  • Coelho LS, Krohling RA (2003) Predictive controller tuning using modified particle swarm optimization based on Cauchy and Gaussian distributions. In: Proceedings of the 8th on-line world conference on soft computing in industrial applications. WSC8

  • Fan H-Y, Lampinen J (2003) A trigonometric mutation operation to differential evolution. J Glob Optim 27:105–129

    Google Scholar 

  • Gamperle R, Muller SD, Koumoutsakos A (2002) Parameter study for differential evolution. In: WSEAS NNA-FSFS-EC 2002. Interlaken, Switzerland

  • García S, Molina D, Lozano M, Herrera F (2009a) A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 special session on real parameter optimization. J Heuristics 15:617–644

    Article  MATH  Google Scholar 

  • García S, Fernández A, Luengo J, Herrera f (2009b) A study of statistical techniques and performance measures for genetics-based machine learning: accuracy and interpretability. Soft Comput 13:10:959–977. doi:10.1007/s00500-008-0392-y

  • Hendtlass T (2001) A combined swarm differential evolution algorithm for optimization problems. In: Proceedings of the fourteenth international conference on industrial and engineering applications of artificial intelligence and expert systems. Lecture notes in computer science. Springer, Berlin, vol 2070, pp 11–18

  • Hrstka O, Kucerová A (2004) Improvement of real coded genetic algorithm based on differential operators preventing premature convergence. Adv Eng Software 35:237–246

    Article  Google Scholar 

  • Kannan S, Slochanal S, Subbaraj P, Padhy N (2004) Application of particle swarm optimization technique and its variants to generation expansion planning. Electr Power Syst Res 70(3):203–210

    Article  Google Scholar 

  • Lampinen J, Zelinka I (2000) On stagnation of the differential evolution algorithm. In: Ošmera P (ed) Proceedings of MENDEL 2000, 6th international mendel conference on soft computing. Brno, Czech Republic, pp 76–83

  • Lan K-T, Lan C-H (2008) Notes on the distinction of Gaussian and Cauchy mutations. In: Eighth international conference on intelligent systems design and applications, vol 1, pp 272–277

  • Liu J, Lampinen J (2005) A fuzzy adaptive differential evolution algorithm. Soft Comput Fusion Found Methodol Appl 9(6):448–462

    MATH  Google Scholar 

  • Noman N, Iba H (2005) Enhancing differential evolution performance with local search for high dimensional function optimization. In: Proceedings of the 2005 conference on genetic and evolutionary computation, pp 967–974

  • Noman N, Iba H (2008) Accelerating differential evolution using an adaptive local search. IEEE Trans Evol Comput 12(1):107–125

    Article  Google Scholar 

  • Omran M, Engelbrecht A, Salman A (2005a) Differential evolution methods for unsupervised image classification. Proc IEEE Congr Evol Comput 2:966–973

    Article  Google Scholar 

  • Omran M, Salman A, Engelbrecht AP (2005b) Self-adaptive differential evolution, computational intelligence and security, PT 1. In: Proceedings lecture notes in artificial intelligence, vol 3801, pp 192–199

  • Omran MGH, Engelbrecht AP, Salman A (2008) Bare bones differential evolution. Eur J Oper Res. doi:10.1016/j.ejor.2008.02.035

  • Pant M, Ali M, Singh VP (2009) Parent centric differential evolution algorithm for global optimization. Opsearch 46(2):153–168

    Article  MATH  MathSciNet  Google Scholar 

  • Price K (1999) An introduction to DE. In: Corne D, Marco D, Glover F (eds) New ideas in optimization. McGraw-Hill, London (UK), pp 78–108

  • Qin K, Huang VL, Suganthan PN (2009) Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans Evol Comput 13(2):398–417

    Article  Google Scholar 

  • Rahnamayan S, Wang GG (2008) Solving large scale optimization problems by opposition based differential evolution (ODE). WSEAS Trans Comput 7(10):1792–1804

    Google Scholar 

  • Rahnamayan S, Tizhoosh HR, Salama MMA (2008) Opposition-based differential evolution. IEEE Trans Evol Comput 12(1):64–79

    Article  Google Scholar 

  • Ronkkonen J, Kukkonen S, Price KV (2005) Real parameter optimization with differential evolution. In: Proceedings of IEEE congress on evolutionary computation (CEC-2005). IEEE Press, vol 1, pp 506–513

  • Rudolph G (1997) Local convergence rates of simple evolutionary algorithms with Cauchy mutations. IEEE Trans Evol Comput 1(1):249–256

    Article  Google Scholar 

  • Shih FY, Edupuganti VG (2009) A differential evolution based algorithm for breaking the visual steganaliytic system. Soft Comput 13(4):345–353

    Google Scholar 

  • Stacey A, Jancie M, Grundy I (2003) Particle swarm optimization with mutation. In: Proceeding of IEEE congress on evolutionary computation, pp 1425–1430

  • Storn R (1995) Differential evolution design for an IIR-filter with requirements for magnitude and group delay. Technical Report TR-95-026. International Computer Science Institute, Berkeley, CA

  • Storn R, Price K (1995) DE-a simple and efficient adaptive scheme for global optimization over continuous space. Technical Report TR-95-012, ICSI, March 1995. ftp://ftp.icsi.berkeley.edu/pub/techreports/1995/tr-95-012.ps.Z

  • Talbi H, Batouche M (2004) Hybrid particle swarm with differential evolution for multimodal image registration. Proc IEEE Int Conf Indust Technol 3:1567–1573

    Google Scholar 

  • Tang K, Yao X, Suganthan PN, MacNish C, Chen YP, Chen CM, Yang Z (2007) Benchmark functions for the CEC’2008 special session and competition on large scale global optimization, Technical Report, Nature Inspired Computation and Applications Laboratory, USTC, China. http://nical.ustc.edu.cn/cec08ss.php

  • Teng NS, Teo J, Hijazi MHA (2009) Self-adaptive population sizing for a tune-free differential evolution. Soft Comput 13(7):709–724

    Google Scholar 

  • Vesterstroem J, Thomsen R (2004) A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical benchmark problems. Proc Congr Evol Comput 2:1980–1987

    Google Scholar 

  • Wang H, Liu Y, Li C, Zeng S (2006) A hybrid particle swarm algorithm with Cauchy mutation. In: IEEE swarm intelligence symposium 2007 (SIS 2007), Honolulu, Hawaii, USA (in press)

  • Xu W, Gu X (2009) A hybrid particle swarm optimization approach with prior crossover differential evolution. In: Proceedings of GEC09, pp 671–677

  • Yang Z, Tang K, Yao X (2008a) Self-adaptive differential evolution with neighborhood search. In: Proceedings of IEEE congress on evolutionary computation (CEC-2008), Hong Kong, pp 1110–1116

  • Yang Z, Tang K, Yao X (2008b) Large scale evolutionary optimization using Cooperative Co evolution. Inf Sci 178(15):2985–2999

    Article  MathSciNet  Google Scholar 

  • Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput 3(2):82–102

    Article  Google Scholar 

  • Zaharie D (2003) Control of population diversity and adaptation in differential evolution algorithms. In: Matousek D, Osmera P (eds) Proceedings of MENDEL 2003, 9th international conference on soft computing. Brno, Czech Republic, pp 41–46

  • Zaharie D, Petcu D (2003) Adaptive pareto differential evolution and its parallelization. In: Proceedings of 5th international conference on parallel processing and applied mathematics. Czestochowa, Poland, vol 3019, pp 261–268

  • Zhang WJ, Xie XF (2003) DEPSO, hybrid particle swarm with differential evolution operator. IEEE Int Conf Syst Man Cybern 4:3816–3821

    Google Scholar 

  • Zhang C, Ning J, Lu S, Ouyang D, Ding T (2009) A novel hybrid differential evolution and particle swarm optimization algorithm for unconstrained optimization. Oper Res Lett 37:117–122

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the unknown referees whose comments and suggestions helped in improving the shape of the paper. The authors would also like to acknowledge the financial support provided by the MHRD, India, and DST, India, respectively.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Musrrat Ali.

Appendix

Appendix

Benchmark problems:

  1. 1.

    Easom function (EP):

$$ f_{EP} \left( x \right) = - { \cos }\left( {x_{1} } \right){ \cos }\left( {x_{2} } \right){ \exp }\left[ { - \left( {x_{1} - \pi } \right)^{2} - \left( {x_{2} - \pi } \right)^{2} } \right]\quad {\text{With}} - 10 \le x_{i} \le 10,\,{ \min }\, \, f_{{\rm EP}} \left( {\pi ,\pi } \right) = - 1 $$

It is a multimodal, nonseparable function having several local optima.

  1. 2.

    Foxhole function (FX):

$$ \begin{gathered} f_{{\rm FX}} \left( x \right) = \left( {\frac{1}{500} + \sum\limits_{i = 1}^{25} {{\frac{1}{{i + \sum\limits_{j = 1}^{2} {\left( {x_{j} - a_{ij} } \right)^{6} } }}}} } \right)^{ - 1} \,{\text{With}} -65.536 \le x_{i} \le 65.536,\,{ \min }\, \, f_{{\rm FX}} \left( { - 32, - 32} \right) = .998004 \hfill \\ a_{ij} = \left( \begin{gathered} - 32\,\,\,\, - 16\,\,\,\,\,\,\,\,0\,\,\,\,\,\,\,\,\,16\,\,\,\,\,\,\,\,32\,\,\,\,\,\, - 32\,\,\, \ldots \,\,\,\, - 32\,\,\,\,\, - 16\,\,\,\,\,0\,\,\,\,\,\,16\,\,\,\,\,32 \hfill \\ - 32\,\,\,\, - 32\,\,\, - 32\,\,\, - 32\,\,\, - 32\,\,\,\,\,\, - 16\,\,\,\, \ldots \,\,\,\,\,\,\,\,32\,\,\,\,\,\,\,\,32\,\,\,\,\,32\,\,\,\,32\,\,\,\,32\, \hfill \\ \end{gathered} \right) \hfill \\ \end{gathered} $$

It is a multimodal nonseparable function having several local optima. Many standard optimization algorithms get stuck in the first peak they find.

  1. 3.

    Six hump Camel back function (CB6):

$$ \begin{gathered} f_{\text{CB6}} \left( x \right) = 4x_{1}^{2} - 2.1x_{1}^{4} + \frac{1}{3}x_{1}^{6} + x_{1} x_{2} - 4x_{2}^{2} + 4x_{2}^{4} \quad {\text{With}} -5 \le x_{i} \le 5,\,{ \min }\, \, \hfill \\ f_{\text{CB6}} {{\left( {0.0898,\; - 0.7126} \right)} \mathord{\left/ {\vphantom {{\left( {0.0898,\; - 0.7126} \right)} {\left( { - 0.0898,0.7126} \right)}}} \right. \kern-\nulldelimiterspace} {\left( { - 0.0898,0.7126} \right)}} = - 1.0316285 \hfill \\ \end{gathered} $$

It is a multimodal separable function having two global optima and four local optima.

  1. 4.

    Goldstien problem (GP):

$$ \begin{gathered} f_{\text{GP}} \left( x \right) = \left[ {1 + (x_{1} + x_{2} + 1)^{2} \left( {19 - 14x_{1} + 3x_{1}^{2} - 14x_{2} + 6x_{1} x_{2} + 3x_{2}^{2} } \right)} \right] \hfill \\ \quad \quad \quad \quad \times \left[ {30 + (2x_{1} - 3x_{2} )^{2} \left( {15 - 32x_{1} + 12x_{1}^{2} + 42x_{2} - 36x_{1} x_{2} + 27x_{2}^{2} } \right)} \right]\quad {\text{With}} -2 \le x_{i} \le 2,\,{ \min }\, \, \hfill \\ f_{\text{GP}} \left( {0, - 1} \right) = 3 \hfill \\ \end{gathered} $$

It is a multimodal, nonseparable function having one global minimum and four local minima.

  1. 5.

    Hartman 3 function (H3):

$$ \begin{gathered} f_{\text{H3}} \left( x \right) = - \sum\limits_{i = 1}^{4} {c_{i} \exp \left[ { - \sum\limits_{j = 1}^{3} {a_{ij} \left( {x_{j} - p_{ij} } \right)^{2} } } \right]\quad {\text{With}}} \,0 \le x_{i} \le 1,\,{ \min }\, \, \hfill \\ f_{\text{H3}} \left( {0.114614,\,0.555649,\,0.852547} \right) = - 3.862782 \hfill \\ \end{gathered} $$

 

I

c i

a ij

p ij

j = 1

2

3

j = 1

2

3

1

1

3

10

30

0.3689

0.117

0.2673

2

1.2

0.1

10

35

0.4699

0.4387

0.747

3

3

3

10

30

0.1091

0.8732

0.5547

4

3.2

0.1

10

35

0.3815

0.5743

0.8828

It is a multimodal nonseparable function having four local minima and one global minimum.

  1. 6.

    Sphere function:

$$ f_{\text{SP}} \left( x \right) = \sum\limits_{i = 1}^{n} {x_{i}^{2} } \, ,{\text{With}} -100 \le x_{i} \le 100,\,{ \min }\, \,f_{\text{SP}} \left( {0, \ldots ,0} \right) = 0 $$

It is a simple, continuous unimodal, separable and highly convex function. It serves as test case for validating the convergence speed of an algorithm.

  1. 7.

    Ackley’s function (ACK):

$$ f_{\text{ACK}} (X) = - 20*{ \exp }\left( { - 0.2\sqrt {1/n\sum\limits_{i = 1}^{n} {x_{i}^{2} } } } \right) - { \exp }\left( {1/n\sum\limits_{i = 1}^{n} {{ \cos }\left( {2\pi x_{i} } \right)} } \right)\, + 20 + e\,, $$

With \( -30 \le x_{i} \le 30, \) min \( f_{\text{ACK}} \left( {0, \ldots ,0} \right) = 0 \)

The presence of an exponential term in the Ackley’s function covers its surface with numerous local minima. The complexity of this function is moderated. An algorithm that only uses the gradient steepest descent will be trapped in a local optima, but any search strategy that analyzes a wider region will be able to cross the valley among the optima and achieve better results.

  1. 8.

    Schwefel’s problem (SWF):

$$ f_{\text{SWF}} \left( x \right) = 418.9829 \times n - \sum\limits_{i = 1}^{n} {x_{i} } { \sin }\left( {\sqrt {\left| {x_{i} } \right|} } \right)\quad {\text{With}} -500 \le x_{i} \le 500,\,{ \min }\, \, f_{\text{SWF}} \left( {s, \ldots ,s} \right) = 0 $$

Where s = 420.97

It is a multimodal function with very deep sinusoidal interactions. It is generally considered to be difficult for optimization algorithms.

  1. 9.

    Griewank function (GW):

$$ f_{\text{GW}} \left( x \right) = {\frac{1}{4,000}}\sum\limits_{i = 1}^{n} {x_{i}^{2} - \prod\limits_{i = 1}^{n} {{ \cos }({\frac{{x_{i} }}{\sqrt i }})} + 1\quad {\text{With}}} -600 \le x_{i} \le 600,\,{ \min }\, \,f_{\text{GW}} \left( {0, \ldots ,0} \right) = 0 $$

The Griewank function is a highly multimodal, nonseparable function. It has many regularly distributed local minima. It tests both convergence speed and the ability to escape from a shallow local minimum.

  1. 10.

    Levy and Montalvo2 problem (LM2):

$$ f_{{{\text{LM}}2}} \left( x \right) = { \sin }^{2} (3\pi x_{1} ) + \sum\limits_{i = 1}^{n - 1} {(x_{i} } - 1)(1 + { \sin }^{2} (3\pi x_{i + 1} )) + (x_{n} - 1)(1 + { \sin }^{2} (2\pi x_{n} )) $$

With \( -50 \le x_{i} \le 50 \), min \( f_{{{\text{LM}}2}} \left( {1, \ldots ,1} \right) = 0 \)

It is a multimodal, nonseparable function having several local optima.

  1. 11.

    Step function(ST):

$$ f_{\text{ST}} \left( x \right) = \sum\limits_{i = 1}^{n} {\left( {\left\lfloor {x_{i} + 0.5} \right\rfloor } \right)^{2} } \quad {\text{With}} -100 \le x_{i} \le 100,\,{ \min }\, \,f_{\text{ST}} \left( { - 0.5 \le x_{i} \le 0.5} \right) = 0 $$
  1. 12.

    Rosenbrock problem (RB):

$$ f_{\text{RB}} \left( x \right) = \sum\limits_{i = 1}^{n - 1} {[100(x_{i + 1}^{2} - x_{i}^{2} ) + (1 - x_{i}^{2} )]\;\;} {\text{With}} -30 \le x_{i} \le 30,\,{ \min }\, \,f_{\text{RB}} \left( {1, \ldots ,1} \right) = 0 $$

Rosenbrock function is unimodal for smaller dimensions; however, as we increase the number of dimensions it ceases to be unimodal. Rosenbrock function is like Colville function for which the optimum lies inside a long, narrow, parabolic-shaped float valley. Like Colville function it helps in testing the ability of an algorithm to prevent premature convergence.

  1. 13.

    Rastrigin’s function (RG):

$$ f_{\text{RG}} \left( x \right) = 10n + \sum\limits_{i = 1}^{n} {(x_{i}^{2} - 10{ \cos }(2\pi x_{i} ))} \quad {\text{With}} -5.12 \le x_{i} \le 5.12,\,{ \min }\, \,f_{\text{RG}} \left( {0, \ldots ,0} \right) = 0 $$

This function is highly multimodal, nonseparable function having large number of local optima

  1. 14.

    Schwefel’s problem 2.22:

$$ f_{{{\text{SWF}}2.22}} \left( x \right) = \sum\limits_{i = 1}^{n} {\left| {x_{i} } \right|} + \prod\limits_{i = 1}^{n} {\left| {x_{i} } \right|} \;\;{\text{With}} -10 \le x_{i} \le 10,\,{ \min }\, \, f_{{{\text{SWF}}2.22}} \left( {0, \ldots ,0} \right) = 0 $$
  1. 15.

    Rotated hyper ellipsoid function (RHE):

$$ f_{\text{RHE}} \left( x \right) = \sum\limits_{i = 1}^{n} {\left( {\sum\limits_{j = 1}^{i} {x_{j} } } \right)}^{2} \quad {\text{With}} -100 \le x_{i} \le 100,\,{ \min }\, \,f_{\text{RHE}} \left( {0, \ldots ,0} \right) = 0 $$

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ali, M., Pant, M. Improving the performance of differential evolution algorithm using Cauchy mutation. Soft Comput 15, 991–1007 (2011). https://doi.org/10.1007/s00500-010-0655-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-010-0655-2

Keywords

Navigation