Elsevier

Chaos, Solitons & Fractals

Volume 44, Issue 9, September 2011, Pages 710-718
Chaos, Solitons & Fractals

Modified cuckoo search: A new gradient free optimisation algorithm

https://doi.org/10.1016/j.chaos.2011.06.004Get rights and content

Abstract

A new robust optimisation algorithm, which can be regarded as a modification of the recently developed cuckoo search, is presented. The modification involves the addition of information exchange between the top eggs, or the best solutions. Standard optimisation benchmarking functions are used to test the effects of these modifications and it is demonstrated that, in most cases, the modified cuckoo search performs as well as, or better than, the standard cuckoo search, a particle swarm optimiser, and a differential evolution strategy. In particular the modified cuckoo search shows a high convergence rate to the true global minimum even at high numbers of dimensions.

Highlights

► Modified cuckoo search (MCS) is a new gradient free optimisation algorithm. ► MCS shows a high convergence rate, able to outperform other optimisers. ► MCS is particularly strong at high dimension objective functions. ► MCS performs well when applied to engineering problems.

Introduction

The practicality of using gradient based optimisation techniques has been reduced by the difficulty of generating automatically objective functions and their derivatives for highly non-linear engineering problems [1]. During the 1950s and 1960s, computer scientists investigated the possibility of applying the concepts of evolution as an optimisation tool for engineers and this gave birth to a subclass of gradient free methods called genetic algorithms (GA) [2]. Since then many other algorithms have been developed that have been inspired by nature, for example particle swarm optimisation (PSO) [3], differential evolution (DE) [4] and, more recently, the cuckoo search (CS) [5]. These are heuristic techniques which make use of a large population of possible designs at each iteration. For each member of the population, the objective function is evaluated and a fitness is assigned. A set of rules is then used to move the population towards the optimum solution. Although this results in a global search without the need to calculate objective function gradients [5], the large number of objective function evaluations means the computational efficiency of these procedures is often inferior to classical gradient-based methods [6]. The problem is exacerbated when considering applications where a single objective function evaluation represents a significant computational cost, such as fluid flow problems [6], [7], [8].

New optimisation algorithms are often tested by applying them to benchmark problems which have known analytical optimum fitnesses. For example, Bratton and Kennedy [3] compare several PSO algorithms by running 30 trials each for 300,000 objective function evaluations and comparing the fitness value obtained with the known optimum fitness. Yang and Deb [5] compared CS to PSO over 100 trials for each objective function. They allowed the algorithms to run until the spread of fitnesses in the population was smaller than 10−5. This resulted in large numbers of objective function evaluations, from 3015 to 110,523, depending upon the method used and the benchmark problem. This number of objective function evaluations would not be feasible for application to practical engineering problems with costly objective functions. The common threads in these benchmarking tests is the application of these techniques to problems with known minima, with a defined stopping criterion, and the large number of objective function evaluations.

In a real world application of an optimisation technique, it is very difficult to define a stopping criterion. For example, in the design of an aerofoil, the objective function might represent the calculation of the ratio of the lift coefficient to the drag coefficient. In this case, the designer would want to maximise this quantity under certain constraints, the maximum value is unknown and may not be unique. Furthermore, in most applications, the objective function is a computer model of the physical design, which means there is an inherit uncertainty in both the inputs, in terms of manufacturing or measurement errors, and outputs of the objective function [1]. Therefore, the idea of finding a unique exact optimum makes little sense in terms of real engineering applications. When choosing an optimisation technique, a designer will want to know how much of an improvement is possible at a given, acceptable, computational cost.

It is these motivations which lead to the modified cuckoo search (MCS) procedure that is introduced here as a possible improvement over the standard CS. The formulations of the DE, PSO, CS and MCS methods used here are presented and their relative performance at different numbers of objective function evaluations on a variety of benchmarks is then measured and discussed.

Section snippets

Differential evolution (DE)

Like most evolutionary algorithms, DE follows the traditional modus operandi: initialisation, mutation, selection and recombination. However, compared to other evolutionary strategies DE has many additional attractive characteristics: it employs a differential operator to create new candidate solutions, uses a one-to-one competition scheme to select new population members and naturally employs real numbers [4]. Furthermore, it has memory and constructive cooperation between population members,

Results

The DE, PSO, CS and MCS algorithms were all implemented as described in MATLAB [14]. To compare the relative performance of the methods, a series of seven test functions, taken from those originally used by Yang and Deb [5], were used as objective functions. For each function, 30 trials were performed for each method. Both unimodal and multimodal functions were selected and a range of dimensions were tested to gauge the robustness of each method. The initial population of 20 individuals were

Conclusions

A new modified cuckoo search algorithm has been presented. For all of the standard test examples that have been considered, the MCS has been shown to outperform the standard CS. For all the considered examples, the MCS performs as well as, or better than, the PSO, with the MCS performing significantly better in some examples. The differences between the methods are less pronounced for low numbers of objective function evaluations but, in general, the PSO and the MCS outperform CS in this

Acknowledgements

The authors thank the UK Engineering and Physical Sciences Research Council for the financial support provided under a DTA grant at Swansea University.

References (18)

There are more references available in the full text version of this article.

Cited by (536)

  • A collaborative cuckoo search algorithm with modified operation mode

    2023, Engineering Applications of Artificial Intelligence
View all citing articles on Scopus
View full text