Summary
Particle Swarm Optimization (PSO) is an effective algorithm for solving many global optimization problems. Much of the current research, however, focusses on optimizing functions involving relatively few dimensions (typically around 20 to 30, and generally fewer than 100).
Recently PSO has been applied to a different class of problems — such as neural network training — that can involve hundreds or thousands of variables. High dimensional problems such as these pose their own unique set of challenges for any optimization algorithm.
This chapter investigates the use of PSO for very high dimensional problems. It has been found that combining PSO with some of the concepts from evolutionary algorithms, such as a mutation operator, can in many cases significantly improve the performance of PSO on very high dimensional problems. Further improvements have been found with the addition of a random constriction coefficient.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Andreou AS, Efstratios F, Spiridon G, Likothanassis D (2002) Exchange rates forecasting: a hybrid algorithm based on genetically optimized adaptive neural networks. Comput Econ 20(3):191–200
Breiman L (1984) Classification and regression trees. Wadsworth International Group, Belmont, California
Chiang CC, Fu HH (1994) Divide-and-conquer methodology for modular supervised neural network design. In: Proceedings of the 1994 IEEE international conference on neural networks 1:119–124
Dokur Z (2002) Segmentation of MR and CT images using hybrid neural network trained by genetic algorithms. Neural Process Lett 16(3):211–225
Foody GM (1998) Issues in training set selection and refinement for classification by a feedforward neural network. Geoscience and remote sensing symposium proceeding:401–411
Fukunaga K (1990) Introduction to statistical pattern recognition, Academic, Boston
Goldberg DE, Deb K, Korb B (1991) Don’t worry, be messy. In: Belew R, Booker L (eds.) Proceedings of the fourth international conference in genetic algorithms and their applications, pp 24–30
Gong DX, Ruan XG, Qiao JF (2004) A neuro computing model for real coded genetic algorithm with the minimal generation gap. Neural Comput Appl 13:221–228
Guan SU, Liu J (2002) Incremental ordered neural network training. J Intell Syst 12(3):137–172
Guan SU, Li S (2002) Parallel growing and training of neural networks using output parallelism. IEEE Trans Neural Netw 13(3):542–550
Guan SU, Ramanathan K (2007) Percentage-based hybrid pattern training with neural network specific cross over, Journal of Intelligent Systems 16(1):1–26
Guan SU, Li P (2002) A hierarchical incremental learning approach to task decomposition. J Intell Syst 12(3):201–226
Guan SU, Li S, Liu J (2002) Incremental learning with an increasing input dimension. J Inst Eng Singapore 42(4):33–38
Guan SU, Liu J (2004) Incremental neural network training with an increasing input dimension. J Intell Syst 13(1):43–69
Guan SU, Neo TN, Bao C (2004) Task decomposition using pattern distributor. J Intell Syst 13(2):123–150
Guan SU, Zhu F (2004) Class decomposition for GA-based classifier agents – a pitt approach. IEEE Trans Syst Man Cybern, Part B: Cybern 34(1):381–392
Guan SU, Zhu F (2005) An incremental approach to genetic algorithms based classification. IEEE Trans Syst Man Cybern Part B 35(2):227–239
Haykins S (1999) Neural networks, a comprehensive foundation, Prentice Hall, Englewood Cliffs, NJ
Holland JH (1973) Genetic algorithms and the optimal allocation of trials. SIAM J Comput 2(2):88–105
Kim SP, Sanchez JC, Erdogmus D, Rao YN, Wessberg J, Principe J, Nicolelis M (2003) Divide and conquer approach for brain machine interfaces: non linear mixture of competitive linear models, Neural Netw 16(5–6):865–871
Lang KJ, Witbrock MJ (1988) Learning to tell two spirals apart. In: Touretzky D, Hinton G, Sejnowski T (eds.) Proceedings of the 1988 connectionist models summer School, Morgan Kaufmann, San Mateo, CA
Lasarzyck CWG, Dittrich P, Banzhaf W (2004) Dynamic subset selection based on a fitness case topology. Evol Comput, 12(4):223–242
Lehtokangas (1999), Modeling with constructive Backpropagation. Neural Netw 12:707–714
Lu BL, Ito K, Kita H, Nishikawa Y (1995) Parallel and modular multi-sieving neural network architecture for constructive learning. In: Proceedings of the 4th international conference on artificial neural networks 409:92–97
Quilan JR (1986) Introduction of decision trees. Mach Learn 1:81–106
Rumelhart D, Hinton G, Williams R (1986) Learning internal representations by error propagation. In Rumelhart D, McClelland J (eds.) Parallel distributed processing, 1: Foundations. MIT, Cambridge, MA
Satoh H, Yamamura M, Kobayashi S (1996) Minimal generation gap model for GAs considering both exploration and exploitation. In: Proceedings of 4th Int Conference on Soft Computing, Iizuka:494–497
The UCI Machine Learning repository: http://www.ics.uci.edu/mlearn/MLRepository.html
Wong MA, Lane T (1983) A kth nearest neighbor clustering procedure. JR Stat Soc (B) 45(3):362–368
Yao X (1993) A review of evolutionary artificial neural networks. Int J Intell Syst 8(4):539–567
Yasunaga M, Yoshida E, Yoshihara I (1999) Parallel backpropagation using genetic algorithm: real-time BP learning on the massively parallel computer CP-PACS. In: International Joint Conference on Neural Networks 6:4175–4180
Zhang BT, Cho DY (1998) Genetic programming with active data selection. Simulated Evol Learn 1485:146–153
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Achtnig, J. (2008). Particle Swarm Optimization with Mutation for High Dimensional Problems. In: Abraham, A., Grosan, C., Pedrycz, W. (eds) Engineering Evolutionary Intelligent Systems. Studies in Computational Intelligence, vol 82. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75396-4_15
Download citation
DOI: https://doi.org/10.1007/978-3-540-75396-4_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-75395-7
Online ISBN: 978-3-540-75396-4
eBook Packages: EngineeringEngineering (R0)