Skip to main content

Comparative Analysis of Genetic Algorithm, Simulated Annealing and Cutting Angle Method for Artificial Neural Networks

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3587))

Abstract

Neural network learning is the main essence of ANN. There are many problems associated with the multiple local minima in neural networks. Global optimization methods are capable of finding global optimal solution. In this paper we investigate and present a comparative study for the effects of probabilistic and deterministic global search method for artificial neural network using fully connected feed forward multi-layered perceptron architecture. We investigate two probabilistic global search method namely Genetic algorithm and Simulated annealing method and a deterministic cutting angle method to find weights in neural network. Experiments were carried out on UCI benchmark dataset.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Whittle, P.: Prediction and regularization by linear least square methods. Van Nostrand, Princeton (1963)

    Google Scholar 

  2. Goggin, S.D., Gustafson, K.E., Johnson, K.M.: An asymptotic singular value decomposition analysis of nonlinear multilayer neural networks. In: International Joint Conference on Neural Networks, pp. I-785–I-789 (1991)

    Google Scholar 

  3. Burton, S.A.: A matrix method for optimizing a neural network. Neural comput. 3(3)

    Google Scholar 

  4. Lawrence, S., Giles, C.L., Tsoi, A.C.: What size neural network gives optimal generalization? Convergence properties of backpropagatioin. UMIACS-TR-96-22

    Google Scholar 

  5. Duch, W., Korczak, J.: Optimization and global minimization methods suitable for neural networks. Neural computing surveys (1999)

    Google Scholar 

  6. Phansalkar, V.V., Thathachar, M.A.L.: Local and Global Optimization Algorithms for Generalized Learning Automata. Neural Computation 7, 950–973 (1995)

    Article  Google Scholar 

  7. Sexton, R., Dorsey, R., Johnson, J.: Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing. European Journal of Operational Research 114, 589–601 (1999)

    Article  MATH  Google Scholar 

  8. Sexton, R., Dorsey, R., Johnson, J.: Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation. Decision Support Systems 22, 171–185 (1998)

    Article  Google Scholar 

  9. Sexton, R., Dorsey, R., Johnson, J.: Beyond Backpropagation: Using Simulated Annealing for Training Neural Networks. Journal of End User Computing 11, 3 (1999)

    Google Scholar 

  10. Shang, Y., Wah, B.W.: Global optimization for neural network training. Computer 29, 45(10) (1996)

    Article  Google Scholar 

  11. PintÈr, J.: Global optimization in action: continuous and Lipschitz optimization–algorithms, implementations, and applications. Kluwer Academic Publishers, Dordrecht (1996)

    MATH  Google Scholar 

  12. Trn, A., Zhilinskas, A.: Global optimization. Springer, Heidelberg (1989)

    Google Scholar 

  13. Zhang, X.M., Chen, Y.Q.: Ray-guided global optimization method for training neural networks. Neurocomputing 30, 333–337 (2000)

    Article  Google Scholar 

  14. Zhang, X.-S.: Neural networks in optimization. Kluwer Academic Publishers, Boston (2000)

    MATH  Google Scholar 

  15. Rubinov, A.M.: Abstract convexity and global optimization. Kluwer Academic Publishers, Dordrecht (2000)

    Google Scholar 

  16. Andramonov, M., Rubinov, A., Glover, B.: Cutting angle methods in global optimization. Applied Mathematics Letters 12, 95–100 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  17. Bagirov, A., Rubinov, A.: Global minimization of increasing positively homogeneous function over the unit simplex. Annals of Operations Research 98, 171–187 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  18. Petridis, V., Kazarlis, S., Papaikonomu, A., Filelis, A.: A hybrid genetic algorithm for training neural network. Artificial Neural Networks 2, 953–956 (1992)

    Google Scholar 

  19. Rechenberg, I.: Cybernatic solution path of an experimental problem. Royal Aircraft Establishment, Library translation no. 1122, Farnborough, Hants, U.K. (August 1965)

    Google Scholar 

  20. Whitley, D., Starkweather, T., BoEArt, C.: Genetic algorithms and neural networks - optimizing connections and connectivity. Parallel Computing 14, 347–361 (1990)

    Article  Google Scholar 

  21. Montana, D., Davis, L.: Training feedforward neural networks using genetic algorithms. In: Proceedings of the Eleventh International Joint Conference on Artificial Intelligence IJCAI 1989, vol. 1 (1989)

    Google Scholar 

  22. Frean, M.: The upstart algorithm: a method for constructing and training feedforward neural networks. Neural computation 2 (1990)

    Google Scholar 

  23. Roy, A., Kim, L.S., Mukhopadhyay, S.: A polynomial time algorithm for the construction and training of a class of multiplayer perceptrons. Neural networks 6 (1993)

    Google Scholar 

  24. Hedar, A.R., Fukushima, M.: Hybrid Simulated Annealing and Direct Search method for nonlinear unconstrained global optimization. Optimization Methods and Software 17(5), 891–912 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  25. Brooks, D.G., Verdini, W.A.: Computational experience with generalized simulated annealing over continuous variables. American Journal of Mathematical and Management Sciences 8, 425–449 (1988)

    MATH  MathSciNet  Google Scholar 

  26. Cardoso, M.F., Salcedo, R.L., de Azevedo, S.F.: The simplex-simulated annealing approach to continuous non-linear optimization. Journal of Computers and Chemical Engineering 20, 1065–1080 (1996)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ghosh, R., Ghosh, M., Yearwood, J., Bagirov, A. (2005). Comparative Analysis of Genetic Algorithm, Simulated Annealing and Cutting Angle Method for Artificial Neural Networks. In: Perner, P., Imiya, A. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2005. Lecture Notes in Computer Science(), vol 3587. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11510888_7

Download citation

  • DOI: https://doi.org/10.1007/11510888_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26923-6

  • Online ISBN: 978-3-540-31891-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics