Cooling schedules for learning in neural networks

Tom M. Heskes, Eddy T. P. Slijpen, and Bert Kappen
Phys. Rev. E 47, 4457 – Published 1 June 1993
PDFExport Citation

Abstract

We derive cooling schedules for the global optimization of learning in neural networks. We discuss a two-level system with one global and one local minimum. The analysis is extended to systems with many minima. The optimal cooling schedule is (asymptotically) of the form η(t)=η*/lnt, with η(t) the learning parameter at time t and η* a constant, dependent on the reference learning parameters for the various transitions. In some simple cases, η* can be calculated. Simulations confirm the theoretical results.

  • Received 2 November 1992

DOI:https://doi.org/10.1103/PhysRevE.47.4457

©1993 American Physical Society

Authors & Affiliations

Tom M. Heskes, Eddy T. P. Slijpen, and Bert Kappen

  • Department of Medical Physics and Biophysics, University of Nijmegen, Geert Grooteplein Noord 21, 6525 EZ Nijmegen, The Netherlands

References (Subscription Required)

Click to Expand
Issue

Vol. 47, Iss. 6 — June 1993

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×