• Rapid Communication

Generalization in a two-layer neural network

Holm Schwarze, Manfred Opper, and Wolfgang Kinzel
Phys. Rev. A 46, R6185(R) – Published 1 November 1992
PDFExport Citation

Abstract

Statistical mechanics is applied to study the generalization properties of a two-layer neural network trained to implement a linearly separable problem. For a stochastic learing algorithm the generalization error as a function of the training set size is calculated exactly. The network with three hidden units experiences two first-order phase transitions due to an asymmetric freezing of the hidden units. Compared to a simple perceptron the committee machine is found to generalize worse.

  • Received 26 August 1992

DOI:https://doi.org/10.1103/PhysRevA.46.R6185

©1992 American Physical Society

Authors & Affiliations

Holm Schwarze, Manfred Opper, and Wolfgang Kinzel

  • Institut für Theoretische Physik, Justus-Liebig-Universität Giessen, 6300 Giessen, Federal Republic of Germany

References (Subscription Required)

Click to Expand
Issue

Vol. 46, Iss. 10 — November 1992

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review A

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×