Hierarchical self-programming in recurrent neural networks

and

Published 15 March 2002 Published under licence by IOP Publishing Ltd
, , Citation T Uezu and A C C Coolen 2002 J. Phys. A: Math. Gen. 35 2761 DOI 10.1088/0305-4470/35/12/306

0305-4470/35/12/2761

Abstract

We study self-programming in recurrent neural networks where both neurons (the 'processors') and synaptic interactions ('the programme') evolve in time simultaneously, according to specific coupled stochastic equations. The interactions are divided into a hierarchy of L groups with adiabatically separated and monotonically increasing time-scales, representing sub-routines of the system programme of decreasing volatility. We solve this model in equilibrium, assuming ergodicity at every level, and find as our replica-symmetric solution a formalism with a structure similar but not identical to Parisi's L-step replica symmetry breaking scheme. Apart from differences in details of the equations (due to the fact that here interactions, rather than spins, are grouped into clusters with different time-scales), in the present model the block sizes mi of the emerging ultrametric solution are not restricted to the interval [0, 1], but are independent control parameters, defined in terms of the noise strengths of the various levels in the hierarchy, which can take any value in [0, ⟩. This is shown to lead to extremely rich phase diagrams, with an abundance of first-order transitions especially when the level of stochasticity in the interaction dynamics is chosen to be low.

Export citation and abstract BibTeX RIS

10.1088/0305-4470/35/12/306