Abstract
In order to study the dynamical behaviour of the Little-Hopfield model of a neural network, we demonstrate a new perturbation theory for the retrieval of memory. This theory is characterized by a time-dependent perturbation variable in powers of which are calculated the overlap m(t) as well as other order parameters such as the non-retrieval parameter r(t), t being time. The second-order approximation in our perturbation scheme shows that, as time increases, the trajectories of the model system represented in the m(t)-r(t) plane converge onto a curve which is identical to the so-called 'freezing' line attained through the replica symmetric solution. In the course of developing our perturbation theory we generalize the exact approach proposed by Gardner et al. (1987). We also re-phrase previous approaches such as the theory by Amari and Maginu (1988) and by Coolen and Sherrington (1994) within the scheme of our perturbation theory.