Contributed article
Kick-out learning algorithm to reduce the oscillation of weights

https://doi.org/10.1016/0893-6080(94)90101-5Get rights and content

Abstract

The back-propagation algorithm, when used with an unmodified gradient descent term, converges very slowly, because the weights oscillate in regions where the error surface forms a ravine. To improve the convergence, the momentum term was introduced. However, the effect of that term on the reduction of oscillations has been insufficiently considered. In this paper, we point out that this term has not been effective in reducing the oscillation. To overcome the oscillations, we focus on the very bottom of a ravine where the direction of steepest descent is the same as the downward direction along the ravine bottom. We describe a method to correct the value of the weights near the bottom of a ravine and propose a new acceleration algorithm based on that correction. The distinctive feature is the correction term that uses the difference of gradients that is invoked during the oscillation. We show that, using the proposed algorithm, the convergence speed is substantially improved in ravine regions.

References (18)

There are more references available in the full text version of this article.

Cited by (28)

View all citing articles on Scopus
View full text