Effect of initial weights on back-propagation and its variations

Hossein Lari-Najafi, Mohammed Nasiruddin, Tariq Samad

Research output: Contribution to journalConference articlepeer-review

13 Scopus citations

Abstract

The effect is studied on the convergence properties of the backpropagation learning rule of the range from which the initial weight values are randomly selected. In addition to the standard backpropagation rule, two variations are also considered, namely symmetric backpropagation and expected-value backpropagation. In most applications of backpropagation, the range of initial weights is small. It is shown that significantly higher initial weights can substantially improve learning rates. If the initial weight range is increased beyond a problem-dependent limit, however, performance degrades. Symmetric backpropagation is most sensitive to the initial weight range, while expected value backpropagation is least sensitive. The authors describe an improvement on the symmetric variation that produces faster learning rates with low initial weights.

Original languageEnglish (US)
Pages (from-to)218-219
Number of pages2
JournalProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Volume1
StatePublished - Dec 1 1989
Event1989 IEEE International Conference on Systems, Man, and Cybernetics. Part 1 (of 3) - Cambridge, MA, USA
Duration: Nov 14 1989Nov 17 1989

Fingerprint

Dive into the research topics of 'Effect of initial weights on back-propagation and its variations'. Together they form a unique fingerprint.

Cite this