The effect is studied on the convergence properties of the backpropagation learning rule of the range from which the initial weight values are randomly selected. In addition to the standard backpropagation rule, two variations are also considered, namely symmetric backpropagation and expected-value backpropagation. In most applications of backpropagation, the range of initial weights is small. It is shown that significantly higher initial weights can substantially improve learning rates. If the initial weight range is increased beyond a problem-dependent limit, however, performance degrades. Symmetric backpropagation is most sensitive to the initial weight range, while expected value backpropagation is least sensitive. The authors describe an improvement on the symmetric variation that produces faster learning rates with low initial weights.
|Original language||English (US)|
|Number of pages||2|
|Journal||Proceedings of the IEEE International Conference on Systems, Man and Cybernetics|
|State||Published - Dec 1 1989|
|Event||1989 IEEE International Conference on Systems, Man, and Cybernetics. Part 1 (of 3) - Cambridge, MA, USA|
Duration: Nov 14 1989 → Nov 17 1989