Abstract
Learning complex tasks in a multilayer perceptron is a nonlinear optimization problem that is often very difficult and painstakingly slow. The use of conjugate gradient methods to speed up convergence rates is proposed. These methods result in a very moderate increase in storage and computational complexity compared to the commonly used backpropagation algorithm. The algorithm used is a modified conjugate gradient method that uses inexact line searches. This reduces the number of function evaluations necessary in the line search part of the algorithm. Simulation results that show the improved convergence rate compared to the backpropagation algorithm are presented.
Original language | English (US) |
---|---|
Pages | 736-739 |
Number of pages | 4 |
State | Published - Dec 1 1989 |
Event | Proceedings of the 32nd Midwest Symposium on Circuits and Systems Part 2 (of 2) - Champaign, IL, USA Duration: Aug 14 1989 → Aug 16 1989 |
Other
Other | Proceedings of the 32nd Midwest Symposium on Circuits and Systems Part 2 (of 2) |
---|---|
City | Champaign, IL, USA |
Period | 8/14/89 → 8/16/89 |