Back-propagation is significantly faster if the expected value of the source unit is used for update

Research output: Contribution to journalConference articlepeer-review

6 Scopus citations

Abstract

It has frequently been remarked that a major drawback of the back-propagation learning rule, and one that does not bode well for its application to real-world problems, is its poor scaling properties - with large networks, back-propagation can take infeasibly long to converge. The research outlined shows that an intuitively straightforward modification of back-propagation can greatly improve its performance, particularly for large and structured networks.

Original languageEnglish (US)
Pages (from-to)216
Number of pages1
JournalNeural Networks
Volume1
Issue number1 SUPPL
DOIs
StatePublished - 1988
Externally publishedYes
EventInternational Neural Network Society 1988 First Annual Meeting - Boston, MA, USA
Duration: Sep 6 1988Sep 10 1988

Fingerprint

Dive into the research topics of 'Back-propagation is significantly faster if the expected value of the source unit is used for update'. Together they form a unique fingerprint.

Cite this